M0UNTAIN 0F C0DE

For my Raspberry Pi Image Manager project I wanted to show a progress bar as an image is downloaded. This was easy as a simple HEAD request would include in the response a Content-Length header telling me the size and allowing progress to be calculated.

This worked until I wanted to include the RetroPie image. The RetroPie devs store their images on GitHub, not a problem you might think, business as normal. GitHub however don't allow HEAD requests to downloads, no idea why. So now I had no way of getting the image size without downloading the whole image.

At least not without some hackery...

Read More

I am very much an advocate for 2 factor authentication tokens or keys such as a Yubikey, these devices can interface with web browsers through a Javascript API that web browsers expose or in the case of Firefox not expose.

There is a community made extension that fills this gap until the Firefox devs get the U2F JS API implemented in version 57 or 58 however I was never able to get it to work, no matter what I tried and no matter how many times I ran the test it just kept popping up with a message saying "Please plug in your U2F device".

Then I found the source code on GitHub and the last line in the README contained the key...

Read More

I recently stumbled across the Pirate Box project and instantly loved the idea. Put simply it's a wifi based file sharing platform that's completely disconnected from the internet. The only way to access or upload files from a pirate box is to be physically within range of the it's access point.

Once I saw that it had an SD card image for the Raspberry Pi I naturally wanted to build my own but also wanted to do something a little different...

Read More

Me and the team at work make extensive use of BitBuckets issue tracker to document and track bugs, issues and suggestions against projects and this works really well for a single repository.

The problem comes when you want to see all the issues in a group of repositories, you can see issues from ALL repositories but can't then order, sort or filter them, this makes it really hard to get an overview of all the issues in a related group of repositories.

Fortunately BitBucket has an API and I have a bunch of spare time...

Read More

I was recently involved in setting up a complex load balanced Auto-scaling multi server setup and to make life easy I wanted to set a header that contained the servers hostname so it was clear which server behind the load balancer satisfied each request.

I thought this would be easy... Not so much! But I managed it and here's how...

Read More

Super capacitors are awesome, cheap, easily obtainable and can be a little dangerous. They have a massive energy density and are willing to give up their energy very VERY quickly.

You don't have to discharge all this energy all at once, as fun as sublimating copper and throwing sparks is, they can be discharged at any rate, meaning you could connect it up to an LED to power it forever more, or for example a Pi for a good while.

So I gave it a go and it worked!

Read More

I maintain and work on a number of repositories on BitBucket for both work and in my own time and use a separate account for each. SSH is used to talk with the remotes and I use my Multi SSH Key Manager to manage the keys.

The problem with this is that the remotes for all BitBucket repos have the same username and server git@bitbucket.org and as soon as I associate a key for git@bitbucket.org with my work account, I can't associate it with my personal account.

I could link the accounts together and then they could both use the same key but I want to keep them separate, so I needed to find a way of telling Git to use a certain key with a certain remote.

Here's how I did it...

Read More

Alot of PHP applications that i've worked on that allow file uploads place the files into a directory that is publicly accessible, this isn't a problem so long as your upload script never ever allows scripts to be uploaded.

It doesn't matter how good you think your MIME type or extension filtering is why allow the PHP interpreter near the files you never expect to be interpreted?

Read More

Some times you want to delete a file and for it to stay deleted forever, SSH/SSL private keys, sensitive documents, old password databases, etc...

Anyone who has ever accidentally deleted a file or had a hard disk fail knows there are a million and one tools out there that will undelete and recover these files.

This is were shred comes in...

Read More

My PC is on literaly 24/7 as I run a FAH (Folding At Home) client and Transmission to seed the RPi and Ubuntu image torrents, but when I want to use my machine I have to pause them both as they bog the machine down.

This was a manual task I had to do every time I sat down and I had to remember to set them both going again when I was done, something I didn't always remember to do at 3AM after a session of "I'll just do a little more..."

Out of habit I lock my PC whenever I leave it and I thought that was an ideal trigger to pause/resume the FAH client and Transmission!

Read More

The other day I was generally tidying the various document, images and scripts that seem to slowly become scattered about my home directory and noticed something strange had happened, the MATE Places menu now had a bookmarks submenu?

I thought I had broken something after my attempts to change the icons in that menu earlier the same day, turns out it's way simpler than that! All that's happening is that if you have more than 8 bookmarks MATE will show them under a Bookmarks submenu.

So for once I hadn't broken anything, this was actually a feature. I thought I would share this as I couldn't find any mention of this when I was trying to find out what was going on.

Today I was writing a script that needs to run without user interaction and need to get the latest version of a single file from a private BitBucket Git repo over SSH.

BitBucket allows you to do this over HTTPS and I could use something like curl or wget with digest auth but then that would require the user name and password to be added to the script in plain text...

Not ideal, esspecially when SSH keys are already setup and far more secure than passwords, but there is a solution...

Read More

Some times you want to clone a repository but you dont want all the history and git stuff to. This is a pretty simple task:

git clone user@repohost:repo-name
git checkout the-branch-i-want
rm -r .git

But it can be easier!

Read More

In Linux you can run lsusb on the command line to list all currently connected USB devices but you may find a number of the entires give a pretty generic or blank name for the device.

This is simply because those devices aren't listed in your copy of the USB ID database, this database can be easily updated with USBUtils update-usbids.

Awesome problem solved... So what's the problem? You will notice that if you run update-usbids and then run it again right away you will download the database twice.. but I already have an upto date copy of the database...

Read More

I use SSH literally every single day, at work and at home, so for security and because I don't want to spend time typing long secure passwords I use SSH keys for authentication.

What's the problem?

Usually you'll generate a key pair with ssh-keygen, copy the public key to any server you want to login to and youre done. So what's the problem with that? Well if you ever want to renew that single key, increase it's length for better security, find out which user and server that key is authorised for etc then you are going to have to change the public key on each of servers you can access.

It would be much better if we had a key pair per user per server, then we can renew, change or delete a key for a single login. We have complete control.

Read More

Every time I need to write a new image to my Pi, usually because i've broken it, I have to look up how to write the image, check mounts and find and download the latest version of the image I want. Even then I have no idea if dd is actually progressing or how long i'm going to have to wait...

There wasn't really anything out there that can take an image name and a location and do the rest for me, now there is!

Read More

Using a Raspberry Pi seems to make alot of sense if you're in the market for a small NAS server, it's low cost, low profile, low energy etc...

Is It Up To It?

The only possible problem is the storage the only way to attach any real storage to a Pi is via USB. USB 2.0 has a maximum theoretical throughput of 480Mbits/s with a but I dont think you'll ever get anywhere near that speed.

We only really need to acheieve around 10MB/s as then the ethernet connection becomes the bottleneck

Benchmarking

I wanted to see what kind of throughput I could actually get with USB devices connected to a Pi so I wrote a device benchamarker tool that is able to measure both the read and write speeds of one or more block devices.

Read More

I've always thought the Linux way of everything auto-updating for you, albeit after asking you first, was the best way forward so it always felt kind of wrong having to run composer self-update manually.

What better tool is there to periodically run a script than cron. It's as simple as adding the following to a file located, at least on Ubuntu, here: /etc/cron.daily/composer

#!/bin/sh
    
# Update Composer to the latest version
composer self-update

It's probably a good idea to redirect the output to a log file with a timestamp but i'm not that worried. Composer complains at you if it's older than 30 days anyway:

Warning: This development build of composer is over 30 days old.
It is recommended to update it by running "composer self-update" to get the latest version.

There was a new version of the Linux kernel released today, 3.13.0-49, once again I came across the issue of my /boot partition not having enough free space to fit the new kernel and was greated by this message:

The upgrade needs a total of XX M free space on disk /boot.
Please free at least an additional XX M of disk space on /boot

Theres an easy if a little dangerous fix to be found on the AskUbuntu site.

TL;DR version:

sudo apt-get purge linux-image-3.13.0-{X,Y,Z}-generic`

Where X Y and Z are the versions you want to delete.

Can't help but think there is way to automate the purging of all but say the last 2 kernel versions any time a new one is released...