I recently had fiber broadband installed at the house. This meant switching provider, and getting a whole new router. ISP routers, by-and-large are terrible, and this one was the type which only allows changing a limited set of options through the web-based admin page.

For a while it was working fine enough, but I started getting lots of DNS issues; accessing sites was terribly slow due to looooooong lookup times – when the lookup succeeded at all! I looked for the option to switch to using the OpenDNS servers, but there was no way to do this through the UI.

Of course, I figured someone had to have run into and fixed this problem before, and with a little hunting around, I was proved right – Pete Cooper had documented how to change these settings through the archaic and arcane wonder of telnet.

Logging into my router through the console, using Pete’s instructions, it soon became apparent his steps had been broken by a firmware update – only a couple of the commands worked. But now I had a lead, I was sure I could figure it out. With a little digging around, and judicious use of the help command, I was able to put together this sequence of commands to update the DNS settings:

# To list your current DNS servers
dns server forward dnsset list
# To a new primary DNS server with higher priority than the default
dns server forward dnsset add set=0 dns= label=None metric=4 intf=Internet
# Add the secondary as above
dns server forward dnsset add set=0 dns= label=None metric=4 intf=Internet
# Save our changes

With the commands entered, my web surfing instantly got a massive speed boost as the DNS issues went away :) I should point out that I left the default PlusNet servers in there as back-up. If for some reason I can’t connect to OpenDNS, the router will fall back to the PlusNet DNS.

About these ads

RIP Winamp

Chris McLeod —  Dec 22 2013 — Leave a comment

Winamp shutdown yesterday. Even though I hadn’t used it in years, this makes me a little sad, as Winamp was iconic. It was a hero of the early world-wide web, helping to kick-start the internet music age for a great many people like myself.


I first discovered Winamp around 14 years ago, during my first year at university. Back then, you could run Winamp from any old folder without installing it, so everyone used to have a copy in their network profile. This was the early days… MP3s were still a rarity here in the UK, so you would listen mainly to CDs (Windows Media Player was a world of suck on Windows NT), or the 2-3 MP3s you had downloaded from Napster.

As time went on, MP3s became more and more common, and Winamp became the defacto music player for a lot of people. Imitators sprung-up elsewhere. It was small, customisable, and with plugins was able to do almost anything – like managing an MP3 Player, if you were the early-adopter who splashed out a few hundred for one of the early, pre-iPod devices. Ahem.

Then the iPod happened, and with it, iTunes. Once iTunes for Windows hit, that was the end of Winamp’s glory days. Owned by AOL, it sank into irrelevance. Full-blown music library management, with integrated store and device management, was the order of the day – all things Winamp was woeful at, even with plugins – relegating Winamp to a niche of nostalgia and a small number of users who couldn’t do with out some feature or other. Winamp 3 was a mess, Winamp 5.5 moved away from the minimal UI. There was even an Android version. It was terrible.

By that time, we had all moved to streaming music services. Why store gigabytes of music files on your computer, when someone else can do it for you, and high-speed access is increasingly common? The need for an application like Winamp was increasingly shrinking. At least Spotify has honoured your legacy by releasing Spotiamp.


And so yesterday, Winamp ceased to be. The site is still there, and for now at least, it seems you can still download v5.666… but that will be turned off soon.

So long, Winamp. You really whipped that ass for as long as you could.

Setting Up Chef

Chris McLeod —  Dec 20 2013 — Leave a comment

I just finished setting up Chef, to have a play around with this DevOps stuff I keep hearing about. While Chef is quite well documented, I found myself struggling in places where things weren’t quite clear enough. So naturally, I’m posting how I got myself up and running.

[Note: I haven't actually done anything with this setup yet, other than get it working.]

Step One: Get A Server

There are 2 parts to a Chef install: client and server. You can run them all on one machine, but given how much Chef slows down my Joyent VM, I’d suggest keeping it off of your day-to-day workstation.

I used my Joyent credit to setup a new Ubuntu 12.04 64-bit server. Chef server only supports Ubuntu or RedHat/CentOS 64-bit. Once the server was provisioned, I followed this 5-minute guide to lockdown the server enough for my needs (this being just an experiment and all…)

Step Two: Set the Server FQDN

Once the server is prepared, make sure it has a resolvable, fully qualified domain name before going any further. While the Chef docs make mention of this, they do so after the rest of the setup instructions. This was one area I was banging my head against for ages, wondering why the built-in NginX server wasn’t working.

Setting the hostname on my Joyent VM was a case of running:

    $ sudo hostname 'chef.example.com'
    $ echo "chef.example.com" | sudo tee /etc/hostname

As I wasn’t on the same network as my Chef server, I added a DNS A record to match the server FQDN.

Step Three: Install Chef Server

This bit was really easy, probably the easiest part of the whole setup. In short: download the latest Chef Server package for your platform, install the package, run the reconfigure tool. In my case, this was:

    $ wget https://opscode-omnibus-packages.s3.amazonaws.com/ubuntu/12.04/x86_64/chef-server_11.0.10-1.ubuntu.12.04_amd64.deb
    $ sudo dpkg -i chef-server_11.0.10-1.ubuntu.12.04_amd64.deb
    $ sudo chef-server-ctl reconfigure

The Chef installer will whirr away, using Chef to setup your new installation automatically. How cool is that?

Step Four: Copy Server Certificates to Your Workstation

This wasn’t mentioned anywhere I could see, but I figured it out from some snippets written around the web. To successfully setup the Chef client, you need some security certificates from your new server. I used SCP from my local PC:

    $ scp user@chef.example.com:/etc/chef-server/admin.pem ~/tmp/
    $ scp user@chef.example.com:/etc/chef-server/chef-validator.pem ~/tmp/

If you find you don’t have permission to copy directly from their default location, SSH to the server and sudo copy them to somewhere you can.

Step Five: Install the Chef Client

Now we should be armed with everything we need to install the client tools. I’m using the Debian-derived Crunchbang, but any *NIX-based OS should be roughly the same as below. If you’re on Windows, I’m afraid you’re on your own.

Run the “Omniinstaller” for Chef:

    $ curl -L https://www.opscode.com/chef/install.sh | sudo bash

Create a .chef folder in your home directory, and add the certificates copied from the server

    $ mkdir ~/.chef
    $ cp ~/tmp/*.pem ~/.chef

Configure Knife (the main Chef CLI utility):

    $ knife configure --initial
    WARNING: No knife configuration file found
    Where should I put the config file? [/home/chris/.chef/knife.rb] /home/chris/.chef/knife.rb
    Please enter the chef server URL: [https://localhost:443] https://chef.example.com:443
    Please enter a name for the new user: [chris]
    Please enter the existing admin name: [admin]
    Please enter the location of the existing admin's private key: [/etc/chef-server/admin.pem] /home/chris/.chef/admin.pem
    Please enter the validation clientname: [chef-validator]
    Please enter the location of the validation key: [/etc/chef-server/chef-validator.pem] /home/chris/.chef/chef-validator.pem
    Please enter the path to a chef repository (or leave blank):
    Creating initial API user...
    Please enter a password for the new user:
    Created user[chris]
    Configuration file written to /home/chris/.chef/knife.rb

Test Knife by listing all users:

    $ knife user list

Wrap Up

That’s it! You now have a working Chef installation. Or at least, I do. Steps two and four are the steps I had to hunt out and piece together myself to get Chef up and running. Everything else is more or less as documented.

All that’s left to do now is figure out how to use Chef!

Thanks to the travelling, my body-clock was a little on the fritz, which meant I was wide-awake at around 4am local time. Not ideal, but it meant I got to see a fairly spectacular sunrise, coming up over San Francisco Bay. Feeling a little inspired, I set my camera up on its tripod, opened the window shutters, and experimented with a few long-exposure shots. I need a little more practice (sunset, anyone??), but I’m pretty pleased with how a couple of the shots came out.

The remaining photos are from a walk I took along the pier-front (Embarcadero). I didn’t go all the way along – I was tempted to walk right around as far as Fisherman’s Wharf (I started at South Beach), so I might get a few snaps of the Golden Gate Bridge, but I decided I would cut back into downtown at Market Street, so I could get a few things for my stay. I think I’ll make my trip up that way on Tuesday, perhaps taking in a ferry ride of the Bay, and maybe a tour of Alcatraz.

As I’m writing this, I’m 36,000 feet over Canadian airspace, on my way to San Francisco (you may have guessed this already, from the title). By the time you’re reading this, I should be safely on the ground again (no in-flight wifi to let me post from the air. It’s a little bit of a impromptu visit; I certainly hadn’t dreamed I’d be making this trip, even as recently as a couple of months ago. But that’s by-the-by at this point – there’s no turning back now!

This will be only my second trip to the United States – my first being Houston in 2011 – so I’ll be very interested to see the (no doubt many) differences. It’s only a short trip too, as I fly back to the UK on Wednesday, so I’ll need to try cram a lot in to make the most of it!

I have one particular bit of business to do while I’m in town1, but the rest of the time is mine, and to be honest, it’s a very welcome break. Things have been so hectic and stressful over the last few months (and not entirely in a good way) that I’m in desperate need of some “R&R”. Hopefully this trip will provide some of that!

As this trip might be a once in a lifetime thing, I’ve packed my full set of camera equipment, so hopefully I can get some memorable photos while I’m here. If I can manage, I’ll try post them up while at the end of each day.

Now, if only I wasn’t missing the live broadcast of The Day of The Doctor, this trip might’ve been even more perfect. Guess what my first priority is, when I land?

  1. That’s a story for another day.2 
  2. Huzzah! WordPress.com finally supports MarkDown natively! 

Source: Important Announcement (GW Coverage) – Beasts of War.

In June of this year, GW published a new set of trade terms that their trade customers must adhere too, in these terms was a clause that effectively meant that, Wayland Games would have been punished for any advance reporting of any GW release by Beasts of War Ltd (Article 9.4) despite Wayland Games not providing any such information to Beasts of War, and despite both companies being separate.

As a result we feel there is no option but to abide by terms set out by GW.

I’m as much a GW fanboy as the next guy, and I try to give them the benefit of the doubt in many of the questionable moves they’ve made over the last couple of years, despite how they treat the wider community.

But this is really a dick move as far as I’m concerned.

I’ll hopefully manage to post a more in depth entry about this later, but I’m super busy with work at the moment, so for now I just wanted to get this out there.

In the meantime, if you want to support Beasts of War, please consider buying a Backstage Pass.

Those of you who follow me on Twitter probably know I was a Games Day 2013 in Birmingham, yesterday. There’s a photo gallery coming later with the many pictures I took through the day, but I’m still waiting on those to transfer off my iPad and upload to the server. In this post I’ll give a a bit of an overview of the day, and recount my experiences.

I’ll mention up front that this was my first Games Day in a long time. The last time I attended, I was still young enough to enter the Young Bloods painting competition, so it’s been 18-19 years at least!

This entry is very long, so it’s behind a “Read More” link. Photos will be in a separate gallery post. Continue Reading…

It started as a throwaway comment:

But like all interesting ideas, it took root in my brain, and I started to wonder “no, really, what if…”

This morning I woke up deciding to finally learn Vim. It’s been on my “to-do” list for quite a while – years, in fact – my old boss used to be a bit of a Vim guru, who could edit file on the Linux boxes 10x faster than me using GEdit, which I used to marvel at as a junior.

Beyond the “code from my iPad” idea above, I have a couple more reasons for learning Vim:

  • Vim is near universal. It runs on pretty much any platform, and it’s standard on virtually any Linux distro
  • Since switching back to Linux, I spend half my time in the CLI anyway – Vim cuts down the need to switch away to something like Sublime
  • Having a CLI-based workflow reduces the amount of “stuff” I need to set-up on a computer to write some code

But coming back to the iPad, I’m quite keen to get this working because I’ll be moving house soon, and it’s more than likely I won’t have space for my current PC set-up, and a new laptop is out of my budget for now. So if this works it will let me keep tinkering on projects even without a “proper” computer. I’ve been in this position before, so I know the limitations. This time, however, I’m using an iPad Mini and a Logitech Bluetooth keyboard case.

My Set-up/Learning Resources

To get myself up and running, I’ve installed and configured Vim on my PC, and will install GVim on my work laptop. I’m going to try and use it in place of Sublime Text Editor as much as I can, but give no promises when it comes to anything work-related, as that generally needs done fast.

I’ve fired up a Ubuntu box on Joyent (starting with the first 5-minutes post I linked to last week), and replicated my PC’s Vim configuration, so I can start working from the iPad straight away. I installed Git, and authorised the box with my Github account, so I can push and pull to my heart’s content.

As I’m just starting out, the configuration I’ve gone for is very basic and minimal, but giving me room to grow as I get more experienced. Essentially, I’ve installed Pathogen, and applied their recommended beginner defaults.

For learning resources, I’ve found there’s a heck of a lot out there for Vim. So much so, I’ve chosen to limit myself for now so I don’t get overwhelmed.

  • The Pragmatic Programmers books are somewhere I always check when I want to learn something new. I find their books to be practical, well written, and informative. True to form, Practical Vim (from what I’ve read so far) is an excellent introduction to “real world” Vim.
  • Vim Adventures is a neat, interactive learning tool, which turns learning Vim into an 16-bit adventure game.
  • VimCasts has 50 free screencasts, dealing – unsurprisingly – with learning Vim

Feel free to suggest some more good resources though – as I get more experience I’ll need to branch out into other areas!