What are My Thoughts on Age of Sigmar?

I have none, because we haven’t seen enough full information – in context – to make any informed opinions.

And neither have you. I get it, change is scary. But stop whining on the internet about AoS before you have all the information. Please? It’ll make the transition much more pleasant for you, me, and everyone else.

I’m flabbergasted by how quickly it all went from “ok, this looks like it could be fun and interesting,” to “ZOMG! The sky is falling! F-you GW! This is the most ridiculous and crappy game EEEEHHHVVAR!

And it hasn’t even been officially revealed yet. Careful; your knee is jerking so hard you might do yourself an injury.

I do have one final, parting thought to leave you with:

If you want a balanced, tournament-friendly (and 1st-party supported!) Fantasy massed-battle game that plays like a “Warhammer 9th” – basically what everyone complaining the loudest seems to be lamenting Age of Sigmar is not – then I humbly suggest you go check out Kings of War. 2nd Edition is right around the corner, with the beta rules available for free download. A number of Warhammer Fantasy armies port over to KoW with little-to-no modification or need to buy new models. It’s fast, deceptively simple, fun, well written, and actively supported. If you’re up in arms about AoS, it wouldn’t hurt to check it out.

So I’m Building a New PC

I mentioned a few weeks back I was considering my choices for how to upgrade my aging computer equipment, and of the choices, building my own custom PC would be the most rewarding path to take. I swithered a bit on whether I really wanted to do this, but in the end I gave in to the temptation to build something entirely my own.

Great, I know what I want to do, now how do I get there? It’s been several years since I built a PC1, and I haven’t been keeping up with the trends, or what’s the latest and greatest in terms of performance, price, or anything really.

I had a few ideas of what I wanted – it needed to be small, as space in the office is at a premium. It needed to be as powerful as I could afford, so it would last a decent amount of time until it needed major upgrades, while being flexible enough to tackle many different types of task – development, gaming, photo (and potentially basic video/audio) editing, for example. In a perfect world, I wanted it to be as quiet as possible and look good.

The last few weeks have been spent doing research, going back and forward over potential configurations using PC Part Picker before settling on an outline of what I wanted. I took it over to /r/BuildAPC for a sense check, and was told my best bet was to change the graphics card for something more powerful than I had picked out. I rejigged a few things to make that possible, and ended up with the spec below:

Type Item
CPU Intel Core i5-4690K 3.5GHz Quad-Core Processor
CPU Cooler Cooler Master Nepton 120XL 76.0 CFM Liquid CPU Cooler
Motherboard Asus MAXIMUS VII IMPACT Mini ITX LGA1150 Motherboard
Memory Corsair Vengeance Pro 8GB (2 x 4GB) DDR3-1866 Memory
Storage Samsung 850 EVO-Series 250GB 2.5″ Solid State Drive
Video Card Gigabyte GeForce GTX 960 2GB Video Card
Case Silverstone FT03B-MINI (Black) Mini ITX Tower Case
Power Supply Silverstone 500W 80+ Gold Certified Fully-Modular SFX Power Supply

The graphics card might still be swapped for another, similarly specced one, but otherwise this is what I’ll be building in a little over a week’s time, when I have some time off. I’ll be talking more about the build, closer to the time, as I have a few things planned which will make it a bit more interesting than just a straight PC build


  1. It was in 2008. I checked my order history. 

Strange iPhone Reboots

Over the last couple of weeks, my iPhone 5S has been rebooting itself during the night. Once (last Saturday) it got stuck in a reboot loop on the Apple logo screen. Strangely, it seemed to be emitting some kind of tone every time it restarted… maybe that was my woken-at-3am brain imagining things, but I’m sure it also made a noise in the early hours of this morning when it rebooted.

The most annoying thing about this, is that it’s only happening at night, while I’m asleep. I know it’s happening because my lock screen tells me so, and I can’t use TouchID to unlock the phone. That, and the fact the display flashing up the stark white loading screen sometimes wakes me up. Throughout the day, everything appears fine. It’s really quite bizarre.

I’d reset the phone to factory settings, but there are a couple of security-related apps installed which would be a massive PITA to have to de-authorise and set up again.

Has anyone else experienced this?

Mistakes Were Made (Google Account Follow-up)

Earlier on I was trying to find a way to “downgrade” a Google Apps account to a personal account. Well, I found a way. Kinda. Ok, not really – I slipped up and deleted my Google account.

I was a bit naive about what removing a Google Apps subscription entailed. In the absence of any clear documentation, I assumed hoped it would remove the baggage of Google Apps, leaving me with a normal Google personal account (especially as the account predated Apps). It didn’t actually remove Google Apps… but it did remove my access to pretty much every useful Google service. I was locked out of Drive/Docs, Browser Sync… everything I use on a regular basis.

It turns out, that if you want to delete Google Apps, cancelling your subscription is only a partial measure. Whereas in most services “cancel subscription” means “I’m done, so remove all my stuff and let me go” if you want to cancel Apps then you have to cancel, and then do the non-obvious step of explicitly deleting your domain from the service.

At this point, my choice was: buy a new subscription to Apps, putting me back to square one – only paying for it, or completely delete everything to do with the Apps account. So deletion it was.

Eventually I tracked down where in the mess that is the Apps admin area I could find the delete domain button, held my breath, and clicked.

Milliseconds later I was dumped out of Google Apps, and everything was gone. Everything.  Even the stuff you’d forgot about, like your Google+ profile, oAuth logins to other sites or logins on other devices, and accounts you forgot were merged, i.e. my YouTube account and subscriptions. My iPhone complained, WordPress complained, Feedly complained, Chrome complained, and so did many, many more! Years of settings, data, and integrations, gone in a button click.

Immediately I had a wave of regret, but also a slight sense of a weight being lifted. I no longer had to worry about the schizophrenic nature of my old account. If I wanted to try a new Google service, I didn’t have to wait for it to be Apps-enabled. Yes, a whole bunch of data was gone, but in a way, that was good. I would be starting over from scratch, without all the cruft that had accumulated over the many years.

So I guess it’s not that bad, really. Just a little inconvenient in the short-term. I’ve created a new account, relinked any complaining devices, and generally started rebuilding.

But please, Google, make the whole Apps/Account integration more user-friendly!

Google Account Frustrations

I like to think of myself as generally a smart person. I have my weaknesses, but I’m usually pretty good at figuring something out – particularly if it’s tech related. Problem solving is generally one of my strong points.

So why, oh why, can I not figure out how to “downgrade” or migrate a Google Apps account to a “normal” Google account?

For background, I have a legacy Google Apps account, from when I used to run my own-domain email account through the service. I switched to Fastmail a couple of years ago, but by this point the Apps account was my “main” Google account – the one I was logged into all the time and thus had my data attached to.

I wanted to get rid of the Apps part of the account, as it causes some weird issues now and again, doesn’t work with all Google services, and I don’t use it for the intended purpose any more.

But it’s increasingly looking like this might not be possible. I can think of a number of enterprise-y reasons why not, but I can also think of a few use cases where it should be possible to at least allow it. I’ll keep hunting for now.

Synchronising GitHub and an Internal Git server

Note: I found this mini How-To while having a clean-up of my GitHub repositories. I figured it would be worth sharing on my blog. Hopefully it is of use to someone. If you want to play around with the steps, but don’t want to use one of your existing projects, you can use this repository.


The Problem

  1. I have my repository hosted on GitHub
  2. I have an internal Git server used for deployments
  3. I want to keep these synchronised using my normal workflow

Getting Started

Both methods I’ll describe need a “bare” version of the GitHub repository on your internal server. This worked best for me:

cd ~/projects/repo-sync-test/
scp -r .git user@internalserver:/path/to/sync.git

Here, I’m changing to my local working directory, then using scp to copy the .git folder to the internal server over ssh.

More information and examples this can be found in the online Git Book:

4.2 Git on the Server – Getting Git on a Server

Once the internal server version of the repository is ready, we can begin!

The Easy, Safe, But Manual Method:

        +---------+          +----------+       /------>
        | GitHub  |          | internal | -- deploy -->
        +---------+          +----------+       \------>
             ^                     ^
             |                     |
             |     +---------+     |
             \-----|   ME!   | ----/
                   +---------+

This one I have used before, and is the least complex. It needs the least setup, but doesn’t sync the two repositories automatically. Essentially we are going to add a second Git Remote to the local copy, and push to both servers in our workflow:

In your own local copy of the repository, checked out from GitHub, add a new remote a bit like this:

git remote add internal user@internalserver:/path/to/sync.git

This guide on help.github.com has a bit more information about adding Remotes.

You can change the remote name of “internal” to whatever you want. You could also rename the remote which points to GitHub (“origin”) to something else, so it’s clearer where it is pushing to:

git remote rename origin github

With your remotes ready, to keep the servers in sync you push to both of them, one after the other:

git push github master
git push internal master
  • Pros: Really simple
  • Cons: It’s a little more typing when pushing changes

The Automated Way:

        +---------+         +----------+        /------>
        | GitHub  | ======> | internal | -- deploy -->
        +---------+         +----------+        \------>
             ^
             |
             |              +---------+
             L------------- |   ME!   |
                            +---------+

The previous method is simple and reliable, but it doesn’t really scale that well. Wouldn’t it be nice if the internal server did the extra work?

The main thing to be aware of with this method is that you wouldn’t be able to push directly to your internal server – if you did, then the changes would be overwritten by the process I’ll describe.

Anyway:

One problem I had in setting this up initially, is the local repositories on my PC are cloned from GitHub over SSH, which would require a lot more setup to allow the server to fetch from GitHub without any interaction. So what I did was remove the existing remote, and add a new one pointing to the https link:

(on the internal server)
cd /path/to/repository.git
git remote rm origin
git remote add origin https://github.com/chrismcabz/repo-syncing-test.git
git fetch origin

You might not have to do this, but I did, so best to mention it!

At this point, you can test everything is working OK. Create or modify a file in your local copy, and push it to GitHub. On your internal server, do a git fetch origin to sync the change down to the server repository. Now, if you were to try and do a normal git merge origin at this point, it would fail, because we’re in a “bare” repository. If we were to clone the server repository to another machine, it would reflect the previous commit.

Instead, to see our changes reflected, we can use git reset (I’ve included example output messages):

git reset refs/remotes/origin/master

Unstaged changes after reset:
M   LICENSE
M   README.md
M   testfile1.txt
M   testfile2.txt
M   testfile3.txt

Now if we were to clone the internal server’s repository, it would be fully up to date with the repository on GitHub. Great! But so far it’s still a manual process, so lets add a cron task to stop the need for human intervention.

In my case, adding a new file to /etc/cron.d/, with the contents below was enough:

*/30 * * * * user cd /path/to/sync.git && git fetch origin && git reset refs/remotes/origin/master > /dev/null

What this does is tell cron that every 30 minutes it should run our command as the user user. Stepping through the command, we’re asking to:

  1. cd to our repository
  2. git fetch from GitHub
  3. git reset like we did in our test above, while sending the messages to /dev/null

That should be all we need to do! Our internal server will keep itself up-to-date with our GitHub repository automatically.

  • Pros: It’s automated; only need to push changes to one server.
  • Cons: If someone mistakenly pushes to the internal server, their changes will be overwritten

Credits

The Upgrade Conundrum

I’m in the market for a new computer1, but I have no idea what way to go. I’ve been making do with older kit for the last few years, but all of it is pretty much at the end of its usable life.

I recently set up a new “office” area in the house, and the way I did it allows me to swap between my work-supplied laptop, and a computer of my own, just by plugging into the right monitor input and swapping a USB cable. This setup also allows my son to make use of the desk if he needs to.

Until recently, the computer I used most around the house was a 9 year old Dell Latitude laptop which I had made usable by putting an SSD into it, and building a lightweight Arch Linux installation. This was primarily because a laptop was all I had space for. Actually, I tell a lie – the “computer” I use most is my iPhone, but for times the iPhone can’t cut it (for whatever reason) I used the Dell2. While this arrangement worked, it showed its age, and it was fiddly at times.

I’ve had a 6 year old Mac Mini lying around for a while, doing nothing. It’s only barely more powerful than the Dell3, and the one time I had it plugged into the living room TV, it was just plain awkward to use. With the new office I was able to plug it in to a proper monitor/keyboard/mouse arrangement which made it more viable. So this past weekend I took the SSD from the Dell, put it in the Mac, and made that my “home computer.” It’s just fast enough to not induce rage when trying to do anything more taxing than surf the web and other light duties.

Now I’ve got a “proper” desk and space, I’ve been thinking I should look getting something which will last me another few years. The cheapest upgrade I could do is to spend ~£60 and double the RAM in the Mac Mini, going from 4GB to 8GB. I’m sure that will give a noticable boost to OS X, but it doesn’t really change the fact the system is on borrowed time. It could buy me another 6-12 months, but at some point, likely soon, something is going to fail. The way I see it, my choices are:

  1. Buy a newer Mac, probably a laptop for flexibility (plus that’s where all their non-iOS/Watch innovation seems to be going).
  2. Buy a Windows laptop.
  3. Build a custom PC.

Of the choices, #3 is likely the most satisfying, and would have the most upgrade potential further down the line, though I would be constrained later by choices I made now. It also has the potential to get very expensive; I priced up a high-end Mini-ITX system for a bit of fun, and it came to roughly £1000 before choosing a suitable graphics card. I could definitely price something for less, and would probably have to, but it would have to be weighed against longevity of usable performance and upgradability. I am a little space constrained, so a massive tower is never going to be practical, but there are plenty options between Mini-ITX and mATX nowadays.

A Windows laptop feels like it would be a cop-out, and there’s not much out there I feel inspired enough to part with my money for. There’s a couple of nice laptops I’ve seen4, but none I feel would last as long as I’d like them to.

Getting a new Mac has been the direction I’ve been leaning towards for a while, but I’ve always struggled to justify it vs. other spending priorities. Plus, when you factor in how fast Apple iterate their hardware, the lack up after-sale upgradability, and you’re always hoping to “time it right”. That said, as an iPhone/iPad owner there’s a lot of upside to getting a Mac, for example: close integration through Handover/Continuity (granted, which I can’t currently use with the Mini), and iCloud Photo Library. I guess I could set up something more “cross-platform” for the photo library, using Dropbox, but I found Apple’s solution to be that little bit nicer to work with.

So the jist of this much-longer-than-I-planned stream of consciousness is that I need to start thinking about replacing the old and almost busted computer kit I have with something new. I don’t know what that will be yet, and I’d hoped getting my thoughts out would help me focus my mind on what I want to do.

No such luck though. Any ideas?


  1. Anyone who knows me probably knows I’ve actually been talking about it for ~4 years. 
  2. And what of my iPad? I mainly just use it for Hearthstone and Games Workshop rulebooks. Since iOS 8 (I think), my iPad has taken a huge hit in performance, and just isn’t as capable as it once felt. 
  3. On paper, at least. In practice it was severely hamstrung by the old-school HDD and running OS X. 
  4. My work laptop is quite nice; it’s a Dell Ultrabook, thin, light, and performant enough. But the consumer pricing is higher than I’d value it at.