Docker swarm monitoring

I’ve seen a bunch of posts lately about how to set up Docker swarm monitoring with tools like cAdvisor and node_exporter which advise running them like this:

docker service create --mode global -p 9100:9100 ...

That will indeed run one container on each swarm node, but it has a subtle problem. When you connect to host:9100, the ingress network routing will connect you to a random instance each connection (for swarm routing values of random). You will indeed get some metrics returned, but they will be for whichever host you happened to be routed to this time.

You can demonstrate this problem quite simply with something like:

docker service create --name hello --mode global -p 8080:80 dockercloud/hello-world

Then just reload the page and you’ll see that the hostname (container ID) changes sometimes. You might see it more readily with curl than a browser.

A simple solution is to just run these containers on each node as normal docker containers outside swarm management. In the setup I’m working on just now, we used the same Puppet automation that provisions the swarm to start the containers.

docker run -d -p 9100:9100 ...

Git post-receive hook for Puppet control repo updates

I made a fairly simple post-receive hook setup to automatically update my Puppet master when I push changes to my control repo. I keep the repo in gitolite, so I wanted to use a regular git hook rather than web hook magic (or even magicer Puppet Enterprise Code Manager magic).

My control repo itself is based on the puppetlabs control-repo on github. Essentially the idea is that every branch in the repo becomes a Puppet environment on your master, complete with automatically updated modules based on a Puppetfile. The r10k tool takes care of the heavy lifting here, and its documentation explains how it works in some detail.

But we don’t have the patience for that! First, install r10k on your Puppet master and configure it to be able to yank your control repo in /etc/puppetlabs/r10k/r10k.yaml, something like this:

Make sure you have SSH keys relationships set up so that you can pull the repo. Running r10k deploy environment --verbose info should let you see what’s going on. Once it works, continue on.

Create an SSH key on your git server as the user that runs git. In my case on Debian, that user is gitolite3, but just wherever you have your repo running.

Copy the public key and install it in your puppet master’s /root/.ssh/authorized_keys:

What’s this, I hear you cry? I’m glad you asked:

Pretty straightforward. Obviously you’ll want to point to wherever you have r10k installed, and make sure the script is executable. This setup takes whatever command you try to run over ssh with this key and appends it to /usr/local/bin/r10k deploy environment. The security conscious may want to do some other sanity checks too. I did say it was simple!

The meat of the matter is the post-receive hook itself. This should go on your git server, inside the puppet control repo’s hooks directory. In my case this is ~gitolite3/repositories/puppet.git/hooks/post-receive. It, too, should be executable.

Like the comment says, you’ll want to make a ~/.config/puppet-update to tell it where your Puppet master lives.

Now, make a commit to one of your branches and push it. You should see r10k working away in the git push output. Yay! Pushing any changes to any branches will update those environments. If you add or delete branches, it will deploy new environments or clean up. In a handy bit of time saving, it will only deploy the modules from the Puppetfile if the Puppetfile has actually been changed.

This works rather nicely for me, but I’d be interested to hear how it works for other people, or what changes you made.

Function for bash or zsh to generate SSL requests and certificates

Rather than memorising annoying OpenSSL options, stick this in your profile, edit the ‘SUBJ’ bit, and you’ll be generating keys with ease.

Useful Haskell Learnings

Most useful.

Import Things tasks into Apple’s Reminders

OK, so I was making a list of things to do today, but then I decided that having created them in Things, I wanted to move them to Apple’s Reminders. Don’t ask – I’m a task list fetishist.

A neat trick you can do with Cultured Code’s Things is to select a bunch of tasks and drag them to a text editor, which will create one line per task with any note appended in brackets. Looks sort of like this:

This is all very well, but there’s no simple way to get that list into Reminders without copying and pasting the relevant bits individually. That sounded boring, so instead I learned enough AppleScript to do it automatically. It probably took more time, but it was definitely more amusing. Anyway, it was that or complete the bunch of tasks I’d just written down.

Here’s the AppleScript code to accomplish this feat.

Yes, I know. AppleScript is weird.

So, now that I’ve written the blog post about the script to migrate to one task manager from another the list of tasks I made of things to do this morning, it’s this afternoon. Yay!

Vag klokke

I localised my awesome script to Norwegian language and, uh, regional time-reading standards. Behold the glory of! A little terminal in the top right of my screen now proudly proclaims:

Datoen er 2012-11-13
Klokka er fem på elleve

It was fun to work out how to handle “x på/over halv” without too many horrible range conditions. I haven’t bothered to remove stale things like ‘tjuefem’ from the minute list, because it ain’t broke. Now you, too, can have the power of a clock that isn’t very accurate. Now with added date!

Hardcoded GNOMEish composition

From Ubuntu docs about the Compose key at

The compose key sequences used by Gnome are derived from the X compose tables of XFree86 version 4.0 with further modifications to provide a Gnome standard for all locales. They are hard coded into the program in source file gtk+-2.10.7/gtk/gtkimcontextsimple.c

Digging into the current Debian gtk+ source verifies this:

/* This file contains the table of the compose sequences,
* static const guint16 gtk_compose_seqs_compact[] = {}
* IT is generated from the script.
#include "gtkimcontextsimpleseqs.h"

So they start with the X Input Method layer which has nice unixy text config files (check out /usr/share/X11/locale) and they want to extend it with some extra sequences. What’s the best way to do that? Clearly snarf what’s already there, bolt on your own bits and hardcode the lot into your binary.

Well done, chaps.

Diablo III EU “Error 33” Fix

If you had the Diablo III beta installed, you might come across error 33 “ is down for maintenance” while trying to log in. This is because there is an old registry setting pointing at the US rather than EU login server.

Take a look in HKEY_CURRENT_USER\Software\Blizzard Entertainment\\D3.

I solved this by deleting the entire ‘Blizzard Entertainment’ key. From orbit, just to be sure. Works like a charm.

How to find our flat

As our building is a bit of a maze, we thought it would be useful to tell you how to find the flat. There are two exciting ways to get here, but this is the easiest to follow.

First, go to Øvre Storgate 1B. You can find it with Google Maps. Press buzzer 4A. Wait for the nice people to let you in.

You find yourself in a hallway. There is a welcoming door here. Go through the door and turn right.

There’s a door on the right to the garage. Go through it and turn left.

Cross the garage. There’s a raised area at the back with a door on the left.

Go through the door which leads to the bottom of the back stairwell. The complicated bit is now over!

Climb up 2 flights of stairs.

Oh no! Another flight of stairs. Nearly there though.

Woo, you made it! Go through this door 😀

If you’re feeling adventurous, you can try to find the other way on your own. Hint: it starts with the front stairwell. There are bad instructions on the welcoming door!

Get off of my iCloud

Like several people I know, I have two AppleID accounts for personal use. I have one that I set up back when an iPod was my only Apple gadget and they first opened the iTunes Store. This has all my (large number of) iTunes purchases associated with it. I also have a MobileMe account that I set up a while later when I first bought a Mac, which I now largely use for the email address associated with it. So far this has worked nicely. I log in to the iTunes account for iTunes and to the MobileMe account for email, sync etc.

The age of iCloud is rapidly approaching. We MobileMe users have an easy migration path, and until next summer to migrate. The idea with iCloud is that you need log in to only one service for everything. This makes sense: the idea is that you have everything bound to a single identity. You log in to that and you get all your email, calendar, backups, media, apps etc. Shiny. Except…

The MobileMe-iCloud migration FAQ makes it clear that accounts can’t be merged. This is Apple’s historical policy – whatever opinion one has about it, it’s not unexpected. Generally the opinion one has about it is that it’s somewhat lame, especially when the merging of iTunes and MobileMe is the entire *point* of iCloud (other than gaining distance from the now-embarrassing MobileMe brand and using the word ‘cloud’ in something).

The trick is that they also don’t make it clear whether you can do a ‘manual’ merge. If I close my MobileMe account, I don’t know whether that will make my email address available to attach to my iCloud account. I’ve just spoken to Apple’s MobileMe support and they don’t know either, though iCloud is still in developer beta so it’s likely they just haven’t been given details at the consumer support level yet. I have previously investigated the possibility of moving my iTunes purchases’ “association” to the MobileMe account. Apparently that can’t be done either.

So now it seems that MobileMe users in this situation might end up having to manually switch between accounts depending on whether they want to send email or watch a movie right now, or sacrifice their existing MobileMe address and pick a new one for iCloud. This is really my main concern: I’ve used that address on so many sites I have lost track, so it’s really not feasible to chase them all down and change them. More to the point, I don’t want to and I shouldn’t have to.

My hope is that there’s a gap in my knowledge and some path will be open to merging or migration. It seems a little unfortunate that when Apple are finally starting up an online service that is free and, hopefully, reliable, they are causing more headaches for the people who paid to support its overpriced, unreliable predecessor.