I’m absolutely terrible at marketing, but it will definitely help me do at least some if you can provide me with some or all of the following:
If you’ve got a standard headshot or publicity photo, that’s great. I’ll use that to make the YouTube preview card.
If you don’t, here’s a few tips for taking your own. These are basic guidelines, not rules. You can ignore all of them and still get a great photo, but if you follow them, you should at least get a decent one.
Seriously, I would have killed for something the quality of even my iPhone SE back when I first started doing digital photography.
If you don’t like having your photo taken, it can show and you’ll end up with another photo you don’t like, so try this. For the duration of the shoot, pretend you’re playing the part of someone who actually likes having their photo taken. Weirdly, that will almost certainly make the shoot go faster because you’ll get a good shot nice and early and you can stop pretending.
It’s just easier. You can use a tripod or a selfie stick, but another human being’s your best choice. If you’ve got a friend who’s an experienced photographer, then for heaven’s sake ask them and ignore the rest of this list – be guided by them. Or book a session with a professional if you have the time, money and desire.
Big light sources are the most forgiving and flattering ones. I like a nice big north facing window, or bright overcast days. It’s not exciting or dramatic light, but that’s fine.
You know the typical passport photo shot? The one where you look like you’ve been lined up against a wall, standing square to the camera and looking nervous. Don’t do that.
Here’s a pretty foolproof lighting/posing idea:
Stand facing the biggest window you can find at a time when the sun isn’t shining directly through it.
Turn about 45° to the left or right
Look across at your friend with the camera who’s standing at 90° to the window light.
Your friend will, of course, take an otherwise fabulous photograph of you with your eyes closed, so repeat the last step until they manage a shot you like.
CHECK YOUR BACKGROUND! Make sure there’s no dead flowers apparently growing out of your ears, an embarrasing cat licking its arse, or a pile of washing in the shot.
Ideally you’ll have a nice smooth out of focus thing going on there (portrait mode, if your phone has it can help), but if you can’t manage that, shoot for something that’s not too busy. You can’t go far wrong with a bookshelf, but stand as far away from it as you can so it’s not tack sharp and distracting.
Bear in mind that I’m going to be putting your photo into a square frame in the YouTube title card, so a wide image probably won’t work too well.
Got a website? Bandcamp? YouTube channel? Tell me about them and I’ll link to them in the event description and during the show.
When I make the YouTube event for a song swap, I write something about my guests. I’ll write something anyway, but it really helps if I’ve got something from you to base it on. After all, I know what I think about you, and I like you or I’d not have asked you to be my guest, but unless you tell me, I don’t know what you think is important about yourself. A short bio can really help there.
A song swap isn’t some kind of forensic interview process; it’s supposed to be an informal chat interspersed with songs. If there’s something you really want to talk about, please let me know. The same goes if there’s anything you want me to steer clear of. It’s fine if the conversation gets dark – folk music’s full of murders and misogyny after all – but it’s really not fine if it gets uncomfortable for you.
Zoom isn’t my favourite piece of software and its default settings are definitely slanted towards business meetings rather than helping musicians sound good. The most important thing you can do in your Zoom settings is to turn on ‘Original Sound For Musicians’, which should show up in the top left of your screen. It’s off by default, so click it and turn it on. If it doesn’t show, you’ll need to dig into Zoom’s audio settings and check the box to make it available.
There’s a catch: turning on original sound disables all of Zoom’s audio processing, including the echo cancellation magic, so it’s really best if you can use a pair of headphones or earbuds rather than speakers to hear me. Or you can just remember to turn original sound on when you start singing and off again when we start chatting. Headphones are easier though.
If your laptop or phone’s built in camera and microphone are all you have, don’t worry, we can work with that. They’re not the best, but they are optimised for someone sitting within reach of the keyboard and making human noises from their mouth hole. You’ll look and sound fine.
If you’ve got money to spend and you want to look and sound better on stream then I have a few suggestions, but this is an area where there’s no right answers, so do shop around and talk to anyone you know who you reckon looks good on your Zoom calls.
Prioritize your spending on gear. In general (and especially for musicians) sound is more important than lighting, which is more important than camera quality. A pin sharp, beautifully lit video of a muddy sounding performer is much harder to watch than a blurry, crystal clear sounding performer singing in a murky cave.
If you’re an unaccompanied singer, this is pretty easy because we don’t have any problems getting your sound balanced and into Zoom. If you’re an instrumentalist, or a band, things get a little more fun, so let’s break it down based on the number of mics you’ll need.
Bear in mind that there are entire books written on this subject; I’m barely scratching the surface here. If you’re serious about getting good sound for streaming and/or recording, it’s worth doing your own research. If all else fails, for your upcoming song swap, buy some of the gear recommended here from a reputable mail order site and rely on the 30 day, no questions asked returns policy they all have because that’s a legal requirement in the UK. But MAKE SURE that when you know what you really want, you buy from the same supplier.
If the sound you make in the room is the sound you’re happy with (unaccompanied singers, acoustic guitarists, acoustic bands) then it’s just a matter of choosing between a USB mic and an XLR mic with an audio interface. Unless you’re in a particularly noisy environment, I’d recommend some kind of condenser mic. The Blue Yeti has been the standard starter USB mic for years and you could do far worse than do that yourself. However, I would definitely recommend going down the slightly more expensive XLR mic and audio interface route as it’s significantly more flexible and upgradeable.
You’ll want a cardioid pattern large diaphragm condenser microphone (I love my Aston Spirit which looks and sounds great, but it’s a multi-pattern mic and nearly £300 new). Brands like Rode, Aston and SE Electronics make great mics, and honestly, pretty much anything that comes up on an Amazon search for ’large diaphragm condenser microphone XLR’ will still sound better than your laptop or webcam’s microphone. You’ll use a balanced XLR cable to connect that to your interface (a cheap cable’s fine, more expensive ones with Neutrik brand connectors and the like might prove more durable and/or reassuring.)
With a single mic setup you only need a really basic audio interface. Something like the Focusrite Scarlett Solo or 2i2 or any cheap class compliant USB audio interface will do the trick so long has it has phantom power available. Just plug your interface in, connect the mic, turn on 48v/phantom power and tweak the gain until, at your loudest you’re not quite going into the red on the meters, select the interface in Zoom, turn on direct monitoring and you’re good.
If you’re a band, or a guitarist where you want to adjust the balance between your voice and your guitar, you’re going to need something a little more sophisticated. How much more sophisticated is up to you, of course and I’m a little out of my depth here, but I’ve got a few suggestions anyway. Don’t hesitate to chat to any live sound engineers of your acquaintance – buy the sound engineer at your local open mic a drink and quiz them, for instance.
The thing to remember here is that Zoom is pretty crap when it comes to audio handling – it doesn’t know anything about pro audio gear, it just expects to receive a mono mix on the first channel of the audio device you select (or a stereo mix on channels 1&2 if you turn stereo on in the audio settings), so you’ll need to do the mixing yourself, either with a standalone mixer, the facilities of your audio interface or some other software on your computer.
I’m going to ignore the software option, but investigate software like Loopback on the Mac and Voicemeeter on Windows. Using your audio interface can be more or less easy, depending on what capabilities the interface has, of course. Modern interfaces are generally more capable in this area. I’ve not actually tried it, but something like the Zoom AMS-24 looks like it would be ideal for a guitarist (put it in ‘streaming’ mode and turn off loopback).
Once you get past a couple of input channels, you’re going to have to go down the mixer route. You can either get a dumb mixer with at least as many inputs as you have instruments and mics, sort out your stereo mix and feed that into a simple two channel audio interface, which will make Zoom happy. That’s fine if you’re primarily interested in live performance, but if you want to do any recording, you’d be better off with a mixer that can also work as a multi-channel audio interface so you can record your vocals and instruments in separate tracks. Something like the Roland GigCasters or the RodeCaster II, for instance. I think Mackie do something in this space too, but I’ve not used any myself, but I do know there are plenty of options.
If your guitar’s got a built in mic or pickup, just plug it into your mixer with a balanced cable, otherwise, you’ll need to mic it up too. There’s whole books written on mic placement, but it’s generally accepted that a good starting point for micing an acoustic guitar is to point the mic at the place where the neck meets the body rather than at the sound hole, maybe a foot or so away. Again, if you know anyone with any audio engineering experience, then talk to them not me.
Once you’ve got your mixer, you’ll need mics. Generally you’ll be close miking things to allow you to mix the different sound sources (if you get lots of bleed between mics, then you have fewer options when it comes to mixing). In theory you use a condenser mic for all these things, just put them close to your mouth or mic, turn the gain down and rely on the inverse square law to give you some separation. In practice, it’s more common to grab a dynamic microphone or two and use them. The canonical mic for the job is probably the Shure SM58, which is built like a tank and looks exactly like you think a stage microphone should look. The SM57 is well regarded as an instrument mic too, and there’s plenty of knock-offs of both. I have an SE Electronics V7, that I use for open mics which sounds pretty good too.
It really helps to have someone else fiddling with the knobs to get your sound dialled in, because they’re not hearing the sound in your head. With decent headphones on, they’re primarily hearing the sound that’s going through the mixer, so they can have a better chance of getting pre-amp gain dialed in, EQing your voice an instruments so they don’t overlap too much and balancing your levels nicely. If at all possible, get some help here.
EQing is a dark art that I’m only vaguely aware of, and mostly don’t have to worry about anyway as an unaccompanied singer. I mostly just leave things flat and hope. There’s plenty of advice to be found on YouTube or sites like Sound On Sound, so I recommend investigating those.
Again, there are entire books on this, and there’s no end of gear you can buy if really fall down the rabbit hole. Elly Lucas made a great ‘Visual Content Level Up Tutorial’ that’s a great starting point.
Seriously. Just watch that. I was going to write more, but she covers pretty everything I was going to say.
If at all possible, don’t use the webcam in your laptop. Investigate ways to use your phone as a webcam. Certainly that’s possible if you’re in Apple World – Zoom can treat your iPhone as a webcam and the camera in your phone is substantially better than almost any webcam you can find. If you’ve got a mirrorless or dSLR camera, more recent ones often have software that lets you use them as a webcam – check your manufacturers website. If they can’t be used directly as a webcam, check whether they have what’s referred to as a ‘clean HDMI’ output and look at getting a cheap and cheerful usb HDMI capture card (or spend rather more on something like the Elgato Camlink. I went with the cheap and cheerful option and it’s fine).
Stick your camera or phone on a good solid tripod, wobbly cameras are really distracting. And level is great. You don’t want people wondering why the things on your shelves aren’t sliding off. Plain backgrounds make this less critical.
If you’re shooting for a solo video, you probably want to frame things so you’re slightly off centre in the frame. However, it makes life much easier for me setting up the Song Swap if you frame yourself in the middle of the shot. One of these days I might fix things so that’s not necessary, but for now centred is best.
Plain walls are great. If you can arrange to hang something like a duvet behind you, that acts as quick and dirty audio treatment of your room. I have a duvet and a rug hanging behind me as a background and it definitely helps with the sound. Getting some distance between you and your background helps blur it a little and make it less distracting.
The advice on making your own headshot applies here as well.
The aim of a song swap is to have a good old natter and sing a bunch of songs and to enjoy ourselves while we do it. It’s not an interview, if only because I’m a terrible gobshite with a tendency to go off on tangents. Don’t hesitate to tell me to shut up. I try to rein myself in, but every time I watch a show back I think “Yeah, you could have shut up a bit more there Piers.”
If you’re the sort who likes a set list, I’d suggest planning to cover as much material as you’d get through in a forty minute folk club set with maybe a couple of encore pieces. Nobody’s run out of material yet.
It’s a song swap tradition that the first question I ask a new guest is to ask them to talk about their first encounter with Martin Carthy, whether in person or recorded. Chris Manners memorably described first hearing Martin singing The Bedmaking on the John Peel show playing a guitar ‘hard enough to drive rivets through concrete walls’ and rearranging his entire world. My first encounter was at the Soles and ‘Eels folk club in Northampton. Everyone had hyped the gig up beforehand like it was the Second Coming and I bloody hated it. Martin was great, as usual, I just wasn’t in the right place to realise that. We’re not about gatekeeping here though, so if you don’t know who I’m talking about, let me know and I’ll probably ask you how you came to discover you were a musician.
The show is supported by people chucking money in the virtual hat at https://ko-fi.com/pdcawley. On the following Monday, I tot up the total takings and send you half through the magic of Paypal.
It’s common practice in music production that, when you’re layering your vocals with harmonies or doubles, you pan everything but the lead vocal around the stereo field. That gives a sense of physical separation between the voices and it feels more realistic – or artificial, if that’s what you’re going for.
With loopers though, you’ll more usually have your overdubs all in the same place, which is fine at the local open mic, where the front of house PA is often mono anyway, but for streamers like me, it feels a little limiting, so I’ve been working on a way to get that effect.
This is still a work in progress, but enough people expressed an interest when I showed this off in a “How Do You Loop?” chat with John Paul Music UK here.
If you don’t want the gory details of how the pandomiser works, here’s how to add it to your Loopy Pro and get going quickly:
Install miRACK from the App store. It’s a full-featured modular synthesizer simulator, completely free, and it’s capable of far more and weirder things than we’re making it do here. Check out the examples.
Grab the pandomizer bundle from my website and unzip with the Files app on your iDevice. Copy pandomiser.mrk to your miRACK folder so the plugin can find it.
Open the mixer, tap the ⊕ button and choose ‘Add MIDI’ and choose the miRack ‘MIDI FX’ option
Open the miRACK interface and tap on the four squares icon in the top left, choose ‘Open Existing’ and load pandomiser.mrk
.
I’m a pure vocal looper, so I’m going to assume you have a single audio input for your vocals that’s routed to all Loopy’s colours. If you’ve got seperate inputs for your voice and other instruments and not everything is routed to all the colours, you might have to add multiple panning busses, but what follows should give you enough information to get started. This setup also assumes that you don’t want to affect the panning of your live audio, only where it gets placed in your loops and oneshots.
Now your input audio is going to your loops via this new bus, which means we can mess with its balance setting and it won’t affect the live sound going to your speakers. We can do anything here, and it won’t be heard until the clip it’s recorded on gets played out. Use this power responsibly. All we’re going to do here is mess with the ‘balance’ setting.
Let’s go.
Test everything is working by opening the mixer and bringing up the keyboard in the miRACK window then holding down C4. You should see your panning bus’s balance jumping to a new random value, then resetting to the centre each time you press and release C4.
Here’s one simple way of ensuring that all your overdubs are placed at different points in the stereo field using a couple of follow actions.
Now, whenever you overdub a clip, as soon as your overdubbing starts, Loopy will ‘hold down’ C4 and the pandomiser will set the recording balance to a new random value. Once overdubbing stops, Loopy will ‘release’ the key and the balance will return to the centre.
In my Loopy setup, I don’t want to change the balance of every overdub of every clip. There might be a couple of clips that will be overdubbed for each verse of the song, and I want the pan to remain the same for each overdub associated with the verse. So I can set the first clip to send the noteOn
message when I start overdubbing and set the second to send a noteOff
when I finish.1 You might only want to mess with the panning for certain colours, or even for specific clips, or you might want to set up widgets to record with or without messing with the panning. All you have to do is send C4 to miRACK when you want to randomise the balance, and release C4 when you want it reset to the centre. Have fun.
A modular synthesizer like miRACK thinks in terms of voltages, and those voltages can mean different things. We use a midi trigger input module that we’ve configured to send +10V whenever C4 is held and we connect that to the Gate input of a Sample & Hold (S&H) module, which we’re using as a source of randomness. Whenever the modules Gate voltage goes high, the module ‘samples’ the voltage at its input, and sets its output voltage to the same until the next time it detects a rising edge. If there’s nothing connected to the input though, it samples an internal noise generator and outputs that voltage. I’ve configured the module to so that its noise source is a white noise generator a range of ±5V, and we can think of that as ranging from a hard left pan at -5V through the centre at 0V and on to a hard right pan at +5V.
The output of the S&H module is now jumping to a new random value every time we press C4, but we really want to output 0V when C4 isn’t pressed, so we feed its output into the input of another staple of a modular setup, a Voltage Controlled Amplifier (VCA). We’ve set this VCA up at unity gain, which means that, when it sees 10V at its Control Voltage (CV) input, it outputs, it sends 100% of its input voltage to its output, and when it sees 0V CV, it sends 0% of its input. So, if we connect the C4 trigger to the VCA’s CV input, we’ve turned it into a gate – whenever C4 is held, the VCA sends the random voltage from S&H, otherwise it sends 0V.
Now we just need to convert that into something MIDI understands, an unsigned value between 0 and 127. The MIDI CC Output module can do some of this for us, but it’s expecting a voltage in the range 0–10V, and right now we’ve got something in the range of ±5V. So we feed the signal from the VCA into the A input of a CONST ADD MULT module, set the constant to 5V, and feed the associated A+B output to one of our CC outputs that we’ve configured to send CC8 to our host app. Job jobbed!
I think the next version of this will allow me to sequence balance for the first few steps, so it might go: hard left, hard right, mid left, mid right, then random pans until the system is reset. I have ideas about how to implement this too, but if you beat me too it, please let me know!
I’ll also be making a YouTube video walking you through it (think of this as a draft script for that video).
I don’t actually do that, because I have a slightly more complex setup that supports multiple song structures with the same set of clips. I’ll explain all that later. ↩︎
Back in 2012, we were living in Cornwall and used to go to the regular Farmers’ Markets in Mullion and Helston. One week, at Mullion, a new trader showed up selling home made bean to bar chocolate under the name of Chocolarder. I got chatting with Mike, the guy who made the stuff, and bought a few bars and some of his sea salt caramel truffles. If we weren’t actually his first customer, we were damned close.
Every month, he showed up with plain looking bars of amazing chocolate. One time he’d bought a bunch of rose petals (apparently, they can be had quite reasonably after Valentine’s Day because there’s something of a glut), dessicated them, ground them fine and added them to the chocolate. Bloody delicious!
I told Mike about some milk chocolate with sea salt that I’d tried and really loved and suggested he do something similar. He’d have a go he said. Months later (after many experiments, apparently) there he was with some bars of salted milk chocolate so I bought as many as I had the cash for and loved every mouthful. He didn’t make them again though. Except, today, with a bit of birthday money burning a hole in my pocket, I thought “I’d love some Chocolarder chocolate, it’s been an age” and what did they have? You guessed it: Cornish sea salt milk chocolate. So that’s a chunk of birthday money spent.
Yes, it’s a lot of money for a bar of chocolate, but believe me, it’s amazing stuff and Mike is as committed to ethical and sustainable manufacture as anyone I’ve ever met. We’ve visited the factory a couple of times and I remember the time we visited and he was more excited about showing off his new, plastic free packaging as he was about the chocolate. He buys direct from cacao farmers and has been known to get his beans shipped by sail rather than container ship.
Scrooby show was lovely on Saturday. Nice to catch up with a few folk I’ve not seen in quite a while and the weather was flat out gorgeous. I took some modelling balloons, planning to do a few balloon animals and hats, had something like a 60% burst rate with the Sempertex 260Ss that were all I could get hold of at short notice. Won’t be using those again. Qualatex all the way, I think.
It’s great to get out of the house sometimes. Gill coped really well by herself too – I’m a full time carer, but it’s definitely good to know that I can have the odd day off without it completely buggering things up.
Whitby Folk Festival FOMO is real. But also, there’s a COVID spike going on, and crowded rooms full of unmasked singers aren’t the safest of environments, so I think I’ll comfort myself in the knowledge that at least I won’t be likely to bring an infection home with me.
I swear I’m going to wrap my head around the workings of the way to optionally build a custom formatted Date tree using org-capture
, but for now I’ve just tweaked the template I use to add
#+hugo: more
to the heading for the week. This means that, when Hugo’s rendering the index page, the week’s notes will be represented by a summary which links to the extended per-day notes for the week.
Furthermore, I’ve added a new capture to let me capture a weeks’ summary. I think I’ll probably end up wrapping that in a summarize-week
command that will show me the wider weeknote context while I write the note, then mark the week as DONE
, then bring up magit
so I can commit and push the changes. But maybe not for a while yet, on the “fake it until you can’t stand not to automate it.” principle.
So, I have the ADHD thing of putting a thing down and completely forgetting where I put it, or even its very existence. Object permanence is clearly not a thing with me.
Or I thought it was.
We have a house guest right now, and she has this habit of trying to help by tidying putting stuff in ‘sensible’ places. So I’ll find the squash in amongst the bottles of oil, vinegar and sauces in a completely other part of the kitchen; or Gill’s socks will show up in an admittedly convenient, but surprising, new place after I’ve given up looking for them in the place I usually put them, and the airer I dry them on, and in the washing machine and laundry basket. Maybe it’s on the floor between any of those places… Grrr.
It turns out I’m kind of comfortable with not quite knowing where a thing is, but I am absolutely viscerally infuriated by finding said thing in the wrong fucking place, somewhere I would never ever ever in a month of Sundays deliberately put it.
And don’t… don’t get me started on the utter utter utter wrongness of using the lids of things as shelves. I will end up foaming at the mouth and shouting. Ask me how I know.
It’s all the more distressing because I hate getting angry about stuff, especially objectively trivial stuff like this, and so the rate spirals. Bah!
Had a great day at Mill Con 2 down in Peterborough. I still think filk music as a genre is a bit weird, but there’s no denying that the people who make it are lovely people, and it’s hard to beat singing in company for lifting your mood.
Mike Whitaker was kind enough to give me a concert spot at a week’s notice too, so I did a forty minute set with a couple of songs with Loopy Pro in what was only the second time I’ve used the gear in any setting but my ‘streaming studio.’ It went well, but there’s still a way larger profusion of wires than I’m happy with, and I definitely want to assemble some kind of all in one pedal setup if I’m going to be taking the gear out of the house more often.
Not a bad week, this week. My step daughter and her family called in on their way back from holiday on Friday night and we spent a pleasant evening with them and a few Cawleys who were knocking about, sat outside the Wool Market. Mostly good food, but apparently the Greek place isn’t that good. Rustic Pizza is still good though.
Most of the proceeds of my Magic: The Gathering cards will be spent on repairing my grandfather’s recliner. But… I wouldn’t be me if I didn’t spend some of it on something gamelike. So I bought myself a ‘GameDad’. In my case, an Anbernic RG 353VS and it’s a hell of a thing. Not much bigger than an old school Game Boy (and cheaper! Not just in real terms, but the Game Boy launched at $89.99 and I got mine for $87.99), but with a large, bright colour screen and enough grunt to play SNES and PlayStation games at full tilt. Apparently, you can even make it play Nintendo 64 stuff, but not necessarily at full speed.
I don’t really care about emulating consoles I never owned though. I want to play Manic Miner, Tempest, Dig Dug, Galaxians and the other games that gladly ate my pocket money, ten pence at a time, down the local arcade (the building’s still there in all it’s new brutalist concrete glory, but the arcade where I boggled at I, Robot and thrilled to the exploits of the masters of Defender and Robotron is long gone).
So, in search of that heady thrill and those unmistakable sound effects, I’ve been frequenting archive.org’s library of delights and installing a few of my old favourites.
The first to get seriously played was my old favourite Tempest – Atari’s miracle of colour vector graphics where you controlled a spiky yellow thing running around the top of a blue tube shooting the terrifying geometric shapes that were climbing up towards you with deadly intent. When I first started playing it, I’d hold the fire button down and spin madly round the top of the tube and die all too quickly. But it was such fun I’d just shove another coin in the slot until my money was all gone. Then, in an arcade in Whitby, I watched someone playing the game in an entirely new way and my mind was blown. The walls weren’t blue! The colours were different and there were new, scary shapes. This guy wasn’t spinning around, and he wasn’t just holding down the fire button either.
Tempest was unusual for the time in that it had autofire. If you held down the fire button on most games of the era, you’d fire one shot, then nothing would happen. But in Tempest, you’d autofire bursts of eight missiles, then a slight pause and the cycle would repeat. And it was the slight pause that would kill you. The Whitby guy had sussed that out and was mashing the fire button at a measured speed that kept up a constant stream of evenly spaced bullets that were far more likely to save you when a Flipper had reached the top of the tube and was making its way towards you; they could only kill you if they in the same space as you and were vulnerable to your shots while they were flipping that last step towards you. If you were simply relying on autofire, you could bet that that flip would happen during the short pause between bursts.
Whitby guy had also worked out that the larger the angle a flipper had to flip through, the more chance you had of killing it before it killed you, so for lots of levels, it was just a matter of finding the safest place and staying there. There are a couple of levels where you were only ‘safe’ from flippers coming from one side. Those were the levels that killed you unless you got good at moving from place of safety to place of safety.
I watched intently and, when I returned to my home arcade, suddenly the top three scores – the ones that got burned into non volatile memory – on the arcade’s machine belonged to PDC. I could reliably reach the red levels and even the next, yellow, set.
I can’t do that on the Game Dad. Not yet at least. I’m old enough and RSI’d enough, that the thought of bashing the fire button at 8 Hertz just gives me the shivers.
But! Modern emulators have all sorts of convenience functions, surely I could configure something that would emulate the steady rate of fire that my youthful fingers were capable of. And maybe I could do something about the incredibly sensitive controls, where even the lightest touch of the analog stick would see me moving two or more segments when what I really wanted was a surgical one step move.1
I turns out that I could. But, frustratingly, not via the very slick UI. I had to edit text files! I had to make new text files. And because popping the Micro SD card out of the Game Dad and into a card reader, editing a file, putting it back in the GD, testing it and then having to fiddle with the text file again is… less than ideal, I did it by logging into my handheld games console via SSH from my iPad, editing the file and just restarting the game!
You can’t do that with an original Game Boy can you? The damned thing’s running Linux. I’m at once annoyed that I had to log in to it and fiddle with text files and astonished that I could even do that.
It’s not the nostalgia that’s making me feel old, it’s my assumptions about what’s capable of what kind of computing.
Real Tempest machines have a gorgeously weighted ‘spinner’ – a free spinning rotary encoder with enough resolution that there still multiple ‘steps’ per tube segment. It meant that your blaster had a lovely fluid movement with very direct control. Basically, Tempest is pretty damned close to perfect on its original hardware and if you ever get the chance to play on a well maintained machine, you should grab it! ↩︎
Also, Good Omens 2 is a delight. Still enough of Terry’s character hanging around it, and the new writers help it not feel too Neil-y.
After a bit of fiddling, I’ve worked out how to add helpers to the Emacs `C-x 8` keymap, so now I have shortcuts for typing ‘λ’, ‘🙂’ and various other characters that I type more or less frequently. Beats the crap out of doing `C-x 8 <ret>` and then typing out the name of the character I’m looking for.
In case you’re interested, here’s the code:
(general-define-key
:keymaps 'iso-transl-ctl-x-8-map
". ," "…"
": )" "🙂"
": D" "😀"
"; )" "😉"
"\\" "λ"
"a ^" "↑"
"a u" "↑"
"a v" "↓"
"a d" "↓"
"a |" "↕")
If you’re not using `general`, but you’ve got `use-package` installed, you can do something similar with `bind-keys`:
(bind-keys
:map 'iso-transl-ctl-x-8-map
(". ," . "…")
(": )" . "🙂")
(": D" . "😀")
(":|" . "😐")
("; )" . "😉")
("\\" . "λ")
("a ^" . "↑")
("a u" . "↑")
("a v" . "↓")
("a d" . "↓")
("a |" . "↕"))
You can no doubt use define-key as well, but I find `general` or `bind-keys` to be much nicer to work with. The latter has the advantage that it’s included in Emacs as part of `use-package` and plays nice with `which-key`, so I might go and redo my key bindings and get rid of `general`, nice as it is, since the real selling point of that library is how easy it is to bind stuff in `evil-mode` states.
I still miss Twitch Sings. It’s how I started streaming—long before the Friday night Song Swaps and folk streams. I’d be happily belting out Lady Gaga’s Bad Romance, hamming it up to You Spin Me Round or giving it my best Johnny Cash on Hurt. It was just huge fun and a great way to make friends on Twitch.
Twitch ended up pulling the plug because it was a free app and… well, free apps and sync rights really don’t play well together.
You’ll still find people doing Karaoke on Twitch though, many of them the same faces I met back in Twitch Sings days. This morning, I woke up early and spotted some friends Karaoke-ing it up on a Discord, so I pulled on pyjamas and went and joined ’em for a few songs. These days, I just use Loopy Pro rather than searching YouTube for backing tracks. It’s great fun though, and definitely makes for a more enjoyable way to spend the occasional hour or so of early morning insomnia.
Singing in company, even virtual company is still the best thing you can do in public with your clothes on. I encourage you all to sing more. What’s the worst that could happen?
Made a capture template for adding a week note. Support functions are currently not the prettiest, and don’t deal with a bunch of corner cases, but they seem to work for my case, so I’ll leave ’em be for the time being. I plan to write it up in a longer post, and that will no doubt tweak my coder pride enough to make things suck a little less.
Oh god, once I start fiddling with my Emacs configuration, it’s impossible to stop!
Nipped over to Mum and Dad’s for lunch at Zini’s, and to borrow dad’s drills for my on going cigar box MIDI controller project. Managed to get eight holes accurately placed enough that I only had to drill 7.8mm holes for the M7 threaded rotary encoders I’d soldered to my stripboard. I’m calling that a win! Next trick, get the microcontroller wired up and appropriate software written.
Also discussed making PID controller I promised to make dad for his heat treatment setup a while back. A Pi Pico and one of its mini displays looks like it should do the job nicely. The plan is to make an extension cable with an SSR as a separate bit of kit, then control that from the prototype controller. Once they’re working as separate parts, we can work out how to bring it all into one container. I shall wuss out of making the kind of thing I saw in a commercial radio controlled plug, which powered the control circuit with a very simple capacitor based power supply, with the slightly worrying wrinkle that the controller’s 0V line was floating at around 5V below mains Vmax. Clever, sure, but scarier than I’m prepared to work with.
Holy crap, but old Magic the Gathering cards are getting horrifically pricy. According to the buy list of the shop I just took my cards in to, I should be expecting about £400 for just four of my cards. And probably another couple of hundred for the two dual lands (assuming they’re not from the Unlimited set, in which case they’re worth a lot more). All being well that’s covered the cost of getting my grandfather’s old recliner reupholstered and fixed.
If I could be arsed with it, I could probably get a lot more by selling direct on eBay, but I was already losing the will to live just sorting things out to take in to the shop.
Do not ask me about the Tabernacle at Pendrell Vale and Black Lotus that I sold far too early, because that might make me grumpy.
Overwhelm had left us with a huge pile of washing to do, filling the sink, the draining board, and various work surfaces and I kept putting off tackling it because my brainweasels just saw the sheer amount of work involved and shut down. No fun. Anyway, Gill grabbed her perching stool and set to and before I knew it there was a full dishwasher burbling away, an empty sink, a full draining board and the beginnings of a system to keep it that way. The goal is to keep the sink empty and the draining board full. Before I cook, I clear the draining board. Any pans I use go in the sink and the next time I make tea, I wash up what’s in the sink and anything that’s still not been done of the mahoosive pile, until there’s a full draining board again. Next time I’m brewing up, I can put the dry stuff away and, if I have the spoons, chip away at some more of the pile (though, post-COVID, I rarely have the spoons for much – I can only stand for so long).
It’s not perfect, but it gets stuff done, and I’m calling that a win.
Emily came over for rehearsal and Carcassonne. It was mostly Carcassonne, if I’m honest, but The Housewife’s Lament is starting to seem like we’ve learned it, as is We’ll Sit Upon The Gate, though that one feels like it could use another verse. The Mary Ellen Carter is pretty damned solid too, which is good.
We set a new record city score in Carcassonne too, managing to share a 105 point city. We even managed my first ever draw in the second game.
A solo stream this week. Again with the Overwhelm getting in the way of getting more guests booked, but I’m starting to fill the diary again, which is good. I’ve got Alex Cumming as my guest in a couple of weeks, and Helen Edwards, Talis Kimberley, Emily, and a folk legend who will remain nameless for the time being lined up for August and September.
Loopy Pro was mostly rock solid. The one-shot overdub whine cropped up once, and there was a hard crash to the home screen at one point, but a restart was fast and clean. Perils of running beta versions of sofware, I guess.
I toddled over for the morning and afternoon sessions at the Bradfield Traditional Music Weekend – spent a happy few hours singing Americana (I sang Cabin in Glory and We’re Gonna Camp a Little While in the Wilderness which seemed to go down well) in the first session, then there was a lovely, ballad heavy, song session. I sang Tamlyn about as well as I’ve ever sung it, and it went down really well. I was knocked out by a cracking version of The Famous Flower of Serving Men in particular, but the whole session was great.
Dim sum at the China Palace for lunch with Dougal & Liz, Matt and Jo and a few of the kids. Excellent as always. We may have overordered
I’m off to the closing session of BTMW later. I suspect it will be as good as the Saturday sessions.
Back when lockdown started, for all we were both classed as ‘critically vulnerable’, we did pretty well. We had each other for company, the house is paid for, we have some private outdoor space, Doncaster Market remained open and I’d got a 25kg bag of flour. Life was… tolerable.
Except, as any musician will tell you, making music needs company. At least it does if you catch the magic that happens when a group of people are together making something beautiful. Which is why I started singing folk songs to the internet1 every Friday night.2 At first, I just sang into the dark to the two or three people who showed up in the text chat but I was still missing harmonies. So I worked out how to use Logic for live sound and did on the fly multitracking of chorus harmonies and that was great. Recently, I’ve switched to using Loopy Pro on my iPad3 and it’s been great – I can sing harmonies on far more songs now and I’ve started experimenting with fancier arrangements too.4
A key realisation came after a late night conversation with Squeezy John [Spiers] on Twitter about how a YouTube archive of performances can become a millstone around an artist’s neck. After all, if someone can catch a performance online from a couple of years ago, why would they pay to watch you doing it again? I don’t think most people think this way, but the life of a professional musician is marginal enough that it doesn’t take many to have an effect. So I started making the catch-up videos private after a week.
It was incredibly liberating. I stopped worrying too much about repeating myself every week – I always open with the same song, I usually close with one of about three songs, there’s a couple of staples that people seem to miss if I don’t sing them and if I fuck something up, it’s gone in a week and I can concentrate on doing better next time. My audience seem to be fine with it. If I do something particularly well, or something funny happens like my cat bringing the backdrop down on me
then I have the original video files, so I can edit and upload a high quality clip that I’m happy to leave available forever.
And it’s closer to the experience of music in person, too. Live music is a magical transient thing that lasts as long as the song and then it’s gone. Recordings are souvenirs, they can be delightful, but they simply can’t be the same as being in the moment.
It’s gone. I might sing it again next week, but just as you can’t cross the same river twice, that performance will be different. And that’s brilliant.
I’ll tell you what is permanent: the repeating Friday night entry in my diary that reads ‘Folk Stream.’ It doesn’t matter whether I’m streaming for 2 people and no tips or a busy room and £100+ in Ko-fi’s hat, singing to the internet never fails to lift my mood. It’s like I’ve made an appointment with joy.
Catch me every Friday night at https://youtube.com/pierscawley/live from 8pm UK Time. ↩︎
I tried a few of the Zoom singarounds that sprang up, but for reasons I can’t quite put my finger on, I found them far more stressful than just singing to the camer and interacting with an audience in text chat. The audio only Clubhouse ballad sessions that I started were way less stressy for me too. ↩︎
And a fancy dual-USB audio interface that means I can capture Loopy’s audio really cleanly. Check out the iConnectivity AUDIO4c (affiliate link) if that’s a thing you might need. There are other dual-USB audio interfaces that are probably at least as good, but this is the one I’ve got experience of using. ↩︎
Check out for an example. ↩︎
I think I’ve got the crossposting to Mastodon looking less awful, and I know I’ve got my front page looking better. Isn’t that lovely?
In the happier timeline, Elton John bought Twitter and it became even more fabulous with every passing day. In the far more depressing timeline we find ourselves living in, Elon seems to be determined to tank the company and fuck the community.
So I’ve buggered off to Mastodon. At the time of writing, you’ll find me at @pdcawley@mendeddrum.org and, you might even be reading a version of this via Mastodon rather than directly on the site.
Once I’d added the Mac Mini to the rack, there was a space in the bracket it was mounted on that was designed to hold one or two Raspberry Pis. I had a Pi sitting about, so of course I added it to the rig thinking “I’ll work out what to do with that later.”
It’s proved invaluable…
Sometimes folk ask why there’s a Raspberry Pi in the rack case that holds my streaming rig and I admit that the primary reason is that I had one kicking about the place, and the rack unit that holds my M1 Mac Mini is designed to hold a couple of Pis in the space that isn’t holding the Mac, so I might as well attach it. I had the feeling it would come in handy.
It turns out, I was right. It runs OBS to handle motion graphics, and Companion to control the ATEM Mini from a streamdeck. In the future I plan to use it as a router so the gear in the rack can live on its own private network with the Pi handling the connection to the outside world either via a network cable or (hopefully not) wifi or a tethered phone.
The latest “Oh, of course! It can do that too!” job it’s taken on is pretending to be an Apple TV, so I can mirror my iPad’s screen to it without mucking up the Mac’s display.
Once again, I’m reminded that computers can be anything you can program them to be… sometimes all at once. The Pi 4 is a mindbogglingly capable bit of kit. I’ve bought USB hubs that cost more and do less than my Pi.
There’s a Pi 3 plugged in somewhere that runs Pi Hole and helps eliminate intrusive ads from my web experience. It has sufficient spare power that, once I get around to it, I plan to hook up a cheap workhorse laser printer to it and configure it to work as a print server, which should let me retire the crappy unreliable bubblejet printer that only prints when you stand over it with a big stick and is always demanding the blood of innocents before it’ll deign to even try to print something.
About the only thing wrong with them (at the moment) is that they’re almost permanently out of stock everywhere. I watch rpilocator and it’s clear that people do get stock, but they sell out almost immediately. A chum who works for Raspberry Pi says that they are shipping lots and lots of boards, but the demand stays sky high. Which is a nice problem to have for them, I suppose.
I added a Raspberry Pi to my streaming rig on spec, and of course now it's indispensible
Then Twitch pulled the plug on Sings and I started doing my regular Friday night folk streams. At first this involved running OBS and Logic Pro on my aging MacBook Pro and it was sort of okay, until I started having guests and doing Song Swaps – the MacBook simply wasn’t up to the job of running Logic, OBS and Zoom, so for a while I had some unholy lashup, with Zoom and OBS running on the PC, my Mac running Logic, with patch leads between the Mac and PC audio interfaces so I could hear my guests and vice versa. It wasn’t pretty.
What saved it was adding an ATEM Mini Pro ISO from BlackMagic. Now I could hand off all the capturing, streaming and recording duties to that box and run Logic and Zoom on the Mac and everything just worked (with an utter rats nest of cables on my desk). The ATEM takes HDMI input from up to four sources and lets me switch what gets sent to the stream between them. There’s also a monitor preview output that can be switched independently between those four sources, as well as multiview and stream previews. Of course, I quickly used up all four sources, and I don’t even have a second camera! The way things are arranged by default is that I have a camera feed, my mac’s primary and secondary displays, and the output from a Raspberry Pi. Mac screen 2 is where the zoom window lives, and is connected to the ATEM via an HDMI splitter with the second output feeding a 7-inch field monitor that sits on a teleprompter setup so I can look guests in the eye.
The primary Mac screen is where I set up streams and drive Logic from. It feeds into the ATEM mostly so I can see it in the preview window or full size when I need to without having to have a another screen, but if I ever get a second camera, I’ll break mac screen 1 out to its own monitor and dedicate input 2 to that secondary camera. Input 4 is a Raspberry Pi that I use for motion graphics. This used to be a copy of chromium running in kiosk mode to display the Ko-Fi stream widget which displays donations as they happen, but I’ve recently managed to compile OBS studio with a working browser plugin and I’m using that instead now, which should allow me to add more overlays later. The only catch with that is that I’ve not yet managed to get Companion to compile, so I think I’ll have to move that onto the Mac.
This all worked fine until recently. You see, the thing I love about unaccompanied singing is singing harmonies. And harmonies don’t work unless the timing is properly tight. That means I can’t sing harmonies with Zoom guests because the speed of light screws things up completely. I got around this by harmonising with myself – either by using a Looper plugin in Logic1 or just by using Logic’s multi-tracking2 to record multiple layers of a song’s chorus. It all works, and works well, but I could never work out how to record harmonies for shanty type refrains.
Consider:
Oh the rain it rains all day long
Bold Reilly oh, bold Reilly!
And them Northern winds they blow so strong
Bold Reilly oh’s gone away
Chorus:
Goodbye my sweetheart, goodbye my dear-o
Bold Reilly oh, bold Reilly
Goodbye my darling, goodbye my dear-oh
Bold Reilly oh’s gone away
Ideally, I want to be layering up harmonies on the Bold Reilly oh, bold Reilly! and Bold Reilly oh’s gone away lines within each verse, as well as on the chorus (and potentially on any chorus repeats too). It should be possible to set up Logic’s live looping feature to enable this sort of thing, but I could never work out how, and I wasn’t ready to switch to Ableton Live (where I couldn’t be sure I knew how to make it work either). So I stuck to just harmonising on the chorus and wishing there was a better way.
I’ll write a full review of Loopy Pro one of these days, but suffice it to say, there is now a better way. Loopy Pro is the long awaited successor to Loopy HD, a software looper for iOS. It’s been a very long time coming, but by god, does it deliver! As well as being a looper, it’s a great replacement for MainStage and (at least for live use) Logic Pro, and it runs on even pretty ancient iPhones and iPads. It’s predecessor was remarkably capable, but was ‘just’ a looper with a configurable number of loops available. Loopy Pro is fully customisable, supporting any number of loops, one shot samples, beat slicing, AUv3 hosting, mix busses, control widgets, faders and dials. There’s a deep system of actions and ‘follow actions’ that allow you to customize its behaviour as well as its appearance, and the audio routing capabilities could embarrass some far more expensive DAW software. It’s astonishingly capable.
I’ve been fiddling with it since it was released, and now have it set up to allow me to sing shanty style songs with harmonies on the refrain as well as stuff like this:
so now I’m running Friday night stream audio from the iPad, leaving the Mac to run Zoom. This has all been made much easier since I added a new audio interface: the iConnectivity AUDIO4c interface is a remarkable bit of kit. Uniquely, as far as I can tell, it can be used as an audio interface simultaneously by my mac and my iPad, and can route audio from the iPad to the mac and vice versa. It’s got four inputs and six outputs, which is two more than existing interface. That means I can have my guest and me fed to the ATEM mini on separate channels, as well as giving myself a different headphone mix. And it’s only one rack unit high!
The remaining bit of the puzzle is to reliably capture the iPad display. I’m looking for, and so far failing to find, a powered USB-C hub that has a bulletproof iPad HDMI connection and gigabit ethernet, which is much more reliable when it comes to remote control of the ATEM. It’s a frustrating search. Everything I’ve found so far has intermittent HDMI dropouts, which would be annoying enough if it weren’t for another ‘feature’ of iOS and iPadOS.
It works like this: on iDevices, you don’t have the option to choose which audio interface you want to use, instead the OS autoselects whichever interface was plugged in most recently. Which would be fine (sort of) if it weren’t for the fact that an HDMI connection to an audio capable device is treated as a new audio interface. So… if the HDMI connection was reliable, getting prepped for a stream would just involve unplugging the audio interface from the hub, connecting the HDMI and reconnecting to the AUDIO4c. But then the HDMI drops and comes back, and suddenly it is the most recently connected interface and the stream’s audio is buggered.
If you know of an iPad friendly USB-C hub that has rock solid HDMI, then I’d love to hear about it because my search is getting really frustrating. Right now I’m working around it by disconnecting the audio, screen sharing to the Mac and reconnecting the audio, but even with a dual screen mac set up, screen sharing takes over both screens and the whole thing has a bunch of latency that you don’t really want – it’s surprising how little latency will start making audio software in particular seem seriously out of sync. I can correct for that when editing, but not so much on a live stream.
But… once I have reliable HDMI from the iPad I run into another problem. Where do I plug it in? Suddenly four inputs on the ATEM aren’t really enough.
Time to start saving up for an ATEM Mini Extreme ISO, which is the extra-wide version of the ATEM Mini Pro ISO, complete with 8 HDMI inputs, 2 HDMI outputs, 2 USB-C connections, a 3.5mm audio jack output (so I can monitor the audio going to the stream rather than just looking at the VU meters on the multiview display) and something called a ‘Super Source’, which would definitely simplify the business of setting up the split screen view when I have a Zoom guest.
The extreme version appears to fix pretty much everything that I find slightly annoying about the Mini, which is brilliant, but could be brillianter. Ah well, a boy can dream.
If you’re interested in seeing what all this technology ends up looking like on stream, then I stream at 8pm UK time every Friday night on my YouTube channel. Maybe pop by, and if you like what you see and hear, don’t forget to like and subscribe.
I used the delightfully named Augustus Loop from Expert Sleepers combined with a Lua script I wrote to make it behave more or less the way I wanted it to. Kind of fiddly to set up, but repaid the effort. There’s still a few things that AL can do that I can’t do with Loopy Pro, but as I write those are due in the next big Loopy patch. ↩︎
Logic isn’t really set up to do live sound, but MainStage, which is, can’t do multi track recording and playback and I couldn’t work out how to configure its looper to emulate that. So I just used Logic. ↩︎
Will I ever learn to leave well enough alone? I've written up the state of my streaming setup here.
Also, I hope, I’ve added some gadgetry that means that brid.gy will automatically tweet a link to this page once I’ve posted it.
I've been doing a bit of gardening on my blog and hopefully setting up auto tweeting too.
The beauty of using a static site generator to build your website is supposed to be that it’s all delightfully simple. Simple markdown formatted files go in at one end and a slim, fast and easy to serve website comes out the other end. All that remains is to upload those files to the appropriate directory on your server and all is well.
But never underestimate the ability of a long time Emacs user to complicate things.
The beauty of using a static site generator to build your website is supposed to be that it’s all delightfully simple. Simple markdown formatted files go in at one end and a slim, fast and easy to serve website comes out the other end. All that remains is to upload those files to the appropriate directory on your server and all is well.
But never underestimate the ability of a long time Emacs user to complicate things. For instance, markdown is all well and good, but I’ve been doing most of my writing in Org Mode1 so I really want to stay in Org mode to write these blog posts. Hugo understands .org
files, so I could just lean on that, but the way Hugo treats org files seems slightly out of whack with what I think of as the Org way and I’d end up having to stick with the subset of org syntax that Hugo know. So I use ox-hugo, there’s a bit of configuration needed to make it work the way I like, but I prefer to change software to accommodate me rather than change me to accommodate software
I’ve had all that set up for a while. As I say, a tad fiddly at first, but once it’s in place, it just works.
Except…
ox-hugo
works by generating .md
files from an org source, which are then used to generate the site, and I had things set up to autogenerate the html whenever I commited to the main branch of the blog repo, and the git server hook based system I was using only worked if those exported files were in repo.
That’s the sort of thing that makes me itch, because there were two files for any given article:
all-posts.org
article.md
The generated file is an artefact of the build process and simply repeats the info in the org file, which should be our single source of truth. It’s not a file that should be left around to be edited willy nilly because it could get out of sync with its source file. It’s certainly not the sort of file that should live in the repository.
I didn’t worry about this for ages, but it niggled at me. Then one day I read an article about using Github Actions to build an ox-hugo based site by installing emacs and ox-hugo on the VM that does the build step and generating the markdown files during the build by running Emacs
in batch mode. The markdown files never exist anywhere that anyone can edit them. So, of course I had to do that. Again, fiddly to set up, and arguably only of philosophical benefit, but worth it, I think.2
I could’ve left it there but the thing I miss about the old, slow, hard to maintain version of this site, is the sense of connection. The old site had comments, and pingback links to other blogs. There was a sense of connectedness that’s missing from a collection of articles. I want some of that back.
There is a way. In the time I’ve been mostly not blogging, some of the folks who kept at it have been cooking up a collection of tools, technologies and standards under the IndieWeb banner. There’s a whole suite of technologies involved, but the piece of the puzzle that I’m interested in right now is the WebMention, described as
… an @ mention that works across websites; so that you don’t feel immovable from Twitter or Fb
Roney Ngala (@rngala) on Twitter
Now we’re talking! It’s a really simple standard too. When you mention, like, comment on, repost, reply to, bookmark or simply publicly interact with an “h-entry”3 on the IndieWeb, you can send a webmention by sending a small chunk of JSON to the webmention endpoint of the entry you mentioned. Assuming all the content is marked up correctly, sending a webmention is delightfully easy. You can do it with curl
, if that’s your thing, but I’m in an emacs buffer, so let’s use restclient
We mention https://indieweb.org in this post, so let’s find out its webmention endpoint.
HEAD https://indieweb.org
#+BEGIN_SRC html
<!-- HEAD https://indieweb.org -->
<!-- HTTP/1.1 200 OK -->
<!-- Server: nginx/1.24.0 -->
<!-- Date: Mon, 04 Mar 2024 13:42:00 GMT -->
<!-- Content-Type: text/html; charset=UTF-8 -->
<!-- Connection: keep-alive -->
<!-- Link: <https://webmention.io/indiewebcamp/webmention>; rel="webmention" -->
<!-- Cache-Control: no-cache -->
<!-- X-No-Cache: 1 -->
<!-- X-Cache: BYPASS -->
<!-- Request duration: 0.932852s -->
#+END_SRC
We’re looking for the 「Link: … ; rel="webmention"
」 line. This tells us that to send a webmention targeting https://indieweb.org, we need to post it to https://webmention.io/indiewebcamp/webmention. Which is almost as simple as finding the end point. Here we go:
POST https://webmention.io/indiewebcamp/webmention
Content-Type: application/x-www-form-urlencoded
source=https://bofh.org.uk/2022/04/24/not-so-simple&target=https://indieweb.org
{
"status": "queued",
"summary": "Webmention was queued for processing",
"location": "https://webmention.io/indiewebcamp/webmention/dELUcbvGREHxYZ2n1xqI",
"source": "https://bofh.org.uk/2022/04/24/not-so-simple",
"target": "https://indieweb.org"
}
// POST https://webmention.io/indiewebcamp/webmention
// HTTP/1.1 201 Created
// Content-Type: application/json;charset=UTF-8
// Content-Length: 236
// Connection: keep-alive
// Status: 201 Created
// Cache-Control: no-store
// Access-Control-Allow-Origin: *
// Location: https://webmention.io/indiewebcamp/webmention/dELUcbvGREHxYZ2n1xqI
// X-Content-Type-Options: nosniff
// Date: Mon, 04 Mar 2024 13:42:00 GMT
// X-Powered-By: Phusion Passenger 5.3.1
// Server: nginx/1.14.0 + Phusion Passenger 5.3.1
// Request duration: 0.489982s
The job is done, and we get a nice JSON formatted summary of what’s going on to boot.
Of course, if a webmention is so simple to send then it’s probably a pain in the bum to receive and it is… sort of. To receive a webmention request, you need to:
Steps 1–3 aren’t particularly hard, but they’re fiddly to get right and involve making web connections to potentially unsafe sites and I’m using Hugo to generate this site because I don’t want to be running potentially insecure code that’s exposed to the internet on a server that I own if I can possibly help it. Thankfully, I don’t have to. I can take a leaf out of indiweb.org’s book and just delegate that part to webmention.io. Webmention.io handles all that icky visiting of foreign websites and parsing out microformats for you and instead presents you with a feed consisting of all the webmention’s that’ve been sent to your site in a variety of formats. I’ve been consuming their .jf2
formatted feed for a while now. JF2 is a JSON representation of the microformats associated with the webmention’s source. Let’s grab something from that feed
GET https://webmention.io/api/mentions.jf2?per-page=2&page=0&sort-dir=up&target=https://bofh.org.uk/2013/03/10/in-which-piers-prepares-to-void-the-warranty/
{
"type": "feed",
"name": "Webmentions",
"children": [
{
"type": "entry",
"author": {
"type": "card",
"name": "David Gallows",
"photo": "https://webmention.io/avatar/pbs.twimg.com/e7b750d847ffdcfc174845aadc9196125b83647258a1789bb2b92b493d223e8b.jpg",
"url": "https://twitter.com/DavidGallows"
},
"url": "https://twitter.com/pdcawley/status/1517783526049001472#favorited-by-877428607",
"published": null,
"wm-received": "2022-04-23T09:59:17Z",
"wm-id": 1385464,
"wm-source": "https://brid.gy/like/twitter/pdcawley/1517783526049001472/877428607",
"wm-target": "https://bofh.org.uk/2013/03/10/in-which-piers-prepares-to-void-the-warranty/",
"wm-protocol": "webmention",
"like-of": "https://bofh.org.uk/2013/03/10/in-which-piers-prepares-to-void-the-warranty/",
"wm-property": "like-of",
"wm-private": false
},
{
"type": "entry",
"author": {
"type": "card",
"name": "David Gallows",
"photo": "https://webmention.io/avatar/pbs.twimg.com/e7b750d847ffdcfc174845aadc9196125b83647258a1789bb2b92b493d223e8b.jpg",
"url": "https://twitter.com/DavidGallows"
},
"url": "https://twitter.com/DavidGallows/status/1517852498555555840",
"published": "2022-04-23T13:06:50+00:00",
"wm-received": "2022-04-23T15:12:04Z",
"wm-id": 1385681,
"wm-source": "https://brid.gy/comment/twitter/pdcawley/1517783526049001472/1517852498555555840",
"wm-target": "https://bofh.org.uk/2013/03/10/in-which-piers-prepares-to-void-the-warranty/",
"wm-protocol": "webmention",
"content": {
"html": "enjoyed reading about it :)\n\nI've been a trackball man since the word go, But have never been able to move away from Qwerty keyboards\n<a class=\"u-mention\" href=\"http://bofh.org.uk/\"></a>\n<a class=\"u-mention\" href=\"https://twitter.com/DrugCrazed\"></a>\n<a class=\"u-mention\" href=\"https://twitter.com/keyboardio\"></a>\n<a class=\"u-mention\" href=\"https://twitter.com/pdcawley\"></a>",
"text": "enjoyed reading about it :)\n\nI've been a trackball man since the word go, But have never been able to move away from Qwerty keyboards"
},
"in-reply-to": "https://bofh.org.uk/2013/03/10/in-which-piers-prepares-to-void-the-warranty/",
"wm-property": "in-reply-to",
"wm-private": false
}
]
}
// GET https://webmention.io/api/mentions.jf2?per-page=2&page=0&sort-dir=up&target=https://bofh.org.uk/2013/03/10/in-which-piers-prepares-to-void-the-warranty/
// HTTP/1.1 200 OK
// Content-Type: application/json;charset=UTF-8
// Content-Length: 2073
// Connection: keep-alive
// Status: 200 OK
// Cache-Control: no-store
// Access-Control-Allow-Origin: *
// X-Content-Type-Options: nosniff
// Date: Mon, 04 Mar 2024 13:42:00 GMT
// X-Powered-By: Phusion Passenger 5.3.1
// Server: nginx/1.14.0 + Phusion Passenger 5.3.1
// Request duration: 0.077520s
Lot’s of lovely structured data. Webmention.io has worked out that one mention was a like-of
the blog post, and the other was in-reply-to
it. We get details of the author of the mentioning post and, where appropriate, its content. If I wanted to run more Javascript on here (and I want to run less), I could attach a script which would consume the post’s feed and build a display of all of the mentions. It has a certain appeal, just add one script to the site and a dummy <div>
or <ul>
somewhere and I’m laughing. Plenty of sites do just that.
This is not one of those sites. It’s not even the first statically generated site to go down the route of statically generating a post’s webmentions. I was mostly inspired by Brian Wisti’s post about consuming the webmention.io API (except, of course, I don’t use any of his actual code.)
The site’s Github repo is configured so any commit on the main
branch fires off a workflow that builds the site and ships all the files over to the webserver using rsync.
If I take Brian’s idea for grabbing all my webmentions
and ignore his warning about splitting it out into Hugo data files and just do it, I can start building the webmentions for posts. Huzzah!
It started so innocently I have a server here that hosts a couple of Docker images and one of them is N8N, a super powered, self-hosted open source replacement for IFTTT with all sorts of hooks into other services and a much more powerful Github client than the IFTTT offering. It’s a bit… JavaScript-y for my tastes, but you can’t have everything.
With a bit of fiddling, I had something that grabbed the webmention.io feed for the site every few hours, split it out into multiple files in data/mentions
and updated GitHub. That’s what I was celebrating in We have WebMentions. I’ve moved on
since then, because of course sorting out de-duplication and remembering information between runs of the script is annoyingly fiddly and full of edge cases. Basically, I ended up trying to emulate a proper database. Which is why the latest iteration of webmention handling uses a proper database. I would have used SQLite, but N8N doesn’t have a SQLite node available out of the box. It does have a PostgreSQL one though, and recent versions of that have really good JSON support. I’d tell you more, but wc-mode
tells me I’m nearly 2500 words in to this article, I think I’ll wrap up for now and promise to give you the gory details in an upcoming article.
org-mode
is an Emacs outliner that grew into a calendar/outliner/spreadsheet/document processor/literate programming tool/dessert wax/floor topping.
It’s what I used to use to manage my bakery, and it’s amazing.
Like Emacs itself, it’s almost infinitely flexible, which makes it incredibly hard to get started with. There’s oodles of org configurations out there to crib from and all of them are a mixture of the useful and irrelevant, because it turns out that people have different opinions about how they want to organise their writing and/or life. My config is very much under construction. ↩︎
The Github Actions based build process is also substantially more reliable than the hand rolled server hook I was using. There’s something to be said assembling your build pipeline from a bunch of stuff that lots of other people use (and maintain). Also, it reduces the number of moving parts on the Raspberry Pi that’s serving these pages, which is no bad thing. ↩︎
An h-entry is something that a web user might want to mention. At present, all the h-entries on this site are articles, but other people use them to mark up photos, videos, notes, calendar entries or anything else that makes sense to think of as an entry in a collection of stuff. If you’ll look at this page in your browser’s inspector, you’ll see that the content is wrapped in <article class="h-entry" …>…</article>
tags. Other tags within that block are are marked with other classes (so the title has p-name
and the body has e-content
), according to the definition of the h-entry microformat. By marking my site up with these micropformats, life becomes much easier for any IndieWeb tools to extract appropriate information from the site. ↩︎
After much fiddling with N8N, webmention.io, and the usual combination of Hugo’s powerful, yet inscrutable templating language and my tenuous understanding of CSS, we’re now displaying our webmentions. We’ve been directing them to webmention.io for years now, but scratching my head over what to do with them.
The way it works at the moment is I run a task every few hours that checks webmention.io, merges the results with the stuff we already know about and commits the updated data files to Github, which triggers a github action that rebuilds the site. This is… inefficient. My next step is to either expose the n8n workflow via a webhook, or work out how to retain some information from the previous run and use that to ensure we only fetch any mentions that’ve arrived since the last time we checked. But that’s work for another day. Right now I’m calling what I have a win, merging this branch to main and basking in the warm glow of taking one more step down the IndieWeb road.
It’s been a while hasn’t it? I’ve been blogging on and (mostly) off since 2004 (at least according to the oldest article on here), and the IndieWeb movement reminds me of those heady days before Facebook, Twitter and the other monoliths scooped up all the bloggers.
It was probably Twitter that killed my regular blogging – before Twitter, if I had something to say, I’d write a blog (or a LiveJournal for more personal stuff) post. Maybe a few days later, someone would reply, or write a blogpost of their own as a reaction and I’d get a pingback. These days, when I blog, my posts sit in splendid isolation, which wasn’t really a thing back in Blogging’s heyday. Spam killed my will to support comments and the growing complexity of blogging software was a real turn off.
I burned out.
I ran this site on Typo, but most of the work I was doing on it was implementing features I didn’t need which made the code slower and harder to understand, so I stepped back just as Twitter started its rise.
I’ve made a few abortive returns to blogging since, prompted by the rise of static site generation engines and the fact that I like having something to fiddle with. I could have just installed WordPress, but the idea of simply serving up a pile of static files (and no JavaScript!) seems way more sustainable (and secure) to me.
Not running code on my server makes it a bit tricky to be fully engaged in IndieWeb ideal of a connected web of websites using WebMentions to make those interactions visible, but it can be done, and I too shall achieve it! One day. Baby steps, eh? I might resort to a Javascript based setup initially, but long term I want to keep the site completely script-free and fast.
I’ve written a few pieces now for Jon Wilks’ new and rather wonderful Tradfolk website. You can find those (and any future articles) at https://tradfolk.co/author/pierscawley/ if you’re interested in my suggestions on how to get started singing without accompaniment and building your repertoire.
I think that’s what’s got me returning to this site frankly. I’d forgotten how much I enjoyed writing long-ish form stuff rather than 280 character miniatures.
I suspect that, like every other IndieWeb blogger, I’ll have a few upcoming articles detailing how I make things work here, but there’s a few things in my drafts folder that I want to return to, and probably some discussion of my experiences streaming folk songs every Friday night for the best part of two years now.
Let’s revisit this in a couple of months and see if it’s still the most recent article on the site, eh?
Hugo1 sort of supports this out of the box with its youtube
, instagram
, vimeo
etc. built in shortcodes. The thing is, they’re not lazy enough – you have to dig into each URL to extract a content ID and pass that in to {{% youtube kb-Aq6S4QZQ %}}
or whatever. Which would be kind of fine, if you weren’t used to the way sites like Facebook, Twitter, Tumblr and so on work. With those sites, you enter a URL and they disappear off behind the scenes and fill in a fancy preview of the page you linked to. Why can’t Hugo do that?
Well, it can. It just takes a little work.2 The question
to ask is how do all those user friendly sites do there thing? Twitter and Facebook, being the walled garden behemoths that they are do it by dictating two different microformats
that live in a page’s HEAD
section. The microformat approach has a good deal to be said for it: In theory, you can just make a HEAD
request to the URL you’re interested in, parse out the microformat of your choice and build your own media card. I’ve not worked out how to do this yet though. However, before Twitter and FB started throwing their weight around, there was an open standard that lots of sites support, it’s really easy to use. It’s called oembed and it’s great. The idea is that it too is discoverable via a HEAD
request to your media page. You look for something matching <link rel="alternate" type="application/json+oembed" href="..." ...>
, make a JSON request to the href
url and paste in the contents of the html
key in the object you get back. The catch, of course, is that you still end up having to parse the document’s HEAD.
The cool thing about oembed
, though, is that you can discover its endpoints that way,
but there’s also a big list of known endpoints on the Oembed homepage, which is also available as a big old JSON object if you want to go the full programmatic route. There are JavaScript libraries available that will walk your webpage and the JSON object and replace all your links with chunks of embedded content, that that’s what I used to use on this site. But… that’s not how I currently roll at Just A Summary. There are currently no <script>
tags to be found on here and I plan to keep it that way. So I wrote a Hugo shortcode. Here it is:
{{ $url := (.Get 0) }}
{{- range $.Site.Data.embed }}
{{- if le 1 ( findRE .pattern $url | len ) }}
{{- with (getJSON .endpoint "?" (querify "format" "json" "url" $url)) }}
{{ .html | safeHTML }}
{{ end }}
{{ end }}
{{ end }}
We use it like: {{< embed "https://youtub.be/kb-Aq6S4QZQ" >}}
, which displays like this:
“But how does it work?”
I hear you ask? It works in conjunction with some per-site data entries that I’ve added to the directory data/embed
in this site’s base directory. You might have guessed that the data entries are maps with two entries, a pattern
and an endpoint
. If the URL argument matches the .pattern
, then we make a getJSON
request to .endpoint
with a sanitised version of the URL argument tacked on as our query string and inserting the JSON response’s .html
entry.
I made the data files by taking the big JSON object from https://oembed.com/providers.json and massaging the supplied patterns into regular expressions. In theory, I could write a script to do the conversion for me, but I’m only really interested in four providers for this site, so I just did it by hand. So the entry for Instagram:
{
"provider_name": "Instagram",
"provider_url": "https:\/\/instagram.com",
"endpoints": [
{
"schemes": [
"http:\/\/instagram.com\/*\/p\/*,",
"http:\/\/www.instagram.com\/*\/p\/*,",
"https:\/\/instagram.com\/*\/p\/*,",
"https:\/\/www.instagram.com\/*\/p\/*,",
"http:\/\/instagram.com\/p\/*",
"http:\/\/instagr.am\/p\/*",
"http:\/\/www.instagram.com\/p\/*",
"http:\/\/www.instagr.am\/p\/*",
"https:\/\/instagram.com\/p\/*",
"https:\/\/instagr.am\/p\/*",
"https:\/\/www.instagram.com\/p\/*",
"https:\/\/www.instagr.am\/p\/*",
"http:\/\/instagram.com\/tv\/*",
"http:\/\/instagr.am\/tv\/*",
"http:\/\/www.instagram.com\/tv\/*",
"http:\/\/www.instagr.am\/tv\/*",
"https:\/\/instagram.com\/tv\/*",
"https:\/\/instagr.am\/tv\/*",
"https:\/\/www.instagram.com\/tv\/*",
"https:\/\/www.instagr.am\/tv\/*"
],
"url": "https:\/\/api.instagram.com\/oembed",
"formats": [
"json"
]
}
]
}
becomes
endpoint: "https://api.instagram.com/oembed/"
pattern: "^https?://(www\\.)?instagr(\\.am|am\\.com)/((.*/)?p/|tv/)"
Collapsing all those scheme
entries down to a single regular expression was a slight pain to do by hand, and I’m not entirely sure the regular expression will match exactly what the schemes match, but it’s not broken on any of the Instagram links I’ve thrown at it so far, so that’s good enough for me.
This isn’t the shortcode’s final form – it’s not as robust as I’d like it to be in the face of a missing or temporarily down oembed endpoint, so it would be good to have some kind of fallback in case an endpoint changes or goes away. Also, there are some sites that have their own methods for embedding previews, which don’t support oembed and it would be great to get at those somehow. I suspect I will end up with a shortcode which is essentially a big case statement dispatching to different partials which will handle the real rendering. Again… watch this space
Hugo is the static site generator I use to build this blog. Another example of letting the computer do all the fiddly repetitive bits. In this case, to handle all the fiddly bits of writing full HTML pages, building index pages and the rest. ↩︎
It’s also annoyingly temperamental at the moment; I’m working on that though. ↩︎
It all kicks off at 7pm, UK time with a kind of Q&A session and introduction to Twitch for newcomers. I’m particurly planning to help other independent musicians reach their audience through the platform.
Then, I plan to follow the Royal Traditions/Singing Together format of two forty minute sets of folk material with a 10 minute refreshment and raffle break in the middle.
After the folk club concert I’ll be jumping onto Twitch Sings to round out the evening singing implausible songs with friends from that community and any folky friends who’ve managed to get themselves up on Twitch by then. I’m hoping it’ll be a lot of fun.
It’s the internet! It won’t cost you a penny to watch me perform. However, right now, daft stuff like this is my only potential source of income, so I would be deeply grateful if you could either “Buy me a coffee” via Ko-Fi or sign up for a free Twitch account and subscribe to my channel.
If you are an Amazon Prime subscriber and you don’t yet have a Twitch Subscription, there’s a wonderful thing you can do that means that Amazon will give me (or any other streamer you enjoy) some money and it won’t cost you a penny. Sign up for Twitch Prime, which is just like a regular twitch account, but you can subscribe to one channel for free each month. The streamer gets paid by Twitch as if you’d signed up for a regular subscription, but you don’t get charged a penny because you’re already paying Amazon for your Prime account. The only difference between a Prime subscription and a regular one on Twitch is that you can’t set up a Prime sub to renew. If you would like to keep making regular payments to the streamer of your choice, you need to remember to resubscribe every month.
A Ko-Fi coffee comes in at £3, but if you want to tip me or any other twitch performer with an arbitrary amount, then Twitch Bits are your friend. You buy ’em from Twitch and can then use them as a virtual currency. For the performer, 100 bits is equivalent to $1, but they will cost you more than that to buy, because Twitch are (understandably, it’s not a cheap platform to run!) going to have to take a cut somewhere. By loading it on the cost of bits to the giver, they make things really transparent. It’s not like the weird alchemy where you pay a music streaming site 69p for a track or whatever and, unbeknownst to you unless you really dig into it, the artist sees maybe 6p of that.
Other performers have other ways for you to support them, whether it be public amazon wishlist, paypal tip jar, patreon page or some other service I’ve not heard of yet. In some ways, it’s never been easier for you to support the work of artists you love.
Before I read Tolkien at the suggestion of the wonderful Miss Reese, my teacher for my last year of primary school; before I pulled Diana Wynne Jones, Alan Garner, Susan Cooper and others from the shelves of Bawtry’s small, but enchanting branch library; before Anne McCaffrey’s DragonSong found me in my school library and set a fire in my imagination. Before all that, I read A Wizard of Earthsea and it stuck with me.
I remember one Saturday with 50p in my pocket from singing for a couple of weddings at St George’s church in Doncaster (25p for each wedding, paid cash in hand on the day. It always felt like a bonus after singing Bach’s Jesu Joy of Man’s Desiring in the side chapel as the register was signed in the vestry and Magnus Black, the choir and organmaster, brought that beautiful tune dancing with such delicacy from in instrument that would shake the walls later as the happy couple left the church to Vidor’s toccata and fugue ). I was never one for saving, I’m still not, so I was straight round to Donny’s nearest thing to a bookshop, the WH Smith in the Arndale in search of something to read. A voracious reader, I’d gone through all the Swallows and Amazons and Narnia books and I needed more. The cover of the second Puffin edition, with its white youths and bizzare half man half hawk fascinated me, so I handed over my 50p and headed home with my prize.
I read A Wizard of Earthsea once or twice and loved it, but I’ve not reread it since. As a kid, I borrowed rest of the then trilogy from the library and found them rather hard going at (my memory says that I found The Tombs of Atuan a real slog. I got through it, but it took a couple of goes and at least one renewal to get to the end). A few weeks ago though, I went to the Sound Post ‘Modern Fairies’ singing weekend and fell into conversation with Terri Windling about the books that had shaped me and I told her about my experience with the Earthsea trilogy and I thought maybe I’d been a little too young for them (I think I was eight or nine when I read AWoE, and maybe twelve when I read The Tombs of Atuan and The Furthest Shore). I hadn’t revisted them since. Terri made me promise to reread them and to let her know what I thought. So that’s what I’m doing. Terri, this book report’s for you. I owe you a few more and I promise I’ll get to them.
By the way, if you’ve never read A Wizard of Earthsea, there will be spoilers in this article. Read the book before continuing. It shouldn’t take you long, and it’s well worth the time.
It’s not so much what happens in this story as the way it’s told that left its impression on me. Earthsea is made of words – all stories are, of course – sung into being by Segoy. Words are power. A wizard spends a large part of his education learning the “the Deeds of heroes and the Lays of wisdom” and year under the Master Namer just learning the true names of things in the Old Speech: the language of dragons; the language in which the world was made. In the period when the book is set, there is written language, but I get the feeling that it’s very much the preserve of the wise. Songs, orally transmitted, are how the people of the archipelago hold their history and Le Guin’s language reflects that. Every sentence seems to have been shaped to be spoken, and beautifully so. I kept stopping and reading passages out because the words were just so… right.
I sometimes wonder who the tale’s narrator is telling the story to. It’s a question that can break a lot of first person SF and exposes lazy storytelling. If a book that’s supposedly the product of a completely different culture or time feels like it’s written for an early 21st century reader, it breaks the book for me. The language and idiom of A Wizard of Earthsea seem entirely right and consistent. We learn so much about the world as Ged’s story is told from things mentioned in passing. We know that this happened a long time ago and it’s assumed we already know about The Deed of Ged, The Creation of Ea and all the other songs, deeds and festivals that are referred to in passing through the book. At the end we are told that no songs have survived that tell how Ged came to terms with his shadow – the entire book is a footnote in a much larger story that’s just out of reach. I’m reminded of the fact that we only have the Norse myths we know because an ancient Icelander worried that readers wouldn’t recognize the allusions in the sagas and eddas, so they wrote down the bones of the older stories to help future readers understand. If Le Guin had left Earthsea at this point, all we would know of Earthsea would be the glimpses of it in this story. And what glimpses they are.
You can find echoes A Wizard of Earthsea in so much subsequent fantasy literature. The possibility of a wizard being trapped in another, for instance. Pratchett plays with and develops this in the Witches sequence of Discworld books, for instance. I loved this sentence though: “And no one knows how many of the dolphins that leap in the waters of the Inmost Sea were men once, wise men, who forgot their wisdom and their name in the joy of the restless sea.” If I had the power to become a dolphin I wouldn’t be keen to return to the body of a fat 51 year old with diabetes and a bunch of aches and pains that I try not to think about. You can keep your wisdom sometimes.
As a kid, I didn’t really understand what was going on with Ged and his Shadow. It was easy to see myself in the ever noticed that he didn’t have the same colour skin as me). I loved learning and especially knowing. It wasn’t hard to take my undoubtedly superior intelligence as analogous to a wizard’s power. Then, though, the shape of the story confused me, especially the ending. Ged and his friend sail off the page. The sea becomes land. Ged steps off the boat and confronts the Shadow, addressing it with his own name. And the shadow disappears/merges with Ged. And they all live on to do the Deeds which are sung of them. What? Nine year old me had no idea what was going on there, but the imagery stuck.
Now, of course, it all seems a clearer. Thesis. Antithesis. Synthesis. Ged does a terrible thing in his ignorance and pride. In shame he runs from it, almost losing his humanity in the process. He is tempted by a dark power, but rejects it. A friend and teacher restores him to himself and tells him that running is the sure road to doom. He turns and chases his Shadow instead. Finally he comes to an acceptance that the Shadow is a part of himself and by giving it his name he reintegrates that part into himself and finally becomes a whole man. There you go – no need to read the book now.
Of course you need to read this book. It’s language sings and the places and people it evokes are beautifully drawn. Rereading this after more than 30 years, so much was familiar. I would have said I’d forgotten almost all of it but the bare outline of the story and a few character names, but that stuff clearly went in deep and helped make me myself because as I read, the whole shape of the thing unfolded in my head. It was almost like recognising roads and pathways in a place you holidayed repeatedly as a kid, then didn’t return for 20 years. Familiar and surprising at the same time. “Oh yeah, that’s where Daniel used to live! And do you remember walking up there to buy ice creams at the village shop? Oh! I’d forgotten this view!”
Right… onwards to The Tombs of Atuan.
And here’s how we represent that in the database :
name | ingredient | format |
---|---|---|
Small Seedy Malt | Seedy Malt Dough | 0.63 kg |
Small White Wild | Basic White Sour | 0.63 kg |
Basic White Sour | Organic white flour | 2.00 kg |
Basic White Sour | Sea salt | 0.06 kg |
Basic White Sour | Water | 1.10 kg |
Basic White Sour | 80% starter | 1.80 kg |
Seedy Malt Dough | 5 Seed Soaker | 4.00 kg |
Seedy Malt Dough | Water | 3.80 kg |
Seedy Malt Dough | Sea salt | 0.22 kg |
Seedy Malt Dough | 80% starter | 3.60 kg |
Seedy Malt Dough | Organic light malthouse flour | 8.00 kg |
5 Seed Soaker | Water | 1.20 kg |
5 Seed Soaker | 5 seed mix | 1.00 kg |
Mother | Water | 3.20 kg |
Mother | Organic white flour | 4.00 kg |
Suppose we have an order for 8 Small White loaves. We need to know how much starter to mix tonight. We know that we need 0.63 kg of dough for each loaf, so that’s a total of 5.04 kg of Basic White Sour. The formula for Basic White Sour makes a total of \(1.10 + 1.80 + 0.06 + 2.00 = 4.96 \mathrm{kg}\) of dough. So we need to multiply each quantity in that formula by the weight of dough we need divided by the total weight of the recipe \((5.04/4.96 = 1.016)\). This is straightforward enough for flour, water and salt, which are basic ingredients, but we’ll need to do a similar calculation to work out how much flour and water we’ll need to make \(1.016 × 1.8 = 1.829 \mathrm{kg}\) of starter. You can see how this might become a little tedious.
If I were going to be doing these calculations by hand, it would definitely pay me to normalize my intermediate formulae so they all made a total of 1 kg of stuff. But screw that, we have a computer, so we can make it do the work.
I’m going to simplify things a little (the real database understands about dates, and we need to know a little more about recipes, products and ingredients than will fit in the recipe_item
table that describes the graph) but this should give you an idea of the recursive queries that drive production planning.
Let’s introduct a production_order
table, where we stash our orders
product | quantity |
---|---|
Small White Wild | 5 |
Small Seedy Malt | 5 |
And that’s all we need to fire off a recursive query.
WITH RECURSIVE po(product, quantity) AS (
SELECT 'Small White Wild', 5
UNION
SELECT 'Large White Wild', 5
), rw(recipe, weight) AS (
SELECT recipe, sum(amount)
FROM bakehouse.recipe_item
GROUP BY recipe
), job(product, ingredient, quantity) AS (
SELECT po.product,
ri.ingredient,
po.quantity * ri.amount
FROM po
JOIN bakehouse.recipe_item ri ON po.product = ri.recipe
JOIN rw ON ri.recipe = rw.recipe
UNION
SELECT job.ingredient, ri.ingredient, job.quantity * ri.amount / rw.weight
FROM job
join bakehouse.recipe_item ri on job.ingredient = ri.recipe
join rw on job.ingredient = rw.recipe
)
SELECT product formula, ingredient, ROUND(sum(quantity),2) quantity from job group by job.product, job.ingredient order by formula;
Which gives the following result:
formula | ingredient | quantity |
---|---|---|
Basic White Sour | Sea salt | 0.09 |
Basic White Sour | Water | 1.72 |
Basic White Sour | Mother | 2.81 |
Basic White Sour | Organic white flour | 3.13 |
Large White Wild | Basic White Sour | 4.65 |
Mother | Organic white flour | 1.56 |
Mother | Water | 1.25 |
Small White Wild | Basic White Sour | 3.10 |
A quick sanity check seems to show this is correct (we’re making 7.75kg of Basic White Sour, which tallies with the weights needed to make the loaves).
So what’s going on in the query? In SQL, WITH
is a way of giving names to your intermediate results, akin to let
in a Lisp. We fake up a table to hold our production orders (po
) and the rw
clause is totals the weights of all our recipes (in the real database, it’s a view). The magic really starts to happen when you use the WITH RECURSIVE
form. With RECURSIVE
in play, the last query is treated differently. Instead of being a simple two part UNION
what happens is that we first run:
SELECT po.product, ri.ingredient, po.quantity * ri.amount
FROM po
JOIN bakehouse.recipe_item ri on po.product = ri.recipe
JOIN rw on ri.recipe = rw.recipe
and call the results job
and then run the second query, adding any extra rows generated to the results, and repeating that query until the result set stops growing. If we didn’t have WITH RECURSIVE
available, and we knew the maximum depth of recursion we would need, we could fake it by making a bunch of intermediate clauses in our WITH
. In fact, until I worked out how WITH RECURSIVE
works, that’s exactly what I did.
Have you spotted the mistake? I didn’t, until a few bakes when horribly wrong.
Here’s what happens when we have an order for 3 small loaves and two large ones
formula | ingredient | quantity |
---|---|---|
Basic White Sour | Sea salt | 0.02 |
Basic White Sour | Water | 0.41 |
Basic White Sour | Mother | 0.68 |
Basic White Sour | Organic white flour | 0.75 |
Large White Wild | Basic White Sour | 1.86 |
Mother | Organic white flour | 0.38 |
Mother | Water | 0.30 |
Small White Wild | Basic White Sour | 1.86 |
We’re only making 1.86 kg of dough? What’s going on?
It turns out that the way a UNION
works is akin to doing SELECT DISTINCT
on the combined table, so it selects only unique rows. When two orders end up requiring exactly the same amount of the ‘same’ dough, they get smashed together and we lose half the weight. This is not ideal.
I fixed it by adding a ‘path’ to the query, keeping track of how we arrived at a particular formula. Something like:
WITH RECURSIVE po(product, quantity) AS (
SELECT 'Small White Wild', 3
UNION
SELECT 'Large White Wild', 2
), rw(recipe, weight) AS (
SELECT recipe, sum(amount)
FROM bakehouse.recipe_item
GROUP BY recipe
), job(path, product, ingredient, quantity) AS (
SELECT po.product,
po.product,
ri.ingredient,
po.quantity * ri.amount
FROM po
JOIN bakehouse.recipe_item ri ON po.product = ri.recipe
JOIN rw ON ri.recipe = rw.recipe
UNION
SELECT job.path || '.' || job.ingredient,
job.ingredient,
ri.ingredient,
job.quantity * ri.amount / rw.weight
FROM job
join bakehouse.recipe_item ri on job.ingredient = ri.recipe
join rw on job.ingredient = rw.recipe
)
SELECT product formula, ingredient, round(sum(quantity),2) weight from job group by formula, ingredient order by formula;
This query gives us:
formula | ingredient | weight |
---|---|---|
Basic White Sour | Sea salt | 0.05 |
Basic White Sour | Water | 0.83 |
Basic White Sour | Mother | 1.35 |
Basic White Sour | Organic white flour | 1.50 |
Large White Wild | Basic White Sour | 1.86 |
Mother | Organic white flour | 0.75 |
Mother | Water | 0.60 |
Small White Wild | Basic White Sour | 1.86 |
This time we’re making 3.74 kg of dough, which is right.
In order to see what’s going on, we can change the final SELECT
to SELECT formula, path, ingredient, round(quantity,2) weight FROM job
, and now we get:
formula | path | ingredient | weight |
---|---|---|---|
Large White Wild | Large White Wild | Basic White Sour | 1.86 |
Basic White Sour | Large White Wild.Basic White Sour | Mother | 0.68 |
Basic White Sour | Large White Wild.Basic White Sour | Organic white flour | 0.75 |
Basic White Sour | Large White Wild.Basic White Sour | Water | 0.41 |
Basic White Sour | Large White Wild.Basic White Sour | Sea salt | 0.02 |
Mother | Large White Wild.Basic White Sour.Mother | Water | 0.30 |
Mother | Large White Wild.Basic White Sour.Mother | Organic white flour | 0.38 |
Small White Wild | Small White Wild | Basic White Sour | 1.86 |
Basic White Sour | Small White Wild.Basic White Sour | Organic white flour | 0.75 |
Basic White Sour | Small White Wild.Basic White Sour | Sea salt | 0.02 |
Basic White Sour | Small White Wild.Basic White Sour | Water | 0.41 |
Basic White Sour | Small White Wild.Basic White Sour | Mother | 0.68 |
Mother | Small White Wild.Basic White Sour.Mother | Organic white flour | 0.38 |
Mother | Small White Wild.Basic White Sour.Mother | Water | 0.30 |
Which shows that we’re considering two lots of Basic White Sour with exactly the same weights, but we (and more importantly, the database engine) know that they’re distinct amounts because we get to them through different routes. Hurrah! The problem is solved and we can accurately work out what we should be mixing.
As a baker, I know if I’ve got an order for bread on Friday, then I need to mix the starters on Wednesday night, then spend Tuesday mixing, fermenting and shaping the loaves, which will spend the night in the retarder ready to be baked at 4 on Friday morning. But the schema I’ve outlined here doesn’t. In my full bakehouse schema, I have a few extra tables which hold timing data and such. In particular, I have a product
table, which knows about everything I sell. This table knows holds info about how many I can make per hour of work and the bake time and temperature. Then there’s a recipe
table which holds information about how long a formula needs to rest.
The real queries take this into account to allow us to work back from the due_date
of a real order to the day we need to do the work. If you want to dig into how I handle dates you can check out the repository at https://github.com/pdcawley/bakehouse/.
Never write your work up for your blog. Especially if you’re mostly happy with it. As I was writing this, I realised there’s an annoying bit of code duplication that I think I can eliminate. In the current code, I repeat what’s essentially the same query structure in a couple of different views, but the formula graph is essentially static unless I add or adjust a recipe. Now I’m wondering if I could make a materialised view that has enough information to shortcut the calculations for both making the production list (what needs to be mixed, when) and for working out my costings (to put a price on a loaf, you need to know how much the raw ingredients cost, and that involves walking the tree again. Maybe a table like:
product | sub_formula | ingredient | factor | lead_time |
---|---|---|---|---|
Large White Wild | Basic White Sour | White Flour | 0.403 | 1 day |
Large White Wild | Basic White Sour | Salt | 0.012 | 1 day |
Large White Wild | Basic White Sour | Water | 0.222 | 1 day |
Large White Wild | Basic White Sour | 80% Starter | 0.462 | 1 day |
Large White Wild | 80% Starter | White Flour | 0.288 | 2 days |
Large White Wild | 80% Starter | Water | 0.173 | 2 days |
If we have that table, then two days before our bread is due, if we have an order for 10 white loaves, we’ll need to mix \(9.3 × .288 \approxeq 2.68\) kg of flour and \(9.3 × 0.173 \approxeq 1.61\) kg of water. Which we can do with a simple non-recursive SELECT
. Something like:
WITH weighted(formula, ingredient, weight, due) AS (
SELECT precalc.sub_formula,
precalc.ingredient,
precalc.factor * po.quantity * rw.weight,
po.due_date - precalc.lead_time
FROM precalc
JOIN production_order po ON precalc.product = po.product
JOIN recipe_weight rw ON precalc.product = rw.recipe
)
SELECT formula, ingredient, sum(weight)
FROM weighted
WHERE due = 'today'
GROUP BY formula, ingredient
We can use the same table to calculate the raw material costs for a given recipe, using a simple non-recursive query too.
I think, however, I’m going to leave it alone until I have to write another recursive view that walks the same graph, at which point I’ll bite the bullet and do the pre-calculated version.
One of the big changes that came with going pro was that suddenly I was having to work out how much stuff I needed to mix to fill the orders I needed. On the face of it, this is really simple, just work out how much dough you need, then work out what quantities to mix to make that much dough. Easy. You can do it with a pencil and paper. Or, in traditional bakers’ fashion, by scrawling with your finger on a floured work bench.
And that’s how I coped for a few weeks early on. But I kept making mistakes, which makes for an inconsistent product (bread is very forgiving, you have to work quite hard to make something that isn’t bread, but consistency matters). I needed to automate.
I’d been on one of Bread Matters’ “Baking for a Living” courses and as part of the course materials had received a copy of a spreadsheet that could be used to go from a list of orders to a list of ingredients to mix alongside accurate costings and other useful bits and bobs. It was great and certainly opened my eyes to the possibilities for automation of this part of the job.
And then I tried to add a new recipe.
Spreadsheets aren’t my favourite computational model so maybe it was just my lack of experience with them, but adding a new recipe was like pulling teeth; lots of tedious copying, pasting and repetition of formulae. It just seemed wrong, especially as the underlying computations were so straightforward (ish). There had to be a better way.
The key insight is that a bakery formula is so cliched that it can be represented as data. Here’s the formula for seedy malt loaves:
recipe | ingredient | quantity |
---|---|---|
Small Seedy Malt | Seedy malt dough | .61 kg |
Large Seedy Malt | Seedy malt dough | .92 kg |
Of course, that’s not the full set of formulae, because it doesn’t tell you how to make ‘Seedy malt dough’, but that’s just another formula, which consists of flour, water, starter, salt and a multiseed ‘soaker’, where the starter and the soaker are the results of other formulae, which are (finally) made from basic ingredients. I did consider reaching for the object oriented hammer at this point, but thought that I might be able to do everything I needed without leaving SQL. It was relatively straightforward to move the shape of the calculations in the Bread Matters spreadsheet into my database schema, the only real sticking point being the recursive nature of the formulae, but it turns out that recursive queries are a thing in modern SQL, albeit a little tricky to get absolutely right first time. If you’re curious about the details of the schema, you can find it in my github repo for the bakery.
So now, a few days before a bake, I’d setup my production_order
table with the orders for the bake, and run a query on the production_list
view to find out what I needed to mix when. And all was great. Well, sort of. I had to add a bit extra onto the quantities in the initial starter mix to allow for the bits that get stuck to the bowl and lost to the final dough, and it was all very well until I wanted to bake two days in a row (a bake is a two day process from mixing the starters on a Wednesday evening, through mixing, fermenting and shaping on Thursday to baking the resulting loaves at four on Friday morning). But, vitally, it was much, much easier to add and adjust formulae, and the limitations were no worse than the limitations of the spreadsheet. All was well.
It’s the nature of business that you need to keep records. How much got baked? How much sold? Did we clean the floor? Were there any accidents? What sort? How do we prevent them next time? The list is endless. It all needs to be recorded, for both legal and pragmatic reasons. So I started a day book. This is just an .org file
Every day I come into the bakery, I run org-capture
and I get a template for the day’s entry in the daybook, which I fill in as the day goes on.
One of the features of org-mode is org-babel
, a literate programming environment, which lets me write something like:
#+begin_src sql
SELECT ingredient, quantity
FROM bakehouse.production_list
WHERE work_date = 'today';
#+end_src
and then, with the cursor somewhere in the code block, hit C-c C-c
whereupon Emacs will run that SQL against the bakery database and populate a table like:
ingredient | quantity |
---|---|
Old starter | 1.3 |
Water | 2.08 |
White flour | 2.6 |
… | … |
If that were all org-mode did to assist, it’d be awesome enough, but the queries I make are a little more complex than that, the current version of the database understands about dates and can cope with overlapping bakes, but all that makes the queries a little more complex. Org-mode helps with that too, because I can file away snippets of code in a ’library of babel’ and just reference them from the daybook. And I can set arbitrary variables at any point in the hierarchy of the document.
So I have a bit of code in my emacs config that tweaks the day’s entry in a daybook like so:
(defun pdc//in-bakery-daybook? ()
"Are we in the bakery daybook?"
(equal (buffer-name) "CAPTURE-loafery-daybook.org"))
(defun pdc/set-daybook-entry-properties ()
"Set the properties we rely on in our boilerplated daybook queries"
(save-excursion
(while (not (looking-at "*+ [[:digit:]]\\{4\\}\\(-[[:digit:]]\\{2\\}\\)\\{2\\}"))
(org-up-element))
(let ((entry-date (first (s-split " " (org-entry-get (point) "ITEM")))))
(org-entry-put
(point)
"header-args+"
(format ":var work_date=\"'%s'\"" entry-date)))
(org-babel-execute-subtree)))
(defun pdc/org-capture-before-finalize-daybook-entry ()
(when (pdc//in-bakery-daybook?)
(pdc/set-daybook-entry-properties)))
(add-hook 'org-capture-before-finalize-hook
#'pdc/org-capture-before-finalize-daybook-entry)
It won’t win any code beauty contests, but it does the job of setting a work_date
variable for the day’s entry and running any code in the subtree as part of the capture process. The capture template has lines like #+call:mixes()
, which call the stored code snippets, that reference the variable set in the current subtree and so make the query for the right day. This means that all I have to do to know what I should be doing when I get into the bakehouse is to run an org-capture and check the resulting entry in my daybook. Provided, that is, that I’ve added the appropriate rows to the database.
The software isn’t done, of course, no software ever is. But it’s good enough that it’s been managing my mixes without a hitch for the last few months, telling me what to pack for which customer and generally removing the need to work anything out with a pencil and paper. It’s nowhere near as mature or capable of commercial production management software, but it fits me. I understand what it does and why, how it does it, the limitations it has and how to work around them. When it becomes annoying enough, I might sit down and work out how to fix it, but I’ll do that when I’m in the right frame of mind. My current list of niggles looks something like this:
org-ledger
Computers are amazing. They are versatile tools even if you don’t know how to program them, because there’s almost always an app for what you want, or something close enough that you cant work around its infelicities. It’s quite remarkable the things that folks can do with their kit with no programming skill at all.
But… learn to program, and a whole other vista of possibility opens up to you. With good programmable tooling you’re only really limited by your skill and understanding. Instead of accommodating yourself to your software, you can accommodate your software to you, and make the right functionality trade-offs for you. There’s a brilliant commercial piece of music looping sofware I use that could be massively more brilliant if there were a way of picking up the tempo automatically from the first recorded loop - it would free me from having to sing to a click and generally make the whole process easier. The developers have other (understandable) priorities, like porting the app to windows. And they’re not wrong to do so. There were folk clamoring for a windows version, and if a developer isn’t making money from a commercial application, then development will stop. I’m definitely not complaining, the feature is not so dramatically necessary that I’m prepared to spend the time learning how to do real time music programming in order to implement it, but if I want software to dance to my tune then doing it myself is the only way.
So… choose tools that let you program them. I choose emacs and PostgreSQL, you might choose vim and SQLite or Atom and a NoSQL database, or you might just live in your Smalltalk image. Once you start to see your computing environment as truly soft and malleable, you can do amazing things, assisted by a computer that is truly yours.
Seriously, why does the Devil need an advocate? If you want to play DA because you think the position you want to argue has some merit, then argue the position honestly and own it. If it doesn’t survive the discussion (or is shouted down), then “Ah right, I hadn’t thought of that, you’re right” or words to that effect and file that position in your memory as a bad one (along with the skeleton of why it’s bad). Nothing wrong with holding strong opinions, the thing that’s bad is holding onto them if they’re shown to be bad. If the group you’re talking with just shouts you down and doesn’t convince you that your position is a bad one, maybe find a different group? Or agree with them to steer clear of that topic.
What’s really intellectually dishonest is to say “I was only playing Devil’s Advocate!” after an idea has been shot down. I’m sure your intentions are entirely honorable, but what if they weren’t? Say you genuinely held that the best thing to do with the children of the poor was to turn them into cheap and delicious meals for the richest in society. Say you advanced this position to your friends and were utterly appalled by the idea. Then maybe you’d try to distance yourself from it by saying “Whoah! Guys ! I was only playing Devil’s Advocate!”
When I hear someone playing that card, how am I supposed to distinguish between the well-meaning “There is this argument I’ve come across that I’m not sure I agree with, but it maybe has some merit and I don’t know how I’d argue convincingly against it” types and the assholes who were flying a kite? Maybe the non-assholes will have friends who’ll tell me that “They might seem like a bit of an arse, but they’re not really.” I’ve been that guy, and I don’t want to be him again. Why is it okay for me to load the work of explaining that I’m not dickhead onto my friends rather than just not acting like a dickhead in the first place? Eventually, friends get tired. Eventually they’ll shift to “Yeah, I know he seems like an ass, and he kind of is, but…” and then one day, they won’t be your friends any more.
Before you introduce the idea you want to play Devil’s Advocate for, say something like “D’you mind if I play Devil’s Advocate for a moment?” And when the group tells you “Yes, we do mind. Why help the devil?” listen to them. If it’s genuinely that you’ve heard some argument that on the face of it seems repugnant, but you can’t find a hole in it, then say as much: “What’s wrong with this idea? Clearly feeding poor babies to the 1% is utterly repellant, but I can’t find an effective counterargument.”
Don’t keep doing it, mind, or you’ll start looking like the kite flying asshole again.
I know! It’s been a while. But we’re in! I have baked, and it was good. There’s still a ton of stuff to do (plumbing, mostly) but the really important bits of kit are all in place and looking good.
We celebrated getting in by turning one of the decks up as high as it would go and making lots of pizzas and a few loaves of bread.
In my last entry (over a year ago, argh! Gill is much better) the oven and all my kit were still in my garage, up on blocks waiting for Dad to build an A-frame so we could winch it up and assemble it. Which happened, and we managed to get one section of oven up onto the base. And there we stopped because the fully assembled oven is very tall, and the A-frame isn’t tall enough to accommodate a fully assembled oven + the winch + space for the straps (and my garage roof isn’t high enough to accommodate a sufficently tall A-frame). Still, it allowed us to start in on prEocess of breaking the heads of very old brass machine screws and generally failing to get the oven beds out where they could be cleaned. This was frustrating, but it’s not like I was unused to frustration.
Meanwhile, the bakehouse site moved again. We had thought it would be a relatively easy (and thus cheap) matter to run the necessary 3-phase power to the space, but it turns out there wasn’t quite enough power going to the building to support what we needed. That would mean a new substation and some very expensive cable laying. So it wasn’t going to fly. Luckily, there is also an old cafe in the yard. And it already has 3 phase, and enough 3 phase at that. So we set about making that into a bakehouse. A lick of paint; some new flooring; wider, taller doorways so we could get the oven in. Minor stuff like that.
By now we’re up to late spring of 2017. I’d given up on trying to renovate the decks myself, so I got onto Martin Passey at Becketts and arranged for them to sort out the electrics and replace the rusty steel beds with ceramic ones instead, which are generally reckoned to be the best choice if you want to make ‘hearth’ breads on the oven floor. We just needed to work out how to get the oven from Doncaster to Heywood.
Guess what? It wasn’t straightforward.
When we picked up my oven from the Isle of Wight, we’d got it into a large Luton bodied van with a tail lift, and it was kind of fine. I suppose I could have hired another one, roped in a few volunteers and driven it over myself, but the fact that we’d partially assembled the oven was going to make that rather trickier than it could have been. Disassembling it was going to be tricky too - after we got the straps out from the top decks when we’d assembled it, we discovered we’d been very lucky indeed, and the strap had very nearly broken.
The best option was to get a flatbed truck with a Hiab or similar hydraulic crane which would make short work of getting the oven up onto the truck and off to be fettled. But the access (up a 10 foot wide back lane) proved daunting. All the haulage companies I talked to took one look at it and backed away, muttering darkly and making the sign of the cross. “Get a bunch of strong Polish lads to carry it down the alley and stick it on the back of a truck” was the best (but very unofficial) suggestion. Not ideal.
So now it’s July and I’m chatting to a fellow guest at my brother’s silver wedding anniversary about my shipping woes. “You want to talk to Dan!” he said.
“Dan?”
“Yeah, Dan Punchard. He’s great, he’s moved a couple of lathes for me with some really tight access.”
“Thanks!”
My informant was not wrong. Dan was brilliant. We exchanged a few emails and photos of the access and bang the oven was off into the tender loving care of Becketts for its electrical fettling and new floors. And soon the money was flowing out of my savings as I bought a new spiral mixer, wire cooling baskets, steel work table, scales (both electronic for weighing ingredients into the mixer and a balance scale, which is still the fastest way of scaling dough when you divide it), lots of bannetons (probably not enough) from Bakery Bits, workwear, flour, wire shelving, and a bewildering amount of janitorial bits and bobs from Nisbets. Fettling the oven wasn’t exactly cheap, but wow, do those baskets add up!
On the 8th of December last year I sent mail Martin some mail with the subject “It’s arrived, and it fits!” and over the next couple of weeks the rest of the stuff I needed to bake arrived and, on the 20th of December, I fired up the mixer for batch of 16 loaves and what proved to be far too many pizza doughballs.
On Friday 21st of December, I pulled my first loaves out of my ~40 year old oven, and damn, but they were good.
Of course, no enterprise like this is ever finished, here’s a selection from my to do list.
A bigger sink! Water near the scale so I’m not carrying buckets back and forth! A handwash basin!
Environmental Health Officers do like you to have a certificate to show you’re not a complete moron when it comes to hygiene. Breadmaking is relatively low risk because everything gets so very hot during the cooking process, but even so.
Right now everything’s at the ambient temperature, which can mean staying in the bake house until the early hours in order to get the loaves into the oven when they’re perfectly proved. A better approach would be to stick the dough into a retarder (big fridge, racked for standard bakers’ sheet pans) and bake them first thing in the morning after a decent night’s sleep. I have a retarder, but transport is annoyingly tricky because it’s 2m tall, and should ideally be transported vertically too.
Right now, I can just about cope with two bakes a week, but if I’m going to actually make money at this, I’m going to need to be able to manage more. Hopefully, as I bake it’ll improve my fitness, so as demand grows I’ll be able to meet it.
Oh boy, do I suck at marketing? Still, the product is good and there’s nobody else in the local area making this sort of bread, so I have a few advantages. I still haven’t made a Loafery website though. At least I have the loafery.co.uk domain.
If I can get people ordering online, I can use that to produce production schedules, and generally have a better idea of how much to make on each bake day, which help minimise any wasted bread. With two bakes done this year, I’ve sold every loaf - I’d like to keep that up.
I’ll talk about how a bake goes and the process of developing an initial range of products, sourcing flour and other ingredients and hopefully some news about online ordering.
You may have noticed a distinct lack of Bakehouse Diary posts. It’s mostly because dad’s in the process of building some A-frames to allow him to get engines out of old cars and such, and to help me get the oven assembled so we can test it in the garage. Which would be the work of not very long at all, if dad wasn’t doing a million and one other things. He’s just back from a trip to Holland and France, “following” the route south that my Grandfather took after his Lancaster was shot down in 1943. Which is an entirely other post.
It’s also because, a couple of weeks ago, I took my wife into A&E because she was short of breath and, after a desperately apologetic bit of Friday night queue jumping (“Um… I’m sorry, but, err… would you mind awfully if I jumped the queue here? It’s just that, ah, my wife’s having trouble breathing”) and prompt action by a triage nurse, she crumpled.
I was whisked away to a small room with teamaking facilities where I could fret without getting in the way. Over the next hour or so the team of heroes in the resus ward kept her alive through cardiac and respiratory arrests. By the time they let me see her again, she was looking a good deal better and proceeded to bounce back with astonishing speed and was discharged just under a week after she was admitted. She’s much, much better now.
Something like that gives you pause.
I’ve not been happy in my work for a while. A combination of persistent RSI preventing the kind of extended runs of almost trance-like coding that can be so energizing, general low level depression, and a sense that helping car hire companies make more money from their punters wasn’t exactly meaningful work. When you’ve been in the lucky position where what you do to make a living is what you’d be doing for enjoyment anyway, it comes as a shock when the pleasure drains away. So I’ve done the potentially stupid thing and jacked it in to spend more time with my wife. I’m pretty sure this isn’t sustainable, but I’m equally sure that the kind of stress I was placing myself under wasn’t worth the generous salary. My rough plan now is to get the bakery to the point where I can do a “selling” batch of bread and pastries at least once a week and, assuming that doesn’t snowball and become commercially viable by the new year, start looking for part-time programming work again in January.
If you’re looking for an experienced programmer, then by all means get in touch before then. I’d love to hear from you, but bear the following in mind.
I’ve worked around this with a voice coding environment that’s remarkably effective, but it still means:
I don’t want to still be a dedicated programmer in 2020. Or even 2018. My long term goal is to get the bakery to the point where it’s paying me a living wage and I can go back to programming for fun or to scratch my own business itches.
I used to believe that working from home wasn’t a great way of building code. That a co-located team was emphatically the way to go. But most of the open source software I used on a day to day basis isn’t written that way, it’s written by distributed, loosely connected teams coordinated via IRC, source control and infrequent meetups at conferences and hackathons.
Clearly face to face matters, and regular meetings in an office are a good thing, but I’m never doing another weekly commute. Nor will I spend over an hour travelling every day.
Seriously, if you can’t find someone from an underrepresented group who isn’t at least as good as me, then you should be putting some money into nurturing new talent from those same groups.
At least, I think I am. There’s not much on github right now (but I’m working on getting my voice coding setup into a documented and releasable state) colleagues have said nice things about my code, and various technical talks I’ve given have been well received.
I’ve spent a good deal of my career understanding other peoples’ code and I’m good at breaking down what’s good and bad, what it does and how it can be improved.
Folk with my skills generally don’t come cheap. But the people with the deepest pockets aren’t necessarily the people I want to work for. Selling adverts doesn’t motivate me. Interesting and or socially valuable work does. And I’m happy to take that into account when negotiating a salary.
I wrote back with suggestions and a bucket of enthusiasm. Fay’s reply was lovely
Still in feasibility stages and I’m fluctuating between thinking it’s a brilliant idea and that I’m mad
Those exciting plans turned into the first Dungworth Singing Weekend. It was magical. Great singers, fine singing in the Royal and thought provoking discussion and teaching. Serious, but far from po-faced. That was Soundpost’s first event and they’ve been delivering enjoyable and stimulating events ever since.
Arts Emergency makes badges which read “sometimes if you want something to exist you have to make it yourself”. The beauty of Soundpost is that I can make something else I want to exist because Fay, Jon, Sam, Andy and co. have already made Soundpost and it’s wonderful.
There is a new folk revival going on and organizations like Soundpost are welding the best of the old and the new together and building something which will last; a community of musicians and singers, supporting each other and welcoming the world in to a music that can seem cliquey and stand-offish to newcomers. More power to their collective elbow.
.zshenv
MacOS comes with a tool called path_helper
. It’s great. It sets up more or
less sane PATH
and MANPATH
environment variables based on the contents of
/etc/paths.d
and /etc/manpaths.d
respectively. This means that the install
step of an application that includes some command line tools that really
shouldn’t be dropped in the existing /usr/bin
can keep those tools in, say,
/opt/toolname/bin
and drop a file in /etc/paths.d
and /opt/toolname/bin
will appear on everyone’s default path. Very handy.
The default shell rc files that get run when a shell starts up call
path_helper
and bam, you have a sane path.
Well… yeah… except in zsh
. There are two global zsh startup files in
/etc
, /etc/zshenv
and /etc/zshrc
. When zsh starts up it sources a bunch
of files to set up the environment. At the very least, it uses /etc/zshenv
and $HOME/.zshenv
. If the shell is interactive, it also uses /etc/zshrc
and $HOME/.zshrc
, but it always loads zshenv
and ~/.zshenv
.
So… Like a good zsh user, my PATH
and other environment variables are set
up in my ~/.zshenv
. As well as all the system paths, there’s a bunch of
stuff in my home directory that wants to be on the path as well, and .zshenv
is the place to do that.
But, everytime I upgrade MacOS I forget that whoever is responsible for the
global zsh startup files at Apple is, not to put too fine a point on it, a
fucking idiot. Because as well as running path_helper
in /etc/zshenv
,
where it makes sense, this upstanding member of society sees fit to run it
again in /etc/zshrc
. Which would be fine if it weren’t for the fact that the
very first thing path_helper
does is nuke the existing path. Which throws
away all my path customisations from ~/.zshenv
and leaves me wondering what
on earth just happened until I remember the zsh startup sequence and fix
/etc/zshrc
by deleting the path_helper
stanza.
If you’re being driven up the wall by this, then this is what my /etc/zshrc
looks like now:
# Correctly display UTF-8 with combining characters.
if [ "$TERM_PROGRAM" = "Apple_Terminal" ]; then
setopt combiningchars
fi
disable log
I hope you find it useful (and even if you don’t, I’m sure I’ll find it useful come the next upgrade).
For some reason now, when my laptop wakes from sleep, my ssh-agent
has
forgotten some of my authentication keys that were added via the system
keychain. After a while, I got bored of typing ssh-add -A
every time I
failed to ssh into a host and installed sleepwatcher
via Homebrew like so:
$ brew install sleepwatcher
$ brew services start sleepwatcher
$ echo <<EOF > ~/.wakeup
#!/bin/bash
ssh-add -A
EOF
$ chmod +x ~/.wakeup
which fixed that.
Those are the only two things that I’ve noticed so far in the Sierra change that have annoyed me enough to find out how to fix them. If any more crop up, I’ll update this page. If any more command line niggles crop up, I’ll update this page with any fixes I find.
Doesn’t time fly when there’s nothing much you can be doing? Right now we’re pretty much where we are back in week three, except that Dad is slowly progressing on building the lifting frame which we’re going to use to reassemble the deck oven in my garage so I can finish the process of either getting the rust off the soles, or replacing them with ‘stone’. Apparently steel has been ordered so it’s just a matter of time.
The final bakehouse site is still lined up and it should be possible to sort out the 3-phase and plumbing in there quickly once we’re definitely ready to pull the trigger on moving the oven in. So, yes… right now we’re twiddling our fingers.
Meanwhile, the first Singing Together Night at the Doncaster Brewery & Tap went really well. We sang a lot of songs and Ian, the landlord sold a lot more beer than he usually does on a Thursday night so we’re chalking that up as a success. If you find yourself in or near Doncaster on next Thursday we’re presenting the fabulous Cath and Phil Tyler. You should come and listen.
It’s the first week in August. As soon as I finished my programming work, I drove down to Sidmouth for a couple of days of folkie immersion.
There’s not all that much to say. I didn’t get to bed before 2am each night. I sang a bunch of songs in good company. I listened to a bunch more and spent an evening singing old chestnuts, meeting house songs and school hymns with Sheila Kay Adams, Carol Anderson1 and a brilliant crowd in the garden of the Swan. Magic.
Tune in next week for more bakehouse related bakehouse diary.
A amazing Scots fiddler and singer who doesn’t have a website. If you want to know what she sounded like 6 years ago, seek out the fabulously titled Single track road trip. She’s only getting better though. ↩︎
Not much progress on the “getting the oven ready front this week”. I sheared another machine screw!
https://www.instagram.com/p/BIUhqezAacy/At least this one was on the retaining bar that holds the oven soles in place; I’d rather not have sheared it, but since I’m going to be drilling stuff out anyway, it’s not the end of the world. Basically I’m blocked on doing much more with the oven ’til I’ve moved the decks and got them the right way up rather than on their sides because I don’t want to undo the last screw in the retaining bar and cope with a couple of large lumps of metal and (possibly) the insulation and heating elements falling out. Once the ovens are the right way up, gravity will be my friend.
So, I’m going to talk about the software side of things.
Last year, I went up to Bread Matters in the Scottish Borders and did their excellent Baking for a Living four day course. I can’t recommend it highly enough; if you’re remotely interested in taking the next step in your baking and going pro, then this will give you a solid grounding in stuff you might not have thought about.
One of the exercises was to work out the costings for a product. If you follow the rule of thumb that’s often bandied about for restaurants and catering (multiply the ingredient costs by 3 and you’ve got the final price) you are almost certain to go bust. The working formula is more along the lines of:
Price_{RRP} & = & (1 + retail_markup) \times Price_{wholesale} \end{eqnarray*}]
With rules of thumb like $cost_{transport} = 0.1 \times Price_{wholesale}$
1 and $cost_{labour} \approx 1.1 \times cost_{ingredients}$
, which can be replaced with $cost_{labour} = hourly\_rate \div production\_rate $
once you know how many loaves you can
make with an hour of work.2
Working all this stuff out is a complete pain in the arse. It’s not the sort of thing you want to be doing every time you change a product formula, or you negotiate a wholesale rate with your miller, or the pound goes through the floor and suddenly you’re paying 30% more for your dried fruit.
On the course, after we’d done the calculations by hand, for one product, we were given a spreadsheet, which was great so long as you were plugging in new numbers for ingredient costs or your hourly rate. But it’s in the nature of spreadsheets that, once you want to start adding a new product, there was an awful lot of copying and pasting of formulae and general faffing about.
If there’s one thing my time as a programmer has taught me, it’s that faffing about is the sort of thing you should really be leaving to the computer.
So, once I got back from the course (and on and off over the nine months since), I set about converting the spreadsheets to a relational database. The great thing about using a database for this sort of thing is that, if you set up your tables and views correctly, you only have to type formulae like the one for the wholesale price in once and after that the whole thing becomes data driven.
If you’re happy with absolutely zero documentation and a simple SQL command line, then you can take a look at my progress by looking at my bakehouse project on Github.
If you have a collection of bread books, you’ll find that the ones aimed at the home baker often have a slightly different dough recipe for every kind of loaf you might want to make. Which is fine when, in the course of the day you’ll be baking one loaf (or batch of buns or whatever). When you’re baking commercially, this doesn’t really fly. You may be making 10 or more different products and and if you can make three different products (say large loaves, small loaves, and baps) from a single huge lump of dough, then it makes sense to do that. And if you can carve off big lump of that dough before the bulk ferment and add some dried fruit to so you can make teacakes as well, then that makes sense too. In fact, if you look carefully at the proportions in the average domestic bread cookery book, you can probably pull out a few common doughs which are then jazzed up with other ingredients to make most of the things in the book.
A commercial formula for a product will almost certainly be multi stage - First you take some starter from yesterday and feed it with flour and water to make a production leaven, then after four hours you take some of the production leaven, flour, water and salt and make up your basic doughs (white, wholemeal, granary) which you’ll leave for their bulk ferment before scaling, shaping and leaving to prove (overnight in the retarder unless you want to be baking at 3 in the morning) so you can come in in the morning, fire up the ovens, bake off the buns, then the loaves, then the fancy breads and ship everything out to your customers. Fancier products might involve more fiddling about. Stollen is a classic ‘complicated’ multi-stage recipe which involves making up:
Then you make up your Stollen dough with the ferment, eggs, more flour, more milk and work in some butter too, which gives you a lovely soft rich dough. After an hour or so, you mix in the fruit, let it relax, then scale up the dough and the marzipan and make up your stollen, prove, bake, slather with melted butter and try to resist scoffing all at once.
If I were simply after working out the ingredient cost per loaf, I could represent this recipe in the database with a simple table like so:
product | ingredient | weight(kg) |
---|---|---|
Stollen | Ground almonds | 0.060 |
Stollen | Caster Sugar | 0.020 |
Stollen | Icing Sugar | 0.020 |
Stollen | Whole egg | 0.020 |
Stollen | Sugar | 0.005 |
Stollen | Fresh yeast | 0.005 |
Stollen | Milk | 0.060 |
Stollen | White flour | 0.050 |
Stollen | Sugar | 0.030 |
Stollen | White flour | 0.110 |
Stollen | Whole egg | 0.050 |
Stollen | Salted butter | 0.050 |
Stollen | Sultanas | 0.070 |
Stollen | Raisins | 0.060 |
Stollen | Candied mixed peel | 0.050 |
Stollen | Rum | 0.020 |
but that throws away a bunch of information and is a surprising pain in the arse to type. It’s also very hard to look at that table and deduce anything about what needs doing when in the stollen making process. So we write the table a little more naturally.
recipe_ingredient
tablerecipe | ingredient | weight(kg) |
---|---|---|
Stollen | Stollen Dough | 0.560 |
Stollen | Marzipan | 0.120 |
Stollen | Beaten Egg | 0.010 |
Stollen | Butter | 0.050 |
Stollen | Icing Sugar | 0.010 |
Stollen Dough | Stollen Fruits | 0.200 |
Stollen Dough | Festive Bread Dough | 0.360 |
Stollen Fruits | Sultanas | 0.070 |
Stollen Fruits | Raisins | 0.060 |
Stollen Fruits | Candied Mixed Peel | 0.050 |
Stollen Fruits | Rum | 0.020 |
Marzipan | Ground Almonds | 0.060 |
… | … | … |
Festive Bread Dough | White Flour | 0.110 |
Festive Bread Dough | Festive Ferment | 0.120 |
… | … | … |
Festive Ferment | Sugar | 0.005 |
Festive Ferment | Fresh Yeast | 0.005 |
Festive Ferment | Milk | 0.060 |
Festive Ferment | White Flour | 0.050 |
product
tableIf we want to calculate an accurate price, we also need to capture the scale weight of a ‘piece’ of product, the number of pieces we can make per hour of labour, and (think of batched buns, for instance) the number of pieces that go to make up a product. This gives us a table like:
product | pieces_per_hour | scale_weight(kg) | pieces |
---|---|---|---|
Stollen | 16 | 0.680 | 1 |
So, now we just need to calculate the cost of all the ingredients in a loaf which we can use as the inputs to the formulae above. At this point, I’m just grateful to be a programmer because the SQL query that does that calculation is… complicated. The spreadsheet version of the calculation copes with the arbitrarily nested nature of a recipe linking formulae together by hand (which is part of why the spreadsheet version is hard to extend), the Postgres version it ends up with a rather icky recursive query which calculates the per kilo price of each product and intermediate recipe by starting with a table of just the raw ingredient unit prices and at each iteration calculates the unit price of all the recipes which can be made using the ingredients and intermediates we already know the price of. If you want the gory details, check out unit_cost.sql.3
So, that’s the price breakdown sorted, so now I have some numbers to plug into a business plan and, indeed to stick on a pricelist. Huzzah. What’s next?
Here’s a question I want the software to answer: say I arrive at the bakehouse to start the day’s work. I’ve got orders for a couple of dozen large sourdough loaves, a dozen large wholemeal loaves, four dozen burger buns and a pile of cheese straws. What do I do first?
This is where the order
and production_sheet
tables come in. The order
sheet is pretty straightforward. Two columns, product and quantity (I might
get fancy and and add a third column for client, but that’s for later). The
production sheet less so.
Right now I know how to make a view which has recipe, ingredient and quantity columns, the thing that’s currently puzzling me is how to add a ’time’ column so I can have a view like:
time | product | ingredient | quantity |
---|---|---|---|
t-15h | 100% Sponge | White flour | 5kg |
t-15h | 100% Sponge | Water | 5kg |
t-15h | 100% Sponge | Fresh Yeast | 25g |
t-3h | Basic White Dough | 100% Sponge | 10.025kg |
t-3h | Basic White Dough | White Flour | 5kg |
t-3h | Basic White Dough | Salt | 200g |
t-3h | Basic White Dough | Water | 1.6kg |
t-1h30m | Scale | 18xLarge White | 800g |
t-1h30m | Scale | 6xSmall White | 400g |
t | Bake | Large White | |
t | Bake | Small White | |
t+40m | Pull | Small White | |
t+50m | Pull | Large White |
That needs a better representation of a recipe in a database. One which captures details of how long each step takes and what resources are needed while it’s taking place. Watch this space.
I’d really like to produce a schedule for multiple potentially overlapping products being prepped at once and constrained by available labour (only one activity can be carried out at once because I’m my only employee) and retarder and oven space. Unless I’m missing something that’s a proper NP-complete problem and that’ll have me hitting Knuth’s preprints on SAT solvers. Which will be fun.
Most of the people who can help me getting the oven on its feet are going to be away next week, so I doubt there’ll be much progress on oven commissioning, but I hope I’ll be able to keep making progress on the bakehouse software.
You will note that the transport cost is defined as a fraction of the wholesale price, but the wholesale price is defined in terms of the raw cost of the product, which includes the transport cost.
At this point, it’s good to remember your algebra. After a certain amount
of fiddling, it’s possible to define the transport cost solely in terms of
the other input costs and the desired markup. It comes out as
$$\frac{\sum\nolimits_{i \in \lbrace ingredients, labour, packaging \rbrace} cost_{i}}{1 - transport\_allowance\_rate - gross\_margin}$$
which which can get entertainingly huge as
$transport\_allowance\_rate + gross\_margin$
approaches 1. So we make
sure they don’t. ↩︎
Which can be a bit of a pain in the arse to calculate. You should only really count the time you spend actively working on the components of a product and not the time you spend waiting around for dough to prove or bake or whatever. It’s worth doing though. Accurate numbers matter. ↩︎
Bear in mind that the table structure in the app isn’t quite the same as the tables described here. ↩︎
Almost a year ago now, I hired a tail lift van and hared off down to the Isle of Wight to fetch my lovely ’new’ deck oven.
https://www.instagram.com/p/BIKWHSogUz2/?taken-by=theloaferyWe manhandled the bits off the van with a pallet truck, put ’em up on blocks in the garage and left them to their own devices while the folk at the market dithered over whether I’d be allowed to install it on market stall (I got the answer just over a month ago, it was “No”, but that’s probably good news).
What I should have been doing in this time was checking out the wiring and, maybe trying to test one or two of the decks.
But now I’ve got a bakehouse site lined up, which means there’s no time like the present…
The bread I like best isn’t baked in a tin or on a baking tray. Instead it’s turned out of a proving basket onto a peel and then slid onto the sole of a hot oven. The end result is a lovely crusty loaf, crisp on the bottom and with a lovely variation in the crust where the dough is slashed just before it goes into the oven. Something like this.
https://www.instagram.com/p/7hdnWbiHiD/?taken-by=theloaferyWhen I bake these at home, I put a big lump of stone into the oven before I turn it on and even when the oven’s up to heat I’ll give it a while before putting the dough in to get the stone itself up to heat too. It really does make a difference to the end result.
Traditionally, if you want to bake on the sole of the oven, you get ceramic or stone soles, but I used to bake on the sole of an Aga and I know that a good heavy metal sole will do the job just as well. Which is just as well because the soles in the deck oven are either steel or cast iron.
There’s a catch though.
https://www.instagram.com/p/BIKTottAW9K/?taken-by=theloaferyMetal you want to cook on shouldn’t be that colour.
So, my current project is to get the sole plates out of at least one the decks and see how well they clean up (either with an angle grinder or possibly something more lazily chemical involving vinegar or something). This is proving problematic.
In theory, the sole plates are held in place by a single retaining bar, I just have to unscrew that and bingo, I can get the plates out and get on with cleaning them.
In theory.
This is a two part job. Before I can get at the screws that hold the retaining bar in place, I first have to undo the brass screws that hold the lovely polished stainless steel fascia that forms the front of the oven.
Those brass screws have been in place for a long time. So far I’ve succesfully unscrewed six screws and sheared off the heads of six more. Joy. I see fun with left handed drill bits in my future.
https://www.instagram.com/p/BIKUgLKA2X9/?taken-by=theloaferyThen there’s the machine screws that hold the retaining bars in place. These are proving to be right little sods. I’ve been dosing them with WD-40 “Fast release penetrant spray” for ages now and they’re still refusing to shift. A friend on Twitter has recommended another product, Plus Gas, which I’ll be trying as soon as I can get hold of some, but otherwise… well, at least I already needed to get a lefthanded drill to cope with the brass screws.
https://www.instagram.com/p/BIKU5uZgfrs/?taken-by=theloaferyWell, yes, I am a programmer. But I’m also suffering from RSI - which makes programming painful, low level depression and a general sense of what’s the bloody point about programming as a career.1 Making a real product that people really need in the real world is really appealing to me. Really. And doing a job that will force me to take some exercise has its charms too.
Also, if you’re going to have a mid-life crisis, at least get something tasty out of it.
I’m also not planning to leave programming entirely. I’m unhappy with the idea of the professional programmer, but I’m not unhappy about programming as something embedded in another job. Being able to write a program to make your life easier is something we should all be able to do. There’s plenty of tedious arithmetic involved in working out how much of the various doughs, soakers and pre-ferments that are needed for a given day’s bake. Work that I can eliminate with a bit of programming know how.
A deck oven is the weapon of choice for your average artisan baker. It’s a stack of independent boxes of hot. Mine has 5 decks, each of which can hold 2 British standard baking trays, which are 30″×18″.2
If you’re used to domestic oven dimensions, you might be thinking that a stack of 5 ovens will be impossibly tall, but since we’re never going to have to roast a turkey in the deck oven, each deck needs to be no taller than the tallest loaf of bread plus a little headroom, so the oven volume is 30″ deep, 36″ wide and only about 8″ tall. My oven also has pipes for feeding steam into the chamber, which some bakers swear by, but I don’t have any steam generation fitted yet and I’m not sure I’m going to.
Why yes, I will have more to say about this in a later blog post. Thanks for asking. ↩︎
A standards, don’t you love them. So many to choose from.
So, the standard baking sheet is thirty inches by eighteen. Unless you’re an American, when it’s twenty six inches by eighteen. Or from continental Europe, when it’s (these days at least) sixty centimeters by forty, unless you’re Portugese because the Portugese (apparently) still use the same size as us Brits. I have no idea why. Maybe it’s got something to do with St George.
The inch sizes don’t really play nicely with modern domestic oven sizes either. I have some US-sized half sheet trays (Lovely solid things, from http://bakerybits.co.uk/, highly recommended) that have to be slightly finessed to fit into my Miele oven , they clash with the rails but fit in happily once the bake stone is in place. I’ve just checked, and I think Aga runners take a US half-sheet. It’s a minefield I tell you. ↩︎
If you’re reading this, I finally got my act together and shifted this blog over from Publify (Née Typo) to a static site generated by Hugo.
I wish I’d done it earlier.
I also wish I’d not decided that Textile was the right lightweight markup language for me back in 2003 when I started this blog and that I’d not used platform specfic techniques for code highlighting. Converting all that to Github Flavoured Markdown was a bear, massively helped by pandoc and some very hacky text munging in Perl. Some stuff is still looking very ugly though; I chopped and changed my code formatting over the years, so as time and inclination allows, I’ll try and get stuff fixed by hand.
The straw that broke this camel’s back was when I updated Publify and Ruby and spent the next few hours wrestling with Apache and Passenger trying to get the site back up. When the error messages told me that things weren’t working because I didn’t have a javascript runtime installed on my host, I knew that the time had come to just go with a system of files in a directory, served up by a simple minded webserver.
This first pass at migration has eliminated all the comments. They’re still in my database, but I’ve not worked out how to get them imported into the site’s structure yet. Again, as time and inclination allows, I’ll try and port them over. Who knows, I might even add Disqus comments to the site and let someone else deal with the pain of spam management.
I’m making no promises about restarting blogging I’m afraid. The shift to static generation has been an itch I’ve needed to scratch and, as I was building another site with Hugo anyway, now seemed like the opportune moment to move things over.
Something strange happened at the end of the war. In 1914, only around 30% of the adult population had the vote. By February 1918, a general election was years overdue. The Russians had killed the Tsar and were embracing communism; the women’s suffrage movement was threatening to start up again; and millions of returning soldiers — men used to violence by now — would have no say in how they would be governed.
Parliament read the tea leaves and passed the Representation of the People Act, extended the franchise to all men over 21 and many women over 30. This tripled the size of the electorate, 43% of which was now female (if they’d allowed younger women the Vote, then women would have had a clear majority because the war had killed so many men. Voting ages were equalised in 1928).
In the election, not much changed. The Tories won the most seats with a new class of MP, mostly coming from trade and commerce. Labour’s share of the vote increased dramatically, but the nature of the electoral system meant they only won 57 seats (fewer then Sinn Féin, who basically won Ireland). The Liberals came third, in the popular vote (second in seats, first past the post really sucks) but Lloyd George remained prime minister promising a land “fit for heroes”.
He didn’t deliver. The Irish had to fight for their independence and won it in 1921 (ooh look, another stupid war) and in the 1922 election Labour took over from the Liberals as the second party British politics.
Without the first world war, I wonder how long it would have been before parliament was shamed into extending the franchise to all adults. The expanded electorate may not have got the government it deserved, but the Vote was won.
Seventy years ago, the next big war ended. This time the returning soldiery weren’t going to be fobbed off with fine words and broken promises. Young men came home from defeating fascism in Europe and saw a sitting government still dominated by the party that had blundered into the war in the first place, still promising more of the same. They heard the Labour’s promises of full employment, a National Heath Service, a cradle to grave welfare state and a compelling vision of the future. And they voted Labour. Oh, how they voted Labour.
Labour won the kind of majority that politicians dream about and went straight to work. Attlee’s government nationalised roughly 20% of the economy; built social housing and encouraged the growth of new towns; introduced national insurance, unemployment benefit and the family allowance; expanded on the universal free education introduced with the Education Act of 1944; and created our National Health Service and what came to be known as the “Postwar Consensus”.
In five years.
In the face of austerity that made our current conditions seem like the lap of luxury.
They didn’t just deliver homes, health and education. They found money for the Arts Council too. Because once you’ve dealt with the worst that physical poverty can bring, shouldn’t you look to do something about poverty of aspiration too?
Few revolutions are so successful. No others have achieved so much without violence. A generation came back from war, said to itself, “We deserve better than this” and did something about it. If you’ve got a grandparent living who voted in that election, go and thank them. Stopping Hitler was a towering achievement, but our grandparents managed to surpass even that.
The Tories hated it. Every time they’ve had power since they’ve chipped away at the Postwar Consensus. They’ve had to be sneaky about it though. Once you’ve won the right to fall ill without fearing bankruptcy; once your children are guaranteed a decent education; once you have a roof over your head that isn’t two pay cheques away from being taken away… Well, you get attached to such things.
The 1944 education act was a Tory act, and rather than replace the old system, it added state schools to the mix. The rich were able to opt out and keep their children in the public school system. The public schools and their associated ‘old boy’ networks survived. Etonians don’t just learn Latin and Greek and the art of fagging; they learn that glib smoothness, the art of masking base and selfish motives behind the a veneer of affability. They learn to help their friends and the Devil take the hindmost.
The thing about villains is, they think they’re heroes. They think there’s nothing nobler than helping a chum. They think the world is just. If you’re blessed with the kind of money that Cameron and Osborne inherited you’re going to convince yourself that you somehow deserve your wealth. And if you deserve your wealth, then it’s a small step to thinking that the poor deserve their poverty.
If the world is as it is because everyone deserves their station, then the welfare state is going to seem like the next best thing to evil. The state wants to take some of your money and use it to pay some loser’s rent? It wants to give a drunk a liver transplant? Disgusting! If those people really cared about keeping their home, they’d get a decent job — it’s not hard, just have a word with a friend. And the drunk has only himself to blame. They’ve made their bed and they should lie in it.
The real trick though, is convincing those who really are a pay cheque or two from disaster (which is pretty much anyone with a mortgage or in private rented accommodation when you stop and think about it) that the enemy is the poor bastard on benefits. Not the landlord who banks their housing benefit. Not the employer who doesn’t pay a living wage; who lets the taxpayer top up their employees’ pay packets. And certainly not the government which won’t let local authorities build new social housing to help reduce housing costs (which would pay for itself in short order).
This government has that down pat. They’ve used a financial crisis — one whose seeds were sown when Thatcher and Reagan deregulated the markets and fertilised by every bloody government since (there are no innocents in this fiasco) — as the excuse and are dismantling what was so hard won by our grandparents. A government that promised “No top down reorganisations of the NHS” is gutting it. The poor are being forced out of rich areas by the benefits cap and the bedroom tax. The young are… oh god, the young… the coalition seems to read A Modest Proposal as sound policy. When I went to university, my fees were fully paid (Thatcher had frozen maintenance grants, not that I’d’ve got one after means testing). My step-grandson is looking at a minimum debt of £27,000 — assuming he can live for nothing. If you’ve got the cash to get your kid the best education money can buy, you don’t want some bright lass from the local comprehensive competing with them for the plum jobs. Pull up the ladder Jack!
It doesn’t have to be like this. Ask yourself how it is that, in 1945, when the country was on the bones of its arse with precious few lines of credit and an industrial base battered by years of bombing we built a welfare state and a national health service that have lasted for seventy years? Ask how we could, at the same time, find the money to subsidise the Royal Opera House and Sadlers Wells and many other arts organisations? Ask how we could afford, as a country, to support our university students so they could spend their time concentrating on their degrees and the life of the university and not miring themselves in debt?
Ask how we can afford not to do those things now.
There is no excuse for what our government is doing to the poorest among us. Or for what it’s doing to the middle classes come to that. An underclass is handy thing. They keep those on lower middle incomes so bloody scared of falling into poverty that they’ll put up with gross abuse just so they can hang on to what they have. Some guard their little portion with such jealousy that they will not just tolerate the abuse of the poor, they will be baying for blood.
It pains me to say this, but not everything the coalition has done is evil. And I don’t just mean Equal Marriage. Even Michael “Stopped Clock” Gove’s been right about something — the emphasis on learning to code rather than merely drive Powerpoint and Microsoft Word is a good thing. The gov.uk initiative is good news — anything which reduces the influence of KPMG, Capita, G4S and their cronies (and which employs so many of my more technical friends) can’t be bad. But a ‘good in parts’ government is still intolerable.
There’s an election due in 2015. 2015, the 70th anniversary of the Attlee revolution. It’s time to do it again. Vote. Vote progressive. Vote independent or green. Hold your nose and vote Liberal or Labour. Join a fucking party and work to change their outlook. Vote pragmatic. But, whatever you do, vote. Especially if you’re young. Politicians only care about keeping the people who vote happy — if you don’t vote, they’ll ignore you. If it makes some other part of their constituency happy, they’ll shit on you from great height (though I think that may backfire yet — the thing about grandparents is, they tend to like their grandchildren and don’t like seeing them get the shitty end of the stick)
You could listen to Russell Brand and not vote ’cos it’s “irrelevant” — there’s a revolution coming! You could. But you’d be an idiot and you’d be waiting a long time. There’s been one progressive revolution that actually stuck in this country, and that was achieved by voting.
Demand the nationalisation of public goods; the Post Office, Rail, Water, Gas, Electricity. Encourage small businesses and making stuff. Build new public housing. Demand real transparency in markets and government. Fuck landlords. Fuck rentiers.
Change the world. Our grandparents did it seventy years ago. We deserve better. Let’s take a leaf out of their book and do it again.
There appear to be two camps around the way Moose::Roles busily arguing about whether the following code should emit a warning:
package Provider {
use Moose::Role;
sub foo { 'foo' };
1;
}
package Comsumer {
package Moose;
with 'Provider';
sub foo { 'no, bar' }
1;
}
One camp holds that the code should at least emit a warning and ideally blow up at compile time. The other camp (Moose as implemented), holds that it shouldn’t. The debate gets somewhat heated, people end up appealing to the Traits paper as if it were some kind of holy writ. What’s annoying is that the folk who appeal to that paper appear to have read a different paper from the one I remember reading. So I went and read it again, and here’s what it has to say about overriding methods got from traits:
Trait composition enjoys the flattening property. This property says that the semantics of a class defined using traits is exactly the same as that of a class constructed directly from all of the non-overridden methods of the traits. So, if class A is defined using trait T, and T defines methods a and b, then the semantics of A is the same as it would be if a and b were defined directly in the class A. Naturally, if the glue code of A defines a method b directly, then this b would override the method b obtained from T. Specifically, the flattening property implies that the keyword super has no special semantics for traits; it simply causes the method lookup to be started in the superclass of the class that uses the trait.
Another property of trait composition is that the composition order is irrelevant, and hence conflicting trait methods must be explicitly disambiguated (cf. section 3.5). Conflicts between methods defined in classes and methods defined by incorporated traits are resolved using the following two precedence rules.
- Class methods take precedence over trait methods.
- Trait methods take precedence over superclass methods. This follows from the flattening property, which states that trait methods behave as if they were defined in the class itself.
Which is pretty much as I remember, and strongly implies that Moose is right not to issue a warning.
The paper has more to say on overriding trait implementations in its section on ‘Evaluation against the identified problems’:
Method conflicts may be resolved within traits by explicitly selecting one of the conflicting methods, but more commonly conflicts are resolved in classes by overriding conflicts.
And (relevant to another argument around role composition that’s more or less current):
… sometimes a trait needs to access a conflicting feature, e.g., in order to resolve the conflict. These features are accessed by aliases, rather than by explicitly naming the trait that provides the desired feature. This leads to more robust trait hierarchies, since aliases remain outside the implementations of methods. Contrast this approach with multiple inheritance languages in which one must explicitly name the class that provides a method in order to resolve an ambiguity. The aliasing approach both avoids tangled class references in the source code, and eliminates code that is hard to understand and fragile with respect to change.
There are folk arguing for removing the aliasing support from Moose role composition, but I have to say that I find this argument compelling.
When it comes down to it, referencing the traits paper is just argument from authority, which is one of the classic logical fallacies. However, if you are going to appeal to an authority, try to make sure that you’re not misrepresenting what that authority says. The original traits paper does not suggest that overriding a method got from a trait should come with a warning. On the contrary, it recommends overriding as the right way to resolve conflicts between multiple composed roles.
You may well think that this is problematic. You may be able to show examples where silent overriding has bit you on the arse. You may even have a good argument for introducing warnings. But “Because that’s how the traits paper says you should do it!” is a lousy argument, made doubly lousy by the fact that is precisely not what the traits paper says you should do.
Wow.
Seriously. Wow. He’s talking about programming the ACE, the ‘pilot’ version of which didn’t run its first program until 1950. And the Manchester ‘Baby’, the first stored program electronic computer, was more than a year away from running its first program. It sounds like it might be dreadfully speclative and either handwavy or as out there and daft as the usual crop of ‘futurologist’ type predictions.
As you can probably guess from the fact that I’m bothering to write this up, it was nothing of the sort. I suggest you nip off and read it for yourself. It won’t take you long and it’s well worth your time. Then come back here and find out if the same things struck you that struck me.
Here’s the sentence that brought me up short like a slap:
Computers always spend just as long writing numbers down and deciding what to do next as they do in actual multiplications, and it is just the same with the ACE.
I got to the end of the sentence before it clicked that back then a computer was a human being performing a computation. What we think of today as ‘a computer’ was what Turing called ’the ACE’ and back then it certainly deserved that definite article.
Then I read it again and recognised the deep truth of it. Back in Turing’s day, the ACE was planned to have a memory store made up of 5 foot tubes full of mercury acting as an acoustic delay line. Each tube could hold 1K bits and an acoustic pulse took 1 millisecond to get from one end of a tube to the other, so the average access time for a single bit of memory was around 500 microseconds. When it was finally built, it was the fastest computer in the world, running at the mighty speed of 1MHz. Nowadays we think that a cache miss that costs 200 processor cycles is bad news and our compilers and processors are designed to do everything in their power to avoid such disasters. In Turing’s day there were no caches, every time something was fetched from memory it cost 500 cycles. (Well, in 1947 that would be 500 cycles + a year and a half before there was a computer to fetch the memory from in the first place).
Curiously, the gold standard of high performance memory in Turing’s day was the same circuit as you’ll find in high speed SRAM today - the bistable flip flop - but done with valves and hope rather than by etching an arcane pattern on a bit of silicon.
Turing seems to have invented the idea of the subroutine. Admittedly it’s implicit in his implementation of a Universal Turing machine in On Computable Numbers…, but it’s explicitly described here. And, rather wonderfully, the pipedream of extensive code reuse is there in the computer science literature right from the start:
The instructions for the job would therefore consist of a considerable number taken off the shelf together with a few made up specially for the job in question.
There are several moments when reading the paper where I found myself thinking “Hang on, he means that literally rather than figuratively doesn’t he?” and this is one of them. When your code is embodied in punched Hollerith cards, a library is just that. Row upon row of shelves carefully indexed with reusable code stacked on them like so many books.
Elsewhere he says:
It will be seen that the possibilities as to what one may do are immense. One of our difficulties will be the maintainence of an appropriate discipline, so that we do not lose track of what we are doing. We shall need a number of efficient librarian types to keep us in order.
That’s my emphasis, and ain’t that the truth? I’m not sure that Turing would have foreseen that the nearest thing we have to ‘a number of efficient librarian types’ would turn out to be Google’s computers though. One wonders whether he’d be horrified or delighted.
Here he is, having painstakingly explained how the use of loops can reduce the size of a program:
It looks however as if we were in danger of getting stuck in this cycle and unable to get out. The solution of this difficulty involves another tactical idea, that of ‘descrimination’. ie. of deciding what to do next partly according to the results of the machine itself instead of according to data available to the programmer.
And there we have the nub of what makes computing so powerful and unpredictable. The behaviour of any program worth writing isn’t necessarily what you expect because it’s making decisions based on things you didn’t already know (if you already knew them, you wouldn’t have to compute them in the first place). This is why I’m optimistic about AI in the long run. I think that given that the behaviour of a single neuron is understandable and simulatable then, eventually we’ll manage to connect up enough virtual neurons and sensors that the emergent behaviour of those simulated neurons is as near to a ‘real’ consciousness as makes no odds. I’m far less convinced that we’re ever going to be able to upload our brains to silicon (or whatever the preferred computing substrate is by then). Whether we’ll able to communicate with such a consciousness is another question entirely, mind.
The masters are liable to get replaced because as soon as any technique becomes at all stereotyped it become possible to devise a ssystem of instruction tables which will enable the electronic computer to do it for itself. It may happen however that the masters will refuse to do this. They may be unwilling ot let their jobs be stolen from them in this way. In that case they would surround the whole of their work with mystery and make excuses, couched in well chosen gibberish, whenever any dangerous suggestions were made
Oh, did Turing nail it here. 1947 and he’s already foreseen ‘job security’ code. I’ve seen this kind of behaviour all the time and it drives me up the wall. What the peddlars of well chosen gibberish always fail to see that, if you get it right, the computer ends up doing the boring parts of your work for you. And your time is then free to be spent on more interesting areas of the problem domain. Software is never finished, it’s always in a process of becoming. There’s a never ending supply of new problems and a small talent pool of people able to solve them; if you’re worth what you’re paid today then you’ll be worth it again tomorrow, no matter how much you’ve delegated today’s work to the computer. And tomorrow’s work will be more interesting too.
Automating the shitwork is what computers are for. It’s why I hate the thought of being stuck writing code with an editor that I can’t program. Why I love Perl projects like Moose and Moo. Why I’ll spend half a day trawling metacpan.org looking to see if the work has already been done (or mostly done - an 80/20 solution gets me to ‘interesting’ so much quicker).
Job security code makes me so bloody angry. There are precious few of us developers and so much work to be done. And we piss our time away on drudgery when we simply don’t have to. We have at our fingertips the most powerful and flexible tool that humanity has ever built, and we use it like a slide rule. Programming is hard. It demands creativity and discipline. It demands the ability to dig down until we really understand the problem domain and what our users and customers are trying to do and to communicate the tradeoffs that are there to be made - users don’t necessarily understand what’s hard, but they’re even less likely to understand what’s easy. But its very difficulty is what makes it so rewarding. It’s hard to beat the satisfaction of seeing a way to simplify a pile of repetitive code, or a neat way to carve a clean bit of testable behaviour off a ball of mud. Sure, the insight might entail a bunch of niggly code clean up to get things working the new way, but that’s the kind of drudgery I can live with. What I can’t stand is the equivalent of washing the bloody floor. Again. And again. And again. I’d rather be arguing with misogynists - at least there I might have a chance of changing something.
I’m not scared that I’m going to program myself out of a job. I’m more worried that I’m never going to be able to retire because as a society and a profession we’re doing such a monumentally piss poor job of educating the next generation of programmers and some of us seem to be doing a bang up job of being unthinkingly hostile to the 50% of the talent pool who are blessed with two X chromosomes. But that’s probably a rant for another day.
I watched the rugby yesterday. England vs Wales at Cardiff Arms Pack. It was a great game of rugby - England were comprehensively outthought by a Welsh side with more experience where it counts, but by gum, they went down fighting to the very end. It’s going to be an interesting few years in the run up to the next World Cup.
While the game was going on, I found myself wondering why the crowd’s singing sounded so very good.
I watched the rugby yesterday. England vs Wales at Cardiff Arms Pack. It was a great game of rugby - England were comprehensively outthought by a Welsh side with more experience where it counts, but by gum, they went down fighting to the very end. It’s going to be an interesting few years in the run up to the next World Cup.
While the game was going on, I found myself wondering why the crowd’s singing sounded so very good. It’s not a particularly Welsh thing (though Cwm Rhonda, Bread of Heaven and the whole Welsh crowd’s repertoire are have fabulous tunes). The Twickenham crowd getting behind Swing Low, Sweet Chariot sound pretty special too, even if I wish they still sang Jerusalem occasionally. How come a crowd of thousands, singing entirely ad lib with no carefully learned arrangements or conductor can sound so tight?
After all, if you took, say, 30 people and asked ’em to sing a song they all know, it would sound ropey as hell (unless they were a choir in disguise and had already practiced). Three or four together might sound good because, with that few of you, it’s much easier to listen to your fellow singers and adapt, but 30’s too many for that and without some kind of conductor or leader, things aren’t likely to sound all that great.
I think it’s a statistical thing. Once you get above a certain number of singers, the fact that everyone’s going to sing a bum note now and again, or indeed be completely out of tune and time with everyone else, the song is going to start to make itself heard. Because, though everyone is wrong in a different way, everyone is right the same way. So the wrongs will start to cancel themselves out and be drowned by the ‘organised’ signal that is the song. And all those voices, reinforcing each other make a mighty noise.
That’s how big data works too. Once you have sufficient data (and for some signals sufficient is going to be massive) then the still small voices of whichever fraction of that data is saying the same thing will start to be amplified by where the noise is dissipated.
Just ask an astrophotographer. I have a colleague who takes rather fine photographs of deep space objects that are, to the naked eye nothing more than slightly fuzzy patches of space, only visible on the darkest of nights but which, through the magic of stacked imaging can produce images of stunning depth and clarity.
If you’ve ever taken photographs with a digital camera at the kind of high ISO settings that Mike used to take this, you’ll be used to seeing horrible noisy images. But it turns out that, by leveraging the nature of the noise involved and the wonder of statistics, great photographs like this can be pulled out of noisy data. It works like this:
Any given pixel in a digital photograph is made up of three different componants:
The job of an astrophotographer is to work out some way of extracting the signal at the expense of the noise. And to do that, they have one massive advantage compared to the landscape or portrait photographer. The stars and nebulae may be a very very long way away. They may be very dim. But they don’t move. Once you’ve corrected for the motion of the earth, if you point your scope at the horsehead nebula today it’s going to look the same as it did yesterday and the day before that. Obviously, things do change, but, from the distance we’re looking, the change only happens on multi-hundred year timescales. This constancy makes the astrophotgraphers task, if not easy, at least possible.
So… the stars (like the tune of Cwm Rhonda) are unchanging, but the noise is different with every exposure (that’s why it’s called noise after all). Even if, on any given exposure the noise is as strong as the signal, by taking lots and lots of exposures and then averaging them, the noise will get smeared away to black (or very dark grey) and the stars will emerge from the gloom. Sorry. The stars and the systematic error will emerge from the gloom. So, all that remains to do is to take a photograph of the systematic error and take that away from the image.
Huh? How does one take a photograph of systematic error? You do it by photographing a grey sheet. Or, because it’s probably easier, by throwing your telescope completely out of focus so what you see is to all intents and purposes a grey sheet and taking a photograph (or lots of photographs - you’ve still got noise to contend with…) and subtracting the resulting error map from your stack of photographs and bingo, you’re left with an image that’s mostly signal. All that remains is to mess with the levels and curves and possibly to stack in a few false colour images grabbed from the infra red or the hydrogen alpha line where there’s lots of detail and you’re on your way to a cracking photograph.
Obviously, it’s not as easy as that - telescope mounts aren’t perfect, they drift, camera error changes over time. It’s bloody cold outside on a clear night. Sodium street lights play merry hell with the sky. And so on. But if you persevere, you end up with final images like the one above. That sort of thing’s not for me, but I’m very glad there are folk like Mike taking advantage of every clear night to help illuminate the awesome weirdness of our universe.
Noisy data is a pain, but, we’re starting to realise that, if you have enough data and computing power, you can pull some amazing signals out of it. Whether that’s the sound of thousands of Welsh rugby fans combining to sound like the voice of God; an improbably clear photograph of something that happened thousands of years ago a very long way away; your email client getting the spam/ham classification guess right 99 times out of 100; or Google tracking flu epidemics by analysing searches, if you have enough data and the smarts to use it, you can do some amazing things.
Some of them are even worth doing.
However, the Maltron has a slightly different layout and I’m less gung ho about getting rid of the extra little finger keys, especially the left hand control and the shifts. The layout I’m starting from looks a little like this:
If you count that up, it’s 60 keys. There are 112 keys in the original Maltron layout arranged in 8 rows of 16 columns (which means that the total number of keys that could be accommodated in the matrix is 16 * 8 or 128. Because I was using only 60 keys, I could fit everything in an 8x8 matrix, which I wired up like this:
Once all the keys were wired up, I tacked ribbon cable in place to pick up signals, crimped terminations on the other end, plugged in the Teensy++ and went searching for firmware software.
Jesse had settled on the Humble Hacker Keyboard Firmware, but I found I couldn’t get on with it, and I ended up with the tmk firmware if only because it’s the first one I managed to get working and I found the documentation a wee bit more comprehensible. However, it was driving me up the wall for while because I simply couldn’t get it to recognise key presses as single keypresses. Keys would bounce, or wouldn’t register and I couldn’t work out what was going on until I read this tip on the Teensy website. It turns out that electronics is more subtle than I realised.
I’m a software guy. So I thought that the effective way of detecting a signal was to look for a positive voltage on your controller input pin. So zero volts implies that the input bit is false (zero in boolean logic). It’s a little bit more complicated than that though. It turns out that you get a clearer signal if you treat a pin being pulled to ground as true. To do this, we need some way of arranging for our input pin to be at 5V when the switch is open and, which (if you don’t know the trick) is more tricky, at 0V when the switch is closed and current is not flowing. Enter the pullup resistor. Consider the following schematic:
All we need to know to understand what’s going on now is Ohm’s Law. Ohm’s Law is almost laughably simple but once you’ve grasped it, understanding electronics gets much easier. The law states that the voltage (V) dropped across a load is equal to the product of the current flowing (I) in Amps and the resistance in Ohms (R).
So, when the switch is open (as in the diagram), we can see that the voltage between P0 and ground is equal to 5V - IR, but no current is flowing which makes IR equal to zero and so P0 is at 5V. So… what happens when the switch is closed?
We know that the voltage between the power rail and ground is 5V and we choose R so that the resistance of the switch might as well be zero. Which means that the voltage at P0 is 0V, or as near as makes no odds, so we have our two logic levels. When the switch is open, the input pin is at 5V, which we call false, and when it’s closed the pin is pulled down to ground (0V), which we call true.
So, if we recast our matrix driver so that, rather than applying a voltage to each column in turn and check the row pins to see if they’re high, we set up pull up resistors on the column pins and, set all our rows to 5V. Then, to scan the matrix, we set a row to ground and check which columns go to ground too and on to the next. The beauty of the Teensy is that we can do that without any extra hardware, we just set a couple of registers to appropriate values and we’re golden. Once I’d done this and rebuilt my debugging firmware suddenly the debugging output was making more sense. No missed keys. No strange repeats. No keys I hadn’t touched suddenly deciding they’d been pressed. Lovely.
There’s another possible problem with keyswitches called ‘bouncing’ that the firmware takes care of for me out of the box. In theory keyswitches are dead simple. You press the button and circuit goes from not conducting to conducting with no shilly shallying around. In practice… watching the voltage across even the best switch with a suitable oscilloscope is a lesson in the damnable imperfection of mechanical bits and pieces. The voltage is high. Then low. Then high. Then low. Then low. Then high. Then low and staying there. If you don’t take this into account in your driver you’re going to be registering far too many keypresses. Which is why any firmware worthy of the name has software debouncing (there are hardware debouncing solutions, but it’s much, much cheaper and more convenient to do the compensation) and the tmk firmware is no different. I’m sufficiently lazy that I’ve not really looked at how it works in any detail. Basically, if it detects a switch change it reads the same pin multiple times and, assuming the switch state is still changed at the end of that process, then it’s a real keyup or keydown event.
The tmk firmware is substantially more competent than I’ve explored in any depth yet. I’m experimenting with what I want to do with the blue shift layer and distinguishing between taps, chording and other possibilities by setting up my ‘blue shift’ keys to send the ‘F12’ and F13’ keycodes and I’m using KeyRemap4Macbook to do most of my messing with stuff, but once I’ve worked out what I want, I expect to push as much as possible into the firmware so I don’t have to duplicate a bunch of work (and indeed find appropriate driver software) when I want to use the keyboard on a Linux or, in extremis, Windows box.
The keyboard on your computer is (unless you’re a weirdo like me and you’ve got a kinesis, maltron or some other alternative input device) is a living fossil. It takes the form it does because back when typewriters were invented, the mechanical constraints of needing to have typebars strike paper forced the designers to stagger the rows of keys. The keyboard layout was (allegedly) designed not to slow typists down, but to try and avoid getting keys tangled up with each other during typing by keeping common key combinations apart (I’m not entirely convinced that his is true, given that ’e’ is next to ‘r’ and ’t’ and ‘h’ are such near neighbours, but it’s pretty obvious that the qwerty layout isn’t really designed to minimise finger travel while touch typing (one wonders if they’d even thought of touch typing when they designed the thing). There’s no real reason to remain tied to this design. The Maltron case is designed so that there’s not much lateral movement of your fingers or wrist flexion while typing. Once you’ve learned the layout, it’s a delight to type with. But even with the radical case design and rejigged layout the Maltron is a surprisingly conservative design. The microcontroller I’m using to drive the keyboard is a pretty capable 8bit computer running at 16MHz, 8K RAM, 4K of EEPROM and 128K of flash memory to hold the program. Scanning an 8x8 matrix doesn’t come close to pushing it.
So, if we’re not tied to ‘one key one action’, what can we do?
Here’s what I’ve been experimenting with so far:
Distinguishing between tapping and press and then release. And between typing a key by itself and using it as a modifier. So at the moment I have:
If I tap (press and release quickly without pressing another key) the left blue shift, then pretend I actually tapped the tab key. If I press the key and, while holding it down, hit another key, send the ‘blue shift’ symbol associated with that key or just send L_ALT + the original keycode if there’s no blue shift symbol. The right blue shift works similarly but instead of sending tab on tap, we send RET. If I press either key hold it for a while and then release it, we don’t send anything.
The two keys on the far left (shift and control) send ESC when tapped.
I’ve also arranged things so that both ALT keys send R_ALT. I realise that might seem weird, but I’ve also configured my Emacs to treat R_ALT as a SUPER key which lets me bind actions to blue shifted keys. So when I’m in Emacs, all those keys without a blue symbol on them have more or less complicated actions associated with them. Others have used teensy based firmwares to have certain key combinations move the mouse pointer or trigger complex sequences of actions.
I’ve also got enough pins spare on the teensy that (and enough holes in the case) that I’m seriously considering using hot glue to mount a few RGB LEDs behind some of the holes in the middle of the case so that, If I end up cooking up more keyboard layers, I can indicate the keyboard (and Emacs perhaps?) state with blinkenlights. Because how can a project be complete if there aren’t blinkenlights?
Where next? I’m not sure. I’m still experimenting with the possibilities that open up once you realise that just because we’ve always simulated a mechanical typewriter there’s no reason to keep doing it. Hardware doesn’t have to be dumb.
And then there’s the fact that a sixty key layout in a case designed to hold over a hundred keys looks scruffy. Until I started hacking my keyboard I’d tended to think that a desktop 3d printer was, for me at least, a solution looking for a problem. But now I’m trying to work out how to build a better keyboard case… Well, I think I’ve found my problem.
One thing that worried me about both Jesse and Maltron’s wiring was the fiddly nature of the way the wired the diodes in. The Maltron wiring only had diodes on a few keys, but I was looking to experiment with some serious remapping and possibly chording layouts - hardwiring a limited set of modifier keys wasn’t in my plan.
Before I talk about how I solved that problem, I’d best explain what the problem is. Consider the average computer keyboard with 105 or so keys. How do you work out which keys have been pressed without needing 105 I/O pins on your microcontroller? You arrange things in a matrix. We’ll worry about the physical layout of the board later, but here’s a schematic of a 5x5 key matrix which, with a bit of cunning, allows us to read which one of 25 keys is pressed with only 10 microcontroller pins
Suppose we apply a signal to the first column then, by looking at the pins attached to the rows, we can tell which switches in the selected column have been pressed by checking for the signal.
By cycling the symbol from column to column rapidly, we can scan the whole matrix.
When two keys are pressed at the same time, we can spot them with this arrangement, but what happens when, say, three keys are pressed? Let’s press switches S1, S11 and S13 and find out. When we scan column one, all is well, we get the signal out on rows 1 and 3 as we expected, but when we scan column 3, which only has one key held down, we get a signal out on rows 1 and 3 as well. What’s going on?
Let’s trace the signal and see if we can work out how it gets to row one. The signal comes in on column 3, through S13 and onto row 3. But S11 is also on row 3, which means that the signal can flow through S11 onto column 1 and once it’s on column 1, it flows through S1, onto row 1 and confusion reigns.
Welcome to the wonderful world of ghosting.
If you’re happy to live with only detecting when two keys are pressed simultaneously, you can correct for this in software by ignoring ambigous symbols (or by ignoring all signals which show more than two keys if you’re feeling lazy) and you can correct things for WASD gamers with some careful matrix layout to make sure that most of the common 3+ key combos aren’t ambiguous. Or you can spend a quid on a hundred 1N4148 diodes and invest some extra soldering time to wire things up like so:
« Signal: C3. Closed: C1R1, C1R3, C3R1. All Diodes »
With the diodes in place, the signal can’t go back through S1 and confuse things and your controller software can be much simpler.
The original Maltron wiring doesn’t put diodes on every key, just on all the modifier keys. Diodes may be cheap, but getting them wired into the matrix in the way Maltron did is a complete pain, doing it for over 100 keys was never going to be fun. But dammit, diodes on every key is just the Right Thing. There had to be a better way.
If you’ve ever spent time soldering stuff, you’ll be aware that humans have a hard time soldering due to an acute shortage of hands. Generally you need to hold one component in one hand, another component in the other, the soldering iron in your other other hand and the solder wire in the… hmm… we appear to have run out of hands. Which is where things like vises come in handy. That way you can hold one component in the vise, lock the wire or component you’re trying to attach to it by twisting, wrapping or beinding things, leaving you with hands free for the iron and solder. Wiring diodes up to the keyswitches, which have to be installed in the keyboard case while you’re doing it, is the very definition of fiddly.
However, if you go and read things like the MX keyswitch’s datasheet, it makes reference to switches with diodes fitted and when you look at the bottom of a switch you’ll see a diode symbol and four small holes.
Time to crack a switch open
There’s quite a bit going on in there, but right where the four holes are is the interesting bit. We can bend the legs of a 1N4148 diode and feed ’em through the holes, pop the lid back on and it fits, clean as a whistle. We’re onto something here.
I’d decided to go with a reduced layout based on the ‘blue shift’ layout that Jesse cooked up so, although it was fiddly it didn’t take that long to pop another 59 keys and put diodes in there. After that, it was easy enough to wrap the anode lead around one of the switch’s pins, solder it in place and clip off the excess wire:
Now I had 60 keyswitches with diodes installed which I could pop into the keyboard case and wire up with magnet wire. I love solderable magnet wire. It’s magic stuff. Just wrap it tight round a pin, or the cathode lead of the diode and it stays in place while you solder it in place. I’m not going to pretend it was the work of moments, but it was pretty straightforward and I haven’t needed to resolder a single joint. There’s something very satisfying about watching capillary action suck molten solder into the joint. Physics is awesome.
Tune in next time to really learn about the importance of the pull-up resistor, how to roll your own (or someone else’s) keyboard firmware and some thoughts on where next.