Turing

Written by Piers Cawley on , updated

Today is Alan Turing’s 100th birthday. I’ve been thinking about him lately, in particular about a story that demonstrates the perils of working with genius. The story goes that, when Turing was working with the Manchester Baby (the first stored program computer ever built. Just) a colleague wrote the first ever assembler which would turn (relatively) human readable assembly language and turn it into the ones and zeroes of machine code that the machine could actually execute.

Today is Alan Turing’s 100th birthday. I’ve been thinking about him lately, in particular about a story that demonstrates the perils of working with genius.

The story goes that, when Turing was working with the Manchester Baby (the first stored program computer ever built. Just) a colleague wrote the first ever assembler which would turn (relatively) human readable assembly language and turn it into the ones and zeroes of machine code that the machine could actually execute. He showed it to Turing, who blew up at him, ranting that using the computer to do ’trivial’ jobs like that were a massive waste of expensive computer time.

The problem with working with a genius, from the point of view of more ordinary mortals, is that the genius has only a very rough idea of what is actually easy, and what is only easy to them. From today’s vantage point, when computers are as freely available as they are now, the idea of not letting the computer do the shitwork for you seems utterly ludicrous - programmer time is more valuable than computer time.

What’s less obvious is that the same was true in Turing’s time (when there was precisely one computer) too. It only takes one programmer to make a mistake in translating from assembler to machine code and run a job that, for instance, gets stuck in an infinite loop and you’ve probably wasted more computer time (and programmer time) than if you’d just run it through the assembler in the first place. Turing didn’t see that, because the process of translating from symbols to binary wasn’t something that he found particularly complicated. To him, what was needed wasn’t more and better tools, it was more and better Turings.

I’m not entirely sure that I believe the story (and I can’t remember where I heard it, so it may be a phantom of my memory). It certainly doesn’t chime with the Turing who was instrumental in mechanising the shitwork of finding the days rotor settings so that Enigma traffic could be cracked. The history of Bletchley Park is a story of building better and better machines to do the dully repetitive jobs that humans find so hard to stick to and which machines excel at.

The “Just write code without bugs in the first place” school of programming is alive and well today. It’s not my school though. I’m very much an exploratory coder. I like to have tests to show me the way and to keep me honest so that I don’t go breaking things when I change this bit here. I’ve customised my editing environment to help me as much as possible. I have a nasty habit of writing bare words that I should have quoted, so I have a keystroke that shoves the last word into quotes for me. Another keystroke will align all the broad arrows at the current level of nesting. I’ve got snippets set up that fill in standard boilerplate when I start a new class, a huge list of common spelling mistakes that get autocorrected while I’m not looking and an expanding list of other little helpers that I need to write when I get around to it. Automation lets me go faster, make more and better mistakes and recover from them faster. I am a of little brain and I want all the help I can give myself.

I’m not sure that Turing would be in the same camp as me.

Here’s the thing though - Turing was, definitely, a genius. But his paper “On Computable Numbers, with an Application to the Entscheidungsproblem” (the one that gave the world the Turing Machine) has a bug in it. In fact, it has two. The first is obvious enough that I spotted it when i read the paper for the first time. The second bug is rather more subtle (but still fixable. It’s okay, the field of computing is not build on sand).

I love that The Paper - The one that’s the theoretical basis for the modern world (first publishd 75 years ago, round(ish) number fans) - has a bug. It gives me a strange kind of hope. Fallibility gets us all - even people who have saved the world. I think we should celebrate the humanity as well as the genius. Turing was, by all accounts, a very odd fish, but the world is undeniably richer for his contributions.

So, let’s raise a glass to the memory of Alan Turing tonight, marathon runner, gay icon, saviour of the world and the pleasantly fallible inventor of the modern world. Not a bad CV, when you think about it.

Updated

Apparently I’m wrong about it being Turing who didn’t like the assembler but Von Neumann (another genius). See below for details. And phew! How nice to find that a hero doesn’t have feet of clay.

  • 0 likes
  • 0 reposts
  • 0 replies
  • 0 mentions