In the latest issue:

Boris Johnson’s First Year

Ferdinand Mount

Short Cuts: In the Bunker

Thomas Jones

Theban Power

James Romm

What can the WHO do?

James Meek

At the Type Archive

Alice Spawls

Where the Poor Lived

Alison Light

At the Movies: ‘Da 5 Bloods’

Michael Wood

Cultural Pillaging

Neal Ascherson

Jenny Offill

Adam Mars-Jones

Shakespeare v. the English

Michael Dobson

Poem: ‘Now Is the Cool of the Day’

Maureen N. McLane


David Trotter

Consider the Hare

Katherine Rundell

How Should I Refer to You?

Amia Srinivasan

Poem: ‘Field Crickets (Gryllus campestris)’

Fiona Benson

Diary: In Mali

Rahmane Idrissa

Vol. 28 No. 12 · 22 June 2006

My Life as a Geek

Thomas Jones

3719 words

In 1979-80, a six-part documentary called The Mighty Micro was broadcast on ITV. Written and presented by the late Christopher Evans of the National Physical Laboratory, and based on his book of the same name, the series looked at the ways the world might be changed by the microcomputer revolution. The BBC responded with a more practically minded, educational series called The Computer Programme – a terrible if irresistible pun – which first aired on 11 January 1982. The aim was to give viewers lessons in computer literacy. When it went into development, a suitable machine for use in the series didn’t exist. So the BBC drew up a list of specifications, and went in search of someone to build them a computer.

The contract eventually went to Acorn Computers, a company based in Cambridge, who were developing a machine they were going to call the Proton, a successor to the Acorn Atom which was already on the market. The Proton was released as the BBC Micro in late 1981. Acorn expected to sell about 12,000 of them; in the event, they sold more than a million.

One of them, a Model B, was bought by the Joneses in the summer of 1984, when I was seven. It cost £399, and had twice the RAM of the Model A (‘random access memory’, crudely speaking, is the kind of memory that software can be loaded into): a whopping 32 kilobytes. The laptop I’m writing this piece on, by way of comparison, has 240 megabytes, which is 7680 times as much. My laptop also has a 40 gigabyte hard drive. The BBC Micro had no hard drive. The limitations of a 32K memory revealed themselves most bluntly in the fact that our computer couldn’t count any higher than 32,767. If you typed in ‘32768’, it would reply: ‘Syntax error’. This was its second favourite error message. If you typed in a word it didn’t understand, it would say simply: ‘Mistake’.

It spoke in such user-friendly terms because it came preloaded with a language called BBC BASIC. In keeping with the BBC’s pedagogical intentions, BASIC allowed users to give the computer instructions using a limited range of English words, and was therefore much easier to get the hang of than the more esoteric codes used by more advanced programmers. If you instructed the computer to ‘PRINT “Hello”’, for example, it would obligingly say ‘Hello’. You could tell it what COLOUR it should say ‘Hello’ in, or make it DRAW a few lines across the screen. It also understood such logical terminology as IF, AND, THEN and ELSE. You could string a series of commands together in the form of a program by prefixing them with a number – ‘10 COLOUR 1’, for example, ‘20 PRINT “Hello”’ – and when you typed RUN the computer would carry them out in order: in this case, setting the text colour to red and then saying ‘Hello’.

I never saw The Computer Programme, but there were plenty of other ways to learn how to get to grips with a Beeb: it came with a cassette tape full of programs that showcased its many and various abilities, and there were innumerable manuals and guides jostling for the attention of apprentice programmers. And so with their help, I taught myself BASIC. I soon also took out – or had taken out on my behalf – a subscription to Micro User, a magazine vast swathes of which were entirely incomprehensible to me, but which every month included the code for a BASIC game. The monthly game, and other software, later came free with the magazine on a floppy disk. But if you didn’t have a disk drive, you would have to type in the code for the game yourself. So I would come home from school, settle down at the computer with a plate of Marmite sandwiches, and spend much of the evening, not entirely unlike a medieval monk, absorbed in transcribing pages of code. Peering at the screen through my NHS spectacles for hours on end can’t have done my already poor eyesight any favours.

It was a tedious yet peculiarly rewarding pastime. The games often weren’t up to much, from the point of view of simply playing them, but it was satisfying to be able to relate the dry lines of code to their modestly spectacular effects, to have access to the mechanism of the game, to see how it worked – and then to be able to tinker with it. From typing out and messing with other people’s games, it was a short step to trying to write my own.

A computer game can be thought of as having both an argument and a story. If the argument held up, the game would work; if it didn’t, the computer would let you know, stopping the program to announce tersely that you’d made a ‘Mistake’ or a ‘Syntax error’, or that there was ‘No such variable at line 140’. I could, most of the time, get the argument to work in the games I wrote, with the occasional help of an older, savvier boy called Mark who lived round the corner.

There was no objective way to determine whether the story was any good, however: that depended on whether or not anyone enjoyed playing it. And the only person who ever played the games I wrote was me. My older sister wasn’t interested; my other sister was too young; and when friends came round we were more likely to spend the time out on our bikes or climbing trees or playing football. Writing computer games, like any kind of writing, or even like reading, was a solitary activity. If we did huddle round the computer, it was to play proper games, the kind that cost money and came on tapes (or, later, disks) in shiny boxes. They were written in languages that were much harder for small boys to learn than BASIC, but closer to the computer’s own binary thought processes, so could run faster and be more complicated because less memory had to be wasted translating the programmer’s instructions.

Much of the satisfaction I derived from playing the games I wrote was that of the unduly proud creator, blind to their shortcomings and ungainliness. I once wrote to the editors of Micro User asking for advice on how to go about getting a few of my games distributed more widely; they replied, in much the same manner that the LRB might write to a pre-teen poet, that if I really thought they were good enough I could perhaps try them out on a few of the smaller publishers, but shouldn’t get my hopes up. Taking the hint, I kept my creations to myself.

In the broadest structural terms, the stories of almost all computer games are pretty much identical. The player must guide a protagonist through an environment that contains hazards and rewards, doing his best to avoid the hazards and collect the rewards. As the game progresses, the hazards may get harder to avoid, and the rewards increase accordingly. There may also be a number of puzzles to solve, but these aren’t strictly necessary. The first commercial game I played was Snapper, closely modelled on Pac-Man – so closely, in fact, that the manufacturers had to alter the graphics to avoid a law suit. You had to guide a little man around a maze, munching up the green dots and avoiding four ghouls that pursued you. There were four larger, flashing green dots, which would give you the ability for a limited period to munch the ghouls, too. It was culpably simple, and astonishingly addictive.

Towards the other extreme of sophistication on the BBC was Michael Jakobsen’s Citadel, in which you had to explore a large castle and its environs, picking up keys, unlocking doors, shooting large, malevolent monks between the eyes, taking chickens from the freezer and roasting them in the kitchen, boiling up bones in a witch’s cauldron to cast a spell to make her disappear so you could explore her house, filling a bucket with water from the well so you could put out a fire and climb the chimney, and so on.

It was an absorbing game, but also an infuriating one, not least because it took ages to complete, and it wasn’t possible to save your position and then pick up again where you’d left off a few hours or days or weeks later. Every time you died, you had to go all the way back to the beginning. What’s more, the game was identical every time you played it: all the objects you needed were always in the same place; the same monsters always haunted the same rooms and followed the same patterns of movement. Puzzles that were satisfying the first time round, or rooms that were initially exciting to enter, soon became tedious and predictable. This was characteristic of many BBC games; some got round the problem by giving you passwords at key points in the story, allowing you to skip ahead next time. When Citadel was released, the manufacturers offered a substantial cash prize to the first person to complete it, and no wonder: success depended not only on a considerable amount of ingenuity and dexterity, but also on a phenomenally high boredom threshold.

Indisputably the greatest game ever written for the BBC was Elite, by David Braben and Ian Bell. The aim was to travel through eight galaxies, each with 32 solar systems, trading cargo, battling enemies and becoming a steadily more feared space pirate. Not only were you able to save your progress as you went along, but the game made impressive use of 3D graphics and – here was the stroke of genius – generated each star system as you entered it. This radically efficient use of the BBC’s limited memory capacity made for an almost infinite variety of environments, in a way that simply wasn’t possible with games such as Citadel, in which every detail was mapped out in advance. The world of Elite was not only enormous but also never the same twice.

Elite was released in 1984. The great leap forward it represented in programming terms completely passed me by: I was content to churn out, in laborious BASIC, umpteen variations on the theme of a little man running about the screen collecting goodies and avoiding baddies. In one permutation, the protagonist was an explorer collecting exotic fruit in a jungle, crisscrossing a river and trying not to be eaten by crocodiles. (For some reason it never occurred to me to write a game in which a little boy in shorts and NHS specs ran about gobbling Marmite sandwiches and avoiding cricket balls.)

In another, the crowning achievement of my modest career, the player was the Scarlet Pimpernel, guiding an endless procession of aristocrats from the gates of Paris through the countryside of Northern France to safety in England, with an increasing number of frenetic sans-culottes in lukewarm pursuit. Several guillotines were mysteriously scattered through the woods, and would kill any aristo unfortunate enough to stumble into one. You could also pick up gold coins, which again isn’t very realistic – but perhaps they had been dropped by earlier fugitives. Occasionally, in a fit of revolutionary zeal or plain boredom, I would direct one of the aristos into a waiting guillotine or the arms of his pursuers. And even though it was only a computer game – moreover, a very simple computer game, and one that I had written myself – this trivial defiance of the rules gave me a minor thrill of transgression.

The feeling was altogether stronger with proper games. In Stryker’s Run, for example, you controlled a character called Commander John Stryker, who had to convey an urgent top-secret message from one futuristic army base to another. What this meant, in practice, was running along past a monotonous scrolling backdrop of uniform mountains, shooting any enemy soldiers you encountered along the way with your laser gun, occasionally stealing their helicopters, and trying not to get shot yourself. Every so often, you would encounter one of your own troops. They didn’t do much; they certainly didn’t do anything useful; and you could, if you felt like it, shoot them too, and watch them crumple uncomplainingly into a heap of bones. You didn’t get any points for it, but you didn’t incur any penalty either. In the moral economy of the game, it was an absolutely neutral act. And yet, perhaps precisely because it was wholly gratuitous, and went unpunished, it felt peculiarly wrong.

Citadel offered the possibility of a different kind of transgression. Before you started playing, you were asked: ‘Are you Male or Female? Please enter M or F.’ I was just about old enough to have a fully developed sense of gender constancy, and so unhesitatingly pressed M. ‘You are male,’ the computer responded. ‘Correct (Y/N)?’ Oh yes, I was quite sure. You can check with Piaget (I didn’t think). But after playing a few times, I began to wonder what would happen if I claimed to be female. And so, with some trepidation, the next time I loaded the game, I tapped the F key. ‘You are female. Correct?’ I recoiled in horror and with masculine firmness pressed N. But then my curiosity got the better of me and, praying that no one would ever know, told the computer that I was female. Yes, I was female. The game began. The only difference was that the protagonist had long hair. In fact, she was more fun than her crewcut male counterpart, because her hair streamed out behind her when she jumped. Any stigma attached to such virtual transvestism – if any ever existed outside the mind of a nine-year-old boy – was dispelled once and for all with the advent of Tomb Raider. Lara Croft, with her impossible curves (her breasts swelled ever larger with every sequel), was, unlike the overwhelming majority of those who controlled her, unquestionably female: beyond female, even.

The opportunity to be bad, too, has been incorporated into a great many modern video games. In Ultimate Spider-Man, for example, which was released last autumn, you can choose to play Spider-Man and keep the streets of New York safe from comic book villains; or you can be one of those villains, and be rewarded for terrorising the city. In Grand Theft Auto, you score points for running over innocent pedestrians. It’s not so much that these games take the fun out of being bad, as that they make being bad simply impossible: you’re working within the rules rather than breaking them.

I realise that this is to use the phrase ‘being bad’ in a very limited sense. Just because a computer game gives you points for running over virtual passers-by in your virtual motor car doesn’t mean that in real life (or IRL, in webspeak) it’s OK to run down Saturday morning shoppers on the Holloway Road in your Citroën Saxo. And there is a perennial fear with violent computer games, as with violent movies and pop songs with violent lyrics, that they will encourage some of the people who play, watch or listen to them – susceptible young men and teenagers, or ‘youths’, as the press likes to say; well-behaved young people are never called ‘youths’ – to act in violently imitative ways. Witness David Cameron’s recent crowd-pleasing remark in response to a question from the editor of Good Housekeeping: ‘I would say to Radio 1, do you realise that some of the stuff you play on Saturday nights encourages people to carry guns and knives?’ This may be true, up to a point, and ‘encourages’ is a conveniently vague word, but I’d be interested to know whether the Tory leader has any statistical evidence to back up his assertion.

Fantasy rarely bleeds into reality to the extent that people, however foolish, will start behaving as if they were living in a computer game. All the same, the games we play, like the books we read, influence the way we see the world. And a game such as 50 Cent: Bulletproof, in which you get to play a computerised fantasy version of the rapper fighting his way up from the streets, can’t fail to perpetuate a racial stereotype in the minds of the people, of whatever race, who play it. That said, it’s generally more useful to look at cultural artefacts as symptoms rather than causes of the society that produces them: you’d have to be a complete idiot to think that if Radio 1 would only stop playing gangsta rap, South London’s rude boys would swap their knives for knitting needles.

Technological advances have made modern computer games a thing of wonder, unimaginable twenty years ago, in which anatomically detailed protagonists move with astonishing fluidity and acrobatic grace through environments of almost cinematic verisimilitude. But the structure underlying the stories hasn’t really changed: except, that is, in the case of multiplayer games on the internet.

There have been multiplayer computer games for as long as there have been computer games. In 1958, William Higginbotham at Brookhaven National Laboratory on Long Island invented a game, played on an oscilloscope, called Tennis for Two. A variation of it was released by Atari as the arcade game Pong in 1972, which enjoyed a brief period of enormous popularity. In 1961, three computer scientists at MIT developed a game called Spacewar on a DEC PDP-1. Two spaceships had to try to shoot each other while avoiding falling into the gravity well of a star. The game has recently been resurrected on the only working PDP-1 still in existence, at the Computer History Museum in Mountain View, California. The BBC Micro had its fair share of two-player games, and in theory there was the capacity for more, since it was easy enough to network a group of computers together. But that happened for the most part in schools, where playing games was never the primary purpose.

The internet has thrown the possibilities for multiplayer gaming wide open. There are countless MMORPGs (massively multiplayer online role-playing games) out there, of varying degrees of seriousness and complexity, catering to all tastes, each with several thousand players. One of them, Project Entropia, made headlines a couple of years ago when it was revealed that one of its 300,000 players had paid £13,700, in real money, for a virtual island. Everyone thought he was a nutcase, until it transpired that he not only recouped his investment in less than a year, but made a tidy profit, by selling the property piecemeal to other players.

For the less dedicated, or less wealthy, there are games such as Urban Dead, a free, low-tech, text-based MMORPG, in which roughly fifty thousand characters, half of them human, half of them zombies – you can be turned from one to the other by being either killed or ‘revivified’ – struggle for control of a post-apocalyptic city called Malton. What sets these games apart is that the environment is defined not so much by a set of predetermined parameters as by the sum of the behaviour of the other players. In addition, the game’s mastermind – in the case of Urban Dead, someone from Lewes called Kevan Davis – is able to fine-tune the rules in response to player activity. A few months ago, for example, a large number of disgruntled zombies, who felt they were getting a raw deal compared to the humans, went on strike. They stopped attacking people and headed for a park in the centre of Malton where they held a mass demo. The high concentration of characters in one place caused problems for the server, and the refusal of so many to play – the strike received a huge amount of support, from zombies and humans alike – threatened to sink the game. Kevan (everyone’s on first-name terms in Malton) responded to their demands by improving the zombies’ lot; the strike came to an end; and, to celebrate, the zombies went on a rampage round the city, attacking each of the shopping malls in turn, in a campaign that became known as Mall Tour ’06, and once again significantly changed the dynamics of the game. It’s perhaps noteworthy that the zombies tend to have a much better sense of humour than their more po-faced human counterparts.

It’s immensely silly, certainly, but it’s also a lot more fun than accumulating points by blowing out the brains of rival crack dealers in hyperreal cartoonish technicolour. Urban Dead offers plenty of scope for being bad, for which the game won’t punish you but the other players almost certainly will – if you get caught. There’s also plenty of scope for genderbending, though most of it’s hopelessly unconvincing, with many female characters describing their appearance in sentences that were all too clearly written by a man.

It’s 17 years since I stopped writing computer games: a combination of my going to a new school, the onset of adolescence and the BBC Micro becoming obsolete. There are still a few functioning Beebs around the place: a number of Britain’s railway stations apparently still use them to run their platform displays. But if, for nostalgia or any other reason, you’d like to get your hands on one, you don’t need to go to the trouble of robbing a railway station, because you can easily, free of charge, download an emulator from the internet. This is a piece of software that allows you to pretend that your PC is a BBC: the ultimate downgrade. A couple of days ago, I wrote a short BASIC program which allowed me to make a little man run about the screen. I even solved a problem that I’d completely forgotten had ever bothered me. It was a very simple problem, which had nothing to do with my understanding of BASIC and everything to do with my inability to work out the logical steps underpinning the procedure. The results were utterly unremarkable, and would have been equally unremarkable (to anyone apart from me) in the mid-1980s. But it gave me quiet satisfaction all the same: I worked out, after twenty years, how to make the little man jump.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

Please include name, address, and a telephone number.


Vol. 28 No. 15 · 3 August 2006

Thomas Jones writes that ‘the limitations of a 32K memory revealed themselves most bluntly in the fact that our computer couldn’t count any higher than 32,767’ (LRB, 22 June). The BBC micro used 32 bit integer variables, so it had no problems with numbers far larger than 32,767 and, in any event, that limitation would have had nothing to do with the amount of memory.

Andy Armstrong
Alston, Cumbria

Vol. 28 No. 17 · 7 September 2006

It’s not quite right to say, as Andy Armstrong does, that the BBC Micro used 32-bit integer variables (Letters, 3 August). Like nearly all modern computers, the 6502 central processing unit (CPU) that the Beeb was based on uses binary digits (bits). The 6502 has instructions built into its hardware to move bits around in groups of eight and to add and subtract 8-bit numbers: it is an 8-bit micro. For any other arithmetic operations (addition involving numbers bigger than 255, multiplication, division etc) someone would have to write software.

More modern CPUs move bits around in groups of 32 or 64, and have built-in instructions for a wide range of arithmetic operations. BBC BASIC has several built-in data types, including 32-bit integers. Different software running on the BBC Micro can manipulate much larger integers. BBC BASIC first ran on BBC microcomputers, but has since been made available on many other, newer computer systems. Armstrong seems to have confused the hardware (BBC Micro) and its inherent capabilities with the software (BBC BASIC), which could be run on a different computer, or be replaced by different software with better or different capabilities.

Thomas Jones got a ‘syntax error’ when he typed 32768 at the BBC BASIC command line because BBC BASIC assumes that any input which begins with a number is a line of a program. But its internal data structure allocated only 15 bits for storing line numbers, so the highest possible line number was 32767. My favourite BBC BASIC error message is line number related, too. Typing ‘RENUMBER 10, 0’ at the command line provokes the reply: ‘Silly’.

Roddy Graham
Motherwell, Lanarkshire

send letters to

The Editor
London Review of Books
28 Little Russell Street
London, WC1A 2HN

Please include name, address and a telephone number

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Read More

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences