Normalaccident theory, developed by the sociologist Charles Perrow in the 1980s, predicts that sooner or later there will be a devastating accident with a nuclear weapon. The list of near misses is terrifying. In Command and Control (2013), Eric Schlosser compiled a minute by minute account of an accident involving a Titan II nuclear missile based at Damascus, Arkansas in September 1980. It began with a trivial event and escalated dramatically: a technician dropped a tool during routine maintenance; the tool struck the liquid fuel tank of a missile, causing a leak; the Titan’s tightly coupled systems meant that any rise in temperature would almost certainly cause the leaked fuel in the enclosed rocket chamber to ignite. The danger was compounded by a lack of clear on-the-spot information and a rigid response procedure. Schlosser concludes that, by sheer luck, ‘none of those leaks and accidents led to a nuclear disaster. But if one had, the disaster wouldn’t have been inexplicable or hard to comprehend. It would have made perfect sense.’

Biological weapons systems are candidates for ‘normal accidents’; so too are advanced virological research programmes, whose scientists – a small group, spread across the world – know each other well and fiercely protect their community. For candid analysis of the perils of virological research, we must turn to the Bulletin of the Atomic Scientists, a journal founded by physicists troubled by the destructive potential of their discipline. Their counterparts in microbiological science haven’t faced that reckoning and have no comparable journal. In 2014, bioweapons specialist Martin Furmanski wrote in the Bulletin that ‘many laboratory escapes of high-consequence pathogens have occurred, resulting in transmission beyond laboratory personnel. Ironically, these laboratories were working with pathogens to prevent the very outbreaks they ultimately caused.’

Virological hazards begin in the field, with expeditions to collect samples of dangerous pathogens. In search of bats that host unknown coronaviruses, virus hunters clamber into unmapped caves, risking snakebite, lesions from razor-edged rocks and rockfalls, as well as infection from the bite or scratch of a bat, or from the bat shit that coats every surface. Macho camaraderie encourages short cuts, such as not wearing cumbersome protective suits. Further perils follow once field workers emerge with captive bats and prepare to draw blood samples: base camp staff, including local guides and student assistants labelling the specimens, are now in close contact with potential vectors.

In the laboratory there are also risks. Living with Covid means every sore throat or runny nose is a quandary. Should I self-isolate and take a test right away or leave it for a day to see how the symptoms develop? Research virologists working with deadly pathogens run through the same mental calculations as the rest of us – they will often decide to shrug off a mild fever and press on with their work. Biosafety is scrupulously regulated, but mostly by the researchers themselves. There has been little public debate about the way these risks should be assessed.

In 1977 a strain of H1N1 influenza reappeared after a twenty-year absence, an event with a probability in the natural world approaching zero. It wasn’t particularly virulent and was superseded by another strain the next year, but its appearance was a mystery. The most likely explanation is that it escaped from a Chinese or Russian laboratory during a vaccine trial. Other escapees include smallpox (in Birmingham in 1978), Sars outbreaks after the Sars epidemic (twice), and foot and mouth disease in the UK. Human error is usually to blame.

In order to predict the ways in which viruses might become more transmissible or more severe, virologists genetically manipulate them in the laboratory – what’s known as ‘gain of function’ research. If we can identify the most crucial elements in a virus, for instance those that allow it to infect human cells, then we can develop a vaccine more effectively – or so the thinking goes. This became controversial in 2011 when researchers experimenting with the H5N1 influenza virus submitted a paper for publication in Science. The editors hesitated, not out of concern about the scientific quality of the work but the ethical dilemma it posed. Even publishing the blueprint for a virus might be tantamount to releasing it, since it would allow malign research to reproduce the experiment. The US National Institutes of Health ordered a suspension of the project and convened an expert group – the National Science Advisory Board for Biosecurity – to assess the risks and benefits. The board held two symposia, wrote new ethics guidelines and published a thousand-page report weighing the need for pandemic preparedness research against biosafety, and ultimately decided that scientists should regulate themselves.

It’s possible that Covid-19 was a laboratory leak. The strongest point in favour of the theory is that, despite intensive investigation, no one has identified how the virus was transferred to Wuhan, either directly or through an intermediate host, from bats living many hundreds of miles away. The Wuhan Institute of Virology was one of the very few places in the world in which potential immediate antecedents to Sars-CoV-2 may have been stored or used in research. The hypothesis of a laboratory leak is both circumstantial and conjectural and can probably never be confirmed, not least because if there was any incriminating evidence it would have been destroyed. In an opaque process, China and the World Health Organisation agreed the terms of reference for a joint inquiry earlier this year ‘to identify the zoonotic source of the virus and the route of introduction to the human population’. There were no biosafety specialists among the investigators and the possibility of a laboratory origin was relegated to ‘extremely unlikely’ on a show of hands in an open meeting – hardly a scientific approach. The report was released on 30 March. The same day, the WHO director general, Tedros Adhanom Ghebreyesus, pointedly said: ‘All hypotheses remain on the table.’ Last month the WHO set up a new Scientific Advisory Group for the Origins of Novel Pathogens; its members include specialists in laboratory safety and security. In a paper for Science, Tedros and two colleagues wrote that a lab accident ‘cannot be ruled out until there is sufficient evidence to do so and those results are openly shared’.

The Covid-19 pandemic may well have been a ‘normal accident’; it’s equally possible that ‘Disease X’, the WHO’s codename for the next pandemic, will be another. If so, it will be the by-product of our total war on microbes, our determination, since the acceptance of germ theory 150 years ago, to collect, classify, experiment with and sometimes exterminate them. As with the Manhattan Project, demand for ever more powerful munitions justifies risk-taking of a kind that the scientists involved don’t fully comprehend.

At a symposium organised by the Wildlife Conservation Society in New York in 2004, scientists and policy advisers with backgrounds in food, agriculture, geology, wildlife, environmental law, disease control and public health came up with twelve recommendations – the Manhattan Principles – for preventing epizootic disease and maintaining ecosystem diversity. This collaboration, between countries and across disciplines, played an important role in the emergence of the ‘One Health’ initiative, which has won the backing of the WHO, the US Centres for Disease Control and Prevention and others. If taken seriously, it would entail radical changes to industrial farming, along with the protection of threatened ecosystems and the promotion of biodiversity. One thing that’s missing from this agenda is the regulation of virological research to minimise risk. But the greater ambition should be the refashioning of public health as a democratic project. The politics of global health requires activism of the sort seen in response to HIV/Aids, which turned medical research into a collaboration between scientists and their subjects, and forced public health policy to deal with questions of inequality and stigma. The investigation into the origins of Covid-19 and the work currently underway to prevent Disease X should be driven by a debate about the risks we are willing to take and the technologies we choose to employ.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences