A Different Universe: Reinventing Physics from the Bottom Down 
by Robert Laughlin.
Basic Books, 254 pp., £15.50, September 2005, 9780465038282
Show More
Show More

‘There is nothing new to be discovered in physics now,’ William Thomson, Lord Kelvin asserted at the British Association meeting in 1900. ‘All that remains is more and more precise measurement.’ Not to be outdone, the American scientist Albert Michelson said: ‘The grand underlying principles have been firmly established; further truths of physics are to be looked for in the sixth place of decimals.’

Nature repeatedly reveals the limits of our collective imagination. The discovery of the nuclear atom, and the rise of quantum mechanics and relativity, showed how naive Thomson and Michelson had been. Yet, undeterred by the lessons of history, many modern theorists and popular science magazines now tout the current candidate for a Theory of Everything: superstring theory. This posits that all physical forces are gravity acting in a Universe with ten dimensions, whose matter is made up of strings on a scale so small that a billion billion of them could fit into the nucleus of a hydrogen atom. Even without the intervention of experiments that might show such a theory to be highly presumptuous, Robert Laughlin cautions against searching ‘on smaller and smaller scales for meaning that is not there’.

Laughlin’s central argument is that instead of becoming obsessed with ultimate theories we would do better to focus on those properties of matter that ‘emerge’ from the organisation of large numbers of atoms. Examples of emergence include the hardness of crystals, the self-organisation of vast numbers of atoms that we know as life, and even the most fundamental laws of physics, such as Newton’s laws of motion.

Emergence is said to occur when a physical phenomenon arises as a result of organisation among any component pieces, whereas the same phenomenon is not produced by the individual pieces alone. Thus in art, the individual brush strokes in a canvas by Renoir are randomly shaped and coloured seen from close up, yet when viewed from a distance the whole becomes the image of a field of flowers. It is the very inadequacy of the brush strokes themselves that shows the emergence of the painting to be a result of their organisation. Analogously, individual atoms can form an organised whole which can do things that individual atoms, or even small groups of atoms, cannot. Thus one proton or electron is identical to another. All they can do individually is ensnare one another by their electrical attraction, thereby forming atoms. The electricity within atoms enables groups of them to join up, making molecules. Put enough molecules together and they can become self-aware, in the form of human beings.

Certain metals can expel magnetic fields when they are cooled to ultra-low temperatures, producing what is known as superconductivity, yet the individual atoms that make up the metal cannot do this. A more everyday example is the emergence of solids, liquids and gases from a large collection of molecules: we take it for granted that the floor of a plane flying at 40,000 feet will not suddenly lose its rigidity and release us into the clouds below, just as Eskimos trust the rigidity of the ice pack beneath them, even though a small rise in temperature could cause it to melt, leaving them stranded in the sea.

In a crystalline solid, the orderly arrangement of individual molecules into a lattice ensures the crystal’s solidity and also its beauty: carbon atoms may organise themselves into diamond, or into soot. In a solid, the individual atoms are locked in place relative to one another, but a rise in temperature causes them to jiggle a bit so that each is slightly displaced. The positional ‘errors’ do not accumulate, however, and the whole can retain its apparent solidity. In the liquid phase, the jiggling becomes so agitated that the atoms break ranks and flow.

In some materials the change from a solid to a liquid state is abrupt: on the ice a fractional temperature shift above or below 0ºC can be the difference between life and death. In other materials it is not: there is no meaningful way to tell whether glass is a solid or a highly viscous liquid. Helium is a gas at room temperature and liquid when cold, but it never freezes however much you lower the temperature. On the other hand, subject it to enough pressure and it will crystallise. The equations that describe the behaviour of individual atoms are known, but solving them is possible in only a few simple cases, and using them to derive the conditions of existence of solids and liquids all but impossible. Yet this doesn’t prevent engineers from designing solid structures or hydraulic systems.

The reason predictive science is possible is that laws operating at the level of individual atoms become organised into new laws as one moves up to complex systems. Thus the laws of electrical charges beget those of thermodynamics and chemistry; these in turn lead to the laws of rigidity and then of engineering. We may not be able to derive the liquid state for this or that substance from first principles, but liquids still have general properties that transcend these. Liquids will not tolerate pressure differences between any two points other than those caused by gravity; this is the principle behind the mercury barometer and all hydraulic machinery. It’s a property of the organised liquid state, and the underlying laws at the atomic level are essentially irrelevant.

It is this hierarchy of structures and laws that enables us to understand and describe the world: the outer layers rely on the inner, yet they each have an identity of their own and can often be treated in isolation. Thus an engineer can design a bridge without needing to know the atomic physics that underpin the laws of stress and strain; and although we know that atoms come apart when they collide fast enough, that their nuclei split at higher speeds, and that at extreme speeds they effectively melt into constituent quarks and ‘glue’, a chemist has no need of quarks and gluons when it comes to designing drugs.

Laughlin rails against reductionist science and its practitioners: ‘large experimental laboratories’ which ‘defend their work from criticism . . . by forming self-refereeing monopolies that define certain ideas to be important, whether they actually are or not’, making ‘measurements that serve no purpose other than to expand journals and fatten frequent flyer accounts’. String theory is a favourite target: ‘immensely fun to think about’ but without ‘practical utility . . . other than to sustain the myth of the ultimate theory’. He takes issue with the string theorists’ view that the output of today’s ‘mighty accelerators’ is merely ‘“low-energy phenomenology” – a pejorative term for transcendent emergent properties of matter impossible to calculate from first principles’. He categorises string theory as a textbook case of a ‘Deceitful Turkey, a beautiful set of ideas that will always remain just out of reach’.

You can’t argue with the fact that even if we had a Theory of Everything, it would be quite another matter to work out all its consequences. At the human scale such a theory already exists. Mathematical relationships accounting for everything bigger than the atomic nucleus have been with us since the work of Schrödinger, Heisenberg and Dirac seventy years ago. The equations governing the behaviour of electrons and atoms are taught to students, but their simplicity is ‘highly misleading’: as Laughlin notes, they are difficult to manipulate and impossible to solve outside a few simple cases. It is only with the development in recent years of powerful computers that the range of such solvable problems has grown. No one has deduced from these equations the properties of simple amino acids, let alone the workings of DNA (though this has hardly held back the astonishing development of modern biology). Similarly, we can predict solar or lunar eclipses with certainty, but not the weather.

More contentious is the thesis that the laws of physics too are emergent. Newton’s laws of motion – that things move at a constant speed unless forced to alter their motion; that the same amount of force accelerates a heavy thing less than a light one; that acceleration takes place in the same direction as the force that causes it – underpin all of engineering and technology. In three hundred years of careful experimentation, their only failures have come when they are applied to objects moving near to the speed of light, when Einstein’s relativity theory takes over, and at atomic scales, where the laws of quantum mechanics replace them.

Our immediate experiences are of bulk matter – our senses are blind to the existence of atoms – but clues to the restless agitation of the atomic architecture are all around. As I watch my plants grow, I don’t see the carbon and oxygen atoms pulled from the air and transformed to build their leaves; my breakfast cereal mysteriously turns into me because its molecules are being rearranged. In all cases the atoms are calling the tune: we lumbering macro-beings see only the large end products. Newton’s laws apply only to the behaviour of these bulky things.

Two hundred years after Newton, experimental techniques had progressed to the point at which the atomic architecture was beginning to be evident. By the start of the 20th century, numerous strange empirical facts about atomic particles had accumulated that seemed incompatible with Newton’s laws: billiard balls bounce off one another in a determined way, but beams of atoms will scatter in some directions more than others, forming areas of intensity or scarcity like the peaks and troughs of waves of water diffracted through an opening. We try to describe this weirdness to students using familiar Newtonian language, but we fail, or cause confusion, or both.

The solution to the conundrum is, as Laughlin says, ‘a beautiful case history of how science advances by making theories conform to facts rather than the other way round’. The laws of quantum mechanics, the mechanics of very small things, were discovered in the 1920s. Quantum mechanics works: it makes predictions that in some cases have been confirmed to accuracies of parts per billion. Yet it creates mind-bending paradoxes that charlatans have exploited to convince the public that scientists seriously consider parallel universes where Elvis still lives, or that telepathic communication is possible.

The famous illustration of the supposedly fundamental paradox of quantum mechanics is the tale of Schrödinger’s cat. Schrödinger imagined a cat shut in a box which contains a radioactive atom, a Geiger counter and a cyanide capsule rigged to release the cyanide when the Geiger counter clicks. The idea is that at the moment the atom decays, the cat is killed. The uncertainty enters in when one applies the rules of quantum mechanics to interpret the result. According to quantum mechanics, as Laughlin puts it, ‘a mysterious quantity called the wave function leaks out of the atom slowly, the way that air escapes from a balloon, so that a finite but diminishing amount of this wave function is still inside.’ The amount left inside gives the probability that the atom has not decayed – and this is the crucial bit – at the time it is measured. To measure it, you must open the box; until you do so, according to quantum mechanics, the system essentially contains a combination of alive and dead cat.

For some, this paradox has mystical overtones, as though twisting one’s mind around it were a step on the path to enlightenment. Laughlin rightly criticises this and argues that ‘in science one becomes enlightened not by discovering ways to believe things that make no sense but by identifying things that one does not understand and doing experiments to clarify them.’ He stresses here that what is not understood in the cat paradox is the measurement process. Where a large number of atoms are involved, their co-operation means that Newtonian laws can emerge from the underlying quantum weirdness. All means of determining the cat’s status involve a large number of atoms, whether we shine a light on it, or even sniff it. Detecting the radioactive decay of one atom by using something as small as another atom would make no sense, as it would merely replace one unmeasurable thing by another. What we recognise as ‘measurement’ requires a large apparatus, and Laughlin argues that this is the key factor, since if the process of observing an object changes that object, then it doesn’t qualify as an act of observation. This is why it isn’t possible to observe a single atom. Size is the key, and certainty emerges from the organisation of billions of atoms acting co-operatively.

Laughlin’s interpretation of Schrödinger’s cat is interesting, but I’m not convinced the facts of the matter are so simple. Experiments in high-energy particle physics show quantum mechanics at work in many ways, one of which is particularly instructive as it doesn’t readily offer itself to the thesis of organisational emergence. At CERN in Geneva, over a ten-year period, beams of electrons were collided with beams of their antimatter counterparts, positrons. Electron and positron were duly annihilated, and out of the maelstrom new particles of matter and antimatter formed, emerging in two important classes known as 2-jet and 3-jet events. These present the same uncertainty as the cat in the box: until I look at the photo of the collision, I don’t know which class of event it was.

Two-jet events are about ten times more likely than their 3-jet alternatives. Over the course of time, more than 10 million such events were recorded and each one was different. This is as quantum mechanics predicts. Yet when the full set is added together, the physicists found that those in the 2-jet class indeed outnumbered the 3-jet tenfold. Other ‘sensible’ expectations also emerged, even though from picture to picture the scientists couldn’t know what would occur.

All this highlights a widely held misconception of what physical science is and isn’t. Newtonian laws enable us to predict what will happen in certain circumstances but not in all. Why the rules governing gravity’s workings, for example, are as they are is to some degree within science’s ability to explain: the fact, to take one instance, that the attraction between two bodies dies away as the square of the distance of separation is intimately related to the fact that space has three dimensions. We don’t yet know why, and I support Laughlin in urging caution on some string theorists who would suggest that we do.

I can’t agree totally, on the other hand, with his thesis that ‘the frontier of reductionism is now closed at the level of everyday things, yet the list of difficult problems grows.’ That the latter is true is beyond argument, yet there are phenomena at the human scale whose explanation seems to depend critically on the underlying physical laws. Why is matter electrically neutral (if it weren’t, gravity wouldn’t function at cosmic dimensions)? It seems that if the underlying quantum theory is not to contain certain ‘anomalies’, the quarks, electrons and neutrinos have to be related in such a way that the neutrality of matter necessarily emerges. And then, why is gravity so much weaker than the other fundamental forces (I can jump off the ground by performing a small chemical reaction in my blood and thereby temporarily overcome the gravitational attraction of the whole Earth)? Why do chemicals and their mirror images affect the body in radically different ways? To answer questions such as these, the principle of emergence is necessary, as Laughlin argues, but it hardly seems to be sufficient.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences