By the Numbers: Numeracy, Religion and the Quantitative Transformation of Early Modern England 
by Jessica Marie Otis.
Oxford, 264 pp., £18.99, April, 978 0 19 760878 4
Show More
Show More

It is​ an instructive irony of English political history that the Houses of Parliament were burned down not by revolutionaries but by bureaucrats. In 1834, John Phipps, an assistant surveyor for London in the Office of Woods and Forests, was tasked with finding more office space in the cluttered Exchequer buildings at Westminster. He discovered that a whole suite of rooms was being used for the storage of old tally-sticks, great stacks of obsolete financial records notched on wood. The tallies were ‘entirely useless’, according to the Treasury. Phipps and his colleague Richard Weobley came up with an economical solution: they would send the tallies, two cartloads’ worth of fiscal kindling, as extra fuel for the stoves under the House of Lords. The Times leader the next day called the conflagration, which began when two stoves ignited, a ‘spectacle of terrible beauty’.

Tallies were long, squared-off pieces of wood – often hazel – cut with horizontal notches to represent quantities of money or goods that had changed hands between two people. After the transaction, the tally was split in half along the length, into a ‘stock’, for the creditor to hold on to, and a ‘foil’ for the debtor. When the debt was settled, stock and foil were matched to demonstrate that the amount was correct.

The Exchequer had issued tallies on its debts as far back as the 12th century. They were so recognisable in England that in the Middle Ages they became a kind of secondary currency, traded to third parties or exchanged at a discount for hard cash; the stock of the tally is one possible origin of the term ‘stock market’. Samuel Pepys, working for the Navy Office, once nervously carried £17,500 of government debt tallies back to his house ‘fearful every step of having one of them fall out, or snatched from me’.

Exchequer stocks made up just a few trees’ worth in a whole forest of wooden exchange. Almost everyone made and used tallies. With small change in short supply, they served as a record of routine credits and debts: people ran up tabs with alewives, bakers, butchers and other traders, and notched them on sticks. The Nonconformist writer Thomas DeLaune, writing in 1681, declared that ‘this Antient way of striking of Tallies hath been found, by long experience, to be absolutely the best way that was ever invented, for it is Morally impossible so to Falsifie or Counterfeit a Tally’: the ‘Natural growth or shape’ of the wood guarded against fraud.

In early modern England, numbers were something you could touch. On tally-sticks and abacuses, counting boards and jettons, arithmetic was a feat of hand-eye co-ordination. The word ‘calculus’ is derived from the Latin for ‘little pebble’: thinking numerically was a matter for the fingers as much as the mind. John Cannon, an 18th-century diarist, recalled his grandfather keeping his accounts with beans. Daniel Defoe claimed to know a shopkeeper who ‘knew nothing of figures, but he kept six spoons in a place on purpose, near his counter, which he took out when he had occasion to cast up any sum’.

Jessica Otis’s By the Numbers surveys what she calls the ‘quantitative transformation’ of early modern England, exploring ‘how numbers came to matter so much – and be so widely embedded – in English culture’. In the 16th century, the utility of numbers could not be taken for granted. With his popular textbook The Grounde of Artes (1543), Robert Recorde made a polemical claim for arithmetic as the foundation of all human knowledge. ‘Without nomberynge a man can do almost nothynge,’ he wrote, but ‘with the helpe of it, you maye attayne to all thyng.’ In many respects, what Otis describes was a documentary transformation: in these two centuries, cheap printed materials, technical education and Arabic numerals combined to flatten the hands-on world of object-based reckoning into the abstract arithmetic of pen and paper. Counting with things became old hat. Less than two centuries after Recorde was writing, a treatise entitled The Gentleman Accomptant disdained tallies as ‘obsolete’, except for ‘ordinary Use in keeping Accompts with illiterate People’.

The most immediate way in which most people learned to count was on their fingers: ‘the natural and simple way of numbring and computation’, according to the physician John Bulwer, ‘borne with us and cast up in our Hand from our mothers wombe, by Him who made all things in number, weight and measure’. Thomas Hobbes speculated that finger-counting predated the emergence of numerical language itself. There is something to this: modern research has found that humans (and some animals) can ‘subitise’ numbers up to five, grasping quantity without counting. Early modern educators taught pupils to recite the names of numbers – to solidify them with words – as the first rule of arithmetic.

The absorption of some basic numerical knowledge was taken for granted as a part of growing up. Everyone was expected to be able to count higher than ten; twenty was the threshold set by some contemporary legal theorists for mental competency, and with it, the ability to own property. Ambrose Bennett was interrogated by the Court of Wards in 1628 to determine whether he was an ‘idiot’: prompted by the assessors, he correctly deducted his living expenses of £120 from his yearly annuity of £200, and showed that he knew how much interest his fortune would generate (a rate of 8 per cent). Bennett passed the test, but it is hard not to sense a sneer in the comments of his examiner, who ‘told him he was very skilful in his estate’.

It was estate accounting that had long furnished the most widespread incentive for learning arithmetic. A 13th-century treatise on household management, printed in 1589 as The Booke of Thrift, made it clear that the first task of a competent landlord was to find trustworthy accountants. The second task was to learn enough arithmetic to prevent them from defrauding him. He was unlikely to find such instruction in conventional schooling; mathematics had little place in a predominantly literary curriculum. The natural philosopher John Aubrey remarked that ‘a Barre-boy at an Alehouse will reckon better and readier than a Master of Arts in the University.’

Where it was taught formally, arithmetic was an applied science, part of a bureaucratic education in writing documents and casting accounts. John Wallis, who went on to devise the ∞ symbol for infinity, recalled that during his childhood in the 1630s he had only learned arithmetic after becoming jealous of his younger brother, who was learning to account during his apprenticeship. ‘For Mathematicks (at that time, with us) were scarce looked upon as Accademical studies, but rather Mechanical; as the business of Traders, Merchants, Seamen, Carpenters, Surveyors of Lands, or the like; or perhaps some Almanak-makers in London.’

The mechanical aspect of accounting lay in a knowledge of the counting board, or ‘reckoning cloth’, chequered with black and white squares – the origin of the name for the royal ‘Exchequer’. Clerks pushed metal counters across the grid to represent quantities. Made from brass, copper or lead, the counters were mass-produced in Tournai and Nuremberg through the 16th century, made to look like coins but blazoned with symbols or the alphabet. They were sold in ‘casts’ or ‘nests’ of one hundred, costing seven pence a set. To ‘know the lines’, as it was called, was to understand arithmetical operations through spatial representation. In the most common variant, known as merchant’s use, the top line indicated units of £20, followed by lines for single pounds, shillings (s) and pence (d). Placing a counter to the right side of the column indicated one unit; on the left it indicated five (or for pence, six).

In the diagram here, the left column shows £26, 3s, 2d added to the central column of £101, 5s, 1d, to generate a total in the right column of £127, 8s, 3d. Though it seems finicky, it was (apparently) intuitive and highly flexible. The horizontal lines could be adapted to mean whatever the calculator wanted, allowing arithmetic between the mix of base-twelve (pence in the shilling) and base-twenty (shillings in the pound) systems of English money.

But a counting board merely calculated. For the recording of numbers, Roman numerals were required. They were not used for calculation: X + X = XX makes some sense as a visual sequence, but XVII + XVII = XXXIV does not. The German classicist Theodor Mommsen suggested that they may have originated as glyphs of finger-counting; as Otis and others have pointed out, the figures – particularly the iterative function of I II III – bear a strong resemblance to notches on tally-sticks.

Roman numerals were valued for their affinity with Latin, the de facto language of administration. But the qualities of Arabic numerals – known in Europe since before the first millennium – slowly overturned this attachment to tradition. Their great virtue was that they entrenched a decimal base into the representation of figures, creating a closer correspondence between the names of numbers (above twenty) and their symbols. Arabic numerals were rarely written in England before the 16th century, and then mainly for calendar dates – they weren’t regularly used in accounting until the later 17th century. Otis suggests that they were distrusted as a means of recording financial information because it was relatively easy to alter them. Roman numerals must be written in sequence to make sense: the only alteration that can be made to MMXXIV is to add some greater number at the beginning. But if you write 2024 in Arabic numerals, you can squeeze in a number anywhere in the order to change the quantity dramatically.

Arabic numerals allow the conception of ‘zero’ as a natural number. (In 2004 Robert Kilroy-Silk wrote a bilious article in the Sunday Express claiming that ‘we owe Arabs nothing.’ As critics pointed out, he was so wrong that he was right.) In early modern England, this was their defining quality: the use of Arabic numerals was called ‘ciphering’, from șifr, the Arabic word for zero, or ‘algorism’, after the Persian astronomer al-Khwarizmi. In fact, as Otis points out, ciphering was not quite as radical a leap forward as is sometimes made out. The counting board, too, drew on a system of place-value that relied on a spatial representation of number – its idea of zero was an absence rather than an abstraction. What was new about Arabic numerals was their speed: writing out calculations was much faster than the cumbersome business of pushing counters around. John Palgrave, writing in 1530, recorded what was perhaps a familiar boast among the cipherers: ‘I shall reken it syxe tymes by aulgorisme [bef]or you can caste it ones by counters.’ But as the comment indicates, throughout the period people continued to use both. In a set of late 16th-century household accounts kept for Anne Stanhope, duchess of Somerset, the scribe used Arabic numerals for dates and quantities of goods, counting board dot diagrams for financial calculations in the margins, and Roman numerals to record prices.

The 16th and early 17th centuries yielded vital breakthroughs, both in theory and in practical application. The Scottish mathematician John Napier discovered logarithms, and also devised a calculating machine called ‘Napier’s Bones’, a set of rods inscribed with numbers which made it possible for users to perform complex multiplication and long division via the simpler operations of adding and subtracting. Edmund Gunter drew on logarithms to design measuring devices with trigonometric functions for navigation at sea and land surveying.

These advances came on the back of a long push for better mathematical training. In 1573 Sir Thomas Smith, the Regius Professor of Civil Law at Cambridge, left a bequest to Queens’ College on the condition that undergraduates should not proceed to the BA ‘before that they be well expert in the parts of Arithmatique’. He endowed two lectureships for the purpose, with the stipulation that teachers should not lecture ‘as of a preacher out of a pulpit’, but rather ‘with a penn on paper or tables, or a sticke or compasse in sand or duste to make demonstracon that his schollers maie both understand … and also do it themselves’.

For the most part, however, those who wanted a better knowledge of mathematics had to seek it out. In 1662 Pepys, a Cambridge graduate, engaged Mr Cooper, the one-eyed mate of the Royall Charles, to teach him arithmetic (after a month of lessons together they fell to tinkering with model ships instead). There was a ready market for such instruction, which printers were keen to tap. The stationer Thomas Rooks reissued Hodder’s Arithmetick in 1667, explaining that ‘in this bad time of trade of Books, in less than ten months, I sold of them 1550 … now I present you with a 4th Edition.’ They were inexpensive – about 4s new or 6d secondhand – and a good investment: they could, after all, help you put your finances in order.

As Otis sees it, the writers of these textbooks were the guiding stars of the quantitative transformation. They refined the technical arithmetic of craftsmen, sailors and clerical workers into a science fit for gentlemen; in doing so, they also developed new cultural distinctions between kinds of knowledge. The mathematician John Kersey, reprinting a textbook by his friend Edmund Wingate in 1658, suggested that it would be useful for ‘Learners, as desire only so much skill in Arithmetick, as is useful in Accompts, Trade and such like ordinary employments … before any entrance be made into the craggy paths of Fractions, at the sight whereof some Learners are so discouraged’. Perhaps he had in mind readers such as Hobbes, who had griped publicly about the ‘scab of symbols’ littered through Wallis’s work on conic sections. Wallis gave a fractious reply: ‘Sir, they were not written for you to read, but for them that can.’ By the later 17th century it was embarrassing, in certain circles, not to know something about numbers. Edmund Cocker addressed his 1678 textbook to ‘the pretended Numerists of this vapouring age’.

Through the mists​ , it is possible to make out the origins of a quantified political culture. In the 16th century, anxious Tudor governments sought to collect more information about their subjects, launching a proliferation of national inquiries, musters, assessments and surveys. With ever greater frequency, men on horseback came around asking questions that required some counting. How many acres lie under the plough, and how many homesteads have been abandoned? How many in the vill have £1 or more in goods, lands or chattels, and how do they get their money? In 1538, Cromwell mandated that every parish should keep a register to record ‘the day and year of every wedding, christening and burying made … and also there insert every person’s name that shall be so wedded, christened or buried’. People feared that it was a ruse for a new tax. The truth was subtly worse: it was an attempt to pin them down, to fasten social identity to irrefutable documentary evidence. Big data for the Leviathan.

The burial records would take on a life of their own. In an age of recurrent plague, there was a great deal of popular interest in mortality figures; with the advent of cheap print, lists of parish dead were published on bills sold for a penny a piece. By 1603, London plague bills had weekly print runs of as many as six thousand copies (for a city with a population of about 141,000), giving parish-by-parish totals of plague deaths, general mortality and new christenings.People were terrified of the bills, but they couldn’t look away. The preacher Francis Raworth gave a homily to the Providential zeroes: ‘For these twelve moneths and above, I finde there nothing but Ciphers: Ah Lord, how unthankful are we for such a blessing! when thou might’st as justly as suddenly, turn our Ciphers into Figures.’ Pepys used his new arithmetical skills to work out the weekly rate of increases in mortality. Matthew Mead, a Nonconformist minister, wished bitterly that ‘we had Weeklie Bills of such Sins.’

These skills would soon be put to more utilitarian uses. In 1662 John Graunt published Natural and Political Observations, a pioneering work of demography that tried to find out ‘how many People there be of each Sex, State, Age, Religion, Trade, Rank, or Degree’. From the bills of mortality, he ‘reduced several great confused Volumes into a few perspicuous Tables, and abridged such Observations as naturally flowed from them’. From what he called the ‘Mathematiques of my Shop-Arithmetique’, he estimated that there were six and a half million people in England and Wales; he was probably about a million over.

Graunt saw a clear link between mathematical ratios and political harmony. An understanding of the composition of the population, he wrote, made for ‘good, certain, and easie Government, and even to balance Parties, and factions both in Church and State’. William Petty, an official in colonial Ireland, took the idea further still. Calculating the English and Irish populations of Protestants and Catholics, he proposed a scheme of forced migration to achieve the correct ratio of righteousness. Exchanging 200,000 Irish Catholics for the same number of English Protestants, he argued, would create an Anglican majority in Ireland, while leaving a Catholic minority of less than 2 per cent in England.

Although Petty’s plan was never put into practice, his methods were influential. Gregory King, whose statistical work was later incorporated into Adam Smith’s The Wealth of Nations, claimed that ‘Mathematical Reasoning’ was essential to public discourse, a foundation of neutrality from which rational debate could proceed. Writing with his colleague Charles Davenant, King claimed that arithmetic was ‘not only applicable to Lines and Numbers, but affords the best means of Judging in all concerns of human life.’ Bureaucrats have always been revolutionaries in their own way.

After the Great Fire of 1666, enterprising Londoners established new schemes of home insurance; Nicholas Babon’s company set premiums at 2.5 per cent of annual ground rent for brick houses, double for those built with timber. One scheme, the Amicable Contributors, insured more than 13,000 houses in the capital by 1708. These policies weren’t based on a statistical analysis of risk. The calculation of the premiums used a basic arithmetic of building material and property value; the companies were principally designed to generate profit for their shareholders. Each one maintained its own brigade, who wore colourful branded uniforms and affixed distinctive fire-marks to the buildings under their protection. They were widely suspected of refusing to extinguish fires in buildings they had not insured. By the end of the 18th century, the quantitative transformation had given way to what Ian Hacking called an ‘avalanche of statistics’.

Nowadays, critiques of quantification are ten a penny; it is easy to see the deficiencies of metrics when they are so often used against you. The attempt to count people can be invasive, even offensive to human dignity. A long-standing Christian tradition held that King David had sinned when he carried out his census of the Israelites. According to an early 17th-century commentary, ‘it belonged vnto God onely to number that which was innumerable.’ There has always been something about numbers that leaves people cold; a disaffection with their abstracting effects, the distance they seem to place between experience and the world. And there is an accompanying mistrust of people who can intuit with them. Edward Worsop, who in 1582 wrote a book pointing out geometrical errors made by land surveyors, complained that people ‘which have no understanding in mathematicall arts’ were ignorant and jealous of those who did; ‘When they see a fellow … especially if he be studious, and given to solitarines, [they] say in way of scorning, he hath a mathematicall head.’

What mathematical heads can sense – and what the rest of us would do well to remember – is that at the bottom of number there is a holy mystery. Can something truly be the same as itself? (If so, it can be counted.) Numbers present us with the oscillation between likeness and difference, mathematics with a language that might bridge the human and divine. Adelard of Bath, who pioneered the translation of Arabic mathematical works into Latin in the 12th century, wrote that ‘all visible things are subject to number … [which] is latent in the reality of things themselves.’ He quoted Xenocrates: ‘The soul is number moving itself.’ In premodern England, this soul was alive and kicking, present in the fingers, the grain of the tally, the geometrical line drawn in the sand. Number had a texture. Three barleycorns laid end to end made an inch. Twelve inches to the foot, twelve pence to the shilling, twelve apostles, twelve jurymen. The ‘long’ hundred was 120 for herring, but 112 for measures of tin. Four saltfish made a warp.

Ideas cast from objects take a long time to die. Twenty, or thereabouts, is still a score – a notch on the old tally. When in 1783 the Exchequer finally replaced tallies with a system of paper cheques, it gave them indented edges that mimicked the form of a stock and foil. Well into the 20th century, dockers and miners continued to be issued with brass ‘tallies’ as a means of clocking in at work. Some numbers are still odd.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Letters

Vol. 46 No. 22 · 21 November 2024

Tom Johnson quotes the early modern mathematicians Robert Recorde and John Wallis, and remarks on Wallis’s invention of the lemniscate symbol to denote infinity (LRB, 24 October). Recorde had a far greater influence on the way we write mathematics: his book The Whetstone of Witte (1557) introduced the = symbol for equality, with the rationale that ‘noe 2 thynges, can be moare equalle’ than a pair of parallel lines. The book is also notable as the first in English to use the modern plus and minus signs. (Recorde also coined the term zenzizenzizenzic to mean an eighth power, which survives today as the solitary entry in the OED with six ‘z’s.)

Artie Prendergast-Smith
Loughborough

Vol. 46 No. 21 · 7 November 2024

Tom Johnson is right to say that the concept of zero reached Europe via Arab scholars, but its invention dates back considerably further, to Indian mathematicians of the third century BCE (LRB, 24 October).

Dave Morris
London SW12

send letters to

The Editor
London Review of Books
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address and a telephone number

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences