Like other beliefs and forms of behaviour to be met with in this country in the course of the present century, much of the Thatcherite approach to social policy was imported from the United States. Joe Rogaly commented recently that if under Mrs Thatcher 10 Downing Street was attached ‘by threads of steel’ to right-wing think tanks such as the Centre for Policy Studies, the latter were attached ‘by underwater cable’ to their counterparts in New York and Washington. If the steel threads endure under Mr Major, what messages will be coming down the underwater cables?
In the Eighties the Reagan Administration exploited the public disillusionment with government that had grown steadily throughout the previous decade and a half. By the end of the Seventies, only 25 per cent of those surveyed said they would trust the government in Washington ‘to do what is right’. A basic proposition of the Reagan years was that the Federal Government was trying to do too much. It was consuming too large a share of national resources. It should stick to ‘delivering the mail and protecting the coasts’, as the conservative slogan has it.
Linked to this were several other propositions. First, governments are inherently inefficient; they waste whatever resources they take or are given. Secondly, people make better decisions for themselves than governments can make for them. Thirdly, governments are ineffective – especially in matters of social welfare. Many, if not most, governmental attempts to deal with social problems have failed. (Social science research, unfortunately, was often unable to demonstrate the contrary.) American public confidence in many programmes became so frail that they were liable, in the words of one academic, to be ‘undermined by anecdote’. ‘In the war against poverty, poverty won.’ said Ronald Reagan.
American conservatives argued that government welfare programmes created a whole set of ‘perverse incentives’ which, directly or indirectly, actually made things worse. This thesis was popularised in the influential book Losing ground, by Charles Murray, published in 1984. Murray noted the inconclusive findings of social research. He pointed out that since 1965 generous rates of social security benefit had been available to workers and non-workers alike. He claimed that these benefits had encouraged young men to withdraw from the labour market, young women to become pregnant outside marriage and young fathers to decline to marry the mothers of their babies. In the black ghettos, young men seeking jobs were not encouraged to do so by their families, friends or neighbourhoods. Moreover, government policies actually succeeded, in the course of time, in changing social norms: ‘For the first time in American history it became socially acceptable within poor communities to be unemployed, because working families too were receiving welfare.’
Though some liberals, like Daniel Patrick Moynihan, derided such sweeping assertions, Murray’s thesis appealed both to common sense and to prejudice. It justified cuts in welfare spending and reductions in taxes and in state intervention. It also presented a pragmatic, morally neutral case for penalising idleness and sexual promiscuity. In January 1985 a Federal official told a Congressional committee: ‘Today poverty has been institutionalised by the Government’s anti-poverty programs.’ In mid-1990, the Bush Administration announced that nothing more could usefully be done to fight poverty. ‘We have decided to abjure a glitzy, splashy high-profile announcement of new programs and a brand-new strategy,’ an official said. ‘We concluded that there were no obvious things we should be doing that we weren’t doing that would work.’ On this a staff member at a right-wing US think tank commented that the War on Poverty was already too glitzy and splashy: it had been ‘a disaster, ruining millions of lives’.
Rhetoric of this kind has become familiar in Britain, too. It has crossed the Atlantic spontaneously, like seeds borne on the wind: a recent article in the Daily Telegraph used, almost in passing, the phrase ‘welfare-induced poverty’. But it has also been introduced deliberately. The Manhattan Institute – where for some years Charles Murray was a visiting fellow – has organised seminars in Britain with the Centre for Policy Studies and others. At these seminars there seems to have been general agreement with such propositions as that the answer is to compel young fathers to take more responsibility for their children. Murray himself was brought to Britain last year by the Sunday Times. He wrote an article for them, and identified in Britain the same problems as in the United States, the same misconceived responses, the same build-up of ‘perverse incentives’ exacerbating the problems they were intended to solve, the same signs of a developing, welfare-dependent ‘underclass’.
But even as such views are being peddled in Britain, in post-Reagan America they are coming under severe challenge. Once-dominant Conservative orthodoxies have started to look tarnished in the light of the facts. One set of facts is the legacy of the Eighties. Not only liberals are distinctly embarrassed by some aspects of contemporary US capitalism: the Savings and Loan scandal, the collapse (sometimes amid criminal charges), or drastic contraction, of some of the glitzier financial institutions, the involvement in some of these episodes of leading politicians. New Yorkers are starting to panic in the face of media stories claiming that the city is on the verge of collapse. The phrase ‘private affluence and public squalor’ seems to hang in the air.
In the field of health care, the failings of the basically private US system are front-page, tabloid material. The United States spends over 11 per cent of GNP on health care, nearly twice as much as ourselves and more even than its wealthy neighbour, Canada. But this does not mean that the average American receives health care twice as good as the average Briton. Competition among providers in a system where the bills are paid mainly by insurance companies (whose bills are in turn paid mainly by employers) has meant remarkably high standards of care for those insured and thus with access to care. But these high standards have to be paid for; since the late Sixties the costs of hospital care have increased twice as fast as the cost of living. ‘Today’s hospital system is extravagant, visible, flamboyant, exclusive and money-oriented,’ writes the author of the best recent book on the subject. Meanwhile over thirty million Americans have no health insurance at all.
These headline stories have drawn attention to underlying social problems. Infant mortality among American blacks is a little better than in Hungary, a little worse than in Costa Rica. Child poverty and homelessness are increasing. Among homeless children, once-declining diseases like whooping cough and tuberculosis arc becoming common again. Between 1977 and 1990 the real income of the poorest 20 per cent of the population fell by 9 per cent. For the lucky richest 5 per cent, their real income went up by nearly 53 per cent. In The Politics of Rich and Poor the political analyst Kevin Phillips discusses the growing financial inequality in American society. As one of his chapter bylines the author quotes Business Week, no less: ‘That the Great Divide between rich and poor in America has widened is perhaps the most troubling legacy of the 1980s.’ All this has prompted liberals to reassess the Sixties and Seventies, including the War on Poverty. It can convincingly be shown that where the greatest efforts were made, poverty was reduced. The elderly were the main beneficiaries from higher social expenditures, mainly on health care and social security. In the Sixties there was more poverty among elderly people than among the population at large. There is now less. Between 1965 and 1975 infant mortality declined by 33 per cent.
The past few years have also seen an increasingly effective challenge to the hypotheses of Charles Murray and others. Murray’s conclusions have been repeatedly shown not to follow from the facts which he cited. Take his claim that the growth in one-parent families was primarily caused by the rise in benefit levels. The number of children in one-parent families certainly rose steadily during the Sixties, Seventies and Eighties. But the levels of benefits payable to such families stopped rising in 1972, and fell from 1976. Levels of the main relevant benefit. Aid to Families with Dependent Children, vary widely, moreover, between States. It should follow from Murray’s thesis that there would be more one-parent families in the States where AFDC benefits were highest. There is no such correlation. Indeed, many States with the lowest benefits have the largest percentages of children living in one-parent families.
Such analysis knocks the legs from under the argument that since social problems have worsened despite government programmes, the next step must be to scrap the programmes. Emphasising the Bush Administration’s reluctance to increase educational spending, one of Bush’s advisers has said ‘We’ve got to shift the debate from resources to results.’ On the contrary, replied a leading state education officer, ‘the issue should not be reforms v. more money, as if spending more money is a reform strategy or as if initiating high-pay-off reforms can be done for free. They’ve got to get off this argument that money doesn’t count. Money well targeted is crucial.’ Much of the historical background to all this is documented by Karger and Stoesz. Their conclusion is ruefully pessimistic. They note that ‘the welfare state has served as the perfect “fall guy” for those who are displeased with government social programs,’ and foresee a broadly conservative orientation to US social welfare policy for the rest of the century.
Two other recent publications, confronting the same facts, reflect and may reinforce the change of mood described above. Marmor, Mashaw and Harvey’s excellent book begins with the statement: ‘This book has a simple message: America’s social welfare efforts are taking a bum rap.’ They continue: ‘The vision of social welfare policy generated during these two decades has often been misleading and misdirected – indeed riddled with myths.’
Part of the problem as they see it is the exaggerated claims made by welfare legislators for their programmes and tot subsequent ‘reforms’. These boasts led to exaggerated expectations, and to inevitable disappointment. This provided fertile soil foi the counterclaims of small-government conservatives such as Murray, whose analyses are once again convincingly demolished. The authors argue that the need is to fill the gaps in government provision, not to dismantle what is there.
A new quarterly, The American Prospect, has much the same concerns. ‘America now has another agenda on which its security depends: to rebuild our nation’s strength by investing in the capacities of out economy and our people.’ And: ‘That agenda requires the creative use of government for common purpose. After a decade of celebrating the private virtues, we need to remember that our system depends equally on the vitality of our public institutions and public life.’ Such statements suggest that in the United States a new orthodoxy is being born. A role may be rediscovered for the Federal Government which goes beyond protecting the coasts and delivering the mail. The implications for this country are clear. American tobacco companies are now vigorously marketing in the Third World the cigarettes they can no longer sell at home. We should be cautious lest, in the same way, we find policy-related concepts for which there is no longer an American market being peddled in Britain.
One such concept is ‘underclass’. Observing a familiar transatlantic time-lag, the term ‘underclass’ seems to be establishing itself here just as it is being tested to destruction in the United States. The term was first popularised there by a journalist in 1981. He wanted a term – as a later commentator put it – to embrace ‘chronically jobless men, perennial welfare mothers, alcoholics, drug dealers, street criminals, de-institutionalised schizophrenics, and all the other walking wounded who crowded New York City’s sidewalks in the late Seventies’. The term was further defined, and given academic respectability, by the sociologist William Julius Wilson. Taking broadly the same groups, he characterised them as ‘that heterogeneous grouping of families and individuals who are outside the mainstream of the American occupational system’. Wilson argued that as the middle class and the upwardly mobile moved out of innercity ghettos, those left behind became isolated from the networks that lead to jobs, and from role models. They could not cope with the difficulties of finding employment. Unemployed young men had less to offer as husbands and fathers. Their girlfriends were left to bring up their babies by themselves.
Charles Murray’s more recent definition, in his article on Britain, resembled Wilson’s, but with a subtle variation. Wilson’s underclass was outside the system (and found it impossible to get back in). Murray’s underclass is ‘a subset of pool people who chronically live off mainstream society (directly through welfare or indirectly through crime) without participating in it’ [my italics]. Shades of MacHeath and Fagin! Murray does not actually state that welfare policies cause crime. But he juxtaposes, with surveys of recent British social legislation, illegitimate births and youth unemployment, an alarming account, garnished with anecdote, of rising crime rates. The inferences about cause and effect are easy to draw. In response to this kind of usage, Wilson has now virtually forsworn his own term. The conclusion for us must be that if we wish to use it, we should be very clear how we define it.
Moreover, for conservatives like Murray, a large part of the problem is government itself. It is a common wish to preserve or return to the past – or rather, to an imagined past, in which things were somehow better and simpler. ‘Dream of London small and white and clean, the clear Thames bordered by its gardens green,’ as William Morris wrote. Such dreamers are bound to feel that our own actions have caused some of today’s ills. The most specific kind of action identifiable is government policy.
One effect of British social policies in the Sixties and Seventies, Murray argues, using almost the same language that aroused liberal derision in the United States several years ago, was that ‘the rules of the game changed fundamentally for low-income young people, changing behaviour in their wake.’ The implication is, of course, that if government through its programmes has bent people’s behaviour in directions of which we disapprove, then government – by abolishing existing programmes or introducing new ones – can bend behaviour back again.
This does not, however, seem to be the case. As convincingly demonstrated by Marmor and his colleagues, American experience shows no consistent causal relationships between, for example, welfare policies and changing patterns of child-bearing or family formation – let alone between welfare policies and underlying social norms. It seems clear that most feasible policy changes will have little or no impact on family structure or on ‘welfare dependency’, or on public views about such matters. Another American commentator observes of the ‘make-em-suffer’ strategy of social policy that, if its goal is to discourage single motherhood, it has failed ‘Policy cannot recapture the past. The energy spent lamenting the emergence of new family forms could more productively be directed towards ensuring young women with children a decent chance to avoid a life of poverty.’ And, as Marmor and colleagues might put it, towards thinking realistically yet positively about the role of social policy. This kind of argument is now increasingly heard in the United States. If the underwater cables are still in place, they should be carrying the message to those who need to know it over here.
Send Letters To:
The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN
letters@lrb.co.uk
Please include name, address, and a telephone number.