Freedom itself is under threat as the ever-growing State damages liberalism and individualism, argues historian and IPA Adjunct Fellow, Bradley Bowden.
The success of modern liberal-democracies has been built upon three pillars: democracy, economic and political liberalism, and an embrace of an energy-intensive economy, Many assume democracy is the key element of the three, as millions laid down their lives in World War II in the name of democracy. Repeatedly, however, democratic majorities have heeded the siren call of authoritarianism in one form of another; for example, Germany in the 1930s, post-Soviet Russia, and the Middle East in the wake of the Arab Spring of 2010.
In fact, liberalism—as a system of legal protections and social traditions that supports individual choice and liberty—has been most decisive for the modern West’s success. Liberalism underpins individualism, creativity and entrepreneurship. It is, however, also the most elusive element, hardest to maintain, and most easily pushed aside. We need to look at the factors in modern society which work against the liberal tenets of individualism and liberty, including the growth of the university-educated bureaucratic and managerial class associated with the expanding role of government.
In looking at the rise-and-rise of the State and its pernicious effect on liberty, we will also put to rest once and for all the idea that a ‘neo-liberal’ consensus has ruled in the West; the fiction spread by the left that somehow government has been in sustained retreat in recent decades. The opposite is true, and historical perspective also shows just how staggering has been the growth of the State over the past 120 years in particular. This also drives the rise in numbers and power of the bureaucratic-managerial class, with all the negative effects that has on democracy, and freedom.
The appeal of liberalism and its universal twin, individualism, should be self-evident. For the individual, however, freedom necessarily entails risk associated with choices one makes, whether they relate to job selection, investment decisions, or provision for one’s retirement. Conversely, collectivism—such as clan or tribal-based decision-making, socialism, and fascism—frees the individual not only from the risk involved in their choice but also choice itself. Collectivism does not, however, negate the need for choice; decision-making is merely transferred to a collective entity. In modern societies, invariably this is the State.
For liberals around the world, the exercise of power by governments during the COVID-19 pandemic has come as an undoubted shock, threatening the freedoms and individual liberties most of us hold dear. As with most State-based responses, the promise of ‘security’ has come at the expense of not only risk but also choice. Those in secure State and quasi-State employment—such as at universities, schools, or in health work—have prospered. Small and medium-sized businesses have been thrown to the wolves. Unfortunately, what we have witnessed in the COVID-19 pandemic is not an aberrant outcome. Rather, it exposes a monster that has steadily grown in size and power even as the political left bemoaned the supposed ascendancy of ‘neo-liberal’ virtues over those of the ‘social-democratic’ State.
FROM LAISSEZ-FAIRE TO STATE SUPREMACY
Prior to World War I, the principles of self-reliance and laissez-faire (allowing the market to work) were more than theoretical abstractions. They were the lived reality across the liberal-democratic societies of Northwest Europe, North America, and Australasia. To the extent the State intervened in the economy it typically involved tariff protection and, in the case of Australia, immigration controls. Invariably, such interventions were directed towards fostering private-sector manufacturing rather than an expansion of the State’s own economic footprint.
On the eve of WWI, as Figure 1 indicates, governments in New World societies such as Australia and the US emphasised self-reliance to an even greater degree than Old World nations such as Britain and Sweden. In the US and Australia, governments were responsible for less than $3 of every $100 of economic output. Even in Britain, however, government spending amounted to slightly less than 10% of Gross Domestic Product (GDP) as late as 1910.
Britain pioneered the interventionist welfare state.
In the popular imagination Sweden and other Scandinavian countries are believed to be the birthplace of the modern welfare state. In truth, however, Sweden long remained loyal to a laissez-faire economic model. But while government spending as a share of GDP rose from 10% to 24% in Sweden between 1930 and 1940, the society thereafter sharply reduced its dependency on government spending. In 1960, government spending represented only 19% of Swedish GDP, a total that was lower than in Australia (24%) and the USA (30%) at that time.
Among the Western societies that avoided a descent into totalitarianism, Britain–not Sweden–was the true pioneer of the interventionist welfare state.
By 1950, the Marxist historian, Eric Hobsbawm, approvingly declared, Britain boasted “the most State-planned and State-managed economy ever introduced outside a frankly socialist economy”. In addition to legislating in favour of a cradle-to-grave system of welfare protections, the Attlee Labour government (1945-51) nationalised coal mining, road transport, the railways, steel production, and health care. Most private homes constructed in the immediate post-War years were also government-built. By the early 1950s, more than one-fifth of all homes in England and Wales were publicly-owned and (in addition) nearly one-third were subject to rent controls. Government spending rose from 10% of GDP in 1910 to 37% in 1950. Although Conservative governments re-privatised some industries (notably road transport) in the 1950s, the State’s economic footprint grew ever larger.
Unlike Britain, where government spending already represented 32% of GDP in 1940, the USA and Australia were still societies with a comparatively small State sector on the eve of World War II. In the USA, government funding accounted for 10% of GDP in 1940, and it was only very slightly less in Australia. The vast literature that suggests Franklin Roosevelt’s ‘New Deal’ of the 1930s represented a gigantic Keynesian ‘stimulus spending’ program is thus founded in myth; a conveniently self-serving myth for the post-War generation of Keynesian economists. Yes, Roosevelt’s administration in the 1930s doubled Federal spending compared to the previous Hoover government; which had, in turn, doubled its spending in response to the Wall Street crash. Such increases, however, occurred from a low base. The Hoover and Roosevelt administrations also offset much of their increased spending with increased taxation.
As Price Fishback notes, the resultant deficits were miniscule. State governments also expanded their range of taxes, even as many cut back spending. In consequence, by the late 1930s the State government sector as a whole was running a surplus.
In the USA there was of course an expansion in the size of the State because of World War II, but then there was a rapid retrenchment in the wake of victory. There, as in Australia and Sweden, the large-scale expansion of the State sector was primarily a post-1950 phenomenon. In 1950, government spending in the USA and Australia represented 15% and 13% of GDP, respectively. Twenty years later the totals were 32% and 28%: an effective doubling. Although the economic footprint of the State sector of the USA grew only modestly to 38% between 1970 and the early 1990s, in Australia the big-spending policies of the Whitlam (1972-75) and Hawke-Keating (1983-96) governments had an outsized effect. By the early 1990s, government spending represented 40% of Australian GDP; nearly half again as big as the 1970 figure.
Following the election of the British Thatcher government (1979-1990), Western societies witnessed for the first time a sustained attempt to limit the State’s seemingly inexorable rise. In Britain, privatisations were associated with a marked shrinking of the government’s economic footprint. Between 1980 and the early 1990s, government spending as a share of GDP fell from 53% to 38%. In Australia, under John Howard (1996-2007), and in the USA, reductions were more modest. In Australia, government spending’s share of GDP fell from around 40% in the early 1990s to 37.5% in 2000, and in the USA declined from 38% of GDP to 34%. In ‘social democratic’ Sweden, government spending was cut back from an unsustainable 63% to ‘only’ 50%.
Spending on ‘social expenditures’ had detrimental effects.
Superficially impressive, the ‘neo-liberal’ reforms of the 1980s and 1990s disguised a failure to fundamentally alter society’s growing dependence on the State for a range of social benefits and subsidies, ranging from child care to various ‘green’ initiatives including rooftop solar panels and wind farms. Most of the Thatcherite-era achievements were associated with the privatisation of often unprofitable industries, such as railways and airlines.
Temporary reductions in government debt levels were also obtained by selling off highly-profitable assets including power utilities. At the same time, however, as Figure 2 indicates, governments maintained or increased ‘social expenditure’ (on welfare, education, and health), even if these services were often delivered through outsourced third parties.
In Australia, government social expenditure rose from 10% of GDP in 1980 to 17% on the eve of the COVID-19 pandemic. In the USA it rose from 16% to 20% during the same period. In France, which surpassed Sweden as the world’s great ‘welfare state’, it rose from 20% to 31% of GDP.
The expansion of government spending on ‘social expenditures’ had many effects, all detrimental. Almost everywhere, government spending began to creep upwards again. By 2019, Australian government spending represented almost 44% of GDP; a 6% gain on the 2000 figure. Similar trends were apparent in Britain and the USA. Taxes also tended to creep upwards. By 2019, taxes amounted to a third of British GDP, a total that was little different to when Thatcher was first elected. In Australia, taxes consumed 29% of GDP in 2019, a higher figure than in 1990. Rarely, however, were these tax increases commensurate with increases in spending.
The result, as Figure 3 indicates, was a steady growth in government debt. On this score, Anglosphere countries were by 2020 performing as bad if not worse than their European counterparts. For while places like Britain and Australia spent less than nations such as Sweden, they also taxed less. By 2020, Australian government debt reached nearly 100% of GDP; more than three times its level when Kevin Rudd was elected to office at the end of 2007.
As the State once more increased its economic footprint, and more-and-more people relied on government ‘social expenditure’ for jobs and income, the inevitable result was a collapse in productivity. Across the OECD, the last 20 years have been lost years in terms of multifactor productivity (the benchmark measure incorporating improvements in the productivity of labour and capital). France, Canada and Australia experienced negative productivity growth in five of the 13 years from 2007 to 2019. In Britain, productivity went backwards in six of those years. Even in the USA—where the state’s economic footprint is less than elsewhere—productivity declined twice during that period.
THE STATE’S HANDMAIDENS
In his recent study, The Coming of Neo-Feudalism, the American demographer and social critic Joel Kotkin argues that the emergence of this salaried middle-class is a fundamentally new phenomenon, heralding what he refers to as the ‘clerisy’, a ‘new form of aristocracy’. (Read a review of his book by IPA Director of Research, Daniel Wild, in the Spring 2020 edition of the IPA Review). A quarter of a century earlier Christopher Lasch in his The Revolt of the Elites and the Betrayal of Democracy put forward a similar argument. The ‘new professional and managerial elites’ associated with jobs in academia, government and the corporate world, Lasch recorded, represented a fundamentally ‘new class’, whose “livelihoods rest not so much on the ownership of property as on the manipulation of information and professional expertise”. The IPA’s Centre for the Australian Way of Life is extending the analysis to our own country.
While the social weight of the ‘new’ middle class is, no doubt, greater than ever, its ascent had begun and was noticed much earlier than perhaps even Lasch and Kotkin realise. When the modern bureaucratic State first emerged during the Age of Absolutism (c.1550-1789), the university-trained professional was from the outset the State’s handmaiden. Lacking independent means of wealth, the fate of the professional bureaucrat and the State that they served were symbiotic. Neither thrived without the other.
Universities groomed graduates for careers in the civil service.
Reflecting on the role of the early-modern universities, British historian Henry Kamen (1936-) observed they primarily existed to serve the state. Their training in canon and civil (statute) law groomed university graduates for a civil service career. At the University of Salamanca, the great recruiting ground for the Spanish bureaucracy, almost all of the two to three thousand students who graduated each year between 1570 and 1640 boasted degrees in canon law. Virtually all went on to jobs in the church or State bureaucracies. At Oxford and Cambridge, the study of canon and statute law were also the most popular courses by a considerable margin. Typically, this training in canon and statute law made university graduates near-useless when it came to representing clients before a court. This is why Barristers in the common law countries were trained in the various ‘Inns’ of the Court, until quite recently.
Conversely, a university law degree became almost mandatory for those interested in a political or civil service career. At the University of Salamanca, one chair in canon law had to be filled 61 times in the course of the 17th century as the beneficiaries of appointment moved across to the Spanish bureaucracy.
Significant as the salaried professional was to the rise of the modern State, they were a numerically insignificant force before 1850. Even after that, their rise to numerical and social significance initially depended not on State bureaucracies but rather on the expansion in the private sector of what business historian Alfred Chandler (1918-2007) referred to as the multiunit enterprise; an enterprise that boasted its own purchasing, marketing, production, and internal audit functions. By 1860, Chandler estimated, America’s railroads “probably employed more accountants and auditors than the Federal or any State government”. In Britain, this new class of private sector managers, administrators, and the like grew almost ten-fold from 1851 to 1911. On the eve of World War I, Hobsbawm estimated, this salaried middle-class (Kotkin’s ‘clerisy’) was already numerically more significant than Britain’s old economically-independent middle class of shopkeepers, small-business owners and self-employed professionals (which Kotkin labels the ‘yeomanry’).
Until the 1960s, the political inclinations of the professional middle class was almost always conservative and pro-business. Indeed, the tendency of the professional middle class to vote for the right, and the working class to vote for the left, appeared an iron law of politics. Today, as most understand, the reverse generally applies. What explains this shift? The short answer is that the university-educated professional—whether in the public or private sector—has reverted to primarily being a handmaiden of the state. As with the growth of the 16th century Absolutist state, the expansion of State ‘social expenditures’ demanded an ever-greater professional workforce in the State’s direct or indirect service.
The shifting attitudes of the professional middle class also reflects changes in its educational profile. In 1970, as indicated in Figure 4, the university-educated amounted to only 11% of the adult population, even in the USA. In Australia, Britain and France they represented 6% to 9% of the total. Salaried professionals as a class were much larger, but without a university degree any claim to intellectual and social leadership was threadbare. As the university-educated cohort grew exponentially after 1970, however, this picture changed in ways that made the university-educated less sympathetic to the worlds of business and production.
Reflecting on the growing predilection of France’s university-educated for postmodernist ideas—a tendency mirrored in Australia and other Western societies—Harvard sociologist Michèle Lamont argued that ‘consumption’ of such philosophies represented a “cultural produit de luxe” for a class of people for whom educational advancement was no longer reflected in significantly higher incomes. Barely “accessible even for the highly-educated”, the capacity to discuss such ideas with any degree of competence, Lamont argued, set one off from the less well educated. That is, even when the truly astounding incomes accrued only to the higher ranks (and those salaries are astounding, and appalling), all of the university-educated clerisy could at least assert their relative status. Such trends today are a near-universal feature of the ever-growing class of university-educated professionals.
The clerisy demark themselves by embracing ‘woke’ political causes.
Whereas their grandparents had distinguished themselves from the blue-collar working class by voting conservative, today’s members demark themselves from their educational inferiors through an embrace of ‘woke’ political causes. (Read my own discussion of the intellectual deficiencies of postmodernism in the December 2018 edition of the IPA Review).
As the university-educated grew in number, they also increasingly found jobs not just in the public sector but in private sector jobs subsidised by the taxpayer (such as in health) and in private sector jobs justifiable only because of government regulation (such as HR, diversity, compliance). This trend is apparent in virtually every Western economy, setting those who benefit from the expansion of the State (welfare recipients and the university educated) against those who suffer from increased taxation and lower productivity (everyone else).
In the US in 2019 there were far more people employed as professionals in State-subsidised education and health sectors (18.5 million) than there were in either manufacturing (15.1 million) or construction (8.4 million). Collectively, those in professional jobs (56.6 million) easily outnumbered the combined total found in manufacturing, construction and transport (30.2 million). In Australia in August 2020, the number employed in public administration (836,000) easily outnumbered those employed in manufacturing (739,000), construction (745,000) and hospitality (671,000), considered singularly. Collectively, those in the latter three industries (2.1 million) barely exceeded the combined employment found in public administration and education (1.9 million).
Invariably, modern States—be it the Soviet Union, Britain in the 1950s, or Australia in 2021—justify the extension of their power by claiming to promote greater equality and ‘social good’ over ‘market outcomes’. Without fail, however, they produce greater levels of inequality, not less. The reason for this paradoxical outcome is clear. A welfare cheque seldom provides a fair exchange for a job lost in a slowing economy.
Even economist Thomas Piketty recognised this in his much-cited Capital in the Twenty-First Century (2013). As Piketty conceded, the main causal factor behind increased inequality over the past few decades is a “return to a regime of relatively slow growth”. Under this regime, Piketty observed, economic mobility is inevitably curtailed. In consequence, wealth concentrates in the hands of two groups, whom Piketty describes as a new class of ‘supermanagers’ and those benefitting from ‘hyperpatrimonial’ outcomes, such as inherited wealth.
The greatest problem associated with the rise-and-rise of the modern State, however, is not economic. Rather, it is found in the fundamental threat to the primacy of the individual and his or her autonomy.
In any society at any given point in time, there is typically little unanimity as to moral values or personal objectives. Most would place the economic wellbeing of immediate family at the centre of their moral universe, and liberalism allows a great deal of latitude (once called tolerance, when that was a virtue) in the belief of different people and groups in society. When, therefore, a government makes one set of concerns and moral values the guiding light for government interference in the economy—such as we currently witness in the global embrace of a ‘Net Zero by 2050’ objective—it does so at the expense of other possible moral claims.
In the process, one set of moral values is legitimised, while others are delegitimised. One group, those who identify their interests with those of the State, are made to feel morally virtuous. Those who do not, are demonised. So it was in Joseph Stalin’s Russia pursuing collectivisation, and so it has been with Victorian premier Dan Andrews pursuing eradication of COVID-19; where the clerisy in the media and the commanding heights of social media made sure dissenters could not merely be wrong, they had to be bad.
Thus, of the three pillars of the successful nation-state, it is not strictly speaking democracy which is under most threat. The clerisy, through its dependence on and effective control of the State, can dominate society and the bounds of its morality so it at least appears the will of the people is being done. A ‘majoritarian’ legitimacy suffices for even the most illiberal measures, such as Vaccine Passports.
True liberalism is under sustained challenge.
Even the forms of political liberalism can survive, since the clerisy realises its power survives undimmed regardless of any changes of the political party occupying the Government benches. The centre-right may continue the trend of the ever-growing State, perhaps at a slightly slower rate, but no fundamental challenge will be allowed.
What is under sustained challenge is true liberalism, built on individualism, self-reliance, personal autonomy, individual conscience, and liberty. The ever-growing State is reducing the room not just for choice and action in the exercise of liberty, but even the scope to claim the moral value of doing so.
That the system is economically unstainable—turning its back on entrepreneurship and even the harnessing of energy sources as a source of prosperity—only means that any reckoning and reversal will necessarily be painful. It will require leadership from those who still cleave to the ideals of liberty.
Bradley Bowden is an Adjunct Fellow at the Institute of Public Affairs and a Professor in Griffith University’s Griffith Business School.