In this series of essay on barriers to scientific progress, we’ll examine the many interesting points made by Michael Bhaskar in his book Human Frontiers: The Future of Big Ideas in an Age of Small Thinking.
As Bhaskar writes:
Go back to the lives of your great-great-great-grandparents. They were probably alive towards the middle of the nineteenth century. Like you, they had hopes, fears and dreams. They craved what we crave: security, love, entertainment. Yet their world, just a generation or two out of reach, was more dangerous, backbreaking, boring, precarious and limited than anything we in the West ever experience, coronavirus and its aftermath included. In the mid-nineteenth century, buildings were often wooden and rudimentary. In cities especially, this meant devastating fires were common. The boomtown of Chicago was almost eradicated in a vast urban conflagration in 1871. Sewage was a ubiquitous disease-carrying presence, in the home and out. Animals were everywhere, their dung clogging the streets. Those animals were scrawny and ill fed: a cow produced 1000 lb of milk a year, compared to 16,000 lb today. Clean water and unspoiled food, monotonous and meagre though it might be, were rarities. Disease was rife and little understood. Between a fifth and a quarter of all infants died before their fifth birthday. Unimaginable personal tragedy was an unexplained but near inevitable part of life … Most people never moved far from where they were born, and never encountered much outside their own small cultures. The railway had started to change that, but for the most part travel was, as it had been for millennia, dictated by the pace of hoof and sail. Likewise, although there were a few industrial jobs (in grim, dark, limb-mauling places), most people worked the land, using technology and methods that again changed at glacial pace. Illiteracy was the norm, access to knowledge and entertainment media scant. In the evenings people relied on the faint, expensive and polluting light of bad candles or whale oil. Most went to bed when it got dark … Our recent ancestors, in some ways so close and so similar, inhabited a very different world. But it had already begun changing at a blistering pace. In a little more than a generation, their sons and daughters and grandchildren would be living in a landscape that is recognisably modern. For the first time in history there was exponential population growth: the US started the nineteenth century with just 5.3 million citizens but finished with 76 million, bigger than any European country save Russia. In the space of a few decades homes were transformed from quasi-medieval hovels into the ‘networked’ house: clean hot and cold water ran in and out. Electricity powered a host of ‘electric servants’ that began to take over some of the back-breaking domestic labour which had dominated the lives of women. The lightbulb and the motor car were both effectively invented in 1879. At the end of the nineteenth century they were still novelties; twenty years later, both were produced by the million. And it wasn't just cars and lights: telephones, aeroplanes, canned and processed food, the modern corporation and production methods, radio, refrigeration and the first plastics all burst into society around this time. No wonder there was an unprecedented revolution in productivity. Inventions like Cyrus McCormick's threshing machine led to a 500 per cent increase in output of wheat per hour. Isaac Singer's sewing machine meant the time spent making a shirt was reduced from over fourteen hours to just one hour and sixteen minutes. Big ideas were conceived and executed at what felt like an accelerating pace.
In his fascinating book Civilization: The West and the Rest, historian Niall Ferguson attributes this dramatic progress in the West to what he refers to as “six killer apps,” a phrase he uses to describe the six fundamental reasons why Western civilization led the way in this regard:
Those who decry “Eurocentrism” as if it were some distasteful prejudice have a problem: the Scientific Revolution was, by any scientific measure, wholly Eurocentric. An astonishingly high proportion of the key figures– around 80 per cent– originated in a hexagon bounded by Glasgow, Copenhagen, Kraków, Naples, Marseille and Plymouth, and nearly all the rest were born within a hundred miles of that area. In marked contrast, Ottoman [Empire] scientific progress was non-existent in this same period … The best explanation for this divergence was the unlimited sovereignty of religion in the Muslim world. Towards the end of the eleventh century, influential Islamic clerics began to argue that the study of Greek philosophy was incompatible with the teachings of the Koran. Indeed, it was blasphemous to suggest that man might be able to discern the divine mode of operation, which God might in any case vary at will. In the words of Abu Hamid al-Ghazali, author of The Incoherence of the Philosophers, “It is rare that someone becomes absorbed in this [foreign] science without renouncing religion and letting go the reins of piety within him.” Under clerical influence, the study of ancient philosophy was curtailed, books burned and so- called freethinkers persecuted; increasingly, the madrasas became focused exclusively on theology at a time when European universities were broadening the scope of their scholarship. Printing, too, was resisted in the Muslim world. For the Ottomans, script was sacred: there was a religious reverence for the pen, a preference for the art of calligraphy over the business of printing. “Scholar’s ink”, it was said, “is holier than martyr’s blood.” In 1515 a decree of Sultan Selim I had threatened with death anyone found using the printing press. This failure to reconcile Islam with scientific progress was to prove disastrous. Having once provided European scholars with ideas and inspiration, Muslim scientists were now cut off from the latest research. If the Scientific Revolution was generated by a network, then the Ottoman Empire was effectively offline.
Ferguson continues:
Why did the West dominate the Rest [of the world] and not vice versa? I have argued that it was because the West developed six killer applications that the Rest lacked. These were: 1. Competition, in that Europe itself was politically fragmented and that within each monarchy or republic there were multiple competing corporate entities 2. The Scientific Revolution, in that all the major seventeenth- century breakthroughs in mathematics, astronomy, physics, chemistry and biology happened in Western Europe 3. The rule of law and representative government, in that an optimal system of social and political order emerged in the English- speaking world, based on private property rights and the representation of property- owners in elected legislatures 4. Modern medicine, in that nearly all the major nineteenth- and twentieth- century breakthroughs in healthcare, including the control of tropical diseases, were made by Western Europeans and North Americans 5. The consumer society, in that the Industrial Revolution took place where there was both a supply of productivity- enhancing technologies and a demand for more, better and cheaper goods, beginning with cotton garments 6. The work ethic, in that Westerners were the first people in the world to combine more extensive and intensive labour with higher savings rates, permitting sustained capital accumulation. Those six killer apps were the key to Western ascendancy.
In any case, these “killer apps” led to tremendous growth in material well-being. As Bhaskar writes:
Our great-great-great-grandparents experienced that rare thing: for perhaps the first time in history every dimension of their human frontier changed. This was a historic break, built on rapid, vaunting advances, decade after decade … It seemed inevitable that the frontiers would continue to gloriously unfurl. But it’s become clear that the trajectory is more complicated. Change is rapid in some areas, yet has slowed in others … [T]hinkers point to flat growth rates; incremental and derivative technology; paradigms of knowledge and culture that have been stuck for decades … Entrepreneur and investor Peter Thiel famously encapsulated the argument as “We wanted flying cars, instead we got 140 characters.” … For most of recorded history (about 97 per cent, in fact), not much changed over the course of the average lifetime … Since the late eighteenth century, per capita GDP has risen by up to 10,000 per cent in the richest parts of the world, an astonishing and completely unprecedented change ... In the words of the economic historian Deirdre Nansen McCloskey, “Our riches did not come from piling brick on brick, or bachelor's degree on bachelor's degree, or bank balance on bank balance, but from piling idea on idea.” The most important transition in recent history was built on ideas … Darwin read Adam Smith, and was thus familiar with the idea that an undirected process with numberless small instances of local competition could have extraordinary results … Big ideas “broker” other ideas in interesting ways – whether that's Elvis Presley brokering gospel and the blues or Gutenberg's printing press coupling the wine press with the idea of casting a seal. Johannes Kepler united the previously disparate fields of physics and astronomy, using new data uncovered by Tycho Brahe to prove the elliptical orbits of the planets … A 2019 paper analysed millions of US patents from 1840 to 2010 to find patterns in what the authors call “breakthrough innovation”, “distinct improvements at the technological frontier”. To do that they needed a way of identifying such innovation. They used two criteria, which follow the approaches taken here: a measure of originality, and a measure of influence. Breakthrough innovations are patents which exhibit the highest scores on both, adjusted for time. Using the text-analysis methods, they compared patents’ language, looking for novel uses of language (originality) and echoes of it down the patent record (influence). They then produced indices of patents that isolate major breakthroughs: the telegraph, the telephone, the automobile, the aeroplane, plastics, microprocessors and genetic engineering all show up clearly. The organisations and companies behind the big ideas also look familiar: General Electric and Westinghouse, IBM and RCA, Microsoft and Apple. These researchers weren't plucking breakthrough innovation out of thin air – they'd found robust statistical means of identifying it.
But things have begun to change, and scientific and material progress is slowing in many ways. As Bhaskar writes:
The discovery of drugs appears to obey a rule christened Eroom's Law. In a nutshell, the number of drugs approved for every billion dollars’ worth of research and development (R&D) halves every nine years. This pattern has remained largely consistent for over seventy years.16 Since 1950, the cost of developing a new drug has risen at least eighty-fold. A Tufts University study suggests that the cost of developing a drug approved by the US Food and Drug Administration (FDA) rose at least thirteen times between 1975 and 2009. By the mid-2000s it was $1.3 billion. Today it stands at above $2.6 billion, although science writer Matthew Herper estimates it as $4 billion. In the 1960s, by contrast, costs per drug developed were around $5 million. Timelines, at least pre-Covid, are likewise extended. Eroom's Law shows that it takes more and more effort and money to develop new drugs. Achieving a pharmaceutical breakthrough is on a trend of increasing difficulty … Eroom is not a person. Eroom's Law simply reverses the name Moore, as in Moore's Law (the idea that the number of transistors on a chip will double every two years, driving an exponential increase in computational power) … Every year it takes more money, researchers, time and effort to achieve breakthroughs. Each and every one of us is affected – our families, our friends, our basic quality of life.
Bhaskar continues:
Eroom's Law is far from the only example. We face a world where the remaining problems – and the new ones – are of a higher order. At a certain point, endeavours hit a breakthrough problem, where despite the improved capacity for making big new ideas happen, they don't … We can easily imagine a future of bustling skies filled with great airships, tiny darting drones, hypersonic intercontinental jetliners, solar-powered long-distance cruisers, blizzards of AI-controlled car-like transportation systems, adrenalin junkies getting to meetings on jetpacks, information teleported around the world in real time. We can easily imagine it, and, in fact, we have done for a century ... But despite that, it's not here. We are fairly good at the conception stage, less so at the execution and, especially, purchase phases. Many if not most of these technologies are still in a perilous untested state. To reach this point has already taken volumes of R&D capital massively in excess of those deployed for the creation of previous technologies … In 2005 a hitherto obscure physicist working at the Pentagon's Naval Air War Center in California made a stir with a paper entitled “A possible declining trend for worldwide innovation”. Jonathan Huebner wanted to puncture triumphalist narratives of scientific and technological success; he didn't believe they followed an ever-escalating curve. Instead he claimed to detect a reversal. The curve of innovation, he argued, at first rose slowly over millennia, then massively increased before starting to tail off at the moment of its greatest triumph. The data suggested a peak year for human innovation, and it wasn't in the late twentieth or early twenty-first centuries. It was, he said, 1873. From the whole of history Huebner had landed on this arrestingly specific and un-anticipated date. Huebner started by looking at the rate of innovation, defined as the number of important technological developments per year divided by world population. This gives you a rough measure of how much each individual contributes to a given technological development. Analysing 7,198 scientific and technological developments from the Dark Ages to the present, he plotted them against estimates of world population drawn from the US Census Bureau. Doing this suggests that the rate of major innovations per head peaked in 1873; after that date the development of a major scientific or technical innovation required ever more people. Population is a blunt instrument, but the picture looks worse if you examine the ratio of innovation against either GDP or education expenditures, which are arguably better indicators. Per capita GDP grew by a factor of 9.62, whereas population only increased by a factor of 3.77. The result is that “the number of innovations when normalized to world per capita GDP declined 2.55 times more rapidly during the twentieth century than when normalized to population.” Education expenditures, recorded as a proportion of GDP, went up by even more. Adjusting for investment in education, the decline is yet steeper. Many of the innovations listed from the late twentieth century, moreover, are only improvements of earlier technologies. Looking at patent numbers against population shows a similar pattern, although with a later skew: the peak here is 1916. Better to have been born in the nineteenth century than the twentieth or twenty-first if you want to create a great invention or make a dazzling discovery … Some years later, the geneticist Jan Vijg looked at the number of significant inventions per decade as judged by Wikipedia's timeline of historic inventions. … It showed a similar pattern: a rise from the 1830s to the end of the century, a genuine golden age in the early twentieth century and a later drop-off. The 2000s were a notable low. Huebner's paper is, if not conclusive, then suggestive. It was an early shot in the debate, posing an uneasy question about the arrow of innovation. It implied that the breakthrough problem isn't isolated in a few significant areas, but operates across human civilisation; that big ideas are subject to diminishing returns. [Note: a similar phenomenon of diminishing marginal returns on bureaucracy was examined in a previous essay series.] And it focuses attention on a fascinating period around 1873, the time when Maxwell transformed physics and Pasteur invented modern medicine, the era of our near ancestors’ life transformation … 1873 stood on the cusp of what came to be known as the Second Industrial Revolution (2IR). If the First Industrial Revolution (1IR) lit the touchpaper of modernity, the 2IR saw it catch fire. In the words of the polymath researcher Vaclav Smil, who places this birth of a new world in the years 1867 to 1916, it “created the twentieth century”. … [T]he 2IR [included the] electric light to clean running water, elevators to large multinational corporations, production lines to the beginnings of social security, consumer brands to telephones, radio to the tabloid newspaper, moving pictures to motor cars … The 1IR had been powered by coal; the 2IR hit upon a new, even richer energy source: oil. In the twenty years between 1859 and 1879, American oil production grew from 8500 barrels of refined crude to 26 million. In parallel the price fell from $16 a barrel to just a dollar. The two technologies had a host of knock-on effects: buildings could be taller, and transport more efficient and powerful. Macro-inventions came fast. New chemicals from dyes to dynamite, aspirin to fertiliser were harnessed and mass produced. Electricity generation and the electricity network, a general purpose technology surely as significant as any in history, came of age; then advances were made in the manufacturing process itself, a total system exemplified by Fordism with its interchangeable parts and production line. Changes in food supply from the use of nitrates to refrigeration improved quality and length of life; and a revolution in communications media introduced a now familiar informational mix.
Bhaskar then describes the age we live in now, the “third industrial revolution” (3IR):
We too are living through an industrial revolution, the third (3IR, roughly since 1970). The 3IR is all about digitisation, the equivalent of steam or electricity. Jobs, consumption, knowledge, social lives and relationships seem to have been swallowed whole by the 3IR … In the world's wealthiest countries, those closest to the frontier, long-term economic growth as measured by GDP is in a pronounced pattern of slowdown. The growth rate in real per capita GDP (in the US) has slowed from 2.25 per cent, smoothed out across the twentieth century, to 1 per cent in the twenty-first century. If anything this slowdown in growth is becoming more marked as the century goes on. This seems strange, to say the least, in a time when tech and economic understanding should be spurring growth on. Productivity figures, a useful shorthand for the delivery of new ideas within an economy, are also significant. Productivity growth has been much slower over the 3IR than before. Since 1970, Total Factor Productivity (TFP), the key measure of how technology boosts growth, has grown at only a third of the pace achieved between 1920 and 1970, leaving us fully 73 per cent behind the postwar trend. In the words of Tyler Cowen and Ben Southwood: “TFP growth probably is the best contender for how to measure scientific progress. And overall TFP measures do show declines in the rate of innovativeness, expressed as a percentage of GDP.” In short, it appears that recent ideas have failed to have the same impact as those of the earlier generation. Even the US Congressional Budget Office and Federal Reserve have begun working with forecasts of lower long-term productivity growth. We have gone from making major gains to marginal improvements, and the results are commensurate … Patents have increased in absolute numbers, but not in quality. Recall that patents are one area where researchers have established quantitative models for novelty and impact; those same researchers conclude that the quality of patents, judged by how original and influential they are, was higher in the 1850s and 1860s than it is today. During the 2IR, change happened across every dimension of human experience, from housing to communications, transport to healthcare. In contrast, argues economist Robert Gordon, the 3IR concentrated fundamental advances – big ideas – in entertainment, information and communications technologies. Yes, these are significant. But Gordon likes to ask people: if you could have all the innovations from the mid-1990s on, or access to hot and cold running water in your house, which would you choose? Most of us probably wouldn't take the joys of social media over not having to haul gallons of water on cold winter mornings, or having electric light in the evening, or a car. Aside from informational and communications goods, the signature advances of modern life occurred decades ago: not only light or the car, but electricity, refrigeration, transport infrastructure, household labour-saving goods.
I remember reading a book called Daily Life in the United States, 1920-1940: How Americans Lived Through the Roaring Twenties and the Great Depression and noticing that, except for the internet and other digital things, so many of the modern conveniences introduced a hundred years ago are still the sorts of things we use today, only in slightly improved form, like dishwashers, air conditioning, televisions and radios, and a whole variety of other electricity-driven devices. And regarding things digital, Bhaskar writes:
In reality, the 3IR has funneled progress into the area of least resistance: software. Whereas the 2IR saw innovation across almost every endeavour, the 3IR boiled it all down to little screens. It allows work in a frictionless and easily malleable sphere, the contained and supple world of code. Grad students and dropouts could build the world's major software corporations. One or two graduate students could not deliver a redesigned jet engine or nuclear power plant; barriers in the physical world are too high. Whereas the 2IR built the modern world, the gains and innovations of 3IR are narrower, its economic impact surprisingly muted at the macro level, debatable according to the World Bank. Digital technology gave the sensation of acceleration, but exponential improvements like Moore's Law cannot be ported over to the material world … We have genetic engineering, but its foundations were laid over a century ago.
Bhaskar writes that evidence of slow-downs are everywhere:
[P]er capita energy use grew for centuries in what has been called the Henry Adams curve, underpinning technological and societal change. This growth has stopped: if pre-1970 energy use trends had continued, we would have access to thirty times as much energy today, likely supplied by novel forms of nuclear power. But at the frontier, the availability of energy has stagnated, even as it is growth in available energy that underwrote almost everything about modernity … Civilisational advances are predicated on an expanding and secure energy base. Societal collapse is strongly correlated with crises in energy supply … [W]e are seeing a slowdown not in just absolute economic growth, or major innovations, or energy use, but also in debt (student, automobile and mortgage), the number of books published per year, population and, with it, fertility rates, the number of relationships and age of marriage, improvements in living standards, median wage growth, property and other asset price inflation, increases in human height, the introduction of new significant consumer appliances and enrolment in tertiary education … This slowing – and in some cases outright reversal of growth – isn't necessarily all bad but it does imply we have hit an historic inflection point. In contrast to the past, the breakthrough problem spans fields, its tentacles suffocating the economy. Gen Z – my children's generation – will have lives broadly similar to my own or, worryingly, worse. Their houses, jobs, health, appliances, work, media consumption and lifestyle will be quite recognisable. That was not true of my parents and their parents, or even more my grandparents and great-grandparents. Beyond the screen, ‘the frontier of last resort’, this is an age of consolidation … [I]nnovation has been directed at ways to order pizza or take better selfies; the softest frontiers around. To be fair, Google Maps, Zoom, Minecraft, Spotify – these are marvels and big ideas that have found purchase, and digital is the bright spot. But it only makes the contrast with other areas starker.
In the next essay in this series, we’ll examine how our information pipelines have become clogged.
Paul, Wow, enlightening but utterly depressing. It does make many things clearer. Most of this can be traced to governmental idiots (Head Girls) trying to save us from ourselves because they know better. Would be interesting if you ever get to contemplating whether any of this is reparable.