Innovation Is Slowing Down
Publication Date: July 03, 2019 7 minute readWhat explains the decline in productivity growth across the developed world? How will this impact jobs already threatened by the advance of robotics and artificial intelligence? Professor of economics at Stanford since 2005, Nicholas Bloom argues that growth is being curtailed as rich countries struggle to match their previous success as prolific innovators. Despite soaring investment in R&D, he says, the ‘ideas productivity’ of individual researchers has been in freefall for decades.
Productivity growth is slowing down in the US (see Figure 1) and other developed countries (see Figure 2). In the 1950s American productivity was rising by more than 3% a year. This period of incredible progress was driven by the rapid expansion of research universities like Harvard, MIT, and Stanford; research labs in firms like General Electric and Ford; and the commercialization of technologies developed in World War II. By the 1980s, however, productivity growth had halved to 2%. It has now fallen to just 1% per year.
This slowdown has sparked a debate among economists over the sources of the problem. Are statisticians underestimating output? Is the US mired in “secular stagnation”, a prolonged period of low economic growth caused by insufficient investment? Or are recent innovations simply not as productive as those of the past?
In research with three fellow economists (Chad Jones and Mike Webb, Stanford; and John Van Reenen, MIT), I argue that ideas productivity - the productivity of science and discovery - has been falling for decades. Scientific discoveries and technical advances are getting harder to find, and spending on R&D has not been increasing fast enough to offset these declines in productivity. Innovation is slowing down.
Running to Stand Still
The creation of ideas is central to economic growth. This is driven by two things: the number of researchers (scientists and engineers) and the productivity of these individuals (ideas per researcher). Our analysis found that while there are a rising number of researchers, each one is becoming less productive over time (see Figure 3). R&D efforts have been rising steeply for decades, but research productivity - the number of ideas being produced per researcher - has fallen rapidly.
Our analysis revealed that more than 20 times as many Americans are engaged in R&D today compared to 1930, yet their average productivity has dropped by a factor of more than 40. The only way the US has been able to maintain even its current lackluster GDP growth rate has been to throw more scientists and engineers at research problems. The US economy has had to double its research efforts every 13 years just to sustain the same overall rate of economic growth.
The acceleration of computer processing power is a telling example. Named after the co-founder of the computer chip giant Intel, Moore’s Law holds that the transistor density of silicon chips will double roughly every two years.
-150x211.jpg)
Such advances have enabled the creation of ever more powerful computers, which have transformed modern society. But maintaining that regular doubling today requires more than 18 times as many researchers than were needed in the early 1970s.
A similar pattern shows up in agricultural and pharmaceutical industries. For agricultural yields between 1970 and 2007, research effort went up by a factor of two, while research productivity declined by a factor of four, at an annual rate of 3.7%. In the pharmaceutical sector, research effort went up by a factor of nine between 1970 and 2014, while research productivity declined by a factor of five, at an annual rate of 3.5%.
We also examined the track records of more than 15,000 US public firms between 1980 and 2015. We found that even as spending on R&D rose, a vast majority of the firms experienced rapid declines in ideas productivity. The average firm now needs 15 times as many researchers as it did 30 years ago to produce the same rate of growth.
The Apple Tree Model
So why has the productivity of scientists and engineers fallen so much? One explanation is that the low-hanging fruit of ideas have been plucked. To explain this ‘apple tree model’ of growth, we should travel back to the start of the Industrial Revolution in England.
Before 1750, productivity growth was close to zero. Most of the population in 1700 still worked on farms and were not much more productive than their ancestors under the Romans 2,000 years before. But from the late 1700s until about 1950, productivity growth began to accelerate. This is the era of “standing on the shoulders of giants”. Each new invention - the steam engine, electric lighting, penicillin, and so on - made future inventors more productive. Growth took off as firms started creating industrial R&D labs, starting with those of Thomas Edison in 1876, while universities began to focus more on science and engineering research.
By 1950, however, the tide began to turn. The US reached its peak productivity growth of around 4% per year before the third phase of the ‘apple tree model’ began to set in. Humanity had made many of the quickest discoveries, and now unearthing new scientific truths started getting harder.
What about the future?
In an accounting sense, productivity growth in the US has been slow because output growth has been low while employment growth has been high (see Figure 4). In other words, robots have not been stealing our jobs (on net). In fact, the US has just experienced the longest and largest stretch of job creation since World War II (before which we lack accurate national accounts data). So the robot jobpocalypse is nowhere to be seen. One reason for this is that the types of low-skilled jobs that robots would replace have seen falling wages. So low-skilled workers have effectively been forced to accept lower wages to price themselves into jobs (see Figure 5).
So I suggest two predictions:
A) The next 10 to 20 years are likely to herald similar trends of 1% productivity growth as the recent past. Compared to the 1950s to 1990s that is slow, but compared to the longer sweep of history going back 2,000 years it is still a blistering rate of advance.
B) In the longer-run artificial intelligence will be the great leveler - it has the potential to replace some types of higher-paid managerial and cognitive jobs that until recently have been safe from automation.
These higher-paid managerial jobs are also relatively much more expensive so the returns to developing technologies to replace them are greater. As such, artificial intelligence could reduce inequality and herald massive savings for companies employing large cohorts of graduate employees. But for these graduates - including the authors of this piece - it is less clear how beneficial that will be.
Find Out More
Research by Nicholas Bloom on a range of topics such as innovation, management and IT, including his 2017 working paper ‘Are Ideas Getting Harder to Find?’, is available to view at: https://nbloom.people.stanford.edu/research.
The preceding is republished on TAP with permission by its author, Professor Nicholas Bloom, and by the Toulouse Network for Information Technology (TNIT). “Innovation Is Slowing Down” was originally published in TNIT’s June 2019 newsletter.