Credit: Jim Henderson/Creative Commons
Two examples: A decade ago, a) it cost US taxpayers $3 billion to sequence the human genome vs. individuals able it now do for about six grand, b) not a single vehicle in a DARPA driverless car challenge made it even 5% through a desert course vs. the Google car today zipping all across America.
Those and other technological advances, including the continuing progress of Moore’s Law, are enough to persuade MIT’s Andrew McAfee that it is “not going to take anything close to a century for digital technology to transform our economy into something out of science fiction.”
But then there’s this chart — presented by economist Robert Gordon in an Economist debate with McAfee — showing a slowdown in US productivity growth over the past decade:
Credit: Robert Gordon
To Gordon, more of a “great stagnation” guy, “the productivity data provide a clear verdict on the topic of this debate.” The IT revolution is over. Now as judged by the magazine’s readers, McAfee easily won the debate — yes, technological change is accelerating. Of course, most of us would prefer a future of high growth and new gadgets versus the same-old, same-old. McAfee’s argument is inherently more appealing. But McAfee attempts to wave away the inconvenient statistics, thusly:
1. The Great Recession has spoiled the data. The big drop in output made the economy look less productive, at least for a bit.
2. There are long lags as society figures out the best and most efficient uses of new technology. So the fruits of these advances will play out over a number of years and decades.
3. Government economic statistics aren’t much good at capturing dynamic technological change. McAfee:
Data from online searches and social networks are letting us track outbreaks of cholera, flu and other diseases better and faster than ever before. Several centuries of digitised books are revealing how verbs become regular, how fame has become more fleeting and how long the effects of censorship last. A single plane equipped with gear for precise positioning and laser-based mapping recently scanned more than 50 square miles of Central American jungle down to the inch, and found evidence of several previously unknown major archaeological sites. None of these advances was possible even a decade ago. The constant price declines and performance improvements summarised by Moore’s law, the staggering quantities of digital data now available, and the imagination and talent of countless innovators, entrepreneurs and tinkerers are combining to bring us into a second machine age.
AEI’s Stephen Oliner stakes out a middle position in a recent paper: While the Great Recession and its aftermath may have decreased productivity somewhat, the slower paces of the past decade can be largely explained by a reduced contribution from IT after a decade-long tech boom. On the other hand, semiconductor innovation “is continuing at a rapid pace, raising the possibility of a second wave in the IT revolution, and we see a reasonable prospect that the pace of labor productivity growth could rise to its long-run average of 2¼ percent or even above. … No, the information technology revolution is not over.”
Indeed, a recent report from McKinsey Global highlighted a host of new technologies and their potential economic impact — and then went to warn about how these disruptive technologies might affect the bulk of the labor force:
By 2025, technologies that raise productivity by automating jobs that are not practical to automate today could be on their way to widespread adoption. … Given the large numbers of jobs that could be affected by technologies such as advanced robotic and automated knowledge work, policy makers should consider the potential consequences of increasing divergence between the fates of highly skilled workers and those with fewer skills. The existing problem of a creating a labor force that fits the demands of a high-tech economy will only grow over time.