Economics, Pethokoukis

Is the computer revolution already over?

Image Credit: Shutterstock

Image Credit: Shutterstock

Technological innovation drives productivity which drives economic growth and rising living standards. And the more rapid the increases in the capability of computing equipment, the more rapid the decline in the price of that equipment, given a fixed capability. An average computer today sells for about what it did a decade ago, say,  $1000. But today’s version is way more powerful. An average 2003-era computer would sell for far less today. And during the 1990s, computer prices, adjusted for quality, fell sharply.

So here’s the problem: Prices for information technology equipment are declining at the slowest pace in over a generation. And to the economic team at JPMorgan, this suggests the pace of technological advance is also slowing. If they’re right, this phenomenon would have a big impact on the US economy and workers.

Two charts from the bank’s new report, “Is I.T. Over?,” display the slowing pace of price declines:

Economist Michael Feroli:

Gains in information technology are routinely credited with the strong growth in the supply side of the US economy in the 1995-2005 period. If that technological growth is slowing—as indicated by the earlier observation on tech prices—then this could have quite significant implications for the US economy’s potential growth rate.

Note that this is not an exercise in futurology. Northwestern University economist Robert Gordon’s recent claim that US economic growth is over has attracted a fair bit of attention. This note, however, does not speculate on whether growth in information technology has reached some natural limit, or whether further revolutionary advances are coming. Instead, by looking at tech prices—and incorporating some economic reasoning—we can infer what is currently occurring on the tech frontier.

While we are not qualified to speculate on the future of technological change, we do observe that the growth in tech prices is positively serially correlated, which is another way of saying the pace of innovation in the near future is likely to resemble that in the recent past.

The downside here is that a slowdown in price declines has been accompanied by a slowdown in tech investment. And less tech investment, Feroli explains, “means less capital deepening, which could help explain why productivity growth has been soft in recent years.”

But there might be an upside to all this, he argues. Slower gains in technology and productivity, at least for a bit, might make it easier to absorb workers — labor in place of capital — back into the economy.

Then there’s the income inequality issue. To the extent that its increase over the past few decades has been driven by the increased return to high-skilled, highly educated workers, “then workforce skills may be better able to catch up with the level of technology. That, combined with the rise in college enrollment, “suggests the march toward increasing income inequality could soon reverse itself.”

Perhaps. But if Feroli’s version of “the great stagnation” argument is correct, I would rather increase opportunity and absolute incomes by better educating workers and creating more entrepreneurs to supply the next wave of innovation. More smart research investment by government, too. Or it could be this data is not accurately capturing either ongoing IT innovation and its diffusion throughout the economy. Indeed, perhaps the third industrial revolution has only just begun. As Kevin Kelly wrote in Wired recently:

Right now it seems unthinkable: We can’t imagine a bot that can assemble a stack of ingredients into a gift or manufacture spare parts for our lawn mower or fabricate materials for our new kitchen. We can’t imagine our nephews and nieces running a dozen workbots in their garage, churning out inverters for their friend’s electric-vehicle startup. We can’t imagine our children becoming appliance designers, making custom batches of liquid-nitrogen dessert machines to sell to the millionaires in China. But that’s what personal robot automation will enable.

Everyone will have access to a personal robot, but simply owning one will not guarantee success. Rather, success will go to those who innovate in the organization, optimization, and customization of the process of getting work done with bots and machines. Geographical clusters of production will matter, not for any differential in labor costs but because of the differential in human expertise. It’s human-robot symbiosis. Our human assignment will be to keep making jobs for robots—and that is a task that will never be finished. So we will always have at least that one “job.”

Is the tech revolution over? Maybe. Or maybe it is just paused. Maybe not even that. But we should act as if it has slowed and take whatever policy steps we can to hit the gas pedal.

Faster, please.

28 thoughts on “Is the computer revolution already over?

  1. The downside here is that a slowdown in price declines has been accompanied by a slowdown in tech investment“…

    Two problems I see here…

    The cost of chip fabrication has gone up…

    The price of technology has gone down…

    Still there are folks who don’t think the tech gravy train is really slowing down that badly…

    CE Industry Revenues to Reach Record-High $209 Billion in 2013, According to CEA

    Arlington, VA – 01/08/2013 – Revenues for the consumer electronics (CE) industry are projected to grow nearly three percent, reaching a new record-high of $209.6 billion, according to the semi-annual industry forecast released today by the Consumer Electronics Association (CEA)®. The forecast also shows 2012 industry revenues reached $204 billion, up five percent from the previous year. CEA President and CEO Gary Shapiro announced the forecast in his opening remarks today at the 2013 International CES®, the world’s largest annual innovation event.(there’s more)

  2. It’s a pause driven by a slow economy. It isn’t the cause of the slow economy. Look around, do companies have the “ultimate” systems in place that deliver the maximum amount of productivity possible? Not even close.

    If we ever get the economy back on track investment in tech will increase driving further innovation.

  3. It also depends on what you measure and what you call IT. The PC business is stagnant. But, the IT industry is moving to cloud and mobile (tablet, smartphone), and embedded computing (e.g. enabling major gains in productivity. Drivers don’t get lost and now take optimal routes to save fuel. Small manufactures use cloud simulation services to make safer, effective, cheaper products. Computing is now part of everything and the enabler of innovation.

  4. It could be that the market has bee distorted by gov’t fiat. Electronic Health Records were mandated by the Bush Administration. The mandate is now being realized due to the timetable in the mandate. Without this mandated switch to EHR, I would not have increased my computer, LAN and software investment by 300 to 350%. Then there are the many IT man-hours I would not have chosen to pay for to integrate my existing equipment into my new EHR system. Mandated PC/LAN/software purchases by healthcare providers, hospitals and clinics across the country are causing an artificial demand spike leading to sectoral inflation. This effect will likely dissipate in 2 years or so as the mandate is fulfilled.

    • Electronic Health Records were mandated by the Bush Administration“…

      Could you just drop your employee health benefits and not have to deal with all that mandated nonsense?

        • The OD in his name would indicate he is an optometrist“…

          I understood that part jethro, hence the reason I used the term, ‘employee‘…

          • In order to stay in business you have to be part of an Accountable Care Organization, in order to do that, you have to have EHR that connect you to the rest of your organization.

            Also you wouldn’t believe how much money is wasted because records are kept in files in some office somewhere. Imagine paying for 7 different MRIs because the right hand doesn’t know what the left hands is doing even within the same hospital.

          • In order to stay in business you have to be part of an Accountable Care Organization, in order to do that, you have to have EHR that connect you to the rest of your organization“…

            OMG! Here I thought having file quarterly tax returns was over the top!!

            Thanks for that info chris

  5. Personally, I thought I’d be getting to work in my flying car as GM predicted we’d be doing by the 1990′s. But now I’d be satisfied if my car would drive me to the ‘bot serving Big Mac’s at the drive thru. Who’d a thunk-it?

    Oh, and I’m still waiting for the Japanese to take over the computer industry. And, lastly, electronic ‘computers’ have been with us for nearly 70+ years ago… Jezze, that’s a h_ll of a long revolution, ugh?

  6. IT will grow in direct proportion with the service industry which will eventually surpass manufacturing as America’s prime employer and producer of US GDP.

  7. I’ve owned a personal computer since 1981. I have an M.S. in Computer Science and work in this field. There is absolutely no way the computer revolution is over. We are only at the beginning. The room for growth is incredible. Most manufacturing jobs will be taken over by robots within 20 years. Almost all cars will be self-driving shortly thereafter. Computers will be attached to our bodies and integrated into our brains. We are only scratching the surface. The future is both fascinating and scary.

  8. Enabling technologies that spawn entire industries are by nature unpredictable.
    Never underestimate the catalytic reaction between a young, prepared mind, persistence, and luck.
    And, by all means, avoid extrapolating existing trends into the indefinite future.
    Technology does not evolve linearly. It evolves in unanticipated jumps.

    • Well said. Technology is sort of quantum in nature. There are thousands of ideas, thousands of patents, but it’s what a larger collective population does when clever new users say, “Wow, look what I can do with this!” That’s when there’s a quantum change & the world changes. Revolutions in technology are not predictable, but you know it when there’s exponential adoption & growth.

  9. I own a computer repair and network service company. I can tell you, firsthand, that the price of technology is as low as it can get which is why you do not see it slowing further.

    At a certain pricepoint, it is simply not worth building. I turn people away when they ask us to build custom computers or servers; to compete with off the shelf stuff is totally unprofitable.

    The downside to this is that pretty much everything you pick up off the shelf at Best Buy is a piece of junk that will not last 2 years without some kind of malfunction. The race to the bottom has hit bottom. There is nowhere left to go.

    • “The downside to this is that pretty much everything you pick up off the shelf at Best Buy is a piece of junk that will not last 2 years without some kind of malfunction.”

      Which is the corollary to Moore’s Law: if it’s going to be obsolete in two years, why build it to last any longer?

  10. Which makes me wonder, is the Industrial Revolution already over? Will things go back to how they were in the 1700s.

    i.e. it’s absurdist crap of a question.

  11. “take whatever policy steps we can to hit the gas pedal.”

    that statement makes my hair stand up in fear … government policies have caused the problems and certainly won’t solve the current ones …

  12. Easy to imagine devices will cause another explosion in horsepower requirements – hologram displays (and pin-point audio to go with it), air sensors that are monitoring you and your family’s health profiles, tax-buddy systems that keep refining your daily transactions to leverage the multi-gazzilion page tax code to your advantage… I think the constant / increasing load of the welfare state and statism in general repel investments.

  13. Those charts suggest that the diminishing percentage of IT price declines tracks rather well to both Dubya’s weak dollar policy and the easy money regime under Ben Bernanke. The other thing it suggests is that we’ve already extracted the easy value out of IT outsourcing to India and electronics manufacturing to China; future value will need to be realized through more difficult and innovative methods.

  14. I design chips using those bleeding edge technologies, so I can offer some guidance.

    Yes, the revolution is slowing. It will soon break. We’re hitting physical limits.

    1) For a 16nm FinFET, you’re talking about a transistor about 160 atoms wide, and about 15 atoms thick. That dimension used to scale by about 70% per generation in the old version of Moore’s Law. Even today we’re hitting a fair number of quantum mechanical effects, and by the time we’re at 5nm in a few years devices will be ruled by quantum mechanical effects and let’s just say computations will get interesting there. At some point “1″ won’t mean “1″.

    2) Moore’s Law talks about density. He doesn’t talk about the speed of the devices. That’s flatlining and has been since about 65nm. At that point the device scaling rules completely broke and we’ve been making tradeoffs. Bascially, we’ve been holding the transistor threshold voltage constant while we scale the devices. That’s because if we don’t do that, the chips will “leak” too much charge and that wasted power will easily match or exceed the useful switching energy by the time we hit 28nm. But that means that device speeds have to go down. FinFETs help to some extent, but not enough to keep frequency scaling like it used to be.

    So, the short version is: yes, chips are slowing down. There are fundamental, physical effects coming into play and essentially the easy gains of the last five decades are over. And unless there’s a radical (and I mean really radical) innovation, your grandkids won’t remember the days of exponential growth in compute power.

    But just because chips are slowing down and costs are exploding doesn’t mean that the computer revolution is over. I’d argue the opposite. The hardware guys have been blowing ahead so fast that the software guys haven’t been able to keep up.

    What’s happening now is that compute power is so cheap that the software guys have just begun to see what they can do with the power out there. We’ve just started to automate things, sensors are just starting to be really deployed, and networking is really just starting. Remember, nearly all the gains in your car’s fuel efficiency since the 70s has been due to electronics, and that’s not even begun to slow yet.

    The chip revolution will stall in maybe a decade or two, but the software/computer revolution has much, much longer than that to run.

  15. The increasing numbers and types of software patents is also stifling the industry. Unlike manufacturing and physical engineering, there are only so many variations on how to make a computer perform tasks, and if you keep patenting those limited variations, you soon get into the problem of no longer being able to innovate, and you start running into what we currently face, increasing “patent wars.”

    Mathematical formulas were once not able to be patented, and software from the beginning was treated like mathematical equations (which in terms of Boolean algebra they should still be) which prevented software from being patented. Companies in the 80s didn’t like the limited and insufficient protection that copyright afforded, and so they pushed for patenting software. I remember when AT&T patented the “exclusive OR (XOR)” method of covering and exposing windows on a screen, this stifled the ability of the open source “X Window” package to implement window exposing and hiding, which all windowing systems use.

    Softare patents are far more insidious than manufacturing patents, since the product of software development is intangible and “virtual”, and because it really is a way of thinking, any time that a developer determines a process of manipulating certain bits to achieve a certain different set of bits, the developer is forced to constantly perform a patent search to see if anyone else has thought of this bit processing method.

    There are many patentable ways of making iron, but there is only one way of performing an XOR operation on bits, and I would like to see reform of the patent process to either eliminate software patents altogether, or at least limit them to one or two years at most.

  16. Yep, it’s over. In fact, the acceleration phase has been over for about 20 years now, and we’ve just been coasting up. It was obvious in the 90s, when you could buy on your desktop stuff that wasn’t seriously different from what was being used in supercomputing centers.

    I guess veryone thought that was because there was some newly zippy conduit from the leading research edge to the commodity user. Nonsense. It’s because the leading research edge in computing hardware crapped out in the 80s. And for that matter, so did system-level software, major computing paradigms. No one has invented anything nearly as remarkable as what was invented in the 70s and 80s since then. It’s all just been refining, polishing, reducing the size, adding bells and whistles, filling in the obvious extensions. You can coast quite a long time on that — but not forever.

    This is just the way technology goes. There was an absolutely explosion of automobile tech in the 1890s through maybe 1950s, but since then, it’s just been refining and polishing up. Same with airplanes from the 1900s through 1970s. You pick all the low-hanging fruit and then…things get very tough, and advances slow to a one per generation crawl.

    The only people truly shocked are going to be those who naively just made a simple extrapolation of computing tech in the 90s and 00s and predict by the 2020s and 2030s the Singularity will have arrived and we’ll all be living in virtual reality, uploading our consciousnesses, talking to our phones in natural language, et cetera. Nope. This is the same mistake made by people who made a straight-line extrapolation of aerospace progress in the 1940s-1960s and predicted colonies on Mars by 2015.

  17. No one else seems to have mentioned it, but I’d also be curious to see the charts with Apple’s insane margins removed from the data. I’m sure the slow economy accounts for reduced R&D expenditures, but aside from the sharp uptick in prices following the collapse of the dot-com bubble there only seems to be about a 3% shift, which captures Apple’s growth to 10 or (claimed) 20% market share along with their outsized margins.To the extent that matters, the market isn’t failing but simply incorporating people’s willingness to pay more for improved user-friendliness, product design, and social status, as opposed to the raw computing power being measured here.

  18. I can’t edit this into my last comment for some reason (poor comment software) but, yes, the ability of Apple to thus far monetize its improvements in user-friendliness and design are in great part attributable to the current rentseeker-captured regime of IP law.

  19. The IT computer revolution is most definitely not over. It is going in a different direction.
    The ‘Commercial Computing Revolution’ era, for me started in 1959 when I worked on the first commercial computer designed and built in England. Time has ended this era. Commercial computing is has well known practices. When enterprises used this technology they had to make tranformative organizational changes to reduce labour costs and so maximize their benefits to stay competitive. Economists created ways to identify and measure these changes.
    We are now in what I call the ‘Societal/Mobility Revolution’ era. It will continue to create thousands of new IT jobs but I think economists and others will need to create new ways to measure its impact since they will not necessarily be in our enterprises.
    The coincidental convergence of several different areas of technology results in this change in direction for the IT revolution and its subsequent impacts.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>