For some time now, we have been reading about the tech revolution: Entire industries are being "disrupted," the way we do the simplest things is changing, the pace of innovation is accelerating. According to economic data, however, nothing much is happening. Should we believe our eyes or the data? It's not an obvious choice.
Former U.S. Treasury Secretary Larry Summers talked about one aspect of the paradox at a recent conference on productivity. If technological innovation is driving low-skilled workers out of the workforce -- in 1965, 19 out of 20 men between the ages of 25 and 54 were working in the United States, but now only 17 do -- shouldn't productivity rise faster? Intuitively, it should, both because fewer people are employed and the economy is growing and because it's the least productive workers who are being "dis-employed."
Instead, total factor productivity growth in the U.S. has halved in the last decade, compared with the previous one.
Summers' intuitive answer -- he pointed out in his speech that productivity wasn't his academic field -- is that perhaps economic growth is mismeasured: The methodology that served governments for generations is ill-suited for the brave new world. Summers argued, for example, that there's no obvious way to account for quality improvements brought about by innovation. He asked his audience to vote: "Which would you rather have for you and your family, 1980 healthcare at 1980 prices or 2015 healthcare at 2015 prices?" That's a no-brainer.
Summers then argued that, in reality, the quality improvement in health care made inflation in this sector negative for the last 30 years, producing about 0.3-0.4 percent of gross domestic product in unaccounted-for growth. He added:
"What's present in healthcare, I would suggest is to at least some degree, present in many other areas. You go to the store, there is much more variety in the store than there used to be. It really is easier going through the airport with my boarding pass on my cell phone than it used to be when I carried a ticket to a ticket counter, which was swapped for a boarding pass. I would happily pay 30 percent, even 40 percent of the price of my Boston-Washington ticket, to have the extra convenience associated with the way we travel now."
The measurement failure argument is also Glodman Sachs' favorite explanation of the productivity paradox. The investment bank argues that in software and digital content, "quality- adjusted prices and real output are much harder to measure than in most other sectors."
Economists who specialize in measuring productivity don't quite buy that. At the same Peterson Institute conference where Summers spoke, John Fernald of the San Francisco Federal Reserve Bank acknowledged that the difficulties of accounting for the qualitative improvements could lead to lower growth and productivity estimates, but said the underestimation was limited to 0.1 percent to 0.2 percent of gross domestic product, not enough to explain the big productivity slowdown.
Besides, many of the improvements aren't really making us more productive at work. "The benefits you get from watching cat videos on the internet, that doesn't get counted in GDP," Fernald said. The varied digital entertainment and communication we're enjoying because of the tech revolution mainly affects our non-market activity. Another way to think about it is that many of the recent innovations only make it easier and more fun to do things we hardly ever noticed were hard and no fun. People use internet messenging apps where they once relied on email, or they pay with their phones where they used to pull out a credit card. There is no perceptible change even in time use -- we're just switching to a new, seemingly more perfect way to conduct the same old transactions. Much of the "Internet of Things" (now often referred to as the Internet of Everything) -- connected light bulbs and faucets, excessive electronics and software in cars -- provides this kind of "quality improvement": Gadgets are wonderful, but rarely essential.
A way could be worked out to include these benefits in the GDP calculation, but that would mean changing the whole model -- essentially the ideology --of how the economy is measured. Instead of resolving the productivity paradox, such changes would simply obscure it.
Finding a plausible explanation within the current set of rules isn't impossible. In a just-released paper, Ryan Decker of the Federal Reserve Board and his co-authors point out that the fast productivity growth of the 1980s and 1990s was driven by "high-growth young firms" -- in other words, start-ups. It's somewhat counter-intuitive given the current visibility of the start-up culture and the preference Generations Y and Z show for entrepreneurship, but, according to the Decker paper, business dynamism in the U.S. has sharply declined since 2000: "The U.S. has a much lower pace of start-ups, and those that do enter are less likely to be high-growth firms."
Business dynamism has declined as much in hyped-up sectors such as tech as in more traditional ones -- retail and manufacturing. Decker and his collaborators didn't research the reasons for the phenomenon, but one possible explanation is that the technological revolution really took place in the 1980s and 1990s. Personal computing was the major breakthrough. Twenty years ago, though, some expected more productivity growth from computerization than it actually delivered, and discussed mismeasurement-related explanations. The more things change, the more they stay the same.
The rise of mobile or social networks have brought significant changes, but not revolutions in terms of productivity and effect on the economy. The productivity paradox will disappear when the next Really Big Thing comes. At the moment, we're living through the tail end of the boom that a previous generation brought about -- and perhaps the gestation period for the next leap forward.
Bloomberg View contributor Leonid Bershidsky is based in Berlin. For more columns from Bloomberg View, visit http://www.bloomberg.com/view