the dark side
john komlos is professor emeritus of economics and of economic history at the University of Munich.
Illustrations by Jacob Thomas
Published January 20, 2015
It has been six years since Joseph Stiglitz, the Columbia University Nobel-Prize-winning economist, coined the term "GDP fetishism" to explain the almost religious reverence paid to GDP as a surrogate for societal welfare. But GDP isn't the only concept to command veneration from the congregation. "Disruptive innovation," the rocket fuel allegedly propelling growth, has also been assigned an honored place in the pantheon of underexamined economic virtues.
But I get ahead of myself. It first makes sense to take another look at an immensely influential six-decade-old concept that lies behind much that we celebrate in modern economies. In 1942, Joseph Schumpeter, the Austrian-born Harvard economist, famously dubbed the process of growth-powering innovation as "creative destruction." In his then-novel framework, profit-seeking entrepreneurs invent products or processes in order to increase efficiency, improve quality or lower price. The old is swept away by the new in the relentless Darwinian (or, for history-of-ideas buffs, Spencerian) competition for survival.
Thus, economic creativity in Schumpeter's conceptualization is (like natural selection) at once constructive and destructive: there are almost inevitably losers as well as gainers. Nonetheless, Schumpeter and those who took up the Schumpeterian banner have stressed that creative destruction was, in the main, welfare-enhancing.
I am less convinced. There is, indeed, a significant downside to creative destruction that Western societies – in particular, the United States – are disinclined to notice. In fact, I'll go a step further: the character of disruptive innovation is evolving in ways that lead to more destruction and less creation.
Creative Destruction –Emphasis On the Latter
The destructive component of innovation, whether organizational or technological, can be viewed as a negative "externality" – a cost borne by third parties in the way that the consequences of pollution spewed by a factory are borne by its neighbors rather than by its owners or customers. To take a simple example: integrating cameras into mobile phones rapidly led to the decline in demand for stand-alone point-and-shoot cameras, and may well have hastened the demise of once-mighty Eastman Kodak. In 1998, that iconic camera and film manufacturer employed 86,000 people (and paid them decent wages). In 2014, after emerging from bankruptcy, it has a skeleton workforce of 8,000.
Look a bit more closely at the tension between creation and destruction. Suppose a new invention adds $50 million to the wealth of the inventor and another $50 million to the welfare of consumers. One might conclude that the societal gain equals $100 million. But further suppose that the invention makes the capital equipment of a rival obsolete, worthless for any use in a competitive marketplace. Suppose further that the specialized labor that operated the now-obsolete capital equipment no longer has value in the market and joins the ranks of the unemployed. Then, to calculate the full impact of the innovation on societal welfare, the depreciation of the physical capital and the skills of the labor force need to be netted from the aforementioned $100 million gain.
One would still expect some net gain from the creative destruction. That is, the loss to the losers would probably be less than $100 million; otherwise, it would pay the losers to buy the rights to the invention and bury it. But that is not necessarily the case in the real world, where firings and plant closings can lead to chain reactions of socioeconomic displacement that reverberate through families or whole communities. And, in any case, all too often the gains are celebrated while the losses are ignored – or even rationalized in social Darwinian terms as the just desserts of the unproductive.
Apple, arguably one of the firms most responsible for Kodak's death spiral, has but 47,000 employees, two-thirds of whom are earning below-middle-class wages. More broadly, there is every reason to believe that the digital revolution has, on balance, destroyed a lot of jobs. U.S. employment in the Internet-publishing, broadcasting and search-portals sector has increased by 87,000 since 1999; in the same period, however, the number of jobs in newspaper publishing was halved, with a decline of 212,000 positions.
The externalities from creative destruction may also fall on consumers. Consider the case in which a new product wipes out the market for an existing one – as when, in very short order, the DVD made the videocassette player obsolete. Cassette players still played cassettes. But the innovation stopped the sale of new content on tape, requiring consumers to buy DVD players if they wished to watch new movies and the like.
Note, too, the intrusion of the awkward issue of what is commonly called "planned obsolescence." Classical economic theory assumes consumer tastes are formed independently; producers merely strive to satisfy them. That surely isn't always the case – otherwise, the only effective advertising would be purely informational. It is hard to say, though, whether firms often design new models with the primary goal of reducing the value of their old models. While Apple's regular introduction of new iPhone models may have that effect, the company has little choice if it is to stay competitive with the latest offerings from Samsung and HTC – or to smother new market-entrants in their cribs.
But one doesn't need to believe that producers are planning obsolescence to recognize that many consumer innovations are superficial – or simply figments of marketers' imaginations – and may lead to relatively small welfare gains for consumers, even as they reduce the perceived value of existing equipment. By the same token, one can imagine that, as often as not, electronics producers deliberately make new stuff that's incompatible with the last generation of connectors and software – or resist efforts to create uniform industry standards for hardware and software – in order to raise the cost of upgrading to the latest and greatest.
Clothing fashion, where there's little argument that most change is for change's sake, is another example of an industry in which innovation tests our implicit bias in favor of innovation. This year's fashion may generate consumer welfare even if the clothing offers no objective improvement because novelty is fun. But there's no denying the cost in terms of devaluing last year's fashion. Note, moreover, that the new fashion was not demanded by consumers; demand came after the fact. In the case of "positional goods" in which all the benefits of ownership consist of their value in keeping ahead of the Joneses, new ones surely create little or no net value.
More Destruction, Less Creation?
In his deification of the innovating entrepreneur, Schumpeter was thinking of the great disruptive innovations associated with both the first and second industrial revolutions – everything from the steam engine to the telephone to the automobile to the radio. The negative externalities associated with these technologies were small or even negligible compared with the gains in productivity and the resulting improvements in the quality of life. That's because many of these were completely new products (penicillin) or improved productivity across old sectors and new (electrification). Moreover, all of them were capable of capturing economies of scale previously undreamed of, and all satisfied a need innate to human nature, so consumers required little persuading to adopt them if they could afford them. Then, too, the firms they displaced were generally small-scale operations with little capital to depreciate.
What's more, these new technologies used labor on a massive scale so that the workers displaced by the innovations could easily find employment in the new sectors of the economy (though not always in industries that valued their specialized skills). Hence, the destructive force of those innovations was usually small in absolute terms and virtually always small relative to the creative component.
For example, the societal value of replacing kerosene lamps with incandescent bulb lighting was undeniably enormous in terms of reliability, convenience, health and safety. On the other side of the coin, the capital made obsolete and the labor displaced in the kerosene lamp industry was not a major loss to the economy. Similarly, the telephone was a new technology that replaced nothing but some mail – whose volume continued to grow, anyway.
The closer the substitutability between the new and the old products (or the new and old ways of making something), the greater the risk that the cost of displacement is relatively high. Most of the innovations of the 19th and early 20th centuries led to little such substitution, however. But that, I would conjecture, is no longer the case.
The Problematic Arc of Creative Destruction
It's hard to quantify this alleged trend toward more destruction and less creation. But, at the anecdotal level, the change is striking. Take mobile communications, which seems to be stranding the huge investment in land-line telephony in rich countries and devaluing the specialized skills of the myriad workers who built and maintained it. Ironically, mobile communications seems to be eating its own entrails, and at an accelerated pace, as marketers have transformed smartphones (which cost as much as laptops to build) into rapidly depreciating fashion statements. You still have an iPhone 5? It is so yesterday, now that the iPhone 6s – and the Samsung Galaxy 5 – are available.
Consider, too, that modern products ranging from digital electronics to pharmaceuticals require huge investments in development, but generally cost relatively little to manufacture. Moreover, many exhibit network effects, in which the value to one consumer depends on how many other consumers have adopted the product. As a result, competition in these markets often leads to winner-take-all outcomes, in which the product with a slight edge in features or marketing acumen obliterates rivals. By no coincidence, Silicon Valley is full of rags-to-riches-and-back-to-rags stories, in which successful products are rapidly displaced by the Next Big Things.
Time Warner, you may remember, effectively paid close to $200 billion for AOL, the Internet service pioneer, in what was seen (by Time Warner, anyway) as a merger of equals in 2000. Come to think of it, you probably don't remember, since Time Warner stockholders lost their entire investment within a few years as AOL sank like a stone in a world of changing technology and marketing-driven tastes.
One could argue that the collective gains from social networking facilitated by Facebook, Twitter, Pinterest, Tumblr, Instagram, etc. look more like earlier generations of innovation in which it's easy to see the gains and hard to see the losses. But at the risk of sounding ancient, I would suggest that the endless online chatter thereby facilitated mostly replaces older ways of socializing without adding much to well-being. The market capitalization of Facebook is a humungous $200 billion (as this was being written), but one has to wonder whether it is just monetizing activities previously left outside the market's purview.
The current list of disruptive technologies widely seen as likely to usher in future waves of innovation ranges from autonomous vehicles to artificial intelligence to education based on massive open online courses. They may or may not deliver on the hype. But it seems pretty clear that, if they do, they are going to displace a lot of jobs. Economists are too easily inclined to dismiss these losses as frictional, problems to be solved by labor markets made more efficient by searching, telecommuting and retraining at rapidly declining cost, thanks (again) to the Internet.
Maybe. But that is far from a consensus view. In a recent book, The Second Machine Age, Erik Brynjolfsson and Andrew McAfee of MIT warn that labor-saving innovation is overwhelming the capacity of the economy to create jobs that are sufficiently productive to yield decent wages. [Read an excerpt in our Third Quarter 2014 issue – the editors.] What used to be called automation, they suggest, is about to make many sorts of skills uncompetitive with machines at virtually any price.
Actually, in this litany of concerns we have thus far failed to take account of the most familiar negative externalities of innovation, the ones that threaten health, safety and the environment – what Joel Mokyr of Northwestern calls the "bite-backs" that range from asbestos-induced cancer to that monster in the closet, fossil-fuel-induced climate change. [See Mokyr's analysis in our Second Quarter 2014 issue – the editors.]
Wait; there's more. Not all successful market-driven innovations even aim to enhance productivity or increase consumer welfare. Many are designed for what economists call "rent seeking" – capturing existing economic surpluses that would have otherwise gone to others. The share of GDP going to the financial services industry roughly doubled in the two decades preceding the Great Recession. And while some innovations in these years did make capital markets more efficient (think index funds and asset securitization), others did little more than shift risk from private financial intermediaries to government agencies (uninsured money market funds, Fannie Mae and Freddie Mac) or lure low-income households into mortgages they couldn't afford. The "bite-back": trillions in lost output and untold misery for the unemployed and foreclosed during the recession.
So, What Can We Do About It?
We shouldn't (and really couldn't) stop innovation, but we should recognize the dark side and begin to think of ways to mitigate the pain of the victims. This means acknowledging that innovation generates losers who, in some circumstances, should be compensated for their losses.
It has become a cliché (no less true for its repetition) that, since the 1970s, technological and organizational change (including globalization) has meant that the bounty of growth in the industrialized world has largely gone to the rich, even as the volatility of income increased for the bottom half. It has also become a cliché (equally true) that interfering directly with market outcomes to redistribute income creates the risk of slowing productivity-enhancing innovation. But this hardly implies that incomes shouldn't be redistributed by indirect means – notably through tax policy and the delivery of productivity-enhancing services (education, health care) to the losers in the games of markets.
By the same token, there is a fairly strong case to be made for reducing the income volatility that has been exacerbated by innovation. For example, research suggests that increasing the duration of unemployment compensation has only a marginal impact on the willingness of the jobless to search for work.
Then, there is the question – really, questions – of the regulation of innovation. There's a long, and for the most part positive, history of requiring innovators in endeavors ranging from automobiles to pharmaceuticals to take steps to minimize bite-back. And in cases where some bite-back is the price of vital change, there's some precedent for compensating the losers. Thus, the extremely small number of children harmed by epidemic-preventing vaccinations are compensated from a no-fault fund financed by fees on vaccine sales.
But in an era of rapid innovation – and, I believe, growing costs in harm to workers, consumers and the environment – it would make a lot of sense to get serious about finding ways to compensate the losers and, when practical, to shift the burdens to those who create the costs in the first place. That doesn't necessarily mean, say, protecting bricks-and-mortar bookstores from online booksellers. But it does mean enforcing the antitrust laws to prevent the accumulation of market power by innovators. That doesn't necessarily mean barring the use of coal in generating electricity. But it does mean enforcing environmental and safety laws on the coal industry and taxing coal to reflect its impact on climate.
The transition to a post-industrial economy has been far from advantageous to a substantial share of the population. Just because we have been innovating and growing successfully for a quarter of a millennium by no means implies that the process will, or should, continue indefinitely. No such economic law exists, and the historical record indicates that there are times when economic regimes reach a tipping point and abruptly change direction.
That is what I believe is happening now. At the very least, it is time to acknowledge the possibility.