Growth and Technology: The Wild Ride Ahead

joel mokyr is an economist at Northwestern University who specializes in the history of technology. 

Published April 28, 2014.

 

The economic history of the 20th century was nothing short of miraculous. The century experienced more technological progress in areas that truly affect material well-being than in all of previous history. Even a casual examination of the technological menus available to consumers, from lighting to dentistry and antibiotics, from laundering to musical entertainment and social interaction, confirms this.

give us 5

big ideas image 01

Vaclav Smil, an eminent historian of 20th century technology at the University of Manitoba, has described what happened as the "astonishing concatenation of technical advances" creating "a new kind of civilization." He points out that most of the world's six billion people [today, more than seven billion] reside in "largely or overwhelmingly man-made rather than natural environments." Economic growth, more than ever before, was technology-driven.

Can this continue? A wave of pessimism has swept the economics profession – with many analysts concluding that the best is behind us, that the low-hanging fruits of technology have been picked and that we can no longer replicate the enormous technological successes attained during the second Industrial Revolution (1870-1914) and in the last decades of the 20th century. Some, notably my Northwestern University colleague Robert Gordon, have made this notion concrete by predicting a precipitous decline in per capita growth in the future. From a rate of about 2 percent annually in the United States in the 20th century (and similar figures for the rest of the industrialized world), we are told that in the coming decades it will be, at best, 0.5 percent – and not even that for those of us who find themselves in the "bottom 99 percent." Things look even worse in terms of productivity growth.

One objection I have to these calculations is that computations of productivity and growth are mostly designed for very short-term comparisons – say, to measure this year's results against last year's. But the longer the period, the dicier the comparisons become, especially during an era of rapid technological change. New products appear on the market that augment consumer welfare in ways that would have been unimaginable before, while existing products are improved in so many dimensions that it seems silly to compare them with those of a decade earlier. In how many ways is an Apple iPhone 5 "better" than a Nokia flip-phone, vintage 1995? The same is true for services: how does one compare the reliability and certainty of ordering a taxi from Uber or Gettaxi with the Hail Mary service and the long waits of phone-operated taxi companies of yore?

big ideas image 02

Bite-Back

There is a deeper and more troubling dimension to those comparisons, though, that is worth a close look – a phenomenon that sheds a different light on the dispute between techno-pessimists and techno-optimists. The problem with technological progress is not just that we are hooked on it to raise living standards. Far more often than not, implementation initiates a journey into the unknown, with consequences that could not be foreseen at the time the innovation is introduced.

This is true almost by definition. To predict the full ramifications and fallout of every new technology, we would need a complete understanding of the forces that govern it. Yet such is rarely, if ever, the case: when pharmaceutical scientists develop a new drug, they cannot foresee all the side effects (though not for lack of trying). Indeed, most technologies developed in the 20th century had unanticipated side effects, most of them negative.

This means the social costs of new techniques (as opposed to the costs captured in market prices) are systematically underestimated. In more technical terms, some of the gains in productivity were attained through "inputs" that were either not seen as scarce or else not paid for because nobody realized they were being used at all.

Yet accurate productivity computations require subtracting all inputs from the estimated output. If we fail to do so, we underestimate the costs of production and thus overestimate the gains from innovation. Eventually, society must pay the bill, either by living with (and adjusting to) the consequences or by coming up with (often costly) fixes to modify the technique and repair the damage.

Although formal national income accounting calculations are not exactly the stuff of great excitement, the issue here is sufficiently important to merit some emphasis. Suppose that a new technique is invented that adds 2 percent to GDP, but also suppose that this technique is later discovered to cause damage that needs to be remedied at the cost of 0.5 percent of GDP. This means that the original gains were overestimated by one-third and that the full gains are not realized until the damage is repaired.

How common are such cases of unanticipated costs? Very common; indeed, it is hard to come up with examples of a major breakthrough in technology in which it was not later realized that the accompanying "creative destruction" included some of the uncreative sort. Unfortunately, correcting national income calculations to account for such effects is difficult because the exact costs of the "omitted input" are not known (and by definition are not paid for).

big ideas image 6

 
When we invent something, we know enough about the underlying science to make it work... but rarely enough to assess all the potential side effects.
 
Bite-Back, Up Close and Personal

The mother of all omitted inputs, surely, is climate stability. We now know, as certainly as one can ever know such things, that the engine of much economic growth, the burning of fossil fuels, uses resources that were never imagined to be scarce by those who built the first coal-burning steam engines in the early 18th century: climate stability, sea-water temperature and acidity, the size of the arctic ice cover, and the surface size of the oceans.

Robert Pindyck of MIT, one of the foremost experts on the economics of climate change, has (much like Socrates) concluded that the only thing we know about it is that we do not know anything. But it's plain enough that, had we subtracted even a rough proxy of the full social cost of the energy used so profligately in the 20th century from the value of output it produced, productivity growth would have been much lower than is generally believed.

The problematic relationship between energy technology and climate change comes up in many other contexts. As the technology analyst Edward Tenner noted in his seminal 1996 book Why Things Bite Back: Technology and the Revenge of Unintended Consequences, most of the path-breaking inventions of the 20th century have unwittingly used up some valuable resource that was not paid for because the fact of its existence (and scarcity) was only discovered much later. Chlorofluorocarbons, once used almost universally as refrigerant gases, were found to destroy a scarce resource nobody before paid any attention to: the atmosphere's ozone layer.

Meanwhile, DDT, a wondrously effective insecticide discovered on the eve of World War II, proved as dangerous to two- and four-legged creatures as it was to six-legged ones.

More generally, our war on noxious critters seems to be a continuous series of forward moves followed by reverses, as rapidly multiplying organisms mutate around whatever poison we throw at them. Antibiotics, one of the most significant discoveries of all time, have a built-in bite-back mechanism: with enough exposure, bacteria mutate sufficiently to become drug-resistant. Antibiotics' ancillary benefit to agricultural productivity in the second half of the 20th century has been significant. But the cost in terms of loss of their efficacy in containing human disease must be weighed against those benefits.

It is thus now plain we have overestimated the productivity gains associated with technological change in the 20th century. The degree of overestimation depends on the costs of remedying the damage, or finding an alternative way of producing the gain. Yet, since such costs are still unknown in most areas, the calculations by Gordon and others that suggest we cannot possibly match the productivity growth of the 20th century are robbed of much of their meaning. This ignorance has been historically costly: almost three decades ago, The Economist magazine asked rhetorically if the internal combustion engine had from the start been charged its full environmental cost, whether it would have been adopted at all.

Why is bite-back so common? When we invent something, we know enough about the underlying science to make it work, but rarely know enough to assess all the potential side effects. This is well-recognized in pharmaceutics – hence FDA testing. But it has been equally true in the disruption of ecological systems (which are enormously complex), and in many other areas of economic activity. Like the sorcerer's apprentice, we sometimes unleash forces we do not fully understand and cannot control.

The surprising discovery of omitted inputs is particularly interesting in the case of one of the most important inventions of the 20th century, the Haber-Bosch process for making ammonia from atmospheric nitrogen. There can be no doubt that existing supplies of nitrates from mineral sources alone would not have been able to provide enough fertilizer to feed a rapidly growing humanity. By the year 2000, half the nutrients supplied by the world's crops and 40 percent of proteins can be traced to Haber-Bosch. But it was not suspected until fairly recently that the casual application of nitrates to agriculture threatened water supplies.

Fertilizer runoff has become a serious threat to both aquifers – in quantity, nitrogen fertilizer makes water non-potable – and coastal ecologies. Man-made eutrophication has led to massive algae blooms and the appearance of large "dead zones" in coastal waters. The dead zone in the Gulf of Mexico was estimated in 2011 at about 6,700 square miles, an area the size of Connecticut. The same is true for phosphorus, another essential ingredient of fertilizers (and thus plant life).

But the bite-back effects of technological progress are often much more insidious than environmental damage. Unintended consequences come from unexpected corners. The history of sugar is a case in point.

For much of human history, sugar was rare and its consumption limited to the very rich. However, cultivation of sugar cane on New World plantations and, later, the development of sugar beets that flourished in cooler climates meant that sugar became available to all. A result was a precipitous increase in tooth decay in the industrialized world. Thus, part of the added output of dentists needs to be subtracted from the national accounts because dentistry in large part was necessitated by easy access to sugar.

Quantitatively, this is, of course, a tiny effect, but the concept scales up to agricultural productivity in general. The growth of agricultural productivity since 1890 has increased the consumption of calories from proteins and fat. While this was at first a desirable outcome, it eventually led to an epidemic of obesity and associated health problems.

Obesity is rarely taken into account as a negative unanticipated side effect of technological progress, but it should be. Junk food is cheap because we are very efficient at making and distributing it. Much of the population in countries in the developing world today are struggling with rising obesity, even as others must still worry about widespread malnutrition – 70 percent of all Mexicans are overweight, and a third are clinically obese. A recent study by the Overseas Development Institute estimates the number of overweight people in developing countries to be around 900 million, three times the figure in 1980.

Healing these self-inflicted wounds would be very costly; the cost thus should have been subtracted as "omitted inputs" in productivity calculations. Of course, this was not done and could not have been done. Who could have known in 1921 that adding a lead compound to gasoline to make car engines run better would lead to an enormous cost in terms of lead poisoning? Some scholars have even argued that the lead in gasoline was in part responsible for rising crime rates.

(Thomas Midgley, the General Motors chemist who developed leaded gasoline, might be described as the king of technological bite-back. He later developed Freon, the gas that was long used as a refrigerant but was later phased out because it damaged the atmosphere's ozone layer.)

The same is true for construction materials: lead-based paints and asbestos, to name just the most obvious ones, were later discovered to cause serious health problems. Asbestos, known since antiquity (the word is derived from ancient Greek), has fascinated engineers and chemists for centuries and was widely hailed as a miracle material – one that was abundant, strong, malleable and fire-resistant and, in combination with rubber or cement, a very effective building insulator. In 1939, the New York World's Fair had an exhibit celebrating asbestos' "service to humanity." Only in the 1960s were the dangers of asbestos fully recognized. The campaign to stop its use and remove it from millions of structures has cost $50 billion in the past 20 years.

The point here should now be clear: by not adjusting productivity calculations for these bite-back effects, we make the 20th century look better than it really was and, by implication, probably make the future look worse than it will be. Only when additional advances take account of the omitted costs involved in employing new technologies will we be able to know how much they contributed to productivity.

In some cases, such as asbestos, the gain may in fact be a loss. In others, such as antibiotics, we simply need to put in a lot of effort to retain the gains we have already made. Leaded gasoline turned out to be easy to fix. Ocean acidity will not be.

big ideas image 5

More is More

Unlike the suggestions of some more wild-eyed technophobes, my conclusion is not that technological progress has been an unmitigated disaster. Technological change does not need to be slowed. Quite the reverse: we need more of it. Unlike the sorcerer's apprentice, we eventually learn, adjust and correct. Technology creates problems and technology fixes them. The remedy for technology's unintended consequences is to fix, whenever possible, the techniques causing them, and/or to replace the problematic technology with more benign ones.

This is not wishful thinking. In the past, adaptation has worked more often than not. Burning coal for home heating, electricity generation and manufacturing (made possible by continuous cost declines in the production and transportation of coal since 1800) led to massive urban air pollution. The problem was largely solved by switching to low-sulfur coal, cleaning up smokestacks or moving on to natural gas. Sugar-induced tooth decay was drastically reduced by adding fluoride to drinking water. The need for tetraethyl lead in gasoline was eliminated by technical advances in automotive engineering and petroleum chemistry.

Consider the issue of global warming, about which so much is being written. It seems, as of now, highly unlikely that a political solution will be negotiated that drastically curbs carbon emissions. So some technological fix will have to be found. The possibilities vary from more reliance on renewable fuels (such as solar and wind power) to geoengineering that reduces the amount of the sun's energy trapped by the atmosphere (although the possible bite-back effects here could be horrendous).

More plausibly, we will be driven to partial technological adaptations. For example, those who live on land increasingly vulnerable to flooding because of rising sea levels may be resettled or protected by barriers. We may also need to change building codes and construct dwellings on stilts to protect them from occasional surges.

The ongoing acidification of the world's oceans, largely a function of waste runoff, poses another major challenge. The water's acidity has increased by a substantial amount (with its pH already declining from 8.2 to 8.1), endangering shell-forming organisms and plankton. That, plus serious overfishing, almost guarantees that we will grow ever more dependent on farm-raised seafood. Painful as it may sound, this will be an adaptation to technological bite-back that closely parallels one experienced thousands of years ago. As hunting technology improved, land animals became rare and their domestication in the Neolithic age was an adaptation to the resulting scarcity.

We do not eat much game anymore, and we think little of it. Modern technology, using best-practice physiology and genetics, computer-controlled fish ponds and robots, is certainly up to the task of providing fish lovers with what they want, even if the oceans are eerily empty.

Adaptation will be made possible by a group of technologies developed in the last three decades: genetic engineering. The potential of genetically modified organisms to "repair" the damage done by previous technologies is now recognized, but its full impact is still in the future. There are already glimpses, though, of what can be done.

One of the biggest bite-backs of agricultural technology is the salinization of soils and ground water resulting from water overuse and drought. The problem is particularly acute in Africa and the Middle East, but is also serious in Texas and China. Genetically modified saline-tolerant crop varieties have been developed in which a gene from a plant that grows well on saline soils has been inserted in a rice variety.

It is also possible that genetic engineering will come up with new fish varieties that thrive in more acidic oceans – in which case the bleak prediction of fishless oceans may not come to pass after all. Genetically modified organisms may also be the answer to nitrate pollution: some plants, such as clover, are able to produce their own nitrogen fertilizers by cultivating symbiotic bacteria that convert atmospheric nitrogen into fertilizer. Genetic research is trying to "teach" other plants to do the same by inserting into them the appropriate genes from nitrogen-fixing plants. The GMO frontier is huge. Among other advances to date: soybeans modified to resist insects without the use of pesticides and "golden rice" fortified with vitamin A.

From this perspective, political opposition to GMOs seems particularly misplaced. If you love the environment, you should like these new plants. But more than anything else, they will help humanity clean up the mess left by earlier innovations.

To be sure, GMOs may generate bite-back, too. Precisely because the science of genetic modification is very young, we do not know whether it may itself have any bite-back. It is those effects that the people who object to GMOs are concerned about. But there are solid reasons to believe the likelihood is low that the bite-back effects involved are so huge that costs will exceed their benefits (the "asbestos syndrome").

The nightmare scenario in which some "Frankenfood" wipes out other crops or causes some unanticipated disaster is very unlikely. While it cannot be ruled out altogether, as our knowledge of molecular genetics increases exponentially with time, the risks seems manageable.

big ideas image 03

* * *

The human species has been on a wild techno-ride for millennia, as innovation after innovation disrupted business as usual. Bite-back is common, and in some cases disastrous. Yet, while technological progress is never riskless, the risks of stasis are far more troubling. Getting off the roller coaster midride is not an option.

main topic: Innovation
related topics: Environment, Tech & Telecoms, Economy: U.S.