* Princeton University Press (2023).
All rights reserved.
Angus Deaton may not have the public name recognition of, say, Paul Krugman or Ben Bernanke, but within the economics Ben Bernanke, but within the economics profession he’s secured his place as a giant. Scottish-born and Cambridge-educated, Sir Angus (yes, really...) has been teaching at Princeton for the past 40 years and researching a variety of data-driven topics related to human development – often with his wife, the Princeton economist Anne Case. His work earned him a Nobel Prize in 2015 (and that knighthood in 2016). But his visibility today is mostly linked to Deaths of Despair and the Future of Capitalism, which he wrote with Case in 2020. That book documented the shocking fall in life expectancy in America in recent years and its likely cause. His just-published book, Economics in America: An Immigrant Economist Explores the Land of Inequality, is an anecdote-rich account of what economists do for a living. Here, we excerpt the chapter that names names in assigning economists part of the blame for the country’s division.
— Peter Passell
Published October 23, 2023
The financial crisis that began in 2008 with the collapse of Lehman Brothers was a pivotal event, not just for those who were harmed in the subsequent recession but also for stimulating discussion about whether the American and global economies were fit for purpose. The discussion has gone on long after the crisis itself. Many serious commentators continue to worry that democracy is incompatible with capitalism, at least capitalism as currently practiced and regulated. The rich whose ac- actions had caused the crisis made off with hundreds of millions of dollars and were never punished, while many ordinary people lost their jobs and their homes. Most economists, including forecasting institutions such as the International Monetary Fund and the Organization for Economic Cooperation and Development, did not predict the crisis, a public failure that inspired the Queen on a visit to the London School of Economics to ask, “Why did no one see it coming?” Before the crisis, many economists had promoted the elaborate financial engineering that underlay the collapse, confident in the power of financial markets to create wealth and to regulate themselves.
Once it happened, economists were far from united on what to do about it. Alan Blinder wrote in 2022, “The financial crisis was the result of a series of grievous errors, misjudgments, and even frauds by private-sector companies and individuals, aided and abetted by leaders such as George W. Bush and Alan Greenspan, who were unduly enamored of laissez-faire and viscerally attached to the vaunted wisdom of the market.”
This is a tale that cannot be told too often, of government-enabled rent-seeking and destruction supported by the ideology of market fundamentalism. Not all economists supported that ideology, but many did and still do.
Not entirely unrelated to these events came a populist upsurge that brought the election of Donald Trump in 2016, his failure to acknowledge his defeat by Joe Biden in 2020 and an ongoing threat to American electoral democracy. One can perhaps understand why so many are unfazed by a threat to democracy as it is currently working, given that it has long failed to work for them.
Economists did not cause the financial crisis, nor did they bring deaths of despair. But many would assign them a good deal of responsibility for their reckless enthusiasm for markets in general and financial markets in particular, and they were often relaxed about the growing inequality that markets were generating. As to health, there is always a ready supply of economists denouncing government interference or price control in a health care system whose exorbitant costs are destroying good jobs and spreading despair.
The big question is whether today’s American capitalism – and to a lesser extent capitalism in other rich countries – continues to be compatible with liberal democracy. I do not have the answer to this question, but I do want to explore the question of the responsibility (if any) of my profession in bringing us to this pass.
My Cambridge University teachers explained how the Great Depression need never have happened, if the benighted policymakers had only understood John Maynard Keynes’s insight that government spending – stimulus policies – could cure unemployment and restart the factories, just as diabetics need never have died if they had only known about insulin.
Business cycles have long been a central topic in economics, and many individuals in the generation before mine, who came of age in the Great Depression, became economists to better understand the horrors of mass unemployment, and dedicated their professional lives to ensuring that it would never happen again. To a large extent, they, and we, thought that they had succeeded, if not perfectly, then close enough. The crash in the fall of 2008 was a great surprise, rather like being told that the plague was back. Then, in the spring of 2020, the plague did come back (though that is a different story).
Encountering the financial crisis, or the Great Recession as it was called both to echo and to separate it from the Great Depression, was therefore rather like meeting a dinosaur or attending the premiere of a Shakespeare play instead of reading about them in history books. As always, textbooks leave things out, and so the experience seemed fresh. When I was an undergraduate, my Cambridge University teachers explained how the Great Depression need never have happened, if the benighted policymakers had only understood John Maynard Keynes’s insight that government spending – stimulus policies – could cure unemployment and restart the factories, just as diabetics need never have died if they had only known about insulin. As in too much of the economics we learned, politics was little mentioned, but in 2007 and 2008, the politics came back with a vengeance.
The Republican Party is unanimously anti-Keynesian and robustly challenged the post-crisis stimulus policies. Republicans accused the Obama administration of printing money, debasing the dollar, stealing from future generations and turning the USA into the USSA – the first S stands for socialist. A sinister purpose was even read into the visit to Washington in March 2009 of Britain’s socialist prime minister Gordon Brown, leader of the Labour Party, who had previously been a successful and orthodox finance minister. Such talk would not have been unfamiliar 80 years ago. Many politicians and much of the media take it as obvious that the stock market measures social welfare and that the job of any administration is to keep it high. As a result, the fall in the market in the early days of the Obama administration was taken as showing that its policies had failed.
Most American economists – including many who have advised and worked with Republican administrations – did not argue against government stimulus spending in and of itself. Yet there has been no unanimity in the profession. Robert Barro of Harvard, who is one of the top-ten most cited economists in the world, wrote about what he called “Voodoo multipliers” and sounded a common theme, that the crisis does “not invalidate everything we have learned about macroeconomics since 1936.” The multiplier he refers to is the factor by which stimulus spending will add to national income, a number that the administration’s economists believed was greater than one; after all, the post-crash unemployment of labor and capital left unused resources that could be brought into play.
Barro, by contrast, argued that the multiplier is zero, because the government cannot do anything that the market cannot do better, and will simply replace private spending that would otherwise have taken place. Barro is most famous for his argument that deficit spending generates offsetting saving by consumers. He argues that people realize that, in the end, the government will have to pay the money back, that the repayment will have to be financed by higher taxes, and so they will save in anticipation of the day that they or their descendants will have to make restitution by paying those taxes.
For most economists, including me, this insanity is an embarrassment, and the fact that Barro is taken seriously – and is a professor at Harvard, rather than a fringe blogger – is a sure indication that, indeed, macroeconomics has regressed, not progressed, since 1936. Still, there is some justice to the claim that it is possible to find some well-credentialed economist who will support any policy. And that Barro’s ideas on this are taken seriously does not redound to the credit of the profession.
Whether it is the profession or the policymakers who are to blame, it is a distressing fact about my profession that 80 years of work in macroeconomics, much of which earned the highest accolades, has had so little effect on the policies that it nominally addressed.
Instead of a stimulus, Barro recommended that the elimination of the corporate income tax would be a “brilliant” way to address the crisis. The late Ed Prescott of Arizona State noted that it is not true that all economists agree on the effectiveness of a fiscal stimulus, though “if you go down to the third-tier schools, yes, but they are not the people advancing the science.” Prescott won the Nobel Prize in 2004 for advancing the science, and specifically for understanding “the driving forces behind business cycles.” But even his presence does not propel Arizona State into the top tier. According to U.S. News and World Report, the graduate program there is tied for 38th place, far behind Harvard, MIT, Stanford and Princeton, many of whose economists are fans of fiscal stimulus.
The (libertarian) Cato Institute, one of whose co-founders was Charles Koch, found 200 economists to sign a full-page advertisement stating that government expenditure had not stimulated economies in the past, and would not do so then. Prominently absent from the signatories were professors of economics at the “third-tier schools” such as Harvard, MIT and Princeton, perhaps because so many of their faculty were in Washington, helping to construct the stimulus. It is not clear how many of the 200 signers agreed with Barro’s or Prescott’s economic analysis, and many may simply be skeptical of the effectiveness of large government programs under American political conditions. Yet many economists do not appear to recognize that such programs might act differently in an economy in a slump rather than at full employment, which was Keynes’s point. Nor would they learn such a thing in many of today’s graduate courses in macroeconomics.
Most of the economists I know and talk to do not take Prescott’s or Barro’s work as a serious guide to policy. They accept that it is clever, is original and has opened up avenues that were not previously explored, even if those avenues would perhaps have been better left that way. The same is true of the other recent innovative approaches to macroeconomics, several of which have earned Nobel Prizes but have had little or no impact on policymaking in Washington.
Perhaps it would have been good if they had had more impact, though I believe not. Either way, whether it is the profession or the policymakers who are to blame, it is a distressing fact about my profession that 80 years of work in macroeconomics, much of which earned the highest accolades, has had so little effect on the policies that it nominally addressed. I also find it profoundly depressing that, at least as far as macroeconomic policy is concerned, there is no consensus that would convince an intelligent but skeptical layperson. Indeed, it is worse than that. Paul Krugman’s discussion of why economists got it so wrong, with which I am largely in sympathy, points to the huge divides among macroeconomists but is honest enough to admit that even those on the opposite side from Barro and Prescott have nothing like the coherent understanding of the aggregate economy that would support sound policymaking.
Lest I am taken as claiming that only macroeconomics is in trouble, there are other areas that are doing equally badly. In Decem- ber 2008, I attended a meeting to “celebrate” 30 years of research on economic development at the World Bank, followed immediately by the American Economic Association (AEA) meetings in San Francisco. (Full disclosure: I organized the program at the AEA meetings.) Both had a feeling of crisis. At the Bank, it was clear that the model of economic development through aid or concessionary lending was broken, and that the research agenda that supports those loans and is financed out of them has become untethered from any chance of promoting development around the world. The atmosphere was dreary, the gloom unrelieved. It seemed that the idea that the international organizations, guided by economics, could promote growth in the world and eliminate poverty, an idea born after World War II and midwifed by Keynes among others, was dead.
In our book, Deaths of Despair and the Future of Capitalism, Anne Case and I tell the story of how the lives of Americans without a college degree have, on average, and in many dimensions, fallen behind those with a college degree. The gap began opening 50 years ago (around 1970), and has continued since.
The AEA meetings were not engineered to talk about crises in financial markets or in the profession, if only because the program was set nine months ahead and before the crisis. Yet much could be arranged at the last minute, and instead of gloom and depression there was a sense of invigoration, of a task to be done, and of the talent to deal with it. Over and over again people happily argued that, at last, macroeconomics would change. Perhaps so, and at the time of writing in early 2023, there is a great ferment of debate, with many mainstream economists challenging ideas that would not have been challenged fifteen years ago.
Deaths of Despair
One of the most important divisions in America today is between those who have a four-year college degree and those who do not. The bachelor’s degree has increasingly become a passport not only to a good job – the kind of job that is worth doing and whose rewards have steadily increased over the past half-century – but also to good health, to longevity, and to a flourishing social life. Without it, you risk being a second-class citizen, with implications for life at home and at work, and in time spent with others. Michael Sandel notes that “the idea that a college degree is a condition of dignified work and social esteem has a corrosive effect on democratic life. It devalues the contributions of those without the diploma, fuels prejudice against less-educated members of society, effectively excludes most working people from representative government, and provokes political backlash.”
In our book, Deaths of Despair and the Future of Capitalism, Anne Case and I tell the story of how the lives of Americans without a college degree have, on average, and in many dimensions, fallen behind those with a college degree. The gap began opening 50 years ago (around 1970), and has continued since, including the last few years of the pandemic. Keep in mind that, even today, when educational attainment is much more widespread than it used to be, only a third of American adults have a four-year college degree.
An epidemic of what Anne Case and I called “deaths of despair” – deaths from suicide, drug overdose and alcohol abuse – began before the financial crisis and continues to this day. For Americans without a college degree, life expectancy at age 25 has been falling since 1990. Among the worst villains of this story are rich pharma companies that exploited the despair in an economy and a society that was no longer serving the majority, and enriched themselves by promoting addiction and death. Yet the background of despair, into which addiction enriched the pharma companies, came from decades of an economy that was not delivering a good life for the two-thirds of the population without a four-year college degree.
Perhaps the most prominent gaps are seen in mortality and in life expectancy. After a century of increasing life expectancy – not only an indicator of health but, as many argue, a sensitive indicator of the state of the economy and society – life expectancy in the United States fell for three years in a row, from 2014 through to 2017, something that had not happened in the century since the last pandemic in 1918-19. The rising mortal- ity came not only from rising deaths of despair – suicide, drug overdose and alcoholic liver disease – but from a simultaneous slow-down and eventual cessation of the decline in deaths from cardiovascular disease that had been the main engine of mortality improvement in the last quarter of the 20th century.
Remarkably, this rising epidemic of deaths has almost entirely spared those with a four-year college degree. For those without that qualification, we draw a parallel to Émile Durkheim’s analysis of suicide, where people found themselves in an economy and society that no longer worked for them and no longer provided the support that they needed to make their lives worth living.
Even in normal times, there are suicides, drug overdoses and deaths from alcoholism among both the more and less educated, and indeed, until the last part of the 20th century, it had been believed that suicide was more common among those with more education. But the increase in deaths of despair, around 100,000 every year since the mid-1990s, is confined to those without a college degree. It is as if those without the degree must wear a scarlet badge denoting their inferior status. Suicide itself is now more common among those without a college degree, those wearing the badge.
Death is the terminus at the end of the long road of despair. The starting point is a labor market that increasingly excludes those without a four-year college degree from good jobs. The fraction of non-elderly adults who are employed has been declining for less-educated men for half a century, and for less-educated women since 2000. Participation in work increases in boom times and falls back in recessions, but the rise in the next boom never attains the previous peak. The same is true for the real value of wages, falling and rising around a falling trend. For men, even in the boom leading into the pandemic – when the rise in wages for less-educated men was being loudly celebrated – the purchasing power of wages for men without the degree was lower than at any date in the 1980s.
The failing labor market spills over into the rest of life. Unions are now almost nonexistent in the private sector. Unions not only raised wages for their members, as well as for many nonmembers, but also kept an eye on working conditions – federal authorities are not always effective in preventing even illegal practices – and were often a center for social life. Bob Putnam’s famous solitary bowler was bowling in a union hall; neither would likely be there today. Unions provided countervailing power for working people not only at work but in local and national politics. Unions have little power in Washington today, and even the most powerful union lobbies are outspent by several individual corporations like Facebook and Google.
Marriage has declined among the less educated, but not among those with college degrees. Instead of marrying, many Americans participate in serial cohabitations, often having children, with the result that men in middle age, although often father to several children, do not know their kids, who are living with their mothers or perhaps other men. These nontraditional family and childbearing patterns may appear to promise personal and sexual liberty for the young, but for those who are middle-aged and older, they cannot provide the comfort and stability of traditional arrangements, at least when they work well.
Morbidity has risen alongside mortality. In an extraordinary reversal of a law of nature, middle-aged Americans now report more pain than do elderly Americans. Once again, this is true only for those without a four-year college degree and is not, in fact, a reversal of the process of aging but happens because those in midlife today have experienced more pain throughout their lives than have today’s elderly.
The largest part of the increase in deaths of despair comes from opioid overdoses. For this, pharmaceutical companies bear huge blame; the initial wave of opioid deaths was a result of wealth-seeking pharma companies pursuing profits by addicting people. Pharma knew to target the less-educated, because it was the less-educated whose lives were in disorder; more broadly, historical opioid epidemics have happened in places and at times of social turmoil and disintegration. Pharma and their distributors were supported and defended by politicians, some “representing” the places most deeply affected. Money speaks very loudly in American politics, and when it comes to choosing between the interests of your voters and campaign finance, the selection is often the latter.
William Julius Wilson, who saw the loss of jobs as the key, just as Anne Case and I argue today. Indeed, if ever-lazier workers were turning down jobs, we would expect wages to rise, not fall, as workers became scarcer relative to available jobs.
Meanwhile, suicide rates rose to levels that used to characterize only the worst societies on earth: the former Soviet Union and its satellites. Even in those countries, as throughout the world, suicide rates have been falling. American suicide rates – especially those for less-educated Americans – are a notable and disgraceful exception.
Economists and Deaths of Despair
Economists are perhaps less split over the causes of deaths of despair than they are over the causes of the financial crisis. Yet, the familiar divides between right and left soon appear. The facts themselves are not in dispute, and the National Center for Health Statistics (part of the Centers for Disease Control and Prevention) confirmed Anne Case’s and my calculations soon after our first publications, but different writers assign blame differently.
Our own story sees the decline in good jobs for less-educated Americans as the key. This decline, in response to globalization and, more importantly, technical change (robots), is made much worse in the United States than elsewhere by the grotesquely exorbitant cost of health care. Much of the cost is financed through health insurance premiums paid by employers, premiums that are much the same for low- and high-income workers, making the former much more expensive relative to their contribution to the firm. Beyond that, when bad things happen and people need help, the safety net in the United States is fragmentary compared with those in other rich countries.
Others lay the blame on the victims themselves. Although he does not explicitly write about deaths of despair, Charles Murray identifies the same rising gaps between the more and less educated but attributes them to a decline in virtue among the latter, particularly the virtue of industriousness. People are not working because they are lazy. Murray previously made the same argument about Black American communities in the 1960s and 1970s. But there was a more compelling story from William Julius Wilson, who saw the loss of jobs as the key, just as Anne Case and I argue today. Indeed, if ever-lazier workers were turning down jobs, we would expect wages to rise, not fall, as workers became scarcer relative to available jobs. Nicholas Eberstadt tells a story similar to Murray’s – that less-educated workers are choosing not to work, with their choice enabled by government benefits, particularly disability benefits.
It was not long before those arguments were brought to bear on the opioid crisis, and once again, some on the right argue that government benefits are making things worse. The story begins with work by Alan Krueger on the long-term decline in employment. He reports survey evidence that half of those not in employment were using pain medication, and that two-thirds of those were using prescription pain medication. Nicholas Eberstadt quoted this study in an article in Commentary, in which he wondered how these people, out of employment, could afford to be “stoned,” given that painkillers like oxycodone are not cheap. The answer, according to Eberstadt, is Medicaid. It was the government providing subsidized opioids through Medicaid. He wryly comments, “In 21st-century America, ‘dependence on government’ has thus come to take on an entirely new meaning.”
President Trump’s Council of Economic Advisers, while recognizing the role of pharma companies pressing doctors to write prescriptions, focused on the prices of opioid drugs, arguing that the expansion of government health care programs, particularly Medicare Part D (which covers prescription drugs), had made opioids cheaper and encouraged their consumption. The Senate Committee on Homeland Security and Government Affairs, chaired by the execrable Senator Ron Johnson of Wisconsin, issued a 164-page report whose message is summarized by its title, “Drugs for Dollars: How Medicaid Helps Fuel the Opioid Epidemic.” Yet, according to a leading health care information company, only 8 percent of opioid prescriptions between 2006 and 2015 were paid for by Medicaid.
We might wonder how it is that rich European countries, which subsidize or even have free prescription drugs, have managed to avoid opioid epidemics. Perhaps it is because those countries’ governments do not allow opioids to be used outside of hospitals or clinical settings. Nor are pharma companies allowed to send their representatives to doctors’ offices to persuade them to prescribe opioids, often bringing misleading information. The U.S. government does, indeed, bear much responsibility for the epidemic. But its guilt lies in yielding to relentless and well-funded lobbying by pharma and their distributors to write favorable laws and to hamper investigations that attempt to counter abuse. In a better regulatory environment, providing cheaper drugs to consumers would be a good thing, not a bad thing.
Once the Covid-19 pandemic arrived, deaths of despair were used as an argument for not imposing lockdowns. President Trump argued that stay-at-home orders would be worse for people’s health than the virus: “You’re going to have suicides by the thousands.” Others, including Health Secretary Alex Azar, pointed to the likelihood of mass casualties from alcoholism and opioid overdoses, and warned about deaths from postponing medical screenings and treatment.
In fact, suicides fell at the beginning of the pandemic, not just in the United States but around the world. Perhaps this could not have been predicted, but the studies linking unemployment and suicide had broken down long before the pandemic (for example, during the 2008-9 financial crisis). Drug overdoses rose rapidly during the pandemic, as did deaths from alcoholic liver disease, and Casey Mulligan, who served as the chair of President Trump’s Council of Economic Advisers, has long argued that the pandemic, and government responses to it, is largely responsible. Yet opioid overdoses were rising rapidly in January and February 2020, and there is no obvious increase in the upward trend at the time of the emergency. Later, it is certainly possible that some of the benefit checks were spent on street drugs.
But Mulligan’s argument linking unemployment and unemployment benefits to deaths from alcohol is an almost perfect caricature of the kind of economic story that only economists could love. Here it is. Before the pandemic, people liked to go to bars, drink and hang out with friends and other drinkers. With bars closed, that is impossible, so people have to drink at home. Alcohol at home is cheaper than in a bar; there are no markups and no time costs of going out. When prices fall, people drink more, leading to the sort of lockdown casualties that the Trump administration predicted.
Perhaps. But what people care about – or at least most people – is the experience of having a drink; otherwise, they would behave like “stoned” people who sleep and drink on the street, dosing themselves with the cheapest alcohol they can find. So, using Mulligan’s version of price theory, what would I actually expect? Even if the “price” of a drink is lower, the “price” of socializing with alcohol is now higher than before. Consumption should fall. Do I believe this? I don’t know, but the problem with this sort of theorizing is that once we depart from the actual observable prices in the store or in a bar, we can make up any story we like about the “price” that we think is relevant.
What I find so distressing in these alternative accounts of deaths of despair is the deflection of blame away from the pharmaceutical companies and their enablers in Congress, where it rightfully belongs, toward the victims themselves. Policy is helpless, government is always the problem and never the solution, and the best that we can do is to tell people to be more virtuous. Economics does not have to be like this.