andrew l. yarrow, a former New York Times reporter, is the author of the forthcoming book, LOOK: How a Highly Influential Magazine Helped Define Mid-20th-Century America.
Published September 8, 2021
The millions of workers sidelined by the pandemic faced the immediate problem of keeping themselves and their families alive. It’s all the more striking, then, that 67 percent of workers in the tech sector and 44 percent of the total workforce set aside time to worry about whether their jobs would be made redundant by artificial intelligence, robots, machine-learning and other drivers of automation. Those numbers, found in an April 2020 study by KPMG, the consulting firm, were roughly the same as survey results in European countries ranging from Britain to Hungary. And on this point, anyway, workers were on the same page as management. Some 43 percent of businesses surveyed by the World Economic Forum in October 2020 said they expected to cut jobs “due to technology integration.”
Everywhere we turn, it seems that machines are replacing humans. Think of self-checkout lines at grocery stores, mobile apps handling routine bank transactions, QR codes delivering restaurant menus — not to mention those frustrating automated phone “help” lines that too often consign you to computer purgatory. What’s more, some of these machines “think” rather than just slavishly follow algorithms. “It’s hard to overstate how big of an impact [artificial intelligence] is going to have on society over the next 20 years,”declared Jeff Bezos, whose company has blasted through the brick-and-mortar retail sector like a neutron bomb.
On this subject, though, economists are the iconoclasts, inclined to forsake their traditional role as purveyors of the dismal science. Even as technology destroys jobs, they argue, it fills the holes with new ones. And, at least sometimes, they’ve been right. Nobody wonders what happened to the tens of thousands of people who took care of work horses when the internal combustion engine took over. Ditto for the myriad telephone operators who were sent packing when manual switchboards were replaced by electrical switches.
The Good (?) Old Days
Right or wrong, the doomsayers have rarely had trouble gaining traction in the popular imagination. And at no time did fears — and hopes — about automation so grip the United States as at the end of the Eisenhower administration and beginning of the Kennedy years.
Between 1956 and 1964, the word “automation” appeared in 625 headlines in The New York Times. Congressional subcommittees on automation operated from the 84th to 87th Congresses (1955-63). JFK’s Labor Secretary, Arthur Goldberg, even created an Office of Automation and Manpower. “To paraphrase the ballad of Davy Crockett” — then a hit TV show starring Fess Parker — “automation has become the king of the social frontier,” began a June 1957 Times story.
In most cases, reactions were predictable. For the left and the part of the labor movement with roots in more radical times, automation presaged the Marxian nightmare in which armies of the unemployed would fight over the scraps left by the captains of industry. “We are stumbling blindly into the automation era with no concept or plan to reconcile the need of workers for income and the need of business for cost-cutting and worker-displacing innovations,” wrote the Nation magazine in 1958. It warned that while the workforce would grow, “latter-day industrial technology is contractive of man-hours.”
“Automation is rapidly becoming a curse to our society,” George Meany, president of the AFL-CIO, declared at the labor confederation’s 1963 convention. “It could bring us to national catastrophe.”
On the other side, one certainly found the usual suspects. “Automation is urgently needed to help individual companies, and the nation as a whole, try to be able to meet the new competition from abroad,” Ralph Cordiner, CEO of General Electric, told Congress in 1960. GE practiced what it preached, installing one of the first mainframe computers, an eight-ton UNIVAC I, at its Louisville factory in 1954.
For its part, the U.S. Information Agency, ever ready to parry a potential Cold War propaganda threat, wrote that workers would be the biggest beneficiaries, as automation “opens up opportunities for [them] to exercise skill and judgment on the job.”
While the World Economic Forum predicted 85 million jobs would be eliminated during the next five years, it also predicted that 97 million new jobs would be created. This is the silver lining in the clouds.
But back to the aforementioned Pollyanna economists. “Instead of being alarmed about growing automation, we ought to be cheering it on,” University of Chicago free marketeer Yale Brozen wrote in 1963, concluding that technology had resulted in seven million net new jobs created during the 1950s.
And America’s most famous Keynesian economist couldn’t have agreed more. Alvin Hansen of Harvard, who had argued that the U.S. economy was doomed to “secular stagnation” and would need continuing fiscal stimulus after World War II to sustain full employment, was euphoric in a January 1958 paper: “The United States, having reached the automation age of technical progress, can afford to relegate material goods to a secondary role.” Instead, he continued, automation would free Americans to build communities that “measure up to the great values.”
Meanwhile, cracks in the labor movement illustrated the growing divide between industrial unions built on worker solidarity and smaller craft union royalty who saw the potential for higher wages in tech innovation. James Carey, head of the electrical workers union, argued that automation, together with atomic energy, could not only “end poverty [and] abolish hunger and deprivation,” but also “create a near-paradise on earth, a world of plenty and equal opportunity, a world in which the pursuit of happiness has become reality rather than a hope and dream.”
Too much of a good thing, of course, can be a problem—as the problem of vanishing work was replaced by the “problem” of overabundant leisure in some early 1960s commentary. One scholar worried about the danger of “barren boredom.” The Ford Foundation funded a Center for the Study of Leisure at the University of Chicago, and Look magazine’s John Poppy wrote that “people of the future will need to know how not to work.”
Needless to say, as the economy boomed in the 1960s, fears of automation-led job loss all but disappeared. And even when the economy stagnated in the 1970s, little attention was paid to automation. Two-earner couples became common as wages stagnated, and fears of how to occupy leisure time in a society that worked long hours to keep up with middle-class aspirations similarly vanished.
Chickens Coming Home to Roost?
Fast forward to our digital age. Of course, the American and global economies are much different from what they were 60 years ago, and one would be hard-pressed to find many non-retirees complaining of too much leisure.
There is no doubt that millions of jobs will be lost because of AI, robotics and other new technologies. But, as in the 1960s, many of the real and self-appointed experts, see silver linings in those clouds. The same late 2020 World Economic Forum report that forecast massive job losses from the “robot revolution” illustrates the yin-and-yang nature of the automation debate. While it predicted 85 million jobs would be eliminated during the next five years, it also predicted that 97 million new jobs would be created.
Many Americans will no doubt struggle and suffer, but many will benefit. How that calculus sorts out will be a function of markets, public policy — and chance. However, even those bullish on automation, like economist Robert Theobald in the 1960s or Twitter CEO Jack Dorsey today, believe that the worker churn will produce a lot of losers.
In their view (and mine), the path forward is pretty clear. Governments must act to provide an income floor, as well retraining for willing workers who are displaced or are at risk of becoming redundant. But that is another story for another post.