Spotmini robot image courtesy of Boston Dynamics

Automation and the Future of Work

 

james manyika is chairman and a director of the McKinsey Global Institute, the business and economics research arm of McKinsey & Company.

Published October 29, 2018

 

The SpotMini robot trapped in a room with its mate looks uncannily like a dog — well, maybe a mutant dog with a canary-yellow body and a proboscis sticking out where its head should be. The robot confidently scurries to the door, raises itself to the perfect height, and then uses its appendage to press down the handle. Pulling open the door, the robot allows its mate to be first across the threshold before the two of them make their dash for freedom.

Such is the content of a short video by Boston Dynamics that has gone viral since it was released in February. Reactions to it on social media have ranged from amusement (“soon they’ll be opening our fridges and stealing our beer”) to alarm (“this is how we die”). It is the latest evidence of recent remarkable advances in robotics and artificial intelligence: robots can now perform backflips and ride bicycles, and machines empowered by AI are able to analyze X-ray images better than expert radiologists and to teach themselves to play the ancient game of Go better than world champions.

Yet the dog clip is also powerful because it conjures up a more worrisome vision of robots on the loose, beyond the control of us humans. Thus, it plays into a powerful, angst-laden contemporary narrative about technology and the future of work — and maybe even of mankind itself.

What, really, are the risks from this new wave of technology? And will the opportunities these advances bring with them outweigh those risks?

Some Perspective, Please

To address such questions about the future, it is useful — indeed, necessary — to look to the past. Technology has for centuries both excited the human imagination and prompted fears about the effects. Back in 1589, Queen Elizabeth I of England refused to grant a patent to a stocking frame invented by one William Lee because Her Majesty was concerned about the effect of the contraption on the livelihoods of those who knitted stockings by hand.

The Luddite weavers in Britain and silk workers in Lyons, France, famously staged revolts in the early 19th century against new automated looms that threatened their skilled (and well-paid) jobs. And in the realm of fiction, imagination has run riot since the Czech writer Karel Capek coined the word “robot” (from a Slavic word for “work”) in his 1920 play Rossum’s Universal Robots (a k a RUR), which ends with the destruction of mankind. As one of Capek’s characters explains: “Robots are not people. Mechanically they are more perfect than we are, they have an enormously developed intelligence, but they have no soul.”

 
For 100 percent of continuous 10-year periods in the United States, technologically driven productivity gains have gone hand in hand with rising employment.
 

Almost a century later, that same concern about soulless hyper-intelligence infuses not just movies like The Matrix and mind-bending video series like Philip K. Dick’s Electric Dreams, but also warnings about AI and its potential consequences expressed by Elon Musk and the late Stephen Hawking, among others. Back in the 1950s, the German philosopher Martin Heidegger fretted about the risk that humans would become enslaved by the technologies we rely on. With the rise of machines able to mimic many human traits, from speech (albeit still falteringly) to facial recognition (already significantly more accurately than humans), that time appears to be much closer.

Yet when we switch from philosophy to economics, we find a very different story — a long historical record of innovation that shows technological change has been overwhelmingly positive for productivity and surprisingly benign when it comes to employment. Job displacement has occurred in waves, first with the structural shift from agriculture to manufacturing, and then with the move from manufacturing to services. Throughout, productivity gains generated by new technology have been reinvested, and the GDP bounce from that productivity eventually raises consumption and, on balance, increases the demand for labor.

In McKinsey’s research, we have sought to quantify this trend and its net impact in several ways. For example, we looked at the very big picture — what happened to employment as productivity grew over the past 50 years. We found no tradeoff between the two: for 100 percent of continuous 10-year periods in the United States, technologically driven productivity gains have gone hand in hand with rising employment. Even when measured in one-year snapshots, productivity and employment have risen together four years in five.

The personal computer provides an iconic case to support the compatibility of technological progress with job growth. We measured how many jobs were lost and how many gained in the United States between 1980 and 2015 as a result of the diffusion of computer technology. Several hundred thousand bookkeepers and auditing clerks, secretaries and typists did lose their jobs. But the overall balance was strongly positive: the desktop/laptop computer created more than 19 million jobs in industries ranging from computer hardware to enterprise software to online retail sales, against the loss of about 3.5 million jobs — a net gain of 15.7 million. That figure amounts to 18 percent of all the net U.S. employment created in the period — almost one in five jobs.

vcg via Getty Images

Something similar happened with the invention of the automobile, which destroyed the horse and buggy business but gave rise to a wealth of new, sometimes unimagined products, from petroleum-based synthetic rubber to motels.

That said, we ignore those left along the way at our peril. For the many workers who are displaced, painful transitions are a reality. In Britain during the industrial revolution, starting in about 1800 wages of the masses stagnated for almost a half century despite a strong surge in productivity that enriched the owners of capital — a phenomenon often referred to as “Engels’ pause” because it was first noted by the German philosopher who co-authored (with Karl Marx) The Communist Manifesto.

Wages picked up again as the fruits of productivity gains eventually trickled down to workers. But “eventually” can be a long time: the Luddites’ fears were not entirely unfounded. And once again today, the link between productivity gains and wage growth seems bent if not broken. U.S. median hourly compensation rose only 11 percent from 1973 to 2016, even as hourly labor productivity grew by 75 percent.

While it is natural to focus on labor displacement through technology, one element often overlooked in the discussion of the future of work is the number of jobs that are neither lost nor gained, but fundamentally changed for reasons that are hard to predict. History again provides some examples.

 
Hhauke-Christian Dittrich/Picture Alliance via Getty Images
 

When ATMs became as common as vending machines in the 1970s, the number of bank tellers actually increased. That’s because the new technology reduced the costs of processing individual transactions, giving the banks incentives to open more branches to meet local demand for other financial services. The tellers’ primary tasks changed from cashing checks and taking deposits to providing advice and selling securities, mortgages and the like.

To be sure, this tale is not yet complete: with the rise of online banking, the demand for tellers has again waned, and so their numbers have thinned, especially after the 2008 financial crisis. But modern economies are nothing if not dynamic, and the only certainty is change.

There’s a similar story to be told about jobs and information technology. Thanks to the internet and some ingenious software, anyone can now dig up information at the blink of a cursor. Yet, the need for data analysis has exploded, even as the need for data-gathering clerks and the like has plummeted. And so the number of information analysts of the genus homo sapiens has quintupled since 1980, to about two million today.

Yes, But…

This largely comforting history isn’t much help in determining whether this time will be different. To the point: the rise of ever-more-sophisticated technologies, especially artificial intelligence that can be brought to bear on any number of tasks, is likely to usher in a new age of rapid workforce transitions.

The McKinsey Global Institute recently estimated that in our fastest automation adoption scenario, as many as 375 million workers worldwide, or about 14 percent of the global workforce, will need to switch occupational categories by 2030 in order to avoid obsolescence. Even if the pace of adoption is slower, as in our midpoint scenario, some 75 million workers will need to switch occupational categories in the next 10 to 15 years. Among the work most vulnerable to automation: jobs that require routinized physical activity in a predictable setting, and collecting and processing data.

Some aspects of the current technological transformation seem in line with that of previous eras — the rise of computers, for example, and earlier epochs of technical change such as the diffusion of the steam engine that transformed work across multiple sectors of the economy simultaneously. Every new wave of automation seems remarkable at the time, and AI is not the first technology to chip away at cognitive tasks. Since the birth of computerized spreadsheets in the 1980s, machines have tackled ever-more-sophisticated problems that previously required human brainpower.

Still, other aspects of the new technologies do seem qualitatively different from previous waves of innovation. The rate of technological change today is strikingly rapid and, while hard to measure, appears to be accelerating. Consider in this regard the development of deep learning and reinforcement-learning techniques based on neural networks, the exponential increase in computing capacity made available through the cloud and the sheer volume of data that can be generated. These advances are driving AI — for example, enabling the development of autonomous vehicles — but also heralding breakthrough applications such as a heightened ability to diagnose disease with non-invasive techniques. Accordingly, the share of jobs that could be automated could also be larger than historical precedents suggest.

 
The future of work looks set to be not a tale of machines replacing humans, but of machines complementing humans in the workplace.
 

That said, we see no evidence to date suggesting that the time it takes for new technologies to be diffused throughout the economy is shrinking. Looking at a range of high-value technologies over the past 60 years, from airbags to MRI machines to smartphones, we find that the pace of adoption hasn’t changed. It still takes between 5 and 16 years to reach 50 percent adoption and 5 to 28 years to reach 80 percent adoption. Even highly popular social media applications do not beat this clock. Facebook, for example, was begun in 2004 and had attracted more than 2.2 billion followers 13 years later, but the untapped market remains vast.

Consider, too, that new technology is rarely applied to a blank canvas; a variety of economic and social factors play a big role in determining the pace of adoption. They include the cost of developing and deploying the hardware and software, labor market dynamics, the regulatory environment and social acceptance. Computer algorithms in state-of-the-art-technology aircraft can do just about anything asked of them without a helping hand, but we still insist on humans in the cockpit for peace of mind. And who wants to be cared for in a hospital solely by a robot?

We remain disconcerted by the “uncanny valley,” the term coined by the Japanese robotics guru Masahiro Mori to characterize that unsettling feeling people may experience when facing humanoid robots. Indeed, for tech luminaries ranging from the MIT economist Erik Brynjolfsson to the former Google chairman Eric Schmidt, the future of work looks set to be not a tale of machines replacing humans, but of machines complementing humans in the workplace.

The rise of AI and the ascendance of self-learned algorithms whose logic can be hard to explain pose some arguably unique problems. A recent report from a who’s who of experts on the malicious use of AI lists a number of the darkest scenarios, from political manipulation through hyper-personalized disinformation campaigns to a major increase in sophisticated computer hacking attacks. For now, AI pioneers, including Google DeepMind’s chief executive, Demis Hassabis, are clear that, while machines are proving formidable in specific areas such as games, we remain a long way away from the time when they will have mastered a general artificial intelligence enabling them to think and act like humans. And a number of groups including Stanford University’s AI100, with which I am associated, are trying to look ahead, to study and anticipate how AI will affect the way people live, work and play long into the future.

 
Laura Lezza/Getty Images
 

The challenges here for policymakers and business leaders worldwide are significant. We will need to institute retraining of workers on a scale we have not seen for generations. We will need to rethink and adapt our social systems to help hundreds of millions of workers affected by the new technologies — hopefully handling it better than we have handled globalization in providing both new forms of income and job-transition support. We will need to find ways to tame the wilder, unpredictable side of AI. For their part, CEOs will need to rethink their organizational structures and processes to take advantage of the technologies and the performance-enhancing effects they can have — or risk ending up in the recycle bin of business history.

While all of this may seem intimidating, it is still the best way forward. We should embrace automation technologies for the productivity benefits they bring, even as we deal proactively with the workforce transitions that will accompany adoption. Given that demographic trends — increased longevity, low birth rates — are rapidly increasing the dependent population, our societies really need that productivity boost. Beyond the prosperity (and employment) these technologies will create, there will be other societal benefits: new “moonshots,” as we use the power of AI to tackle daunting problems from cancer to climate change.

We can already see one of the key challenges, that of worker retraining (or “reskilling,” as some like to call it), advancing to the top of the corporate agenda and, in some countries, also receiving renewed political attention. In our recent work on the skills that will be needed by the workforce, we found that demand will rise sharply not just for technological skills of all kinds, but also for social and emotional skills (that is, leadership and empathy with customers), as well as for higher cognitive skills linked to creativity and complex information processing.

Business leaders tell us that they believe finding the right talent to help implement automation is critical to their future financial performance, and they expect companies to take the lead in the skills upgrade that is required. Some firms are already putting in place large-scale retraining initiatives. AT&T is using external education providers, including, for example, Udacity. At SAP, the German enterprise-software company, the focus is on in-house efforts to create “learning journeys” for thousands of employees, including engineers who need to acquire “softer” skills as their roles change to service customers directly. Walmart has set up more than 100 academies in the United States that provide hands-on training for positions that include customer service managers and online grocery pickup.

Not Just the Business of Business?

While welcome, these disparate retraining efforts are barely a start. Given the scale of the occupational and skill shifts that we see coming in the next decade, much more will need to be done — and not just by business. Governments and educational establishments working together with philanthropic foundations will, of necessity, be part of the mix. Foundations have the advantage of being able to test innovative programs as pilots — and will not face the disadvantage of concern that workers with new skills will subsequently jump ship for a new employer.

As a starter, governments will need to reverse the 20-plus-year decline in spending on labor-related services — no small task in light of tight government budgets and, in the United States, reflexive opposition to government intervention. They will also need to be more flexible in order to adapt to the rapidly changing workplace.

Mobility is one key, yet labor markets have become markedly less dynamic in recent years. In the United States, for example, job reallocation rates have declined by 25 percent since 1990. This is the time to test ideas such as “portable” benefits that are owned by workers and not tied to particular jobs or companies. Technology itself can help facilitate greater mobility, especially digital platforms turbocharged by AI that enable matching of people to jobs.

But we also need to rethink some basic assumptions. In order for changing jobs across sectors to become a common practice, companies will need to agree on definitions and qualifications for specific skills. And a major debate is overdue about credentialing. Today, higher education encourages the world to judge graduates by their subject knowledge rather than by their skills in problem-solving and creative thinking — skills that will be especially valuable in an AI-enhanced workplace.

More than anything, we need to prepare for this new era in which humans must work alongside smart machines far more intensively. President Lyndon Johnson was prescient in this regard, writing a half a century ago:

Automation is not our enemy. Our enemies are ignorance, indifference and inertia. Automation can be the ally of our prosperity if we will just look ahead, if we will understand what is to come, and if we will set our course wisely after proper planning.

Back then, nobody other than science fiction writers could have imagined robot dogs opening doors and escaping to wherever those androids from Blade Runner found refuge. But LBJ’s words resonate: planning for change has never been more important.

main topic: Workforce
related topics: Tech & Telecoms