The Massachusetts Institute of Technology (MIT), in Cambridge, ranks among the most prestigious universities in the United States and has a reputation for extolling the virtues of technological progress. Here, along the banks of the Charles River, researchers helped lay the foundations for modern computer technology and played a key role in paving the way to the digital age.
Andrew McAfee and Erik Brynjolfsson, two respected economists and directors at the MIT Center for Digital Business, are deeply committed to this tradition. This prompted them to set out to research the flood of IT innovations in recent years and write a book about how wonderful the digital revolution is for the entire economy.
But something didn't add up here. The theory didn't match reality.
"We examined the data and, at one point, Erik and I looked at each other and said we have to rethink what we're trying to do with this book," says McAfee. "We have seen some flourishing of innovation in many different industries and this is great for the economy," he notes, "but there are some troubling trends." The two MIT economists have nevertheless decided to publish their results as a book. The conclusion, though, is much different than originally anticipated -- which is precisely why it has caused quite a stir among economists, politicians and technology experts. And that conclusion is: The digital revolution is destroying jobs faster than it is creating them.
The worldwide application of computer technology has become so much more cost-effective and efficient that people are no longer only replaceable in certain sectors -- autoworkers on assembly lines, for instance -- but in entire occupational areas. Cashiers are being replaced by self-service check-out lines, airline employees by self check-in kiosks, financial traders by algorithms and travel agencies by online travel sites.
This development has been apparent for roughly a decade. But, says McAfee: "You ain't seen nothing yet. Looking ahead to what technology is going to do over the next five to 10 years, I'm really concerned."
In addition to the ongoing turmoil from the financial crisis, Western economies may have to face "tectonic shifts in employment," the economists warn -- and Asian countries won't be able to escape this development, either. In the battle between man and machine, many workers in Chinese production plants will also lose out.
Fresh Causes for Concern
Fears of the impact of technical progress are nothing new. Back in 1930, renowned British economist John Maynard Keynes warned of a "new disease" that he dubbed "technological unemployment." Nonetheless, the world's economies and their labor markets have always managed to swiftly adapt to even major changes and, ultimately, more jobs were created in new industries than were lost in obsolete ones.
Why should this be any different for the digital revolution, which has produced global technology giants, such as Microsoft and Google, over the past couple of decades and created countless new jobs around the world?
It goes without saying that advances in computer technology have generated millions of new jobs around the globe, more than any other economic sector, says McAfee. But he hastens to add that, at the same time, this very progress could wipe out even more jobs in other areas of the economy.
The two MIT professors are not the only ones issuing such warnings. Politicians and economists of all ideological stripes have similar concerns, and current unemployment figures from early April appear to confirm these fears. Despite the current positive economic climate and rising consumer confidence, far fewer jobs have been created than expected.
"Can innovation and progress really hurt large numbers of workers, maybe even workers in general?" asked Nobel laureate and economist Paul Krugman, for example, in his column for the New York Times last December. "I often encounter assertions that this can't happen," Krugman wrote. "But the truth is that it can."
Richard Posner, a prominent legal scholar and law professor at the University of Chicago, argues in his most recent contribution to his joint blog with Nobel laureate in economics Gary Becker that the economy cannot absorb extremely rapid technological advances: "The result will be soaring unemployment that will retard normal market processes by reducing incomes and in turn production and therefore in the demand for workers."
And even technology gurus are voicing their concern. Jaron Lanier, the man who popularized the term virtual reality, warns in his most recent book that members of the global middle class could become the big losers of a new breed of technological capitalism while an increasingly powerful elite class of digital entrepreneurs emerges.
Earlier technological evolutions took place gradually over many decades, as with factory automation. But technological progress has recently advanced far more rapidly. Moore's Law, a standard principle of computer science, posits that processing power doubles every two years. With time, this means that the jumps forward will become increasingly larger, as technological advancements come in ever greater leaps and bounds. Today's smartphones already have more computing power than cutting-edge PCs did seven years ago. In recent years, software and hardware have grown more sophisticated and complex at breakneck speed.
Higher Productivity, Fewer Jobs
The two MIT professors soon intend to publish a new book that supports their research results, which were first made public in late 2011. McAfee says that he has no doubt that "the list of activities in which people are better than machines is rapidly shrinking."
Unlike with previous technological revolutions, today, it is no longer primarily poorly trained and educated workers in less challenging jobs who are threatened, but also the broad middle range of the workforce, consisting of service providers and white-collar workers. Call-center staff are being replaced by telephone robots, paralegals by computer programs that can more quickly and effectively comb through documents, and tax advisers by more cost-effective software.
Journalists are also affected by this development. In the US, news agencies are already releasing sports news flashes written by computer programs.
The MIT researchers' observations are underpinned by economic data from the US that has been a cause for concern among economists and politicians for years: The productivity of the world's largest economy is swiftly rising, while the number of jobs is stagnating.
The 2000s were the first decade since the Great Depression to end with a net loss in jobs despite the fact that per capita gross domestic product (GDP) in the US is one-third higher than it was 20 years ago -- and the country produces 75 percent more goods than it did back then.
According to the usually accepted rule of thumb, when the economy rapidly grows, unemployment should decline, often by 1 percent for every 3 percent increase in GDP. According to this formula, the US should have almost full employment by now. Instead, even before the financial crisis, no additional jobs were created, although productivity surged at the fastest pace since World War II.
Do the old laws of economics no longer apply? Apparently not "if more and more is produced by machines," says McAfee, who adds that "the definition of automation is that there are fewer jobs than there used to be, but with the same amount of output."