
Independent Thinking®
The Coming Employment Crisis
April 24, 2016
There can be little doubt that computers, robotic technologies and other forms of job automation have been getting far more capable, and that as this trend continues, more workers are certain to be displaced in the relatively near future.
A very large percentage of jobs are, on some level, essentially routine and repetitive in nature. In other words, the job can be broken down into a discrete set of tasks that tend to get repeated on a regular basis. It seems likely that, as both hardware and software continue to advance, a large fraction of these job types are ultimately going to be susceptible to machine or software automation.
I’m not talking about far-fetched science fiction-level technology here: This is really a simple extrapolation of the expert systems and specialized algorithms that can currently land jet airplanes, trade autonomously on Wall Street, or beat nearly any human being at a game of chess. As technology progresses, these systems will begin to match or exceed the capability of human workers in many routine job categories – and this includes a lot of workers with college degrees or other significant training. Many workers will also be increasingly threatened by the continuing trend toward self-service technologies that push tasks onto consumers.
One of the most extreme historical examples of technologically induced job losses is, of course, the mechanization of agriculture. In the late 1800s, about three-quarters of workers in the United States were employed in agriculture. Today, the number is around 2%-3%. Advancing technology irreversibly eliminated millions of jobs.
Obviously, when agriculture mechanized, we did not end up with long-term structural unemployment. Workers were absorbed by other industries, and average wages and overall prosperity increased dramatically. The historical experience with agriculture is, in fact, an excellent illustration of the so-called “Luddite fallacy.” This is the idea – and I think it is generally accepted by economists – that technological progress will never lead to massive, long-term unemployment.
The reasoning behind the Luddite fallacy goes roughly like this: As laborsaving technologies improve, some workers lose their jobs in the short run, but production also becomes more efficient. That leads to lower prices for the goods and services produced, and that, in turn, leaves consumers with more money to spend on other things. When they do so, demand increases across nearly all industries – and that means more jobs. That seems to be exactly what happened with agriculture: Food prices fell as efficiency increased, and then consumers went out and spent their extra money elsewhere, driving increased employment in the manufacturing and service sectors.
The question we have to ask is whether or not that same scenario is likely to play out again. The problem is that this time we are not talking about a single industry being automated: These technologies are going to penetrate across the board. When agriculture mechanized, there were clearly other labor-intensive sectors capable of absorbing the workers. There’s little evidence to suggest that’s going to be the case this time around.
It seems to me that, as automation penetrates nearly everywhere, there must come a “tipping point,” beyond which the overall economy is simply not labor intensive enough to continue absorbing workers who lose their jobs due to automation (or globalization). Beyond this point, businesses will be able to ramp up production primarily by employing machines and software – and structural unemployment then becomes inevitable.
If we reach that point, then we also have a serious problem with consumer demand. If automation is relentless, then the basic mechanism that gets purchasing power into the hands of consumers begins to break down. As a thought experiment, imagine a fully automated economy. Virtually no one would have a job (or an income); machines would do everything. So where would consumption come from? If we’re still considering a market (rather than a planned) economy, why would production continue if there weren’t any viable consumers to purchase the output? Long before we’d reach that extreme point of full automation, it seems pretty clear that mass-market business models would become unsustainable.
If, at some point in the future, consumers look out the window and see a landscape where jobs are relentlessly getting automated away, and if it appears that getting additional education or training provides little protection, there’s likely to be a significant negative impact on consumer sentiment and discretionary spending. If we someday get into a reinforcing cycle driven by fear of automation, a very dark scenario could ensue. It’s difficult to see how traditional policies like stimulus spending or tax cuts would be effective because they wouldn’t address consumers’ concerns about long-term income continuity.
If you look at issues like stagnating or declining wages for average workers, growing income inequality, increasing productivity, and consumption supported by debt rather than income, you can certainly find evidence that generally suggests we might be approaching that “tipping point” where structural unemployment is going to become a problem.
This article represents the views of Martin Ford, and not necessarily the views of Evercore Wealth Management.
To learn more about Evercore Wealth Management events in California, please contact Iain Silverthorne at [email protected].
Editor’s note: Martin Ford is the author of The New York Times bestseller “Rise of the Robots: Technology and the Threat of a Jobless Future.” He addressed Evercore Wealth Management clients in several cities earlier this year at the De Young Museum in San Francisco. He subsequently contributed this article to Independent Thinking.