The workers were furious. Believing that new mechanical looms threatened their jobs, they broke into factories, seized machinery, brought it into the street and set it afire, all with widespread public support, even tacitly from the authorities.
That was in 1675. And those English textile workers were neither first nor last in the long procession of worriers about the potential harm to jobs from labor-saving devices. Several centuries earlier, the adoption of the fulling mill caused an uproar among workers forced to find other occupations. Almost exactly 60 years ago, Life magazine warned that the advent of automation would make “jobs go scarce” — instead, employment boomed.
Now, the launch of ChatGPT and other generative A.I. platforms has unleashed a tsunami of hyperbolic fretting, this time about the fate of white-collar workers. Will paralegals — or maybe even a chunk of lawyers — be rendered superfluous? Will A.I. diagnose some medical conditions faster and better than doctors? Will my next guest essay be ghostwritten by a machine? A breathless press has already begun chronicling the first job losses.
Unlike most past rounds of technological improvement, the advent of A.I. has also birthed a small armada of non-economic fears, from disinformation to privacy to the fate of democracy itself. Some suggest in seriousness that A.I. could have a more devastating impact on humanity than nuclear war.
While acknowledging the need for substantive guardrails, I’ll leave those valid concerns to others. When it comes to the economy, including jobs, the reassuring lessons of history (albeit with a few warning signals) are inescapable. At the moment, the problem is not that we have too much technology; it’s that we have too little.
We’ve had forms of artificial intelligence, broadly defined, for millenniums. The abacus, thought to have been invented in Babylonia more than 4,000 years ago, replaced more laborious methods of mathematical calculation, saving time and therefore reducing work.
When I began my career in finance in the early 1980s, we had only hand-held calculators to help with our numerical analysis, which we painstakingly wrote in pencil on large sheets of paper (hence, the term “spread sheets”) and which were then typed by a secretarial pool. Any changes meant redoing the entire spread sheet. Now, all that happens with the click of a mouse.
Less than three decades ago, library-type research could require hours combing through dusty volumes; now, it necessitates a few strokes on a keyboard. Not surprisingly, the number of librarians has been flat since 1990, while total employment has grown by more than 40 percent.
Other job categories have almost completely disappeared. When was the last time you talked to a telephone operator? Or were conveyed by a manned elevator? In the place of these and so many other defunct tasks, a vast array of new categories has been created. A recent study co-authored by M.I.T. economist David Autor found that approximately 60 percent of jobs in 2018 were in occupations that didn’t exist in 1940.
And so the Great American Jobs Machine ground on. In the decade after Life magazine decried the robot invasion, the United States created 20.2 million jobs, and today, the unemployment rate sits at 3.6 percent, a hair above its 50-year low. Of course, the number of Americans employed in finance has boomed, even as computers, Excel and other technologies have made them far more productive.
Higher worker productivity translates into higher wages and cheaper goods, which become more purchasing power, which stimulates more consumption, which induces more production, which creates new jobs. That, essentially, is how growth has always happened.
This makes A.I. a must-have, not just a nice-to-have. We can only achieve lasting economic progress and rising standards of living by increasing how much each worker produces. Technology — whether in the form of looms or robots or artificial intelligence — is central to that objective.
Generative A.I. — as dazzling and scary as it can be because of its potential to be a particularly transformative innovation — is just another step in the continuum of progress. Were our ancestors any less startled when they first witnessed other exceptional inventions, like a telephone transmitting voice or a light bulb illuminating a room?
In the heyday of commercial innovation — between 1920 and 1970 — productivity rose at a 2.8 percent annual rate. Since then, except for a brief interval of acceleration between 1995 and 2005 (the modern computer revolution), the annual rate of growth has averaged a modest 1.6 percent. To pessimists, that reflects their view that the most impactful technological advances are behind us. To me, that means full speed ahead on A.I.
What constitutes “full speed ahead” remains to be seen. For all those who believe that A.I. will prove revolutionary, there are others more skeptical that it will prove a game changer. My best guess is that it will help nudge productivity upward but not back to its halcyon days of the last century.
To be sure, the benefits of productivity growth don’t always reach workers as fully and efficiently as we’d like. Recently, even the meager productivity growth has largely not filtered down to the workers. Since 1990, labor efficiency has risen by 84 percent, but average real (adjusted for inflation) hourly compensation has increased by 56 percent.
That foregone worker compensation has largely gone into corporate profits, fueling a stock market boom and record income inequality. Why the disconnect? There are a variety of contributors, from declining union membership to imports to anti-labor practices by companies, like noncompete clauses for hourly workers.
Government can help ameliorate these dislocations. For more than a century, redistribution — yes, that can be a dirty word in America — has been a necessary part of managing the fruits of the industrial and technological improvements.
The progressive income tax, introduced in 1913, was designed, in part, to offset the vast income inequality generated during the Gilded Age. More factory improvements and more income inequality in the 1920s helped stimulate a variety of New Deal policies, from additional protection for labor to the introduction of Social Security.
Today, we can easily see the consequences of Washington failing to hold up its end of the bargain. Disgruntled white factory workers in the Midwest with stagnant or falling real wages became supporters of Donald Trump (despite the fact that his policies favored the wealthy). With only 22 percent of Americans saying our country is on the right track, America feels more divided politically and socially than at any time in my 70-year lifetime.
We did a lousy job of preparing Americans for the transition from a manufacturing economy to one dominated by services. We have to do a better job this time.
If artificial intelligence proves as transformative as its acolytes (and some antagonists) believe, we could face a vast need for better education and training. The impact will not be just on factory workers but on Americans across industries and up and down the employment chain, from financial analysts and coders to graphic designers and customer service agents and call-center workers.
A recent report from Goldman Sachs, among the most bullish of the techno-bulls, concluded that A.I. can help return our productivity growth rate to the halcyon days of the mid-20th century. I, for one, am fervently hoping that the Goldman report proves correct and that A.I. unleashes a new era of technological and economic progress — and that we take the right steps to be sure the rewards are widely shared.