Are technological advances contributing to the jobless "recovery" evident in the U.S. government's latest feeble employment report? That question is both age-old and up to the minute. And the answer is vital for determining not only when the economy will rebound, but indeed whether it will ever reach its previous heights.
Economist and technology theorist W. Brian Arthur fears it won't. Writing in McKinsey Quarterly, the Santa Fe Institute scholar and former Stanford University professor describes how digitization is creating what he calls a "second economy" (registration required) beneath the surface economy where consumers and businesses interact. At this deeper layer, which most of us may feel more than see, business processes once conducted by people are now handled electronically.
Air travelers check in with the swipe of a credit card rather than with someone's assistance. That single scan activates a whole network of computers and telecom systems working together to check everything from your travel history to whether you are a person of interest to U.S. intelligence agencies. Or consider how RFID technology has reduced the number of workers required to ship freight and packages around the world.
Such developments may boost labor productivity, but they can be hell on labor. Arthur says:
What used to be done by humans is now executed as a series of conversations among remotely located servers....
I am concerned that there is an adverse impact on jobs. Productivity increasing, say, at 2.4 percent in a given year means either that the same number of people can produce 2.4 percent more output or that we can get the same output with 2.4 percent fewer people. Both of these are happening. We are getting more output for each person in the economy, but overall output, nationally, requires fewer people to produce it.Were the Luddites right?
For modern economies, concern about technology -- the loom, tractor, switchboard or ATM, just to name a few innovations -- supplanting people goes back to the dawn of the Industrial Revolution. In the early 19th century Britain, for instance, "Luddites" famously torched factories to protest the mechanization of textile production.
These anxieties have only intensified as technology has grown more potent and intelligent. It isn't only auto workers or forklift operators who are being automated out of a job. A growing number of professions are increasingly endangered, from customer-service reps manning the phone to white-collar professionals formerly thought to be immune to such "progress." In large corporations across America, algorithms are beginning to replace doctorate-level statisticians and other eggheads.
Within the high-tech industry itself, cloud computing threatens to make the data center and network managers who run corporate information systems as superfluous as last century's buggy-whip makers. Arthur writes:
Now business processes -- many in the service sector -- are becoming "mechanized" and fewer people are needed, and this is exerting systematic downward pressure on jobs. We don't have paralegals in the numbers we used to. Or draftsmen, telephone operators, typists, or bookkeeping people. A lot of that work is now done digitally. We do have police and teachers and doctors; where there's a need for human judgment and human interaction, we still have that. But the primary cause of all of the downsizing we've had since the mid-1990s is that a lot of human jobs are disappearing into the second economy. Not to reappear.Of course, this is how "creative destruction" is supposed to work. New technologies -- steam engines, railroads, electricity, transistors -- open new avenues of growth and wealth. And it creates jobs -- smart-phones need someone to sell them.
But what if that process creates profits for employers, boosting productivity and lowering labor costs, while destroying jobs for employees? In some ways, there is no need to speculate, because the effects of that pattern is already written all over the U.S. economy -- productivity rises, jobs vanish, wages sag:
"There's a permanent reduction in the workforce, due to productivity gains. There will be no short term reduction in unemployment," says Jeffery Weiner, Managing Partner at the public accounting and advisory services firm for Marcum LLP.Mind the gap
Worsening the pain from this widening gap between productivity and wages in the U.S. is the widening gap between the wealthiest Americans and, well, the rest of us. As Arthur notes (and as the Occupy Wall Street protests underscore), the role of an economy isn't only to create wealth -- it is also to distribute that wealth in ways that allow the economy to flourish as a whole.
Income equality in this country isn't only a matter of economic justice, in other words -- it's a matter of economic survival. Software doesn't go to the mall, putting money in the hands of job-creating merchants. The robots increasingly responsible for assembling our cars, electronic gadgets, and other products don't repair to the corner bar after their shift is done, requiring a flesh-and-blood bartender to pour them a round. As one commenter to Arthur's piece succinctly puts it:
Who will be able to afford the goods and services so produced if no one is employed?It's an iron law: Workers, at least the kind with families and mortgages, are also consumers and customers. And with the economy continuing to belch smoke and most economists predicting slow job-creation for years to come, that question is likely to become more pressing.
The future is hidden as always. It's notoriously difficult to say at what point the macroeconomic risks of digitization may outweigh its benefits. But even Moore's law is no match for Murphy's law. Just because innovation has saved us in the past doesn't mean it will do so in the future.