Have computers made us more productive? Have all the installs, upgrades, mips, modems, and migration plans been worth it? Have the billions and billions invested in high-tech infrastructure
actually made people more prolific?
The common notion today is that computer technology will swiftly lead society to vast improvements in productivity. But despite the fact that computers are everywhere in our lives, we have a
lingering doubt. Innovations with computers have yet to translate into recognizable impact to our overall productivity. This quandary has been termed the “Productivity Paradox.”
In the last two decades we have seen rapid innovation in computer technology. But over the same 20 years we’ve seen disappointingly slow gains in measured productivity. According to P.A. David
from Stanford University, “the recent boom in office automation and the rise of computer-intensity of service industries have not been accompanied by surging output per hour in those activities.”
Even though computer technology accounts for approximately one-half of the United States gross investment in equipment, economic trend-watchers like Robert Solow remind us that “we see the computers everywhere but in the economic statistics.”
Even one of the nation’s foremost computer scientists, Michael Dertouzos, bemoans the current state of computer technology. “I marvel at how the great promise that computers would improve human productivity is more easily discussed than implemented.” In a recent article for MIT’s Technology Review, Dr. Dertouzos decries how computers are misused and abused. “I see ridiculous duplication of effort. People are doing everything they used to do before computers plus the added work required to keep computers happy or to make people appear modern.” We encounter situations everyday — wading through automated phone answering systems for example — that collectively erode our enthusiasm for computer innovation and subconsciously remind us that something is amiss.
But should we expect to see an upwelling in productivity from computer technology at this time? Or do we suffer from “telescopic vision” where the future seems closer at hand and possibilities
It’s natural for people to focus on the future and be riveted by the prospect of dramatic improvements to our lifestyle. The culture in the United States, for example, was forged with intrepid,
inquisitive hope. Americans sailed the Atlantic in search of freedom, pioneered the west in search of gold, and now we search the final frontier to Mars and beyond. If not for dreamers, inventors
and futurists, our accomplishments would be few. Today, when it comes to computers, we are on a societal quest toward the modern frontier of technology.
But forward-thinking has drawbacks. Our culture tends to view far-sightedness as less worrisome than myopia, and our telescopic vision of computer technology has us concentrating on the arrival and
rather than the journey. In our rush to satisfy our high expectations perhaps we loose sight of the intricacies associated with high-tech change.
History teaches that the pace of realized improvement is not tightly tied to the rate of innovation. For example, if we look at the not-so-distant economic and technologic developments around 1900,
we see a similar productivity paradox at the outset of the “electric technology” age. As industrial countries went from an age based on steam to one built around electricity, rates of
productivity declined. Proficiency suffered for decades while at the same time there was rapid innovation in electric technology.
The full transformation to the new electric technology was drawn out and uncertain. Since the first electric dynamo in 1870, engineers saw the potential for electric technology to be revolutionary,
but it took nearly a half century before there was widespread electrification of American facilities. Part of the delay was due to the durability of old steam powered technology. Steam had staying
power much like, as Dertouzos points out, old procedures linger along side modern computer-aided methods.
When it finally came, the payoff of electric technology was big. Measurements of productivity and total growth soared in the 1920’s once electric technology became common place. Today, only a few
decades later, the use of electric technology is now so firmly part of our technological culture that it is hard to imagine our society without it.
We will be well into the new millennium before it will officially become “the computer technology age.” The assimilation of computer technology is enormously more subtle and complex than the
transformation to electric technology. It will take time for us to fully understand the nature of human-machine interactions and design a truly human-friendly interface. But if we take a long view
of technology we may be able to counteract our impatience with computer technology. By looking at the story of electric technology perhaps we may avoid immoderate expectations and undue impatience
on our journey into the information age.
Dertouzos, Michael L., Creating the People’s Computer, MIT’s Technology Review, Cambridge, MA. April 1997. pp. 20-31.
David, P.A., Computer and Dynamo – Discussion Paper #172, Stanford Center for Economic Policy Research. Stanford University. 1991.
Ó 1997 Kevin Craine, EDPP
One time rights. Duplication not authorized without consent of the author