In 2011, Marc Andreesen famously wrote that "software is eating the world." I love the phrase, but after stumbling upon an old cover article from the Saturday Review, I'd revise it to say: "Software continues to eat the world."
I've been thinking about this topic recently after discovering a pile of old magazines when cleaning out my great aunt's Iowa townhouse. One of them caught my eye, a Saturday Review from June 23, 1979. (Saturday Review no longer exists in the same form -- it's now a medical journal.) The cover illustration was mesmerizing, not to mention prescient:
The story, "Computer Shock: The Inhuman Office of the Future," was written by Jon Stewart, an editor at Pacific News Service. Jon told me he and his colleague John Markoff (subsequently of The New York Times) did a series of syndicated articles "on the productivity promises of desktop computing, exploring some of the potentially darker aspects of those expectations."
Thirty-six years later, though the article's subtitle sounds dystopian, I find the story more balanced. And my present-day reading has had me reflecting on just how completely our lives have been transformed by technology over the past 40 years -- for better and worse -- and how we often take these things for granted. (And we do take them for granted -- see Louis C.K.'s "everything's amazing and nobody's happy" YouTube clip as evidence.)
Just consider a few of the ways the world has dramatically changed. (Note: All italicized quotes to follow come from "Computer Shock.")
"A recent IBM study concludes that major corporations pay $6.41 to create, type, revise, and mail a one-page letter."
Adjusted for inflation, that's $21.04 in today's dollars. Is it any wonder that in the United States, email outnumbers "snail mail" 81 to 1?
Radicati Group, a technology market research firm, estimates that in 2014, business users sent/ received 121 emails per day on average. It predicts that figure will grow to 140/day by 2018.
"Frank Piedad of Automatic Banking Systems predicted that researchers will be experimenting with 'limited teleporting' of packages in which parcels are rendered into molecules for electronic mailing and reassembled by computers at the receiving end -- much as letters are today transformed into digital signals for transmission."
While not exactly what Frank Piedad predicted, 3-D printing is very, very close. I ask that you read "3D printing helps doctors safely deliver baby" to get a sense for just how widespread this technology already is. "We see this as OBs could potentially have a 3D printer right by the ultrasound machine."
"Integrated office systems, combing data and word processing, will be common even sooner; by the turn of the century the executive should be able to automate himself right out of the office with a sophisticated communications and computing system in his home. With instant access to any individual or data bank anywhere in the world, by merely instructing a home computer in spoken English, why should anyone bother to congregate in a skyscraper miles from home every day?"
Through the rise of smartphones, laptops, wifi, and cloud computing, among other things, remote working has taken over corporate America. GlobalWorkplaceAnalytics.com has found that "50% of the U.S. workforce holds a job that is compatible with at least partial telework, and approximately 20%-25% of the workforce teleworks at some frequency."
It's also found that workers are not at their desks "50%-60% of the time."
"There will be less personal interaction in routine tasks, such as bank transactions, as well as in high-level decision-making, such as weighing the merits of closing down a plant. The undeniable logic of the silicon chip will tend to take us by the hand and lead us to the 'right' decision."
That big data has taken over corporate decision-making is obvious -- you'd struggle to find anyone who'd admit to making a decision by instinct rather than data analysis. Harvard Business Review called the use of big data a "management revolution."
Growth in productivity
"It has been estimated that the performance of computers has increased 10,000-fold in 15 years, while the price of 'each unit of performance' has declined 100,000-fold since 1960. Digest those figures and you will not find it astonishing when sober and intelligent people compare the impact of the microprocessor to that of James Watt's steam engine that ushered in the industrial revolution more than 200 years ago."
My colleague Morgan Housel has published the following chart before. It shows the number of hours an American worker would have to work today to produce as much as in 1950:
That steam engine analogy isn't far-fetched.
Growth in productivity, part 2
"According to a 1977 report to Congress by the comptroller general, the government employed 2 million civilians and two computers in 1950, when it had a $40 billion budget. By fiscal 1977 the budget had risen to $400 billion, but personnel was up only 25% to 2.5 million. By then, however, the government employed no less than 10,000 computers."
Leaving aside the politics of the current size and efficiency of the federal government, there is little doubt that, true to the trend highlighted in the above passage, computers enabled massive gains in public sector productivity.
The Bureau of Labor Statistics measured public sector productivity up until 1994 through a survey called Federal Productivity Measurement Program. In the final report, the BLS found that the "overall average annual increase in productivity of the federal government approximated that of nonfarm businesses between 1967 and 1982, but lagged behind it from 1982 to 1994."
Surely private sector productivity growth trumped public sector productivity growth since then, but I feel confident in estimating that there were indeed impressive gains in the latter group.
Labor market impact and income inequality
"Peter Schwartz, the Stanford Research Institute futurist, believes that this revolution contains the seeds of social disaster, as well as the possibility of greater democracy and a better world. 'We are going to have information-rich people and information-poor people,' he says. The Information Age could thus create a new 'underclass' of people who lack the skills necessary to take advantage of the new technology."
In 2013, The Globe & Mail described the field of data analytics as "the fastest-growing job market you've never heard of." One industry expert said there are 4,000 new such jobs being created annually, and that supply would struggle to keep up with demand.
The move toward computer science, engineering, and mathematics has put the humanities on alarm, but the real divide is not between STEM and the liberal arts. It's between those who have ready access to the Internet and those who do not.
According to the Pew Research Center:
College or graduate degree
Not completed high school
Annual household income at least $75,000
Annual household income below $30,000
A report from NBC News says the profile of those not using the Internet -- less educated, less affluent, and generally older -- "correlates closely with the demographics of those suffering the fastest rises in unemployment, an analysis of data from the U.S. Bureau of Labor Statistics shows."
Earlier this month, an old New York Times headline was making the rounds on Mockery Twitter:
That story was published in June 1979 -- the exact month and year that Jon Stewart's "Computer Shock" hit newsstands -- and starts, regrettably: "What has happened to the home computer? What was once thought to be a billion-dollar, mass-market item that would revolutionize our lives seems suddenly to have lost its luster."
Although that bit of forecasting stupidity isn't nearly as egregious as Newsweek's "Why the Web Won't Be Nirvana," published in 1995. After AOL had already made shareholders a fortune. After Netscape was founded. (Newsweek then was one of the two major newsweeklies in the U.S., but its fortunes since have been dramatically altered by you-know-what.)
In that context, it's astounding that the United Nations declared Internet access a basic human right in 2011. It didn't take long for widespread computer and Internet usage to go from "what's the point?" to basic human right. This is all a good reminder that the current state of the world was far from inevitable.
Of my takeaways from this exercise, the first is likely the most obvious: Accept that the future is unpredictable, but trust that technology will make it different than it is today -- and not always in a linear fashion. (A related thought: Be skeptical of any large-cap valuation model going out five years or more.)
The second is to look for similar current-day narratives: things that are at once impossible and inevitable. I see that in the world of artificial intelligence; a four-book (!) review from the Financial Times last weekend started, "The 'robot invasion' meme has become a staple of publishing."
Silicon ... Gulch?
For anyone who wishes to read the entire Saturday Review article, you can find a digitized version here. It's worth your time -- did you know, for instance, that the original name for Silicon Valley was "Silicon Gulch"?
That a journalist got this all so right is impressive, but also not the point. Software will continue to "eat" the world, the digital age will affect our lives in ways most of us can't anticipate, and the world will look different -- maybe even radically so—in 10 years' time.
These are facts we ought to accept, as citizens, employees, consumers, and investors.