Friday, January 4, 2008

Computing Utility - the Next Big Thing?

The recent cover article in Business Week about Google becoming the foremost computing utility reminded me of an article by David Warsh in the Boston Globe circa 1990 as he described research by Paul A. David, a Stanford Economist. The Business Week article and the new book by Nicholas Carr entitled The Big Switch: Rewiring the World, from Edison to Google, build on the theme that Paul David described in 1990.

One of David's key points is that this kind of transformation from one technology paradigm to another takes a minimum of 20-30 years. The Warsh article is particularly interesting as since that time the Dot.com and the Dot.bust have occurred.

Let's look at what Warsh and Paul David had to say in 1990:

"Many people in the computer business are feeling a little blue as the new decade begins, and not just International Business Machines Corp., either. Earnings at Digital Equipment Corp. and Apple Computer Inc. are down, too. IBM's plans to seed a merchant semiconductor consortium have collapsed, the Japanese are said to be tightening their hold on the market for commodity chips. The software industry is out of the headlines. Maybe the good days are over.

"That means it's a good time to step back and search for a little perspective. That's what Paul David has done. In a recent study, the Stanford economic historian has drawn a relatively tight historical comparison between the way the computer has made its way into everyday life so far in the 20th century, and the way the electrical revolution unfolded through the advent and adoption of the dynamo, starting in the 19th.

"The episodes lend themselves to analogizing, he says, for both the computer and the dynamo are "general purpose engines," capable of being engineered in sizes ranging from tiny to immense, the computer tossing off information as the dynamo tosses off power. Moreover, we have travelled just about the same distance today from the invention of the idea of the computer as the world stood in 1900 from the first mechanically generated electricity - about 60 years on.

"So what does David's analysis of the coming of the age of electricity tell us about what to expect of the computer revolution?

"Well, for one thing, it suggests that much of the unfolding of the benefits of the information age still lies ahead. And it casts intriguing new light on the riddle of lagging US productivity as well. Indeed, his very use of the quaint old fashioned word "dynamo" for what today we know as an electrical generator, suggests that we may have to say goodbye to the very word computer, before we know its age has truly arrived.

"A little history is in order.

"Electricity clearly dominates the story of the last third of the 19th century, but (as with computers) the initial run-up from idea to practical invention took just as long - about 30 years. At first there were chemical batteries and telegraphy, but only after mechanical generators - "magnetos" and "dynamos" - were created did the technology begin to truly gather force. The first direct current magneto was invented in 1841; a more efficient version in 1856 - the first light was generated in a lighthouse in 1858. Soon thereafter followed a series of dramatic breakthroughs. The ring-winding dynamo was invented in 1870; the incandescent light bulb dates to 1879; and Thomas Edison had the first central generating stations up and running in New York and London in 1881. The first electric tram service began in 1885.

"At first dynamos were confined to the sites they served. But the superiority of alternating current was demonstrated in the 1890’s, and the electricity business began to evolve into a series of centralized power sources and decentralized systems with many customers, many uses. Thus dynamos were not a novelty by the turn of a century but a ubiquitous urban fact - even if the universal utility form the system would soon take wasn't apparent to many outside the industry. Surveying the great Paris Exhibition of 1900, Henry Adams felt he was standing at the pinnacle of the age. He wrote, "It is a new century and what we used to call electricity is its God."

"Yet if you looked at the situation in 1900 through the lens of present-day economists, you might have had reason to worry then, David notes. Labor productivity had sunk to its lowest levels in England since the 18th century; it was declining in the United States, too, as waves of immigration from southern Europe entered the labor markets. Financial wizards were engaged in a binge of "paper entrepreneurism," peddling stock and arranging mergers, rather than pursuing technological improvements or bringing new products to market. True, the science and technology establishments of the industrial nations were expanding vigorously. But the industrial and economic leadership of the world was shifting from England to the United States, and the "Edwardian boom" of the first 15 years of the century was interpreted as a kind of Indian Summer by many commentators at the time. In short, it was a period very much like today, and those who, like Henry Adarns, felt they had already seen the fulfillment of the promise of electricity, had some reason to be disappointed.

"According to economist David, Adams was suffering from a tendency that might be called "technological presbyopia," or far-sightedness. It is a diagnosis deserving wider fame, for it is frequently to be found in the vicinity of new "generic" or systemic technologies - biotech, say, or the revolution in materials science. Its central symptom, according to David, is its misplaced focus: "on the arrival and not the journey," is the way he puts it. Technological presbyopia causes analysts to lose sight of the enormous complexity of the processes they are studying - the ways that new businesses tie into the economic, social, political, and legal transformations that they trigger. His prescription is for a set of economic lenses designed to correct for buoyant techno-optimism on the one hand, for the "depressing conviction that something has gone awry" on the other.

"For in fact, the "dynamo revolution" was just getting started in 1900, David writes. The building of the great grid that would connect cities and whole continents had barely begun. For a time, innovative businesses installed electrically-driven systems on top of pre-existing power trains, with thoroughly mixed results. Power costs didn't begin to fall sharply until 1907-1917. The great productivity-enhancing powers of the new techniques became unmistakable only after World War I. Then, at last, businesses went on a building binge. Not until the great investment boom of the 1920s did electrical power - secondary motors, in particular - penetrate deep into modern factories. It was then that productivity began to soar.

"In his essay, David takes pains to locate his comparison of computers and electrical generators in the context of the current debate about lagging American productivity. It's well-known that American productivity fell well behind its post-war trend during the 1970s. And though many experts see the brilliant record of the 1950s and 1960s as a kind of unrepeatable "great leap forward" stemming from the sustained doing-without of the Great Depression, combined with World War II, there is still widespread puzzlement about the effect of computers on America's economic strength. "We see computers everywhere but in the economic statistics," as MIT's Robert Solow has put it. Does that mean that the much ballyhooed productivity-enhancing effects of computers are so much hot air? Not necessarily.

"On the experience of the coming of electricity, David thinks the good news on productivity may be yet to come - that is, when manufacturers begin truly switching over from present-day methods of record keeping and control to fully electronic systems, in everything from airplane and automotive controls to bank accounts. "You don't get the full productivity effects until about two-thirds of the way into the diffusion process," he told a session of the American Economic Association last month. “The productivity surge is located in that period.”

"If David is right, of course, it means there is good news up ahead for the computer industry in particular – and for the United States in general - though not necessarily for the companies that dominate the industry now. Hardware makers could see their opportunities subside as quickly as did the big dynamo manufacturers in the 1930s. Does anybody now remember General Electric's many competitors at the turn of the century? Mightn't IBM someday become as slim a splinter of the total market for computer gear as Thomas Edison's GE is in the electricity business today?

"As memories become cheaper, and architectures become more complex, and fiber-optic transmission becomes more efficient, computing could follow the example of electricity: towards utility-style organization, with artfully distributed processing nodes scattered wherever needed. "Computers" then might be everywhere and nowhere; networks might become the economically important item. Who knows, in time, maybe the most basic everyday terminology itself may change. If everything you buy has a certain degree of computer "smartness" built into it, maybe your monthly "processing" bill becomes the important thing.

"Whatever the case, it seems likely that there will continue to be plenty of demand for new information-processing goods and services. That means investment opportunities and jobs, if not tomorrow, then in due course. As the computer revolution proceeds, manufacturing employment as a percentage of the total American work force might then be expected to come down from its present 25 percent, with no more ultimately adverse consequences than were suffered during the dramatic decline in farm jobs from around 40 percent of the total in 1940 to less than 3 percent today. If Paul David's analogy with the history of the electrical revolution is as fruitful as it seems, then it is merely half-time in the information revolution. The biggest opportunities (if not the greatest hoopla) are still ahead. "

The original source articles by Paul A. David are:

One of the major challenges that Google has is getting its new software engineers to understand the scale of a computing utility. Christophe Bisciglia illustrates the problem by sharing a question he asks in interviews "Tell me what you would do if you had 1,000 times more data?" Since that time Bisciglia has developed courseware to help develop these skills while young engineers are in college. An overview of this courseware is online at Google.

This notion of the the "cloud" is the new computer yields an insightful observation captured in the Business Week article:

"As the sea of business and scientific data rises, computing power turns into a strategic resource, a form of capital. 'In a sense,' says Yahoo Research Chief Prabhakar Raghavan, 'there are only five computers on earth.' He lists Google, Yahoo, Microsoft, IBM, and Amazon. Few others, he says, can turn electricity into computing power with comparable efficiency. "

Carr in The Big Switch points out that what is making the computing utility come to life is the repeal of Grove's Law. Grove's Law stated that telecommunications bandwidth doubles only every century. Carr states:

"The network barrier has, in just the last few years, begun to collapse. Thanks to all the fiberoptic cable laid by communications companies during the dotcom boom - enough, according to one estimate, to circle the globe more than 11,000 times - Internet bandwidth has become abundant and abundantly cheap. Grove's Law has been repealed. And that, when it comes to computing at least, changes everything. Now that data can stream through the Internet at the speed of light, the full power of computers can finally be delivered to users from afar. It doesn't matter much whether the server computer running your program is in the data center down the hall or in somebody else's data center on the other side of the country. All the machines are now connected and shared - they're one machine. As Google's chief executive, Eric Schmidt, predicted way back in 1993, when he was the chief technology officer with Sun Microsystems, 'When the network becomes as fast as the processor, the computer hollows out and spreads across the network.'"

Thomas Friedman in The World is Flat: A Brief History of the Twenty-first Century also chronicles the repeal of Grove's Law and looks at the many ways business has gone global as a result of cheap network bandwidth.

The challenge for those of us creating disruptive business plans for our clients is how to think differently about the impact on our business models with the interaction of not having to invest in an IT infrastructure combined with having far more than 1000s of time the data that I have access to today. It's rare for a business when an innovation can affect both the expense side of the equation (reduced IT expenses) and the revenue side of the equation (scaling to a far higher level of useful data). This intersection of insights is what Ian Ayres describes in SuperCrunchers: Why Thinking by Numbers is the New Way to Be Smart.

No comments: