Communicating About Communications
Thank you. I want to thank Governor Leavitt, Ladd Christensen and his colleagues at Smart Utah, and Gina Crezee of the Utah Department of Community and Economic Development for inviting me to participate.
As I was preparing these remarks, I visited my 80-year-old Grandpa, who lives in a rural area of Northern Wisconsin. We were sitting at the kitchen table when he asked me to explain what I do for a living. I took a business card from my wallet and handed it to him. I told him about the National Information Infrastructure, and about the National Information Infrastructure Testbed consortium. He just stared at my card. When I finished, he looked at me and said: "I've always wanted to know: What is 'fax'? " I had explained the NII and the NIIT in excruciating detail. It never dawned on me that my Grandfather -- like millions of other happy and successful Americans -- has never seen or used a fax machine.
When I told Grandpa the "National Information Infrastructure" is just a fancy name for the information superhighway, he said, "Why does anybody need 500 channels of TV when there are too many commercials already?" When I told him the NIIT is a group of U.S. companies, national laboratories, universities and supercomputing centers that have joined hands to build that highway, he said, "Well, at least you quit practicing law."
And it isn't just my Grandpa. I remember my first NIIT meeting. The discussion focused on "ATM" -- short-hand for asynchronous transfer mode, a high-speed networking technology. I couldn't figure out why they were talking about automatic teller machines. When I was still in private law practice, I argued it was time for our firm to join the Internet. One of the senior partners muttered, "The last thing this firm needs is another health club membership."
The point is, those of us promoting the NII tend to forget our most important audience: the American people. According to a recent poll, only a third of Americans have ever heard of the information superhighway. Only a fraction of those know what it is, much less why it's important. If my Grandpa, or your Grandpa, or our parents or brothers and sisters don't believe they're part of this revolution -- or don't even know there is a revolution -- that's our problem, not theirs. If America is to lead the world in the 21st Century, we must demystify "cyberspace," "the Telecosm," "the Third Wave" and all the other shorthand phrases we currently use when we talk to ourselves. We need to speak plainly about how information technology will affect our families, our jobs and our future.
We can start by translating the term "National Information Infrastructure" into kitchen-table English. Infrastructure literally means "the structure within." It is the basic framework, the underlying foundation upon which the rest of society will be built. That foundation is needed to convey information -- the "raw material" of the New Economy. The Industrial Revolution is sometimes called the Age of Steel. Steel came to symbolize how man could harness energy to transform raw materials like iron and carbon into ships and railroads, bridges and buildings, cities and factories. Man had been smelting iron since 1000 B.C. Yet it was the advent of a national energy infrastructure that enabled the development of large-scale manufacturing, transportation and communications.
A New Economy of Scale
In much the same way, new computing and telecommunications technologies have greatly reduced the cost of harnessing information. People have been communicating with each other for thousands of years. Yet until now, the cost of doing so on a mass scale was high and characterized by scarcity, by gatekeepers, or both. A century after Johannes Gutenberg invented the printing press in 1458, it still took at least a month to send a single book from his hometown in Germany to any major city in Europe. Today, IBM Corporation has joined forces with the Vatican Library to save some of those original books by digitizing them -- converting them into "1s" and "0s", the language of computers -- and using the Internet to make them easily accessible to historians. In 1915, when transcontinental telephone service was first introduced, a four-minute call between New York and Los Angeles cost $30. Today, not accounting for inflation, the same $30 buys an hour-long videoconference. When I first started law practice, the U.S. Code occupied three library shelves and cost hundreds of dollars. Today you can by a single CD-ROM with the entire code -- plus annotations -- for $39.95.
Putting the power of information into the hands of individuals from all walks of life -- and from every location -- is the essence of this revolution. That's why this infrastructure must be national in scope. Here the superhighway metaphor breaks down. This isn't an interstate highway system, planned and financed by government, that connects only the big spots on the map. The goal of the NII is to make information accessible to every American -- quickly, easily and inexpensively. It is upon this foundation that the economic future of our nation depends.
The Role of the NIIT
Conceptually, the National Information Infrastructure seems straightforward: Connect different computers so they can "talk" to each other, then build "on ramps" so each of us can plug into one vast system. The reality is something else entirely. Different computer systems speak different languages: in the jargon of the industry, systems must be engineered so they can "interoperate." What's more, a single electronic message might travel over the telecommunications network, the cable system or the Internet; by satellite or radio; even over the electrical power system, which in some parts of Europe is already providing basic phone service. Each of these networks was designed for a specific purpose. Each is a different kind of highway. You can't just plug them together. If you've traveled abroad and tried to plug your electric razor or hairdryer into a strange power socket, you have some idea what I mean.Cable systems, for instance, are one-way expressways. They're terrific at delivering programs from your cable company but must be redesigned to carry two-way, interactive communications. Or take the Internet, a "network of networks" with at least 30 million users worldwide. It was originally designed in the late 1960s by the U.S. Department of Defense to provide a data communications network that could survive a nuclear attack. The Internet relies on "packet-switching," which means a single message is broken down into separate packets by computers on one end of the network, routed to various other computers -- called "data nodes" -- and reassembled by computers on the other end. Accordingly, a message often travels to many relatively obscure places along the Internet before it reaches its final destination. The Internet's routing system makes sense for a network that is supposed to function even if large portions of it have been destroyed. But it makes much less sense in today's business world.
To illustrate this, a friend of mine once traced a single Internet electronic mail message from his office in Boulder to mine in Denver about 35 miles away. His e-mail was routed to 13 different data nodes before it reached my desk -- including three separate trips to the West Coast and back. The security risks associated with such a system pose obvious challenges for businesses using the Internet to transact electronic commerce. Without adequate protection, a hacker might intercept your message at virtually any data node along the way.
Add to this the rapid pace of technological change. Our capabilities are doubling every year in computing and photonics -- the lightwaves that carry signals over fiberoptics -- and every 18 months in silicon chips. The next generation of advanced microprocessors will enable Personal Communications Services, or PCS -- portable devices you can carry with you anywhere on the planet. Without question, the unpredictability of today's technology means that the private sector -- and most emphatically not the federal government, as I'll explain later -- must lead the way in creating the National Information Infrastructure. Yet the requirements of the NII go beyond the ability of any single company or institution.
That's why we need the NIIT, the National Information Infrastructure Testbed. A "testbed" is a proving ground, a test-track where scientists and engineers can learn how to connect and redesign different networks and make different computer systems interoperate. It's also a place where customers and providers can assess critical requirements so that applications can be tailored to fit their needs.Let's say -- purely hypothetically, of course -- that you live in Denver and want to build an automated baggage handling system for your new airport. You could build a scale model of that system -- a prototype -- and try it out with customers one step at a time. Granted, it would cost you some money up front, but over the long run you would learn from practical experience what works and what doesn't -- improving the system and avoiding unnecessary costs and delays. The alternative -- again hypothetical -- is to forgo testbedding, build the entire baggage system from scratch, and hope for the best. Unlike a private business venture, this hypothetical airport is a public works project, so taxpayer dollars are at stake. As my Grandpa might say, how hard you gamble depends on whose money it is.
Testbedding and prototyping are relatively new words, but the concept is as old as the Industrial Age. In 1879, when Thomas Edison first told the world he was building an electrical lighting system that would illuminate all of Manhattan, he had two minor problems: He not yet invented either the light bulb or the electrical power system. After he created the first bulb, the Wizard of Menlo Park built a testbed to see how many different bulbs he could string together at once. After a month, his network had grown to 40 continuously burning bulbs -- not exactly enough to light up Manhattan, but a crucial step forward.
Engineers call this process "scaling up": Creating a small-scale pilot project that is the precursor to a large-scale production environment. This is the essence of testbedding -- and it is the mission that defines NIIT. Created in September 1993, NIIT is a privately funded, industry-led consortium with more than 60 members from telecommunications and computing companies, the national laboratories and supercomputing centers, and public and private universities. NIIT is not a commercial venture. We do not sell products or services. Instead, our members use the NII Testbed to develop practical applications in areas such as health care, electronic commerce and manufacturing. These applications can then be scaled up to improve their capabilities and performance, as well as that of the underlying infrastructure.Who's on First?
Testbeds like the NIIT are one reason why U.S. industry is leading the world into the Information Age. I recently returned from Brussels, Belgium, where NIIT represented the United States at the first-ever Information Society Showcase. The Showcase featured the economic ministers of the Group of Seven industrialized nations, "the G7" -- Germany, Japan, Italy, France, Britain, Canada and the U.S. Each of the G7 nations selected companies to demonstrate the latest information technology.
The untold story of the G7 was the commanding presence of the United States. Back home, the event was hardly mentioned by the press. But it made headlines in Europe and Japan. The tone of those stories was somber: How can the other G7 nations compete with the U.S. in telecommunications and computers? The first night in Brussels, I had dinner with friends from several European Union countries. They argued about how many years their respective countries were behind the U.S. in information technology. Estimates ranged from five to 15 years. The only consensus was that America is pulling ahead.
I must confess that the tenor of these discussions still surprises me. Despite my best efforts -- and the wealth of evidence to the contrary -- I sometimes catch myself believing that the U.S. is falling behind the Europeans and Japanese. Perhaps you have had the same experience. Many of us were taught that way. When I was in college a decade ago, the conventional wisdom was that the U.S. would never again be the world's leading economic nation; that we were destined to take second place behind the Europeans and Japanese; and that the future belonged to them, not us.
This wasn't the first time -- and doubtless won't be the last -- that the conventional wisdom turned out the be wrong. In fact, the Myth of American Decline has been fashionable for much of the 20th Century. Recently, I stumbled across a book called Totalitarianism, written by Anne Morrow Lindbergh, wife of aviator Charles Lindbergh. Published shortly before the outbreak of World War II, this book predicts the imminent and inevitable triumph of dictatorship over democracy. Lindbergh proclaims that in the final analysis, the trend toward totalitarianism is really all for the best -- as evidenced by what she characterizes as the brilliant artistic, cultural and industrial achievements of Nazi Germany and its enlightened if strong-willed leader, Adolf Hitler.Like most myths, the Myth of American Decline persists because it has at least some superficial plausibility. Take Japan. The growth of the Japanese economic powerhouse -- whose share of world industrial output rose from 2.5 percent in 1913 to 5 percent in 1938 to roughly 10 percent today -- has fueled the conventional wisdom that while Japan lost World War II militarily, it will eventually prevail over us economically.
This view, which gained currency in the mid-1980s, is once again becoming trendy. No less an authority than The Washington Post -- which likes to think of itself modestly as the most important newspaper in the most important city in the world -- recently featured articles by two often-quoted political analysts, Edward Luttwak and Clyde Prestowitz. Both claim that the U.S. is in long-term economic decline. Luttwak -- who previously predicted that Desert Storm would lead to an American bloodbath -- warns that "[s]tructural changes induced by the arrival of new technologies . . . are impoverishing a net majority of all working Americans." He urges Congressional leaders to embrace Japanese industrial policy -- before it's too late. Prestowitz -- who less than two years ago predicted the demise of the U.S. auto industry -- ridicules America's lead in new information technologies. He suggests that our economic future is so grim that we may have to "persuade Mexico to bail us out." Joining this pair is the Post's own Hedrick Smith, who hosts a PBS show and has just written a book -- both appropriately called Rethinking America -- based on the premise that Americans can only succeed in the global economy if we become less like ourselves and more like the Japanese.
These gloomy forecasts may sell books and articles and appeal to the Corporation for Public Broadcasting, but they just don't square with the facts. Manufacturing is still the same percentage of U.S. Gross National Product as it was when Lindbergh predicted our imminent collapse. In fact, U.S. manufacturing output has risen more than 50 percent since 1980. It is now twice as high as in 1970 and five times as high as 1950. In 1991, we regained our position as the world's largest exporter. Last year, for the first time since 1979, we produced more cars than Japan. Our country also leads the way in entire new manufacturing industries. For example, U.S. companies account for half the world's shipments of fiberoptic cable.
Yet manufacturing output is just one measure of national economic strength. What matters most in today's economy is a nation's ability to increase its productivity in both the manufacturing sector and the services sector. On the manufacturing side, the productivity of our industrial workforce is increasing at 3 percent a year -- the biggest increase in decades. In 1980, it took an American worker 10 hours to produce a ton of steel. It takes four hours today. On the services side, America's commanding lead is a direct function of our success in creating knowledge-based industries and information processing technologies. Here in Utah, one out of every four new jobs is in services, and one out of every nine is in information technologies.
So don't let anyone tell you our nation is destined for decline. The fact is, we don't need to imitate the Europeans, the Japanese or anyone else to compete and win in the global economy.
Let me offer a recent example from the computer industry, again using Japan. Back in the mid-80s, Japanese industrial policy was trumpeted as "superior" to the American free-enterprise system. Japan's powerful Ministry of Trade & Industry was investing heavily in supercomputers. A decade and several billion dollars later, Japan is competitive in what turned out to be a dying industry. By 1993, a single desktop personal computer had all the power of a supercomputer occupying an entire room. PCs had captured more than 55 percent of the total computer market. More than a third of Americans have them, compared to less than 10 percent of Japanese and Europeans. The Age of the Supercomputer never materialized.There are scores of similar examples, such as the massive subsidies Japan poured into analog-based High-Definition Television -- even after digital HDTV emerged as the winner in the competitive marketplace. A small but growing number of Japanese now believe the only way to keep up with the U.S. in information technology is to embrace American-style free enterprise. Not surprisingly, the Japanese equivalents of Luttwak, Prestowitz and Smith are catering to their worst fears. A current bestseller in Tokyo bookstores is entitled, "The Threat of the Information Superhighway and the Annihilation of the Japanese Information Industry."
A High Stakes Game
As Americans, we know that a national economic policy based on free and open markets is still the best way to take full advantage of rapidly changing technology. We live in a society that still encourages free men and women to predict the future direction of new technology. They take the risks so they might reap the rewards. We likewise know from experience that government officials are not subject to the same degree of discipline and accountability as their private sector counterparts. That is not meant as an indictment against the necessity of government in its proper sphere, or as a criticism of our public servants, no matter how well-intentioned. It is merely to say that government was not and is not designed for predicting where new commercial technology will flourish and where it will fail. An economist might say that policymakers do not internalize the risks of their own decisions. When entrepreneurs get it wrong, they take their own lumps. The rest of us go on as before. When bureaucrats fail, they may or may not pay the price. But the rest of us most certainly will.
The history of information technology is littered with individuals who tried to predict the future but failed -- and of visionaries who succeeded where few would tread. Here's what Western Union had to say in 1882 about Alexander Graham Bell's proposal to build the first municipal telephone network: "Bell's proposal to place his instrument in every home and business is, of course, fantastic in view of the capital costs involved in installing endless numbers of wires. . . . [A]ny development of the kind and scale which Bell so fondly imagines is utterly out of the question."
A more recent example is the cellular communications industry. When AT&T first started selling cellular telephones in the mid-'80s, it predicted a total U.S. market of 900,000 phones. Today, nearly 20 million Americans own cellular phones; usage has nearly doubled since 1992; and 14,000 new customers are signing up every day as rates continue to fall. It's one thing for AT&T to underestimate the potential of its own customers to buy its own products. But what if the U.S. Congress had adopted a national industrial policy a decade ago? Using AT&T's own market statistics, the Federal Communications Commission might well have concluded that cellular technology was a luxury item and given that share of the electromagnetic spectrum to other, "more promising" technologies.
Instead, the success of cellular technology has surpassed all predictions. Everyone wants to get into the game. I recently visited 3M Company, an $18 billion manufacturing corporation that makes 60,000 different product lines -- from computer disks to Scotch tape to Post-It notes. It turns out 3M is also the third-largest owner of billboards in the U.S. and may be buying even more. One reason: Billboard posts make great places for cellular phone antennas.What's more, many Less-Developed Countries that have never had reliable telephone service -- or any service at all -- are now working with cellular companies like AT&T, U S West, and AirTouch Communications to connect millions of new subscribers. This is no small matter given that three billion people on this planet have never made a telephone call.
Lowering the Roadblocks
As has so often happened, America has achieved her leadership role in advanced information technology at a defining moment in history. The global market for information products and services is valued at $853 billion and growing at more than 10 percent annually. Worldwide investment in telecommunications infrastructure alone will exceed $200 billion by 2004. With aggressive deregulation and pro-competitive policies, the National Information Infrastructure could boost overall U.S. productivity by as much as 40 percent over the next 15 years.
One such policy is to eliminate the legal and regulatory roadblocks that are grinding traffic on some parts of the information highway to a standstill. One of the great ironies of the 1990s is that heavy-handed laws and regulations are driving innovative U.S. companies offshore. In Great Britain, for example, consumers can purchase both cable television and telephone service from a single company. U S West and TCI have crossed the Atlantic to provide services that are usually illegal to sell at home. The remarkable success of these two U.S. companies in a comparatively modest market of 50 million people foreshadows what American entrepreneurs can achieve in our own country once legal and regulatory barriers are removed.
A Larger Purpose
Let me emphasize that much more is at stake than material wealth. Each generation of Americans is summoned to a great national purpose. Ours is no different. Until recently, that purpose was winning the Cold War. That victory, like so many others, was achieved at enormous cost in money, resources and human lives. To be worthy of that legacy, we should resolve to use today's information technologies to democratize the coming millennium. Held firmly in American hands, these technologies have unprecedented potential to promote democratic values and institutions around the world. Like those who have gone before us, we must ensure that freedom remains the birthright of every American -- and work to make it as universal as the global information superhighway we are building.Democracy is still our last, best hope for peace and prosperity. As we prepare for the bright days of the 21st Century, we should reflect on the darkest hours of the 20th. There was a time, not so very long ago, when dictators were on the march. Totalitarianism threatened to become the new world order, although it was hardly the celebration of art and culture that Anne Morrow Lindbergh predicted. Our commemoration of the 50th anniversary of the end of World War II takes on new meaning when we recall that at the start of that bloody conflict -- where 15 million people perished on the battlefield -- there were only nine democracies left on earth.
Of all the lessons we could learn from this century, none is more brutally clear than this: Dictators murder their own people and wage war on the rest of us. As Philip Burgess has written, dictators are "unrestrained by the checks and balances inherent in the institutions of a democratic society -- including, first and foremost, a free press that can blow the whistle on official outrages and otherwise facilitate public accountability." Since 1900, more than 120 million people have been killed in domestic conflicts, such as genocide, insurrections and civil wars. Only about 3 percent of those deaths occurred in democratic societies.
Modern information technology is a dictator's worst nightmare. Long before the Persian Gulf War, we knew who and what Saddam Hussein was when we saw -- through the eye of a video camera -- that he had gassed his own citizens, the Kurds. During the Soviet coup, Boris Yeltsin and his democratic reformers were saved by a single fax machine, which allowed them to rally supporters outside the besieged Parliament Building. Fax machines and videocassette recorders also enabled the pro-democracy demonstrators in Tiananmen Square to capture the world's attention. Tragically, the freedom fighters there did not prevail -- this time. But someday soon, Chinese students may own inexpensive PCS devices, linked by satellite to hundreds of millions of Internet users worldwide. Their next protest might turn out differently.
Information technology also strengthens democracy through increased trade and investment. From a political standpoint, free trade reduces the power of government bureaucrats to interfere with private transactions. Information technology allows many more transactions to occur -- creating greater economic wealth and a strong middle class of entrepreneurs with a stake in the future. This is happening today in Mexico because of the North American Free Trade Agreement and helps explain why six decades of one-party rule by a privileged elite is finally giving way to democratic reforms.
For these reasons and many more, using information technology to democratize the millennium is an urgent national priority. Preserving our own democracy in the 21st Century means projecting democratic values to every corner of the globe -- from Cuba to North Korea, from Baghdad to Beijing. The measure of our generation will be the extent to which we use the National Information Infrastructure to promote the values that have made America great -- free enterprise, equal opportunity, respect for the individual, and the rule of law. Those values are the key to building a safer, more prosperous world. The NII can help us make them the world's values.
In his Iron Curtain speech in 1947, Winston Churchill said, "The United States stands at this time at the pinnacle of world power. It is a solemn moment for the American democracy. For with the primacy of power is also joined an awe-inspiring accountability for the future." Those words are still true today. America still stands at the pinnacle of world power. And we are still accountable for the future. Let us use this solemn moment to build the National Information Infrastructure. For it is upon that foundation that we can create not just the Information Age, but the Age of Democracy. Thank you.