It Doesn’t Matter
By: Venidikt • Research Paper • 5,358 Words • December 20, 2009 • 1,415 Views
Essay title: It Doesn’t Matter
IN 1968, a young Intel engineer named Ted Hoff found a way to put the circuits necessary for computer processing onto a tiny piece of silicon. His invention of the microprocessor spurred a series of technological breakthroughs -desktop computers, local and wide area networks, enterprise software, and the Internet--that have transformed the business world. Today, no one would dispute that information technology has become the backbone of commerce. It underpins the operations of individual companies, ties together far-flung supply chains, and, increasingly, links businesses to the customers they serve. Hardly a dollar or a euro changes hands anymore without the aid of computer systems.
As IT's power and presence have expanded, companies have come to view it as a resource ever more critical to their success, a fact clearly reflected in their spending habits. In 1965, according to a study by the U.S. Department of Commerce's Bureau of Economic Analysis, less than 5% of the capital expenditures of American companies went to information technology. After the introduction of the personal computer in the early 1980s, that percentage rose to 15%. By the early 1990s, it had reached more than 30%, and by the end of the decade it had hit nearly 50%. Even with the recent sluggishness in technology spending, businesses around the world continue to spend well over $2 trillion a year on IT.
But the veneration of IT goes much deeper than dollars. It is evident as well in the shifting attitudes of top managers. Twenty years ago, most executives looked down on computers as proletarian tools--glorified typewriters and calculators--best relegated to low level employees like secretaries, analysts, and technicians, it was the rare executive who would let his fingers touch a keyboard, much less incorporate information technology into his strategic thinking. Today, that has changed completely. Chief executives now routinely talk about the strategic value of information technology, about how they can use IT to gain a competitive edge, about the "digitization" of their business models. Most have appointed chief information officers to their senior management teams, and many have hired strategy consulting firms to provide fresh ideas on how to leverage their IT investments for differentiation and advantage.
Behind the change in thinking lies a simple assumption: that as IT's potency and ubiquity have increased, so too has its strategic value. It's a reasonable assumption, even an intuitive one. But it's mistaken. What makes a resource truly strategic--what gives it the capacity to be the basis for a sustained competitive advantage--is not ubiquity but scarcity. You only gain an edge over rivals by having or doing something that they can't have or do. By now, the core functions of IT--data storage, data processing, and data transport--have become available and affordable to all.[1] Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none.
IT is best seen as the latest in a series of broadly adopted technologies that have reshaped industry over the past two centuries--from the steam engine and the railroad to the telegraph and the telephone to the electric generator and the internal combustion engine. For a brief period, as they were being built into the infrastructure of commerce, all these technologies opened opportunities for forward-looking companies to gain real advantages. But as their availability increased and their cost decreased--as they became ubiquitous--they became commodity inputs. From a strategic standpoint, they became invisible; they no longer mattered. That is exactly what is happening to information technology today, and the implications for corporate IT management are profound.
Vanishing Advantage
Many commentators have drawn parallels between the expansion of IT, particularly the Internet, and the rollouts of earlier technologies. Most of the comparisons, though, have focused on either the investment pattern associated with the technologies--the boom-to-bust cycle-or the technologies' roles in reshaping the operations of entire industries or even economies. Little has been said about the way the technologies influence, or fail to influence, competition at the firm level. Yet it is here that history offers some of its most important lessons to managers.
A distinction needs to be made between proprietary technologies and what might be called infrastructural technologies. Proprietary technologies can be owned, actually or effectively, by a single company. A pharmaceutical firm, for example, may hold a patent on a particular compound that serves as the basis for