It Ethics
By: Fatih • Research Paper • 876 Words • February 23, 2010 • 954 Views
Join now to read essay It Ethics
Business Ethics
Most companies ought to have an IT department. This appears to be an obvious observation. However, it is worth recognizing that, in the memories of more than half the working population of the United States, a company department organized solely around information technology was unheard of. The IT department has evolved from a narrowly focused data processing element of the accounting department to a function that supports and, in many cases, drives, nearly every area of a company. This has happened in a mere 40 years. Stand-alone IT departments are a relatively recent development. The number of people working in technology-related jobs grew six times faster between 1983 and 1998 than the U.S. workforce at large. Information technology-related industries doubled their share of the U.S. economy between 1977 and 1998. Practically overnight, technology-related services have become a global, trillion-dollar industry.
The principle driver behind this remarkable, rapid creation of a vibrant, sophisticated, and enormous industry and the attendant inclusion of a department dedicated to it in every credible company is the quest for business productivity improvement.
The notion of technology investments as a driver of United States business productivity has a controversial history. The benefits of technology investments (and IT departments) were not always so apparent. Productivity growth in the United States faltered from the mid-1970s through the early 1990s, in spite of large technology investments from most major U.S. corporations. The disconnection between heavy capital and expense investment and the theoretically associated improvements in productivity led to a so-called productivity paradox. In reaction to the failure of such large investments to produce the expected productivity gains, MIT Nobel Laureate Robert Slow famously remarked in 1987, “You can see the computer age every where but in the productivity statistics.”
More recent research suggests that the productivity benefits from the deployment of technology have had a massive, albeit delayed, impact on the U.S. and world economy. A variety of researchers have concluded that investments in IT have been instrumental in the improved productivity seen in the U.S. economy beginning in the mid-1990s. In early 2000, the Federal Reserve gave information technology investments credit for approximately $50 billion in productivity improvement, which represents more than 65 percent of the total $70 billion in productivity gains seen by businesses in the United States in the last half of the 1990s.
Even the Internet, which the hindsight of the dot-com era shows us was obviously oversold, is still a revolutionary achievement. Certainly, it has not lived up to the price/earnings expectations reflected in the NASDAQ of the late 1990s, but, on the other hand, would anyone care to do without it? In short, no reasonable person today contemplates a life without corporate systems supporting every business function, from manufacturing, to finance, to sales and customer support, to say nothing of desktop office-automation products such as spreadsheets and word processors. Corporations have adopted technology to improve productivity, reduce costs, drive revenues, offer new capabilities to customers and suppliers, and maintain competitive parity. Researchers, educators, economists, and, most importantly, business managers agree that investments in information technology are not only unavoidable,