EssaysForStudent.com - Free Essays, Term Papers & Book Notes
Search

Computer History

By:   •  Research Paper  •  2,468 Words  •  April 23, 2010  •  1,146 Views

Page 1 of 10

Computer History

Many people do not know how, or even when computers were first made. Even before the first electronic computers were made, many people believe that computers started with the abacus, a simple counting device. The abacus is believed to have been built in Babylon in the fourth century B.C. The “First Generation” of computers started in the very late 1930’s. These computers were grotesquely slow, colossal in size, created much heat, used hundreds of kilowatts of power, and were about as reliable as a used up match (www.pbs.org). As time went on, these computers evolved from using vacuum tubes to transistors. Transistors mark the beginning of the “Second Generation” of computers in 1947. Unfortunately, the 2nd Generation did not advance as much as most people hoped, but eventually, the “Third Generation” was brought about by the invention of integrated circuits in 1958. Integrated circuits replaced transistors, and many computer languages came out in this time. Many more computer companies were born during this time, and eventually this led to personal computers for everyday use. Microprocessors introduced the beginning of the “Fourth Generation” and a time where computers were in almost every house (www.cs.princeton.edu).

Had the automobile developed at a pace equal to that of the computer during the past twenty years, today a Rolls Royce would cost less than $3.00, get a million miles to the gallon, and six of them would fit on the head of a pin! (www.crews.org)

Early computers, starting with the “First Generation” were not what we would think of as computers today. Almost every computer at least filled an entire room, and most were not any smaller than a van today. Compared to today’s computers these first computers were as slow as snails, and no where near as dependable. In 1936, Alan Turing showed that any problem can be solved by a machine if it can be expressed as a finite number of steps that can be done by the machine (www.csudh.edu). The first computers were actually nothing more than calculators. Prior to World War II, John Atansoff and Clifford Berry began to build an electronic computer. They never finished it because of the war. In 1939, Atansoff finished his small computer he built to test his ideas. He used this model to begin work on his Atansoff-Berry Computer (ABC), and was again forced to ground his project because of war. The ABC used 300 vacuum tubes to perform calculations, used capacitors to store binary data, and punched cards for input/output. (www.cs.princeston.edu)

The computer “code” called Binary is a number written in the base two instead of base ten like we have today. Each place is equivalent to the power of two, so there would be the one’s place, two’s place, four’s place, etc. For example, the number five would be 101. Early programmers had a hard time converting to this, so they thought of one as true, and zero as false (www.cs.princeton.edu).

A man by the name of Howard Aiken built the Mark I which had mainly mechanical parts, but had some electronic parts as well. The Mark I was the first computer financed by IBM and was about 50 feet long and 8 feet tall. It used mechanical switches to open and close its electric circuits. It contained over 500 miles of wire, and 750,000 parts and weighed well over 5 tons. Aiken would go on to hire one of the most famous females in computer history, Grace Hopper to be the lead programmer of the Mark I. Early in her time as the lead programmer, she would come to find the first computer “bug” which was actually a moth that got fried by one of the Mark I’s vacuum tubes. It is amazing how the name for “debugging” a system came around by a moth (www.csudh.edu).

In 1945, John von Neumann wrote a paper telling how a binary program could be stored on a computer. The program would enable the computer to alter the operations to be performed depending on previous steps. This concept greatly increased computer’s capacities, and in 1947, the EDVAC (Electronic Discrete Variable Automatic Computer) was the first computer built using this idea. It was followed shortly by the UNIVAC I (UNIVersal Automatic Computer) (www.cs.princeton.edu).

During World War II, the Defense Department needed an easier way to compute firing and ballistic tables. J. Presper Eckert and William Mauchley developed the ENIAC (Electronic Numerical Integrator and Calculator) in 1946 to fix the military’s problems. The ENIAC’s first job was to calculate the feasibility of a design for a hydrogen bomb. It filled a thirty by fifty foot air-conditioned room and weighed over thirty tons. It was built with over 18,000 vacuum tubes which could perform calculations at a rate of 5,000 additions a second, which is about 1000 times faster than the Mark I. Operators had to use plug boards and wires to program the computer to do the right operations. Dials were turned until the desired numbers corresponded

Download as (for upgraded members)  txt (15.6 Kb)   pdf (181.9 Kb)   docx (16.1 Kb)  
Continue for 9 more pages »