Реферат на тему History Of Computers Essay Research Paper Computers
Работа добавлена на сайт bukvasha.net: 2015-06-21Поможем написать учебную работу
Если у вас возникли сложности с курсовой, контрольной, дипломной, рефератом, отчетом по практике, научно-исследовательской и любой другой работой - мы готовы помочь.
History Of Computers Essay, Research Paper
Computers play a very important role in the everyday life of the world. Just about every nation has computers running the vital parts of their military and government. I will explain the history, types, different uses, and growth of computers.
The history of computing began with an analog machine. In 1623 German scientist Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocket wheels that could add and, with the aid of logarithm tables, multiply and divide. A French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642 that added and subtracted, automatically carrying and borrowing digits from column to column. In the early 19th century French inventor Joseph-Marie Jacquard devised a specialized type of computer called loom. Jacquard s loom used punched cards to program patterns that were output as woven fabrics by the loom. Another early mechanical computer was the Difference Engine, designed in the early 1820s by British mathematician and scientist Charles Babbage. Although never completed by Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that could solve mathematical problems. Babbage also made plans for another machine, the Analytical Engine, considered to be the mechanical precursor of the modern computer. The Analytical Engine was designed to perform all arithmetic operations efficiently; however, Babbage s lack of political skills kept him from obtaining the approval and funds to build it. Augusta Ada Byron was a personal friend and student of Babbage. She was the daughter of the famous poet Lord Byron and one of only a few woman mathematicians of her time. She prepared extensive notes concerning Babbage s ideas and the Analytical Engine. Ada s conceptual programs for the Engine led to the naming of a programming language in her honor. Although the Analytical Engine was never built, its key concepts, such as the capacity to store instructions, the use of punched cards as a primitive memory, and the ability to print, can be found in many modern computers.
Herman Hollerith, an American inventor, used an idea similar to Jacquard s loom when he combined the use of punched cards with devices that created and electronically read the cards. Hollerith s tabulator was used for the 1890 U.S. census, and it made the computational time three to four times shorter than the time previously needed for hand counts. Hollerith s Tabulating Machine Company eventually merged with other companies in 1924 to become IBM. In 1936 British mathematician Alan Turing proposed the idea of a machine that could process equations without human direction. The Turing machine resembled an automatic typewriter that used symbols for math and logic instead of letters. Turing intended the device to be used as a universal machine that could be programmed to duplicate the function of any other existing machine. Turing s machine was the theoretical precursor to the modern digital computer.
In the 1930s American mathematician Howard Aiken developed the Mark I calculating machine, which was built by IBM. This electronic calculating machine used relays and electromagnetic components to replace mechanical components. In later machines, Aiken used vacuum tubes and solid-state transistors to manipulate the binary numbers. Aiken also introduced computers to universities by establishing the first computer science program at Harvard University. Aiken never trusted the concept of storing a program within the computer. Instead his computer had to read instructions from punched cards. At the Institute for Advanced Study in Princeton, Hungarian-American mathematician John von Neumann developed one of the first computers used to solve problems in mathematics, meteorology, economics, and hydrodynamics. Von Neumann s 1945 Electronic Discrete Variable Computer was the first electronic computer to use a program stored entirely within its memory. John Mauchly, an American physicist, proposed an electronic digital computer, called the Electronic Numerical Integrator And Computer, which was built at the Moore School of Engineering at the University of Pennsylvania in Philadelphia by Mauchly and J. Presper Eckert, an American engineer. ENIAC was completed in 1945 and is regarded as the first successful, general digital computer. It weighed more than 27,000 kg and contained more than 18,000 vacuum tubes. Roughly 2000 of the computer s vacuum tubes were replaced each month by a team of six technicians. Many of ENIAC s first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic weapons. Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task. Eckert and Mauchly eventually formed their own company, which was then bought by the Rand Corporation. They produced the Universal Automatic Computer, which was used for a broader variety of commercial applications. By 1957, 46 UNIVACs were in use. In 1948, at Bell Telephone Laboratories, American physicists Walter Houser Brattain, John Bardeen, and William Bradford Shockley developed the transistor, a device that can act as an electric switch. The transistor had a tremendous impact on computer design, replacing costly, energy-inefficient, and unreliable vacuum tubes. In the late 1960s integrated circuits, tiny transistors and other electrical components arranged on a single chip of silicon, replaced individual transistors in computers. Integrated circuits became miniaturized, enabling more components to be designed into a single computer circuit. In the 1970s refinements in integrated circuit technology led to the development of the modern microprocessor, integrated circuits that contained thousands of transistors. Modern microprocessors contain as many as 10 million transistors.
Manufacturers used integrated circuit technology to build smaller and cheaper computers. The first of these so-called personal computers (PCs) was sold by Instrumentation Telemetry Systems. The Altair 8800 appeared in 1975. It used an 8-bit Intel 8080 microprocessor, had 256 bytes of RAM, received input through switches on the front panel, and displayed output on rows of light-emitting diodes. Refinements in the PC continued with the inclusion of video displays, better storage devices, and CPUs with more computational abilities. Graphical user interfaces were first designed by the Xerox Corporation, and then later used successfully by the Apple Computer Corporation with its Macintosh computer. Today the development of sophisticated operating systems such as Windows 95 and Unix enables computer users to run programs and manipulate data in ways that were unimaginable 50 years ago.
Possibly the largest single calculation was accomplished by physicists at IBM in 1995 solving one million trillion mathematical problems by continuously running 448 computers for two years to demonstrate the existence of a previously hypothetical subatomic particle called a glue ball. Japan, Italy, and the United States are collaborating to develop new supercomputers that will run these calculations one hundred times faster.
In 1996 IBM challenged Gary Kasparov, the reigning world chess champion, to a chess match with a supercomputer called Deep Blue. The computer had the ability to compute more than 100 million chess positions per second. Kasparov won the match with three wins, two draws, and one loss. Deep Blue was the first computer to win a game against a reigning world chess champion with regulation time controls. Many experts predict these types of parallel processing machines will soon surpass human chess playing ability, and some speculate that massive calculating power will one day replace intelligence. Deep Blue serves as a prototype for future computers that will be required to solve complex problems.
Computers can be either digital or analog. Digital refers to the processes in computers that manipulate binary numbers (0s or 1s), which represent switches that are turned on or off by electrical current. Analog refers to numerical values that have a continuous range. Both 0 and 1 are analog numbers, but so is 1.5 or a number like pie (p). As an example, consider a desk lamp. If it has a simple on/off switch, then it is digital, because the lamp either produces light at a given moment or it does not. If a dimmer replaces the on/off switch, then the lamp is analog, because the amount of light can vary continuously from on to off and all intensities in between.
Analog computer systems were the first type to be produced. A popular analog computer used in the 20th century was the slide rule. It performs calculations by sliding a narrow, gauged wooden strip inside a ruler like holder. Because the sliding is continuous and there is no mechanism to stop at one exact value, the slide rule is analog. New interest has been shown recently in analog computers, particularly in areas such as neural networks that respond to continuous electrical signals. Most modern computers, however, are digital machines whose components have a finite number of states for example, the 0 or 1, or on or off of bits. These bits can be combined to denote information such as numbers, letters, graphics, and program instructions.
People use computers in a wide variety of ways. In business, computers track inventories with bar codes and scanners, check the credit status of customers, and transfer funds electronically. In homes, tiny computers embedded in the electronic circuitry of most appliances control the indoor temperature, operate home security systems, tell the time, and turn videocassette recorders on and off. Computers in automobiles regulate the flow of fuel, thereby increasing gas mileage. Computers also entertain, creating digitized sound on stereo systems or computer-animated features from a digitally encoded laser disc. Computer programs, or applications, exist to aid every level of education, from programs that teach simple addition or sentence construction to advanced calculus. Educators use computers to track grades and prepare notes; with computer-controlled projection units, they can add graphics, sound, and animation to their lectures. Computers are used extensively in scientific research to solve mathematical problems, display complicated data, or model systems that are too costly or impractical to build, such as testing the air flow around the next generation of space shuttles. The military employs computers in sophisticated communications to encode and unscramble messages, and to keep track of personnel and supplies
In conclusion, computers where invented in the in the early 17th century and today technology is making advancements that were thought to be unobtainable in previous years because of computer. Computers play a vital role in our everyday life, and they maintain our way of life. Computers shall continue to expand of views of what we can accomplish.
The History Of Computers
Presented To:
R. Knowles
Presented By:
Brandon Waterman
Computer Seminar I
CS 110
3/13/2001
Contents
I. How Computers Got Started
A. Early Computers
B. Early Inventors
II. Types Of Computer
A. Analog
B. Digital
III. Different Uses
A. Commercial
B. Personal
IV. Growth Of Computers
Works Cited
Augarten, Stan. Bit-by-Bit: an illustrated history of computers. New York: Tichnor &
Fields, 1984.
Goldstine, Herman. The computer from Pascal to von Neumann. Princeton, N.J.:
Princeton University Press, 1972.
Moreau, Rene. The computer comes of age: The people, the hardware, and the software.
Cambridge, Mass.: MIT Press, 1984.
Pugh, Emerson. Memories that shaped an industry: decisions leading to IBM system.
Cambridge, Mass.: MIT Press, 1984.
Shasha, Dennis Elliott. Out of their minds: the lives and discoveries of 15 great
Computer scientist. New York: Copernicus, 1995.
Shurkin, Joel. Engines of the mind: a history of the computer 1 ed. New York: Norton,
1984.
Williams, Micheal. A history of computing technology 2nd ed. Los Alamitos, Calif.:
IEEE Computer Society Press, 1997.