Essay PreviewMore ↓
The evolution of modern computers is divided into a few "distinct" generations. Each generation is characterized by extreme improvements over the prior era in the technology used in the manufacturing process, the internal layout of computer systems, and programming languages. There has also been a steady improvement in algorithms, including algorithms used in computational science, though not usually associated with computer generations. The following timeline has been organized using a logical breakdown of events and discoveries.
First Generation of Modern Computers 1945-1956
With the beginning of the Second World War, governments sought to develop computers to exploit their potential strategic importance. This increased funding for computer development projects hastened technical progress. By 1941 German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers. In 1943, the British completed a secret code-breaking computer called Colossus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war (Goldstine 250).
American efforts produced a broader achievement. Howard H. Aiken, a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator by 1944. The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short, was an electronic relay computer. It used electromagnetic signals to move mechanical parts. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in that sequences of calculations could not change); but it could perform basic arithmetic as well as more complex equations (Stern 47).
Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power, enough energy to dim the lights in an entire section of Philadelphia.
How to Cite this Page
"History of Modern Computers." 123HelpMe.com. 17 Aug 2018
Need Writing Help?
Get feedback on grammar, clarity, concision and logic instantly.Check your paper »
- Computers are everywhere, they are in homes, they are at schools, and most of people even carry them around on us everyday. But it was not always like this. As strange as this might sound, there was a time when a single computer would take up an entire room and still not be able to do a fraction of what an iphone can do. So how did people get from a computer taking up an entire room just to do basic things, from being able to do about anything on a small device in peoples pockets. Well, the most basic answer is advancement; the human race is moving forward everyday and what is in movies, about having holographs and robots, are a lot closer than many may think.... [tags: modern computer, homes, schools]
1063 words (3 pages)
- The History of Computers Most of usäparticipate in this digital culture, whether by using an ATM card, composing and printing an office newsletter, calling a mail-order house on toll-free numberäor shopping at a mega-mall where the inventory is replenished just in time. (Ceruzzi 1) In the Information Age of today society has become dependent on technology; every aspect of our lives have become centered on how fast and efficiently something can be accomplished. The use of computers aids our stride to becoming technologically advanced.... [tags: Technology Technological Computers Essays]
875 words (2.5 pages)
- The History of Computers From primitive abaci to lab tops and calculators, the computer has evolved through time to become the essential part of our technocratic society. The development of the computer has shaped the way technology and science is viewed in different cultures around the world. The connotation of what a computer is nowadays brings to mind a monitor, keyboard, processor and its other electronic components; however, that is not how things have always been. From the Chinese using abaci to count, to the Druids' usage of stones to follow the seasonal changes, to the Europeans using Pascalines and calculators to work out mathematical problems the concept of the computer ha... [tags: Computers Technology Technological Essays]
840 words (2.4 pages)
- There is one way to define a PC: a general purpose information processing device. "Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention." (The History of Computers- Mary Bellis) In simpler terms for those whose expertise are lacking in the technical world; a PC can take information from a keyboard, floppy disk, or modem and process it.... [tags: Technology History]
1264 words (3.6 pages)
- ... Not all of them were invented by the mathematician or physician. Among those machines were pioneering computers put together by english academics notably Manchester/Ferrenti Mark 1, built at Manchester University by Frederic Williams and Thomas Kilburn. And the EDSAC, Electronic Delay Storage Automatic Calculator built by Maurice Wilkes at Cambridge University. The microelectronic revolution started when they were using the vacuum tubes it consume a lot of power supply. As a comparison the ENIAC used about 2000 times as much electricity as the modern laptop.... [tags: charles babbage, digital computer]
1320 words (3.8 pages)
- The History of Computers In the year 2003, it is almost impossible for those of us who have been brought up with computer to imagine what the world was like before their coming. People use computers every day in their homes, using the Internet, specifically e-mails and Instant Messenger, to keep in touch with friends and relatives far away. People also use computers every day at work and school, planning projects and writing papers. For many of today’s children, computers are an integral part of their education, and some of them use computers in school every day.... [tags: Technology Technological Computers Essays]
794 words (2.3 pages)
- The History of Computers To those of us that have grown up during the computer age, computers seem like a normal part of our everyday lives. The idea that only a few decades ago computers were virtually unheard of is inconceivable. Computers are now so essential that they basically run our society and the whole of the modern world. They track hurricanes, forecast the weather, predict natural disasters, control satellites and missiles and keep countries around the world in constant contact.... [tags: Technology Computers Essays]
944 words (2.7 pages)
- The History of Computers Thousands of years ago calculations were done using people’s fingers and pebbles that were found just lying around. Technology has transformed so much that today the most complicated computations are done within seconds. Human dependency on computers is increasing everyday. Just think how hard it would be to live a week without a computer. We owe the advancements of computers and other such electronic devices to the intelligence of men of the past. The history of the computer dates back all the way to the prehistoric times.... [tags: Computers Technology Essays]
836 words (2.4 pages)
- The History of Computers in Education Computers were first introduced into schools in the late 1950, however, at that time they were only used by large universities for clerical work such as accounting, payroll, and for storing student records. Computers began to emerge more and more in the 1950, but it was still uncommon to see computers in schools. Today, one will find that quite the opposite exists. Since 1977 there has been a rapid growth in the use of computers throughout schools.... [tags: Technology in Education Computers Essays]
868 words (2.5 pages)
- A Brief History of Personal Computers The electronic computer is a relatively modern invention; the first fully operable computer was developed about 50 years ago, at the end of World War II, by a team at the University of Pennsylvania's Moore School of Engineering. This team was headed by John Mauchly and J. Presper Eckert, who named the new machine ENIAC, for Electronic Numerical Integrator and Calculator. ENIAC was hardly a personal computer, occupying a large room and weighing about 33 tons.... [tags: essays research papers]
1281 words (3.7 pages)
In the mid-1940's John von Neumann joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data. This stored memory technique as well as the conditional control transfer, that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. Through the use of a memory that was large enough to hold both instructions and data, and using the program stored in memory to control the order of arithmetic operations, EDVAC was able to run orders of magnitude faster than ENIAC. By storing instructions in the same medium as data, designers could concentrate on improving the internal structure of the machine without worrying about matching it to the speed of an external control. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source (Goldstine 171, 181 -183).
Eckert and Mauchly later developed what was arguably the first commercially successful computer in the UNIVAC I (Universal Automatic Computer). In 1951, the UNIVAC I became one of the first commercially available computers to take advantage of these advances. Both the U.S. Census Bureau and General Electric owned UNIVACs. One of UNIVAC's impressive early achievements was predicting the winner of the 1952 presidential election, Dwight D. Eisenhower. The UNIVAC, 45 minutes after the polls closed and with 7% of the vote counted, predicted Eisenhower would defeat Stevenson with 438 electoral votes (Stern 149).
Second Generation Computers 1956-1963
By 1948, the invention of the transistor greatly changed the computer's development. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors. The first large-scale machines to take advantage of this transistor technology were early supercomputers, Stretch by IBM and LARC by Sperry-Rand. These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly, however, and tended to be too powerful for the business sector's computing needs, thereby limiting their attractiveness. Only two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore, California, for which the computer was named (Livermore Atomic Research Computer) and the other at the U.S. Navy Research and Development Center in Washington, D.C. Another addition in second generation computers was the introduction of assembly language. When assembly language replaced machine language, abbreviated programming codes to replaced long, difficult binary codes (Gersting 35).
Throughout the early 1960's, there were a number of commercially successful second generation computers used in businesses, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others. These second generation computers were also of solid state design, and contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, and stored programs. One important example was the IBM 1401, which was universally accepted throughout industry, and is considered by many to be the Model T of the computer industry. By 1965, most large business routinely processed financial information using second generation computers (Gersting 218).
It was the stored program and programming language that gave computers the flexibility to finally be cost effective and productive for business use. The stored program concept meant that instructions to run a computer for a specific function (known as a program) were held inside the computer's memory, and could quickly be replaced by a different set of instructions for a different function. A computer could print customer invoices and minutes later design products or calculate paychecks. More sophisticated high-level languages such as COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator) came into common use during this time, and have expanded to the current day. These languages replaced cryptic binary machine code with words, sentences, and mathematical formulas, making it much easier to program a computer. New types of careers (programmer, analyst, and computer systems expert) and the entire software industry began with second generation computers (Gersting 131).
Third Generation Computers 1964-1971
Despite the fact that transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts. The quartz rock eliminated this problem. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958. The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory (Gersting 35 - 39).
Fairchild Camera and Instrument Corp. built the first standard metal oxide semiconductor product for data processing applications, an eight-bit arithmetic unit and accumulator. The fundamental components of this semiconductor laid the groundwork for the future discovery of the microprocessor in 1971. Another company that took advantage of the third generation advancements was IBM with the unveiling of the IBM System/360. The company was making a transition from discrete transistors to integrated circuits, and its major source of revenue moved from punched-card equipment to electronic computer systems.
In 1969 AT&T Bell Laboratories programmers Kenneth Thompson and Dennis Ritchie developed the UNIX operating system on a spare DEC minicomputer. UNIX was the first modern operating system that provided a sound intermediary between software and hardware. UNIX provided the user with the means to allocate resources on the fly, rather than requiring the resources be allocated in the design stages. The UNIX operating system quickly secured a wide following, particularly among engineers and scientists at universities and other computer science organizations.
Fourth Generation Computers 1971-Present
After the invention of the integrated circuit, the next step in the computer design process was to reduce the overall size. Large scale integration (LSI) could fit hundreds of components onto one chip. By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-large scale integration (ULSI) increased that number into the millions. The ability to fit so much onto an area about half the size of a U.S. dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability. The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minute chip. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets, and automobiles with electronic fuel injection incorporated microprocessors (Gersting 35 - 39).
Such condensed power allowed everyday people to harness a computer's power. They were no longer developed exclusively for large business or government contracts. By the mid-1970's, computer manufacturers sought to bring computers to general consumers. These minicomputers came complete with user-friendly software packages that offered even non-technical users an array of applications, most popularly word processing and spreadsheet programs. Pioneers in this field were Commodore, Radio Shack and Apple Computers. In the early 1980's, arcade video games such as Pac Man and home video game systems such as the Atari 2600 ignited consumer interest for more sophisticated, programmable home computers.
In 1981, IBM introduced its personal computer (PC) for use in the home, office and schools. The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers to palmtop. In direct competition with IBM's PC was Apple's Macintosh line, introduced in 1984. Notable for its user-friendly design, the Macintosh offered an operating system that allowed users to move screen icons instead of typing instructions. Users controlled the screen cursor using a mouse, a device that mimicked the movement of one's hand on the computer screen.
As computers became more widespread in the workplace, new ways to harness their potential developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other. As opposed to a mainframe computer, which was one powerful computer that shared time with many terminals for many applications, networked computers allowed individual computers to form electronic gateways. Using either direct wiring, called a Local Area Network (LAN), or telephone lines, these networks could reach enormous proportions. A global web of computer circuitry, the Internet, for example, links computers worldwide into a single network of information. During the 1992 U.S. presidential election, vice-presidential candidate Al Gore promised to make the development of this so-called "information superhighway" an administrative priority. The ideals expressed by Gore and others are in usage everyday through email, web browsing, and e-commerce. A new generation of computers will emerge with the use wireless communications and wide area networking.
- Gersting, Judith L. The Computer: History, Workings,Uses & Limitations. New York :
Ardsley House, c1988
- Goldstine, Herman Heine. The Computer from Pascal to von Neumann. Princeton, NJ:
Princeton University Press 1972.
- Stern, Nancy B. From ENIAC to UNIVAC : Appraisal of the Eckert-Mauchly Computer.
Bedford, MA: Digital Press, c1981.