A Brief History of Personal Computers The electronic computer is a relatively modern invention; the first fully operable computer was developed about 50 years ago, at the end of World War II, by a team at the University of Pennsylvania's Moore School of Engineering. This team was headed by John Mauchly and J. Presper Eckert, who named the new machine ENIAC, for Electronic Numerical Integrator and Calculator. ENIAC was hardly a personal computer, occupying a large room and weighing about 33 tons. By today's standards, ENIAC was extremely slow, unreliable, and expensive to operate. In 1945, on the other hand, it was considered a marvel.
Which made the price for the PCs themselves to be higher than most family incomes at the time. According to Frank from Columbia University the IBM 610 Auto-Point Computer was invented by John Lentz with the help of Bryon Havens and Robert M. Walker during the 1648-1956 (Lentz). The manufacture price was set at $55,000, or rental at $1150/month, $460 academic. So only 180 units were ever made. With the price to manufacture so high, not many people could afford to buy one.
The abacus provided the fastest method of calculating until 1642, when the French scientist Pascal invented a calculator made of wheels and cogs. The concept of the modern computer was first outlined in 1833 by the British mathematician Charles Babbage. His design of an analytical engine contained all of the necessary components of a modern computer: input devices, a memory, a control unit, and output devices. Most of the actions of the analytical engine were to be done through the use of punched cards. Even though Babbage worked on the analytical engine for nearly 40 years, he never actually made a working machine.
ENIAC had plenty of drawbacks though, first and foremost its size, and secondly the 18,000 tubes it took to run it. ENIAC and UNIVAC, which came shortly after, were indisputably the greatest advances in technology of all time, but they were still useless to the mass majority due to size, cost and time of construction. The invention of the transistor in 1947 solved this problem for the most part, allowing computers to become smaller and more reliable. But alas due to the cost only the largest of private companies and governments could use the machines. By 1964 this had changed, International Business Machines or IBM as we know them today introduced the system 360 mainframe, a solid state semi portable computer which could handle many types of data and allowed many conventional businesses to enter the computer age.
Before this no true written calculations could be made, making this one of the most essential inventions to help computers. In 830 AD the first mathematics textbook was invented by a man named Mohammed Ibn Musa Abu Djefar. The subject of this textbook he wrote was “Al Gebr We'l Mukabala” which in today’s society is known as “Algebra” (History of Computers). So what does all of this have to do with computers? Well without numbers computers wouldn’t exist or have any reason to exist.
He created an automatic calculation machine that was steam powered and stored up to 1000 50-digit numbers. Unlike its two earliest ancestors, Babbage’s invention was able to perform various operations. It relied on cards with holes punched in them, which are called “punch cards.” These cards carried out the programming and storing operations for the machine. Unluckily, Babbage’s creation flopped due to the lack of mechanical precision and the lack of demand for the product (Soma, 46). The machine could not operate efficiently because technology was t adequate to make the machine operate efficiently Computer interest dwindled for many years, and it wasn’t until the mid-1800’s that people became interested in them once again.
Currently, the quantum computer can only calculate elementary math and nothing more. We could use the qubit to build softwares that could theoretically process QUANTUM COMPUTER problems very similar to a quantum computer but it would still be based on the classical computer interpretation. So in conclusion, a quantum computer is a daunting task of mechanical and engineering skill which could be made if we understood the elemental particles more thoroughly. As the information age has come, our culture uses data so incredibly fast that we need an alternative to the classical computing devices.
Charles Babbage was so ahead of his time, that the machines that were used back then were not even precise enough to make the parts for his computer. Gulliver, states: The first major use for a computer in the US was during the 1890 census. Two men, Herman Hollerith and James Powers, developed a new punched-card system that could automatically read information on cards without human intervention (Gulliver 82). In the 1930's punched-card machine techniques had become so well established that Howard Hathaway Aiken, together with engineers at IBM, came up with the automatic computer called Mark I. The Mark I ran by using prepunched paper tape.
But keep in mind; now-a-days computers are one million times faster and more versatile than they were in the early 60’s, literally. When the first computer came out that could add, subtract, multiply and divide. People thought that we had reached the end of inventions; there is nothing more that can be invented. But we say “wow, it’s just a common calculator” but actually it wasn’t, it was much worse than our common calculators that we can buy at the local dollar store.
John Mauchly received his Bachelor's, Master's and Doctorate degree at Johns Hopkins University in Baltimore, Maryland in physics. John Eckert met John Mauchly when he was a graduate student. It took Mauchly and Eckert one year to design and 18 months to build the Eniac. The Eniac could hardly be considered “just a computer” due to its massive size and speed. One programmer of the Eniac described the machine as being “faster than thought.” This statement is not invalid due to the Eniac being able to calculate 5000 additions problems, 357 multiplications or 38 divisions in one second.