The work of George Boole on mathematical logic that gave birth to the “Boolean Algebra” was the most important contribution to the development of computer technology. We are going to see briefly, how the prodigy George Boole came from an economically modest family, to change the way we relate with logic, influencing the concept of computing and processing information. We are also going to analyse how his work in algebra revolutionised the way we think and symbolise logic sets, making the binary code and logic relations as the fundamental tool in the evolution of computer systems and networks, from the first breakthrough of Boole's work, until many years later, when science identified that his mathematical set theories could in fact be the base of circuits operation and information storage and access, computer as we know it.
George Boole was born in Lincoln, England, in 1815, his mother was a lady’s maid and his father a shoemaker, who was also an amateur scientist. Not the exactly childhood of a genius, in fact at the age of 16 he saw himself forced to abandon school to help supporting their family, but maybe it was the interest that his father had on scientific instruments that got his interest on science. He went to work as teacher assistant, self educated, the prodigy boy taught himself Latin, Italian, Greek, German and French, eventually opening his own school in 1834. His interest in mathematics grew then started to develop and he started to self educate in the subject, writing his first mathematical paper in 1838. He founded a new branch of mathematics called Invariant Theories, which late would inspire Albert Einstein. He also worked in Differential Equations, which are still used, and that earned him a Gold Medal of the ...
... middle of paper ...
... of science. Retrieved 24/03/2011.
url: http://understandingscience.ucc.ie/pages/sci_georgeboole.htm
Smith, E. S. (1993). On the shoulders of giants: From boole to Shannon to taube; the origins and development of computerized information from mid 19th century to the present. Information Technology and Libraries, 12(2), 217- 226.
url: http://search.proquest.com/docview/215833453?accountid=14543
Valente, K.G. (2010).Giving wings to logic: Mary everest boole’s propagation and ful-fillment of a legacy. British Journal for the History of Science, 43(1), 49-74.
url: http://search.proquest.com/docview/215741792?accountid=14543
Aikat D.(2001). Pioneers of the Early Digital Era: Innovative Ideas that Shaped Computing in 1833-1945 Convergence: The International Journal of Research into New Media Technolo-gies December Vol. 7, 52-81. doi:10.1177/135485650100700404
---. “History of Emulation, The.” Parts 1-3. Digital Chameleon: The Rise of Computer Emulation. 24 Oct. 1999. Zophar’s Domain. 25 Oct. 2000. , ,
ABSTRACT: I examine some recent controversies involving the possibility of mechanical simulation of mathematical intuition. The first part is concerned with a presentation of the Lucas-Penrose position and recapitulates some basic logical conceptual machinery (Gödel's proof, Hilbert's Tenth Problem and Turing's Halting Problem). The second part is devoted to a presentation of the main outlines of Complexity Theory as well as to the introduction of Bremermann's notion of transcomputability and fundamental limit. The third part attempts to draw a connection/relationship between Complexity Theory and undecidability focusing on a new revised version of the Lucas-Penrose position in light of physical a priori limitations of computing machines. Finally, the last part derives some epistemological/philosophical implications of the relationship between Gödel's incompleteness theorem and Complexity Theory for the mind/brain problem in Artificial Intelligence and discusses the compatibility of functionalism with a materialist theory of the mind.
Derry, T. K., and Trevor Williams. A Short History of Technology. Oxford: Oxford University Press, 1961.
The study of philosophy has become a lost art. Philosophy does not receive credit for the achievements received, when all work stems from philosophy. One important branch of philosophy is logic. Through logic, the world can see how questions are developed, why questions are asked, and how questions are invalid. Many logicians have been formulating hypothesis centering around logic for years. Aristotle and George Boole are two logicians who are extremely well known for their work in the philosophical field and their conflicting viewpoints when considering logic. In the textbook, A Concise Introduction to Logic
With the introduction of Gödel’s paper in 1931, a whole new world of mathematics was open for Turing. In 1935 Turing became aware that the question of Decidability, or the Entscheidungsproblem, which asks could there exist a method or process by which it could be decided whether a given mathematical assertion was provable, was still open. He provided a negative answer by defining a definite method or an algorithm in today’s terms. He analyzed the characteristics of a methodical process and how to perform that process and expressed his findings in the terms of a theoretical machine that would be able to perform the operations on symbols on a paper tape. This correspondence between operations, the human mind and a machine that was designed to embody a certain physical form was Turing’s contribution (Huertas).
The subject of this term paper will be about computers in the 1950’s. The divisions that will be covered are; the types of computers there were, the memory capacity of computers, the programming languages of that time, and the uses of the computers for that time. Information will be gathered from the Internet, from books, and from magazines, and from the encyclopedia.
This essay will consist in an exposition and criticism of the Verification Principle, as expounded by A.J. Ayer in his book Language, Truth and Logic. Ayer, wrote this book in 1936, but also wrote a new introduction to the second edition ten years later. The latter amounted to a revision of his earlier theses on the principle.It is to both accounts that this essay shall be referring.
Goldstine, Herman H. "Computers at the University of Pennsylvania's Moore School." The Jayne Lecture. Proceedings of the American Philosophical Society, Vol 136, No.1. January 24, 1991
Computers are a magnificent feat of technology. They have grown from simple calculators to machines with many functions and abilities. Computers have become so common that almost every home has at least one computer, and schools find them a good source for information and education for their students (Hafner, Katie, unknown). Computers have created new careers and eliminated others and have left a huge impact on our society. The invention of the computer has greatly affected the arts, the business world, and society and history in many different areas, but to understand how great these changes are, it is necessary to take a look at the origins of the computer.
"Personal Computers." UXL Encyclopedia of U.S. History. Sonia Benson, Daniel E. Brannen, Jr., and Rebecca Valentine. Vol. 6. Detroit: UXL, 2009. 1222-1228. Student Resources in Context. Web. 25 Nov. 2013
Carl Friedrich Gauss was born April 30, 1777 in Brunswick, Germany to a stern father and a loving mother. At a young age, his mother sensed how intelligent her son was and insisted on sending him to school to develop even though his dad displayed much resistance to the idea. The first test of Gauss’ brilliance was at age ten in his arithmetic class when the teacher asked the students to find the sum of all whole numbers 1 to 100. In his mind, Gauss was able to connect that 1+100=101, 2+99=101, and so on, deducing that all 50 pairs of numbers would equal 101. By this logic all Gauss had to do was multiply 50 by 101 and get his answer of 5,050. Gauss was bound to the mathematics field when at the age of 14, Gauss met the Duke of Brunswick. The duke was so astounded by Gauss’ photographic memory that he financially supported him through his studies at Caroline College and other universities afterwards. A major feat that Gauss had while he was enrolled college helped him decide that he wanted to focus on studying mathematics as opposed to languages. Besides his life of math, Gauss also had six children, three with Johanna Osthoff and three with his first deceased wife’s best fri...
The fist computer, known as the abacus, was made of wood and parallel wires on which beads were strung. Arithmetic operations were performed when the beads were moved along the wire according to “programming” rules that had to be memorized by the user (Soma, 14). The second earliest computer, invented by Blaise Pascal in 1694, was a “digital calculating machine.” Pascal designed this first known digital computer to help his father, who was a tax collector. Pascal’s computer could only add numbers, and they had to be entered by turning dials (Soma, 32). It required a manual process like its ancestor, the abacus. Automation was introduced in the early 1800’s by a mathematics professor named Charles Babbage. He created an automatic calculation machine that was steam powered and stored up to 1000 50-digit numbers. Unlike its two earliest ancestors, Babbage’s invention was able to perform various operations. It relied on cards with holes punched in them, which are called “punch cards.” These cards carried out the programming and storing operations for the machine. Unluckily, Babbage’s creation flopped due to the lack of mechanical precision and the lack of demand for the product (Soma, 46). The machine could not operate efficiently because technology was t adequate to make the machine operate efficiently Computer interest dwindled for many years, and it wasn’t until the mid-1800’s that people became interested in them once again.
The history of computers is an amazing story filled with interesting statistics. “The first computer was invented by a man named Konrad Zuse. He was a German construction engineer, and he used the machine mainly for mathematic calculations and repetition” (Bellis, Inventors of Modern Computer). The invention shocked the world; it inspired people to start the development of computers. Soon after,
The history of the computer dates back all the way to the prehistoric times. The first step towards the development of the computer, the abacus, was developed in Babylonia in 500 B.C. and functioned as a simple counting tool. It was not until thousands of years later that the first calculator was produced. In 1623, the first mechanical calculator was invented by Wilhelm Schikard, the “Calculating Clock,” as it was often referred to as, “performed it’s operations by wheels, which worked similar to a car’s odometer” (Evolution, 1). Still, there had not yet been anything invented that could even be characterized as a computer. Finally, in 1625 the slide rule was created becoming “the first analog computer of the modern ages” (Evolution, 1). One of the biggest breakthroughs came from by Blaise Pascal in 1642, who invented a mechanical calculator whose main function was adding and subtracting numbers. Years later, Gottfried Leibnez improved Pascal’s model by allowing it to also perform such operations as multiplying, dividing, taking the square root.
computer. The electronic computer has been around for over a half-century, but its ancestors have been around for 2000 years. However, only in the last 40 years has it changed the American society. From the first wooden abacus to the latest high-speed microprocessor, the computer has changed nearly every aspect of people’s lives for the