The History of the Computer
The computer dates back to the early days of Babylon, circa 300 BC. The Babylonians used the first computing device, the abacus, for simple counting. With the increase in human needs and modernization, the computer (especially its power and speed) has advanced significantly. The computer’s evolution is not the accomplishment of any one person. It is rather the result of numerous inventors’ contributions over an extended time period.
Abacus was the first computer used for counting. Papyrus assisted people to document language and numbers. Earlier computers did not have the required technology for particular designs to work accurately, since they used wooden, not metal and steel parts.
Computer development occurred in stages. Each computing generation came up with a key technological improvement that significantly changed the way computers worked. This resulted in computers being less expensive, having smaller component parts (hardware), and being more powerful and trustworthy at each developmental stage.
First Generation: 1940-1956 (Vacuum Tubes)
The first type of computer was the vacuum tube computer. It was gigantic, almost occupying an entire room. Its creators installed vacuum tubes and magnetic drums inside it for memory storage. Its operation was not economical as it consumed a large amount of electricity, produced too much heat, and was prone to failure. This computer could not multitask and its calculations and operations were based on programming language that it understood. Punched cards and paper tape were used as input devices and printouts were taken to view the result.
Example: UNIVAC (earliest commercially used computer) and ENIAC
Second Generation: 1956-1963 (Transistors)
The second generation of computers used transistors, which were invented in 1947. Transistors were better than vacuum tubes but still generated lot of heat. They helped computers become smaller, faster, and more economical, but they still relied on punched cards and printouts. Technicians replaced the vacuum tubes’ machine language with a higher one called Assembly Language, which helped them to write code in English words. Second generation computers, which had magnetic core technology, were the first machines to store code instructions in their memory.
The atomic energy industry was the first to use these computers.
Third Generation: 1964-1971 (Integrated Circuits)
Integrated circuits (IC) were the next major computer advancement. Semiconductors made of reduced transistors on silicon chips were introduced. The introduction of semiconductors significantly improved computers’ speed and productivity. The input and output devices were replaced with keyboards and monitors. An operating system was incorporated, which provided more functions for users. Many consumers started using these third generation computers due to their availability and economic value.
Fourth Generation: 1971-Present (Microprocessors)
The introduction of microprocessors revolutionized the computer as it became much faster, more efficient and reliable, and less expensive. This allowed the amalgamation of numerous integrated circuits into a single small silicon chip or microprocessor. The Microprocessor was widely adapted and incorporated into every personal computer. It became the key controller of all the units present in a personal computer and other digital devices. Many consumers started using personal computers because of their small size, speed, efficiency, and minimal cost.
Comments - One Response to “The History of the Computer”
Sorry but comments are closed at this time.