Always Adapting: The History of Computers
Modern life is defined by the computer. Computers are in the phones, televisions, automobiles, and even the washing machines used every day. Computers make life easier and make many tasks faster. In fact, it’s hard to imagine life without the speedy communication and instant access to information enjoyed today. An estimated 76% of all Americans have at least one personal computer. It wasn’t always like this, though. Though man has always searched for a quicker and easier way to store data and do computations, computers have largely been a 20th Century phenomenon.
The First Computer
Most historians agree that the true genesis of modern computers began with a man named Charles Babbage. Babbage, who died in 1871, designed a steam powered engine that could perform complex computations, like those used in calculus. Helping him was a woman, Augusta Ada Lovelace, who was the daughter of the poet Lord Byron. She created the system by which Babbage’s machine could receive information, make the computations and store the data, all using perforated cards. Though the device never got off the drawing board, Babbage is considered the Father of Computers, and Ada Lovelace as the first computer programmer. She even has a computer language named after her.
These early computers had not made a significant impact except in the area of straightforward computation. However, in 1890, a punch card machine was used for the first time to automate the American census. Then in the 1930s, a man by the name of Alan Touring designed a programmable computing device which was programmed by ones and zeros read off a special tape. This binary language code would not only provide the input but also the instructions for what to do with that input. He also created the machine that would provide Enigma decryption during World War II.
At the same time as Touring was creating his machines, the first of the first generation computers was being assembled. What set these machines apart from pervious devices was the use of vacuum tubes. They also used magnetic drums to store data. The operating systems had to be custom made to perform the task for which the machine was designed. That meant these large computers could only perform one job, unlike the multitasking devices in use today.
The first major computer in this mold was called the ENIAC, or the Electronic Numerical Integrator and Computer. Completed in 1946, this computer took up an entire room that measured 30 feet by 50 feet and used 18,000 vacuum tubes. Programming this device meant plugging wires into certain places, and was a very difficult job to learn. The first generation of the computer era lasted from 1939 until 1956.
Computers Through the Years
The second generation (1956-1963) used transistors instead of vacuum tubes. Though the transistor was invented in 1948, it wasn’t until the 50s that its use became widely used in computers, as well as televisions and radios. Transistors were much smaller and more efficient than vacuum tubes, making this advance the first of many that would result in smaller and smaller computers. There were also advances in the magnetic storage capabilities of these machines. They still weren’t practical for even large businesses to use, as the computers with these transistors were large supercomputers. These were just too large and too expensive for wide-spread use.
The third generation of computers, usually dated from 1964-1971, began with the use of the integrated circuit, which was invented in 1958, though it took a few years for it to be incorporated into computers. Though transistors were a definite advance over vacuum tubes, they still put out a lot of heat and had other limitations. When Texas Instruments came out with the integrated circuit, they could fit a lot more computing power on a much smaller platform, while producing much less heat. At this same time, operating systems were developed which allowed computers, for the first time, to perform different programs at the same time.
The fourth generation of computers, which historians say began in 1971 and continues to the present time, was marked by an emphasis on size, mainly on making them smaller. In the movie, Apollo 13, Tom Hanks’ character is giving a tour of NASA facilities, and he mentions all the great advancements they are making. He said that one day, they might even have a computer that could fit in just one room. While that seems laughable to modern ears, that was a real concern in the early 1970s. Computers were still too large and too expensive for most anyone to afford.
Some of the first car computers were being developed during this time as well. Today, computers communicate with cars through what is called the CAN bus. Since there aren't very many computers with the CAN interface, there are USB to CAN adapters to aid with car communication.
In 1971, the Intel 4004 chip changed all that. For the first time, the memory, operating system, cpu, and input/output controls were all put onto the same component, the microprocessor. This achievement was the result of large-scale integration, one of the hallmarks of fourth generation computers. This also allowed computers to be made for general purpose use, rather than designed specifically for one particular job. Software programs were also designed so that anyone, even those without any computer training, could use a computer.
This allowed computers to be manufactured, not only for business use, but for home use as well before the end of the 1970s. Tandy, Commodore, and Apple were some of the first of these affordable computers. In 1981, IBM coined the phrase “PC” with the release of its “personal computer.” Since this time, the rate of advances for computers has been astonishing. Not only have engineers developed ways to get more and more processing power onto smaller and smaller chips, but they have also developed ways to store more and more data in smaller and more convenient ways, moving from storage tape and floppy drives the size of dinner plates to thumb drives and minidisks. The development of the internet has also transformed the computer from just a word processor to a communication and information hub. The internet started out slow requiring a dial-up modem which gave speeds up to 56Kbps (Kilobits per second). This speed permitted sharing basic files and information. The demand for sharing information led to increasing the speed and caused high-speed internet to be widely available with fewer and fewer devices that cannot connect to the internet. The side effect of the progress of computers has made the computer not only affordable, but included in many devices besides the computer on the desk, like microwaves and televisions.
Many of the old computers of this era used signal protocols such as GPIB and Serial. A lot of these interfaces went by the wayside for the most part when new computers were made, but a lot of industrial and control equipment still used these signals. That is why you will find devices such as the USB to GPIB controller, as well as the USB to serial adapter which allow modern computers to communicate with older machinery.
Most computer scientists believe that the fifth generation of computers is beginning. Some say it began with the parallel processor now commonly used. Other fifth generation characteristics are voice recognition and the complex networks of computers. The ultimate goal is artificial intelligence, where computers not only can perform a multitude of tasks simultaneously, but can interact with humans using natural human language rather than computer languages. These devices would also be able to self-organize and would be capable of learning. This would allow computers to solve problems in a creative way that is not presently possible. Nanotechnology is another area that computer scientists are hoping to master in the next several years. Whatever the future holds for computers, they are sure to go faster, do more, and continue to become less expensive.
For now, we can have plenty of fun with what computers already have to offer. They offer great user experiences which include watching movies, playing games, and surfing the internet. There are a lot of accessories that can help you get the most out of your computer, for instance the USB sound card which allows you to add a high performance sound system to your computer with the use of the USB port.
- Definition and Timeline of Computer Advances
- Early History of Computing
- Who Was Charles Babbage?
- History of Computers
- History of Computing
- Computer History
- History of Computers and Computer Languages
- Computer History Collection at the Smithsonian
- A Brief Overview to the History of Computers
- Early History of the Personal Computer
- The Five Generations of Computers
- A Brief History of the Computer
- Interactive History of Computers
- How Computers Work
- Historical Timeline of Computers and Computer Graphics
- Computer Concepts and Terminology: History