The Evolution of Technology – The History of Computers
While computers are now an important part of the lives of human beings, there was a time where computers didn’t exist. Knowing the history of computers and how much progression has been made can help you understand exactly how complicated and innovative the creation of computers really is.
Unlike most devices, the computer is one of the few inventions that doesn’t have one specific inventor. Throughout the evolution of the computer, lots of people have added their creations to the list required to make a computer work. A few of the inventions have been different kinds of computers, and some of them were parts required to permit computers to be developed further.
Perhaps the most crucial date in the history of computers is the year 1936. It was in this year the first “computer” was developed. It was created by Konrad Zuse and dubbed the Z1 Computer. This computer stands as the first because it was the first system to be fully programmable. There were devices prior to this, but none had the computing power that sets it apart from other electronics.
It was not until 1942 that any business saw profit and opportunity in computers. This first company was called ABC computers, owned and operated by John Atanasoff and Clifford Berry. Two decades later, the Harvard Mark I computer was developed, furthering the science of computing.
Over the course of the next few years, inventors all over the world started to search more into the study of computers, and how to improve upon them. Those next ten years say the introduction of the transistor, which would eventually become a very important part of the internal workings of the computer, the ENIAC 1 computer, as well as several other kinds of systems. The ENIAC 1 is perhaps among the most interesting, as it required 20,000 vacuum tubes to operate. It was a gigantic machine, and started the revolution to build smaller and faster computers.
The age of computers was forever altered by the introduction of International Business Machines, or IBM, into the computing industry in 1953. This company, over the course of history, has become a significant player in the creation of new systems and servers for public and private use. This introduction brought about the first real signs of competition within computing history, which helped to spur faster and better development of computers. Their first contribution was the IBM 701 EDPM Computer.
A Programming Language Evolves
A year later, the first successful high level programming language was made. This is a programming language not written in ‘assembly’ or binary, which are considered very low level languages. FORTRAN was written so that more people could begin to program computers easily.
The year 1955, the Bank of America, coupled with Stanford Research Institute and General Electric, saw the creation of the first computers for use in banks. The MICR, or Magnetic Ink Character Recognition, coupled with the actual computer, the ERMA, was a breakthrough for the banking sector. It was not until 1959 that the pair of systems have been put into use in actual banks.
During 1958, one of the most important breakthroughs in computer history occurred, the creation of the integrated circuit. This device, also known as the chip, is one of the base requirements for modern computer systems. On every motherboard and card within a computer system, are many chips that contain information on what the boards and cards do. Without these chips, the systems as we know them today cannot function.
Gaming, Mice, & the Internet
For many computer users now, games are a crucial part of the computing experience. 1962 saw the introduction of the first computer game, which was created by Steve Russel and MIT, which was dubbed Spacewar.
The mouse, one of the simplest components of modern computers, was created in 1964 by Douglass Engelbart. It got its name in the “tail” leading out of the apparatus.
Among the most important aspects of computers today was invented in 1969. ARPA net was the original Internet, which provided the basis for the Internet that we know today. This development would lead to the development of knowledge and business across the entire planet.
It was not until 1970 that Intel entered the scene with the first dynamic RAM chip, which resulted in an explosion of computer science innovation.
On the heels of the RAM chip was the first microprocessor, which was also designed by Intel. Both of these components, in addition to the chip developed in 1958, would number among the core components of modern computers.
A year later, the floppy disk was created, gaining its name from the flexibility of the storage device. This is the first step in allowing most people to transfer bits of data between unconnected computers.
The first networking card was created in 1973, allowing data transfer between connected computers. This is like the World Wide Web, but allows for the computers to connect using the Web.