Learn to TEACH English with TECHNOLOGY. Free course for American TESOL students.


TESOL certification course online recognized by TESL Canada & ACTDEC UK.

Visit Driven Coffee Fundraising for unique school fundraising ideas.





Texas ISD School Guide
Texas ISD School Guide







Short Stories for Teachers

The History of Computers
By:Simon Goodman

The early computers

The history of computer dates back a lot longer than the 1900s, in fact computers have been around for over 5000 years.

In ancient time a "computer", (or "computor") was a person who performed numerical calculations under the direction of a mathematician.

Some of the better known devices used are the Abacus or the Antikythera mechanism.

Around 1725 Basile Bouchon used perforated paper in a loom to establish the pattern to be reproduced on cloth. This ensured that the pattern was always the same and hardly had any human errors.

Later, in 1801, Joseph Jacquard (1752 - 1834), used the punch card idea to automate more devices with great success.

The First computers?

Charles Babbage's. (1792-1871), was ahead of his time, and using the punch card idea he developed the first computing devices that would be used for scientific purposes. He invented the Charles Babbage's Difference Engine, which he begun in 1823 but never completed. Later he started work on the Analytical Engine, it was designed in 1842.

Babbage was also credited with inventing computing concepts such as conditional branches, iterative loops and index variables.

Ada Lovelace (1815-1852), was a colleague of Babbage and founder of scientific computing.

Many people improved on the Babbage inventions, George Scheutz along with his son, Edvard Scheutz, began work on a smaller version and by 1853 they had constructed a machine that could process 15-digit numbers and calculate fourth-order differences.

On of the first notable commercial use, (and success), of computers was the US Census Bureau, which used punch-card equipment designed by Herman Hollerith to tabulate data for the 1890 census.

To compensate for the cyclical nature of the Census Bureau's demand for his machines, Hollerith founded the Tabulating Machine Company (1896), which was one of three companies that merged to form IBM in 1911.

Later, Claude Shannon (1916- 2001) first suggested the use of digital electronics in computers and in 1937 and J.V.Atanasoff built the first electronic computer that could solve 29 simultaneous equations with 29 unknowns. But this device was not programmable

During those trouble times, computers evolved at a rapid rate. But because of restrictions many projects remained secret until much later and notable example is the British military "Colossus" developed in 1943 by Alan Turing and his team.

In the late 1940 the US army commissioned John V. Mauchly to develop a device to compute ballistics during World War II. As it turned out the machine was only ready in 1945, but the Electronic Numerical Integrator and Computer, or ENIAC, proved to be a turning point in computer history.

ENIAC proved to be a very efficient machine but not a very easy one to operate. Any changes would sometime require the device itself to be re-programmed. The engineers were all too aware of this obvious problem and they developed "stored program architecture".

John von Neumann, (a consultant to the ENIAC), Mauchly and his team developed EDVAC, this new project used stored program.

Eckert and Mauchly later developed what was arguably the first commercially successful computer, the UNIVAC.

Software technology during this period was very primitive. The first programs were written out in machine code. By the 1950s programmers were using a symbolic notation, known as assembly language, then hand-translating the symbolic notation into machine code. Later programs known as assemblers performed the translation task.

The Transistor era, the end of the inventor.

Late 1950 saw the end of valve driven computers. Transistor based computers were used because they were smaller, cheaper, faster and a lot more reliable.

Corporations, rather than inventors, were now producing the new computers.

Some of the better known ones are:

TRADIC at Bell Laboratories in 1954,
TX-0 at MIT's Lincoln Laboratory
IBM 704 and its successors, the 709 and 7094. The latter introduced I/O processors for better throughput between I/O devices and main memory
First supper computers, The Livermore Atomic Research Computer (LARC) and the IBM 7030 (aka Stretch)
The Texas Instrument Advanced Scientific Computer (TI-ASC)

Now the basis of computers was in place, with transistors the computers were faster and with Stored program architecture you could use the computer for almost anything.

New high level programs soon arrived, FORTRAN (1956), ALGOL (1958), and COBOL (1959), Cambridge and the University of London cooperated in the development of CPL (Combined Programming Language, 1963). Martin Richards of Cambridge developed a subset of CPL called BCPL (Basic Computer Programming Language, 1967).

In 1969, the CDC 7600 was released, it could perform 10 million floating point operations per second (10 Mflops).

The network years.

From 1985 onward the race was on to put as many transistors as possible on one computer. Each one of them could do a simple operation. But apart from been faster and been able to perform more operations the computer has not evolved much.

The concept of parallel processing is more widely used from the 1990s.

In the area of computer networking, both wide area network (WAN) and local area network (LAN) technology developed at a rapid pace

Get a more detailed history of computer.

Ever wanted to learn more about your computer? http://www.myoddpc.com gives you information from the history of computer to what computer memory to get. Computer software as well as everything you need to know about computer hardware. All in simple terms for the non-technical amongst us.






Go to another board -