Statistics Canada
Symbol of the Government of Canada

History of computers

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

The history of computers, and the history of computers in education

500 BC

Evidence of the abacus, the world’s first calculating machine, exists from as far back as 2,500 years ago in the Tigris-Euphrates Valley. The earliest form of the abacus is a stone or clay tablet that uses pebbles for counting. Grooves are carved into the tablet and pebbles, representing numbers, are placed in these grooves. The pebbles can slide along the grooves from one side of the tablet to the other, thus allowing for easier counting. The abacus also helps ancient peoples perform simple calculations such as addition and subtraction.

1300 AD

In 13th century China, the idea was to thread beads or drilled-out pebbles onto string or wires attached to a wooden frame. This becomes the basis for the modern-day abacus.

Early 1600s

John Napier, the Scottish inventor of logarithms and the decimal point, invents a hand-held device to help with multiplication and division. His device is known as Napier’s Rods or Napier’s Bones.

1622

William Oughtred of England invents the Slide Rule. Unlike the slide rules of the future, his is circular in shape.

1623

Wilhelm Schickard of Germany makes a calculating machine called the Calculating Clock. It is capable of adding and subtracting up to six digits. The machine and its plans were lost but were rediscovered in 1935, lost again and found once more in 1956. In 1960, Schickard’s machine is later reconstructed and found to have worked.

1642

Blaise Pascal of France builds the Pascaline—the first digital computer that can add figures. Up to eight digits can be entered into the machine by turning dials. Pascal actually built and sold about a dozen of these, and some of them still exist today.

1668

Sir Samuel Morland of England produces a non-decimal adding machine suitable for use with English money.

1673

Gottfried Wilhelm Leibniz of Germany, the co-inventor of calculus, designs the Stepped Reckoner, a machine that can carry out the multiplication of up to 12 digits. It is also capable of dividing and finding square roots as well as adding and subtracting. The machine was lost in an attic until 1879.

1775

Charles, the third Earl of Stanhope of England creates a successful multiplying calculator.

 

1801

Joseph-Marie Jacuard develops an automatic loom controlled by punch cards.

1820

Charles Xavier Thomas de Colmar of France develops the Arithmometer, the first mass-produced calculator that can successfully add, subtract, multiply and divide numbers.

1822

Charles Babbage of England designs the first mechanical computer.

1832

Babbage produces a prototype for the first automatic mechanical calculator. The function of his Difference Engine is to calculate and print mathematical tables. Only one-seventh of the engine is ever assembled by Babbage’s engineer, Joseph Clement. Most of the 12,000 manufactured parts are later melted for scrap.

1833

Babbage begins designing the Analytical Engine. This machine has storage systems and computing components such as input and output units.

1853

To produce a set of astronomical tables, the Dudley Observatory in Albany, New York buys the first tabulating machine built by Swedes George Scheutz and his son Edvard. Their machine is based on Babbage’s design.

1854

British mathematician George Boole devises binary algebra. His work is the basis for binary switching, upon which modern computing depends.

1878

Ramon Verea, a Spaniard living in New York, invents a calculator with an internal multiplication table.

1886

Herman Hollerith of the United States Census Bureau develops a mechanical device that uses punched cards to compile and tabulate data. Dorr. E. Felt of Chicago constructs the first calculator to enter numbers by pressing keys instead of turning dials. His calculator is called the Comptometer.

1889

Felt invents the first printing desk calculator.

1896

Hollerith establishes the Tabulating Machine Company which eventually becomes the International Business Machines (IBM) corporation.

 

1931

E. Wynn-Williams uses a thyratron tube to construct a binary digital counter for use in physics experiments at Cambridge University.

1935

IBM introduces the IBM 601, a punch card machine with an arithmetic unit based on relays that is capable of doing a multiplication per second.

1937

George Stibitz of Bell Telephone Laboratories constructs a one-bit binary adder using relays. This is one of the first binary computers. Alan Turing develops his Universal Machine.s

1939

John Vincent Atanasoff and Clifford Berry of Iowa State College completes a prototype 16-bit adder. This is the first machine to calculate using vacuum tubes. Bell Telephone Laboratories develops the Complex Number Calculator.

1941

Atanasoff and Berry completes a special-purpose calculator for solving problems of simultaneous linear equations. It is later called the Atanasoff-Berry Computer (ABC). This computer has 60 fifty-bit words of memory in the form of capacitors mounted on two revolving drums. Its second memory burns holes into punch cards.

1943

Thomas Flowers of England builds the earliest programmable electronic computer which contains 2,400 vacuum tubes. Called the Colossus Mark I decrypting computer, it translates 5,000 characters per second and uses punched tape for its input. This computer is developed to crack the German coding device, Enigma.

1945

A bug is found in a computer relay, and the term "debugging" is coined.

1946

The first vacuum tube-based computers are developed at the University of Pennsylvania. The Electrical Numerical Integrator and Calculator (ENIAC1) has 18,000 vacuum tubes and takes up 1,800 square feet of space. It is considered the first "true computer" (i.e., the first fully electronic, general purpose digital computer).

1947

The transistor is invented by William B. Shockley, John Bardeen and Walter H. Brattain at the Bell Telephone Laboratories in the United States.

1951

The baby boom causes an increase to classroom size, but little electronic technology is used in schools. The first generation Universal Automatic Computer (UNIVAC) computer is delivered to the United States Census Bureau. Whirlwind, the first real-time computer is built for the U.S. Air Defense System.

1952

UNIVAC is used to predict the 1952 United States presidential election. No one believes its prediction, based on 1% of the vote, that Eisenhower will sweep the election. He does.

 

1955

IBM sells its first commercial computer.

1957

The first transistorized computer, the Transistorized Experimental Computer (TX-O) is completed at Massachusetts Institute of Technology.

1958

Mainframe host computers are not widely accepted in schools, which are still using the single classroom, teacher-as-manager method of delivering information to students. Jack St. Claire Kilby invents the integrated circuit at Texas Instruments. The late 1950s sees the development of two important computer-programming languages—Common Business Oriented Language (COBOL) and List Processor (LISP).

1959

The transistor-based computers comes into use. Smaller computers based on transistors and printed circuits are built between 1959 and 1964. These are regarded as "second generation" computers.

1960

Dr. Grace Murray Hopper, professor of mathematics, finishes creating the COBOL language.

1962

Airlines begin to use a computerized reservation system.

1963

The U.S. Vocational Education Act provides new money to support technology in schools. However, the mainframes and minicomputers use batch-processing methods that do not fit well with the single teacher-as-manager-of-learning methods used in most schools. Beginners All-Purpose Symbolic Instruction Code (BASIC), a simple high-level programming language is developed and used mostly in universities to train programmers. The IBM 360 family of computers is developed. Most computers are still using host methods with punch cards as the primary input device. Line-printers are still the primary output device. The first microcomputer, called the PDP-9, is built by Digital Equipment.

1964

Computers built between 1964 and 1971 are regarded as "third generation" computers; they are based on the first integrated circuits, which enable machines to be made even smaller. IBM releases the PL/1 programming language. The IBM 360 is launched. The computer mouse and "windows" are invented this year, as well as the programming language BASIC. Gordon Moore (Chief Executive Officer or CEO of Intel between 1979 to 1987) predicts that the number of transistors the industry would be able to place on a computer chip will double each year. This theory later becomes known as the Moore’s Law.

1965

Mainframes and minicomputers are put into place in some schools. These are used mainly for administrative purposes and school counselling. The first supercomputer, the Control Data CD6600 is developed.

1967

High-level programming languages such as Formula Translation (FORTRAN) are being taught in universities. School vocational training programs begin to include computer maintenance.

1968

The Intel Corporation is founded.

1969

The United States Department of Defense starts the Advanced Research Projects Agency network (ARPAnet) for research into networking. It is the basis for what is now the Internet. The original Net is a small network of supercomputers that is used to promote the sharing of research among universities. The first hosts are connected in 1969.

1970

Mainframes and minicomputers are used in some schools, but not extensively in the delivery of instruction. Intel introduces the first random-access memory (RAM) chip, the 1103, with a capacity of one kilobyte or 1024 bytes.

1971

Intel’s first microprocessor, the 4004, is developed. It is capable of approximately 60,000 interactions per second (0.06 millions of instructions per second or MIPs), running at a clock rate of 108 KHz. The first microcomputers (PCs) are developed. Mainframes and minicomputers are in wide use in business. A few new software companies develop mainframe- and minicomputer-based instructional programs. The floppy disk is invented by IBM engineers led by Alan Shugart. The development of the programming language PASCAL (named after the famous mathematician) is completed.

1972

Computers built after 1972, referred to as the "fourth generation" computers, are based on large scale integration circuits (i.e., microprocessors) with 500 or more components on a chip. The C programming language is developed. Later developments included C++. The first hand-held scientific calculator (the HP-35) made by Hewlett-Packard, makes slide rules obsolete. Intel releases the 8008 Processor. Canada’s Automatic Electronic Systems introduces the world’s first programmable word processor with a video screen. This computer, named the AES 90, uses magnetic disks for storage and custom-built microprocessors.

1974

The Apple I computer is sold in kit form. The CLIP-4, the first computer with a parallel architecture, is developed. The MITS Altair 880, the Scelbi and the Mark-8 are introduced. These are considered the first personal computers. Telenet opens and the first commercial version of the ARPAnet is released.

 

1975

Some Apple I PCs are donated to schools. Some schools adopt mainframes and minicomputers but most refuse to consider PCs. Bill Gates and Paul Allen implement BASIC for the first time. IBM introduces the first laser printer. Micro-Soft was formed by Bill Gates and Paul Allen. (The hyphen in "Micro-Soft" is dropped later on.)

1977

Apple II is released.

1979

Fifteen million PCs are estimated to be in world-wide use. The PC-based spreadsheets are developed. Mainframes and minicomputers are still in wide use. The release of the arcade video game, Space Invaders, starts the video game craze. Honeywell introduced the programming language Ada, named after Augusta Ada Byron, one of the first computer scientists in history and, surprisingly, the daughter of Lord Byron, the famous Romantic poet.

1980

The TI-99, from Texas Instruments, uses a television screen as a monitor, the world’s most popular PC. Development of MS-DOS/PC-DOS begins. The Sinclair ZX80 is released.

1981

IBM develops and introduces a PC, the first mainframe manufacturer to do so. The first educational drill and practice programs are developed for personal computers. The Xerox Start System and the first Windows, Icons, Menus and Pointing Devices (WIMP) system are developed. ARPAnet has 213 hosts and is growing rapidly. Microsoft introduces MS-DOS version 1.0.

1982

The TCP/IP protocol is established, and the term "Internet" is used for the first time to describe the connected set of networks using TCP/IP. The Commodore 64 is released. Compaq releases their Compaq Portable, which is IBM PC-compatible. IBM launches double-sided 320K floppy disk drives.

1983

IBM PC-clones flourished and the Sperry Corporation becomes the second mainframe manufacturer to develop an IBM PC-compatible computer (developed by Mitsubishi in Japan). The Apple II finds widespread acceptance in education because it fits the teacher-as-manager model of instruction. Simple simulation programs are developed for personal computers. IBM releases the PC junior.

1984

There are still relatively few computers in the classroom. The Apple Macintosh computer is developed and released. Commercial software manufacturers develop computer-based tutorials and learning games. The domain name server (DNS) is introduced to the Internet, which consists of about 1,000 hosts. William Gibson coins the term "cyberspace" in his novel Neuromancer.

 

1985

Microsoft Windows is launched. Lotus, Intel and Microsoft introduces Lotus, Intel and Microsoft Expanded Memory Specification Standard (LIM EMS Standard).

1987

The number of Internet hosts exceeds 10,000.

1988

Laptops are developed. The first optical chip is developed. Write Once Read Many times (WORM) disks are marketed for the first time by IBM.

1989

The "World Wide Web", invented by Tim Berners-Lee, sees the need for a global information exchange. The Sound Blaster card is released.

1990

Multimedia PCs are developed. Schools are using videodiscs. Object-oriented multimedia authoring tools are in wide use. Simulations, educational databases and other types of computer assisted instruction programs are delivered on CD-ROM disks, many of these with animation and sound. ARPAnet is decommissioned and the number of hosts had passed 300,000.

1991

Linus Torvalds of Finland develops Linux, a variant of the UNIX operating system. Intel’s monopoly in the Windows PC world is challenged when AMD releases its Am386 microprocessor.

1992

Schools are using Gopher servers to provide students with online information.

1993

Commercial providers are allowed to sell Internet connections to individuals. Pentium is released. The first graphics-based web browser, Mosaic, becomes available. The PDF (Portable Document Format) standard is introduced by Adobe. AMD releases its Am486 microprocessor to compete with Intel’s 80486.

1994

Digital video, virtual reality, and 3-D systems capture the attention of many, but fewer multimedia PCs than basic business PCs are sold. Object-oriented authoring systems such as HyperCard, Hyperstudio, and Authorware grow in popularity in schools. Netscape 1.0 is written as an alternate browser to the National Center for Supercomputing Applications (NCSA) Mosaic. First wireless technology standard (Bluetooth). Yahoo! Internet search service launched. The World Wide Web comprises at least 2,000 Web servers.

1995

The Internet and the World Wide Web begins to catch on as businesses, schools, and individuals create web pages. Most of the computer assisted instruction is delivered via CD-ROM disks, which are growing in popularity. Windows 95 is released, as well as Pentium Pro. Netscape announces JavaScript. George Moore revises his law to say that the number of transistors placed on an integrated circuits will now double every two years. Intel’s supremacy in the Windows PC world continues to erode as AMD releases its K-5 microprocessor, which offers a cheaper, pin-compatible alternative to the Pentium microprocessor.

1996

The Internet is widely discussed as businesses begin to provide services and advertising using web pages. New graphics and multimedia tools are developed for the delivery of information and instruction using the Internet. Many schools are rewiring for Internet access. A few schools install web servers and provided faculty with a way to create instructional web pages. Netscape Navigator 2.0 is released. The number of computer hosts connected to the Internet approached 10,000,000. Microsoft releases the first version of Internet Explorer, its proprietary Web browser.

1997-1998

The growth of the Internet continues to expand with new uses and applications. Voice recognition is slowly entering the computing mainstream. Educational software is expected to become more popular with the introduction of much larger CD-ROM capacities. Intel releases the Pentium MMX for games and multimedia enhancement. Intel releases the Pentium II processor. Microsoft released Windows 98. AMD releases the K-6 microprocessor. Palm Computing markets the first PDA (Personal Digital Assistant), the Palm Pilot. Introduction of e-Book technology. Internet-based computing starts on a large scale with downloadable programs such as SETI@Home.

1999

Linux Kernel 2.2.0 is released. The number of people running Linux is estimated to be about 10 million. Advanced Micro Devices (AMD) releases K6-III, the 400MHz version. In some tests, this computer outperformed the Intel P-III. It contains about 23 million transistors. Intel launches the Pentium III line of microprocessors. Many e-commerce sites are set up on the Internet. Governments and businesses all over the planet make last-minute preparations for the arrival of the year 2000 (Y2K): while new computers are fully Y2K-compliant, fears remain that there are still too many vulnerable computers in use. AMD releases its proprietary Athlon chip, which sets a new speed record of 1 GHz, outpacing all of the competing Pentium microprocessors offered by Intel.

2000

Fears of the Y2K bug prove largely groundless as the new millenium arrives without global computer network crashes. Microsoft launches Windows Millennium (for PCs) and Windows 2000 (for local area networks). Numerous Web-based businesses ("dot-coms") go out of business as their shares collapse on the stock market. Microsoft chairman Bill Gates resigns as CEO of his own company to dedicate himself to the development of software. IBM releases a follow-up to Deep Blue, nicknamed Blue Gene: it operates at 1 quadrillion ops per second (one peta flop) and is 1,000 times faster than Deep Blue. Blue Gene will be used for modelling human proteins. Cyber attacks bring down some websites.

2001

The first Linux virus is detected. Gordon Moore says that the law named after him should be changed again from a doubling of transistors on an integrated circuit every five years to every two years, starting between 2010 and 2020. Amazingly, considering the speed of change in this field of technology, Moore’s Law has held for 36 years. Intel launches Pentium IV. Microsoft releases Windows XP (for both PCs and networks).

2002

Wireless computing becomes widespread: new handheld devices are sold which bring together wireless communications modems, dual-mode cell phones, Web browsers, palmtop computers, Global Positioning System (GPS) receivers and increasingly sophisticated operating systems and graphical user interfaces (Tablet PC).