Contents
1.Introduction
2.Generation Of Computer
Introduction
The
history of computer development is often in reference to the different
generations of computing devices. Each of generations of computers is
characterized by a major technological development that fundamentally changed
the way computers operate. Most developments resulted in increasingly smaller,
cheaper and more powerful and efficient computing devices . The computer was
born not for entertainment or email but out of a need to solve a serious
number-crunching crisis.
The
original definition of Computer was any person who performed computations or
was required to compute data as a regular part of their job function. Throughout history several man-made devices,
such as the Abacus and Slide Rule, have been built to aid people in calculating
data. This historical timeline covers
the origins of the first mechanical and electronic computers to the beginnings
of commercially available computers sold to the public.
Meaning Of Computer
A computer
is a general-purpose electronic device that can be programmed to carry out a
set of arithmetic or logical operations automatically. Since a sequence of
operations can be readily changed, the computer can solve more than one kind of
problem.
History Of
Computer Was Divide into 2 Era
1. Mechanical
2. Electronic
Generation Of Computer
History Of Mechanical Computer
500 B.C. The
abacus was first used by the Babylonians as an aid to simple arithmetic at
sometime around this date. The abacus in the form we are most familiar with was
first used in China in around 1300 A.D.
1623 Wilhelm
Schickard (1592-1635), of Tuebingen, Wuerttemberg (now in Germany), made a "Calculating
Clock". This mechanical machine was capable of adding and subtracting up
to 6 digit numbers, and warned of an overflow by ringing a bell.
1625 William Oughtred (1575-1660) invented the
slide rule.
1642 French mathematician, Blaise Pascal built a
mechanical adding machine (the "Pascaline").
1671 German mathematician, Gottfried Leibniz
designed a machine to carry out multiplication, the 'Stepped Reckoner'.
1801 Joseph-Maire Jacuard developed an automatic
loom controlled by punched cards.
1820 Charles Xavier Thomas de Colmar (1785-1870),
of France, makes his "Arithmometer", the first mass-produced
calculator
1822 Charles Babbage (1792-1871) designed his
first mechanical computer, the first prototype for the difference engine that
is Analytical Engine .
1834 Babbage conceives, and begins to design, his
"Analytical Engine". The program was stored on read-only memory,
specifically in the form of punch cards .
1842 Babbage's difference engine project is
officially cancelled. (The cost overruns have been considerable, and Babbage is
spending too much time on redesigning the Analytical Engine.)
1858 The first Tabulating Machine is bought by the
Dudley Observatory in Albany, New York, and the second one by the British
government. The Albany machine is used to produce a set of astronomical tables
.
1886 Dorr E. Felt (1862-1930), of Chicago, makes
his "Comptometer". This is the first calculator where the operands
are entered merely by pressing keys .
1906 Henry
Babbage, Charles's son, with the help of the firm of R. W. Munro, completes the
mill of his father's Analytical Engine, just to show that it would have worked. It does. The complete machine is never produced.
1938 Konrad
Zuse (1910-1995) of Berlin, with some assistance from Helmut Schreyer,
completes a prototype mechanical binary programmable calculator, the first
binary calculator it is based on Boolean Algebra .
History Of Electronic Computer
First
Generation (1940-1956) Vacuum Tubes
The first
computers used vacuum tubes for
circuitry and magnetic drums for memory, and were often enormous, taking up
entire rooms. They were very expensive to operate and in addition to using a
great deal of electricity, the first computers generated a lot of heat, which
was often the cause of malfunctions.
First
generation computers relied on machine language, the lowest-level programming
language understood by computers, to perform operations, and they could only
solve one problem at a time, and it could take days or weeks to set-up a new
problem. Input was based on punched cards and paper tape, and output was
displayed on printouts.
The UNIVAC
(Universal Automatic Computer) and ENIAC computers are examples of
first-generation computing devices. The UNIVAC was the first commercial
computer delivered to a business client, the U.S. Census Bureau in 1951.
Second
Generation (1956-1963) Transistors
Transistors
replace vacuum tubes and ushered in the second generation of computers. The
transistor was invented in 1947 but did not see widespread use in computers
until the late 1950s. The transistor was far superior to the vacuum tube,
allowing computers to become smaller, faster, cheaper, more energy-efficient
and more reliable than their first-generation predecessors.
Third
Generation (1964-1971) Integrated Circuits
The
development of the integrated circuit was the hallmark of the third generation
of computers. Transistors were miniaturized and placed on silicon chips, called
semiconductors, which drastically increased the speed and efficiency of
computers.
Instead of
punched cards and printouts, users interacted with third generation computers
through keyboards and monitors and interfaced with an operating system, which
allowed the device to run many different applications at one time with a
central program that monitored the memory. Computers for the first time became
accessible to a mass audience because they were smaller and cheaper than their
predecessors.



Fourth Generation
(1971-Present) Microprocessors
The
microprocessor brought the fourth generation of computers, as thousands of
integrated circuits were built onto a single silicon chip. What in the first
generation filled an entire room could now fit in the palm of the hand. The
Intel 4004 chip, developed in 1971, located all the components of the
computer—from the central processing unit and memory to input/output
controls—on a single chip.
In 1981
IBM introduced its first computer for the home user, and in 1984 Apple
introduced the Macintosh. Microprocessors also moved out of the realm of
desktop computers and into many areas of life as more and more everyday
products began to use microprocessors.
As these
small computers became more powerful, they could be linked together to form
networks, which eventually led to the development of the Internet. Fourth
generation computers also saw the development of GUIs, the mouse and handheld
devices.

Fifth Generation (Present and Beyond)
Artificial Intelligence
Fifth generation
computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being
used today. The use of parallel processing and superconductors is helping to
make artificial intelligence a reality. Quantum computation and molecular and
nanotechnology will radically change the face of computers in years to come.
The goal of fifth-generation computing is to develop devices that respond to
natural language input and are capable of learning and self-organization.
Conclusion
As a
result of the various improvements to the development of the computer we have
seen the computer being used in all areas of life . It is a very useful tool
that will continue to experience new development as time passes .
Computers
are used in various areas of our life . Education , entertainment , sports ,
advertising , medicine , science and engineering , goverment , office and home
are some of the application areas of the computers .
No comments:
Post a Comment