"Computer", who aren't familiar with this word? It is used so extensively that we thought, why not
make a post where we can focus on the evolution of computers throughout history.
What is a computer?
The definition which we hear the most goes something like this, "it is an electronic machine that can store,
find and arrange information, calculate amounts and control other machines according to some specific
instructions that are written in a program."
The computer never always meant a word for an electronic device or a combination of monitor, CPU, and
keyboard, but in ancient times the only use of computers was for arithmetic tasks to aid the computation
process of basic algebra like addition and subtraction, etc.
History of computers
The history of early computing using computers began as early as the 24th century BC.
Simple and early computers such as manual devices like the abacus helped people in doing
calculations.
The period 2700–2300 BC saw the first appearance of the Sumerian (Sumer is the earliest
known civilization in the historical region of southern Mesopotamia "now southern Iraq", during the Chalcolithic
and Early Bronze Ages.) abacus.
Abacus consisted of a table of successive columns which delimited (established or fixed) the
successive orders of magnitude of their sexagesimal number system(numeral system with 60 as its base).
Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks.
More sophisticated electrical machines did specialized analog calculations in the early 20th century.
Early computing methods
Contents
The Antikythera mechanism is believed to be the earliest mechanical analog "computer". It was designed
to calculate astronomical positions.
It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera
and Crete, and has been dated to c. 100 BC.
Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a
thousand years later.
The planimeter used to calculate the area of a closed figure by tracing over it with a mechanical
linkage, slide rule( not to be confused with the common ruler which is used to draw straight lines and for
measurement) and many more advancements in both mathematics and electronics lead to nowadays used computers.
First computer and the father of computers.
Charles Babbage, an English mechanical engineer, and polymath gave the concept of a programmable
computer.
He is considered the "father of the computer". Charles Babbage conceptualized and invented the first
mechanical computer in the early 19th century.
After his revolutionary difference engine, in 1833 he realized that a much more general design, an Analytical The engine was possible.
The Analytical Engine included an arithmetic logic unit, control flow in the form of conditional branching and
loops, and integrated memory, making it the first design for a general-purpose computer.
Some early Analog computers
Contents
During the first half of the 20th century, many scientific computing needs were met by increasingly
sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for
computation.
However, these were not programmable and generally lacked the versatility and accuracy of modern digital
computers.
The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century.
The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often
attributed to a greek astronomer Hipparchus.
It can be considered as an analog calculator capable of working out several different kinds of problems in
astronomy.
A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable
of working out several different kinds of problems in spherical astronomy.
Analog computer [source-wikipedia]
The dioptra was a sighting tube or a rod with a sight at both ends, attached to a stand. If fitted with
protractors, it could be used to measure angles.
The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication, and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying, and navigation.
The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson in 1872
was of great utility for navigation in shallow waters.
The differential analyzer, a mechanical analog computer designed to solve differential equations by
integration using wheel-and-disc mechanisms was conceptualized in 1876 by James Thomson, the brother of the
more famous Lord Kelvin.
Digital computers: electrochemical
By 1938, the United States Navy had developed an electromechanical analog computer small enough to use aboard a
submarine. This was the Torpedo Data Computer, which used trigonometry to solve the problem of firing a torpedo
at a moving target. During World War II similar devices were developed in other countries as well.
Early digital computers were electromechanical; electric switches drove mechanical relays to perform the
calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric
computers, originally using vacuum tubes.
Vacuum tubes and digital electronic circuits
Contents
Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the
same time that digital calculation replaced analog.
Colossus, the first electronic digital programmable computing device, was used to break German ciphers during
World War II.
Colossus was the world's first electronic digital programmable computer. It used a large number of valves
(vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean
logical operations on its data, but it was not Turing-complete.
Turing complete means that a system can recognize or decide other data-manipulation rule sets.
ENIAC was the first electronic, Turing-complete device, and performed ballistics trajectory calculations for
the United States Army.
The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built
in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was
Turing-complete.
Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and
switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls".
It combined the high speed of electronics with the ability to be programmed for many complex problems. It could
add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to
multiply, divide, and square root. High-speed memory was limited to 20 words (about 80 bytes).
The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum
tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors.
Modern computers
Concept of modern computer
The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable
Numbers.
Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a
universal Turing machine.
He proved that such a machine is capable of computing anything computable by executing instructions
(program) stored on tape, allowing the machine to be programmable.
The fundamental concept of Turing's design is the stored program, where all the instructions for computing are
stored in memory.
Von Neumann, also referred to as the father of modern computers acknowledged that
the central concept of the modern computer was due to this paper.
Turing machines are to this day a central object of study in the theory of computation. Except for the
limitations imposed by their finite memory stores, modern computers are said to be Turing-complete.
Turing completeness is used as a way to express the power of such a data-manipulation rule set.
Computers with stored program
World's first
Early computing machines had fixed programs. Changing its function required the rewiringre-wiring and
restructuringre-structuring of the machine. With the proposal of the stored-program computer this changed.
A stored-program computer includes by design an instruction set and can store in memory a set of instructions
(a program) that details the computation. The theoretical basis for the stored-program computer was laid by Alan
Turing in his 1936 paper.
The Manchester Baby was the world's first stored-program computer.
It was designed as a testbed for the Williams tube, the first random-access digital storage device.
Although the computer was considered "small and primitive" by the standards of its time. It was the
first working machine to contain all of the elements essential to a modern electronic computer. As soon as the
Baby had demonstrated the feasibility of its design, a project was initiated at the university to develop it
into a more usable computer, the Manchester Mark 1.
Grace Hopper was the first person to develop a compiler for programming language.
The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially
available general-purpose computer.
The LEO I computer became operational in April 1951 and ran the world's first regular routine office computer
job.
Use of transistors in computing
Concept of field-effect transistor
The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and
Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the
point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948.
From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second
generation" of computers.
Compared to vacuum tubes, transistors have many advantages: they are smaller and require less power than
vacuum tubes, so give off less heat.
Junction transistors were much more reliable than vacuum tubes and had longer indefinite, service life.
Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.
However, early junction transistors were relatively bulky devices that were difficult to manufacture on a
mass-production basis, which limited them to some specialized applications.
At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using
the newly developed transistors instead of valves.
Their first transistorized computer and the first in the world was operational by 1953, and a second version
was completed there in April 1955.
However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to
read and write on its magnetic drum memory, so it was not the first completely transistorized computer.
That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy
Research Establishment at Harwell.
Use of MOSFETs
The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented by
Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959.
It was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of
uses. With its high scalability, and much lower power consumption and higher density than bipolar junction
transistors, the MOSFET made it possible to build high-density integrated circuits.
In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage
elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in
computers.
The MOSFET led to the microcomputer revolution and became the driving force behind the computer revolution. The
MOSFET is the most widely used transistor in computers and is the fundamental building block of digital
electronics.
Integrated circuits
Contents
The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the
integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the
Ministry of Defence, Geoffrey W.A. Dummer.
The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild
Semiconductor.
Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the
first working integrated example on 12 September 1958.
However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated
circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce.
Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention
was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not.
Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.
Modern monolithic ICs are predominantly MOS (metal-oxide-semiconductor) integrated circuits, built from MOSFETs
(MOS transistors).
After the first MOSFET was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959, Atalla first
proposed the concept of the MOS integrated circuit in 1960, followed by Kahng in 1961, both noting that the MOS
transistor's ease of fabrication made it useful for integrated circuits.
The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven
Hofstein at RCA in 1962.
General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman.
The development of the MOS integrated circuit led to the invention of the microprocessor and heralded an
explosion in the commercial and personal use of computers.
While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of
agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first
single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS
IC technology, along with Ted Hoff, Masatoshi Shima, and Stanley Mazor at Intel.
In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip.
System on a Chip (SoCs)
Contents
What is System on a Chip (SoCs)
System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or
may not have integrated RAM and flash memory.
If not integrated, The RAM is usually placed directly above (known as Package on package) or below (on the
opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC.
This all done to improve data transfer speeds, as the data signals don't have to travel long distances.
Since ENIAC in 1945, computers have advanced enormously, with modern SoCs being the size of a coin while also
being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and
consuming only a few watts of power.
The first mobile computers were heavy and ran from mains power. The 50lb IBM 5100 was an early
example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to
be plugged in.
The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries
– and with the continued miniaturization of computing resources and advancements in portable battery life,
portable computers grew in popularity in the 2000s.
These smartphones and tablets run on a variety of operating systems and recently became the dominant computing
device on the market. These are powered by System on a Chip (SoCs), which as told earlier are complete computers
on a microchip the size of a coin.
Nutshell
Computers have helped humanity throughout history and its evolution is happening day by day. Computer
technology has advanced and evolved a lot from past times.
The speed, power, and versatility of computers have been increasing dramatically ever since the use of
semiconductors.
With the use of MOS, the transistor counts are increasing at a rapid pace (as predicted by Moore's law),
leading to the Digital Revolution during the late 20th to early 21st centuries.
So we are now trying to achieve quantum computing, which is faster than today's supercomputer but
uses different techniques. In a simple way we can differentiate today's computer with quantum computers
using the analogy of candle and bulb.
Like candles, cant be made to be like bulbs however better we use the equipment to make them, similarly, our everyday computers are different from a quantum computer.
Post a Comment