Upload
antoniofv988
View
216
Download
0
Tags:
Embed Size (px)
DESCRIPTION
This is one of the works we have to do in the subject TIC , and we leave it here for everyone who want to see it. Made by Antonio Fuentes y María Cal.
Citation preview
Future History
Antonio Fuentes and Maria Cal
1ºA Bach, IES Sofía Casanova
A way of
rediscovering
the past
This is the
now of
yesterday
WAY, WAY BACK….
Part 1 – The origins
Chapter I
THE ABACUS
The oldest device known calculation is the abacus.
Its name comes from the Greek meaning Abakos flat surface. It is known that the Greeks employed to count tables in the V century BC or perhaps earlier. Abacus as we know it today is composed of a series of wires with beads strung on them.
This version of Abacus has been used in the Middle East and Asia
It was invented on 3500 A.C. (Babylon) by the ancient
Greek and Roman civilizations; it was very simple, with beads
strung on rods which in turn are mounted on a rectangular
frame, it was the first "machine" to perform computations.
WAY, WAY BACK….
until relatively recently
Chapter II
Blaise Pascal y Leibniz It is a 17th-century calculating machine
developed by French mathematician Blaise
Pascal. After numerous prototypes, Pascal
introduced his machine to the public in
1645. It could only add and subtract but
gained attention because units were placed
in prominent locations throughout Europe.
The Pascaline inspired Gottfried Leibniz to
invent his stepped drum cylinder, a major
improvement that was used in calculators
for centuries.
When accountants began using the Pascaline, fellow colleagues expressed grave
concern that they might be replaced by technology!
943
WAY, WAY BACK….
Chapter III
Analytical Engine
Charles Babbage was a computer pioneer,
and he designed two classes of engine.
Difference engines are called so because
of the mathematical principle on which they
are based. Babbage conceived, in 1834, a
more ambitious machine, later called
Analytical Engine, a general-purpose
programmable computing engine.
The Analytical Engine has many essential features
found in the modern digital computer. It was
programmable using punched cards, which we are
going to see what they are in the next point, an
idea borrowed from Jacquard used for weaving
complex patterns in textiles. The Engine had a
'Store' where numbers and intermediate results
could be held, and a separate 'Mill' where the
arithmetic processing was performed. It had an
internal repertoire of the four arithmetical functions
and could perform direct multiplication and division.
The logical structure of the Analytical Engine was essentially the same as that
which has dominated computer design in the electronic era - the separation of the
memory (the 'Store') from the central processor (the 'Mill').
WAY, WAY BACK….
Chapter IV
Punched cards
Punched cards were first used around
1725 by Basile Bouchon and Jean-
Baptiste Falcon as a more robust form of
the perforated paper rolls then in use for
controlling textile looms in France. This
technique was greatly improved by
Joseph Marie Jacquard in his Jacquard
loom (Spanish: telar) in 1801.
From the 1900s, into the 1950s, punched
cards were the primary medium for data
entry, data storage, and processing in
institutional computing.
During the 1960s, the punched card was gradually replaced as the primary means
for data storage by magnetic tape, as better, more capable computers became
available. Punched cards were still commonly used for data entry and
programming until the mid-1980s when the combination of lower cost magnetic
disk storage, and affordable interactive terminals on less expensive minicomputers
made punched cards obsolete for this role as well.
WAY, WAY BACK….
Part 2 – First Generation
First Generation (1940-1956)
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
A UNIVAC computer at the Census Bureau.
WAY, WAY BACK….
Chapter I
VACUUM TUBES
In electronics, a vacuum tube, is a device used to
amplify, switch, otherwise modify, or create an
electrical signal by controlling the movement of
electrons in a low-pressure space. Some special
function vacuum tubes are filled with low-pressure
gas: these are so-called soft tubes as distinct from the
hard vacuum type which have the internal gas
pressure reduced as far as possible. Almost all tubes
depend on the thermionic emission of electrons.
Vacuum tubes were critical to the development of
electronic technology, which drove the expansion and
commercialization of radio broadcasting, television,
radar, sound reproduction... Some of these applications pre-dated electronics, but
it was the vacuum tube that made them widespread and practical.
Chapter II
COLOSSUS Colossus was the world's first electronic digital
computer that was at all programmable. The
Colossus computers were developed for British
codebreakers during World War II. Colossus
used vacuum tubes to perform Boolean
operations and calculations. Colossus was
designed by the engineer Tommy Flowers to
solve a problem posed by mathematician Max
Newman.
The prototype, Colossus Mark 1, was shown to be working in December 1943. An
improved Colossus Mark 2 that used shift registers to quintuple the speed, first worked
on 1 June 1944, just in time for the Normandy Landings.
WAY, WAY BACK….
The destruction of most of the Colossus hardware and
blueprints, as part of the effort to maintain a project secrecy
that was kept up into the 1970s, deprived most of those
involved with Colossus of credit for their pioneering
advancements in electronic digital computing during their
lifetimes.
Chapter III
TURIN MACHINE
Alan Mathison Turing was a British mathematician, logician, cryptanalyst, philosopher, computer scientist, mathematical and biologist. He was highly influential in the development of computer science, providing a formalization of the concepts of "algorithm" and "computation" with the Turing machine, which can be considered a model of a general purpose computer. Turing is widely considered to be the father of theoretical computer science and artificial intelligence. During World War II, Turing worked for the Government Code and Cypher School (GC&CS) at Bletchley Park, Britain's code breaking centre. For a time he led Hut 8, the section responsible for German naval cryptanalysis. He devised a number of techniques for breaking German ciphers. A Turing machine, invented in 1936, is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer.
WAY, WAY BACK….
Chapter IV
ENIAC Acronym for Electronic Numerical Integrator And Computer, the first operational
electronic digital computer in the U.S., developed by Army Ordnance to compute
World War II ballistic firing tables. The ENIAC, weighing 30 tons, using 200
kilowatts of electric power and consisting of 18,000 vacuum tubes, 1,500 relays,
and hundreds of thousands of resistors, capacitors, and inductors, was completed
in 1945. In addition to ballistics, the ENIAC's field of application included weather
prediction, atomic-energy calculations, cosmic-ray studies, thermal ignition,
random-number studies, wind-tunnel design, and other scientific uses. The ENIAC
soon became obsolete as the need arose for faster computing speeds.
Chapter V
UNIVAC The UNIVAC I (UNIVersal Automatic Computer I) was the second commercial computer
produced in the United States. It was designed principally by J. Presper Eckert and John
Mauchly, the inventors of the ENIAC. UNIVAC I used 5,200 vacuum tubes, weighed 29,000 pounds (13 metric tons), consumed
125 kW, and could perform about 1,905 operations per second running on a 2.25 MHz
clock. The Central Complex alone (i.e. the processor and memory unit) was 4.3 m by 2.4m
WAY, WAY BACK….
by 2.6 m high. The complete system occupied more than 35.5 m² of floor space.
Part 3 – Second generation
Second Generation (1956-1963)
Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
WAY, WAY BACK….
Part 4 – Third generation
Third Generation (1964-1971)
Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Chapter I
Integrated Circuit The idea of the integrated circuit was conceived by a
radar scientist working for the Royal Radar
Establishment of the British Ministry of Defence,
Geoffrey W.A. Dummer (1909–2002). Dummer
presented the idea to the public at the Symposium on
Progress in Quality Electronic Components in
Washington, D.C. on 7 May 1952.
An integrated circuit or monolithic integrated circuit (also referred to as an IC, a
chip, or a microchip) is a set of electronic circuits on one small plate ("chip") of
semiconductor material, normally silicon. This can be made much smaller than a
discrete circuit made from independent components. ICs can be made very
compact, having up to several billion transistors and other electronic components
WAY, WAY BACK….
in an area the size of a fingernail. The width of each conducting
line in a circuit can be made smaller and smaller as the
technology advances. ICs were made possible by experimental
discoveries showing that semiconductor devices could perform
the functions of vacuum tubes and by mid-20th-century
technology advancements in semiconductor device fabrication.
The integration of large numbers of tiny transistors into a small
chip was an enormous improvement over the manual assembly
of circuits using discrete electronic components.
Chapter II
Microprocessors
UNIVAC
The advent of low-cost computers on integrated circuits has transformed
modern society. The first use of the term "microprocessor" is attributed to
Viatron Computer Systems describing the custom integrated circuit used in
their System 21 small computer system announced in 1968.
During the 1960s, computer processors were constructed out of small and
medium-scale ICs—each containing from tens of transistors to a few
hundred. These were placed and soldered onto printed circuit boards, and
often multiple boards were interconnected in a chassis. The large number of
discrete logic gates used more electrical power—and therefore produced
more heat—than a more integrated design with fewer ICs. The distance that
signals had to travel between ICs on the boards limited a computer's
operating speed.
The first microprocessors emerged in the early 1970s and were used for
electronic calculators, using binary-coded decimal (BCD) arithmetic on 4-bit
words.
WAY, WAY BACK….
Chapter III
The company was founded in 1911 as the
Computing Tabulating Recording Company (CTR)
through a merger of three companies: the
Tabulating Machine Company, the International
Time Recording Company, and the Computing Scale
Company
In 1963, IBM employees and computers helped
NASA track the orbital flight of the Mercury
astronauts, and a year later, the company moved its
corporate headquarters from New York City to
Armonk, New York. The latter half of that decade
saw IBM continue its support of space exploration,
with IBM participating in the 1965 Gemini flights,
the 1966 Saturn flights, and the 1969 mission to
land man on the moon.
On April 7, 1964 IBM announced the first computer
system family, the IBM System/360. Sold between
1964 and 1978, it was the first family of computers
designed to cover the complete range of applications, from small to large, both
commercial and scientific.
On October 11, 1973, IBM introduced the IBM 3660, a laser-scanning point-of-sale
barcode reader which would become the workhorse of retail checkouts.
WAY, WAY BACK….
Fourth Generation (1971-Present)
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Chapter IV
HOME COMPUTERS Development of the single-chip microprocessor was an enormous catalyst to the
popularization of cheap, easy to use, and truly personal computers. The Altair 8800 at the
time set a new low price point for a computer, bringing computer ownership to an
admittedly select market in the 1970s. This was followed by the IMSAI 8080 computer,
with similar abilities and limitations. The Altair and IMSAI were essentially scaled-down
minicomputers and were incomplete: to connect a keyboard or teleprinter to them
required heavy, expensive "peripherals".
The MITS Altair, the first commercially successful microprocessor kit, was featured on the
cover of Popular Electronics magazine in January 1975. It was the world's first mass-
produced personal computer kit, as well as the first computer to use an Intel 8080
processor. It was a commercial success with 10,000 Altairs being shipped. The Altair also
inspired the software development efforts of Paul Allen and his high school friend Bill
WAY, WAY BACK….
Gates who developed a BASIC interpreter for the Altair, and then formed Microsoft.
The advent of the microprocessor and solid-state memory made home computing
affordable. Early hobby microcomputer systems such as the Altair 8800 and Apple I
introduced around 1975 marked the release of low-cost 8-bit processor chips, which had
sufficient computing power to be of interest to hobby and experimental users. By 1977
pre-assembled systems such as the Apple II, Commodore PET, and TRS-80 began the era
of mass-market home computers
After the success of the Radio Shack TRS-80, the Commodore PET and the Apple II in
1977, almost every manufacturer of consumer electronics rushed to introduce a home
computer. Large numbers of new machines of all types began to appear during the late
1970s and early 1980s. Mattel, Coleco, Texas Instruments and Timex, none of which had
any previous connection to the computer industry, all had short-lived home computer
lines in the early 1980s. Some home computers were more successful—the BBC Micro,
Sinclair ZX Spectrum, Atari 800XL and Commodore 64, sold many units over several years
and attracted third-party software development. During the peak years of the home
computer market, scores of models were produced, usually with little or no thought given
to compatibility between different manufacturers or even within product lines of the
same manufacturer. Except for the Japanese MSX standard, the concept of a computer
platform was still forming, with most companies considering BASIC compatibility
sufficient.
WAY, WAY BACK….
Chapter V
APPLE Apple was established on April 1, 1976, by Steve Jobs, Steve Wozniak and Ronald Wayne to sell the Apple I personal computer kit. The Apple I kits were a computer single handedly designed and hand-built by Wozniak and first shown to the public at the Homebrew Computer Club. The Apple I was sold as a motherboard (with CPU, RAM, and basic textual-video chips), which is less than what is today considered a complete personal computer.The Apple I went on sale in July 1976 and was market-priced at $2,763 in 2014 dollars, adjusted for inflation). Apple was incorporated January 3, 1977, without Wayne, who sold his share of the company back to Jobs and Wozniak for $800. By the end of the 1970s, Apple had a staff of computer designers and a production line. The company introduced the Apple III in May 1980 in an attempt to compete with IBM and Microsoft in the business and corporate computing market As of June 2014, Apple maintains retail stores in fourteen countries, as well as the online Apple Store and iTunes Store the latter of which is the world's largest music retailer. Apple is the largest publicly traded corporation in the world by market capitalization with an estimated market capitalization of $446 billion by January 2014. +
WAY, WAY BACK….
Chapter VI
MICROSOFT Microsoft Corporation is an American multinational corporation headquartered
inRedmond, Washington, that develops, manufactures, licenses, supports and
sells computer software, consumer electronics and personal computers and
services. Its best known software products are the Microsoft Windows line of
operating systems, Microsoft Office office suite, and Internet Explorer web
browser. Its flagship hardware products are Xbox game console and the Microsoft
Surface series of tablets. It is the world's largest software maker measured by
revenues. It is also one of the world's most valuable companies.
Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975, to develop
and sell BASIC interpreters forAltair 8800. It rose to dominate the personal
computer operating system market with MS-DOS in the mid-1980s, followed by
Microsoft Windows. The company's 1986 initial public offering, and subsequent
rise in its share price, created three billionaires and an estimated 12,000
millionaires from Microsoft employees. Since the 1990s, it has increasingly
diversified from the operating system market and has made a number of corporate
acquisitions. In May 2011, Microsoft acquired Skype Technologies for $8.5 billion
in its largest acquisition to date
As of 2013, Microsoft is market dominant in both the IBM PC-compatible operating
system and office software suite markets (the latter with Microsoft Office). The
company also produces a wide range of other software for desktops and servers,
and is active in areas including Internet search (with Bing), the video game
industry (with the Xbox, Xbox 360 and Xbox One consoles), the digital services
market (through MSN), and mobile phones (via the Windows Phone OS). In June
2012, Microsoft entered the personal computer production market for the first time,
with the launch of the Microsoft Surface, a line of tablet computers.
WAY, WAY BACK….
Part 5 – Fifth generation
Fifth Generation
(Present and Beyond) Defining the fifth generation of computers is somewhat difficult because the field is
in its infancy. The most famous example of a fifth generation computer is the
fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. HAL
performed all of the functions currently envisioned for real-life fifth generation
computers. With artificial intelligence, HAL could reason well enough to hold
conversations with its human operators, use visual input, and learn from its own
experiences. (Unfortunately, HAL was a little too human and had a psychotic
breakdown, commandeering a spaceship and killing most humans on board.)
Using recent engineering advances, computers may be able to accept spoken
word instructions and imitate human reasoning. The ability to translate a foreign
language is also a major goal of fifth generation computers. This feat seemed a
simple objective at first, but appeared much more difficult when programmers
realized that human understanding relies as much on context and meaning as it
does on the simple translation of words.
WAY, WAY BACK….
Many advances in the science of computer design and technology are coming
together to enable the creation of fifth-generation computers. Two such
engineering advances are parallel processing and another advance is
superconductor technology, which allows the flow of electricity with little or no
resistance, greatly improving the speed of information flow.
Fifth generation computing devices are still in development. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come.
Consider Moore's Law, an observation that Gordon Moore made back in 1965. He noticed that the number of transistors engineers could cram onto a silicon chip doubled every year or so. That manic pace slowed over the years to a slightly more modest 24-month cycle. Awareness of the breakneck speed at which computer technology develops has seeped into the public consciousness. We've all heard the joke about buying a computer at the store only to find out it's obsolete by the time you get home. What will the future hold for computers?
WAY, WAY BACK….
Assuming microprocessor manufacturers can continue to live up to Moore's Law, the processing power of our computers should double every two years. That would mean computers 100 years from now would be 1,125,899,906,842,624 times more powerful than the current models. In 2005, Moore said that as transistors reach the atomic scale we may encounter fundamental barriers we can't cross .We may get around that barrier by building larger processor chips with more transistors. But transistors generate heat, and a hot processor can cause a computer to shut down. Computers with fast processors need efficient cooling systems to avoid overheating. The larger the processor chip, the more heat the computer will generate when working at full speed. Another tactic is to switch to multi-core architecture. A multi-core processor dedicates part of its processing power to each core. They're good at handling calculations that can be broken down into smaller components; however, they aren't as good at handling large computational problems that can't be broken down. Future computers may rely on a completely different model than traditional machines. What if we abandon the old transistor-based processor?