Welcome to the virtual exhibition of the History of Information Technology! This website is an attempt to introduce you to some fascinating devices and events from the history of the creation, storage, transformation, restoration, replication, retrieval, transmission, reception, etc. of information. We hope that you will find this virtual journey into the past of Information Technology (IT) exciting and beneficial.
as knowledge obtained from investigation, study, or instruction;
as a signal or character (as in a communication system or computer) representing data;
as a quantitative measure of the content of information;
as data at any stage of processing (input, output, storage, transmission, etc.);
as something (such as a message, experimental data, or a picture) that justifies change in a construct (such as a plan or theory) that represents physical or mental experience or another construct. (1)
The last of these definitions serves our purposes best. (Next page)
The process of verbalization of one’s feelings, impressions, sensations, etc., both oral and written, is actually the act of encoding them. Depending on their purpose, people have used various, simple and exceedingly complicated, encoding systems. One example is the Polybius square.
The Polybius square is a device invented in ancient Greece and made famous by the historian and scholar Polybius. The alphabet is divided into 5 groups; letters are put in a grid with numbered cells so that each letter is represented by its coordinates in the grid (see the grid for the modern Latin alphabet in the image below). Only 5 numeric symbols are needed.
Later, the Polybius square was used in telegraphy and cryptography, as a basic cipher.
Other encoding systems were used in the ancient world. The header image shows one simple system, dactylonomy, or, finger-counting. Merchants from different countries did not necessarily understand each other’s language. When they wanted to negotiate a price without the cost of an interpreter, they used their hands to explain, literally counting on the fingers.
How did it work? The imaginary dividing line ran across the hand, as if cutting the space in half. All the five fingers closed together to form a ring under the line symbolized the number 5, the ring above the line meant ten. Unfolded fingers below the line with no ring shown represented numbers 1-3. A finger-ring below the line and one unfolded finger showing meant a subtraction operation: 5 –1 = 4.
A ring above the line signified ten. One, two, or three unfolded fingers under the line with a ring above it meant, respectively 9,8, and 7 (10-1; 10-2; 10-3). Zero was also shown as a finger-ring above the imaginary line.
This section focuses on three great men whose inventions laid the foundation for modern computer programming, Joseph-Marie Jacquard, Charles Babbage, and Herman Hollerith.
Punch cards were already in use in music boxes and earlier looms, but French silk-weaver Joseph-Marie Jacquard (1752-1834) made such improvements in punch card technology that sophisticated patterns could be produced quickly through the mechanical direction of the punch card’s hole-code system. The Jacquard loom is controlled by a chain of multiple cards punched with holes that determine which cords of the fabric warp should be raised for each pass of the shuttle. The ability to store and automatically reproduce complex operations found wide application in textile manufacturing.(Source)
The Jacquard loom was a marvel of the Industrial Revolution. A textile-weaving loom, it could also be called the first practical information-processing device. The loom worked by tugging various-coloured threads into patterns by means of an array of rods. By inserting a card punched with holes, an operator could control the motion of the rods and thereby alter the pattern of the weave. Moreover, the loom was equipped with a card-reading device that slipped a new card from a prepunched deck into place every time the shuttle was thrown, so that complex weaving patterns could be automated.
What was extraordinary about the device was that it transferred the design process from a labour-intensive weaving stage to a card-punching stage. Once the cards had been punched and assembled, the design was complete, and the loom implemented the design automatically. The Jacquard loom, therefore, could be said to be programmed for different patterns by these decks of punched cards.
For those intent on mechanizing calculations, the Jacquard loom provided important lessons: the sequence of operations that a machine performs could be controlled to make the machine do something quite different; a punched card could be used as a medium for directing the machine; and, most important, a device could be directed to perform different tasks by feeding it instructions in a sort of language—i.e., making the machine programmable.
It is not too great a stretch to say that, in the Jacquard loom, programming was invented before the computer. The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage’s invention of the first computer. (Source)
In the nineteenth century there were no televisions, aeroplanes, computers, or spacecraft; neither were there antibiotics, credit cards, microwave ovens, compact discs, or mobile phones.
There was, however, an Internet.
During Queen Victoria’s reign, a new communications technology was developed that allowed people to communicate almost instantly across great distances, in effect shrinking the world faster and further than ever before. A world-wide communications network whose cables spanned continents and oceans, it revolutionised business practice, gave rise to new forms of crime, and inundated its users with a deluge of information. Romances blossomed over the wires. Secret codes were devised by some users, and cracked by others. The benefits of the network were relentlessly hyped by its advocates, and dismissed by the sceptics. Governments and regulators tried and failed to control the new medium. Attitudes to everything from newsgathering to diplomacy had to be completely rethought. Meanwhile, out on the wires, a technological subculture with its own customs and vocabulary was establishing itself.
The telegraph unleashed the greatest revolution in communications since the development of the printing press. Modern Internet users are in many ways the heirs of the telegraphic tradition, which means that today we are in a unique position to understand the telegraph — and the telegraph, in turn, can give us a fascinating perspective on the challenges, opportunities and pitfalls of the Internet. (Tom Standage, author, “The Victorian Internet”)
To read about the electric telegraph, see next page.
Teleprinter (or Teletypewriter) is a telegraphic instrument that transmits and receives printed messages and data via telephone cables or radio relay systems. Teleprinters became the most common telegraphic instruments shortly after entering commercial use in the 1920s. They were used by operators in local telegraph offices and switching centers, by press associations and other private networks, and by subscribers to international telegraphic message services.
Teleprinters were invented in order to send and receive messages without the need for operators trained in the use of Morse code. A system of two teleprinters, with one operator trained to use a typewriter, replaced two trained Morse code operators. The teleprinter system improved message speed and delivery time, making it possible for messages to be flashed across the country with little manual intervention.
The teleprinter consists of a typewriter-like keyboard and a printer, powered by an electric motor. The two devices are coupled to the motor by clutches that are brought into operation automatically when required. A message is sent by typing on the keyboard. Each key stroke generates a sequence of coded electrical pulses, which are then routed by electronic switching through an appropriate transmission system to the destination. There a receiving teleprinter decodes the incoming pulses and prints the message on paper. (To view a teleprinter in action, click here)
Two different coding schemes have been used for teleprinters. The first was a variation of the Baudot Code, in which letters, numbers, punctuation marks, and keyboard functions were represented by 32 combinations of 5 “on” and “off” pulses. With the advent of digital computers in the 1960s, a new coding scheme, the American Standard Code for Information Interchange (ASCII), was developed and came to be widely used by teleprinters. ASCII employed 7 code pulses and was thus able to provide 128 combinations, giving a much more extensive range of symbols that could be transmitted. Teleprinters utilizing the ASCII code could transmit messages at speeds up to 150 words per minute, compared to 75 words per minute for machines using the Baudot Code. (Read more)
The Baudot code preceded the American Standard Code for Information Interchange (ASCII). To read more about the importance of these coding systems for information exchange, see next page.
The word telephone, from the Greek roots tēle, “far,” and phonē, “sound,” was applied as early as the late 17th century to the string telephone familiar to children, and it was later used to refer to the megaphone and the speaking tube, but in modern usage it refers solely to electrical devices derived from the inventions of Alexander Graham Bell and others. Within 20 years of the 1876 Bell patent, the telephone instrument, as modified by Thomas Watson, Emil Berliner, Thomas Edison, and others, acquired a functional design that has not changed fundamentally in more than a century. Since the invention of the transistor in 1947, metal wiring and other heavy hardware have been replaced by lightweight and compact microcircuitry. Advances in electronics have improved the performance of the basic design, and they also have allowed the introduction of a number of “smart” features such as automatic redialing, call-number identification, wireless transmission, and visual data display. Such advances supplement, but do not replace, the basic telephone design. (Read more)
To read about telephone components, see next page.
From the earliest days of the telephone, it was observed that it was more practical to connect different telephone instruments by running wires from each instrument to a central switching point, or telephone exchange, than it was to run wires between all the instruments. In 1878 the first telephone exchange was installed in New Haven, Connecticut, permitting up to 21 customers to reach one another by means of a manually operated central switchboard. The manual switchboard was quickly extended from 21 lines to hundreds of lines. Each line was terminated on the switchboard in a socket (called a jack), and a number of short, flexible circuits (called cords) with a plug on both ends of each cord were also provided. Two lines could thus be interconnected by inserting the two ends of a cord in the appropriate jacks. (Read more)
The idea of automatic switching appeared as early as 1879, and the first fully automatic switch to achieve commercial success was invented in 1889 by Almon B. Strowger, the owner of an undertaking business in Kansas City, Missouri.
Convinced the local telephone operator who was the wife of a competitor was diverting his calls, Strowger set out to create a means of bypassing the operator….and in 1891 he patented “the automatic telephone exchange”. A system that would become used throughout the world into the 1970s. (Read more)
To learn about electronic and digital switching, see next page.
The invention of various kinds of machines was attempted in the 19th century. Most were large and cumbersome, some resembling pianos in size and shape. All were much slower to use than handwriting. Finally, in 1867, the American inventor Christopher Latham Sholes read an article in the journal Scientific American describing a new British-invented machine and was inspired to construct what became the first practical typewriter. His second model, patented on June 23, 1868, wrote at a speed far exceeding that of a pen. It was a crude machine, but Sholes added many improvements in the next few years, and in 1873 he signed a contract with E. Remington and Sons, gunsmiths, of Ilion, New York, for manufacture. The first typewriters were placed on the market in 1874, and the machine was soon renamed the Remington. Among its original features that were still standard in machines built a century later were the cylinder, with its line-spacing and carriage-return mechanism; the escapement, which causes the letter spacing by carriage movement; the arrangement of the typebars so as to strike the paper at a common center; the actuation of the typebars by means of key levers and connecting wires; printing through an inked ribbon; and the positions of the different characters on the keyboard, which conform almost exactly to the arrangement that is now universal. Mark Twain purchased a Remington and became the first author to submit a typewritten book manuscript. (Read more)
To learn more about one of the longest lasting technologies, see next page.
In 1877, Thomas Edison and his assistants attached a needle to the diaphragm of a telephone receiver with the idea that the needle could be used to etch an impression of sound onto quickly moving paper, thus creating a recording or sound writing.
Edison understood that sound is the vibration of particles across a medium, such as air, in waves. He developed a way to imprint or record the waves so that they could be played back or turned back into sound using a second needle.
He eventually designed a device he called the phonograph that had a brass cylinder wrapped in tinfoil, which rotated and moved lengthwise when turned by a hand crank. On one side was a diaphragm, or very thin membrane, connected to a needle. When sound waves were forced into the receiving end, it caused the membrane to vibrate and the needle to etch a groove into the foil as the cylinder was being turned by the crank, thus recording sound. A second needle and an amplifier were on the other side. When the cylinder was set to the beginning and the needle placed in the grooves, the original sound was reproduced as the vibrations were amplified. (Read more)
To learn how Edison’s phonograph worked, click here.
During the early 1880s a contest developed between Thomas A. Edison and the Volta Laboratory team of Chichester A. Bell and Charles Sumner Tainter. The objective was to transform Edison’s 1877 tinfoil phonograph, or talking machine, into an instrument capable of taking its place alongside the typewriter as a business correspondence device. This involved not only building a better machine, but finding a substance to replace the foil as the recording medium. By the beginning of 1887 both sides had announced the invention of a machine using a wax cylinder that would be incised vertically to match the sound vibrations. The same machine that was used to make the recording would, as with the tinfoil machine, be used for playback. Edison, as he did earlier, termed his wax cylinder apparatus a phonograph; Bell and Tainter named their apparatus a graphophone. Business people preferred the former, but neither machine was much of a success. Since the phonograph did not succeed as a dictating device, Edison’s company began to market pre-recorded wax cylinders of popular music that could be played on the phonograph in the office or home or even on coin-in-slot machines in arcades, saloons, and elsewhere. By the early 1890s a rudimentary recording industry was underway. Meanwhile, Bell and Tainter made considerable improvements to their graphophone, and they, too, entered the entertainment field. Both sides had applied for a patent on the vertical cutting, or incising, of sound vibrations into a wax cylinder. Both sides made recordings with the result that a phonograph cylinder could be played on the graphophone and vice-versa. (Read more)
Meanwhile Emile Berliner in Washington, D.C., began to take a great interest in the future of sound recording and reproduction.
To read about Emile Berliner’s new invention, the gramophone, see next page.
The success of the gramophone to play recorded sounds was dependent on the ability to mass produce records.
The process of making records has its roots in Thomas Alva Edison’s phonograph. First, a master recording is made, usually in a studio where engineers perfect the recorded sound. Then an object called a lacquer is placed on a record-cutting machine, and as it rotates, electric signals from the master recording travel to a cutting head, which holds a stylus, or needle. The needle etches a groove in the lacquer that spirals to the center of the circular disc. The imprinted lacquer is then sent to a production company.
There, the lacquer is coated in a metal, such as silver or nickel, to produce a metal master. When the metal master is separated from the lacquer, the resulting disc has ridges instead of grooves. The metal master is then used to create a metal record, also called the mother, which is then used to form the stamper. Stampers are just negative versions of the original recording that will be used to make the actual vinyl records.
Next, the stamper is placed in a hydraulic press, and vinyl is sandwiched in between. Steam from the press softens the plastic as the stampers push an impression of the master recording onto it. Finally, the disc is stiffened using cool water. (Read more)
The micro-groove or long-playing record technology built on advances that been available, but unused since the 1930s. By this point the big record companies realized that creating a longer-playing record would require a tougher medium that could hold a smaller groove. They would also require more sensitive electromagnetic pick-ups that would not rapidly wear down the records.
The type of recording medium was the first crucial aspect. Earlier attempts to produce a long-playing record had failed because slower turntable speeds on shellac records resulted in more “rumble” and unwanted undesirable frequency noise. Record companies had tried and failed to add more grooves to records to increase their playing time, but the shellac surface could not effectively hold all of the sound information of the smaller grooves. One of the major advances that had been made in the 1930s was the introduction of vinyl resins. These records consisted of mixture of vinyl chloride and vinyl acetate, known as Vinylite, which was harder and finer than shellac, and allowed the discs to be cut with 224 to 226 grooves per inch, which was a massive improvement over the former 80 to 100 grooves per inch. Decreasing the width and increasing the number of grooves was only one of the ways to make records play longer. The other was to slow down the rate of revolution on the turntable. Throughout the 1930s many companies tried one or both of these techniques to create a longer-playing record, but Columbia was first to successfully combine both of these methods to create a long-playing record. (Read more)
Once the record is ready to be played, it will need a proper machine to bring its sounds to life. To learn how records are played, see next page.
Magnetic recording is a method of preserving sounds, pictures, and data in the form of electrical signals through the selective magnetization of portions of a magnetic material. The principle of magnetic recording was first demonstrated by the Danish engineer Valdemar Poulsen in 1900, when he introduced a machine called the telegraphone that recorded speech magnetically on steel wire. (Read more)
Poulsen recorded his voice by feeding a telephone microphone signal to an electromagnet that he moved along a steel piano wire. In 1899 he filed a patent and founded a company to build the telegraphone, a pioneering telephone answering machine. A simple version stored 2 minutes of audio on 130 mm (5 inch) diameter steel disks. A recording medium of steel wire wound around a cylinder held up to 30 minutes of audio. Poulsen’s associate Peder Oluf Pedersen (1874 – 1941) patented electroplating disks with different magnetizable materials. The telegraphone received a Grand Prix at the 1900 Paris World Exhibition where it recorded Emperor Franz Josef of Austria. (Read more)
In the years following Poulsen’s invention, devices using a wide variety of magnetic recording mediums have been developed by researchers in Germany, Great Britain, and the United States. Principal among them are magnetic tape and disk recorders, which are used not only to reproduce audio and video signals but also to store computer data and measurements from instruments employed in scientific and medical research. Other significant magnetic recording devices include magnetic drum, core, and bubble units designed specifically to provide auxiliary data storage for computer systems. (Read more)
Wire recording or magnetic wire recording is an analog type of audio storage in which a magnetic recording is made on thin steel or stainless steel wire.
To read about the magnetic wire recorder, see next page.
Although there were many developments in the mechanisms and the materials used, the basic operating principles of mechanical calculators / adding machines did not change much from the late 19th century to their obsolescence in the 1970s.
The first calculator or adding machine to be produced in any quantity and actually used was the Pascaline, or Arithmetic Machine, designed and built by the French mathematician and philosopher Blaise Pascal between 1642 and 1644. It could only add and subtract, with numbers being entered by manipulating its dials. Pascal invented the machine for his father, a tax collector. He built 50 of them over the next 10 years.
In 1671 the German mathematician-philosopher Gottfried Wilhelm von Leibniz designed a calculating machine called the Step Reckoner. The Step Reckoner expanded on Pascal’s ideas and did multiplication by repeated addition and shifting.
With the Industrial Revolution of the 18th century came a widespread need to perform repetitive operations efficiently. In 1820 Charles Xavier Thomas de Colmar of France built his Arithmometer, the first commercial mass-produced calculating device. It could perform addition, subtraction, multiplication, and even division. Based on Leibniz’s technology, it was extremely popular and sold for 90 years. The Arithmometer was large enough to cover a desktop.
The first mechanical calculation machines were not very reliable. But they became the basis for the highly successful mechanical calculators built throughout the 19th and 20th century, when more accurate gears and wheels became available.
To learn how pre-computer calculating machines worked, see next page.
The Macintosh, or Mac, is a series of several lines of personal computers, manufactured by Apple Inc. The first Macintosh was introduced on January 24, 1984, by Steve Jobs and it was the first commercially successful personal computer to feature two known, but still unpopular features—the mouse and the graphical user interface, rather than the command-line interface of its predecessors. (Read more)
Mac was a truly personal computer, ideal for the home as well as professional and educational environments. It was remarkable in that it had:
Graphical User Interface (GUI) that even kids could figure out.
Mouse/menu-based operating system that did not require memorizing commands and typing classical geek on the keyboard.
Bit-mapped screen allowing drawing any shape or picture on the screen and not just characters. (Read more)
“Insanely great” – Steve Jobs could hardly put into words his enthusiasm by the launch of the Macintosh. On the legendary annual general meeting of January 24th, 1984, in the Flint Center not far from the Apple Campus in Cupertino, the Apple co-founder initially quoted Bob Dylan’s “The Times They Are A-Changin’” in order to then polemicize against an imminent predominance of the young computer industry by IBM. (Read more)
To read about the beginning of Macintosh computers, see next page.