TEXT I. FROM CALCULI TO MODERN COMPUTER
Although the first modern automatic computers began to work in 1944, the story of the development of ideas, devices, and machines entering into that automatic computer goes back a long time into the past. Problems of calculating with numbers, and recording numbers, lave pressed upon human beings for more than five thousand years.
Probably the first of the ideas to deal with numbers is the idea of using small objects, such as pebbles, seeds, or shells, to count with, to supplement the fingers.
People, however, find it troublesome to count only in units - it takes too much time and effort. So early a second idea appears: the dea of composing a new unit equal to ten of the old units. The source of this idea is clearly the fact that a man has ten fingers; with this idea you could designate 87 by referring to all the fingers if 8 men, and than 7 more fingers on one more man.
In order to deal with numbers in their physical form of counted objects, a third idea appears: a specialized, convenient place upon which to lay out the counted objects. Such a place may be a smooth piece of ground, slab of stone, or a board.
It becomes convenient to mark off areas on the slab according to he size of unit you are dealing with - you have one area for ordinary units, one area for tens, one area for hundreds, and so on. These developments gave birth to the abacus, the first computing machine. This device consisted of a slab divided into areas, and a supply of small stones for use as counters or objects to keep track of numbers. The Greek word for slab was abax, and the Latin word for the small stones was calculi, and so the first computing machine, the abacus was invented, consisting originally of a slab and counting stones, and later on, a frame of rods strung with beads, for keeping track of numbers while calculating.
The system of numbering and the abacus go hand in hand together. The abacus is still the most widely used computing machine in the world.
Then appeared the Arabic positional notation for numerals which reached Western Europe in the 1200's. Just as the small counting stones or calculi could be used in any area on the slab, so the digits 1, 2, 3, 4, 5, 6, 7, 8, 9 could be used in any position of a numeral. Just as the position on the slab answered the question as to whether units, tens, hundreds, etc., were being counted, so the place or column or position of the digit (as in 4786 with its four places) answered the question as to what kinds of units were there being counted. And - this was the final key idea - just as a place on the slab could be empty, so the digit 0 could mark "none" in place or column of a number.
That idea, by the way, required centuries to develop. The Romans did not have a numeral for zero; but about 300 B.C. in Babylon a symbol for zero was used. Then the Hindus developed the numerical notation that we call Arabic. The Arabs used the word "sifr" meaning "vacant" about 800 A.D. for "zero". About 1200 A.D. the Arabic word was translated into Latin giving rise subsequently to the two English words "cipher" and "zero".
The first machine which would add numbers mechanically was invented by the French mathematician and philosopher Blaise Pascal in 1642. It contained geared counter wheels which could be set at any one of ten positions from 0 to 9. Each gear had a little tooth for nudging the next counter wheel when it passed from 9 to 0 so as to carry 1 into the next column.
Some 30 years later, in 1673, another mathematician, G.W. Leibnitz, invented a device which would control automatically the amount of adding to be performed by a given digit, and in this way he invented the first multiplying machine.
Pascal's and Leibnitz's machines and their improved successors have given rise to electric-powered but hand-operated adding machines and desk calculating machines which are found throughout offices today.
The idea of an automatic machine which would not only add, subtract, multiply, and divide but perform a sequence of steps automatically, was probably first conceived in 1812 by Charles Babbage, a professor of mathematics at Cambridge University, England. Babbage intended that his machine should compute the values of the tabulated mathematical functions and print out the results. No attention would be needed from the human operator, once the starting data and the method of computation had been set into the machine.
The construction of this machine was begun with aid from the government. For 20 years however little progress was achieved. In 1833 Babbage changed his plans for another computing machine which he called an analytical engine. This was to consist of three parts: (1) the "store", where numbers were to be stored or remembered; (2) the "mill" where arithmetical operations were to be performed on numbers taken from the store; and (3) the "sequence mechanisms" which would select the proper numbers from the store and instruct the mill to perform the proper operation.
Once, Countess of Lovelace, the daughter of the great English poet, Lord Byron, Augusta Ada Byron saw that computing machine. As she was a brilliant mathematician, she was the first who highly appreciated the idea put into the Babbage's automatic computer. She wrote to him later that she was greatly impressed by his invention. They continued to work together for some years. Probably, it was somewhere in 1840, may be later, but they cooperated up to 1850. (She died in 1852 when she was only 37.) Nowadays she considered to be the first programmer in the world.
But the first and second Babbage's machines were not completely constructed although small parts of them were. Both Babbage and his son, who also tried to carry out his father's ideas, died without seeing the result of their work. The failure to construct those machines was because of the absence of sufficiently accurate machine tools and of mechanical and electrical devices that finally became available around 1900-1910.
Another of the historical developments of automatic machines was about 1886. Dr. Herman Hollerith decided to experiment with cards with punched holes and with electrical devices to detect the holes and count them. He realized that cards bearing human language were not readable by the machine; but cards could be prepared using a machine language, a language of punched holes.
Hollerith's experiments and machines were successful, and have led to a great development of machines using punched cards for business, accounting and statistical purposes. These machines, punched card calculating machines, have become a base of business calculations and reports all over the world.
The first automatic digital computer that worked was a machine called the Complex Computer, constructed in 1939. Dr. George R. Stibitz, an engineer, noticed around him a lot of troublesome arithmetic multiplying and dividing complex numbers, numbers which electrical engineers find necessary for analyzing alternating electrical circuits. Every multiplication of two complex numbers requires four multiplications and two additions of ordinary numbers. Every division of one complex number by another requires six multiplications, two additions, one subtraction, and two divisions of ordinary numbers. The sequence of the operations with the ordinary numbers is always monotonously the same.
Stibitz decided that ordinary telephone relays could be wired together to do this annoying task. So he represented each decimal digit by a code of l's and O's, so that four relays being energized or not energized could express the code and designate each digit. This machine was completed in 1940 and demonstrated.
In 1944 the first general-purpose automatic digital computer was built. This computer ran 24 hours a day, seven days a week and continued to operate for many years. This machine was the first working realization of Ch. Babbage's analytical engine. And it quickly led to more automatic digital computers with numerous improvements.
In 1946 an automatic electronic digital computer was built. This machine used instead of relays standard radio tubes and parts, and aimed for high speed. It was ENIAC (Electronic Numerical Integrator and Calculator). It contained 20 registers where numbers of 10 decimal digits could be stored or accumulated. It could add numbers at the rate of 5,000 additions per second. It also contained a multiplier which would carry out from 360 to 500 multiplications per second, a "divider-square-rooter", and other units.
From 1952 the addition speed of computers has gone to more than 100,000 additions per second. The multiplication speed has risen to more than 10,000 per second. The amount of storage capacity, or memory, has changed from 72 storage registers to millions of registers. The reliability of automatic computers has increased to the point where a billion and ten billion operations take place between errors. Besides, automatic checking has been built into computers so that no wrong results are allowed out.
The description of the history of invention and construction of computers and data processors is only part of the story. What caused this development?
There have been two trends in the causes for this development. One is the growth of scientific and engineering knowledge. Take for example astronomy. Isaac Newton and Albert Einstein expressed general lays for the behaviour of heavenly bodies. But the actual calculations for knowing where to look in the sky to see any particular heavenly body at any particular time have to be carried out numerically. Furthermore, the laws were general and in simple form, ignoring many complexities; the actual calculations for particular heavenly bodies were specific and had to take into account many uncomfortable details. Take for example calculating the orbit of the moon: the bulge of the earth at the equator, where the earth is wider than it is at the poles, has an effect on the orbit of the moon, and this has to be calculated in order to predict to the minute and second where the moon will be at any particular time. Such calculations are laborious. Similar laborious calculations occur in electrical engineering, in physics, in chemistry, in nucleonics, and elsewhere.
The other main trend is from the world of business. Here enormous quantities of records and calculations are required in order that business may function.
The growth of a great civilization has produced an enormous growth in the information to be handled and operated with. This provides the push, the energy, for the development of the electronic computers.
Notes
1. abacus - счеты
2. to keep track of numbers - удерживать след чисел
3. a frame of rods strung with beads - рамка из прутьев, на которые нанизаны
бусинки
4. while calculating - при вычислении
5. positional notation for numerals - позиционная система счисления цифр
6. Just as ..., so ... - точно также как ..., так и ...
7. as to whether units, ... were being counted - относительно того, подсчитывались ли единицы, ...
8. empty - пустой, свободный
9. about 300 B.C. - около 300 года до нашей эры
10. A.D. - нашей эры
11. cipher - шифр, шифровка
12. geared counter wheels - зубчатые счетные колесики
13. for nudging - для выталкивания
14. No attention would be needed - Никакого внимания не требовалось
15. Countess - графиня
16. troublesome - досадный, хлопотный
17. for analyzing alternating electrical circuits - для анализа электрических цепей
переменного тока
18. laws for the behaviour of heavenly bodies - законы поведения небесных тел
19. to take into account - принимать во внимание
20. the bulge of the earth at the equator-выпуклость земли на экваторе
21. effect - влияние
22. enormous - огромный, громадный
Дата добавления: 2015-03-11; просмотров: 1486;