History of the development of computer technology ppt. Presentation on the history of the development of computer technology. Computing in the pre-electronic era

The word “computer” means “computer”, i.e. computing device. The need to automate data processing, including calculations, arose a long time ago. More than 1500 years ago, counting sticks, pebbles, etc. were used for counting.

Nowadays it is difficult to imagine that you can do without computers. But not so long ago, until the early 70s, computers were available to a very limited circle of specialists, and their use, as a rule, remained shrouded in secrecy and little known to the general public. However, in 1971, an event occurred that radically changed the situation and, with fantastic speed, turned the computer into an everyday work tool for tens of millions of people. In that undoubtedly significant year, the almost unknown company Intel from a small American town with the beautiful name of Santa Clara (California) released the first microprocessor. It is to him that we owe the emergence of a new class of computing systems - personal computers, which are now used by essentially everyone, from primary school students and accountants to scientists and engineers.

At the end of the 20th century, it is impossible to imagine life without a personal computer. The computer has firmly entered our lives, becoming man's main assistant. Today in the world there are many computers from different companies, different complexity groups, purposes and generations.

Download:

Preview:

To use presentation previews, create a Google account and log in to it: https://accounts.google.com


Slide captions:


On the topic: methodological developments, presentations and notes

Practical work on the subject: “Fundamentals of Informatics and Computer Science”

Practical work on the subject: “Fundamentals of computer science and computer technology” Topic: The main stages of developing and researching models on a computer using the example of studying a physical model...

WORK PLAN for the office/laboratory of ECONOMICS AND MANAGEMENT Office/laboratory number ___17_______ Ufa College of Statistics, Informatics and Computer Engineering for the 2013-2014 academic year Head of the office/laboratory M.V. KISELOVA

WORK PLAN of the office/laboratory of ECONOMICS AND MANAGEMENT Office/laboratory number ___17_______ Ufa College of Statistics, Informatics and Computer Engineering for the 2013-2014 academic year...

Work program of the academic discipline "Peripheral devices of computer technology" in specialty 230101 Computers, complexes, systems and networks

The work program is compiled in accordance with State requirements for the minimum content and level of training of graduates in specialty 230101 Computers, complexes, systems and networks...

Methodological development of the student conference “History of the development of computer technology”

The acquisition of new knowledge contributes to broadening one's horizons, developing interest in the study of computer science and information technology, the formation of general cultural, educational, cognitive, information...

To use presentation previews, create a Google account and log in to it: https://accounts.google.com


Slide captions:

History of the development of computer technology.

Before the advent of the computer.

It is believed that the first calculating device was invented in ancient China at the end of the second millennium BC. It was a regular counting board. The positional principle arose later, already in the 3rd century BC, in this form, with minor changes, it has reached our time. It is still used in China today, it is called suan-pan. The counting on it went from bottom to top, the terms were located on the bottom of the board, and the summation was carried out from the highest to the lowest digits. The numbers were laid out from small sticks using the additive principle. The zero was not indicated in any way; instead, an empty space was simply left.

The Russian abacus appeared at the turn of the 16th-17th centuries. The most common counting tool in pre-Petrine Rus' was “counting with dice,” which was a special board or table. Before carrying out the calculations, they had to be plotted with horizontal lines. Four arithmetic operations were carried out using a pebble, a fruit stone or a special token.

In 1642, French mathematician Blaise Pascal designed the world's first mechanical adding machine, which could add and subtract. Legend has it that in 1709, a certain Venetian Poleni built a calculating machine that worked using gears with a variable number of teeth. Having learned that Pascal had made an arithmetic machine much earlier (although its design was different), Poleni smashed his machine. The first adding machine, which laid the foundation for calculating engineering, was invented in 1818 by the head of a Parisian insurance company, Karl Thomas.

In 1670–1680, the German mathematician Gottfried Leibniz designed a calculating machine that performed all four arithmetic operations.

In 1812, English mathematician Charles Babbage began work on a “difference” machine that could execute a specific program. By 1822, he built a small working model operating with 18-bit numbers and calculated a table of squares on it.

In 1833, Babbage began developing the Analytical Engine. Its design included: A device for storing numbers, A device that performs arithmetic operations, Control of the sequence of machine actions, A device for entering data and printing the results obtained.

Programs for this machine were recorded on punched cards. The first program developer was Ada Lovelace.

To automate the census in 1888 in the United States, Heinrich Hollerith created a tabulator in which information was deciphered using electric current. In 1924, Hollerith founded IBM.

First generation. 1949 -1958

In 1942, the American physicist John Mauchly (1907-1980), after detailed acquaintance with Atanasov’s project, presented his own design for a computer. 200 people participated in the work on the ENIAC (Electronic Numerical Integrator and Computer) computer project under the leadership of John Mauchly and John Presper Eckert. In the spring of 1945, the computer was built, and in February 1946 it was declassified. ENIAC, containing 178,468 vacuum tubes of six different types, 7200 crystal diodes, 4100 magnetic elements, occupying an area of ​​300 square meters, was 1000 times faster than relay computers. The computer would live for nine years and would be turned on for the last time in 1955.

Simultaneously with the construction of ENIAC, also in secrecy, a computer was created in Great Britain. Secrecy was necessary because a device was being designed to decipher the codes used by the German armed forces during the Second World War. The mathematical decryption method was developed by a group of mathematicians, including Alan Turing. During 1943, the Colossus machine was built in London using 1,500 vacuum tubes. The developers of the machine are M. Newman and T. F. Flowers.

In 1937, Harvard mathematician Howard Aiken proposed a project to create a large calculating machine. The work was sponsored by IBM President Thomas Watson, who invested $500 thousand in it. Design of the Mark-1 began in 1939; the computer was built by the New York company IBM. The computer contained about 750 thousand parts, 3304 relays and more than 800 km of wires

In 1946, John von Neumann, based on a critical analysis of the ENIAC design, proposed a number of new ideas for organizing computers, including the concept of a stored program, i.e. storing the program in a storage device. As a result of the implementation of von Neumann's ideas, a computer architecture was created, which in many features has survived to the present day.

In 1948, Sergei Aleksandrovich Lebedev (1990-1974) and B.I. Rameev proposed the first project of a domestic digital electronic computer. Under the leadership of Academician Lebedev S.A. and Glushkova V.M. domestic computers are being developed: first MESM - a small electronic calculating machine (1951, Kyiv), then BESM - a high-speed electronic calculating machine (1952, Moscow). In parallel with them, Strela, Ural, Minsk, Hrazdan, and Nairi were created.

In 1951, work was completed on the creation of UNIVAC (Universal Automatic Computer). The first example of the UNIVAC-1 machine was built for the US Census Bureau. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5,000 vacuum tubes. The internal storage capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines. This computer is interesting because it was aimed at relatively mass production without changing the architecture and special attention was paid to the peripheral part (input-output facilities).

US Navy officer and head of a group of programmers, then captain (later the only woman in the Navy - admiral) Grace Hopper developed the first broadcast program, which she called a compiler (Remington Rand). This program translated into machine language the entire program, written in an algebraic form convenient for processing.

Jay Forrester patented magnetic core memory. For the first time such memory was used on the Whirlwind-1 machine. It consisted of two cubes with 32x32x17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit. This machine was the first to use a universal non-specialized bus and two devices were used as input-output systems: a Williams cathode-ray tube and a typewriter with punched paper tape (flexowriter).

In Great Britain in June 1951, at a conference at the University of Manchester, Maurice Wilkes presented a report on “the best method for constructing an automatic machine,” which became a pioneering work on the basics of microprogramming. Trial operation of the domestic BESM-1 computer began. In the USSR in 1952-1953 A.A. Lyapunov developed the operator programming method (operator programming), and in 1953-1954 L.V. Kantorovich developed the concept of large-block programming. IBM released its first industrial computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 1,200 germanium diodes.

1951 THE FIRST DOMESTIC COMPUTER “MESM” WAS CREATED UNDER THE LEADERSHIP OF S.A. LEBEDEV; IN 1952, HIM CREATED THE BESM COMPUTER.

The first serial domestic computer, Strela, was released.

The first experimental computer using TX-0 transistors was developed at the Massachusetts Institute of Technology (it was put into operation in 1955). The first magnetic tape drive appeared, the IBM 726 device. The recording density was 100 characters per inch, the speed was 75 inches per second.

Second generation of computers 1959 – 1963

"Tradis" - the first transistor computer from Bell Telephone Laboratories - contained 800 transistors, each of which was enclosed in a separate case 1955

In 1959, the domestic computer Setun was released, operating in the ternary number system. In 1956, the IBM 350 RAMAC model first introduced disk memory (magnetized aluminum disks with a diameter of 61 cm). In 1957, Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invented the integrated circuit. J. McCarthy and K. Strachey proposed the concept of dividing computer time.

Stanford Research Center employee Douglas (Doug) Engelbart demonstrated the work of the first mouse. First mouse

In 1964, IBM announced the creation of six models of the IBM family (System 360), which became the first computers of the third generation. The models had a single command system and differed from each other in the amount of RAM and performance

Third generation 1964 -1976

In 1965, Digital Equipment Corp. (DEC) released one of the first minicomputers, the PDP-8. John Kemeny

In 1967, under the leadership of S.A. Lebedev and V.M. Melnikov, a high-speed computing machine BESM-6 was created at ITM and VT. IBM developed the first disk memory subsystem, the IBM RAMAC 305. It had a capacity of only 5 MB on 50 two-foot platters.

1968 In the USA, the Barrows company released the first high-speed computer based on LSIs (large-scale integrated circuits) - B2500 and B3500. In December 1968, the conference was organized by Paul Saffo, a professor of history at Stanford University and an oracle of computer technology. There was an unusual demonstration at this conference. The video stream, broadcast over the radio from Palo Alto, highlighted highlights of David Engelbart's work at the Stanford Research Institute (SRI). The cornerstones of the new information era were shown: interactive programming, database sharing, video conferencing, navigation in virtual spaces, a prototype window interface.

1969 IBM separated the concepts of hardware and software. The company began selling software separately from hardware, marking the beginning of the software industry. Under the auspices of the US Defense Advanced Research Projects Agency (ARPA), the development and implementation of a global military computer network began, connecting research laboratories in the United States. October 29, 1969 is considered to be the birthday of the Network.

Fourth generation. 1977 -1985

In 1971, Intel created the first microprocessor. On one chip it was possible to form a minimal processor containing 2250 transistors.

In 1977, Apple Computer (S. Jobs and V. Wozniak) launched the production of personal computers. Their basis was a “friendly” approach to human work on a computer.

Since 1982, IBM began producing a reference computer model for us.

IBM released hardware documentation and software specifications, allowing other firms to develop the hardware and software.

Generation of computers First (1949-1958) Second (1959-1963) Third (1964-1976) Fourth (1977-1985) Fifth (1986-...) Computer element base Electronic tubes, relays Transistors Integrated circuits (ICs), large-scale integration. cx. (LSI) Ultra-large IC (VLSI) VLSI Capacity 3 10 5 op/s up to 3 10 6 op/s Up to 3 10 7 op/s more than 3 10 7 op/s more than 3 10 8 op/s RAM volume up to 64 KB up to 512 KB up to 16 MB more than 16 MB 128 MB and more Typical generation models EDSAC, ENIAC, BESM RCA-501.IBM 7090, BESM-6 IBM/360, PDP. ES Computer, SM Computer IBM/360, SX-2. IBM PC/XT/AT.PS/2 IBM Software Codes, autocodes, and assemblers Programming languages ​​PPP, DBMS. operating systems Parallel programming systems Windows platform Storage media Punched tapes Punched cards Magnetic tapes Magnetic disks Magnetic and optical disks

Homework. Topic 24. Page 380 questions. Written No. 7,8.

Verification work.

1. In what century did the first devices capable of performing arithmetic operations appear? in XVI in XVII in XVIII in XIX.

2. The first programmer of the world is: G. Leibniz, A. Lovelace, B. Pascal, S. Lebedev.

4. Abacus is: a jukebox, an abacus, a device for working according to a given program, the first mechanical machine.

5. The first analytical engine was invented by: C. Babbage, V. Chiccard, J. Jacquard, B. Pascal.

Answers: 1 2 3 4 5 b b a b a Ratings: 5 + “5” 4 + “4” 3 + “3”


Slide 2

Computing in the pre-electronic era First generation computers Second generation computers Third generation computers Personal computers Modern supercomputers

Slide 3

Computing in the pre-electronic era

The need to count objects in humans arose in prehistoric times. The oldest method of counting objects consisted of comparing objects of a certain group (for example, animals) with objects of another group, playing the role of a counting standard. For most peoples, the first such standard was fingers (counting on fingers). The expanding needs for counting forced people to use other counting standards (notches on a stick, knots on a rope, etc.).

Slide 4

Every schoolchild is familiar with counting sticks, which were used as a counting standard in the first grade. In the ancient world, when counting large quantities of objects, a new sign began to be used to indicate a certain number of them (for most peoples - ten), for example, a notch on another stick. The first computing device to use this method was the abacus.

Slide 5

The ancient Greek abacus was a plank sprinkled with sea sand. There were grooves in the sand, on which numbers were marked with pebbles. One groove corresponded to units, the other to tens, etc. If in any groove when counting more than 10 pebbles were collected, they were removed and one pebble was added to the next digit. The Romans improved the abacus, moving from sand and pebbles to marble boards with chiseled grooves and marble balls

Slide 6

As economic activities and social relations became more complex (monetary payments, problems of measuring distances, time, areas, etc.), the need for arithmetic calculations arose. To perform the simplest arithmetic operations (addition and subtraction), they began to use the abacus, and after centuries, the abacus.

Slide 7

The development of science and technology required increasingly complex mathematical calculations, and in the 19th century mechanical calculating machines - adding machines - were invented. Arithmometers could not only add, subtract, multiply and divide numbers, but also remember intermediate results, print calculation results, etc.

Slide 8

In the middle of the 19th century, the English mathematician Charles Babbage put forward the idea of ​​​​creating a program-controlled calculating machine that had an arithmetic unit, a control unit, as well as input and printing devices.

Slide 9

Babbage's Analytical Engine (the prototype of modern computers) was built by enthusiasts from the London Science Museum based on surviving descriptions and drawings. The analytical machine consists of four thousand steel parts and weighs three tons.

Slide 10

The calculations were carried out by the Analytical Engine in accordance with the instructions (programs) developed by Lady Ada Lovelace (daughter of the English poet George Byron). Countess Lovelace is considered the first computer programmer, and the ADA programming language is named after her.

Slide 11

Programs were recorded on punched cards by punching holes in thick paper cards in a certain order. The punched cards were then placed into the Analytical Engine, which read the location of the holes and performed computational operations in accordance with a given program.

Slide 12

Development of electronic computer technologyFirst generation computers

In the 40s of the 20th century, work began on the creation of the first electronic computers, in which vacuum tubes replaced mechanical parts. First-generation computers required large halls for their placement, since they used tens of thousands of vacuum tubes. Such computers were created in single copies, were very expensive and were installed in the largest research centers.

Slide 13

First generation computer

In 1945, ENIAC (Electronic Numerical Integrator and Computer - electronic numerical integrator and calculator) was built in the USA, and in 1950 MESM (Small Electronic Computing Machine) was created in the USSR.

Slide 14

First-generation computers could perform calculations at a speed of several thousand operations per second, the execution sequence of which was specified by programs. Programs were written in machine language, the alphabet of which consisted of two characters: 1 and 0. Programs were entered into the computer using punched cards or punched tapes, and the presence of a hole on the punched card corresponded to the character 1, and its absence - to the character 0. The results of calculations were output using printing devices in form of long sequences of zeros and ones. Only qualified programmers who understood the language of the first computers could write programs in machine language and decipher the results of calculations.

Slide 15

Second generation computer

In the 60s of the 20th century, second-generation computers were created based on a new elemental base - transistors, which are tens and hundreds of times smaller in size and weight, higher reliability and consume significantly less electrical power than vacuum tubes. Such computers were produced in small series and installed in large research centers and leading higher educational institutions.

Slide 16

In the USSR, in 1967, the most powerful second-generation computer in Europe, BESM-6 (Big Electronic Calculating Machine), which could perform 1 million operations per second, came into operation.

Slide 17

BESM-6 used 260 thousand transistors, external memory devices on magnetic tapes for storing programs and data, as well as alphanumeric printing devices for outputting calculation results. The work of programmers in developing programs has been significantly simplified, since it began to be carried out using high-level programming languages ​​(Algol, BASIC, etc.).

Slide 18

Third generation computer

Since the 70s of the last century, integrated circuits began to be used as the elemental base of third-generation computers. An integrated circuit (a small semiconductor wafer) can have thousands of transistors packed tightly together, each about the size of a human hair.

Slide 19

Computers based on integrated circuits have become much more compact, fast and cheaper. Such mini-computers were produced in large series and were available to most scientific institutes and higher educational institutions.

Slide 20

Personal computers

The development of high technologies has led to the creation of large integrated circuits - LSIs, including tens of thousands of transistors. This made it possible to begin producing compact personal computers available to the masses.

Slide 21

The first personal computer was the AppleII (“grandfather” of modern Macintosh computers), created in 1977. In 1982, IBM began manufacturing IBM PC personal computers (the “grandfathers” of modern IBM-compatible computers).

Slide 22

Modern personal computers are compact and have thousands of times greater speed compared to the first personal computers (they can perform several billion operations per second). Every year, almost 200 million computers are produced around the world, affordable for the mass consumer. Personal computers can be of various designs: desktop, portable (laptops) and pocket (palms).

Slide 24

Literature used and image links

Computer Science and ICT. Basic level: textbook for grade 11 / N.D. Ugrinovich. – 3rd ed. – M.: BINOM. Knowledge Laboratory, 2009. http://www.radikal.ru/users/al-tam/istorija-razvitija-vychtehniki

View all slides

Lesson topic: History of the development of computer technology Lesson objectives:

  • Get acquainted with the main stages of the development of computer technology.
  • Study the history of the development of domestic and foreign computer technology.
The main stages of the development of computer technology
  • Computing in the pre-electronic era.
  • 2. First generation computer.
  • 3. Second generation computer.
  • 4. Third generation computer.
  • 5. Personal computers.
  • 6. Modern supercomputers.
  • The need to count objects in humans arose in prehistoric times. The oldest method of counting objects was to compare objects of a certain group (for example, animals) with objects of another group, playing the role of a counting standard. For most peoples, the first such standard was fingers (counting on fingers).
  • The expanding needs for counting forced people to use other counting standards (notches on a stick, knots on a rope, etc.).
Computing in the pre-electronic era
  • Every schoolchild is familiar with counting sticks, which were used as a counting standard in the first grade.
  • In the ancient world, when counting large quantities of objects, a new sign began to be used to indicate a certain number of them (for most peoples - ten), for example, a notch on another stick. The first computing device to use this method was the abacus.
Computing in the pre-electronic era
  • The ancient Greek abacus was a plank sprinkled with sea sand. There were grooves in the sand, on which numbers were marked with pebbles. One groove corresponded to units, the other to tens, etc. If in any groove when counting more than 10 pebbles were collected, they were removed and one pebble was added to the next digit. The Romans improved the abacus, moving from sand and pebbles to marble boards with chiseled grooves and marble balls.
  • Abacus
Computing in the pre-electronic era
  • As economic activities and social relations became more complex (monetary payments, problems of measuring distances, time, areas, etc.), the need for arithmetic calculations arose.
  • To perform the simplest arithmetic operations (addition and subtraction), they began to use the abacus, and after centuries, the abacus.
  • In Russia, abacus appeared in the 16th century.
Computing in the pre-electronic era
  • The development of science and technology required increasingly complex mathematical calculations, and in the 19th century mechanical calculating machines - adding machines - were invented. Arithmometers could not only add, subtract, multiply and divide numbers, but also remember intermediate results, print calculation results, etc.
  • Adding machine
Computing in the pre-electronic era
  • In the middle of the 19th century, the English mathematician Charles Babbage put forward the idea of ​​​​creating a program-controlled calculating machine that had an arithmetic unit, a control unit, as well as input and printing devices.
  • Charles Babbage
  • 26.12.1791 - 18.10.1871
Computing in the pre-electronic era
  • Babbage's Analytical Engine (the prototype of modern computers) was built by enthusiasts from the London Science Museum based on surviving descriptions and drawings. The analytical machine consists of four thousand steel parts and weighs three tons.
  • Babbage's Analytical Engine
Computing in the pre-electronic era
  • The calculations were carried out by the Analytical Engine in accordance with the instructions (programs) developed by Lady Ada Lovelace (daughter of the English poet George Byron).
  • Countess Lovelace is considered the first computer programmer, and the ADA programming language is named after her.
  • Ada Lovelace
  • 10.12 1815 - 27.11.1852
Computing in the pre-electronic era
  • Programs were recorded on punched cards by punching holes in thick paper cards in a certain order. The punched cards were then placed into the Analytical Engine, which read the location of the holes and performed computational operations in accordance with a given program.
First generation computer
  • In the 40s of the 20th century, work began on the creation of the first electronic computers, in which vacuum tubes replaced mechanical parts. First-generation computers required large halls for their placement, since they used tens of thousands of vacuum tubes. Such computers were created in single copies, were very expensive and were installed in the largest research centers.
First generation computer
  • In 1945, ENIAC (Electronic Numerical Integrator and Computer - electronic numerical integrator and calculator) was built in the USA, and in 1950 MESM (Small Electronic Computing Machine) was created in the USSR.
  • ENIAC
  • MESM
First generation computer
  • First-generation computers could perform calculations at a speed of several thousand operations per second, the execution sequence of which was specified by programs. Programs were written in machine language, the alphabet of which consisted of two characters: 1 and 0. Programs were entered into the computer using punched cards or punched tapes, and the presence of a hole on the punched card corresponded to the 1 sign, and its absence – to the 0 sign.
  • The results of calculations were output by printing devices in the form of long sequences of zeros and ones. Only qualified programmers who understood the language of the first computers could write programs in machine language and decipher the results of calculations.
Second generation computer
  • In the 60s of the 20th century, second-generation computers were created based on a new elemental base - transistors, which are tens and hundreds of times smaller in size and weight, higher reliability and consume significantly less electrical power than vacuum tubes. Such computers were produced in small series and installed in large research centers and leading higher educational institutions.
Second generation computer
  • In the USSR, in 1967, the most powerful second-generation computer in Europe, BESM-6 (Big Electronic Calculating Machine), which could perform 1 million operations per second, came into operation.
  • BESM-6 used 260 thousand transistors, external memory devices on magnetic tape, as well as alphanumeric printing devices to output calculation results.
  • The work of programmers in developing programs has been significantly simplified, since it began to be carried out using high-level programming languages ​​(Algol, BASIC, etc.).
  • BESM - 6
Third generation computer
  • Since the 70s of the last century, integrated circuits began to be used as the elemental base of third-generation computers. An integrated circuit (a small semiconductor wafer) can have thousands of transistors packed tightly together, each about the size of a human hair.
Third generation computer
  • Computers based on integrated circuits have become much more compact, fast and cheaper. Such mini-computers were produced in large series and were available to most scientific institutes and higher educational institutions.
  • The first minicomputer
Personal computers
  • The development of high technologies has led to the creation of large integrated circuits - LSIs, including tens of thousands of transistors. This made it possible to begin producing compact personal computers available to the masses.
  • The first personal computer was the Apple II (the “grandfather” of modern Macintosh computers), created in 1977. In 1982, IBM began manufacturing IBM PC personal computers (the “grandfathers” of modern IBM-compatible computers).
  • Apple II
Personal computers
  • Modern personal computers are compact and have thousands of times greater speed compared to the first personal computers (they can perform several billion operations per second). Every year, almost 200 million computers are produced around the world, affordable for the mass consumer.
  • Personal computers can be of various designs: desktop, portable (laptops) and pocket (palms).
  • Modern PCs
Modern supercomputers
  • These are multiprocessor systems that achieve very high performance and can be used for real-time calculations in meteorology, military affairs, science, etc.

People learned to count using their own fingers. When this was not enough, the simplest counting devices appeared. ABAK, which became widespread in the ancient world, occupied a special place among them. People learned to count using their own fingers. When this was not enough, the simplest counting devices appeared. ABAK, which became widespread in the ancient world, occupied a special place among them. Making an abacus is not at all difficult, just line a board in columns or simply draw columns on the sand. Each column was assigned a number digit value: units, tens, hundreds, thousands. Numbers were indicated by a set of pebbles, shells, twigs, etc., arranged in different columns - ranks. By adding or removing this or that number of pebbles from the corresponding columns, it was possible to perform addition or subtraction, and even multiplication and division as repeated addition and subtraction, respectively. Making an abacus is not at all difficult, just line a board in columns or simply draw columns on the sand. Each column was assigned a number digit value: units, tens, hundreds, thousands. Numbers were indicated by a set of pebbles, shells, twigs, etc., arranged in different columns - ranks. By adding or removing this or that number of pebbles from the corresponding columns, it was possible to perform addition or subtraction, and even multiplication and division as repeated addition and subtraction, respectively.


The Russian abacus is very similar in principle to the abacus. Instead of columns, they have horizontal guides with bones. In Rus', abacus was used simply masterfully. They were an indispensable tool for traders, clerks, and officials. From Russia, this simple and useful device penetrated into Europe. The Russian abacus is very similar in principle to the abacus. Instead of columns, they have horizontal guides with bones. In Rus', abacus was used simply masterfully. They were an indispensable tool for traders, clerks, and officials. From Russia, this simple and useful device penetrated into Europe.


The first mechanical calculating device was a calculating machine built in 1642 by the eminent French scientist Blaise Pascal. The first mechanical calculating device was a calculating machine built in 1642 by the eminent French scientist Blaise Pascal. Pascal's mechanical "computer" could add and subtract. “Pascalina,” as the car was called, consisted of a set of vertically mounted wheels with numbers from 0 to 9 printed on them. When the wheel turned completely, it engaged with the adjacent wheel and turned it by one division. The number of wheels determined the number of digits - for example, two wheels made it possible to count up to 99, three - up to 999, and five wheels made the car “knowledgeable” even such large numbers as Counting on Pascaline was very simple. Pascal's mechanical "computer" could add and subtract. “Pascalina,” as the car was called, consisted of a set of vertically mounted wheels with numbers from 0 to 9 printed on them. When the wheel turned completely, it engaged with the adjacent wheel and turned it by one division. The number of wheels determined the number of digits - for example, two wheels made it possible to count up to 99, three - up to 999, and five wheels made the car “knowledgeable” even such large numbers as Counting on Pascaline was very simple.


In 1673, the German mathematician and philosopher Gottfried Wilhelm Leibniz created a mechanical adding device that not only added and subtracted, but also multiplied and divided. Leibniz's machine was more complex than Pascalina. In 1673, the German mathematician and philosopher Gottfried Wilhelm Leibniz created a mechanical adding device that not only added and subtracted, but also multiplied and divided. Leibniz's machine was more complex than Pascalina.


The number wheels, now geared, had teeth of nine different lengths, and calculations were made by the clutch of the wheels. It was the slightly modified Leibniz wheels that became the basis for mass calculating instruments - arithmometers, which were widely used not only in the 19th century, but also relatively recently by our grandparents. The number wheels, now geared, had teeth of nine different lengths, and calculations were made by the clutch of the wheels. It was the slightly modified Leibniz wheels that became the basis for mass calculating instruments - arithmometers, which were widely used not only in the 19th century, but also relatively recently by our grandparents. There are scientists in the history of computing whose names, associated with the most significant discoveries in this field, are known today even to non-specialists. Among them is the 19th-century English mathematician Charles Babbage, who is often called the “father of modern computing.” In 1823, Babbage began working on his computer, which consisted of two parts: calculating and printing. The machine was intended to help the British Maritime Department to compile various nautical tables. There are scientists in the history of computing whose names, associated with the most significant discoveries in this field, are known today even to non-specialists. Among them is the 19th-century English mathematician Charles Babbage, who is often called the “father of modern computing.” In 1823, Babbage began working on his computer, which consisted of two parts: calculating and printing. The machine was intended to help the British Maritime Department to compile various nautical tables.


The first, calculating part of the machine was almost completed by 1833, and the second, printing, was almost half completed when costs exceeded pounds sterling (about dollars). There was no more money, and the work had to be closed. The first, calculating part of the machine was almost completed by 1833, and the second, printing, was almost half completed when costs exceeded pounds sterling (about dollars). There was no more money, and the work had to be closed. Although Babbage's machine was not finished, its creator put forward ideas that formed the basis for the design of all modern computers. Babbage came to the conclusion that a computing machine must have a device for storing numbers intended for calculations, as well as instructions (commands) for the machine on what to do with these numbers. The commands that followed one after another were called the “program” of the computer, and the device for storing information was called the “memory” of the machine. However, storing numbers even with a program is only half the battle. The main thing is that the machine must perform the operations specified in the program with these numbers. Babbage realized that for this the machine must have a special computing unit - a processor. It is on this principle that modern computers are designed. Although Babbage's machine was not finished, its creator put forward ideas that formed the basis for the design of all modern computers. Babbage came to the conclusion that a computing machine must have a device for storing numbers intended for calculations, as well as instructions (commands) for the machine on what to do with these numbers. The commands that followed one after another were called the “program” of the computer, and the device for storing information was called the “memory” of the machine. However, storing numbers even with a program is only half the battle. The main thing is that the machine must perform the operations specified in the program with these numbers. Babbage realized that for this the machine must have a special computing unit - a processor. It is on this principle that modern computers are designed. Babbage's scientific ideas captivated the daughter of the famous English poet Lord Babbage's scientific ideas captivated the daughter of the famous English poet Lord George Byron - Countess Ada Augusta Lovelace. At that time there were no such concepts as computer programming, but nevertheless, Ada Lovelace is rightfully considered the world’s first programmer - this is how people capable of George Byron are now called - Countess Ada Augusta Lovelace. At that time, there were no such concepts as computer programming, but nevertheless, Ada Lovelace is rightfully considered the world’s first programmer - this is what people are now called who are able to “explain” its tasks in a language understandable to a machine. The fact is that Babbage did not leave a single complete description of the machine he invented. This was done by one of his students in an article in French. Ada Lovelace translated it into English, adding her own programs that the machine could use to carry out complex mathematical calculations. As a result, the original volume of the article tripled, and Babbage had the opportunity to demonstrate the power of his machine. Many of the concepts introduced by Ada Lovelace in the descriptions of those world's first programs are widely used by modern programmers. One of the most modern and advanced computer programming languages ​​- ADA - is named after the world's first programmer. “explain” its tasks in a machine-understandable language. The fact is that Babbage did not leave a single complete description of the machine he invented. This was done by one of his students in an article in French. Ada Lovelace translated it into English, adding her own programs that the machine could use to carry out complex mathematical calculations. As a result, the original volume of the article tripled, and Babbage had the opportunity to demonstrate the power of his machine. Many of the concepts introduced by Ada Lovelace in the descriptions of those world's first programs are widely used by modern programmers. One of the most modern and advanced computer programming languages ​​- ADA - is named after the world's first programmer.


New technologies of the twentieth century turned out to be inextricably linked with electricity. Soon after the appearance of vacuum tubes, in 1918, the Soviet scientist M.A. Bonch-Bruevich invented a tube trigger - an electronic device capable of storing electrical signals. New technologies of the twentieth century turned out to be inextricably linked with electricity. Soon after the appearance of vacuum tubes, in 1918, the Soviet scientist M.A. Bonch-Bruevich invented a tube trigger - an electronic device capable of storing electrical signals. The principle of operation of the trigger is similar to a swing with latches installed at the upper points of the swing. When the swing reaches one top point, the latch will work, the swing will stop, and they can remain in this stable state for as long as desired. The latch will open - the swing will resume to another upper point, the latch will also work here, stop again, and so on - as many times as you like.


The first computers were considered thousands of times faster than mechanical calculating machines, but were very bulky. The computer occupied a room measuring 9 x 15 m, weighed about 30 tons and consumed 150 kilowatts per hour. This computer contained about 18 thousand vacuum tubes. The first computers were considered thousands of times faster than mechanical calculating machines, but were very bulky. The computer occupied a room measuring 9 x 15 m, weighed about 30 tons and consumed 150 kilowatts per hour. This computer contained about 18 thousand vacuum tubes.


The second generation of electronic computers owes its appearance to the most important electronics invention of the twentieth century - the transistor. The miniature semiconductor device has made it possible to dramatically reduce the size of computers and reduce power consumption. The speed of computers has increased to a million operations per second. The second generation of electronic computers owes its appearance to the most important electronics invention of the twentieth century - the transistor. The miniature semiconductor device has made it possible to dramatically reduce the size of computers and reduce power consumption. The speed of computers has increased to a million operations per second. The invention in 1950 of integrated circuits - semiconductor crystals containing a large number of interconnected transistors and other elements - made it possible to reduce the number of electronic elements in a computer hundreds of times. Third-generation computers based on integrated circuits appeared in 1964. The invention in 1950 of integrated circuits - semiconductor crystals containing a large number of interconnected transistors and other elements - made it possible to reduce the number of electronic elements in a computer hundreds of times. Third-generation computers based on integrated circuits appeared in 1964.


In June 1971, a very complex universal integrated circuit was first developed, called a microprocessor - the most important element of fourth-generation computers. In June 1971, a very complex universal integrated circuit was first developed, called a microprocessor - the most important element of fourth-generation computers.



Similar articles

2024 parki48.ru. We are building a frame house. Landscape design. Construction. Foundation.