Von Neumann machine diagram. Classical computer architecture and von Neumann principles. Development of the concept of a stored program


John von Neumann(1903 - 1957) - Hungarian-American mathematician of Jewish origin who made important contributions to quantum physics, quantum logic, functional analysis, set theory, computer science, economics and other branches of science.


Computer architecture- this is the internal structure of the machine, its logical organization, which determines the processing process and data encoding methods, composition, purpose, principles of interaction of technical means and software.


CPU

In 1945, John von Neumann created computer architecture.

A von Neumann machine consists of a storage device (memory) - a memory, an arithmetic-logical unit - ALU, a control device - CU, as well as input and output devices.

Input device

Output device



In 1946, D. von Neumann, G. Goldstein and A. Berks, in their joint article, outlined new principles for the construction and operation of computers. Subsequently, the first two generations of computers were produced on the basis of these principles. There have been some changes in later generations, although Neumann's principles are still relevant today.

Herman Goldstein

Arthur Burks

John von Neumann



The binary number system uses only two digits, 0 and 1. In other words, two is the base binary system Reckoning

Advantage over decimal system number system is that devices can be made quite simple, arithmetic and logical operations in the binary number system are also performed quite simply.


Number systems

Decimal

Binary

Octal

Hexadecimal


The operation of the computer is controlled by a program consisting of a set of commands. Commands are executed sequentially one after another. The creation of a machine with a stored program was the beginning of what we call programming today.


In this case, both program commands and data are encoded in the binary number system, i.e. their recording method is the same. Therefore, in certain situations, you can perform the same actions on commands as on data.


At any time, you can access any memory cell by its address. This principle opened up the possibility of using variables in programming.


Despite the fact that commands are executed sequentially, programs can implement the ability to jump to any section of code.


Achievements of John von Neumann.

John von Neumann was awarded the highest academic honors. He was elected a member of the Academy of Exact Sciences (Lima, Peru), the American Academy of Arts and Sciences, the American Philosophical Society, the Lombard Institute of Sciences and Letters, the Royal Netherlands Academy of Sciences and Arts, the US National Academy, and honorary doctorates from many universities in the USA and other countries.



· Principle binary coding

· According to this principle, all information entering a computer is encoded using binary signals (binary digits, bits) and is divided into units called words.

· The principle of memory homogeneity

· Programs and data are stored in the same memory. Therefore, the computer does not distinguish between what is stored in a given memory cell - a number, text or command. You can perform the same actions on commands as on data.

· The principle of memory addressability

· Structurally, the main memory consists of numbered cells; Any cell is available to the processor at any time. This implies the ability to name memory areas so that the values ​​stored in them can later be accessed or changed during program execution using the assigned names.

· Sequential program control principle

· Assumes that the program consists of a set of commands that are executed by the processor automatically one after another in a certain sequence.

· The principle of architectural rigidity

· Immutability of the topology, architecture, and list of commands during operation.

· Computers built on these principles are classified as von Neumann computers.

· The most important consequence of these principles is that now the program was no longer a permanent part of the machine (like, for example, a calculator). It became possible to easily change the program. But the equipment, of course, remains unchanged and very simple.

· By comparison, the program of the ENIAC computer (which did not have a stored program) was determined by special jumpers on the panel. It could take more than one day to reprogram the machine (set jumpers differently). And although programs for modern computers may take years to write, but they work on millions of computers after a few minutes of installation on the hard drive.

·

· A von Neumann machine consists of a storage device (memory) - a memory, an arithmetic-logical device - ALU, a control device - CU, as well as input and output devices.

· Programs and data are entered into memory from the input device through an arithmetic logic unit. All program commands are written to adjacent memory cells, and data for processing can be contained in arbitrary cells. For any program last command must be a shutdown command.

· A command consists of an indication of what operation should be performed (out of possible operations on a given hardware) and the addresses of memory cells where the data on which the specified operation should be performed is stored, as well as the address of the cell where the result should be written (if it needs to be saved in memory).


· The arithmetic logic unit performs the operations specified by the instructions on the specified data.

· From the arithmetic logic unit, the results are output to the memory or output device. The fundamental difference between a memory and an output device is that in a memory, data is stored in a form convenient for processing by a computer, and it is sent to output devices (printer, monitor, etc.) in a way that is convenient for a person.

· The control unit controls all parts of the computer. From the control device, other devices receive signals “what to do”, and from other devices the control unit receives information about their status.

· The control device contains a special register (cell) called the “program counter”. After loading the program and data into memory, the address of the first instruction of the program is written to the program counter. The control unit reads from memory the contents of the memory cell, the address of which is in the program counter, and places it in a special device - the “Command Register”. The control unit determines the operation of the command, “marks” in memory the data whose addresses are specified in the command, and controls the execution of the command. The operation is performed by the ALU or computer hardware.

· As a result of the execution of any command, the program counter changes by one and, therefore, points to the next command of the program. When it is necessary to execute a command that is not next in order to the current one, but is separated from the given one by a certain number of addresses, then a special jump command contains the address of the cell to which control must be transferred.

16)Structure and architecture of the computer system

A system (from the Greek systema - a whole, a compound made up of parts) is a set of elements that interact with each other, forming a certain integrity, unity.
A computing system is a collection of one or more computers or processors, software and peripheral equipment, organized for the joint execution of information and computing processes.
Distinctive feature VS in relation to computers is the presence in them of several computers that implement parallel processing.
Basic design principles laid down when creating an aircraft:
ability to work in different modes;
modularity of the structure of technical and software, which allows you to improve and modernize computing systems without fundamental alterations;
unification and standardization of technical and software solutions;
hierarchy in the organization of process management;
the ability of systems to adapt, self-adjust and self-organize;
providing the necessary service to users when performing calculations
According to their purpose, aircraft are divided into
universal,
problem-oriented
specialized.
Universal ones are designed to solve a wide class of problems. Problem-oriented ones are used to solve a certain range of problems in a relatively narrow area. Specialized ones are focused on solving a narrow class of problems
According to the type of aircraft they differ in
multi-machine
multiprocessor.
A computing system can be built on the basis of either entire computers (multi-machine computer) or on the basis of individual processors (multiprocessor computer).
By type of computer or processor they distinguish
homogeneous - built on the basis of the same type of computers or processors.
heterogeneous systems - includes Various types computers or processors.
Geographically, aircraft are divided into:
concentrated (all components are located in close proximity to each other);
distributed (components can be located at a considerable distance, for example, computer networks);
According to the methods of controlling aircraft elements, they are distinguished
centralized,
decentralized
with mixed control.

According to the operating mode of the aircraft, systems operating in
operational
non-operational temporary modes.
In addition, aircraft can be structurally
single-level (there is only one general level of data processing);
Multi-level (hierarchical) structures. In hierarchical computer systems, machines or processors are distributed across different levels of information processing; some machines (processors) can specialize in performing certain functions.
Structure of the computing system.
The structure of the aircraft is a set of integrated elements and their connections. The elements of the computer are individual computers and processors.
In the described multi-level structure, the classical von Neumann organization of the armed forces is implemented and involves sequential processing of information according to a pre-compiled program.
Architecture of computing systems. Classification of computer system architectures.
System architecture is a set of system properties that are essential for use.
The architecture of a computer is its description at some general level, including a description of user programming capabilities, command systems, addressing systems, memory organization, etc.
Classical architecture(von Neumann architecture) - one arithmetic-logical unit (ALU), through which the data flow passes, and one control device (CU), through which the command flow - the program - passes. This is a single-processor computer.
Multi-machine computing system. Here, several processors included in a computing system do not have a common random access memory, but each has its own (local). Each computer in a multi-machine system has a classical architecture, and such a system is used quite widely.
The earliest and most famous is the classification of computer system architectures proposed in 1966 by M. Flynn.

· The classification is based on the concept of a thread, which is a sequence of elements, commands or data processed by a processor. Based on the number of command streams and data streams, Flynn distinguishes four classes of architectures: SISD, MISD, SIMD, MIMD.
SISD (single instruction stream / single data stream) - single instruction stream and single data stream. This class includes, first of all, classic sequential machines, or otherwise von Neumann-type machines, for example, PDP-11 or VAX 11/780. In such machines there is only one stream of commands, all commands are processed sequentially one after another and each command initiates one operation on one stream of data. It doesn't matter that pipelining can be used to increase instruction processing speed and arithmetic speed - both the CDC 6600 with scalar functional units and the CDC 7600 with pipelines fall into this class.
SIMD (single instruction stream / multiple data stream) - single instruction stream and multiple data stream. In architectures of this kind, one stream of commands is retained, which, unlike the previous class, includes vector commands. This allows you to perform one arithmetic operation on many data - vector elements - at once. The method for performing vector operations is not specified, so the processing of vector elements can be done either by a processor matrix, as in ILLIAC IV, or using a pipeline, as, for example, in the CRAY-1 machine.
MISD (multiple instruction stream / single data stream) - multiple instruction stream and single data stream. The definition implies the presence in the architecture of many processors processing the same data stream. However, neither Flynn nor other experts in the field of computer architecture have yet been able to provide a convincing example of a real-life computing system built on this principle. A number of researchers attribute conveyor machines to this

This kind of machine is often referred to as a “von Neumann machine,” but the correspondence between these concepts is not always clear. In general, when people talk about von Neumann architecture, they mean the principle of storing data and instructions in one memory.

Encyclopedic YouTube

  • 1 / 5

    The foundations of the doctrine of computer architecture were laid by von Neumann in 1944, when he became involved in the creation of the world's first tube computer, ENIAC. While working on the ENIAC at the University of Pennsylvania, during numerous discussions with his colleagues John William Mauchly, John Eckert, Herman Goldstine and Arthur Burks, the idea of ​​a more advanced machine called EDVAC arose. Research work on EDVAC continued in parallel with the construction of ENIAC.

    In March 1945, the principles of logical architecture were formalized in a document called the “First Draft Report on EDVAC” - a report for the US Army Ballistics Laboratory, which financed the construction of ENIAC and the development of EDVAC. The report, since it was only a draft, was not intended for publication, but only for distribution within the group, but Herman Goldstein - the project supervisor for the US Army - reproduced this scientific work and sent it to a wide circle of scientists for review. Since only von Neumann's name was on the first page of the document, those who read the document had the false impression that he was the author of all the ideas presented in the work. The document provided enough information so that those who read it could build their own computers similar to EDVAC on the same principles and with the same architecture, which as a result became known as the “von Neumann architecture.”

    After the end of World War II and the end of work on ENIAC in February 1946, the team of engineers and scientists broke up, John Mauchly and John Eckert decided to go into business and create computers on a commercial basis. Von Neumann, Goldstein and Burks moved to , where they decided to create their own “IAS machine” computer, similar to EDVAC, and use it for research work. In June 1946, they outlined their principles for constructing computers in the now classic article “Preliminary Consideration of the Logical Design of an Electronic Computing Device.” More than half a century has passed since then, but the provisions put forward in it remain relevant today. The article convincingly substantiates the use of the binary system to represent numbers, but previously all computers stored processed numbers in decimal form. The authors demonstrated the advantages of the binary system for technical implementation, the convenience and simplicity of performing arithmetic and logical operations. Subsequently, computers began to process non-numeric types of information - text, graphic, sound and others, but binary coding of data is still information basis any modern computer.

    In addition to machines that worked with binary code, there were and still are ternary machines. Ternary computers have a number of advantages and disadvantages over binary ones. Among the advantages are speed (addition operations are performed approximately one and a half times faster), the presence of binary and ternary logic, a symmetric representation of signed integers (in binary logic there will either be two zeros (positive and negative), or there will be a number that does not have a pair with the opposite sign). The disadvantages are that the implementation is more complex compared to binary machines.

    Another revolutionary idea, the importance of which is difficult to overestimate, is the principle of a “stored program”. Initially, the program was set by installing jumpers on a special patch panel. This was a very labor-intensive task: for example, changing the program of the ENIAC machine required several days, while the calculation itself could not last more than a few minutes - the lamps, of which there were a huge number, failed. However, the program can also be stored as a collection of zeros and ones, and in the same memory as the numbers it processes. The absence of a fundamental difference between the program and the data made it possible for the computer to form a program for itself in accordance with the results of the calculations.

    The presence of a given set of executable commands and programs was a characteristic feature of the first computer systems. Today, a similar design is used to simplify the design of a computing device. Thus, desktop calculators, in principle, are devices with a fixed set of programs that can be executed. They can be used for mathematical calculations, but are almost impossible to use for text processing and computer games, for viewing graphic images or videos. Changing the firmware for this type of device requires almost complete rework, and in most cases is impossible. However, reprogramming of early computer systems was still carried out, but it required a huge amount of manual work to prepare new documentation, reconnect and rebuild blocks and devices, etc.

    The idea of ​​storage changed everything computer programs in shared memory. By the time of its introduction, the use of architectures based on executable instruction sets and the representation of the computing process as the process of executing instructions written in a program had enormously increased the flexibility of computing systems in terms of data processing. The same approach to looking at data and instructions made it easy to change the programs themselves.

    Von Neumann's principles

    The principle of memory homogeneity The fundamental difference between the “von Neumann” (Princeton) architecture and the “Harvard” one. Commands and data are stored in the same memory and are externally indistinguishable in memory. They can only be recognized by the method of use; that is, the same value in a memory cell can be used as data, as a command, and as an address, depending only on the way it is accessed. This allows you to perform the same operations on commands as on numbers, and, accordingly, opens up a number of possibilities. Thus, by cyclically changing the address part of the command, it is possible to access successive elements of the data array. This technique is called command modification and is not recommended from the standpoint of modern programming. More useful is another consequence of the principle of homogeneity, when instructions from one program can be obtained as a result of the execution of another program. This feature underlies translation - translation of program text from language high level into the language of a specific computer. The principle of addressing Structurally, the main memory consists of numbered cells, and any cell is available to the processor at any time. Binary codes of commands and data are divided into units of information called words and stored in memory cells, and to access them the numbers of the corresponding cells - addresses are used., this sequence can be changed. The decision to change the order of execution of program commands is made either based on an analysis of the results of previous calculations, or unconditionally.

    The principle of binary coding According to this principle, all information, both data and commands, is encoded with binary digits 0 and 1. Each type of information is represented by a binary sequence and has its own format. A sequence of bits in a format that has a specific meaning is called a field. In numeric information, there is usually a sign field and a significant digits field. In the simplest case, the command format can be divided into two fields: the operation code field and the addresses field.

    Computers built on von Neumann principles

    1. According to the plan, the first computer built according to the von Neumann architecture was to be the EDVAC (Electronic Discrete Variable Automatic Computer) - one of the first electronic computers. Unlike its predecessor ENIAC, it was a binary rather than decimal based computer. Like ENIAC, EDVAC was developed at the Moore Institute of the University of Pennsylvania for the US Army Ballistics Research Laboratory by a team of engineers and scientists led by John Presper Eckert and John William Mauchly, with the active assistance of a mathematician], but until 1951 EDVAC was not launched due to technical difficulties in creating reliable computer memory and disagreements within the development team. Other research institutes, having become familiar with ENIAC and the EDVAC project, were able to solve these problems much earlier. The first computers to implement the main features of the von Neumann architecture were:
    2. prototype - Manchester Small Experimental Machine - Manchester University, UK, 21 June 1948;
    3. EDSAC - University of Cambridge, UK, 6 May 1949;
    4. Manchester Mark I - Manchester University, UK, 1949;
    5. BINAC - USA, April or August 1949;
    6. CSIR Mk 1
    7. EDVAC - USA, August 1949 - actually launched in 1952;
    8. CSIRAC - Australia, November 1949;
    9. SEAC - USA, May 9, 1950;
    10. ORDVAC - USA, November 1951;
    11. IAS machine - USA, June 10, 1952;
    12. MANIAC I - USA, March 1952;
    13. AVIDAC - USA, January 28, 1953;
    14. ORACLE - USA, late 1953;
    15. WEIZAC - Israel, 1955;

    In the USSR, the first fully electronic computer close to von Neumann's principles was MESM, built by Lebedev (on the basis of the Kyiv Institute of Electrical Engineering of the Academy of Sciences of the Ukrainian SSR), which passed state acceptance tests in December 1951.

    The bottleneck of von Neumann architecture

    Sharing buses for program memory and data memory leads to a bottleneck of the von Neumann architecture, namely the limitation of bandwidth between the processor and memory compared to the amount of memory. Due to the fact that program memory and data memory cannot be accessed at the same time, throughput the processor-memory channel and the speed of memory significantly limit the speed of the processor - much more than if programs and data were stored in different places. As processor speed and memory capacity have increased much faster than the bandwidth between them, the bottleneck has become a major problem, increasing in severity with each new generation of processors [ ] ; this problem is solved by improving caching systems, and this gives rise to many new problems [ which ones?] .

    The term "von Neumann architecture bottleneck" was coined by John Backus in 1977 in his Turing Award lecture "Can Programming Be Freed from the Von Neumann Style?"

    Scientists from the USA and Italy in 2015 announced the creation of a prototype mem processor (English memprocessor) with an architecture different from von Neumann and the possibility of using it to solve complete problems.

    see also

    Literature

    • Herman H. Goldstine. The Computer from Pascal to von Neumann. - Princeton University Press, 1980. - 365 p. - ISBN 9780691023670.(English)
    • William Aspray. John von Neumann and the Origins of Modern Computing. - MIT Press, 1990. - 394 p. - ISBN 0262011212.(English)
    • Scott McCartney. ENIAC: The Triumphs and Tragedies of the World's First Computer - Berkley Books, 2001. - 262 p. -

    In 1946, D. von Neumann, G. Goldstein and A. Berks, in their joint article, outlined new principles for the construction and operation of computers. Subsequently, the first two generations of computers were produced on the basis of these principles. There have been some changes in later generations, although Neumann's principles are still relevant today.

    In fact, Neumann managed to summarize the scientific developments and discoveries of many other scientists and formulate something fundamentally new on their basis.

    Von Neumann's principles

    1. Use of the binary number system in computers. The advantage over the decimal number system is that devices can be made quite simple, and arithmetic and logical operations in the binary number system are also performed quite simply.
    2. Software control computer. The operation of the computer is controlled by a program consisting of a set of commands. Commands are executed sequentially one after another. The creation of a machine with a stored program was the beginning of what we call programming today.
    3. Computer memory is used not only to store data, but also programs.. In this case, both program commands and data are encoded in the binary number system, i.e. their recording method is the same. Therefore, in certain situations, you can perform the same actions on commands as on data.
    4. Computer memory cells have addresses that are numbered sequentially. At any time, you can access any memory cell by its address. This principle opened up the possibility of using variables in programming.
    5. Possibility of conditional jump during program execution. Despite the fact that commands are executed sequentially, programs can implement the ability to jump to any section of code.

    The most important consequence of these principles is that now the program was no longer a permanent part of the machine (like, for example, a calculator). It became possible to easily change the program. But the equipment, of course, remains unchanged and very simple.

    By comparison, the program of the ENIAC computer (which did not have a stored program) was determined by special jumpers on the panel. It could take more than one day to reprogram the machine (set jumpers differently). And although programs for modern computers can take years to write, they work on millions of computers after a few minutes of installation on the hard drive.

    How does a von Neumann machine work?

    A von Neumann machine consists of a storage device (memory) - a memory, an arithmetic-logical unit - ALU, a control device - CU, as well as input and output devices.

    Programs and data are entered into memory from the input device through an arithmetic logic unit. All program commands are written to adjacent memory cells, and data for processing can be contained in arbitrary cells. For any program, the last command must be the shutdown command.

    The command consists of an indication of what operation should be performed (from the possible operations on a given hardware) and the addresses of memory cells where the data on which the specified operation should be performed is stored, as well as the address of the cell where the result should be written (if it needs to be saved in memory).

    The arithmetic logic unit performs the operations specified by the instructions on the specified data.

    From the arithmetic logic unit, the results are output to memory or an output device. The fundamental difference between a memory and an output device is that in a memory, data is stored in a form convenient for processing by a computer, and it is sent to output devices (printer, monitor, etc.) in a way that is convenient for a person.

    The control unit controls all parts of the computer. From the control device, other devices receive signals “what to do”, and from other devices the control unit receives information about their status.

    The control device contains a special register (cell) called the “program counter”. After loading the program and data into memory, the address of the first instruction of the program is written to the program counter. The control unit reads from memory the contents of the memory cell, the address of which is in the program counter, and places it in a special device - the “Command Register”. The control unit determines the operation of the command, “marks” in memory the data whose addresses are specified in the command, and controls the execution of the command. The operation is performed by the ALU or computer hardware.

    As a result of the execution of any command, the program counter changes by one and, therefore, points to the next command of the program. When it is necessary to execute a command that is not next in order to the current one, but is separated from the given one by a certain number of addresses, then a special jump command contains the address of the cell to which control must be transferred.

    The foundations of the doctrine of computer architecture were laid by the outstanding American mathematician John von Neumann. He became involved in the creation of the world's first tube computer, ENIAC, in 1944, when its design had already been selected. During his work, during numerous discussions with his colleagues G. Goldstein and A. Berks, von Neumann expressed the idea of ​​a fundamentally new computer. In 1946, scientists outlined their principles for constructing computers in the now classic article “Preliminary Consideration of the Logical Design of an Electronic Computing Device.” Half a century has passed since then, but the provisions put forward in it remain relevant today.

    The article convincingly substantiates the use of the binary system to represent numbers (it is worth recalling that previously all computers stored processed numbers in decimal form). The authors convincingly demonstrated the advantages of the binary system for technical implementation, the convenience and simplicity of performing arithmetic and logical operations in it. Later, computers began to process non-numeric types of information - text, graphic, sound and others, but binary data coding still forms the information basis of any modern computer.

    Another truly revolutionary idea, the importance of which is difficult to overestimate, is the “stored program” principle proposed by Neumann. Initially, the program was set by installing jumpers on a special patch panel. This was a very labor-intensive task: for example, it took several days to change the program of the ENIAC machine (while the calculation itself could not last more than a few minutes - the lamps failed). Neumann was the first to realize that a program could also be stored as a series of zeros and ones, in the same memory as the numbers it processed. The absence of a fundamental difference between the program and the data made it possible for the computer to form a program for itself in accordance with the results of the calculations.

    Von Neumann not only put forward the fundamental principles of the logical structure of a computer, but also proposed its structure, which was reproduced during the first two generations of computers. The main blocks according to Neumann are the control unit (CU) and the arithmetic-logical unit (ALU) (usually combined into CPU), memory, external memory, input and output devices. It should be noted that external memory differs from input and output devices in that data is entered into it in the form convenient for your computer, but inaccessible to direct human perception. Yes, the drive is on magnetic disks refers to external memory, and the keyboard is an input device, the display and print are output devices.

    The control device and the arithmetic-logical unit in modern computers are combined into one unit - the processor, which is a converter of information coming from memory and external devices(this includes retrieving instructions from memory, encoding and decoding, performing various, including arithmetic, operations, coordinating the operation of computer components). The processor functions will be discussed in more detail below.

    Memory (memory) stores information (data) and programs. The storage device in modern computers is “multi-tiered” and includes random access memory (RAM), which stores the information with which the computer is working directly at a given time (an executable program, part of the data necessary for it, some control programs), and external storage devices (ESD). ) much larger capacity than RAM. but with significantly slower access (and significantly lower cost per 1 byte of stored information). The classification of memory devices does not end with RAM and VRAM - certain functions are performed by both SRAM (super-random access memory), ROM (read-only memory), and other subtypes of computer memory.

    In a computer built according to the described scheme, instructions are sequentially read from memory and executed. Number (address) of the next memory cell. from which the next program command will be extracted is indicated by a special device - a command counter in the control unit. Its presence is also one of the characteristic features of the architecture in question.

    The fundamentals of the architecture of computing devices developed by von Neumann turned out to be so fundamental that they received the name “von Neumann architecture” in the literature. The vast majority of computers today are von Neumann machines. The only exceptions are certain types of systems for parallel computing, in which there is no program counter, the classical concept of a variable is not implemented, and there are other significant fundamental differences from the classical model (examples include streaming and reduction computers).

    Apparently, a significant deviation from the von Neumann architecture will occur as a result of the development of the idea of ​​fifth-generation machines, in which information processing is based not on calculations, but on logical conclusions.

    Von Neumann's principles

    Principle of memory homogeneity - Commands and data are stored in the same memory and are externally indistinguishable in memory. They can only be recognized by the method of use; that is, the same value in a memory cell can be used as data, as a command, and as an address, depending only on the way it is accessed. This allows you to perform the same operations on commands as on numbers, and, accordingly, opens up a number of possibilities. Thus, by cyclically changing the address part of the command, it is possible to access successive elements of the data array. This technique is called command modification and is not recommended from the standpoint of modern programming. More useful is another consequence of the principle of homogeneity, when instructions from one program can be obtained as a result of the execution of another program. This possibility underlies translation - the translation of program text from a high-level language into the language of a specific computer.

    The principle of addressing - Structurally, the main memory consists of numbered cells, and any cell is available to the processor at any time. Binary codes of commands and data are divided into units of information called words and stored in memory cells, and to access them the numbers of the corresponding cells - addresses - are used.

    Principle of program control - All calculations provided for by the algorithm for solving a problem must be presented in the form of a program consisting of a sequence of control words - commands. Each command prescribes some operation from a set of operations implemented by the computer. Program commands are stored in sequential memory cells of the computer and are executed in a natural sequence, that is, in the order of their position in the program. If necessary, using special commands, this sequence can be changed. The decision to change the order of execution of program commands is made either based on an analysis of the results of previous calculations, or unconditionally.

    Binary coding principle - According to this principle, all information, both data and commands, is encoded with binary digits 0 and 1. Each type of information is represented by a binary sequence and has its own format. A sequence of bits in a format that has a specific meaning is called a field. In numeric information, there is usually a sign field and a significant digits field. In the command format, two fields can be distinguished: the operation code field and the addresses field.