Also found in: Dictionary, Thesaurus, Medical, Legal, Financial, Wikipedia.
device capable of performing a series of arithmetic or logical operations. A computer is distinguished from a calculating machine, such as an electronic calculator, by being able to store a computer program (so that it can repeat its operations and make logical
..... Click the link for more information. ; programming languageprogramming language,
syntax, grammar, and symbols or words used to give instructions to a computer. Development of Low-Level Languages
All computers operate by following machine language programs, a long sequence of instructions called machine code that is
..... Click the link for more information. .
the process of writing an ordered sequence of instructions—a program—for a computer; also the scientific discipline that studies computer programs and the methods used in writing, checking, and improving them.
Every computer is an automaton consisting of a memory, composed of external and main storage units, a control unit, and an arithmetic unit, in which certain actions or operations can be performed. The memory takes the form of a numbered sequence of cells, each storing a portion of binary information in the form of a series of 0’s and 1’s. The automatic work of a computer controlled by a program consists of a sequence of time steps. During each step, the control unit selects from the designated memory cell a portion of information. This portion is treated as an order, that is, an instruction to the arithmetic unit to perform a certain operation. The performance of an operation in a computer is usually carried out by taking information stored in certain memory cells, transferring the information to the arithmetic unit for performance of the necessary action, reading the result of this action into the indicated memory cell, and informing the control unit of the number of the next instruction’s cell. The individual actions performed by the computer are very simple; they are the arithmetic and logical operations of, for example, comparing and storing a portion of information. Thus, to write a program for a computer means to set forth a procedure for solving a problem in the form of a set of computer instructions (a program). These instructions, stored in the memory and performed in sequence, each instruction calling up the next, lead to the desired computation.
The idea of programming arises early in school when students devise “solution plans” for arithmetic problems in the form of a series of questions. The significant difference between this school exercise and real programming is that the program ordinarily prescribes not one but several sequences of actions (branches), the choice between which depends on the value of intermediate results in the problem’s solution. The program is also able to perform some groups of instructions many times, the necessary number of repetitions (iterations) being determined automatically, and to manipulate its own instructions and alter them dynamically.
Another feature of programming is its difficulty. Many programs possess thousands of instructions, and the number of operations performed by them may run to tens of millions. Such volumes, when regarded in the light of the elementary nature of the machine instructions, make programming both a very complex and very monotonous process.
To overcome this contradiction, programming has been given the character of a multistage process, in which each stage gradually makes the plan for the solution of the problem, obtained from the preceding stage, more concrete and detailed. Moreover, if the rules describing the problem’s solution plan in a certain stage are exact, formal, and universal, that is, applicable to any problem, then we may speak of the existence of a certain programming language that is used to write programs.
Programming languages, used as a means of precisely formulating the different stages in the solution of a problem by a computer, have played a fundamental role in the development of programming. Specifically, they make it possible to regard programming as a process of translation; the problem to be solved by the computer is expressed in one language and must be translated into another, the machine language. If the exact
rules of this translation are found and described, then these rules can, in their turn, be programmed into the computer. The programs obtained, which automate the programming process, are called compilers. The various stages of the programming process are outlined in Figure 1.
The content of each stage of programming may be clarified by using as an example the solution of a quadratic equation. In this case, the initial formulation of the problem consists in finding the roots of 50 quadratic equations of the type ax2 + bx + c = 0, the coefficients of which are given in the form of the three tables Ai, Bi, and Ci (i = 1, …, 50). The algorithmic description of the problem is obtained after the full mathematical analysis of the statement of the problem, the selection of standard (or search for new) algorithms to perform all necessary computations, and a precise determination of the necessary inputs and consequent outputs. Here, the algorithmic description may have the following form: feed the tables of coefficients Ai, Bi, and Ci into the computer; solve each equation according to the general formula
with analysis of the discriminant b2 — 4ac to determine the case of complex roots; and give, for the sake of uniformity, each root of the equation in the form of the complex number x = u + iv, assuming the imaginary part equal to zero in the case of real roots.
The higher-level language is the most important medium for writing computer programs. Common features of these languages are their machine-independent nature and their sentence structure. This, taken together with the use of certain annotations, suggests a resemblance to natural languages. Sentences are usually separated by semicolons; the subordination of sentences is indicated by means of the words “begin” and “end,” which act as left and right parentheses. The statements making up the program are either command statements or data declaration statements. The command statement governs a single mathematical operation and may be unconditional, where a calculation is carried out according to a given formula assuming a computed value for the given variable, or conditional, where one branch of computations or another is selected, depending on the result of a test. There are also command statements for looping, that is, executing a group of statements a number of times. Data declaration statements give the properties of variables and other designations used in the program. The procedural nature of higher-level language is an important characteristic; a symbolic functional representation may be applied to any previously written program that solves a certain particular problem. The text of the program, together with its symbols, is called a procedure declaration, or subroutine. If this procedure declaration is required when composing new programs, it is sufficient merely to render the functional representation in the form of a procedure statement rather than to copy the full text of the subroutine.
In the 1970’s, there exists a whole series of such programming languages: ALGOL-60 and FORTRAN for solving engineering and scientific problems, COBOL for commercial and business problems, SIMULA languages for programming mathematical models, and the more powerful ALGOL-68 and PL/1 languages, which encompass all types of computer applications. All of these languages possess corresponding compilers, which ensure the automatic translation into machine programs of problems expressed in a higher-level language.
The program for solving a quadratic equation written in ALGOL-60 (adapted) is as follows:
begin real arrays A, B, C [1:50];
real a, b, c, ul, vl,
u2, v2; integer i; read (A, B, C);
for i: = 1 step 1 until 50 do
begin a: = A[i]; b: = B[i]; c: = C[i];
if b ↑ 2 - 4 × a × c > 0 then
begin v1: = v2: = 0; ul: = [ - b
+ sqrt(b ↑ 2 - 4 × a × c)]/(2 × a); u2:= [- b - sqrt
(b ↑ 2 - 4 × a × c)]/(2 × a)
begin vl: = sqrt (4 × a × c - b ↑ 2)/(2 × a);
v2:= -vl; ul:= u2: = -b/(2 × a)
end; write (ul, vl, u2, v2)
A machine-oriented language represents programs in terms of computer instructions expressed in symbols more convenient to use than direct binary representation. Such a language is used either in an intermediate stage of the process of automatic translation from a higher-level language or as a programming language when the program must be rapidly devised in terms of machine instructions. In the latter case, a flow chart often replaces the higher-level language. In this case, the structure of the program, that is, the order of performance of its blocks, and the presence of branches and loops are shown in graphic form. The functions of each block are written in an arbitrary textual form. An example of a flow chart for solving a quadratic equation is shown in Figure 2.
After the program has been written, an essential stage is the debugging process, that is, finding and correcting mistakes made during programming. The chief means of debugging is the test run, during which a test program, whose output can be easily scanned for errors, is fed into the computer. The test program is so devised that it can reveal errors in sequence and intermediate values while it is being run. Study of the test program’s output makes it possible to determine how well the program meets the programmer’s intention.
The development of programming as a science began in 1947 with the work of the American mathematicians J. Von Neumann, A. Burks, and H. Goldstine, who described the principles of a computer controlled by a program stored in its memory. They also introduced the use of the flow chart to depict a program. The concept of subroutines and the methods for using them were introduced in 1951 by the British scientists M. Wilkes, D. Wheeler, and S. Gill. The Soviet mathematician A. A. Liapunov, who in 1952 at Moscow State University was the first in the Soviet Union to give a course in computer programming, defined programming as a multistage process and introduced a guide to symbolic representation, which proved to be the forerunner of higher-level programming languages. The idea of the automation of programming by translating a program written in a programming language was realized by J. W. Backus (FORTRAN) and G. Hopper in the United States and by S. S. Kamynin, E. Z. Liubimskii, M. R. Shura-Bura, and A. P. Ershov in the USSR (1954–56). By 1960, the COBOL language and the international programming language, ALGOL-60, had been developed in the United States, the latter one by a group of scientists from six countries.
During the 1960’s, the development of programming involved refinement and universalization of programming languages, which found expression in the ALGOL-68, PL/1, and SIMULA languages, development of methods for strict, formal description of programming languages, development of the theory and technique of building compilers, and creation of libraries of standard subroutines. There has been a marked development in the machine-oriented languages toward combining a number of the features of higher-level languages (procedural quality, sentence structure) with adaptability to the characteristics of a specific computer. For certain classes of problems, successful attempts have been made to expand the area of application of automated programming by formalizing the methods of the algorithmic description of a problem or even its initial formulation. This has led to the concepts of, for example, problem-oriented programming languages and nonalgorithmic programming languages.
REFERENCESLavrov, S. S. Vvedenie v programmirovanie. Moscow, 1973.
Lavrov, S. S. Universal’nyi iazyk programmirovaniia (ALGOL 60), 3rd ed. Moscow, 1972.
Zhogolev, E. A., and N. P. Trifonov. Kurs programmirovaniia, 3rd ed. Moscow, 1971.
Germain, C. B. Programmirovanie na IBM/360, 2nd ed. Moscow, 1973. (Translated from English.)
Stabley, D. Logicheskoe programmirovanie v sisteme 360. Moscow, 1974. (Translated from English.)
A. P. ERSHOV
programmingCreating a computer program. The steps are:
1. Developing the program logic to solve the
2. Writing the program logic in a specific
programming language (coding the program).
3. Assembling or compiling the program to turn
it into machine language.
4. Testing and debugging the program.
5. Preparing the necessary documentation.
The logic is generally the most difficult part of programming. However, depending on the programming language, writing the statements may also be laborious. One thing is certain. Documenting the program is considered the most annoying activity by most programmers. See programming concepts, estimating a programming job and peer review.