Theory of Algorithms

Theory of Algorithms


the branch of mathematics concerned with the general properties of algorithms. Seminal ideas relating to the notion of an algorithm can be found in all periods of the history of mathematics. However, they congealed into the algorithm concept proper only in the 20th century. Algorithms became a subject of independent study (apparently for the first time and in a rather nebulous form) only in the 1920’s in the works of such representatives of mathematical intuitionism as L. E. J. Brouwer and H. Weyl. The year 1936 may be regarded as the beginning of a systematic development of the theory of algorithms. This was when A. Church published the first precise definition of a calculable function. He proposed identifying the concept of an everywhere definite calculable function having natural arguments and values with the concept of a general recursive function and gave the first example of a function that is not calculable. A. M. Turing and E. L. Post offered the first explications of the algorithm concept in terms of an idealized calculating machine—the Turing machine. Subsequently the theory of algorithms received further development in the works of S. C. Kleene, E. L. Post, A. A. Markov, and others. A. A. Markov proposed refining the algorithm concept by means of what he called a normal algorithm. The most general approach to a refinement of the concept of the algorithm was proposed by A. N. Kolmogorov.

Basic concepts of the theory of algorithms The domain of applicability of an algorithm is the set of objects to which an algorithm is applicable. Of an algorithm x we say that it (1) “calculates a function f “if its domain of applicability coincides with the domain of definition of f and if X transforms any x in its domain of applicability into f (x); (2) “resolves a set A with respect to a set X “if it is applicable to any x in X and transforms any x in X ∩ A into the word “yes” and any x in X\A into the word “no”; and (3) “enumerates a set B” if its domain of applicability is the positive integers and the collection of results constitutes B. A function is said to be computable if an algorithm exists that can calculate it. A set is said to be solvable with respect to X if there exists an algorithm that solves it with respect to X. A set is said to be enumerable if it is empty or if there exists an algorithm that can enumerate it.

A detailed analysis of the algorithm concept reveals that (1) the domain of possible initial data and the domain of applicability of any algorithm are enumerable sets; (2) In turn, for any pair of nested enumerable sets, it is possible to choose an algorithm in which the larger set serves as the set of initial data and the smaller set as the domain of applicability. The following basic theorems hold. (3) A function f is computable if and only if its graph—that is, the set of all pairs of the form x, f (x)—is enumerable. (4) A subset A of an enumerable set X is solvable with respect to X if and only if A and X\A are enumerable. (5) If A and B are enumerable, then A∠B and A\B are also enumerable. (6) In every infinite enumerable set X there exists an enumerable subset with a nonenumerable complement. By virtue of (4), this enumerable subset will be unsolvable with respect to X. (7) For every infinite enumerable set X, there exists a computable function defined on a subset of this set and not extendable to the computable function defined on the whole of X. Together, assertions (6) and (2) provide an example of an algorithm ચ with an unsolvable domain of applicability.(See.)

Algorithmic problems The problem of constructing an algorithm possessing one or another set of properties is referred to as an algorithmic problem. As a rule, the property of the sought-for algorithm is formulated in terms of the properties of the correspondence that must obtain between the initial data and the results of the algorithm. Important examples of algorithmic problems are the problem of calculating a given function, requiring the construction of an algorithm capable of calculating the function; the problem of solving a given set, requiring the construction of an algorithm that will solve the set with respect to some other set; and the problem of enumerating a given set, requiring the construction of an algorithm capable of enumerating the given set. The unsolvability of an algorithmic problem signifies the absence of an appropriate algorithm; theorems establishing the unsolvability of such problems are among the most important theorems in the theory of algorithms.

Metric theory of algorithms. The theory of algorithms may be divided into the descriptive (qualitative) and the metric (quantitative) theories. The first investigates algorithms from the point of view of the correspondences that the algorithms establish between the initial data and the results; included here are the algorithmic problems discussed in the preceding section. The second theory investigates algorithms from the point of view of the complexity both of the algorithms themselves and of the “calculations” specified by the algorithms—that is, the processes of a sequential transformation of the constructive entities. It is important to emphasize that the complexity of algorithms and calculations are revealed in various ways; it may turn out that one method, A, will be more complex than B and another, less complex. To be able to speak of the complexity of algorithms, we must first describe some precise language for recording algorithms; then by the phrase “complexity of an algorithm,” we will mean the complexity of the notation. In turn, complexity of notation can be defined in various ways—for example, as the number of symbols of a given type which appear in the notation or as the set of such numbers calculated for various types of symbols. To be able to speak of the complexity of a calculation, we must specify exactiy how the calculation is represented in the form of a chain (sequence) of constructive entities and what is to be regarded as the complexity of such a chain (whether only the number of terms in the chain—the “number of steps” in the calculation—is involved or whether one also takes account of the “size” of the terms and so forth). In any case, the complexity of a calculation depends on the datum that initiates the calculation. For this reason, the complexity of a calculation is a function that associates with each entity of the domain of applicability of the algorithm the complexity of the corresponding chain. The development of methods for estimating the complexity of algorithms and calculations has important theoretical and practical significance. However, in contrast to the descriptive theory of algorithms, which has become a complete mathematical discipline, the metric theory of algorithms is only taking its first steps.

Applications of the theory of algorithms. Applications are found in all areas of mathematics where algorithmic problems are encountered. Algorithmic problems arise in mathematical logic and the theory of models; for every theory the problem is formulated of solving the set of all true or demonstrable propositions of the theory with respect to the set of all its propositions; theories are subdivided into solvable or unsolvable, depending upon the solvability or unsolvability of the indicated problem. In 1936, A. Church established the unsolvability of the solution problem for the set of all true propositions of the logic of predicates. Further important results along this line have been achieved by A. Tarski, A. I. Mal’tsev, and others. Algorithmic problems are encountered in algebra (the problem of identity for semigroups, and, in particular, for groups). The first examples of semigroups with an unsolvable identity problem were discovered in 1947 independently by A. A. Markov and E. L. Post, while an example of a group with an unsolvable identity problem was dicov-ered in 1952 by P. S. Novikov. Algorithmic problems also appear in topology (the problem of a homeomorphism, whose unsolvability for an important class of cases was demonstrated by A. A. Markov in 1958), in number theory (the still open problem of the solvability of Diophantine equations), and in other areas of mathematics.

The theory of algorithms is closely associated with mathematical logic, since the algorithm concept underlies one of the central concepts of mathematical logic, namely the concept of calculus. That is why, for example, Gödel’s theorem on the incompleteness of formal systems can be obtained as a corollary to theorems of the theory of algorithms. Finally, the theory of algorithms is closely associated with the very foundation of mathematics, where a central place is occupied by the problem of the relationship between the constructive and the nonconstructive; in particular, the theory of algorithms supplies the necessary apparatus for developing the constructive trend in mathematics. In 1965, A. N. Kolmogorov proposed using the theory of algorithms as a basis for information theory. The theory of algorithms forms the theoretical foundation for a number of problems in computational mathematics and is closely associated with cybernetics, in which the study of control algorithms is important. Also the algorithm concept is fundamental to programmed instruction.


General problems:
Mal’tsev, A. A. Algoritmy i rekursivnye funktsii. Moscow, 1965.
Markov, A. A. Teoriia algorifmov. Moscow-Leningrad, 1954. (Tr. Matem. instituta AN SSSR, vol. 42.)
Specific problems:
Kolmogorov, A. N. “Tri podkhoda k opredeleneniiu poniatiia ‘kolichestvo informatsii’. “Problemy peredachi informatsii, 1965, vol. 1, issue 1.
Ershov, Iu. L., et al. “Elementarnye teorii.” Uspekhi matematicheskikh nauk, 1965, vol. 20, issue 4.
Markov, A. A. “O normal’nykh algorifmakh, sviazannykh s vychisleniem bulevykh funktsii.” Izvestiia AN SSSR—Seriia matematicheskaia, 1967, vol. 31, issue 1.
Trakhtenbrot, B. A. Slozhnost’ algoritmov i vychislenii. Novosibirsk, 1967.


References in periodicals archive ?
Nachala teorii mnozhestv [Lectures on mathematical logic and theory of algorithms.
Al Gergawi said today's computer-based systems are ultimately based on the mathematical theory of algorithms that originated in the region in the eight century AD.
The theory of algorithms focuses on finding efficient solutions to problems, while computational complexity theory aims to understand when and why problems are hard to solve.
of California-Los Angeles) demonstrates how axiomatic methods in the framework of projective mathematics work in computer science, achieving a high abstract level for a comprehensive unification of many important results in the theory of algorithms.
The GammaSite team includes computer scientists and mathematicians from leading Israeli and Canadian universities with internationally recognized expertise in machine and statistical learning, theory of algorithms, neural networks, signal processing and statistical modeling.

Full browser ?