| Information Structures |
|---|
| When information is stored electronically, it is structured according to the way it is used. Databases support all kinds of information. |
| Information Has "Meaning" |
|---|
| This excerpt from an R&R Report Writer ad exemplifies the idea that information is more usable for the manager than raw data. (Image courtesy of Concentric Data Systems, a subsidiary of Wall Data, Inc.) |
originally, a message transmitted from certain persons to other persons by verbal, written, or any other means (for example, with the help of conventional signals or with the use of technological means), as well as the processes themselves of transmission or reception of the message.
Information has always played a very important part in human life. However, in the mid-20th century the role of information increased immeasurably as a result of social progress and the vigorous development of science and technology. In addition, a rapid expansion of a mass of diversified information is occurring, which has received the name “information explosion.” As a result, the need has arisen for a scientific approach to information and for the elucidation of its most characteristic properties, which has led to two principal changes in interpretation of the concept of information. First, it was broadened to include information exchange not only between man and man but also between man and machine and between machine and machine, as well as the exchange of signals in the animal and plant worlds. The transmission of characteristic traits from cell to cell and from organism to organism also began to be regarded as the transmission of information. Second, a quantitative measure of information was proposed (the works of C. Shannon and A. N. Kolmogorov), which led to the creation of information theory.
A more general approach to the concept of information than was formerly used, as well as the appearance of a precise quantitative measure of information, aroused great interest in the study of information. Since the 1950’s attempts have been undertaken to use the concept of information (which does not as yet have a unique definition) to clarify and describe extremely diverse phenomena and processes.
The investigation of problems connected with the scientific concept of information has proceeded in three principal directions. The first of these consists of the development of a mathematical apparatus reflecting the fundamental properties of information.
The second direction consists of the theoretical development of various aspects of information on the basis of already existing mathematical methods and the investigation of various properties of information. For example, from the moment when information theory was created, the complex problem arose of the measurement of the value of information with respect to its use. In most work on information theory this property is not taken into consideration; however, its importance cannot be doubted. In the quantitative theory advanced in 1960 by A. A. Khar-kevich, the value of information is defined as the increase in the probability of achieving a given goal as the result of using a given piece of information. Closely related studies, such as those of R. Carnap, attempt to give a strict mathematical definition of the quantity of semantic information.
The third method is connected with the use of information methods in linguistics, biology, psychology, sociology, education, and other fields. In linguistics, for example, measurements of the informational capacity of languages have been realized. After the statistical processing of a large number of texts, performed with the help of computers, as well as comparison of the lengths of translations of a text into various languages and numerous experiments on the guessing of letters in a text, it has been ascertained that, with a uniform load of spoken units of information, texts could be shortened by a factor of four or five. Thus, from this viewpoint, the redundancy of natural languages has been established, as well as a rather precise measure of its magnitude, which is practically the same in all natural languages. In neurophysiology, information methods have helped researchers to understand better the mechanism of the fundamental law of psychophysics—the Weber-Fechner law, which asserts that sensation is proportional to the logarithm of excitation. Precisely such a relation must exist if the nerve paths transporting signals from the receptors to the brain are to have properties inherent in the idealized communication channel figuring in information theory. The information approach has played an important role in genetics and molecular biology, and has permitted, in particular, the role of the RNA molecule as a carrier of information to be more deeply understood. Research is also being carried out on the use of information methods in studies of the arts.
Such a diversified use of the concept of information has stimulated certain scholars to impart a general scientific signifcance to it. The founders of such a general approach were the English neurophysiologist W. R. Ashby and the French physicist L. Brillouin. They investigated questions of the common quality of the concept of entropy in information theory and in thermodynamics, treating information as negative entropy (negentropy). Brillouin and his followers began to study information processes from the viewpoint of the second law of thermodynamics, regarding the transmission of information in a certain system as an improvement in the system that leads to a decrease in its entropy. In several philosophical works the thesis has been advanced that information is one of the fundamental universal properties of matter. The positive side to this approach is that it relates the concept of information to the concept of reflection.
V. N. TROSTNTKOV