(redirected from About English and French Peoples)
Also found in: Dictionary, Thesaurus, Medical, Legal, Financial, Wikipedia.


in law: see indictmentindictment
, in criminal law, formal written accusation naming specific persons and crimes. Persons suspected of crime may be rendered liable to trial by indictment, by presentment, or by information.
..... Click the link for more information.


any unit of data or knowledge. The character and extent of the recorded information available to a society is a major differentiating feature between types of society. For example, in its possession of written records a literate culture possesses a decisive adaptive advantage compared with a nonliterate culture. This is seen, for example, in the rise to supremacy of the STATE, which was associated from the outset with the development of record-keeping and WRITING. The capacity possessed by modern societies to marshal and store information has grown massively in recent times as the result of major technological innovations such as printing, audio and video recorders, and especially computers. The centrality of knowledge and information in today's modern technological and highly administered societies has led some commentators to coin the term ‘information society’ to describe these societies (see POSTINDUSTRIAL SOCIETY). A further aspect of the increased capacity of modern societies to collect and store information is greatly increased power of the state in the monitoring and SURVEILLANCE of its citizens.



originally, a message transmitted from certain persons to other persons by verbal, written, or any other means (for example, with the help of conventional signals or with the use of technological means), as well as the processes themselves of transmission or reception of the message.

Information has always played a very important part in human life. However, in the mid-20th century the role of information increased immeasurably as a result of social progress and the vigorous development of science and technology. In addition, a rapid expansion of a mass of diversified information is occurring, which has received the name “information explosion.” As a result, the need has arisen for a scientific approach to information and for the elucidation of its most characteristic properties, which has led to two principal changes in interpretation of the concept of information. First, it was broadened to include information exchange not only between man and man but also between man and machine and between machine and machine, as well as the exchange of signals in the animal and plant worlds. The transmission of characteristic traits from cell to cell and from organism to organism also began to be regarded as the transmission of information. Second, a quantitative measure of information was proposed (the works of C. Shannon and A. N. Kolmogorov), which led to the creation of information theory.

A more general approach to the concept of information than was formerly used, as well as the appearance of a precise quantitative measure of information, aroused great interest in the study of information. Since the 1950’s attempts have been undertaken to use the concept of information (which does not as yet have a unique definition) to clarify and describe extremely diverse phenomena and processes.

The investigation of problems connected with the scientific concept of information has proceeded in three principal directions. The first of these consists of the development of a mathematical apparatus reflecting the fundamental properties of information.

The second direction consists of the theoretical development of various aspects of information on the basis of already existing mathematical methods and the investigation of various properties of information. For example, from the moment when information theory was created, the complex problem arose of the measurement of the value of information with respect to its use. In most work on information theory this property is not taken into consideration; however, its importance cannot be doubted. In the quantitative theory advanced in 1960 by A. A. Khar-kevich, the value of information is defined as the increase in the probability of achieving a given goal as the result of using a given piece of information. Closely related studies, such as those of R. Carnap, attempt to give a strict mathematical definition of the quantity of semantic information.

The third method is connected with the use of information methods in linguistics, biology, psychology, sociology, education, and other fields. In linguistics, for example, measurements of the informational capacity of languages have been realized. After the statistical processing of a large number of texts, performed with the help of computers, as well as comparison of the lengths of translations of a text into various languages and numerous experiments on the guessing of letters in a text, it has been ascertained that, with a uniform load of spoken units of information, texts could be shortened by a factor of four or five. Thus, from this viewpoint, the redundancy of natural languages has been established, as well as a rather precise measure of its magnitude, which is practically the same in all natural languages. In neurophysiology, information methods have helped researchers to understand better the mechanism of the fundamental law of psychophysics—the Weber-Fechner law, which asserts that sensation is proportional to the logarithm of excitation. Precisely such a relation must exist if the nerve paths transporting signals from the receptors to the brain are to have properties inherent in the idealized communication channel figuring in information theory. The information approach has played an important role in genetics and molecular biology, and has permitted, in particular, the role of the RNA molecule as a carrier of information to be more deeply understood. Research is also being carried out on the use of information methods in studies of the arts.

Such a diversified use of the concept of information has stimulated certain scholars to impart a general scientific signifcance to it. The founders of such a general approach were the English neurophysiologist W. R. Ashby and the French physicist L. Brillouin. They investigated questions of the common quality of the concept of entropy in information theory and in thermodynamics, treating information as negative entropy (negentropy). Brillouin and his followers began to study information processes from the viewpoint of the second law of thermodynamics, regarding the transmission of information in a certain system as an improvement in the system that leads to a decrease in its entropy. In several philosophical works the thesis has been advanced that information is one of the fundamental universal properties of matter. The positive side to this approach is that it relates the concept of information to the concept of reflection.


Ashby, W. R. Vvendenie v kibernetiku. Moscow, 1959. (Translated from English.)
Kharkevich, A. A. “O tsennosti informatsii.” In the collection Problemy kibernetiki, issue 4. Moscow, 1960.
Shannon, C. E. Raboty po teorii informatsii i kibernetike. Moscow, 1963. (Translated from English.)
Kolmogorov, A. N. “Tri podkhoda k opredeleniiu poniatiia ‘kolichestvo informatsii.’” Problemy peredachi informatsii, 1965, vol. 1, issue 1.
Brillouin, L. Nauchnaia neopredelennost’ i informatsiia. Moscow, 1966. (Translated from English.)
Ursul, A. D. Informatsiia. Moscow, 1971.



Data which has been recorded, classified, organized, related, or interpreted within a framework so that meaning emerges.


1. Law
a. a charge or complaint made before justices of the peace, usually on oath, to institute summary criminal proceedings
b. a complaint filed on behalf of the Crown, usually by the attorney general
2. Computing
a. the meaning given to data by the way in which it is interpreted
b. another word for data


The result of applying data processing to data, giving it context and meaning. Information can then be further processed to yeild knowledge.

People or computers can find patterns in data to perceive information, and information can be used to enhance knowledge. Since knowledge is prerequisite to wisdom, we always want more data and information. But, as modern societies verge on information overload, we especially need better ways to find patterns.

1234567.89 is data.

"Your bank balance has jumped 8087% to $1234567.89" is information.

"Nobody owes me that much money" is knowledge.

"I'd better talk to the bank before I spend it, because of what has happened to other people" is wisdom.


Information is the summarization of data. Technically, data are raw facts and figures that are processed into information, such as summaries and totals. But since information can also be the raw data for the next job or person, the two terms cannot be precisely defined, and both are used interchangeably.

It may be helpful to view information the way it is structured and used, namely: data, text, spreadsheets, pictures, voice and video. Data are discretely defined fields. Text is a collection of words. Spreadsheets are data in matrix (row and column) form. Pictures are lists of vectors or frames of bits. Voice is a continuous stream of sound waves. Video is a sequence of image frames. See universal server.

Information Structures
When information is stored electronically, it is structured according to the way it is used. Databases support all kinds of information.

Information Has "Meaning"
This excerpt from an R&R Report Writer ad exemplifies the idea that information is more usable for the manager than raw data. (Image courtesy of Concentric Data Systems, a subsidiary of Wall Data, Inc.)