What is information in computer science definition briefly. "evolution of the concept of information"

The concept of information in the modern scientific world is ambiguous. Despite the fact that it appeared quite a long time ago, scientists still cannot give a strict definition of this term and argue that it should be considered as an indefinable primary concept. In computer science textbooks, you can find the following definitions:

Information - information or knowledge about something; knowledge that can be collected, stored, transferred, processed and used. Information in computer science is the basic concept of science, since computer science is the science of information, its structure and properties, methods of its processing and transmission. Informatics studies information with the help of the main tool - a computer.

The term itself, translated from Latin, means "information", "exposition". Today, science is in search of common properties and patterns that are inherent in information, but so far information in computer science is an intuitive concept that has different meanings in different areas of human activity.

Despite the presence of a large number of definitions, we single out the most general and understandable. Information is a reflection of the surrounding world through signals and signs. The value lies in the new knowledge that it contains.

In informatics. Depending on the mode of perception, it can be perceived with the help of the eyes, smell, taste and touch. In addition, according to the form of presentation, textual, numerical, graphic and sound are distinguished. Video information is also in the same classification.

Information in computer science has a number of properties. These include completeness, reliability, relevance, accessibility, security, relevance (ability to meet requests), ergonomics.

Information as a special kind of resource has properties that are characteristic of it. These include: memorability, reproducibility, transferability, convertibility and erasure.

In this case, there can be any object of the material world, waves (acoustic, electromagnetic, etc.), matter in any state. There are also machine media (such as magnetic tape). Information is transmitted in computer science using signals.

A signal is a physical process that has informational value and can be discrete or continuous. The first one takes only a finite number of values ​​for a certain number of moments of time, and the continuous one is the one that constantly changes both in time and in amplitude. Signals that carry symbolic or textual information are discrete. Examples of analog signals include telephone communications or television.

Ours is developing very rapidly, and already today a number of new areas are distinguished in the science of information, such as programming, cybernetics, artificial intelligence, computer technology, information systems and theoretical in computer science - a relatively new one in the human lexicon, and despite the widespread use of this word in speech, its content remains fuzzy and blurry. Computer science is concerned with information and its processing in computers. At an intuitive level, everything is clear, however, if you look closely, the question becomes much more complicated than it might seem at first glance.

Since computers are widespread today, and humanity is experiencing an information boom, the basics of computer science should be understood by every modern individual who wants to keep up with the times at least a little. It was these factors that influenced the fact that the teaching of computer science was introduced into school course and every young person has the opportunity to master this new, but very interesting and necessary science.

Information is information about something.

The concept and types of information, transmission and processing, search and storage of information

Information is, definition

Information is any intelligence, received and transmitted, stored by various sources. - this is the whole set of information about the world around us, about all kinds of processes taking place in it, which can be perceived by living organisms, electronic machines and other information systems.

- this is significant information about something, when the form of their presentation is also information, that is, it has a formatting function in accordance with its own nature.

Information is everything that can be supplemented by our knowledge and assumptions.

Information is information about something, regardless of the form of their presentation.

Information is the psychic of any psychophysical organism, produced by it when using some means, called the means of information.

Information is information perceived by a person and (or) special. devices as a reflection of the facts of the material or spiritual world in process communications.

Information is data organized in such a way that makes sense to the person dealing with it.

Information is the value a person puts into data based on the known conventions used to represent it.

Information is information, explanation, presentation.

Information is any data or information that anyone is interested in.

Information is information about objects and phenomena of the environment, their parameters, properties and state, which are perceived by information systems (living organisms, control machines, etc.) process life and work.

The same information message (newspaper article, announcement, letter, telegram, reference, story, drawing, radio broadcast, etc.) may contain a different amount of information for different people - depending on their previous knowledge, on the level of understanding of this messages and interest in it.

When talking about automated work with information through some technical devices, they are not interested in the content of the message, but in how many characters this message contains.

Information (Information) is

In relation to computer data processing, information is understood as a certain sequence of symbolic designations (letters, numbers, encoded graphic images and sounds, etc.) that carry a semantic load and are presented in a form understandable to a computer. Each new character in such a sequence of characters increases the information volume of the message.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of features. For example, the concept of "information" is basic in the course of computer science, and it is impossible to define it through other, more "simple" concepts (just as in geometry, for example, it is impossible to express the content of the basic concepts of "point", "line", "plane" through simpler concepts).

The content of the basic, basic concepts in any science must be explained by examples or identified by comparing them with the content of other concepts. In the case of the concept of "information", the problem of its definition is even more complicated, since it is a general scientific concept. This concept is used in various sciences (computer science, cybernetics, biology, physics, etc.), while in each science the concept of "information" is associated with different systems of concepts.

The concept of information

In modern science, two types of information are considered:

Objective (primary) information is a property material objects and phenomena (processes) generate a variety of states, which through interactions (fundamental interactions) are transferred to other objects and imprinted in their structure.

Subjective (semantic, semantic, secondary) information is the semantic content of objective information about the objects and processes of the material world, formed by the human mind with the help of semantic images (words, images and sensations) and fixed on some material carrier.

In the everyday sense, information is information about the surrounding world and the processes taking place in it, perceived by a person or a special device.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of features. According to the concept of K. Shannon, information is the removed uncertainty, i.e. Information that should remove, to one degree or another, the uncertainty that the acquirer has before they are received, expand his understanding of the object with useful information.

From Gregory Beton's point of view, the elementary unit of information is a "caring difference" or an effective difference for some larger perceiving system. Those differences that are not perceived, he calls "potential", and perceived - "active". "Information consists of indifferent differences" (c) "Any perception of information is necessarily an acquisition of information about a difference." From the point of view of computer science, information has a number of fundamental properties: novelty, relevance, reliability, objectivity, completeness, value, etc. The science of logic is primarily involved in the analysis of information. The word "information" comes from the Latin word informatio, which in translation means information, clarification, familiarization. The concept of information was considered by ancient philosophers.

Information (Information) is

Before the industrial revolution, defining the essence of information remained the prerogative of mainly philosophers. Further, the science of cybernetics, which was new at that time, began to consider issues of information theory.

Sometimes, in order to comprehend the essence of a concept, it is useful to analyze the meaning of the word that denotes this concept. Elucidation of the internal form of the word and study of the history of its use can shed unexpected light on its meaning, eclipsed by the usual "technological" use of this word and modern connotations.

The word information entered the Russian language in the Petrine era. For the first time it is recorded in the "Spiritual Regulations" of 1721 in the meaning of "representation, concept of something". (In European languages, it was fixed earlier - around the 14th century.)

Information (Information) is

Based on this etymology, information can be considered any significant change in form, or, in other words, any materially fixed traces formed by the interaction of objects or forces and amenable to understanding. Information is thus a converted form of energy. The carrier of information is a sign, and the way of its existence is interpretation: revealing the meaning of a sign or a sequence of signs.

The meaning can be an event reconstructed from the sign that caused its occurrence (in the case of "natural" and involuntary signs, such as traces, evidence, etc.), or a message (in the case of conventional signs characteristic of the sphere of language). It is the second kind of signs that makes up the body of human culture, which, according to one of the definitions, is "a set of non-hereditarily transmitted information."

Information (Information) is

Messages may contain information about facts or interpretation of facts (from Latin interpretatio, interpretation, translation).

A living being receives information through the senses, as well as through reflection or intuition. The exchange of information between subjects is communication or communication (from lat. communicatio, message, transmission, derived in turn from lat. communico, to make common, to inform, talk, connect).

From a practical point of view, information is always presented as a message. An informational message is associated with a message source, a message recipient, and a communication channel.

Returning to the Latin etymology of the word information, let's try to answer the question of what exactly the form is given here.

It is obvious that, firstly, some sense, which, being initially formless and unexpressed, exists only potentially and must be "built" in order to become perceived and transmitted.

Secondly, to the human mind, which is brought up to think structurally and clearly. Thirdly, a society that, precisely because its members share these meanings and share them, gains unity and functionality.

Information (Information) is

information as expressed reasonable meaning is knowledge that can be stored, transmitted and be the basis for the generation of other knowledge. The forms of knowledge conservation (historical memory) are diverse: from myths, annals and pyramids to libraries, museums and computer databases.

Information - information about the world around us, about the processes taking place in it, which are perceived by living organisms, managers machines and other information systems.

The word "information" is Latin. For a long life, its meaning has undergone evolution, sometimes expanding, sometimes narrowing its boundaries to the limit. At first, the word "information" meant: "representation", "concept", then - "information", "message transmission".

AT last years scientists decided that the usual (accepted by all) meaning of the word "information" is too elastic, vague, and gave it such a meaning: "a measure of certainty in a message."

Information (Information) is

Information theory was brought to life by the needs of practice. Its occurrence is associated with work Claude Shannon "Mathematical Theory of Communication", published in 1946. The foundations of information theory are based on the results obtained by many scientists. By the second half of the 20th century, the globe was buzzing with transmitted information, running through telephone and telegraph cables and radio channels. Later, electronic computers appeared - information processors. And for that time, the main task of information theory was, first of all, to increase the efficiency of the functioning of communication systems. The difficulty in the design and operation of means, systems and communication channels is that it is not enough for the designer and engineer to solve the problem from physical and energy positions. From these points of view, the system can be the most perfect and economical. But it is also important when creating transmission systems to pay attention to how much information will pass through this transmission system. After all, information can be quantified, calculated. And they act in such calculations in the most usual way: they abstract from the meaning of the message, as they renounce concreteness in the arithmetic operations familiar to all of us (as from the addition of two apples and three apples they pass to the addition of numbers in general: 2 + 3).

The scientists said they "completely ignored human evaluation of information." To a sequence of 100 letters, for example, they assign meaning to information, regardless of whether that information makes sense and whether, in turn, practical application makes sense. The quantitative approach is the most developed branch of information theory. According to this definition, a collection of 100 letters—a 100-letter phrase from a newspaper, Shakespeare's play, or Einstein's theorem—has exactly the same amount of information.

This quantification of information is highly useful and practical. It corresponds exactly to the task of the communications engineer, who must convey all the information contained in the submitted telegram, regardless of the value of this information for the addressee. The communication channel is soulless. One thing is important for the transmission system: to transmit right amount information for a certain time. How to calculate the amount of information in a particular message?

Information (Information) is

The assessment of the amount of information is based on the laws of probability theory, more precisely, it is determined through probabilities events. This is understandable. The message has value, carries information only when we learn from it about the outcome of an event that has a random character, when it is to some extent unexpected. After all, the message about the already known does not contain any information. Those. if, for example, someone calls you on the telephone and says: “It is light during the day and dark at night,” then such a message will surprise you only with the absurdity of the statement of the obvious and well-known, and not with the news that it contains. Another thing, for example, the result of the race at the races. Who will come first? The outcome here is difficult to predict. The more the event of interest to us has random outcomes, the more valuable the message about its result, the more information. An event message that has only two equally possible outcomes contains one piece of information called a bit. The choice of the unit of information is not accidental. It is associated with the most common binary way of encoding it during transmission and processing. Let us try, at least in the most simplified form, to imagine that general principle quantification of information, which is the cornerstone of all information theory.

We already know that the amount of information depends on probabilities certain outcomes of an event. If an event, as scientists say, has two equally likely outcomes, this means that each outcome is equal to 1/2. This is the probability of getting heads or tails when tossing a coin. If an event has three equally likely outcomes, then the probability of each is 1/3. Note that the sum of the probabilities of all outcomes is always equal to one: after all, one of all possible outcomes will definitely come. An event, as you understand, can have unequal outcomes. Yes, at football match between strong and weak teams, the probability of winning a strong team is high - for example, 4/5. draw is much smaller, for example 3/20. The probability of defeat is very small.

It turns out that the amount of information is a measure of reducing the uncertainty of some situation. Various quantities information is transmitted over communication channels, and the amount of information passing through the channel cannot be more than its capacity. And it is determined by how much information passes here per unit of time. One of the characters in Jules Verne's novel Mysterious Island”, journalist Gideon Spillet, broadcast on telephone set chapter from the Bible so that his competitors could not use the telephone. In this case, the channel was loaded completely, and the amount of information was equal to zero, because the subscriber received information known to him. This means that the channel was idle, passing a strictly defined number of pulses, without loading them with anything. Meanwhile, the more information each of a certain number of pulses carries, the more fully the channel bandwidth is used. Therefore, it is necessary to intelligently encode information, to find an economical, stingy language for transmitting messages.

The information is "sifted" in the most thorough way. In the telegraph, frequently occurring letters, combinations of letters, even whole phrases are depicted with a shorter set of zeros and ones, and those that are less common are shown with a longer one. In the case when the length of the code word is reduced for frequently occurring symbols and increased for rarely occurring ones, one speaks of efficient encoding of information. But in practice, it often happens that the code resulting from the most thorough “sifting”, a convenient and economical code, can distort the message due to interference, which, unfortunately, always happens in communication channels: sound distortion in the phone, atmospheric noise in , distortion or darkening of the image in television, transmission errors in telegraph. These interferences, or, as they are called by experts, noise, fall on the information. And from this there are the most incredible and, of course, unpleasant surprises.

Therefore, to increase the reliability in the transmission and processing of information, it is necessary to introduce extra characters - a kind of protection against distortion. They - these extra characters - do not carry the actual content in the message, they are redundant. From the point of view of information theory, everything that makes a language colorful, flexible, rich in shades, multifaceted, multi-valued, is redundancy. How redundant from such positions is Tatyana's letter to Onegin! How much informational excesses are in it for a short and understandable message "I love you"! And how informationally accurate are the hand-drawn signs that are understandable to anyone and everyone who enters the metro today, where instead of the words and phrases of announcements there are laconic symbolic signs indicating: “Entrance”, “Exit”.

In this regard, it is useful to recall an anecdote told at one time by the famous American scientist Benjamin Franklin about a hatter who invited his friends to discuss a sign project. It was supposed to draw a hat on the sign and write: “John Thompson, the hatter, makes and sells hats for cash». One friend noticed that the words "for cash money» are redundant - such a reminder would be offensive to buyer. Another also found the word "sells" superfluous, since it goes without saying that a hatter sells hats, and does not give them away for free. The third thought that the words "hatter" and "makes hats" were an unnecessary tautology, and the last words were thrown out. The fourth suggested throwing out the word "hatter" - the painted hat clearly says who John Thompson is. Finally, the fifth assured that for buyer it was completely indifferent whether the hatter would be called John Thompson or otherwise, and suggested that this indication be dispensed with. Thus, in the end, there was nothing left on the sign but a hat. Of course, if people used only this kind of codes, without redundancy in messages, then everything " information forms» - books, reports, articles - would be extremely brief. But they would lose in intelligibility and beauty.

Information can be divided into types according to different criteria: in truth: true and false;

according to the way of perception:

Visual - perceived by the organs of vision;

Auditory - perceived by the organs of hearing;

Tactile - perceived by tactile receptors;

Olfactory - perceived by olfactory receptors;

Taste - Perceived by taste buds.

in the form of presentation:

Text - transmitted in the form of symbols intended to designate lexemes of the language;

Numerical - in the form of numbers and signs denoting mathematical operations;

Graphic - in the form of images, objects, graphs;

Sound - oral or in the form of a recording, the transmission of language lexemes by auditory means.

by appointment:

Mass - contains trivial information and operates with a set of concepts understandable to most of the society;

Special - contains a specific set of concepts, when used, information is transmitted that may not be clear to the bulk of society, but is necessary and understandable within a narrow social group where it is used this information;

Secret - transmitted to a narrow circle of people and through closed (secure) channels;

Personal (private) - a set of information about a person that determines the social status and types social interactions within the population.

by value:

Up-to-date - valuable information in this moment time;

Reliable - information received without distortion;

Understandable - information expressed in a language understandable to the person to whom it is intended;

Complete - information sufficient to make the right decision or understanding;

Useful - the usefulness of information is determined by the subject who received the information, depending on the volume of possibilities for its use.

The value of information in various fields of knowledge

In information theory, many systems, methods, approaches, ideas are being developed nowadays. However, scientists believe that modern trends new ideas will be added to information theory, new ideas will appear. As proof of the correctness of their assumptions, they cite the “live”, developing nature of science, point out that information theory is surprisingly quickly and firmly introduced into the most diverse areas of human knowledge. Information theory has penetrated into physics, chemistry, biology, medicine, philosophy, linguistics, pedagogy, economics, logic, technical sciences, and aesthetics. According to the experts themselves, the doctrine of information, which arose due to the needs of the theory of communication and cybernetics, stepped over their limits. And now, perhaps, we have the right to talk about information as a scientific concept that puts into the hands of researchers a theoretical and informational method with which you can penetrate into many sciences about animate and inanimate nature, about society, which will allow not only to look at all problems from a new perspective. side, but also to see the unseen. That is why the term "information" has become widespread in our time, becoming part of such concepts as the information system, information culture, even information ethics.

Many scientific disciplines use information theory to emphasize a new direction in the old sciences. This is how, for example, information geography, information economics, and information law arose. But the term "information" has become extremely important in connection with the development of the latest computer technology, the automation of mental work, the development of new means of communication and information processing, and especially with the emergence of computer science. One of the most important tasks of information theory is the study of the nature and properties of information, the creation of methods for its processing, in particular, the transformation of a wide variety of modern information into computer programs, with the help of which the automation of mental work takes place - a kind of strengthening of the intellect, and hence the development of the intellectual resources of society.

The word "information" comes from the Latin word informatio, which means information, clarification, familiarization. The concept of "information" is basic in the course of computer science, but it is impossible to define it through other, more "simple" concepts. The concept of "information" is used in various sciences, and in each science the concept of "information" is associated with different systems of concepts. Information in biology: Biology studies wildlife and the concept of "information" is associated with the appropriate behavior of living organisms. In living organisms, information is transmitted and stored using objects of various physical nature (DNA state), which are considered as signs of biological alphabets. Genetic information is inherited and stored in all cells of living organisms. Philosophical approach: information is interaction, reflection, cognition. Cybernetic approach: information is characteristics manager signal transmitted over the communication line.

The role of information in philosophy

The traditionalism of the subjective has always dominated in the early definitions of information as categories, concepts, properties of the material world. Information exists outside our consciousness, and can only be reflected in our perception as a result of interaction: reflection, reading, receiving in the form of a signal, stimulus. Information is not material, like all properties of matter. Information stands in the following order: matter, space, time, consistency, function, etc., which are the fundamental concepts of a formalized reflection of objective reality in its distribution and variability, diversity and manifestations. Information is a property of matter and reflects its properties (state or ability to interact) and quantity (measure) through interaction.

From a material point of view, information is the order of the objects of the material world. For example, the order of letters on a sheet of paper according to certain rules is written information. The sequence of multi-colored dots on a sheet of paper according to certain rules is graphic information. The order of musical notes is musical information. The order of genes in DNA is hereditary information. The order of bits in a computer is computer information, and so on. etc. For the implementation of information exchange, the presence of necessary and sufficient conditions is required.

Information (Information) is

The necessary conditions:

The presence of at least two different objects of the material or non-material world;

Availability of objects common property, which allows to identify objects as a carrier of information;

Objects have a specific property that allows them to distinguish objects from each other;

The presence of a space property that allows you to determine the order of objects. For example, the arrangement of written information on paper is a specific property of paper that allows letters to be arranged from left to right and from top to bottom.

There is only one sufficient condition: the presence of a subject capable of recognizing information. This is a person and human society, societies of animals, robots, etc. An informational message is constructed by selecting copies of objects from the basis and arranging these objects in space in a certain order. The length of the informational message is defined as the number of copies of the basis objects and is always expressed as an integer. It is necessary to distinguish between the length of an information message, which is always measured as an integer, and the amount of knowledge contained in an information message, which is measured in an unknown unit of measure. From a mathematical point of view, information is a sequence of integers that are written in a vector. The numbers are the number of the object in the information basis. The vector is called the information invariant, since it does not depend on the physical nature of the basis objects. One and the same informational message can be expressed in letters, words, sentences, files, pictures, notes, songs, video clips, any combination of all previously named.

Information (Information) is

The role of information in physics

information is information about the surrounding world (object, process, phenomenon, event), which is the object of transformation (including storage, transmission, etc.) and is used to develop behavior, to make decisions, to manage or to learn.

The characteristics of information are as follows:

This is the most important resource of modern production: it reduces the need for land, labor, capital, reduces the cost of raw materials and energy. So, for example, having the ability to archive your files (that is, having such information), you can not spend money on buying new floppy disks;

Information brings new productions to life. For example, the invention of the laser beam was the cause of the emergence and development of the production of laser (optical) disks;

Information is a commodity, and information does not lose it after the sale. So, if a student informs his friend about the schedule of classes during the semester, he will not lose this data for himself;

Information gives additional value to other resources, in particular, labor. Indeed, an employee higher education valued more than the average.

As follows from the definition, three concepts are always associated with information:

The source of information is that element of the surrounding world (object, phenomenon, event), information about which is the object of transformation. So, the source of information that the reader of this study guide, is computer science as a sphere of human activity;

The acquirer of information is that element of the surrounding world that uses information (to develop behavior, to make decisions, to manage or to learn). The acquirer of this information is the reader himself;

A signal is a material carrier that captures information for its transfer from a source to a recipient. In this case, the signal is electronic in nature. If the student takes this manual in the library, then the same information will be on paper. Being read and memorized by a student, the information will acquire another carrier - biological, when it is “recorded” in the memory of the student.

The signal is the most important element in this circuit. The forms of its presentation, as well as the quantitative and qualitative characteristics of the information contained in it, which are important for the acquirer of information, are discussed later in this section of the textbook. The main characteristics of the computer as the main tool that maps the source of information into a signal (link 1 in the figure) and “bringing” the signal to the recipient of information (link 2 in the figure) are given in the Computer section. The structure of the procedures that implement links 1 and 2 and make up the information process is the subject of consideration in the part Information process.

The objects of the material world are in a state of continuous change, which is characterized by the exchange of energy of the object with the environment. A change in the state of one object always leads to a change in the state of some other object in the environment. This phenomenon, regardless of how, which particular states and which particular objects have changed, can be considered as a signal transmission from one object to another. Changing the state of an object when a signal is sent to it is called signal registration.

A signal or a sequence of signals form a message that can be perceived by the recipient in one form or another, as well as in one volume or another. Information in physics is a term that qualitatively generalizes the concepts of "signal" and "message". If signals and messages can be quantified, then we can say that signals and messages are units of measurement of the amount of information. The message (signal) is interpreted differently by different systems. For example, a long and two short beeps in sequence in Morse code terminology is the letter de (or D), in BIOS terminology from award, a video card malfunction.

Information (Information) is

The role of information in mathematics

In mathematics, information theory (mathematical communication theory) is a branch of applied mathematics that defines the concept of information, its properties, and establishes limiting relationships for data transmission systems. The main branches of information theory are source coding (compressive coding) and channel (noise-correcting) coding. Mathematics is more than a scientific discipline. She creates common language of all Science.

The subject of mathematics research is abstract objects: number, function, vector, set, and others. Moreover, most of them are introduced axiomatically (axiom), i.e. without any connection with other concepts and without any definition.

Information (Information) is

information is not among the subjects of study of mathematics. However, the word "information" is used in mathematical terms - own information and mutual information, related to the abstract (mathematical) part of information theory. However, in mathematical theory, the concept of "information" is associated with exclusively abstract objects - random variables, while in modern information theory this concept is considered much broader - as a property of material objects. The connection between these two identical terms is undeniable. It was the mathematical apparatus of random numbers that was used by the author of information theory Claude Shannon. He himself means by the term "information" something fundamental (irreducible). Shannon's theory intuitively assumes that information has content. Information reduces the overall uncertainty and information entropy. The amount of information available to measure. However, he warns researchers against the mechanical transfer of concepts from his theory to other areas of science.

"The search for ways to apply the theory of information in other fields of science is not reduced to a trivial transfer of terms from one field of science to another. This search is carried out in a long process of putting forward new hypotheses and their experimental verification." K. Shannon.

Information (Information) is

The role of information in cybernetics

The founder of cybernetics, Norbert Wiener, spoke of information as follows:

information is not matter or energy, information is information." But the main definition of information, which he gave in several of his books, is the following: information is a designation of content received by us from the external world, in the process of adapting us and our feelings.

Information is the basic concept of cybernetics, just as economic intelligence is the basic concept of economic cybernetics.

There are many definitions of this term, they are complex and contradictory. The reason, obviously, is that various sciences deal with cybernetics as a phenomenon, and cybernetics is only the youngest of them. I. is the subject of study of such sciences as the science of management, mathematics, genetics, and the theory of mass media I. (print, radio, television), computer science, dealing with the problems of scientific and technical information, etc. Finally, recent times Philosophers show great interest in the problems of reflection: they tend to regard reflection as one of the basic universal properties of matter, connected with the concept of reflection. With all interpretations of the concept of I., it assumes the existence of two objects: the source of I. and the acquirer (receiver) of I. The transfer of I. from one to another occurs with the help of signals that, generally speaking, may not have any physical connection with its meaning: this the relationship is determined by agreement. For example, a blow to the veche bell meant that it was necessary to gather in the square, but for those who did not know about this order, he did not inform any I.

In the situation with the vesper bell, the person involved in the agreement on the meaning of the signal knows that at the moment there can be two alternatives: the vespers will take place or not. Or, to put it in the language of I. theory, an indefinite event (veche) has two outcomes. The received signal leads to a decrease in uncertainty: the person now knows that the event (veche) has only one outcome - it will take place. However, if it was known in advance that the veche would take place at such and such an hour, the bell did not announce anything new. It follows from this that the less likely (i.e., more unexpected) the message, the more I. it contains, and vice versa, the more likely the outcome before the event, the less I. contains the signal. Approximately such reasoning led in the 40s. 20th century to the emergence of a statistical, or “classical”, theory of I., which defines the concept of I. through a measure of reducing the uncertainty of knowledge about the accomplishment of an event (such a measure was called entropy). N. Wiener, K. Shannon and Soviet scientists A. N. Kolmogorov, V. A. Kotelnikov and others stood at the origins of this science. ., storage capacity of I. devices, etc., which served as a powerful stimulus for the development of cybernetics as a science and electronic computing technology as a practical application of the achievements of cybernetics.

As for the definition of the value, usefulness of I. for the recipient, there is still a lot of unresolved, unclear. If we proceed from the needs of economic management and, consequently, economic cybernetics, then information can be defined as all the information, knowledge, messages that help solve a particular management problem (that is, reduce the uncertainty of its outcomes). Then some possibilities open up for evaluating I.: it is the more useful, more valuable, the sooner or with less costs leads to the solution of the problem. The concept of I. is close to the concept of data. However, there is a difference between them: data are signals from which the AND must still be extracted. Data processing is the process of reducing them to a form suitable for this.

The process of their transfer from the source to the acquirer and perception as I. can be considered as the passage of three filters:

Physical, or statistical (a purely quantitative limitation on the bandwidth of the channel, regardless of the content of the data, that is, in terms of syntactic);

Semantic (selection of those data that can be understood by the recipient, i.e., correspond to the thesaurus of his knowledge);

Pragmatic (selection among the understood information of those that are useful for solving a given problem).

This is well shown in the diagram taken from E. G. Yasin's book on economic information. Accordingly, three aspects of the study of I. problems are distinguished—syntactic, semantic, and pragmatic.

According to its content, I. is subdivided into socio-political, socio-economic (including economic I.), scientific and technical, etc. In general, there are many classifications of I., they are built on various grounds. As a rule, due to the proximity of concepts, data classifications are built in the same way. For example, information is subdivided into static (constant) and dynamic (variable), while data are divided into constants and variables. Another division is primary, derivative, output I. (data are classified in the same way). The third division is I. managing and informing. The fourth is redundant, useful and false. Fifth - complete (continuous) and selective. This idea of ​​Wiener gives a direct indication of the objectivity of information, i.e. its existence in nature is independent of human consciousness (perception).

Information (Information) is

Modern cybernetics defines objective information as an objective property of material objects and phenomena to generate a variety of states that are transferred from one object (process) to another through fundamental interactions of matter and imprinted in its structure. A material system in cybernetics is considered as a set of objects that themselves can be in different states, but the state of each of them is determined by the states of other objects in the system.

Information (Information) is

In nature, the set of system states is information, the states themselves are the primary code, or source code. Thus, each material system is a source of information. Cybernetics defines subjective (semantic) information as the meaning or content of a message.

The role of information in computer science

The subject of science is precisely the data: methods of their creation, storage, processing and transmission. Content (also: “filling” (in the context), “site content”) is a term meaning all types of information (both textual and multimedia - images, audio, video) that make up the content (visualized, for the visitor, content) of the web -site. It is used to separate the concept of information that makes up the internal structure of the page / site (code), from that which will eventually be displayed on the screen.

The word "information" comes from the Latin word informatio, which means information, clarification, familiarization. The concept of "information" is basic in the course of computer science, but it is impossible to define it through other, more "simple" concepts.

The following approaches to the definition of information can be distinguished:

Traditional (ordinary) - used in computer science: information is information, knowledge, messages about the state of affairs that a person perceives from the outside world with the help of the senses (vision, hearing, taste, smell, touch).

Probabilistic - used in the theory of information: information is information about objects and phenomena of the environment, their parameters, properties and state, which reduce the degree of uncertainty and incompleteness of knowledge about them.

Information is stored, transmitted and processed in symbolic (sign) form. The same information can be presented in different forms:

Signed writing, consisting of various signs, among which there is a symbolic one in the form of text, numbers, specials. symbols; graphic; tabular, etc.;

The form of gestures or signals;

Oral verbal form (conversation).

The presentation of information is carried out with the help of languages, as sign systems, which are built on the basis of a certain alphabet and have rules for performing operations on signs. Language is a certain symbolic system for representing information. Exist:

natural languages ​​- spoken languages in oral and written form. In some cases colloquial speech can replace the language of facial expressions and gestures, the language of special signs (for example, road signs);

Formal languages ​​are special languages ​​for various areas of human activity, which are characterized by a rigidly fixed alphabet, stricter rules of grammar and syntax. This is the language of music (notes), the language of mathematics (numbers, mathematical signs), number systems, programming languages, etc. At the heart of any language is the alphabet - a set of symbols / signs. The total number of symbols in an alphabet is called the cardinality of the alphabet.

Information carriers - a medium or a physical body for the transmission, storage and reproduction of information. (These are electrical, light, thermal, sound, radio signals, magnetic and laser discs, printed publications, photographs, etc.)

Information processes are processes associated with the receipt, storage, processing and transmission of information (i.e. actions performed with information). Those. These are processes during which the content of information or the form of its presentation changes.

To ensure the information process, a source of information, a communication channel and an information acquirer are needed. The source transmits (sends) information, and the receiver receives (perceives) it. The transmitted information is achieved from the source to the receiver using a signal (code). Changing the signal allows you to get information.

Being an object of transformation and use, information is characterized by the following properties:

Syntax is a property that determines the way information is presented on a carrier (in a signal). So, this information is presented on electronic media using a specific font. Here you can also consider such information presentation parameters as the style and color of the font, its size, line spacing, etc. The selection of the required parameters as syntactic properties is obviously determined by the proposed transformation method. For example, for a visually impaired person, the font size and color are essential. If you intend to enter this text into a computer through a scanner, the paper size is important;

Semantics is a property that defines the meaning of information as the correspondence of a signal to the real world. So, the semantics of the signal “computer science” is in the definition given earlier. Semantics can be viewed as some agreement, known to the acquirer of information, about what each signal means (the so-called interpretation rule). For example, it is the semantics of signals that is studied by a novice motorist who studies the rules of the road, learning road signs (in this case, the signs themselves act as signals). The semantics of words (signals) is learned by a trainee to some foreign language. We can say that the meaning of teaching computer science is to study the semantics of various signals - the essence of the key concepts of this discipline;

Pragmatics is a property that determines the influence of information on the behavior of the acquirer. So the pragmatics of the information received by the reader of this study guide is, at least, the successful passing of the computer science exam. I would like to believe that the pragmatics of this work will not be limited to this, and it will serve for further education and professional activity of the reader.

Information (Information) is

It should be noted that signals of different syntax can have the same semantics. For example, the signals "computer" and "computer" mean an electronic device for converting information. In this case, one usually speaks of signal synonymy. On the other hand, one signal (i.e., information with one syntactic property) can have different pragmatics for consumers and different semantics. Thus, a road sign, known as a “brick” and having a well-defined semantics (“no entry”), means a ban on entry for a motorist, but does not affect a pedestrian in any way. At the same time, the “key” signal can have different semantics: a treble clef, a spring clef, a key to open a lock, a key used in computer science to encode a signal in order to protect it from unauthorized access (in this case, we speak of signal homonymy). There are signals - antonyms that have opposite semantics. For example, "cold" and "hot", "fast" and "slow", etc.

The subject of study of the science of informatics is precisely the data: the methods of their creation, storage, processing and transmission. And the information itself recorded in the data, its meaningful meaning is of interest to users of information systems who are specialists in various sciences and fields of activity: a physician is interested in medical information, a geologist is interested in geological information, a businessman is interested in commercial information, etc. (including a computer scientist who is interested in information on working with data).

Semiotics - the science of information

Information cannot be imagined without its receipt, processing, transmission, etc., that is, outside the framework of information exchange. All acts of information exchange are carried out by means of symbols or signs, with the help of which one system influences another. Therefore, the main science that studies information is semiotics - the science of signs and sign systems in nature and society (the theory of signs). In each act of information exchange, one can find three of its "participants", three elements: a sign, an object that it designates, and a recipient (user) of the sign.

Depending on the relations between which elements are considered, semiotics is divided into three sections: syntactics, semantics and pragmatics. Syntactics studies signs and the relationships between them. At the same time, it abstracts from the content of the sign and from its practical value for the recipient. Semantics studies the relationship between signs and the objects they designate, while abstracting from the recipient of signs and the value of the latter: for him. It is clear that the study of the patterns of semantic representation of objects in signs is impossible without taking into account and using the general patterns of construction of any sign systems studied by syntactics. Pragmatics studies the relationship between signs and their users. Within the framework of pragmatics, all factors that distinguish one act of information exchange from another, all questions of the practical results of using information and its value for the recipient are studied.

At the same time, many aspects of the relationship of signs between themselves and with the objects they designate are inevitably affected. Thus, the three sections of semiotics correspond to three levels of abstraction (distraction) from the characteristics of specific acts of information exchange. The study of information in all its diversity corresponds to the pragmatic level. Distracting from the recipient of information, excluding him from consideration, we move on to studying it at the semantic level. With a distraction from the content of signs, the analysis of information is transferred to the level of syntactic. Such interpenetration of the main sections of semiotics, associated with different levels of abstraction, can be represented using the scheme "Three sections of semiotics and their relationship." Measurement of information is carried out respectively in three aspects: syntactic, semantic and pragmatic. The need for such a different dimension of information, as will be shown below, is dictated by design practice and firms work of information systems. Consider a typical production situation.

At the end of the shift, the site planner prepares data on the implementation of the production schedule. This data is sent to the information and computing center (ICC) of the enterprise, where it is processed, and in the form of reports on the state of production at the current moment, they are issued to managers. Based on the data received, the shop manager decides to change the production plan to the next planned one or to take any other organizational measures. It is obvious that for the head of the shop, the amount of information that the summary contained depends on the magnitude of the economic impact received from its use in decision-making, on how useful the information was. For the site planner, the amount of information in the same message is determined by the accuracy of its correspondence to the actual state of affairs on the site and the degree of surprise of the reported facts. The more unexpected they are, the faster you need to report them to management, the more information in this message. For ITC employees, the number of characters, the length of the message carrying information will be of paramount importance, since it determines the loading time of computer equipment and communication channels. At the same time, neither the usefulness of information, nor the quantitative measure of the semantic value of information is practically of no interest to them.

Naturally, when organizing a production management system, building models for choosing a solution, we will use the usefulness of information as a measure of the information content of messages. When building a system accounting and reporting that provides guidance on progress data production process as a measure of the amount of information should take the novelty of the information received. Company The same procedures for the mechanical processing of information require measuring the volume of messages in the form of the number of processed characters. These three essentially different approaches to measuring information do not contradict or exclude each other. On the contrary, by measuring information on different scales, they allow a more complete and comprehensive assessment of the information content of each message and more efficient organization of the production management system. According to the apt expression of Prof. NOT. Kobrinsky, when it comes to a rational company of information flows, the quantity, novelty, usefulness of information turn out to be as interconnected as the quantity, quality and cost of products in production.

Information in the material world

Information is one of the general concepts associated with matter. Information exists in any material object in the form of a variety of its states and is transmitted from object to object in the process of their interaction. The existence of information as an objective property of matter logically follows from the well-known fundamental properties of matter - structure, continuous change (movement) and interaction of material objects.

The structure of matter is manifested as an internal dismemberment of integrity, a regular order of connection of elements in the composition of the whole. In other words, any material object, from the subatomic particle of the Meta Universe (Big Bang) as a whole, is a system of interconnected subsystems. As a result of continuous movement, understood in a broad sense as movement in space and development in time, material objects change their states. The state of objects also changes when interacting with other objects. The set of states of the material system and all its subsystems represents information about the system.

Strictly speaking, due to uncertainty, infinity, structural properties, the amount of objective information in any material object is infinite. This information is called complete. However, it is possible to single out structural levels with finite sets of states. Information that exists at a structural level with a finite number of states is called private. For private information, the meaning is the concept of the amount of information.

From the above representation, the choice of the unit of measure for the amount of information follows logically and simply. Imagine a system that can be in only two equally probable states. Let's assign code "1" to one of them, and "0" to the other. This is the minimum amount of information that the system can contain. It is the unit of measurement of information and is called a bit. There are other, more difficult to define, methods and units for measuring the amount of information.

Depending on the material form of the carrier, information can be of two main types - analog and discrete. Analog information changes in time continuously and takes values ​​from a continuum of values. Discrete information changes at some points in time and takes values ​​from a certain set of values. Any material object or process is the primary source of information. All its possible states constitute the code of the source of information. The instantaneous value of states is represented as a symbol ("letter") of this code. In order for information to be transmitted from one object to another as a receiver, it is necessary that there be some kind of intermediate material carrier that interacts with the source. Such carriers in nature, as a rule, are rapidly propagating processes of the wave structure - cosmic, gamma and x-ray radiation, electromagnetic and sound waves, potentials (and maybe not yet discovered waves) of the gravitational field. When electromagnetic radiation interacts with an object, its spectrum changes as a result of absorption or reflection, i.e. the intensities of some wavelengths change. The harmonics of sound vibrations also change during interactions with objects. Information is also transmitted during mechanical interaction, but mechanical interaction, as a rule, leads to large changes in the structure of objects (up to their destruction), and the information is greatly distorted. Distortion of information during its transmission is called misinformation.

The transfer of source information to a carrier structure is called encoding. In this case, the source code is converted into the carrier code. A carrier with a source code transferred to it in the form of a carrier code is called a signal. The signal receiver has its own set of possible states, which is called the receiver code. The signal, interacting with the receiving object, changes its states. The process of converting a signal code into a receiver code is called decoding. The transfer of information from a source to a receiver can be considered as an information interaction. Information interaction is fundamentally different from other interactions. With all other interactions of material objects, there is an exchange of matter and (or) energy. In this case, one of the objects loses matter or energy, while the other receives them. This property of interactions is called symmetry. During information interaction, the receiver receives information, and the source does not lose it. Information interaction is not symmetrical. Objective information itself is not material, it is a property of matter, such as structure, movement, and exists on material carriers in the form of its codes.

Information in wildlife

Wildlife is complex and varied. Sources and receivers of information in it are living organisms and their cells. The organism has a number of properties that distinguish it from inanimate material objects.

Main:

Continuous exchange of matter, energy and information with the environment;

Irritability, the body's ability to perceive and process information about changes in the environment and the internal environment of the body;

Excitability, the ability to respond to the action of stimuli;

Self-organization, manifested as changes in the body to adapt to environmental conditions.

The organism, considered as a system, has a hierarchical structure. This structure, relative to the organism itself, is subdivided into internal levels: molecular, cellular, the level of organs, and, finally, the organism itself. However, the organism also interacts with organismic living systems, the levels of which are the population, the ecosystem, and all living nature as a whole (the biosphere). Not only matter and energy flows, but also information flows between all these levels. Information interactions in living nature occur in the same way as in inanimate nature. At the same time, wildlife in the process of evolution has created a wide variety of sources, carriers and receivers of information.

The reaction to the influences of the external world is manifested in all organisms, since it is due to irritability. In higher organisms, adaptation to the environment is a complex activity that is effective only with sufficiently complete and timely information about the environment. The receivers of information from the external environment are the sense organs, which include sight, hearing, smell, taste, touch and the vestibular apparatus. In the internal structure of organisms, there are numerous internal receptors associated with the nervous system. The nervous system consists of neurons, the processes of which (axons and dendrites) are analogous to information transmission channels. The main organs that store and process information in vertebrates are the spinal cord and brain. In accordance with the characteristics of the sense organs, the information perceived by the body can be classified as visual, auditory, gustatory, olfactory and tactile.

Getting on the retina of the human eye, the signal in a special way excites the cells that make it up. Nerve impulses of cells through axons are transmitted to the brain. The brain remembers this sensation in the form of a certain combination of states of its constituent neurons. (Continuation of the example - in the section "information in human society"). By accumulating information, the brain creates a connected information model of the surrounding world on its structure. In wildlife, for an organism that receives information, an important characteristic is its availability. The amount of information that nervous system a person is able to feed into the brain when reading texts, is approximately 1 bit per 1/16 s.

Information (Information) is

The study of organisms is hampered by their complexity. The abstraction of structure as a mathematical set, acceptable for inanimate objects, is hardly acceptable for a living organism, because in order to create a more or less adequate abstract model of an organism, it is necessary to take into account all the hierarchical levels of its structure. Therefore, it is difficult to introduce a measure of the amount of information. It is very difficult to determine the relationships between the components of the structure. If it is known which organ is the source of information, then what is the signal and what is the receiver?

Before the advent of computers, biology, dealing with the study of living organisms, used only qualitative, i.e. descriptive models. In a qualitative model, it is practically impossible to take into account the information links between the components of the structure. Electronic computing technology has made it possible to apply new methods in biological research, in particular, the method of machine modeling, which involves a mathematical description of known phenomena and processes occurring in the body, adding hypotheses about some unknown processes to them, and calculating possible variants of the body's behavior. The resulting options are compared with the actual behavior of the organism, which allows you to determine the truth or falsity of the hypotheses put forward. Information interaction can also be taken into account in such models. Extremely complex are the information processes that ensure the existence of life itself. And although it is intuitively clear that this property is directly related to the formation, storage and transmission of complete information about the structure of the body, an abstract description of this phenomenon seemed impossible for some time. However, the information processes that ensure the existence of this property have been partly revealed through the deciphering of the genetic code and reading the genomes of various organisms.

Information in human society

The development of matter in the process of motion is directed towards the complication of the structure of material objects. One of the most complex structures is the human brain. So far, this is the only structure known to us that has the property that man himself calls consciousness. Speaking about information, we, as thinking beings, a priori mean that information, in addition to its presence in the form of signals we receive, also has some kind of meaning. Forming in his mind a model of the surrounding world as an interconnected set of models of its objects and processes, a person uses semantic concepts, not information. Meaning is the essence of any phenomenon that does not coincide with itself and connects it with a wider context of reality. The word itself directly indicates that the semantic content of information can be formed only by thinking receivers of information. In human society, it is not the information itself that acquires decisive importance, but its semantic content.

Example (continued). Having experienced such a sensation, a person assigns the concept “tomato” to the object, and the concept “red color” to its state. In addition, his consciousness fixes the connection: "tomato" - "red". This is the meaning of the received signal. (Example continued: later in this section). The ability of the brain to create semantic concepts and connections between them is the basis of consciousness. Consciousness can be viewed as a self-developing semantic model of the surrounding world. Meaning is not information. Information exists only on a physical medium. Human consciousness is considered intangible. Meaning exists in the human mind in the form of words, images and sensations. A person can pronounce words not only out loud, but also “to himself”. He can also create (or remember) images and sensations “to himself”. However, he can retrieve the information corresponding to this meaning by speaking or writing the words.

Information (Information) is

Example (continued). If the words "tomato" and "red color" are the meaning of concepts, then where is the information? information is contained in the brain in the form of certain states of its neurons. It is also contained in the printed text consisting of these words, and when encoding letters with a three-digit binary code, its number is 120 bits. If you say the words aloud, there will be much more information, but the meaning will remain the same. The greatest amount of information is carried by a visual image. This is reflected even in folklore - "it's better to see once than hear a hundred times." Information restored in this way is called semantic information, since it encodes the meaning of some primary information (semantics). Hearing (or seeing) a phrase spoken (or written) in a language that a person does not know, he receives information, but cannot determine its meaning. Therefore, in order to transmit the semantic content of information, some agreements are required between the source and the receiver on the semantic content of signals, i.e. words. Such agreements can be achieved through communication. Communication is one of the most important conditions for the existence of human society.

In the modern world, information is one of the most important resources and, at the same time, one of the driving forces for the development of human society. Information processes taking place in material world, wildlife and human society are studied (or at least taken into account) by all scientific disciplines from philosophy to marketing. The increasing complexity of the tasks of scientific research has led to the need to involve large teams of scientists of various specialties in their solution. Therefore, almost all the theories considered below are interdisciplinary. Historically, two complex branches of science, cybernetics and informatics, are directly involved in the study of information.

Modern cybernetics is a multidisciplinary industry science that studies supercomplex systems, such as:

Human society (social cybernetics);

Economics (economic cybernetics);

Living organism (biological cybernetics);

The human brain and its function is consciousness (artificial intelligence).

Informatics, which was formed as a science in the middle of the last century, separated from cybernetics and is engaged in research in the field of methods for obtaining, storing, transmitting and processing semantic information. Both of these industries use several fundamental scientific theories. These include information theory, and its sections are coding theory, algorithm theory, and automata theory. Studies of the semantic content of information are based on a complex of scientific theories under the general name of semiotics. Information theory is a complex, mainly mathematical theory, which includes a description and evaluation of methods for extracting, transmitting, storing and classifying information. Considers information carriers as elements of an abstract (mathematical) set, and interactions between carriers as a way of arranging elements in this set. This approach makes it possible to formally describe the information code, that is, to define an abstract code and explore it. mathematical methods. For these studies, he applies the methods of probability theory, mathematical statistics, linear algebra, game theory and other mathematical theories.

The foundations of this theory were laid by the American scientist E. Hartley in 1928, who determined the measure of the amount of information for some communication problems. Later, the theory was significantly developed by the American scientist C. Shannon, Russian scientists A.N. Kolmogorov, V.M. Glushkov and others. Modern information theory includes both sections coding theory, algorithm theory, digital automata theory (see below) and some others. There are also alternative information theories, for example, "Qualitative information theory", proposed by the Polish scientist M. Mazur. Any person is familiar with the concept of an algorithm, without even knowing it. Here is an example of an informal algorithm: “Cut the tomatoes into circles or slices. Put chopped onion in them, pour vegetable oil, then sprinkle with finely chopped capsicum, mix. Before use, sprinkle with salt, put in a salad bowl and garnish with parsley. (Tomato salad).

The first rules for solving arithmetic problems in the history of mankind were developed by one of the famous scientists of antiquity Al-Khwarizmi in the 9th century AD. In his honor, formalized rules for achieving a goal are called algorithms. The subject of the theory of algorithms is to find methods for constructing and evaluating effective (including universal) computational and control algorithms for information processing. To substantiate such methods, the theory of algorithms uses the mathematical apparatus of information theory. The modern scientific concept of algorithms as ways of processing information was introduced in the works of E. Post and A. Turing in the 20s of the twentieth century (Turing Machine). A great contribution to the development of the theory of algorithms was made by Russian scientists A. Markov (Normal Markov Algorithm) and A. Kolmogorov. Automata theory is a section of theoretical cybernetics that studies mathematical models of actually existing or fundamentally possible devices that process discrete information at discrete times.

The concept of an automaton originated in the theory of algorithms. If there are some universal algorithms for solving computational problems, then there must be devices (albeit abstract) for the implementation of such algorithms. Actually, the abstract Turing machine, considered in the theory of algorithms, is at the same time an informally defined automaton. The theoretical justification for the construction of such devices is the subject of automata theory. Automata theory uses the apparatus of mathematical theories - algebra, mathematical logic, combinatorial analysis, graph theory, probability theory, etc. Automata theory, together with the theory of algorithms, is the main theoretical basis for creating electronic computers and automated control systems. Semiotics is a complex of scientific theories that study the properties of sign systems. The most significant results have been achieved in the branch of semiotics—semantics. The subject of research in semantics is the semantic content of information.

A sign system is a system of concrete or abstract objects (signs, words), with each of which a certain value is associated in a certain way. In theory, it is proved that there can be two such comparisons. The first type of correspondence directly defines the material object that denotes this word and is called the denotation (or, in some works, the nominee). The second type of correspondence determines the meaning of the sign (word) and is called the concept. At the same time, such properties of comparisons as “meaning”, “truth”, “definability”, “following”, “interpretation”, etc. are studied. For research, the apparatus of mathematical logic and mathematical linguistics is used. F de Saussure in the 19th century, formulated and developed by C. Pierce (1839-1914), C. Morris (born 1901), R. Carnap (1891-1970) and others. The main achievement of the theory is the creation of a semantic analysis apparatus that allows one to represent the meaning of a text in a natural language as a record in some formalized semantic (semantic) language. Semantic analysis is the basis for creating devices (programs) for machine translation from one natural language to another.

Information is stored by means of its transfer to some material carriers. Semantic information recorded on a material storage medium is called a document. Mankind has learned to store information for a very long time. In the most ancient forms of information storage, the arrangement of objects was used - shells and stones on the sand, knots on a rope. A significant development of these methods was writing - a graphic representation of symbols on stone, clay, papyrus, paper. Of great importance in the development of this direction was invention typography. Throughout its history, humanity has accumulated a huge amount of information in libraries, archives, periodicals and other written documents.

At present, the storage of information in the form of sequences of binary characters has gained particular importance. To implement these methods, various storage devices are used. They are the central link of information storage systems. In addition to them, such systems use information retrieval tools ( search system), means of obtaining information (information and reference systems) and means of displaying information (output device). Formed according to the purpose of information, such information systems form databases, data banks and a knowledge base.

The transfer of semantic information is the process of its spatial transfer from the source to the recipient (addressee). Man learned to transmit and receive information even earlier than to store it. Speech is a method of transmission that our distant ancestors used in direct contact (conversation) - we still use it now. To transmit information over long distances, it is necessary to use much more complex information processes. To implement such a process, information must be formalized (presented) in some way. To represent information, various sign systems are used - sets of predetermined semantic symbols: objects, pictures, written or printed words of a natural language. The semantic information about some object, phenomenon or process presented with their help is called a message.

Obviously, in order to transmit a message over a distance, information must be transferred to some kind of mobile carrier. Carriers can move in space with the help of vehicles, as is the case with letters sent by mail. This method ensures complete reliability of information transmission, since the addressee receives the original message, but requires a significant amount of time for transmission. Since the middle of the 19th century, methods of transmitting information have become widespread, using a naturally propagating carrier of information - electromagnetic oscillations (electrical oscillations, radio waves, light). The implementation of these methods requires:

Preliminary transfer of the information contained in the message to the carrier - encoding;

Ensuring the transmission of the signal thus obtained to the addressee via a special communication channel;

Reverse conversion of the signal code into the message code - decoding.

Information (Information) is

The use of electromagnetic media makes the delivery of a message to the addressee almost instantaneous, however, it requires additional measures to ensure the quality (reliability and accuracy) of the transmitted information, since real communication channels are subject to natural and artificial interference. Devices that implement the process of data transmission form communication systems. Depending on the method of presenting information, communication systems can be divided into sign (, telefax), sound (), video and combined systems (television). The most developed communication system in our time is the Internet.

Data processing

Since information is not material, its processing consists in various transformations. Processing processes include any transfer of information from a medium to another medium. The information to be processed is called data. The main type of processing of primary information received by various devices is the transformation into a form that ensures its perception by the human senses. Thus, space photographs obtained in X-rays are converted into ordinary color photographs using special spectrum converters and photographic materials. Night vision devices convert an image obtained in infrared (thermal) rays into an image in the visible range. For some communication and control tasks, it is necessary to convert analog information. For this, analog-to-digital and digital-to-analog signal converters are used.

The most important type of semantic information processing is the determination of the meaning (content) contained in a certain message. Unlike primary information, semantic information does not have statistical characteristics, that is, a quantitative measure - the meaning is either there or it is not. And how much of it, if any, is impossible to establish. The meaning contained in the message is described in an artificial language that reflects the semantic relationships between the words of the source text. A dictionary of such a language, called a thesaurus, resides in the message receiver. The meaning of words and phrases of the message is determined by referring them to certain groups of words or phrases, the meaning of which has already been established. Thesaurus, thus, allows you to establish the meaning of the message and, at the same time, is replenished with new semantic concepts. The described type of information processing is used in information retrieval systems and machine translation systems.

One of the widespread types of information processing is the solution of computational problems and problems of automatic control with the help of computers. Information processing is always done with a purpose. To achieve it, the order of actions on information, leading to a given goal, must be known. This procedure is called an algorithm. In addition to the algorithm itself, you also need some device that implements this algorithm. In scientific theories, such a device is called an automaton. It should be noted as the most important feature of information that, due to the asymmetry of information interaction, new information arises during information processing, and the original information is not lost.

Analog and digital information

Sound is wave vibrations in a medium, such as air. When a person speaks, the vibrations of the throat ligaments are converted into wave vibrations of the air. If we consider sound not as a wave, but as oscillations at one point, then these oscillations can be represented as air pressure changing over time. A microphone can pick up pressure changes and convert them into electrical voltage. There was a transformation of air pressure into electrical voltage fluctuations.

Such a transformation can occur according to various laws, most often the transformation occurs according to a linear law. For example, like this:

U(t)=K(P(t)-P_0),

where U(t) is the electrical voltage, P(t) is the air pressure, P_0 is the mean air pressure and K is the conversion factor.

Both electrical voltage and air pressure are continuous functions in time. The functions U(t) and P(t) are information about the vibrations of the throat ligaments. These functions are continuous and such information is called analog. Music is a special case of sound and it can also be represented as some function of time. It will be an analog representation of music. But music is also recorded in the form of notes. Each note has a duration that is a multiple of a predetermined duration, and a pitch (do, re, mi, fa, sol, etc.). If this data is converted into numbers, then we get a digital representation of music.

Human speech is also a special case of sound. It can also be represented in analog form. But just as music can be broken down into notes, speech can be broken down into letters. If each letter is given its own set of numbers, then we will get a digital representation of speech. The difference between analog information and digital information is that analog information is continuous, while digital information is discrete. The transformation of information from one type to another, depending on the type of transformation, is called differently: simply "conversion", such as digital-to-analog conversion, or analog-to-digital conversion; complex transformations are called "encoding", eg delta coding, entropy coding; the transformation between characteristics such as amplitude, frequency or phase is called "modulation", for example, amplitude-frequency modulation, pulse-width modulation.

Information (Information) is

Usually, analog conversions are fairly simple and easy to handle. various devices invented by man. A tape recorder converts magnetization on film into sound, a voice recorder converts sound into magnetization on film, a video camera converts light into magnetization on film, an oscilloscope converts electrical voltage or current into an image, and so on. Converting analog information to digital is much more difficult. Some transformations cannot be performed by the machine or can be done with great difficulty. For example, converting speech into text, or converting a concert recording into sheet music, and even by its nature a digital representation: it is very difficult for a machine to convert text on paper into the same text in computer memory.

Information (Information) is

Why, then, use the digital representation of information, if it is so difficult? The main advantage of digital information over analog is noise immunity. That is, in the process of copying information, digital information is copied as it is, it can be copied almost an infinite number of times, while analog information is noisy during the copying process, its quality deteriorates. Usually, analog information can be copied no more than three times. If you have a two-cassette audio tape recorder, you can make such an experiment, try copying the same song from cassette to cassette several times, after a few such re-recordings you will notice how much recording quality has deteriorated. The information on the cassette is stored in analog form. You can rewrite music in mp3 format as many times as you like, and the quality of the music does not deteriorate. The information in an mp3 file is stored digitally.

Amount of information

A person or some other receiver of information, having received a portion of information, resolves some uncertainty. Let's take a tree as an example. When we saw the tree, we resolved a number of uncertainties. We learned the height of the tree, the type of tree, the density of the foliage, the color of the leaves, and if it is a fruit tree, then we saw the fruits on it, how ripe they were, etc. Before we looked at the tree, we did not know all this, after we looked at the tree, we resolved the uncertainty - we got information.

If we go out into the meadow and look at it, we will get a different kind of information, how big the meadow is, how tall the grass is, and what color the grass is. If a biologist enters the same meadow, he will, among other things, be able to find out: what varieties of grass grow in the meadow, what type of meadow this is, he will see which flowers have bloomed, which ones will just bloom, whether the meadow is suitable for grazing cows, etc. That is, he will receive more information than we do, since he had more questions before he looked at the meadow, the biologist will resolve more uncertainties.

Information (Information) is

The greater the uncertainty was resolved in the process of obtaining information, the more information we received. But this is a subjective measure of the amount of information, and we would like to have an objective measure. There is a formula for calculating the amount of information. We have some uncertainty, and we have N-th number of cases of resolution of uncertainty, and each case has some probability of resolution, then the amount of information received can be calculated using the following formula that Shannon suggested to us:

I = -(p_1 log_(2)p_1 + p_2 log_(2)p_2 +... +p_N log_(2)p_N), where

I - amount of information;

N is the number of outcomes;

p_1, p_2,..., p_N are the probabilities of the outcome.

Information (Information) is

The amount of information is measured in bits - short for English words BInary digiT, which means binary digit.

For equiprobable events, the formula can be simplified:

I = log_(2)N, where

I - amount of information;

N is the number of outcomes.

Take, for example, a coin and throw it on the table. It will land either heads or tails. We have 2 equally likely events. After we tossed a coin, we got log_(2)2=1 bit of information.

Let's try to find out how much information we get after we roll the die. The cube has six sides - six equally likely events. We get: log_(2)6 approx 2.6. After we rolled the die on the table, we got approximately 2.6 bits of information.

The chance of seeing a Martian dinosaur when we leave our house is one in ten in a billion. How much information will we get about the Martian dinosaur after we leave the house?

Left(((1 over (10^(10))) log_2(1 over (10^(10))) + left(( 1 - (1 over (10^(10)))) ight) log_2 left(( 1 - (1 over (10^(10))) ) ight)) ight) approx 3.4 cdot 10^(-9) bits.

Suppose we tossed 8 coins. We have 2^8 coin drop options. So after tossing coins we get log_2(2^8)=8 bits of information.

When we ask a question and are equally likely to get a yes or no answer, then after answering the question we get one bit of information.

Surprisingly, if we apply the Shannon formula for analog information, then we get an infinite amount of information. For example, the voltage at a point in an electrical circuit can take on an equiprobable value from zero to one volt. The number of outcomes we have is infinity, and by substituting this value into the formula for equiprobable events, we get infinity - an infinite amount of information.

Now I'll show you how to encode "war and peace" with just one notch on any metal rod. Let's encode all the letters and signs that occur in " war and the world", with the help of two-digit numbers - they should be enough for us. For example, we will give the letter “A” the code “00”, the letter “B” - the code “01”, and so on, we will encode punctuation marks, Latin letters and numbers. Recode " war and the world” using this code and get a long number, for example, this is 70123856383901874..., add a comma and zero before this number (0.70123856383901874...). The result is a number from zero to one. Let's put at risk on a metal rod so that the ratio of the left side of the rod to the length of this rod is exactly equal to our number. Thus, if we suddenly want to read "war and peace", we will simply measure the left side of the rod to risks and the length of the entire rod, we divide one number by another, we get a number and recode it back into letters (“00” into “A”, “01” into “B”, etc.).

Information (Information) is

In reality, we will not be able to do this, since we will not be able to determine the lengths with infinite accuracy. Some engineering problems prevent us from increasing the measurement accuracy, and quantum physics shows us that after a certain limit, quantum laws will already interfere with us. Intuitively, we understand that the lower the measurement accuracy, the less information we receive, and the greater the measurement accuracy, the more information we receive. Shannon's formula is not suitable for measuring the amount of analog information, but there are other methods for this, which are discussed in Information Theory. In computer technology, a bit corresponds to the physical state of the information carrier: magnetized - not magnetized, there is a hole - no hole, charged - not charged, reflects light - does not reflect light, high electrical potential - low electrical potential. In this case, one state is usually denoted by the number 0, and the other - by the number 1. Any information can be encoded by a sequence of bits: text, image, sound, etc.

Along with a bit, a value called a byte is often used, usually it is equal to 8 bits. And if the bit allows you to choose one equally likely option out of two possible, then the byte is 1 out of 256 (2 ^ 8). To measure the amount of information, it is also customary to use larger units:

1 KB (one kilobyte) 210 bytes = 1024 bytes

1 MB (one megabyte) 210 KB = 1024 KB

1 GB (one gigabyte) 210 MB = 1024 MB

In reality, the SI prefixes kilo-, mega-, giga- should be used for factors of 10^3, 10^6 and 10^9, respectively, but the practice of using factors with powers of two has historically developed.

A Shannon bit and a computer bit are the same if the probabilities of a zero or one occurring in a computer bit are equal. If the probabilities are not equal, then the amount of information according to Shannon becomes less, we saw this in the example of the Martian dinosaur. The computer amount of information gives an upper estimate of the amount of information. Volatile memory after power is applied to it is usually initialized with some value, for example, all ones or all zeros. It is clear that after power is supplied to the memory, there is no information there, since the values ​​in the memory cells are strictly defined, there is no uncertainty. Memory can store a certain amount of information, but after power is supplied to it, there is no information in it.

Disinformation is deliberately false information provided to an enemy or business partner for more effective conduct of hostilities, cooperation, checking for information leakage and direction of its leakage, identifying potential black market customers. Also, disinformation (also misinformed) is the process of manipulating information itself, such as: misleading someone by providing incomplete information or complete, but no longer necessary information, distorting the context, distorting part of the information.

The purpose of such an impact is always the same - the opponent must act as the manipulator needs. The act of the object against which disinformation is directed may consist in making the decision necessary for the manipulator or in refusing to make a decision that is unfavorable for the manipulator. But in any case, the ultimate goal is the action that will be taken by the opponent.

Disinformation is thus product human activity, an attempt to create a false impression and, accordingly, push for the desired actions and / or inaction.

Information (Information) is

Types of disinformation:

Misleading a specific person or group of persons (including an entire nation);

Manipulation (by the actions of one person or a group of persons);

Creating public opinion about some problem or object.

Information (Information) is

Misleading is nothing more than outright deception, the provision of false information. Manipulation is a method of influence aimed directly at changing the direction of people's activity. There are the following levels of manipulation:

Strengthening the values ​​(ideas, attitudes) that exist in the minds of people that are beneficial to the manipulator;

Partial change of views on a particular event or circumstance;

A radical change in life attitudes.

The creation of public opinion is the formation in society of a certain attitude towards the chosen problem.

Sources and links

en.wikipedia.org - the free encyclopedia Wikipedia

youtube.com - YouTube video hosting

images.yandex.ua - Yandex pictures

google.com.ua - Google pictures

en.wikibooks.org - wikibook

inf1.info - Planet of Informatics

old.russ.ru - Russian Journal

shkolo.ru - Information guide

5byte.ru - Informatics website

ssti.ru - Information technologies

klgtu.ru - Informatics

informatika.sch880.ru - website of the computer science teacher O.V. Podvintseva

Encyclopedia of cultural studies

The basic concept of cybernetics, in exactly the same way economic I. the basic concept of economic cybernetics. There are many definitions of this term, they are complex and contradictory. The reason for this, obviously, is that I., as a phenomenon, is engaged in ... ... Economic and Mathematical Dictionary


We are using cookies for the best presentation of our site. Continuing to use this site, you agree with this. OK

Life in the post-industrial era leaves its imprints on human consciousness. The concept of "information" in our time has become as key as water and air. To understand the importance of this phenomenon, you need to understand the interpretation of the term.

What is information?

The versatility of the term has given rise to many interpretations. So, depending on the scope of use, information is:

According to the Federal Law of the Russian Federation "On information, information technologies and information protection" (2006), this concept is interpreted as "information (messages) regardless of the form of their presentation."

Thus, information is data presented in various forms. This term is considered key in the work of a journalist.

What is up-to-date information?

One more distinctive feature of this concept are its properties. The attributes of information include its quality, quantity, novelty, value, reliability, complexity and ability to be compressed. Each of these indicators can be measured. Another important property of the concept of "information" is its relevance.

Not all data will meet this indicator. The origins of the word "relevance" can be traced in Latin, where it was interpreted as "modern", "important at the present moment", "topical". The peculiarity of this quality is that it can be lost when more recent data becomes available. This process occurs immediately and completely or gradually and in parts.

Up-to-date information is data that is in a state that corresponds to reality. Once outdated, they lose their value.

Search for information

Modernity is a boundless ocean of data in which we need to find every day what will satisfy our requests. In order to structure the process of information retrieval, a separate science was even created. Her father is considered to be the American teachings of Calvin Mowers. Information retrieval, according to the definition of the researcher, is the process of identifying in an indefinite number of documents those that can satisfy our information needs, that is, contain the necessary data.

The algorithm of actions includes operations for collecting, processing and providing the requested information. To effectively search for information, you need to follow the following plan:

  • formulate a query (information that we want to find);
  • find likely sources of needed data;
  • select the required materials;
  • get acquainted with the acquired body of knowledge and evaluate the work done.

This algorithm is able to facilitate the educational process and preparation for writing scientific articles. It was created by the author's realization that information is a boundless space around us. And extracting the necessary data is possible only if you systematize your efforts.

Collection and storage of information

Depending on the goals set, data and information can be subjected to various operations. Collection and storage is one of them.

Working with information is possible only after a thorough search. This process is called data collection, that is, the accumulation in order to ensure a sufficient amount for further processing. This stage of working with information is considered one of the most important, because the quality and relevance of the data that will have to be dealt with in the future depend on it.

Data collection phases:

  • primary perception;
  • development of classification of the obtained data;
  • object coding;
  • registration of results.

The next step in working with information is to ensure its safety for later use.

Data storage is a way to arrange their circulation in space and time. This process depends on the media - a disc, a picture, a photograph, a book, etc. The shelf life is also differentiated: the school diary must be kept for school year, and a subway ticket - only during the trip.

Information is something that exists only on a certain medium. Therefore, the processes of collection and storage can be considered key in working with it.

Information transfer methods

Data circulation is an irreversible process that we face everywhere. The ability to transfer information from person to person is the key to the evolution of the entire civilization. This phenomenon is the movement of signs and information in space in order to organize access to them by other subjects.

The media are its carriers, that is, everything that can serve to store data.

The information transfer scheme consists of the following links: source, communication channel and recipient (recipient). The use of technical means for this purpose provides for the preliminary encoding of the message in a form convenient for the transmitter and its subsequent decoding. Among them are the telegraph, telephone, TV, radio, Internet.

Media has always been considered the most powerful means of data transmission. They operate over vast territories and play a key role in shaping public opinion.

Data protection

Although information about the world around us is open for review, some of them have a special status and are closed to outsiders. These include state, commercial and ship secrets, data on new inventions before the publication of an official announcement about them, as well as personal information about the events of the subject's life, allowing to identify his personality.

  • identifying data that is considered sensitive;
  • restricting access to them by establishing a special procedure for handling and monitoring compliance with this procedure;
  • accounting for those who have access to confidential information;
  • marking “Commercial secret” on material carriers.

All of these are necessary security measures. To adhere to them means to prevent a huge number of crimes, frauds and save the lives of many people.

As you can see, the study of the essence of the term "information" is a process that includes work in many areas and consists in assessing the qualities and methods of processing information that reflects the facts of the surrounding world.

The concept of information

In concept "information"(from lat. information- information, clarification, presentation) a different meaning is invested according to the industry where this concept is considered: in science, technology, everyday life, etc. Usually, information means any data or information that is of interest to someone (a message about any events, about someone's activities, etc.).

There are many definitions of the term in the literature. "information", which reflect different approaches to its interpretation:

Definition 1

  • Information- information (messages, data) regardless of the form of their presentation (“Federal Law of the Russian Federation of $ 27.07.2006 $ 149 $-FZ On Information, Information Technologies and Information Protection”);
  • Information- information about the surrounding world and the processes occurring in it, perceived by a person or a special device ( Dictionary Russian language Ozhegov).

Speaking about computer data processing, information is understood as a certain sequence of symbols or signs (letters, numbers, encoded graphic images and sounds, etc.), which carries a semantic load and is presented in a form understandable to a computer.

In computer science, the most commonly used definition of this term is:

Definition 2

Information- this is conscious information (knowledge expressed in signals, messages, news, notifications, etc.) about the world, which is the object of storage, transformation, transmission and use.

The same informational message (an article in a magazine, an announcement, a story, a letter, a reference, a photograph, a TV show, etc.) can carry a different amount and content of information for different people, depending on their accumulated knowledge, on the level of accessibility of this message and the level of interest in it. For example, a news item written in Chinese, does not carry any information to a person who does not know this language, but may be useful to a person with knowledge of Chinese. No new information will be contained in the news presented in a familiar language, if its content is not clear or is already known.

Information is considered as a characteristic not of a message, but of the relationship between the message and its recipient.

Types of information

Information can exist in various types:

  • text, drawings, drawings, photographs;
  • light or sound signals;
  • radio waves;
  • electrical and nerve impulses;
  • magnetic records;
  • gestures and facial expressions;
  • smells and taste sensations;
  • chromosomes through which the traits and properties of organisms are inherited, etc.

Distinguish main types of information, which are classified according to its form of presentation, methods of its encoding and storage:

  • graphic- one of the oldest types, with the help of which they stored information about the world around them in the form of rock paintings, and then in the form of paintings, photographs, diagrams, drawings on various materials (paper, canvas, marble, etc.), which depict pictures of the real world;
  • sound(acoustic) - for the storage of sound information in $ 1877, a sound recording device was invented, and for musical information, a coding method was developed using special characters, which makes it possible to store it as graphic information;
  • textual- encodes a person's speech using special characters - letters (each nation has its own); paper is used for storage (notes in notebooks, typography, etc.);
  • numerical- encodes a quantitative measure of objects and their properties in the surrounding world using special symbols - numbers (each coding system has its own); became especially important with the development of trade, economy and money exchange;
  • video information- a way of storing "live" pictures of the world, which appeared with the invention of cinema.

There are also types of information for which encoding and storage methods have not yet been invented - tactile information, organoleptic and etc.

Initially, information was transmitted over long distances using coded light signals, after the invention of electricity - the transmission of a signal encoded in a certain way over wires, later - using radio waves.

Remark 1

Founder general theory Information is considered Claude Shannon, who also laid the foundation for digital communication, writing the book "Mathematical Theory of Communication" in $ 1948, in which he first substantiated the possibility of using a binary code to transmit information.

The first computers were a means for processing numerical information. With the development of computer technology, PCs began to be used for storing, processing, transmitting different kind information (text, numerical, graphic, sound and video information).

You can store information using a PC on magnetic disks or tapes, on laser disks (CDs and DVDs), special non-volatile memory devices (flash memory, etc.). These methods are constantly being improved, and information carriers are being invented. All actions with information are performed by the central processor of the PC.

Objects, processes, phenomena of the material or non-material world, if they are considered from the point of view of their information properties, are called information objects.

A huge number of different information processes can be performed on information, including:

  • creation;
  • reception;
  • combination;
  • storage;
  • broadcast;
  • copying;
  • treatment;
  • Search;
  • perception;
  • formalization;
  • division into parts;
  • measurement;
  • usage;
  • Spread;
  • simplification;
  • destruction;
  • memorization;
  • transformation;

Information properties

Information, like any object, has properties, the most important of which, from the point of view of informatics, are:

  • Objectivity. Objective information - existing independently of human consciousness, methods of fixing it, someone's opinion or attitude.
  • Reliability. Information reflecting the true state of affairs is reliable. Inaccurate information most often leads to misunderstandings or poor decision making. The obsolescence of information can turn reliable information into unreliable information, because it will no longer be a reflection of the true state of affairs.
  • Completeness. Information is complete if it is sufficient for understanding and decision making. Incomplete or redundant information may lead to a delay in decision making or an error.
  • Information Accuracy - the degree of its proximity to the real state of the object, process, phenomenon, etc.
  • The value of information depends on its importance for decision-making, problem solving and further applicability in any kind of human activity.
  • Relevance. Only the timely receipt of information can lead to the expected result.
  • Clarity. If valuable and timely information is unclear, then it is likely to become useless. Information will be understandable when it is, at a minimum, expressed in a language understandable to the recipient.
  • Availability. The information must correspond to the level of perception of the recipient. For example, the same questions are presented differently in school and university textbooks.
  • brevity. Information is perceived much better if it is not presented in detail and verbose, but with an acceptable degree of conciseness, without unnecessary details. The brevity of information is indispensable in reference books, encyclopedias, instructions. Logic, compactness, convenient form of presentation facilitates the understanding and assimilation of information.

Questions:

    The concept of information.

    Information concepts.

    Forms of transmission, presentation and types of information.

    Information properties.

    Measurement of information. Mathematical concept of information

    The concept of the number system.

    Binary coding.

The concept of information

Currently, the concept of information is one of the central in science and practice. And this is not surprising. The role of information in human life has been intuitively recognized since ancient times. "In the beginning was the word" - a thought that permeates the consciousness of man at all times.

With the development of the information approach, which reveals new properties, new aspects of material objects, social phenomena and processes, the very concept of information has grown from everyday categories into a general scientific concept, which, despite its prevalence, to this day causes a huge amount of controversy, discussions about which has many different points of view. “Of all the sciences, information theory and informatics, although enjoying enormous and well-deserved popularity,” writes R.I. the concept of information remains strictly indefinite. Literally, there are as many definitions of this phenomenon as there are authors writing about information.

Regarding the etymology of the word "information", it should be noted that it comes from the Latin "informatio", meaning giving form, properties. In the XIV century, this was the name of the divine "programming" - the investment of the soul and life in the human body. According to legend, in the 14th century, this prerogative of God was appropriated by Rabbi Lev, who created in the Prague ghetto a clay "robot" - Golem, which "came to life" whenever the owner put a "program" under his tongue - a text with the name of God (shem). Around the same time, the word "information" began to refer to the transfer of knowledge through books. Thus, the meaning of this word gradually shifted from the concepts of "inspiration", "revival" to the concepts of "message", "plot", while remaining intuitive and not in need of precise definitions, and even more so, philosophical analysis.

In Russia, the term "information" appeared in the Petrine era, but was not widely used. Only at the beginning of the twentieth century, it began to be used in documents, books, newspapers and magazines and used in the sense of reporting, informing, information about something.

A truly scientific understanding of the concept of information became possible, in fact, only due to the rapid development in the 20s of the last century of means and communication systems, the emergence of computer science and cybernetics, which required the development of an appropriate theoretical base.

The history of the doctrine of information began with the consideration of its quantitative aspect, associated with the solution of applied problems of communication and which found its expression in the American proposed in 1948. scientist-researcher Claude Shannon of the mathematical (statistical) theory of information. The concept he proposed was based on the idea of ​​information as a kind of substance that exists in the real world independently of a person. Shannon noted that "the basic idea of ​​communication theory is that information can be viewed as something very similar to a physical quantity such as mass or energy."

Despite the enormous influence that the mathematical theory of information had on the development of science, subsequent attempts to mechanically extend it to other areas of scientific knowledge led to an understanding of the limitations of its provisions and the need to find other approaches to determining the essence of information.

And such an essentially different, additional approach was the cybernetic approach, covering the structures and connections of systems. With the advent of cybernetics as a science about the general laws of information transformation in complex control systems, methods of perception, storage, processing and use of information, the term "information" has become a scientific concept, a kind of tool for studying control processes.

Back in 1941, Wiener published his first work on analogies between the work of a mathematical machine and the nervous system of a living organism, and in 1948 - a fundamental study "Cybernetics, or control and communication in an animal and a machine", offering his "information vision" of cybernetics, as the sciences of control and communication in living organisms, society and machines. According to Wiener, information is "a designation of the content received from the external world in the process of our adaptation to it and the adaptation of our sense organs to it."

Unlike Shannon, Wiener did not believe that information, matter and energy are categories of the same order. He wrote: "The mechanical brain does not secrete thought, as the liver secretes bile, which the former materialists claimed, and does not secrete it in the form of energy, like muscles. Information is information, not matter and not energy."

Speaking about the formation and development of cybernetics, one cannot fail to note the fact that the approach to the study of the phenomenon of information developed within the framework of this science had a truly invaluable impact on the intensification of scientific research in this area. In this regard, the 60-70s of the twentieth century turned out to be very fruitful for scientific research in the field of information problems. It was during these years, at the dawn of informatization, that the phenomenon of information attracted the attention of philosophers, physicists, linguists, physiologists, geneticists, sociologists, historians, etc. The concept of "information" has gained popularity and aroused increased interest in the scientific community.

Thus, the formation and further development of various doctrines about information, views and approaches to determining its essence has led to the fact that information has grown from an intuitive category of everyday communication into a general scientific category, which also required its own philosophical understanding.

Currently, experts have more than 200 currently existing approaches to the definition of information, among which there is not a single more or less generally recognized one, and some of them simply do not stand up to criticism and cause quite sharp assessments in the scientific community.

Here are just a few of the basic definitions of information:

    Information is knowledge transmitted by someone else or acquired through one's own research or study,

    Information is information contained in this message and considered as an object of transmission, storage and processing,

    Information is the objective content of the connection between interacting material objects, manifested in the change in the states of these objects,

    Information is current data on variables in some areas of activity, systematized information regarding the main causal relationships that are contained in knowledge as a concept of a more general class, in relation to which information is subordinate,

    Information is any communication or transmission of information about something that was not previously known,

    Information is a memorized choice of one option from several possible and equal ones.

The most general definition takes place in philosophy, where information is understood as a reflection of the objective world, expressed in the form of signals and signs.

    Information is a reflection in the minds of people of objective causal relationships in the real world around us,

    Information is the content of reflection processes.

The concept of information implies the existence of two objects: a source and a consumer. To understand the essence of information, other philosophical categories should be taken into account, such as movement, space, time, as well as the problem of the primacy of matter and the secondary nature of cognition. Another very important condition for understanding the essence of information and the correct solution of information-cognitive problems, which include most legal ones, is the use of the principle of adequate display of the displayed object by the object displaying it.

So, information is understood as information about the surrounding world and the processes taking place in it, perceived by a person or special devices to ensure purposeful activity. In addition, information about the object of cognition can be not only perceived by the cognizing subject or a technical device (with appropriate processing), but also, as it were, separated from its primary source - the reflection of the object of cognition.

It follows from this that it can be transferred in space, stored in time, transferred to another cognizing subject or technical device (for example, a computer), subjected to other operations, the totality of which is called information processes. Their composition and sequence are determined in each particular case. In general, information processes are the creation, synthesis, transmission, reception, accumulation, storage, transformation, systematization, analysis, selection, dissemination of information, its presentation in a user-friendly form.

Related to the concept of information are concepts such as signal, message, and data.

Signals reflect the physical characteristics of various processes and objects, and through signs, the perception of the objective world by a person occurs. In this way, Signal is any process that carries information.

It should be noted that sometimes there are definitions in the literature that are based on a comparison of the concept of information and the concept of message. In this case, a "vicious circle" arises in the definition: information is a message, and a message is information. Such a definition in some cases can be justified, despite the fact that it is tautological. However, information theory should go further and define information more meaningfully. There are also significant differences between the concepts of information and data. Data are quantities, their relations, phrases, facts, the transformation and processing of which allows you to extract information, i.e. knowledge about a particular process or phenomenon. To be more precise, then data - these are facts and ideas presented in a formalized form that allows these facts and ideas to be transmitted or processed using some process or appropriate technical means.

Message it is information expressed in a certain form and intended for transmission. An example of a message is the text of a telegram, a speaker's speech, readings of a measuring device, control commands, etc. Thus, a message is a form of information presentation.

Data - this is information presented in a formalized form and intended for processing by technical means, such as a computer. Those. data is the raw material for obtaining information.

Now that we have dealt with the general concepts, let's see how the legislator approaches the concept of "information".

In general, the ancestor of a separate information and legal direction in jurisprudence is A.B. Vengerov. He did not give a clear definition of information, but listed certain features (properties) of information that are important for law. These include, in particular:

a) a certain independence of information in relation to its carrier; b) the possibility of multiple use of the same information; c) its inexhaustibility when consumed; d) preservation of the transferred information at the transferring subject; e) the ability to save, aggregate, integrate, accumulate; e) quantitative certainty; g) consistency 2.

The following concept is used in works on information protection: information is the result of reflection and processing in the human mind of the diversity of the surrounding world, it is information about the objects surrounding a person, natural phenomena, the activities of other people, etc. 3

It should be noted that in recent years there has been an active formation of legislation in the information sphere and the registration of information law as an independent industry. The legal foundation regulating relations in the field of information is a number of articles of the Constitution of the Russian Federation (in particular, articles 23, 24, 29, 44). In addition, for 11 years (until 2006) the Federal Law of February 20, 1995 N 24-FZ "On Information, Informatization and Information Protection" was in force (as amended by the Federal Law of January 10, 2003 N 615- FZ (hereinafter referred to as the Information Law of 1995)).

This law regulated relations arising from the formation and use of information resources based on the creation, collection, processing, accumulation, storage, search, distribution and provision of documented information to the consumer; creation and use of information technologies and means of their support; protection of information, the rights of subjects participating in information processes and informatization.

On July 27, 2006, Federal Law No. 149-FZ "On Information, Information Technologies and Information Protection" (hereinafter referred to as the Federal Law) was adopted, which regulates relations in the exercise of the right to search, receive, transmit, produce and disseminate information, with the use of information technologies, as well as in ensuring the protection of information, with the exception of relations in the field of protection of the results of intellectual activity and equivalent means of individualization.

The development of a new basic legislative act is due to the need to unify the principles and rules of interaction in this area, both from a conceptual and substantive point of view, to eliminate a number of gaps in it and to bring the legislation of the Russian Federation closer to the international practice of regulating information relations.

Article 2 of this Law introduces a number of basic concepts.

The central concept of legislation on information, information technology and information protection is the concept of "information". In the previous Law on Information of 1995, information was understood as information about persons, objects, facts, events, phenomena and processes, regardless of the form of their provision. In the new Federal Law, the definition of information is presented in a more general form. Information is any information (messages, data) regardless of the form of their provision.

Looking ahead a little, it should be noted that Art. 5 of the Law defines the status of information as an object of legal relations. "Information can be an object of public, civil and other legal relations. Information can be freely used by any person and transferred from one person to another, unless federal laws establish restrictions on access to information or other requirements for the procedure for its provision or distribution."

Let's continue the conversation about the concepts introduced in the Law. Article 2 introduces a new definition for the Russian legislation of the concept of "information technology", which combines the processes, methods of searching, collecting, storing, processing, providing, disseminating information and methods for their implementation. Information technology is essential for development because, given the importance of information, it the level of development of such technologies is determined the potential for further progressive movement in all areas of society.

Technology is a complex of scientific and engineering knowledge implemented in labor methods, sets of material, technical, energy, labor factors of production, ways of combining them to create a product or service that meets certain requirements. Therefore, technology is inextricably linked with the mechanization of the production or non-production, primarily managerial, process. Management technologies are based on the use of computers and telecommunications technology.

According to the definition adopted by UNESCO, information technology is a complex of interrelated, scientific, technological, engineering disciplines that study methods for the effective organization of the work of people involved in the processing and storage of information; computer technology and methods of organizing and interacting with people and production equipment, their practical applications, as well as the social, economic and cultural problems associated with all this. Information technologies themselves require complex training, high initial costs and high technology. Their introduction should begin with the creation of software, the formation of information flows in specialist training systems.

Thus, information technology is a complex of objects, actions and rules associated with the preparation, processing and delivery of information in personal, mass and industrial communication, as well as all technologies and industries that integrally provide these processes.

The main types of information technologies include:

High intelligent information technologies, which are the generation of technical solutions that implement situational modeling, allowing to identify the connection of elements, their dynamics and designate the objective laws of the environment;

Auxiliary information technologies - focused on ensuring the performance of certain functions (accounting and statistics, maintaining a personnel system, document management, conducting financial transactions, systems for strategic management, etc.);

Communication information technologies - designed to ensure the development of telecommunications and its systems 4

This article also contains an updated definition of the concept of "information system". In the previous Law on Information of 1995, an information system was understood as an organizationally ordered set of documents (files of documents) and information technologies, including the use of computer technology and communications that implement information processes.

An information system is a technological system representing the compatibility of hardware, software and other means that combine structurally and functionally several types of information processes and provide information services.

Signs of an information system:

Performing one or more functions in relation to information;

The unity of the system (the presence of a common file base, common standards and protocols, unified management);

The ability to compose and decompose system objects when performing specified functions (excerpts from laws, bookmarks - all in 1 file).

Basic requirements for the information system:

Efficiency;

Quality of operation (i.e. its accuracy, security, consistency with standards);

Reliability (that is, those thresholds when the system fails in terms of information quality; in terms of access time; in terms of performance;)

Safety.

The rapid development of information and telecommunications technologies has necessitated the need to enshrine in the legislation such a term as "information and telecommunications network". It is a technological system designed to transmit information over communication lines, access to which is carried out using computer technology.

In accordance with Art. 2 of the Federal Law of July 7, 2003 N 126-FZ "On Communications" (as amended on July 27, 2006), the term "communication lines" refers to transmission lines, physical circuits and line-cable communication structures.

In general, an information and telecommunication network is a means of transmitting information about the surrounding world, its objects, processes and phenomena, objectified in a form that allows their machine processing (decoding). For example, the Internet is a type of information and telecommunication networks.

From a technical point of view, the Internet is the largest telecommunications network formed by combining more than ten thousand five hundred telecommunications networks of various types. Such unification became possible due to the use of the TCP / IP internetwork protocol, which plays the role of a kind of standards translator when transferring data between different types of telecommunication networks.

The Internet as a global information space does not recognize state borders and is not only the most effective means of access to information resources accumulated by mankind, but also becomes a means of disseminating mass information. The functioning of the network is a powerful factor in the development and use of advanced technologies.

On the other hand, the use of the Internet is associated with the possibility of uncontrolled dissemination of unauthorized information, penetration into control systems, violations of human rights, which undoubtedly requires special attention to information security issues.

The rapid development of the Internet in the civilized world is ahead of the process of creating and improving the regulatory legal acts necessary to regulate emerging problems. With the development of the Internet in recent years, the legal problems of the Web are becoming more relevant against the backdrop of a noticeable transformation in the world of approaches to their settlement: from an emphasis on self-regulation to strict legal regulation.

The main problems that require legislative regulation in Russia in connection with the development of the Internet are practically the same as those in other developed countries of the world:

1) ensuring a free connection to the Internet and the exchange of information on the network;

3) protection of personal data, in particular those data that are collected in the course of activities of network operators (including addresses, phone numbers and other personal data of subscribers or buyers in the "electronic commerce" system);

4) connecting state bodies to the Internet and providing citizens with information about the activities of these bodies;

5) prevention of the dissemination of offensive and obscene information, calls for inciting national, racial, religious hatred, etc.;

6) electronic document management, electronic signature, confirmation of the authenticity of information in information products, means of viewing and transmitting information;

7) e-commerce;

8) information security: computer viruses, unauthorized access to information, hacking of servers and networks, destruction and substitution of information;

9) application of means of cryptoprotection;

10) jurisdiction: the legislation of which state should be applied to regulate actions taken on the Web.

An analysis of the current Russian legislation shows that the issues of legal regulation related to the functioning and development of the Internet system in Russia form an extensive regulatory framework that includes more than 50 federal laws at the federal level alone, not to mention numerous regulatory legal acts of the President and the Government of the Russian Federation. The range of these legislative acts is extremely wide, and their interpretation from the standpoint of the specifics of legal relations arising from the use of modern information technologies is difficult, especially since when developing these laws, they did not provide for appropriate opportunities. It is clear that for the courts this area of ​​legal relations is also completely new.

The Law also introduces a number of other concepts. Thus, the Federal Law brings the conceptual apparatus and regulatory mechanisms in line with the practice of using information technologies, defines legal status various categories of information, establishes provisions in the field of creation and operation of information systems, general requirements for the use of information and telecommunication networks, as well as the principles for regulating public relations related to the use of information.

The principle of freedom to search, receive, transfer, produce and distribute information in any legal way is consolidated. At the same time, restriction of access to information can be established only by federal laws.

The law contains provisions aimed at protecting against unfair use or abuse of the means of dissemination of information, in which unnecessary information is imposed on users. In particular, the information must include reliable information about its owner or other distributor, in a form and volume that is sufficient to identify such a person. When using means for disseminating information that allow determining its recipients, including postal items and electronic messages, the distributor is obliged to provide the recipient of information with the opportunity to refuse it.

The basic rules and methods for protecting the rights to information, the information itself are determined by taking basic legal, organizational and technical (software and hardware) measures. The rights of the owner of the information contained in the databases of the information system are subject to protection regardless of copyright and other rights to such databases.

Depending on the category of access to information, it is divided into publicly available, as well as limited by federal laws (restricted access information). A list of information is established, access to which cannot be restricted (for example, on the activities of government bodies and the use of budgetary funds), as well as information provided free of charge.

In the legal literature, they mainly operate with the concepts of openness, publicity, publicity and transparency, along with which the term "transparency" is used. In sectoral legislation, these concepts are used as fundamental principles. The new term "transparency", borrowed from foreign practice, is closely related to publicity, openness, access to information, but, in fact, is closest to the terms of transparency and accessibility.

There is a ban on requiring a citizen (individual) to provide information about his private life, including information constituting a personal or family secret, and to receive such information against the will of a citizen (individual). An exception may be made only in cases provided for by federal laws.

Oddly enough, the text of this Law does not include certain concepts used in the previous Law on Information of 1995. Such concepts are "informatization", "information processes", "information resources", "information about citizens (personal data)" , "means of providing information automated systems".

The absence of the terms "informatization" and "information resources" in the new Federal Law seems to be an omission, because these concepts are already widely used not only in legislative and other regulatory legal acts (for example, in the Customs Code of the Russian Federation, etc.), but also firmly entrenched in the field of enforcement of information legislation.

The developers of this Federal Law, excluding the concept of "informatization" from the text, proceeded from the fact that it, in principle, has the right to exist, but it has no place in legal texts, because it can contain different content. Precisely because of the ambiguity, the use of it in the text of the former Federal Law, especially in its title, was, in their opinion, not the most successful working idea. In addition, the developers of the new Federal Law refer to the absence of this term in foreign legislation.

However, it is difficult to agree with such a position, because during the operation of the Law on Information of 1995, a number of legislative acts were adopted aimed at the formation and use of information resources and defining the norms of a number of codified acts (the Criminal Code of the Russian Federation, the Code of the Russian Federation on Administrative Offenses of December 30, 2001 N 195-FZ ( as amended July 26, 2006)). The terminology of the earlier Act on Information of 1995 was used in many regulatory legal acts. In addition, foreign legislation operates, for example, with such a term as "computerization". Thus, it seems that the term "informatization" did not require an exception from the Russian information legislation.

The exclusion from the Law of the definition of the term "information resources" was also, in my opinion, unreasonable. In addition, for example, in the same Law in part 9 of Art. 14 contains an indication that the information contained in state information systems, as well as other information and documents at the disposal of state bodies, are state information resources.

I note that the lack of clarity of certain terms, as well as sometimes unreasonable changes in the definitions of the concepts of information legislation used, do not contribute to the improvement of legal regulation in the information sphere.

The absence of a definition of the concept of "personal data" is explained by the fact that almost simultaneously with the new Federal Law, Federal Law No. 152-FZ of July 27, 2006 "On Personal Data" was adopted, which ensures the protection of the rights and freedoms of a person and citizen in the processing of his personal data. data, including the protection of the rights to privacy, personal and family secrets.

Concluding the consideration of this issue, it should be noted that a significant number of terms related to the concept of information are contained in such specialized laws as “On Communications”, “On State Secrets”, “On Foreign Intelligence”, etc.