INFORMATION (Entropy and) 1)
← Back
As noted by S. UMPLEBY: "Information and thermodynamic entropy are different concepts. Only the equations are similar".
However, information and entropy are somehow in a polar relationship. A. RAPOPORT, for example, writes: "WIENER's insight into the meaning of the mathematical connection between entropy and information provides further clarification of the fundamental principles of the living process. Increase of entropy can be viewed as "destruction of information". Conversely information can be used to reduce entropy" (1966, p.6).
J.de ROSNAY explains in these words the antinomic relation between information and entropy: "Any information which results from an observation, a measurement or an experience, and says what we already know, does not produce any change in the number of possible answers. It does not reduce our uncertainty. The information provided by a message or an event is so much the greater that its probability of occurrence was small. The information obtained by sorting on the first draw the correct answer (for example l=32/1, in the case of a pack of cards) is the inverse of the probability to obtain it as it was before receiving the message (l=1/32).
"Now, probability and entropy are connected by statistical theory.
"One notices by comparing the different mathematical relations, that information is the opposite of the physicist entropy. It is equivalent to an anti-entropy. This is why the term negentropy (which means negative entropy) has been coined, in order to emphasize this important property. Information and negentropy are thus equivalent to potential energy" (which may be used to produce or transmit information).
"However this comparison may be furthered. By conveniently selecting the constant and the unities, it becomes possible to express information by means of thermodynamical units, thus connecting it directly to entropy. In this way it is possible to calculate the smallest "fall" of energy associated to some measure capable to generate a bit of information: in order to obtain an information equal to a bit, a very small, but finite, and thus significant, quantity of the universe's energy must be degraded.
"This very important discovery led some physicists, like L. BRILLOUIN, to generalize in this way CARNOT's principle, in order to express the undissociable relation between information acquired by the brain and the entropy variation of the universe: Any acquisition of knowledge, obtained from an observation or a physical measurement, uses up energy from the laboratory, and thus from the universe" (1975, p.171-2).
This also true for information obtained by a receiver through some channel, which could transmit nothing without using energy.
Categories
- 1) General information
- 2) Methodology or model
- 3) Epistemology, ontology and semantics
- 4) Human sciences
- 5) Discipline oriented
Publisher
Bertalanffy Center for the Study of Systems Science(2020).
To cite this page, please use the following information:
Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]
We thank the following partners for making the open access of this volume possible: