BCSSS

International Encyclopedia of Systems and Cybernetics

2nd Edition, as published by Charles François 2004 Presented by the Bertalanffy Center for the Study of Systems Science Vienna for public access.

About

The International Encyclopedia of Systems and Cybernetics was first edited and published by the system scientist Charles François in 1997. The online version that is provided here was based on the 2nd edition in 2004. It was uploaded and gifted to the center by ASC president Michael Lissack in 2019; the BCSSS purchased the rights for the re-publication of this volume in 200?. In 2018, the original editor expressed his wish to pass on the stewardship over the maintenance and further development of the encyclopedia to the Bertalanffy Center. In the future, the BCSSS seeks to further develop the encyclopedia by open collaboration within the systems sciences. Until the center has found and been able to implement an adequate technical solution for this, the static website is made accessible for the benefit of public scholarship and education.

A B C D E F G H I J K L M N O P Q R S T U V W Y Z

INFORMATION MEASUREMENT 2)

J.de ROSNAY explains with great clarity what is meant by information measurement, why and how to measure it and how the concept of information quantity is totally independent of its meaning (and is purely technical):

"In order to define conveniently what a certain quantity of information represents, one must place oneself in the situation of an observer seeking information about some ill-known system. It may be for example the number of possible answers to a question; or the number of solutions of a problem.

"Obtaining informations on the unknown system may lead the observer to reduce the number of possible answers. A total information could even lead immediately to only one possible answer: the good one. Information is thus a function of the relation between the number of possible answers before the reception of the message (P0) and the number of remaining possible answers after reception (P1)

"A very simple example: the unknown system is a pack of 32 cards. Question: what is the chance to select some predefined card?

"This question introduces an uncertainty, which can be meeted by a relation; i.e. the number of favorable cases respect to the number of possible ones (which means: the probability to sort out the good card) As there is only one favorable case (the chosen card), this probability is of one chance in 32.

"Now, how to meet the quantity of information obtained by sorting one card? Before sorting it, there are 32 possible cases with the same probability (P0). After sorting it, two different situations may appear:

"Either the good card was sorted. In this case, there is only one possible answer (the one in hand). The quantity of information thus obtained corresponds to the relation 32/1, and the information is total.

"Or a useless card was sorted and there are still 31 possible answers. The quantity of information now obtained corresponds to the relation 32/31. The information is partial.

"The information obtained in the first situation solves definitively the problem by reducing to 1 the number of possible cases. In the second, the number of possible cases is only slowly reduced. It thus reduces the denominator of the relation P0/P1: the relation increases and the information too. This is to say that information increases when uncertainty decreases. This is because uncertainty expresses the lack of information about the unknown system.

"Finally, to meet information and to define unities, two conventions are made: information is preferently defined in a substractive way, better than by a relation, as information is the difference between two uncertainties (before and after the message). We substitute thus to the relation P0/P1 a difference of their logarithms. The second convention: as the most practical and used code to translate a message is made of two signs: 0 and 1, the binary language and the base 2 logarithms will be adopted. When using these convention, the quantity of information in a message is meted in bits (i.e. binary digits)…

"Information (quantitative) appears thus as an abstract entity, objective and devoid of any human significance. It is easier to understand what is a quantity of information by assimilating it to material units circulating in some channel: for example as water molecules in a pipe. The flow in the pipe is limited by its section. It is just the same for any transmitting line… as for example a telephone wire. This quantity of information is totally independent of the significance of the message: a song, the results of the horse races or the market's quotations" (1975, p.170-71).

From a different viewpoint I. PRIGOGINE and I STENGERS state: "We may use the algorithmic theory of information proposed by CHAITIN and KOLMOGOROV: The measure of information should be the length of the program that would have to be given to a computer… in order to make it able to achieve the structure that we wish (to obtain)" (1992, p.83). Even in this case however, information is still related to an observer.

Categories

  • 1) General information
  • 2) Methodology or model
  • 3) Epistemology, ontology and semantics
  • 4) Human sciences
  • 5) Discipline oriented

Publisher

Bertalanffy Center for the Study of Systems Science(2020).

To cite this page, please use the following information:

Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]


We thank the following partners for making the open access of this volume possible: