BCSSS

International Encyclopedia of Systems and Cybernetics

2nd Edition, as published by Charles François 2004 Presented by the Bertalanffy Center for the Study of Systems Science Vienna for public access.

About

The International Encyclopedia of Systems and Cybernetics was first edited and published by the system scientist Charles François in 1997. The online version that is provided here was based on the 2nd edition in 2004. It was uploaded and gifted to the center by ASC president Michael Lissack in 2019; the BCSSS purchased the rights for the re-publication of this volume in 200?. In 2018, the original editor expressed his wish to pass on the stewardship over the maintenance and further development of the encyclopedia to the Bertalanffy Center. In the future, the BCSSS seeks to further develop the encyclopedia by open collaboration within the systems sciences. Until the center has found and been able to implement an adequate technical solution for this, the static website is made accessible for the benefit of public scholarship and education.

A B C D E F G H I J K L M N O P Q R S T U V W Y Z

INFORMATION (Amount of) 5)

"A logarithmic measure of the statistical unexpectedness (reciprocal of probability) of the message concerned" (D. Mac KAY 1969, p.79).

Mac KAY states: "Since the unexpectedness of a message need not have direct connection with its semantic content or meaning, SHANNON wisely insisted that the concept of "meaning" was outside the scope of his theory" (Ibid).

Basically and inasmuch as SHANNON's work is related to information, it should be definitely understood that it is about quantitative information and its measurement.

"As an example let us suppose that we must select one object out of a collection, on the base of a specified criterion: for example the letter 0 within the alphabet. The most economical method is to divide the alphabet in two and ask: is 0 in the first half or in the second one? The answer provides us with one bit of information, corresponding to the fact that 0 is in the second half. Following in the same way, we now divide this second half in two and repeat our question. The answer:"0 is in the first half of the second half" provides us with a second bit of information. Finally, we will find out that the number of bits we need to "place" 0 in the alphabet, is the logarithm base 2 of 26, i.e. 5 when rounded off to the next whole number of bits, taking in account that bits are necessarily whole numbers" (Ibid.).

This is in accordance with SHANNON and WEAVER, who consider the amount of information as a basically technical feature, which: "… characterizes the whole statistical nature of the information, and is not concerned with individual messages" (1949, p.104).

In this quantitative sense, information is thus defined as the logarithm to the base 2 of the possible combinations C, i.e.: I = log2C. As logarithms are exponents, quantitative information is an additive measure of complexity.

Obviously, apart from the quantitative aspect of information, as expressed for example in bits, the qualitative aspect is quite a different matter, as it relates to meanings, which "make sense" only in exchanges between sentient and thinking beings, and not, for example, for the telephone sets themselves.

Categories

  • 1) General information
  • 2) Methodology or model
  • 3) Epistemology, ontology and semantics
  • 4) Human sciences
  • 5) Discipline oriented

Publisher

Bertalanffy Center for the Study of Systems Science(2020).

To cite this page, please use the following information:

Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]


We thank the following partners for making the open access of this volume possible: