BCSSS

International Encyclopedia of Systems and Cybernetics

2nd Edition, as published by Charles François 2004 Presented by the Bertalanffy Center for the Study of Systems Science Vienna for public access.

About

The International Encyclopedia of Systems and Cybernetics was first edited and published by the system scientist Charles François in 1997. The online version that is provided here was based on the 2nd edition in 2004. It was uploaded and gifted to the center by ASC president Michael Lissack in 2019; the BCSSS purchased the rights for the re-publication of this volume in 200?. In 2018, the original editor expressed his wish to pass on the stewardship over the maintenance and further development of the encyclopedia to the Bertalanffy Center. In the future, the BCSSS seeks to further develop the encyclopedia by open collaboration within the systems sciences. Until the center has found and been able to implement an adequate technical solution for this, the static website is made accessible for the benefit of public scholarship and education.

A B C D E F G H I J K L M N O P Q R S T U V W Y Z

INFORMATION THEORY 1)3)

Under the general wording of Information theory widely different subjects have been considered:

1- A theory of transmission of information through a physical channel, which is in reality a theory of communication (in the physical sense).

2- A quantitative theory of information, which considers for example the quantity of bits necessary to convey some information.

3- A theory of meanings, which looks into the transfer of significant information from a semantically able emissor to an equally semantically able receiver (or many).

D. Mac KAY expresses this idea: "Information theory is concerned with the making of representations, i.e. symbolism in the most general sense" (1972, p.161).

According to Mac KAY: "General information theory is concerned with the problem of measuring changes in knowledge. Its key is the fact that we can represent what we know by means of pictures, logical statements, symbolic models, or what you will. When we receive information, it causes a change in the symbolic picture, or representation that we could use to depict what we know" (p.42).

Still in Mac KAY's words: "The scope of information theory thus includes in principle at least three classes of activity: 1) Making a representation of some physical aspect of experience. This is the problem treated in scientific or descriptive information theory;

2) Making a representation of some non-physical (mental or ideational) aspect of experience. This is at the moment outside our concern, being the problem of the Arts;

3) Making a representation in one space B, of a representation already present in another space A. This is the problem of communication theory, B being termed the receiving end and A the transmitting end or a communication channel" (p.162).

Thus, at least in the realm of symbolic coding, communication theory is included within information theory.

However, a serious confusion besets the whole subject. For example, Mac KAY writes: "SHANNON wisely insisted that the concept of "meaning" was outside the scope of his theory"

"This innocent statement by SHANNON has given rise to two unfortunate consequences:

"In the first place the original sense of SHANNON's warning has sometimes been forgotten and he is credited with the view that the whole theory of information (which includes his own theory of the unexpectedness of information as a vital part) has nothing to do with "meaning".

"Secondly, and largely in consequence of this, the idea has become current that the whole subject of meaning is not satisfactory for the information theorist. "Subjective", "vague", "dangerous", are the adjectives with which it is often smothered" (p.80).

From a different angle, M. MARUYAMA has observed that SHANNON's Information theory: "… was trapped in a classification epistemology which sees a structure as consisting of elements which tend to behave independently. In SHANNON's theory, the most probable or "natural" state of a system is that in which each element flips its own coin regardless of the other elements. The amount of information which a structure conveys is defined as the degree of departure of the structure from such probable states of nonstructure. The higher the improbability of the structure, the higher the amount of information. In this epistemology, structures tend to decay to more probable nonstructures" (1976, p.201).

In synthesis, SHANNON's Information theory is somewhat at odds with the systemic viewpoint: a bag of loose flipping elements is not a significant network, and still less a system.

The confusion about the real meaning of "information"in SHANNON and WEAVER remains deeply ingrained.

D. GERNERT still felt in year 2000 the need to write the following:

"The so-called "information theory", created mainly by SHANNON and WEAVER, was originally intended as a theory of information transmission. the floppy and misleading terms "theory of information" and "information theory" emerged only later and do not at all reflect the real capacities of that theory. SHANNON himself advanced a passionate pleading against the overestimation of the theory essentially originated by himself.

More and more- albeit still reluctantly- the idea is gaining broader acceptance that a theory of information which really deserves the name must also account for the semantic and pragmatic aspects of information "(2000 b, p.253)

Categories

  • 1) General information
  • 2) Methodology or model
  • 3) Epistemology, ontology and semantics
  • 4) Human sciences
  • 5) Discipline oriented

Publisher

Bertalanffy Center for the Study of Systems Science(2020).

To cite this page, please use the following information:

Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]


We thank the following partners for making the open access of this volume possible: