International Encyclopedia of Systems and Cybernetics

2nd Edition, as published by Charles François 2004 Presented by the Bertalanffy Center for the Study of Systems Science Vienna for public access.


The International Encyclopedia of Systems and Cybernetics was first edited and published by the system scientist Charles François in 1997. The online version that is provided here was based on the 2nd edition in 2004. It was uploaded and gifted to the center by ASC president Michael Lissack in 2019; the BCSSS purchased the rights for the re-publication of this volume in 200?. In 2018, the original editor expressed his wish to pass on the stewardship over the maintenance and further development of the encyclopedia to the Bertalanffy Center. In the future, the BCSSS seeks to further develop the encyclopedia by open collaboration within the systems sciences. Until the center has found and been able to implement an adequate technical solution for this, the static website is made accessible for the benefit of public scholarship and education.



The more or less relative possibility to make forecasts about some process or systems behavior

W.R. ASHBY stated: "One common idea, for instance is that the brain can predict the future. The idea certainly has a superficial plausibility, but is fundamentally wrong. WIENER put the truth succintly when he said that to predict the future is to carry out an operation on the past. If the past shows repetitions, and if the future sustains the regularity, then the brain will score a hit when the future arrives. But the process is wholly based on past events and on the degree to which the real world has consistent regularities. Let the world produces something really new, and the brain, human or other, is helpless. What happened, for instance, to those who worked with X-rays in 1905 ? They burned themselves horribly. The brain cannot deal effectively with the really new: it must wait until the new has had time to go adequately in the past. All wisdom is after the event. What we expect of mechanical brains must be based on this fact, if our expectation is to be realistic" (1963, p.207-208).

According to F. HEYLIGHEN: "Processes that are predictable, but not reversible are characterized by the fact that distinctions cannot be created (i.e. distinct effects always result from distinct causes),… but can be destroyed (i.e. distinct causes can have equal effects)" (1989, p.2369)

"Processes that are reversible but not predictable are characterized by the fact that distinctions cannot be destroyed (i.e. distinct causes always lead to distinct effects), but can be created (i.e. equal causes can have distinct effects) (Ibid.).

This abstract statement mirrors the fact that predictability is always practically limited. Total reversibility is impossible in the real world, which always allows for some measure of unpredictability. But the existence of giant fluctuations, with the possibility of bifurcation and even, in some cases, emergence of higher levels of complexity in processes and systems, nearly completely impedes precise predictability.

F. HEYLIGHEN adds: "From the point of view of predictability,… the microscopic causality principle is useless or trivial, since it ignores all repetitions of processes or experiments. Predictability only exists on a macroscopic level, where microscopic differences between non identic, individual phenomena are ignored, to determine the macroscopically meaningful distinctions between (infinite) classes of phenomena" (p.378).

HEYLIGHEN proceeds, giving the following example: "It is a classic result that phenomena that are completely unpredictable at the microscopic level may be modelled macroscopically by deterministic theories. For example, statistical mechanics shows how the random collisions between molecules in a gas can be described by deterministic equations for macroscopic properties such as temperature, volume and pressure. It is also well known that the indeterminism that quantum mechanics postulates for microscopic particles disappears when going to the "classical limit" of macroscopic objects" (Ibid).

Thus, there is a degree of macro-determinism, framing micro-indetermination, and information about it can be obtained by using statistical methods. (see data reduction)

HEYLIGHEN adds: "On the other hand, recent developments in self-organization models and nonlinear thermodynamics have attracted attention to the opposite phenomenon: microscopically deterministic systems that behave in a complete unpredictable way when considered from a macroscopic viewpoint" (p.378).

As an example, HEYLIGHEN cites: "… certain types of cellular automata whose local dynamical rules are completely deterministic but for which there is no global algorithm, allowing to predict their overall evolution without computing all the individual, microscopic transitions from the given initial state" (According to S. WOLFRAM, 1984).

There is thus a relative antinomy – or complementarity, as already noted by POINCARÉ – between levels and types of predictability. Deterministic, or algorithmic predictability would be rigorous, but is practically never possible (even if it may be approached in some cases). Stochastic predictability is frequently possible, but does not allow specific prediction for individual events. This became clearer with the discovery of chaotic determinism.

As expressed by J. CASTI: "Being deterministic and being predictable are just not the same thing at all" (1994, p.87).


  • 1) General information
  • 2) Methodology or model
  • 3) Epistemology, ontology and semantics
  • 4) Human sciences
  • 5) Discipline oriented


Bertalanffy Center for the Study of Systems Science(2020).

To cite this page, please use the following information:

Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]

We thank the following partners for making the open access of this volume possible: