A technique to eliminate supposedly random variations in data flows.
K.De GREENE states: "The smoothing or averaging of information flows in system dynamics can be thought as an attempt to achieve perfect information. There is no 1:1 relationship between a smoothed information variable and the true physical variable it represents. As in the real world and in all realistic models, information in systems dynamics is subject to bias, amplification, attenuation, noise, and delay" (1994, p.3-21).
Quoting FORRESTER (1961, p.407), he adds: "Smoothing is a process of taking a series of past information and attempting to form an estimate of the present of the underlying significant content of the data".De GREENE comments: "Smoothing is done to attenuate rapid fluctuations and thereby to reduce meaningless noise… Smoothing eliminates random events… it appears that a great deal of real information is lost by averaging and smoothing. The very capability of the system to evolve may be lost" (p.15) (for more on this, see "Systems dynamics").
Of course, these effects result of at least two features in modelization:
1) What is deemed meaningless by the modelizer;
2) Classical models of complexity do not take deterministic chaos in account and may even eliminate data which could reveal it.
- 1) General information
- 2) Methodology or model
- 3) Epistemology, ontology and semantics
- 4) Human sciences
- 5) Discipline oriented
To cite this page, please use the following information:
Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]
We thank the following partners for making the open access of this volume possible: