Entropy and Negentropy in Organized Complexity

 

By Dr. G. M. Vasilopoulos

Darwin’s famous theory of natural evolution, although referred to biological structures that better adopt to environment, is also subject to thermodynamic principles, further applying on artificial structures like entrepreneurial organisms. Thermodynamics that, besides physics, integrate all philosophical & scientific disciplines, biology & cybernetics included. Life is considered since then, as a state of very low probability in nature, because of the complicated structures and loads of energy needed to create & sustain it in best adaptivity, with livability always confronting a major thermodynamic threat, that is the pro-increasing entropy, the measure of disorder. Therefore, all kind of organisms need energy as that vital force received from their ecosystems, to prevent entropy increase and maintain their complicated structures & assembly as a prerequisite to survive.

They take energy at the expense of “nutrients” or raw materials consumed and transduce it inside their functional modules (cells) to new forms of energy (mechanical, chemical, electrical, stored) or usable molecules & finished goods. A process accurately governed by a complicated cybernetic mechanism, homeostasis, for resilience & sustainability development. A remarkable ability of organisms to prevent increase of their entropy by safeguarding, and ever adjusting their organization, was identified as negative entropy by E. Schroendinger in What is Life or negentropy by Léon Brillouin. The concept was algo much stressed, as of critical importance, by Norbert Wiener in his essay on Cybernetics & Society. Wiener, on which we shall see more in the follow-up post, was of course (along with Kolmogorov) the one that invented optimal forecasting and filtering equations and initialized the use of autocovariances and autocorrelations in forecasting.

To secure livability, essential robust forecasting can be feasible using the entropy principle in order to capture uncertainty; an example is the maximum entropy approach of Jaynes discussed in this post. Automation then relies on algorithms that through their probabilistic characteristics can best possibly tackle uncertainty, thus making machine learning more possible and, within the limits of complexity, efficient.