Laws of evolution parallel the laws of thermodynamics

2018 
Abstract We hypothesize that concepts from thermodynamics and statistical mechanics can be used to define summary statistics, similar to thermodynamic entropy, to summarize the convergence of processes driven by random inputs subject to deterministic constraints. The primary example used here is biological evolution. We propose that evolution of biological structures and behaviors is driven by the ability of living organisms to acquire, store, and act on information and that summary statistics can be developed to provide a stochastically deterministic information theory for biological evolution. The statistical concepts that are the basis of thermodynamic entropy are also true for information, and we show that adaptation and evolution have a specific deterministic direction arising from many random events. Therefore, an information theory formulated on the same foundation as the immensely powerful concepts used in statistical mechanics will provide statistics, similar to thermodynamic entropy, that summarize distribution functions for environmental properties and organism performance. This work thus establishes foundational principles for a quantitative theory that encompasses both behavioral and biological evolution and may be extended to other fields such as economics, market dynamics and health systems.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    5
    Citations
    NaN
    KQI
    []