Entropy, once seen as a measure of disorder in physical systems, is now understood as a reflection of our ignorance and knowledge limitations. This evolving perspective links entropy to information theory and challenges traditional views of objectivity in science.
Sorkin and colleagues have derived an equivalent of the second law of thermodynamics for living systems, which establishes a relation between such a cell’s active uptake of energy and its random-looking path in terms of entropy production.
This review article discusses the concept of entropy in statistical physics and its role as both a tool for inference and a measure of time irreversibility. It highlights the developments in stochastic thermodynamics and the principle of maximum caliber, emphasizing the importance of cross-talk among researchers in disparate fields.