information-theory mathematics physics
Definition
Information Theory
Information Theory is the mathematical study of the quantification, storage, and communication of information. It was founded by Claude Shannon in his 1948 paper, A Mathematical Theory of Communication.
Key Concepts
- Entropy: A measure of the uncertainty or unpredictability of a data source. In this context, it quantifies the amount of information produced on average for each unit of data.
- Redundancy: Data that can be removed without losing information. Redundant data is compressible.
- Compression: The process of reducing the number of bits needed to represent data. As noted in What Is Intelligence?, data are redundant precisely to the degree that they are predictable or exhibit order.
Information and Biology
From a biological perspective, dna and other replicators are highly compressible because they evolved through symbiogenesis. This recursive copying of sub-sequences creates a hierarchical, self-similar structure that exhibits power-law scaling, a property that information theory can quantify through data compression.
Obs
Information theory provides the bridge between thermodynamics and biology. As John von Neumann realized, the ability of a system to replicate is fundamentally a computational and informational process involving the separation of “instructions” (the tape) from “machinery” (the constructor).