Note: Continued Fraction Information Channels
Structural link between continued-fraction expansions and substrate information channels
Appendix F — Information Flow Conservation and the Arithmetic Opacity of
F.1 Motivation
This appendix establishes a connection between the information-theoretic classification of mathematical constants (developed in the companion work “Information-Theoretic Classification of Transcendental Constants”, which planned to live in a LevelOfIrratotionalityOfPi/ tree that never shipped — see ../LeanFormalizationV2/OmegaTheory/Irrationality/ for the Lean-side formalization that took its place) and the principle of information flow conservation in physical systems.
The central observation is: the continued fraction expansion of a mathematical constant functions as an information channel, and the properties of this channel (capacity, entropy rate, mixing time) determine whether the constant is “self-encoding” or “opaque.” This distinction has implications for the role of transcendental constants in physical theories.
F.2 The Continued Fraction as an Information Channel
Let be an irrational number with CF expansion . The Gauss map generates the partial quotients via iteration:
This defines a dynamical information channel:
The channel has:
- Entropy rate bits/step (Rokhlin, 1961)
- Mixing time steps (Wirsing, 1974)
- Memory bits (total past-future mutual information)
F.3 Information Conservation in the CF Channel
Theorem F.1 (Information Balance). For a -typical real number , the following information balance holds at each step :
This is a conservation law: the output entropy (a fixed constant bits) is partitioned into information recycled from the past and fresh information injected by the source.
For self-encoding constants (): All output information comes from recycled past. The channel is closed — no external input needed. The CF is a conservative dynamical system in the information-theoretic sense.
For opaque constants (): The output information exceeds what the past can supply. The deficit bits must be injected at each step by the source — i.e., by the geometric or analytic definition of .
F.4 Physical Interpretation
In physical systems, information conservation is a fundamental principle: the von Neumann entropy of a closed quantum system is constant under unitary evolution, and classical Liouville’s theorem preserves phase-space volume (hence information).
The CF channel provides an arithmetic analogue:
| Physical System | CF Channel |
|---|---|
| Hamiltonian evolution | Gauss map iteration |
| Phase space density | Gauss–Kuzmin measure |
| Entropy conservation | (Theorem F.1) |
| Closed system | Self-encoding (): no external input |
| Open system | Opaque (): requires external source |
The constant , with , behaves like a closed arithmetic system: its digits evolve autonomously, generating no new information beyond what is already encoded in the sequence. This is analogous to integrable Hamiltonian systems, where trajectories are fully determined by initial conditions.
The constant , with , behaves like an open arithmetic system: its digits require continuous injection of information from an external source (the geometric definition). This is analogous to dissipative systems coupled to a reservoir, or to quantum measurement, where each observation extracts information from the environment.
F.5 The Self-Referential Entropy
The entropy rate of the Gauss map, which governs the information deficit of opaque constants, is:
This quantity contains itself. The information deficit of is therefore:
This self-referential structure — where appears in the formula for its own informational opacity — is reminiscent of fixed-point theorems and self-referential structures in mathematical logic (Gödel numbering) and theoretical physics (the Wheeler–DeWitt equation, where the universe appears in its own wave equation).
We note this as a structural observation, not a formal theorem, but it suggests a deep connection between the arithmetic properties of and the self-referential nature of physical laws.
F.6 Implications for the Omega Framework
Within the broader Omega Theory framework (see Main Paper), the dichotomy between self-encoding and opaque constants may relate to the distinction between:
-
Algebraic/structural quantities (coupling constants, symmetry dimensions) that are determined by internal consistency — analogous to self-encoding constants whose digits determine themselves.
-
Transcendental/geometric quantities ( in the area of the unit circle, in exponential decay rates) that encode geometric or topological information — where the opacity of reflects the irreducible information content of circular geometry.
The information deficit can be interpreted as the minimum information flux required to sustain the computation of a geometric constant. This is a form of information flow conservation: the geometric content of (the relationship between circumference and diameter) cannot be reduced to an arithmetic pattern — it must be continually accessed from the geometric definition.
F.7 Summary
The continued fraction expansion provides a natural information channel for each mathematical constant. The channel’s entropy rate (), mixing time (), and memory () are fixed by the Gauss map dynamics and are fully proved.
The information deficit classifies constants into two regimes:
This dichotomy is a new structural result in the classification of transcendental numbers, and provides an information-theoretic perspective on why certain mathematical constants () resist arithmetic pattern recognition while others () do not.
References
Key references for this appendix (the planned LevelOfIrratotionalityOfPi/07-References.md companion bibliography was never written; core refs listed inline below):
- Rokhlin [1961]: Entropy of the Gauss map
- Wirsing [1974]: Exponential mixing and the Gauss–Kuzmin–Wirsing constant
- Shannon [1948]: Mutual information and entropy
- Khinchin [1964]: Continued fraction foundations
Return to: Physics Papers README · ↑ chaos-shield root