Notes

Note: Continued Fraction Information Channels

Structural link between continued-fraction expansions and substrate information channels

6 min read

Appendix F — Information Flow Conservation and the Arithmetic Opacity of π\pi

F.1 Motivation

This appendix establishes a connection between the information-theoretic classification of mathematical constants (developed in the companion work “Information-Theoretic Classification of Transcendental Constants”, which planned to live in a LevelOfIrratotionalityOfPi/ tree that never shipped — see ../LeanFormalizationV2/OmegaTheory/Irrationality/ for the Lean-side formalization that took its place) and the principle of information flow conservation in physical systems.

The central observation is: the continued fraction expansion of a mathematical constant functions as an information channel, and the properties of this channel (capacity, entropy rate, mixing time) determine whether the constant is “self-encoding” or “opaque.” This distinction has implications for the role of transcendental constants in physical theories.

F.2 The Continued Fraction as an Information Channel

Let αRQ\alpha \in \mathbb{R} \setminus \mathbb{Q} be an irrational number with CF expansion α=[a0;a1,a2,]\alpha = [a_0; a_1, a_2, \ldots]. The Gauss map T:x{1/x}T: x \mapsto \{1/x\} generates the partial quotients via iteration:

an+1=1Tn(αα)a_{n+1} = \left\lfloor \frac{1}{T^n(\alpha - \lfloor\alpha\rfloor)} \right\rfloor

This defines a dynamical information channel:

SourceTnState Tn(α0)Output an+1\text{Source} \xrightarrow{T^n} \text{State } T^n(\alpha_0) \xrightarrow{\lfloor \cdot \rfloor} \text{Output } a_{n+1}

The channel has:

  • Entropy rate h=π2/(6ln2)3.433h = \pi^2/(6\ln 2) \approx 3.433 bits/step (Rokhlin, 1961)
  • Mixing time τ=O(1/logλ2)0.84\tau = O(1/|\log\lambda_2|) \approx 0.84 steps (Wirsing, 1974)
  • Memory B=C/(1λ2)B = C/(1-|\lambda_2|) bits (total past-future mutual information)

F.3 Information Conservation in the CF Channel

Theorem F.1 (Information Balance). For a γ\gamma-typical real number α\alpha, the following information balance holds at each step nn:

H(an+1)output entropy=I(an+1;a1,,an)from past (B)+D(α,n)from source (deficit)\underbrace{H(a_{n+1})}_{\text{output entropy}} = \underbrace{I(a_{n+1}; a_1, \ldots, a_n)}_{\text{from past (} \leq B \text{)}} + \underbrace{D(\alpha, n)}_{\text{from source (deficit)}}

This is a conservation law: the output entropy (a fixed constant 3.433\approx 3.433 bits) is partitioned into information recycled from the past and fresh information injected by the source.

For self-encoding constants (D=0D = 0): All output information comes from recycled past. The channel is closed — no external input needed. The CF is a conservative dynamical system in the information-theoretic sense.

For opaque constants (D>0D > 0): The output information exceeds what the past can supply. The deficit D3.433BD \approx 3.433 - B bits must be injected at each step by the source — i.e., by the geometric or analytic definition of α\alpha.

F.4 Physical Interpretation

In physical systems, information conservation is a fundamental principle: the von Neumann entropy of a closed quantum system is constant under unitary evolution, and classical Liouville’s theorem preserves phase-space volume (hence information).

The CF channel provides an arithmetic analogue:

Physical SystemCF Channel
Hamiltonian evolutionGauss map iteration
Phase space densityGauss–Kuzmin measure γ\gamma
Entropy conservationH(an+1)=I+DH(a_{n+1}) = I + D (Theorem F.1)
Closed systemSelf-encoding (D=0D = 0): no external input
Open systemOpaque (D>0D > 0): requires external source

The constant ee, with D=0D = 0, behaves like a closed arithmetic system: its digits evolve autonomously, generating no new information beyond what is already encoded in the sequence. This is analogous to integrable Hamiltonian systems, where trajectories are fully determined by initial conditions.

The constant π\pi, with D>0D > 0, behaves like an open arithmetic system: its digits require continuous injection of information from an external source (the geometric definition). This is analogous to dissipative systems coupled to a reservoir, or to quantum measurement, where each observation extracts information from the environment.

F.5 The Self-Referential Entropy

The entropy rate of the Gauss map, which governs the information deficit of opaque constants, is:

h(T,γ)=π26ln2h(T, \gamma) = \frac{\pi^2}{6\ln 2}

This quantity contains π\pi itself. The information deficit of π\pi is therefore:

D(π)=π26ln2BD(\pi) = \frac{\pi^2}{6\ln 2} - B

This self-referential structure — where π\pi appears in the formula for its own informational opacity — is reminiscent of fixed-point theorems and self-referential structures in mathematical logic (Gödel numbering) and theoretical physics (the Wheeler–DeWitt equation, where the universe appears in its own wave equation).

We note this as a structural observation, not a formal theorem, but it suggests a deep connection between the arithmetic properties of π\pi and the self-referential nature of physical laws.

F.6 Implications for the Omega Framework

Within the broader Omega Theory framework (see Main Paper), the dichotomy between self-encoding and opaque constants may relate to the distinction between:

  • Algebraic/structural quantities (coupling constants, symmetry dimensions) that are determined by internal consistency — analogous to self-encoding constants whose digits determine themselves.

  • Transcendental/geometric quantities (π\pi in the area of the unit circle, ee in exponential decay rates) that encode geometric or topological information — where the opacity of π\pi reflects the irreducible information content of circular geometry.

The information deficit D(π)=π2/(6ln2)B>0D(\pi) = \pi^2/(6\ln 2) - B > 0 can be interpreted as the minimum information flux required to sustain the computation of a geometric constant. This is a form of information flow conservation: the geometric content of π\pi (the relationship between circumference and diameter) cannot be reduced to an arithmetic pattern — it must be continually accessed from the geometric definition.

F.7 Summary

The continued fraction expansion provides a natural information channel for each mathematical constant. The channel’s entropy rate (π2/(6ln2)\pi^2/(6\ln 2)), mixing time (O(1)O(1)), and memory (BB) are fixed by the Gauss map dynamics and are fully proved.

The information deficit D=HID = H - I classifies constants into two regimes:

D=0self-encoding (closed channel, no external input)D = 0 \quad \Longleftrightarrow \quad \text{self-encoding (closed channel, no external input)} D>0opaque (open channel, requires external source)D > 0 \quad \Longleftrightarrow \quad \text{opaque (open channel, requires external source)}

This dichotomy is a new structural result in the classification of transcendental numbers, and provides an information-theoretic perspective on why certain mathematical constants (π\pi) resist arithmetic pattern recognition while others (ee) do not.


References

Key references for this appendix (the planned LevelOfIrratotionalityOfPi/07-References.md companion bibliography was never written; core refs listed inline below):

  • Rokhlin [1961]: Entropy of the Gauss map
  • Wirsing [1974]: Exponential mixing and the Gauss–Kuzmin–Wirsing constant
  • Shannon [1948]: Mutual information and entropy
  • Khinchin [1964]: Continued fraction foundations

Return to: Physics Papers README · ↑ chaos-shield root