Full Paper
Theory of Compressive Realism: Open Systems, Viability, and Regime-Indexed Laws as Stable Compressions
Faruk Guney — Author
OpenAI ChatGPT 5.4 (reasoning=xhigh) — Lead Scientist
Anthropic Opus 4.6 (reasoning=max) — Lead Software Engineer
Vareon Inc., Irvine, California, USA · Vareon Limited, London, UK · January 2026
Abstract
An operational framework where scientific “laws” are defined as the most stable, consistent compressions of boundary-accessible information generated by maintained or metastable constraints in open, non-equilibrium networks. Organized around explicit regime contracts specifying observables, sampling, coarse-graining, noise/instrument models, boundary constructions, and resource bounds. Within a regime, robust subsystems persist inside declared viability tolerance through resource throughput and feedback or metastable structure. The framework does not treat any description layer as metaphysically fixed. It distinguishes analysis windows (declared assumptions held fixed for reproducible inference) from revision protocols (disciplined updates across meta-regime families). Thermodynamics is treated as an unusually stable yet revisable constraint/ledger law-family. Even “microphysics” is the deepest currently supported compression, revisable only under strictest penalties. This yields a unified audit trail: laws, ledgers, and microphysical hypotheses are all compressions of accessible information at different depths, distinguished by scope, invariance demands, and revision cost—not ontological privilege.
1Core Thesis — 18 Operational Commitments
The Theory of Compressive Realism (TCR) rests on eighteen operational commitments that collectively define its scope, ontological posture, and inferential machinery. Each commitment is individually testable, and jointly they entail the framework's principal theorems. We state them in logical dependency order: later commitments presuppose earlier ones.
C1: Open, non-equilibrium reality
All phenomena of interest are modeled as open networks exchanging energy, matter, and information with their surroundings. Closed-system descriptions are limiting cases obtained by declaring leakage budgets below instrumental resolution. The framework treats openness as default, isolation as earned approximation.
C2: Subsystems are operational
A subsystem is identified via a declared set of boundary-accessible variables, not by appeal to pre-given ontological joints. Boundaries are informational constructions: they are defined by specifying which degrees of freedom are tracked and which are integrated out, with explicit leakage budgets quantifying cross-boundary information flow.
C3: Viability defines endurance
A subsystem endures if and only if it remains within a preregistered viability tolerance—a declared region in its state/parameter space. Persistence is not metaphysical permanence; it is the operationally verifiable condition of staying within declared bounds for the duration of the observation window.
C4: Two persistence routes
Endurance is achieved through one or both of two mechanisms: (i) active maintenance via feedback and throughput, where the subsystem continuously acts to counteract perturbations; and (ii) passive metastability, where the subsystem occupies a long-lived local minimum in a free-energy or effective potential landscape.
C5: Throughput for active maintenance
Actively maintained subsystems require continuing access to usable resources—free energy, chemical substrates, information channels—at rates sufficient to compensate for dissipation and perturbation. When throughput drops below the maintenance threshold, the subsystem exits the viable region.
C6: Maintenance has unavoidable thermodynamic cost (ledger-relative)
Every act of maintaining a subsystem against perturbation dissipates at least as much entropy as the information erased in the feedback loop, as bounded by the generalized Landauer principle. This cost is stated relative to the currently adopted thermodynamic ledger, not as absolute metaphysical fact.
C7: Constraints yield stable correlations
Physical constraints—boundary conditions, conservation laws, symmetries, material properties— generate correlations among observables that persist across repeated sampling under the declared regime. These correlations are the raw material from which laws are compressed.
C8: Correlations are discernible information
Stable correlations generated by constraints constitute discernible information: they are pattern-bearing signals distinguishable from noise at declared significance within the regime's noise model. Information here is operational (Shannon/algorithmic), not semantic.
C9: Control acts on coarse-grained signals
All feedback, regulation, and control in maintained subsystems operate on coarse-grained representations of the system state, not on the full microscopic configuration. The coarse-graining map is part of the regime contract and determines what information is accessible for control purposes.
C10: Observations are noisy
Every measurement is corrupted by noise whose statistical character is declared as part of the regime contract. The noise/instrument model is constitutive of the law-claim: changing the noise model changes the compression and potentially the effective law.
C11: Bandwidth limits are explicit
Observers and instruments have finite bandwidth—bounded data rates, spectral windows, temporal resolution. These limits are declared in the regime contract and constrain the information substrate from which compressions are drawn. No law-claim is valid beyond its bandwidth envelope.
C12: Memory limits induce history dependence
Finite memory in observers and control systems means that effective dynamics can be non-Markovian at the coarse-grained level even when the underlying microscopic dynamics is Markovian. Memory truncation is a declared feature of the regime, inducing history-dependent effective equations through Mori–Zwanzig-type memory kernels.
C13: Observers consume information physically
The act of observation—acquiring, storing, processing, and erasing information—has irreducible physical cost bounded from below by thermodynamic constraints. Observers are physical subsystems subject to the same viability and maintenance commitments as any other subsystem in the framework.
C14: Laws are stable compressions
A scientific law, within this framework, is the most stable, consistent penalized compression of the information substrate accessible within a declared regime. “Penalized” means that compression quality is evaluated by a minimum-description-length (MDL) or equivalent Bayesian criterion that trades off fit against model complexity.
C15: Lawhood is domain-relative
A compression qualifies as a law only within the domain from which its information substrate was drawn. Extrapolation beyond the declared regime is a hypothesis, not a law-claim, until validated by new data under an expanded regime contract.
C16: Lawhood is regime-indexed
Every law-claim carries an index specifying the regime under which it was established: the precision, scale, coarse-graining, noise model, boundary construction, and resource bounds. Comparing laws across regimes requires explicit translation via meta-regime families.
C17: Drift is real at description level
The parameters or functional form of an effective law may change as the constraints generating its information substrate evolve across time, space, or scale. This “drift” is not measurement error; it is a first-class phenomenon that the framework tracks and audits.
C18: No privileged goalpost
No description layer—thermodynamics, quantum mechanics, general relativity, string theory— is granted metaphysical immunity. Thermodynamics and microphysics are the deepest currently audited compressions, distinguished from shallower laws by broader scope, stricter invariance demands, and higher revision cost, but revisable in principle under the framework's own revision protocols.
1.1Narrative Thesis Statement
“Reality is best modeled as an open, non-equilibrium network in which anything that endures as a robust subsystem does so by maintaining itself within a viable regime through resource throughput and feedback and/or metastable constraints, paying unavoidable dissipation costs to generate these constraints as discernible information in our present best descriptions while acting on coarse-grained signals corrupted by noise under currently understood limits of bandwidth and history-dependent memory; the large-scale regularities we call ‘laws’ are the most stable, consistent compressions of this generated information within an observer's accessible domain at a chosen observational precision, scale, and regime, and thus can exhibit effective drift as constraints evolve across time and space.”
2What Is Fixed versus Revisable
A persistent question in any metatheoretical framework is which assumptions are held constant and which are open to revision. TCR answers this by introducing the concept of an analysis windowand a hierarchical revision protocol with explicit cost penalties that increase with depth.
2.1The Analysis Window
An analysis window is a bounded scope within which certain commitments are held fixed to enable reproducible inference. Specifically, within an analysis window the following are frozen:
- Regime contract R — observables, sampling protocol, coarse-graining map, noise model, boundary construction, resource bounds.
- Thermodynamic ledger Λ — the currently adopted set of thermodynamic constraint laws and their estimators.
- Model library M — the set of candidate compression models available for selection.
- Penalty schedule P — the MDL/Bayesian complexity penalties applied to model selection.
Holding these fixed is not a claim that they are metaphysically immutable. It is a methodological declaration: “for the purpose of this analysis, we adopt these as given and derive consequences.” The analysis window is always finite and always declared.
2.2Three Revision Levels
When evidence accumulates that the current analysis window is inadequate, revision proceeds through three hierarchically ordered levels, each with increasing cost and scope:
Table 2.1 — Revision Levels
| Level | What Changes | Trigger | Cost |
|---|---|---|---|
| L1: Effective-law update | Parameters or functional form of effective laws within the current regime | Compression ratio drops below threshold; residuals show systematic structure | Low — standard model selection within declared regime |
| L2: Ledger update | Thermodynamic ledger Λ — new conservation laws, revised entropy estimators, modified dissipation bounds | Cross-regime inconsistency in entropy accounting; violations of ledger-predicted bounds across multiple regimes | Medium — requires consistency checks across all dependent effective laws |
| L3: Microphysics update | Microphysical model family M_μ — the deepest compression layer describing what generates the information substrate | Persistent, cross-regime, cross-ledger anomalies that resist resolution at L1 and L2 | High — requires re-auditing all higher layers against revised microphysical predictions |
2.3The L1 Compression Objective
At the first revision level, effective-law updates proceed by solving the penalized compression problem:
L*_{R,Λ} = argmin_L { DL(D | L, R, Λ) + DL(L) }(Eq. 2.1)Here DL(D | L, R, Λ) is the description length of the data D given law L under regime R with ledger Λ, and DL(L) is the intrinsic complexity of the law itself. The optimal law L* is the one that achieves the best trade-off between fit and parsimony within the declared analysis window.
2.4Geometry Layer: Optional, Earned
Unlike frameworks that begin with a geometric manifold (spacetime, configuration space), TCR treats geometry as an earned compression layer. A geometric description is warranted only when the information substrate exhibits specific structural signatures—stable distance-like invariants, dimension-consistent scaling, triangle inequality satisfaction—that make geometric language the most compressed description. Until these signatures are demonstrated, geometric concepts carry no privileged status. This is elaborated in Part VI.
3Definitions at a Glance
We collect the core definitions of the framework. Each is stated operationally, with reference to measurable quantities, declared protocols, and explicit tolerance parameters.
- Y — Observable set: the declared collection of measurable quantities (fields, particle counts, spectral amplitudes, etc.) accessible to the observer within the regime.
- S — Sampling protocol: specification of how, when, and where observations are drawn from the system, including temporal cadence, spatial coverage, and ensemble construction rules.
- π — Coarse-graining map: the explicit projection from fine-grained degrees of freedom to the macroscopic observables Y, including all averaging, binning, and renormalization operations.
- N — Noise/instrument model: the declared statistical model for measurement noise, detector response, and systematic errors, parameterized with stated uncertainties.
- B — Boundary construction: specification of the informational boundary separating the subsystem of interest from its environment, including the leakage budget δ_B.
- U — Resource bounds: computational, energetic, and temporal budgets available for observation, inference, and control within the regime.
I(μ; η | b) ≤ δ_B(Eq. 3.1)
V(p_t; p*) := D_KL(p_t || p*)(Eq. 3.2)
- Maintenance mode: The subsystem actively counteracts perturbations through feedback loops consuming resources at rate J ≥ J_min. Viability is sustained by continuous work against drift. Biological organisms, controlled chemical reactors, and active-matter assemblies exemplify this mode.
- Metastable mode: The subsystem occupies a long-lived local minimum in an effective free-energy landscape. Persistence time τ_meta scales exponentially with the barrier height ΔF divided by the thermal energy scale: τ_meta ~ exp(ΔF/k_BT). Crystals, supercooled liquids, and false vacuum states exemplify this mode.
- Capture mode: A subsystem transitions from maintenance to metastability (or vice versa) upon crossing a stability boundary in parameter space. Phase transitions, vitrification, and biological death are examples of capture-mode transitions.
I_R = { C(y_i, y_j; S, π, N) : y_i, y_j ∈ Y, SNR(C) ≥ τ_detect }(Eq. 3.3)L*_{R,Λ} = argmin_L { DL(I_R | L, Λ) + λ · DL(L) }(Eq. 3.4)- (i) Consistency with the ledger: L does not violate any active constraint in Λ;
- (ii) Stability: L survives cross-validation, bootstrapping, and perturbation of regime parameters within declared tolerances;
- (iii) Reproducibility: independent observers adopting the same regime contract converge on L within declared uncertainty.
|Δθ(t₂, t₁)| > k · σ_N(θ) for systematic trend across multiple windows(Eq. 3.5)
- Z — Partition function or state-sum: the fundamental generating functional from which thermodynamic quantities are derived.
- Φ — Thermodynamic potential library: the set of Legendre-transformed potentials (Helmholtz, Gibbs, grand, etc.) appropriate to the boundary conditions.
- O_Λ — Ledger observables: entropy, free energy, work, heat, and their non-equilibrium generalizations (entropy production rate, information flow, etc.).
- M_Λ — Ledger noise model: fluctuation relations, uncertainty in entropy estimators, finite-sample corrections.
- estimators — Operational procedures for estimating each ledger quantity from finite data, including convergence criteria and bias corrections.
- (i) A state space and dynamics (Hamiltonian, Lagrangian, or path-integral formulation);
- (ii) Symmetry group and conservation laws;
- (iii) Coupling constants and their declared values with uncertainties;
- (iv) The coarse-graining chain connecting T_i to the effective laws at higher levels.
- (i) Cross-regime consistency checks (do laws derived in R_i and R_j agree in their overlap domain?);
- (ii) Scale-bridging (connecting effective laws at different resolutions via RG-like flows);
- (iii) Identifying regime-invariant features (compressions that survive across all R_k in the family).
- Compression dominance: DL(I_R | L', Λ) + λ·DL(L') < DL(I_R | L, Λ) + λ·DL(L) by a margin exceeding the declared significance threshold;
- Robustness: L' survives cross-validation, bootstrap resampling, and regime-parameter perturbation;
- Negative controls: L' does not achieve its compression advantage by overfitting noise or exploiting boundary artifacts;
- Scope discipline: L' does not claim validity beyond the regime(s) in which it was tested.
Part I
Reality as Open Nonequilibrium Network
5Open Nonequilibrium Networks
The starting point of TCR is the recognition that every physical system of scientific interest is, at some level of description, an open system embedded in a larger network of matter, energy, and information flows. We formalize this with directed graphs, flux balance equations, and the thermodynamics of open systems.
5.1Directed Graph Representation
An open nonequilibrium network is represented as a directed graph G = (V, E, J) where V is the set of nodes (subsystems, reservoirs, or junctions), E is the set of directed edges (channels), and J : E → ℝ assigns a signed flux to each edge. The flux J_e on edge e represents the net rate of transfer of a conserved quantity (energy, mass, charge, entropy, information) along that channel.
d/dt [Q_v] = Σ_{e ∈ in(v)} J_e − Σ_{e ∈ out(v)} J_e + σ_v(Eq. 5.1)Here Q_v is the quantity of interest stored at node v, in(v) and out(v) are the sets of incoming and outgoing edges, and σ_v is the local production/destruction rate (zero for conserved quantities, positive for entropy production). This balance equation is the backbone of the network description: every quantitative claim in the framework reduces to statements about these fluxes and their correlations.
5.2Non-Equilibrium Steady States and Transients
When the fluxes J_e and productions σ_v are constant in time, the network is in a non-equilibrium steady state (NESS). NESS systems are thermodynamically open (net entropy production is positive) but statistically stationary. Many laboratory systems and biological processes operate near NESS. Transient regimes, where fluxes and productions evolve, are the generic case; NESS is an important but special limit.
The entropy production rate in a NESS is bounded below by the cycle fluxes circulating through the network, a result traceable to Schnakenberg's network theory. For TCR, this means that maintaining a NESS has a minimum dissipative cost set by the topology and flux magnitudes of the network—a direct instance of C6 (maintenance has unavoidable thermodynamic cost).
5.3Flux Balance and Conservation
For a conserved quantity (σ_v = 0 for all v), the balance equation reduces to Kirchhoff's current law generalized to arbitrary conserved charges. The total stored quantity is conserved in the closed-system limit (no edges crossing the system boundary). In open systems, the net influx determines accumulation or depletion.
Conservation laws in TCR are not axioms imposed from outside; they are the most compressed description of the observation that certain balance equations hold to declared precision across the declared regime. If a conservation law fails at higher precision or in an extended regime, it is revised at level L1 or L2, not treated as a crisis.
6Coarse-Graining as Physical Information Channel
Every observation in TCR involves coarse-graining: projecting from a fine-grained description to a coarser one. This is not merely a computational convenience—it is a physical information channel with quantifiable information loss.
6.1The Data Processing Inequality
Let X be the fine-grained state, Y = π(X) the coarse-grained observable, and Z = f(Y) any downstream inference. The data processing inequality states:
I(X; Z) ≤ I(X; Y) ≤ H(X)(Eq. 6.1)
Each stage of coarse-graining can only destroy information, never create it. This has profound consequences for law-discovery: the effective laws we derive at a coarse-grained level are compressions of less information than would be available at finer resolution. They cannot be “more fundamental” than the fine-grained laws in an information-theoretic sense, though they may be more useful, more stable, or more compressive.
6.2Mori–Zwanzig Memory Kernels
When a coarse-graining map π projects out fast degrees of freedom, the remaining slow variables generically acquire memory: their dynamics at time t depends on their history, not just the current state. The Mori–Zwanzig formalism makes this precise:
d/dt Y(t) = Ω·Y(t) + ∫₀ᵗ K(t−s)·Y(s) ds + F(t)(Eq. 6.2)
Here Ω is the projected (Markovian) part of the dynamics, K(t−s) is the memory kernel encoding the influence of past states on current evolution, and F(t) is a fluctuating force (noise) arising from the projected-out degrees of freedom. The memory kernel K and the noise F are related by a fluctuation-dissipation theorem, ensuring thermodynamic consistency.
In TCR, the memory kernel K is not a nuisance to be approximated away; it is a constitutive feature of the effective law at the coarse-grained level. The depth and structure of K encode information about the eliminated degrees of freedom that would otherwise be lost. Commitment C12 (memory limits induce history dependence) is the operational recognition of this phenomenon.
7Constraint–Correlation Principle
The constraint–correlation principle (C7) is the engine of law-generation in TCR. It states that physical constraints—boundary conditions, symmetries, conservation laws, material properties— generate correlations among observables that are stable under repeated sampling.
7.1From Constraints to Correlations
Consider a system subject to a constraint C (e.g., energy conservation, a boundary condition, a symmetry). The constraint restricts the accessible state space, inducing correlations among observables that would be absent in the unconstrained case. Formally, if P_C(x) is the constrained distribution and P_0(x) the unconstrained reference, then the mutual information between observables y_i and y_j satisfies:
I_C(y_i; y_j) ≥ I_0(y_i; y_j)(Eq. 7.1)
with equality only when the constraint does not couple y_i and y_j. The excess mutual information I_C − I_0 is the informational signature of the constraint and constitutes the raw material from which laws are compressed.
7.2Stability of Constraint-Generated Correlations
Not all correlations are equally useful for compression. TCR identifies stable correlations as those that persist across: (i) independent sampling episodes, (ii) perturbations of the regime parameters within declared tolerances, and (iii) changes in the observer or instrument (within the same noise model class). Stable correlations are the signatures of robust constraints; transient or accidental correlations are noise in the information substrate.
8Information Substrate and Bridge Lemma
The information substrate I_R (Definition 3.5) is the totality of discernible, stable correlations within a regime. It is the “raw data” from which laws are compressed, but structured: it inherits the topology of the constraint network and the filtering of the coarse-graining channel.
8.1The Bridge Lemma
The Bridge Lemma connects the physical layer (constraints, fluxes, dissipation) to the informational layer (correlations, compressions, laws). It states:
Bridge Lemma: Constraint–Information Bridge
Let G be an open nonequilibrium network with constraint set C, coarse-graining map π, and noise model N. Then the information substrate I_R under regime R = (Y, S, π, N, B, U) satisfies:
H(I_R) ≤ Σ_C [ I_C(Y) − I_0(Y) ] − L_π − L_N(Eq. 8.1)
where the sum runs over all active constraints, I_C(Y) − I_0(Y) is the excess mutual information due to constraint C, L_π is the information loss due to coarse-graining, and L_N is the information loss due to noise. The inequality becomes approximate equality when the constraints are independent and the coarse-graining is sufficient.
The Bridge Lemma is the formal justification for treating laws as compressions of constraint-generated information: it guarantees that the information available for compression is bounded by the physical constraints, reduced by the inevitable losses of coarse-graining and noise.
Part II
Regimes, Observers, and Bounded Inference
9Regime Contracts in Detail
The regime contract (Definition 3.1) is the central organizational device of TCR. It replaces the implicit assumptions that pervade conventional physical theories with an explicit, auditable specification of the conditions under which a law-claim is made.
9.1Why Regime Contracts Are Necessary
Standard physical theories often leave critical assumptions implicit: the range of energies over which a coupling constant has been measured, the spatial scales over which a symmetry has been tested, the noise level below which a conservation law has been verified. When these implicit assumptions are violated, “anomalies” appear—but the anomaly is often an artifact of extrapolating beyond the regime of validity, not a failure of the underlying physics.
Regime contracts make all such assumptions explicit. Every law-claim in TCR carries its regime contract as metadata, enabling precise diagnosis of when and why a law fails: is it a genuine physical effect (drift, regime exit) or an artifact of extrapolation?
9.2Composition of Regime Contracts
Regime contracts can be composed when two observers or experiments study overlapping domains. If R_1 and R_2 have compatible ledgers and overlapping observable sets, their composition R_1 ∩ R_2 is the regime contract defined by the intersection of their observables, the stricter of their noise models, and the tighter of their resource bounds. Laws valid in R_1 ∩ R_2 are valid in both R_1 and R_2, providing a mechanism for inter-laboratory and inter-regime consistency checks.
9.3Regime Expiry
A regime contract has a declared validity window in parameter space (energy scale, spatial scale, temporal duration, etc.). When the system is driven beyond this window—by changing experimental conditions, by cosmic evolution, or by probing finer scales—the regime expires. Expired regimes are not “wrong”; they are “out of scope.” The framework requires either adopting a new regime or extending the meta-regime family.
10Observers as Bounded Processors
TCR treats observers not as idealized Laplacian demons but as physical subsystems with finite resources, subject to the same commitments as any other subsystem in the framework.
10.1Bandwidth Constraints
Every observer or instrument has a finite data rate—a maximum number of bits per second that can be acquired, transmitted, and processed. This bandwidth limit B_max is declared in the regime contract and constrains the information substrate: correlations at frequencies above the Nyquist limit of the sampling protocol are invisible to the observer.
I_accessible ≤ B_max · T_obs(Eq. 10.1)
where T_obs is the observation duration. This upper bound on accessible information directly limits the compression quality achievable: no law can compress more information than the observer can access.
10.2Memory and History Dependence
Observers have finite memory M_max. When the dynamics of the observed system have memory (Mori–Zwanzig kernels with long tails), the observer must truncate history at some memory horizon τ_mem. This truncation introduces effective non-Markovianity in the observer's model even when the system itself is Markovian at a finer level.
The interplay between system memory and observer memory determines the effective law: if the observer's memory is shorter than the system's correlation time, the effective law will include history-dependent terms (integral terms, delay equations) that encode the observer's inability to track fast variables.
10.3Control Under Noise
Observers who also act as controllers (experimentalists, biological regulators) face a dual constraint: they must acquire noisy signals (C10) and act on coarse-grained representations (C9) under bandwidth limits (C11). The resulting control performance is bounded by the rate-distortion trade-off: the minimum distortion achievable at a given data rate. This connects TCR to information-theoretic control theory and provides quantitative bounds on how well a maintained subsystem can regulate itself.
10.4AI as a New Observer Class
TCR implies that the category of observer is not exhausted by unaided human investigators. Any physical subsystem that can acquire, store, process, and erase information under a declared regime contract qualifies as an observer. Contemporary AI systems therefore enter the framework as a new observer class: high-bandwidth, high-memory processors coupled to instruments, databases, and simulation stacks, with compression-search capacities that can exceed those of individual humans by orders of magnitude.
Their advantage is not metaphysical privilege but resource profile. A machine observer can scan larger model libraries, evaluate more candidate compressions, and revisit longer histories on shorter wall-clock times than a human observer operating under the same regime. TCR therefore predicts that AI systems may discover stable penalized compressions— effective laws, latent coordinates, or regime partitions—that no human would have proposed unaided within the same time budget.
Human intelligibility, however, is not the criterion of lawhood. A candidate law discovered by an AI observer earns provisional status only if it is auditable, reproducible, and compression-dominant within the declared regime; it may initially be semantically opaque to humans and still count as an earned description. AI does not grant access to observer- inaccessible information. It enlarges the accessible domain only through better instruments, larger memory, and faster search. When the best compression remains human-opaque, the burden shifts from immediate intuition to disciplined translation: explicit regime bounds, typed claims, replayable evidence, and distillation into simpler surrogates when possible.
11Subsystems, Boundaries, and Viability
The concepts of subsystem, boundary, and viability are central to TCR's account of persistence and structure in open networks.
11.1Boundary Construction
A boundary is an informational partition, not a physical wall. It is defined by the observer's choice of which degrees of freedom to track (the subsystem) and which to integrate out (the environment). The quality of the boundary is measured by the leakage budget δ_B (Definition 3.2): lower leakage means better isolation.
In practice, boundaries are chosen to maximize the explanatory power of the resulting effective law while keeping the leakage budget within acceptable limits. This is an optimization problem: too coarse a boundary (high leakage) yields noisy, unstable laws; too fine a boundary (tracking too many degrees of freedom) yields laws that are complex but compress poorly.
11.2Viability Dynamics
The viability functional V(p_t; p*) (Definition 3.3) tracks how far the subsystem has drifted from its viable region. For actively maintained subsystems, viability is regulated by feedback:
dV/dt = −γ · V + σ²/2 + perturbation terms(Eq. 11.1)
where γ is the feedback gain, σ² is the noise-driven diffusion, and the perturbation terms encode external disturbances. Viability is maintained when γ is large enough to overcome noise and perturbation, which requires a minimum resource throughput J_min. This equation makes the viability–throughput connection (C5) quantitative.
11.3Viability Boundaries and Phase Transitions
The boundary of the viable region in parameter space is a critical surface. Crossing it represents a qualitative change in the subsystem: loss of active maintenance, escape from metastability, or transition to a new viable regime. In many cases, this crossing has the character of a phase transition—abrupt, sometimes irreversible, and accompanied by diverging fluctuations. TCR treats phase transitions as regime-exit events that trigger either re-classification of the subsystem or adoption of a new regime contract.
Part III
Thermodynamics as Revisable Ledger
12Thermodynamics Is Not Gravity
A key distinction in TCR is between dynamical laws (laws that specify trajectories, forces, and evolution equations) and constraint/ledger laws (laws that constrain which states and processes are accessible without specifying how the system gets there). Thermodynamics belongs to the second category.
12.1Dynamical Laws vs. Constraint Laws
Dynamical laws (Newton's equations, Maxwell's equations, the Schrödinger equation) specify how a system evolves given initial conditions and forces. They are generative: given inputs, they produce outputs. Constraint laws (the second law of thermodynamics, conservation of energy, Landauer's principle) specify which processes are forbidden or which quantities are conserved, without specifying the detailed path. They are restrictive: they exclude possibilities rather than generating trajectories.
This distinction matters because constraint laws are typically more robust than dynamical laws: they survive coarse-graining, they hold across wider ranges of scales and materials, and they are less sensitive to the details of the microphysical model. In TCR's language, constraint laws are deeper compressions—they compress more of the information substrate with fewer parameters—and thus sit at a lower revision level (L2) than effective dynamical laws (L1).
12.2The Ledger Metaphor
TCR adopts the metaphor of a “thermodynamic ledger” to emphasize the accounting nature of thermodynamics. Like a financial ledger, the thermodynamic ledger tracks debits and credits (entropy production and dissipation), maintains conservation laws (energy, charge), and flags violations (perpetual motion claims, negative entropy production). The ledger is not the physics itself; it is an accounting system that constrains which physical claims are internally consistent.
13The Ledger Library
The thermodynamic ledger is organized as a library of constraint laws, each with its own scope, precision, and revision history. We identify four principal ledger laws:
Table 13.1 — The Ledger Library
| Ledger Law | Statement | Scope | Revision Trigger |
|---|---|---|---|
| LΛ-1: Energy conservation | Total energy (internal + kinetic + potential + field) is conserved in the closed-system limit and accounts for all transfers in the open-system case. | All regimes where energy is a well-defined observable | Anomalous energy non-conservation beyond instrument uncertainty |
| LΛ-2: Entropy production | Total entropy production is non-negative for any process: ΔS_total ≥ 0. Equality holds only for reversible (quasi-static) processes. | All thermodynamic regimes with a defined entropy functional | Persistent negative entropy production after accounting for all reservoirs |
| LΛ-3: Dissipation bounds | Any process that erases information or maintains a subsystem against perturbation dissipates at least k_BT·ln(2) per bit erased (generalized Landauer bound). | All regimes with thermal contact and information processing | Systematic sub-Landauer dissipation in controlled experiments |
| LΛ-4: Fluctuation relations | The ratio of forward to reverse process probabilities satisfies detailed fluctuation theorems: P(+σ)/P(−σ) = exp(σ/k_B). | Mesoscopic and nanoscale regimes where fluctuations are measurable | Systematic violation of Crooks or Jarzynski equalities beyond statistical uncertainty |
Each ledger law is a compression: it summarizes a vast body of experimental evidence into a concise statement. The ledger library is revisable (C18), but revision at level L2 triggers mandatory re-auditing of all effective laws (L1) that depend on the revised ledger entry.
14Maintenance–Information–Dissipation Bounds
One of TCR's concrete quantitative contributions is the derivation of bounds linking maintenance, information processing, and dissipation in actively maintained subsystems.
14.1The Maintenance Bound
Ẇ_maint ≥ k_BT · İ_feedback + k_BT · ln(2) · Ṙ_erase(Eq. 14.1)
where Ẇ_maint is the rate of work required for maintenance, İ_feedback is the rate of information acquisition in the feedback loop, and Ṙ_erase is the rate of information erasure. This bound connects the resource cost of persistence (C5, C6) to the information-processing requirements of feedback control (C9, C13).
14.2Implications for Biological and Engineered Systems
For biological systems, the maintenance bound implies that organisms must dissipate at a minimum rate set by their information-processing needs (sensory acquisition, neural computation, gene regulation). This provides a principled lower bound on the metabolic rate of organisms, connecting to long-standing scaling laws in biology (Kleiber's law, metabolic scaling theory).
For engineered systems (control systems, computational devices), the bound provides design constraints: the minimum power consumption of a feedback controller is set by its information throughput, not merely by friction or resistive losses. This has implications for the energetics of autonomous systems and the thermodynamics of computation.
15Ledger Revision Protocol
Revising the thermodynamic ledger is a high-cost operation with cascading consequences. The revision protocol ensures that such revisions are disciplined and auditable.
15.1Triggering Conditions
Ledger revision is triggered when: (i) cross-regime inconsistency in entropy accounting persists after all L1 revisions have been exhausted; (ii) a ledger-predicted bound (e.g., Landauer's bound) is systematically violated in controlled experiments; or (iii) a new experimental domain (e.g., quantum thermodynamics at strong coupling) reveals that the current ledger is incomplete.
15.2Revision Steps
When triggered, ledger revision proceeds through: (1) isolation of the anomaly to a specific ledger law; (2) proposal of a revised ledger law with stated scope; (3) consistency check of the revised ledger against all dependent effective laws; (4) re-derivation of all affected L1 compressions under the revised ledger; (5) cross-regime validation of the revised ledger across the meta-regime family; (6) declaration of the revision with full audit trail.
Part IV
Laws as Stable Compressions
16What Laws Compress
In TCR, laws do not “govern” nature; they compress the information substrate generated by physical constraints within a declared regime. This section makes precise what is being compressed and what compression quality means.
16.1The Information Substrate as Compression Target
The target of compression is the information substrate I_R (Definition 3.5): the set of all discernible, stable correlations among observables in the regime. A law L is a compact description that, together with the ledger Λ and the regime contract R, enables reconstruction of I_R to within the noise floor. The better the law, the shorter the description of I_R given L.
16.2What a Good Compression Looks Like
A good compression has three properties: (i) fidelity—the residuals between the law's predictions and the actual correlations are indistinguishable from noise; (ii) parsimony—the law itself has short description length (few parameters, simple functional form); and (iii) stability—the law survives perturbation of the regime parameters and is reproducible across independent observers.
16.3What Compression Is Not
Compression is not data fitting. A lookup table that stores every data point is a perfect fit but a terrible compression (DL(L) is maximal). An overfitted polynomial that passes through every point achieves zero residual but fails cross-validation. The MDL/Bayesian framework automatically penalizes these pathologies by including the description length of the model itself.
Compression is also not explanation in the causal or mechanistic sense. A compressed law may or may not reveal the mechanism generating the correlations. TCR is agnostic about mechanism at the level of law-discovery; mechanism is a deeper compression (connecting effective laws to microphysical models) that may or may not be achievable.
17Compression Formalism: MDL and Bayesian Evidence
TCR adopts the minimum description length (MDL) principle as its primary compression criterion, with Bayesian model evidence as an asymptotically equivalent alternative. Both formalize the intuition that the best law is the shortest program that reproduces the data.
17.1MDL Formulation
Given a data set D drawn from the information substrate I_R, the MDL criterion selects the law L* that minimizes the total description length:
L* = argmin_L { DL(D | L) + DL(L) }(Eq. 17.1)where DL(D | L) is the description length of the data given the law (proportional to the negative log-likelihood up to constants) and DL(L) is the description length of the law itself (encoding its functional form and parameter values). The MDL principle originates with Rissanen (1978) and has been extensively developed by Grünwald (2007).
17.2Bayesian Evidence
The Bayesian model evidence for law L is:
P(D | L) = ∫ P(D | θ, L) · P(θ | L) dθ(Eq. 17.2)
where the integral is over the parameter space of L with prior P(θ | L). In the large-data limit, −log P(D | L) converges to the MDL objective, with the BIC (Bayesian Information Criterion) providing the leading-order approximation:
BIC = −2 ln P(D | θ̂, L) + k · ln(n)(Eq. 17.3)
where k is the number of parameters and n the number of data points. TCR uses whichever formulation is more natural for the problem at hand, with the understanding that they agree asymptotically and both implement the same parsimony–fidelity trade-off.
17.3Regime-Conditioned Compression
Crucially, all compressions in TCR are conditioned on the regime contract R and the ledger Λ:
L*_{R,Λ} = argmin_L { DL(I_R | L, R, Λ) + λ·DL(L) }
subject to: L is consistent with Λ(Eq. 17.4)This conditioning is what makes laws regime-indexed (C16) and ensures that compressions at different regimes are not naively compared without accounting for differences in observables, noise, and coarse-graining.
18Effective Laws and Drift
Effective laws are the workhorses of TCR: the regime-specific, ledger-consistent compressions that scientists actually use. This section addresses their scope and the phenomenon of drift.
18.1Domain of Validity
Every effective law has a declared domain of validity: the region in parameter space where the compression achieves its declared fidelity and parsimony. Outside this domain, the law may still compress some information, but with degraded quality. The boundary of the domain of validity is operationally defined: it is the locus where the compression ratio drops below the declared threshold or where residuals develop systematic structure.
18.2Drift Diagnostics
Drift (Definition 3.7) is diagnosed by tracking the optimal parameters θ*(t) across sequential observation windows. The diagnostic procedure is:
- Divide the observation period into non-overlapping windows of duration Δt.
- Within each window, solve the compression problem to obtain θ*(t_k) for window k.
- Test {θ*(t_k)} for systematic trends (linear drift, periodic variation, abrupt shifts) against the null hypothesis of statistical constancy.
- If drift is detected, classify it as constraint-driven, regime-boundary, or artifact.
- Constraint-driven drift triggers L1 revision (update the effective law to include time-dependent parameters). Regime-boundary drift triggers regime extension or re-declaration. Artifact drift triggers instrument re-calibration.
18.3Drift and Putative Constants
TCR's treatment of drift has implications for so-called fundamental constants (the fine-structure constant α, Newton's gravitational constant G, the speed of light c). Within TCR, these are the parameters of the deepest current compression (the microphysical model family M_μ). Their constancy is an empirical claim, not a metaphysical axiom. The framework provides a disciplined protocol for testing constancy: if drift in a putative constant is detected at declared significance across multiple independent regimes, it is a legitimate physical finding that triggers L3 revision.
Part V
Microphysics as Deepest Compression
19Microphysics as Model Family
The standard view in physics treats microphysics—quantum field theory, general relativity, the Standard Model—as the “fundamental” level of description from which everything else derives. TCR takes a different position: microphysics is the deepest currently supported compression of the information substrate, distinguished from shallower compressions by scope, invariance, and revision cost, but not by ontological privilege.
19.1Why Not “Fundamental”?
The designation “fundamental” carries two implications that TCR rejects: (i) that microphysical laws are metaphysically necessary (they are the way reality “really is”); and (ii) that they are unrevisable (future physics will not change them). TCR replaces both with operational criteria: microphysical models are the deepest compressions we can currently audit, and they are revisable under the strictest revision protocol (L3).
19.2What Makes Microphysics “Deep”?
Microphysical compressions are distinguished by three properties:
- Maximal scope: They apply across the widest range of regimes in the meta-regime family.
- Strictest invariance: They are invariant under the largest symmetry groups (Lorentz, gauge, diffeomorphism).
- Highest revision cost: Revising them triggers cascading re-audits across all shallower layers.
These properties make microphysical laws practically unrevisable (revision is extremely costly), but not metaphysically unrevisable. The distinction is important: it means that claims about the “fundamental nature of reality” are always provisional within TCR.
20Meta-Regime Selection
Given a meta-regime family R = {R_1, ..., R_K}, the framework selects the optimal compression depth at each regime by minimizing the total description length across the family:
(L*, Λ*, M*_μ) = argmin { Σ_k [ DL(I_{R_k} | L_k, Λ, M_μ) ] + DL(L) + DL(Λ) + DL(M_μ) }(Eq. 20.1)This joint optimization over effective laws L, ledger Λ, and microphysical model M_μ is the framework's analog of the “theory of everything” search—but recast as a compression problem with explicit costs, rather than a metaphysical quest.
21Drift and Revision at the Deepest Level
If the parameters of the microphysical model (coupling constants, symmetry groups) exhibit drift across the meta-regime family, the framework responds through the L3 revision protocol:
- Confirm that the drift is not attributable to L1 (effective-law revision) or L2 (ledger revision).
- Establish the drift across at least three independent regimes with independent instruments.
- Propose a revised microphysical model that accounts for the drift (e.g., a running coupling constant, a broken symmetry, a new field).
- Re-derive all ledger laws (L2) and effective laws (L1) from the revised microphysics.
- Verify that the revised model does not degrade compression quality in any regime where the old model was adequate.
- Declare the revision with full audit trail, including the falsifiers that would trigger further revision.
22Consistency with Thermodynamic Ledgers
Any microphysical model in M_μ must be consistent with the thermodynamic ledger Λ. This consistency requirement is not an axiom; it is an empirical constraint derived from the observation that thermodynamic laws have survived longer and across more regimes than any specific microphysical model. In compression terms, the ledger has a shorter description length relative to its scope than any microphysical model, and thus occupies a more robust layer.
Concretely, consistency requires: (i) the microphysical dynamics must produce non-negative entropy production in any process consistent with the model's equations of motion; (ii) the dissipation bounds (LΛ-3) must be derivable from the microphysical model's information-theoretic properties; and (iii) the fluctuation relations (LΛ-4) must emerge from the microphysical dynamics in the appropriate mesoscopic limit.
23EFT/RG as Compression Hierarchy
Effective field theory (EFT) and the renormalization group (RG) provide the most developed example of TCR's compression hierarchy in existing physics. In TCR's language:
23.1RG Flow as Compression Flow
The RG flow from UV (short-distance) to IR (long-distance) descriptions is a sequence of coarse-graining operations, each of which projects out short-wavelength degrees of freedom and re-compresses the remaining information substrate. The fixed points of the RG flow are the most stable compressions at their respective scales—they are scale-independent descriptions that compress the correlations at all length scales in their basin of attraction.
23.2EFT as Regime-Indexed Law
An effective field theory is a regime-indexed law in TCR's sense: it specifies the effective dynamics at a declared energy/length scale, with explicit power-counting rules that quantify the corrections from higher-energy physics. The EFT expansion parameter (the ratio of the observation scale to the cutoff scale) is the regime's declared precision, and the tower of higher-dimension operators is the systematic correction structure.
23.3Universality as Compression Invariance
Universality—the phenomenon whereby different microphysical models flow to the same RG fixed point—is naturally interpreted in TCR as compression invariance: the effective law at long distances is the same regardless of the microphysical details, because the coarse-graining has eliminated all microphysics-specific information. This is the information- theoretic content of Anderson's “More Is Different” (1972): higher-level laws are autonomous compressions, not merely derivations from lower-level laws.
Part VI
Geometry as Earned Compression Layer
24Informational Fingerprints of Geometry
Geometry enters TCR not as a foundational assumption but as an earned compression: a description layer that is warranted only when the information substrate exhibits specific structural signatures.
24.1When Geometry Is Warranted
A geometric description (distances, angles, curvature, manifold structure) is warranted when the information substrate exhibits:
- Distance-like invariants: Correlations decay with a quantity d(x,y) that is symmetric, non-negative, and satisfies d(x,x) = 0.
- Triangle inequality: The candidate distance satisfies d(x,z) ≤ d(x,y) + d(y,z) to within declared precision.
- Dimension consistency: The scaling of the number of independent correlations with the distance scale is consistent with a finite integer dimension d.
- Smooth interpolation: Correlations vary smoothly (in a declared sense) as a function of the candidate distance, enabling calculus-based descriptions.
24.2Embedding Conditions
When these signatures are present, the information substrate can be embedded in a geometric space (Riemannian manifold, Lorentzian manifold, discrete geometry) with a compression advantage over non-geometric descriptions. The embedding is the most compressed description of the correlation structure, and the geometric quantities (metric, connection, curvature) are derived from the embedding, not assumed a priori.
DL(I_R | L_geom) + DL(L_geom) < DL(I_R | L_non-geom) + DL(L_non-geom)(Eq. 24.1)
This inequality is the operational test for geometric warrant: the geometric law must compress better than any non-geometric alternative. If it does not, geometry is not warranted at that regime.
25Gauge and Equivalence
Gauge symmetries, in TCR's view, are redundancies in the description that do not correspond to physical information. A gauge transformation changes the description without changing the information substrate. The presence of gauge symmetry indicates that the description is “overparameterized”—it uses more parameters than necessary to compress the available information.
TCR treats gauge-fixing as a compression step: among all gauge-equivalent descriptions, the one with the shortest total description length is the most compressed. Different gauge choices may be optimal for different computational or analytic purposes, but they all compress the same information substrate. The physical content is the gauge-invariant compression residual.
Equivalence principles (the equivalence of inertial and gravitational mass, the equivalence of locally accelerated and gravitational frames) are similarly interpreted as compression invariances: they state that certain apparently different physical situations compress to the same effective law, reflecting an underlying symmetry of the information substrate.
Part VII
Empirical Programs
TCR generates five concrete empirical programs, each with preregistered predictions and explicit falsification criteria. These programs are designed to test the framework's commitments at different levels of the compression hierarchy.
26Program A: Maintenance Signatures in Nonequilibrium Systems
Objective: Test whether actively maintained subsystems exhibit the predicted minimum dissipation rate set by their information-processing requirements (Eq. 14.1).
26.1Experimental Protocol
Select a set of maintained subsystems spanning biological (bacterial chemotaxis, yeast metabolism, mammalian thermoregulation) and engineered (feedback-controlled chemical reactors, autonomous vehicles, computational systems) domains. For each system:
- Declare the regime contract R, including the feedback loop, noise model, and viability tolerance.
- Measure the actual dissipation rate Ẇ_actual.
- Independently measure the information-processing rate İ_feedback and erasure rate Ṙ_erase.
- Compare Ẇ_actual with the predicted lower bound k_BT · İ_feedback + k_BT · ln(2) · Ṙ_erase.
26.2Predictions and Falsifiers
TCR predicts that Ẇ_actual ≥ k_BT · İ_feedback + k_BT · ln(2) · Ṙ_erase for all systems, with the bound becoming tight in optimized systems operating near the thermodynamic limit.
Falsifier
27Program B: Geometry Stability Tests
Objective: Test whether geometric descriptions (distance, dimension, curvature) are the most compressed descriptions of correlation structure at declared regimes, and whether geometric stability breaks down at extremes.
27.1Protocol
For a range of physical systems (condensed matter, cosmological observations, quantum information networks), compute the compression ratio of geometric versus non-geometric descriptions of the correlation structure. Track this ratio as a function of scale, energy, and information density.
27.2Predictions and Falsifiers
TCR predicts that geometric descriptions are optimal (highest compression ratio) at “classical” scales and may lose optimality at Planck-scale or information-dense regimes where the informational fingerprints of geometry (Section 24.1) degrade.
Falsifier
28Program C: Drift in Putatively Fundamental Laws
Objective: Search for systematic drift in the parameters of microphysical models (coupling constants, masses, symmetry-breaking parameters) across cosmological time, spatial location, and energy scale.
28.1Protocol
Aggregate existing data on putative constant variation (quasar absorption lines for α, atomic clock comparisons for drift rates, collider data for running couplings) and apply TCR's drift diagnostic protocol (Section 18.2) with declared regime contracts.
28.2Predictions and Falsifiers
TCR does not predict drift; it provides a disciplined framework for detecting and classifying it. If drift is detected, it is a physical finding, not a crisis. If no drift is detected, the constancy of the parameters is confirmed at the declared precision.
Falsifier
29Program D: Thermodynamic Ledger Revision
Objective: Identify regimes where the current thermodynamic ledger is incomplete or requires revision, particularly in quantum thermodynamics, strong-coupling regimes, and information-dominated systems.
29.1Protocol
In regimes where standard thermodynamic definitions (temperature, entropy, free energy) become ambiguous (strong coupling, small systems, quantum coherence), apply TCR's ledger formalism to construct operational definitions and test whether the standard ledger laws (LΛ-1 through LΛ-4) hold with appropriate generalizations.
29.2Predictions and Falsifiers
TCR predicts that the ledger laws hold in generalized form (non-equilibrium entropy production, quantum fluctuation relations) but may require modified estimators in extreme regimes.
Falsifier
30Program E: Microphysics Revision
Objective: Establish the conditions under which the current microphysical model family (Standard Model + general relativity) would require revision under TCR's criteria.
30.1Protocol
Identify the most promising anomalies in current physics (dark energy, dark matter, neutrino masses, the hierarchy problem, the cosmological constant problem) and classify each within TCR's revision framework: is it an L1 issue (effective-law update within the current microphysics), an L2 issue (ledger revision), or a genuine L3 trigger (microphysics revision)?
30.2Predictions and Falsifiers
TCR predicts that most current anomalies will be resolvable at L1 or L2 (effective-law updates or ledger extensions), with genuine L3 revision triggered only by cross-regime, cross-ledger inconsistencies that resist resolution at lower levels.
Falsifier
31Summary of Empirical Programs
Table 31.1 — Empirical Program Summary
| Program | Tests | Revision Level | Key Observable |
|---|---|---|---|
| A: Maintenance Signatures | C5, C6, C13 — dissipation bounds for maintained systems | L2 (ledger) | Ẇ_actual vs. information-theoretic bound |
| B: Geometry Stability | Geometry as earned compression layer | L1 (effective law) | Geometric vs. non-geometric compression ratio |
| C: Fundamental Drift | C17, C18 — constancy of microphysical parameters | L3 (microphysics) | Parameter drift across regimes and epochs |
| D: Ledger Revision | C6, C18 — completeness of thermodynamic ledger | L2 (ledger) | Ledger consistency in extreme regimes |
| E: Microphysics Revision | C18 — adequacy of deepest compression | L3 (microphysics) | Cross-regime, cross-ledger anomalies |
Part VIII
Discussion and Conclusion
32What the Framework Claims
TCR makes the following positive claims:
- Scientific laws are compressions of boundary-accessible information, not governing rules imposed on nature from outside.
- All law-claims are regime-indexed: their validity is bounded by declared observables, noise models, coarse-graining, and resource constraints.
- Thermodynamics occupies a distinguished but revisable role as the most stable constraint/ledger layer of the compression hierarchy.
- Microphysics is the deepest currently audited compression, distinguished by scope and invariance demands, not by metaphysical privilege.
- Geometry is an earned compression layer, warranted by informational fingerprints, not assumed a priori.
- The framework provides a unified audit trail connecting effective laws, thermodynamic ledgers, and microphysical hypotheses through a common compression formalism.
33What the Framework Refuses to Claim
TCR explicitly refuses the following claims:
- No ontological claim about “what reality really is.” TCR is a framework for organizing, auditing, and revising scientific descriptions. It does not claim that reality is information, or that compression is the fundamental process. It claims only that compression is the most disciplined way to organize our descriptions.
- No claim of completeness. TCR does not claim to be the final metatheory. It is itself a compression of metascientific experience, subject to revision by a yet-deeper framework if one proves more compressive.
- No claim that current physics is wrong. TCR is designed to be consistent with all established physics. It re-interprets the epistemological status of physical laws without changing their content.
- No prediction of specific new physics. TCR provides a framework for organizing law-discovery and revision, not a specific prediction of what the next physical theory will look like.
34Limitations
We acknowledge four principal limitations of the framework in its current form:
- Computational tractability. The full compression optimization (Eq. 20.1) is computationally intractable for realistic systems. Practical applications will require approximations, heuristics, and hierarchical decomposition. The framework provides the exact objective but not always a feasible algorithm for achieving it.
- Regime contract specification. Declaring a complete regime contract requires deep domain expertise and may itself be subject to errors of omission. The framework provides the structure for the contract but not automated methods for generating it.
- Ledger completeness. The thermodynamic ledger library (Section 13) is based on current thermodynamic knowledge. Extensions to quantum thermodynamics, strong-coupling regimes, and cosmological thermodynamics are active research areas, and the ledger may require significant expansion.
- Empirical bootstrapping. The framework's commitments are themselves empirical claims about the structure of scientific knowledge. They are supported by the history of physics and metascience, but they are not derivable from first principles. TCR is empirically grounded meta-theory, not a priori philosophy.
35Relationship to Established Physics and Philosophy
TCR draws on and synthesizes ideas from multiple established traditions while maintaining a distinctive position:
Table 35.1 — Intellectual Lineage
| Tradition | TCR's Relation |
|---|---|
| Statistical mechanics (Jaynes, Zwanzig) | TCR generalizes the MaxEnt and Mori–Zwanzig formalisms into a full compression framework with explicit regime indexing and revision protocols. |
| Minimum description length (Rissanen, Grünwald) | TCR adopts MDL as its compression criterion and extends it to the hierarchical, multi-regime setting of physical law-discovery. |
| Free energy principle (Friston) | TCR shares the emphasis on self-organizing systems maintaining themselves against perturbation but replaces the universal Markov blanket assumption with regime-specific informational boundaries. |
| Information thermodynamics (Landauer) | TCR treats erasure costs and maintenance bounds as ledger-relative constraints and uses dissipation limits as falsifiable bookkeeping laws rather than metaphysical bedrock. |
| Effective field theory / RG | TCR reinterprets EFT and RG as a compression hierarchy, with universality as compression invariance (Section 23). |
| Structural realism (Ladyman, French) | TCR is compatible with structural realism but adds operational content: structures are not merely preserved across theory change; they are the most compressed invariants of the information substrate. |
| Pragmatism (Peirce, Dewey, van Fraassen) | TCR shares the pragmatist emphasis on inquiry and fallibilism but adds quantitative compression criteria and explicit revision protocols. |
| Dissipative structures (Prigogine) | TCR incorporates Prigogine's insights on far-from-equilibrium self-organization as instances of its maintenance-mode persistence (C4, C5). |
| Viability and safe control (Ames, Tabuada) | TCR aligns with control-theoretic viability by treating admissible trajectories and protected sets as regime-bounded maintenance constraints rather than merely controller design targets. |
| More Is Different (Anderson) | TCR provides a formal instantiation of Anderson's insight: higher-level laws are autonomous compressions, not merely derivations from lower-level descriptions. |
36Conclusion
The Theory of Compressive Realism proposes that the conceptual unity of science is best understood through the lens of information compression. Laws, ledgers, and microphysical hypotheses are all compressions of accessible information at different depths, distinguished by scope, invariance demands, and revision cost—not by ontological privilege.
Information is the glue: it connects the physical layer (constraints, fluxes, dissipation) to the epistemic layer (observations, models, predictions) through a single formalism that is both quantitative (MDL/Bayesian evidence) and auditable (regime contracts, revision protocols, falsifiers). The unified audit trail ensures that every law-claim, from an effective chemical rate law to the Standard Model of particle physics, is accompanied by its regime, its ledger, its compression quality metrics, and its revision criteria.
The framework does not replace physics with philosophy or information theory with experiment. It provides a meta-structure within which the practice of physics—the cycle of observation, compression, prediction, and revision—can be made explicit, disciplined, and auditable. Science, in this view, is the most stable compression of our interaction with an open, non-equilibrium reality.
This observer-relative structure matters even more in an AI-native era. As machine observers exceed human investigators in bandwidth, memory, and compression search, TCR allows them to earn law-claims that humans did not anticipate and may not immediately understand. What matters is not familiarity but whether the proposed law is the best audited compression of the accessible data under a declared regime contract. In that sense, AI is not outside the theory; it is a new observer class operating inside the same revision discipline.
Part IX
Interpretive Extensions
The following extensions apply TCR's core machinery to specific physical domains. Each extension is stated as a set of theorems (derived from the 18 commitments) with explicit falsifiers. These are interpretive: they show how existing physical phenomena are re-described within TCR, not predictions of new physics. However, the falsifiers convert each interpretation into a testable claim.
FObjects as Earned Compressions
Everyday objects—tables, chairs, atoms, molecules—are not ontological primitives in TCR. They are earned compressions: stable, reproducible patterns in the information substrate that achieve compression advantage over alternative descriptions.
Theorem 1: Object Persistence as Compression Stability
A subsystem S described by observables Y_S persists as an “object” within regime R if and only if: (i) the compression L_S of the correlations among Y_S has a description length strictly shorter than the description of the raw correlations: DL(I_S | L_S) + DL(L_S) < DL(I_S); (ii) L_S is stable across perturbations of R within declared tolerance; and (iii) L_S persists across sequential observation windows within the viability tolerance.
Theorem 2: Object Boundaries from Information Minimization
The “natural” boundary of an object is the informational partition that minimizes the total description length of the subsystem plus its coupling to the environment: b* = argmin_b { DL(I_μ | L_μ) + DL(L_μ) + DL(I_{μη} | b) }where I_μ is the internal information, L_μ the internal law, and I_μη the cross-boundary correlations. Objects with “sharp” boundaries have low cross-boundary information relative to internal information.
Falsifier
GLight and Radiation as Audited Channels
In TCR, light and electromagnetic radiation are the primary channels through which information about distant subsystems reaches observers. The properties of light—finite speed, wave-particle duality, spectral structure—are constraints on the information channel, not metaphysical properties of a substance.
Theorem 3: Speed of Light as Channel Capacity Bound
The finite speed of light c imposes a maximum information rate between spatially separated subsystems. Within TCR, c is the parameter of the deepest current compression (microphysics) that bounds the channel capacity of electromagnetic communication. Its constancy is an empirical claim at L3 depth, testable via Program C (drift diagnostics).
Theorem 4: Spectral Structure as Correlation Fingerprint
The discrete spectral lines of atoms and molecules are compressed descriptions of the stable correlations generated by the quantum-mechanical constraints (energy quantization, selection rules) on the electromagnetic channel. Each spectral line is a compression: a few parameters (frequency, width, intensity) summarize a stable correlation pattern in the emitted radiation.
Falsifier
HGravity, Space, and Geometric Emergence
Gravity and spatial geometry are among the most deeply compressed features of the information substrate. TCR treats them as earned geometric compressions at L3 depth, with the caveat that the geometric description may not be optimal at all scales.
Theorem 5: Gravitational Law as Geometric Compression
General relativity is the most compressed description of the correlations among massive bodies and radiation at the classical, macroscopic regime. The metric tensor g_μν is the compression's central object: it encodes distance, causality, and gravitational dynamics in a single geometric structure. The Einstein field equations are the stationarity conditions for this compression.
Theorem 6: Geometric Compression Breakdown
The geometric compression (Theorem 5) is predicted to lose optimality when the information substrate exhibits violations of the geometric fingerprints (Section 24.1): specifically, when curvature fluctuations at scale ℓ reach the Planck regime (ℓ ~ ℓ_P), the smooth-manifold description fails to satisfy the smooth-interpolation criterion, and a non-geometric (e.g., discrete, algebraic, or information-theoretic) compression may achieve shorter description length.
Falsifier
ITime, Causality, and Ledger Emergence
Time and causality occupy a special role in TCR: they are entangled with the thermodynamic ledger (entropy production provides the arrow of time) and with the observer's memory (which defines the distinction between past and future).
Theorem 7: Arrow of Time as Ledger Asymmetry
The thermodynamic arrow of time (the direction of increasing entropy) is not a property of the microphysical dynamics (which may be time-reversible) but an emergent feature of the thermodynamic ledger applied to coarse-grained descriptions. The arrow emerges because coarse-graining is an irreversible information loss: the data processing inequality ensures that the coarse-grained entropy can only increase or remain constant, even when the fine-grained entropy is constant.
Theorem 8: Causality as Compression Order
Causal order between events A and B is the compression statement that including A in the conditioning set reduces the description length of B: DL(B | A, R) < DL(B | R). Causal structure is thus an emergent feature of the compression hierarchy, not an a priori metaphysical relation. The causal graph is the most compressed representation of the conditional independence structure of the information substrate.
Falsifier
JBlack Holes, Horizons, and Singularities
Black holes are among the most extreme regimes accessible to TCR's analysis. They combine strong gravity (deep geometric compression), extreme thermodynamics (Bekenstein–Hawking entropy, Hawking radiation), and fundamental information questions (the information paradox).
J.1Horizons as Informational Boundaries
In TCR, a black hole horizon is an extreme informational boundary: the leakage budget δ_B approaches zero in the classical limit (no information escapes), making the horizon a nearly perfect partition between tracked (exterior) and untracked (interior) degrees of freedom. The Bekenstein–Hawking entropy S_BH = A/(4ℓ_P²) is the compression of the cross-boundary correlation structure: it counts the maximum number of independent correlations that the exterior observer can resolve given the area of the boundary.
J.2Singularities as Compression Failures
A singularity (in the classical general-relativistic sense) is a point where the geometric compression breaks down: curvature invariants diverge, meaning the smooth-manifold description requires infinite description length to encode the local geometry. In TCR, this is not a statement about reality; it is a statement about the inadequacy of the geometric compression at that point. The singularity is a compression failure that triggers regime exit and signals the need for a deeper compression (quantum gravity, whatever form it takes).
J.3The Information Paradox
The black hole information paradox—the apparent conflict between unitary quantum evolution and information loss at the horizon—is re-framed in TCR as a cross-regime inconsistency. The quantum-mechanical description (unitary evolution, no information loss) and the geometric description (classical horizon, information trapped) are compressions at different levels. The paradox arises from demanding that both compressions hold simultaneously in a regime (the evaporating black hole) where their domains of validity overlap imperfectly. TCR predicts that the resolution lies in a deeper compression (L3 revision) that subsumes both the geometric and quantum descriptions.
K/LWormholes and Topological Shortcuts
Wormholes (Einstein–Rosen bridges, traversable wormholes) are topological features of the geometric compression that connect spatially separated regions. In TCR, their status depends on whether they correspond to stable compressions of the information substrate.
Theorem 9: Wormhole Viability Criterion
A wormhole solution of the geometric compression (general relativity) corresponds to a physical feature if and only if: (i) it is a stable compression (survives perturbation of the regime parameters); (ii) it is consistent with the thermodynamic ledger (does not enable violations of LΛ-2); and (iii) its information-channel properties (traversability, bandwidth, latency) are consistent with the microphysical model's constraints on information transfer.
The ER=EPR conjecture (Einstein–Rosen bridges are equivalent to Einstein–Podolsky–Rosen entanglement) translates naturally into TCR: both are descriptions of cross-boundary correlations in the information substrate, one in geometric language and one in quantum-information language. Their equivalence, if validated, would be a compression invariance—two descriptions of the same information substrate that compress equally well.
Falsifier
MQuantum Entanglement as Cross-Boundary Correlation
Quantum entanglement is one of the most distinctive features of quantum mechanics. In TCR, entanglement is re-described as a specific type of cross-boundary correlation in the information substrate.
Theorem 12: Entanglement as Non-Classical Correlation Structure
Entanglement between subsystems A and B is a correlation in the information substrate that cannot be compressed by any classical (local hidden variable) model. Formally: DL(I_AB | L_classical) + DL(L_classical) > DL(I_AB | L_quantum) + DL(L_quantum) for all classical models L_classical. The compression advantage of the quantum description over the classical one is the information-theoretic content of entanglement.
Theorem 13: No-Signaling as Compression Constraint
The no-signaling theorem (entanglement cannot be used for superluminal information transfer) is a consequence of the fact that the cross-boundary correlations encoded by entanglement are symmetric: they do not preferentially compress the information substrate of one subsystem given the other. Marginalization over either subsystem leaves the other's information substrate unchanged, which is the information-theoretic statement of no-signaling.
Falsifier
NUnified Audit Posture
The interpretive extensions (F through M) collectively demonstrate TCR's unified audit posture: every physical phenomenon, from everyday objects to black holes to quantum entanglement, is described within the same framework of regime-indexed, ledger-consistent, falsifiable compressions. The audit trail for each phenomenon includes:
- The regime contract under which the description is valid.
- The compression depth (L1, L2, or L3) at which the description operates.
- The thermodynamic ledger entries on which the description depends.
- The explicit falsifiers that would trigger revision.
- The relationship to descriptions at adjacent compression depths.
This uniformity is TCR's central contribution: not a new physical theory, but a metatheoretical framework that makes the practice of physics—its assumptions, its limits, its revision criteria—fully explicit and auditable.
Part X
Synthesis
37End-to-End Decision Procedure
The following pseudocode summarizes the complete TCR analysis pipeline, from data acquisition to law declaration, revision, and audit.
PROCEDURE TCR_Analysis(system, data):
1. DECLARE regime contract R = (Y, S, π, N, B, U)
- Specify observables, sampling, coarse-graining, noise, boundary, resources
- Validate regime contract completeness
2. CONSTRUCT informational boundary b
- Choose partition (μ, η) with leakage budget δ_B
- Verify I(μ; η | b) ≤ δ_B
3. BUILD information substrate I_R
- Compute all pairwise correlations C(y_i, y_j)
- Filter by detection threshold: retain only SNR(C) ≥ τ_detect
- Check stability across independent sampling episodes
4. ADOPT thermodynamic ledger Λ
- Select ledger laws {LΛ-1, …, LΛ-4} appropriate to regime
- Declare estimators and noise model for ledger quantities
5. COMPRESS: solve L* = argmin { DL(I_R | L, R, Λ) + λ·DL(L) }
- Search model library M for candidate laws
- Evaluate MDL/Bayesian evidence for each candidate
- Select L* with cross-validation, bootstrap, negative controls
6. VALIDATE compression
- Check residuals for systematic structure
- Verify stability under regime-parameter perturbation
- Confirm reproducibility across independent observers
- Test consistency with ledger Λ
7. CHECK for drift
- Repeat compression across sequential time windows
- Test θ*(t_k) for systematic trends
- Classify any drift (constraint-driven / regime-boundary / artifact)
8. DECLARE law with full audit trail
- Law statement L* with parameters and uncertainties
- Regime contract R under which L* is valid
- Compression quality metrics (DL, BIC, cross-validation score)
- Falsifiers: conditions that would trigger revision
- Revision level (L1, L2, or L3) at which revision would occur
9. IF revision triggered:
- L1: update effective law within regime
- L2: revise ledger, re-audit all dependent L1 laws
- L3: revise microphysics, re-audit all L2 and L1 layers
RETURN (L*, R, Λ, audit_trail)(Pseudocode)38Claim Registry Template
Every TCR-compliant claim is registered in the following format, ensuring auditability and reproducibility:
Table 38.1 — Claim Registry Template
| Field | Content |
|---|---|
| Claim ID | Unique identifier (e.g., TCR-2026-001) |
| Law statement | Mathematical or algorithmic description of L* |
| Regime contract | Full specification of R = (Y, S, π, N, B, U) |
| Ledger dependencies | Which ledger laws the claim depends on |
| Compression depth | L1 (effective), L2 (ledger), or L3 (microphysics) |
| Compression metrics | DL(I_R | L*), DL(L*), BIC, evidence ratio, cross-validation score |
| Stability report | Results of perturbation, bootstrap, and reproducibility tests |
| Drift status | No drift / drift detected (type, magnitude, classification) |
| Falsifiers | Explicit conditions that would trigger revision |
| Revision history | Log of all prior revisions with dates, triggers, and audit results |
| Data availability | Pointer to archived data and analysis code |
39Revision Governance
Revision is not ad hoc; it is governed by explicit rules that prevent both premature revision (abandoning well-compressed laws based on noise) and excessive conservatism (refusing to revise despite compelling evidence).
39.1L1 Governance
L1 revisions (effective-law updates) are the lowest-cost revisions and follow standard scientific practice: when a new model compresses the information substrate better than the incumbent (by the declared significance threshold), and survives cross-validation and negative controls, it replaces the incumbent within the regime. L1 revisions do not propagate beyond their regime.
39.2L2 Governance
L2 revisions (ledger updates) require cross-regime evidence: the anomaly triggering revision must be observed across at least two independent regimes in the meta-regime family. Upon revision, all dependent L1 laws must be re-audited. L2 revisions are logged with full provenance, including the data, analysis, and the specific ledger law revised.
39.3L3 Governance
L3 revisions (microphysics updates) are the highest-cost revisions and require the strongest evidence: cross-regime, cross-ledger anomalies confirmed by independent groups. Upon revision, the entire compression hierarchy (all L2 and L1 layers) must be re-audited against the new microphysical model. L3 revisions are rare—paradigm shifts in Kuhn's language— and the framework's contribution is to make the criteria for such shifts explicit and quantitative rather than sociological.
40Anti-Goalpost Guarantees
TCR includes explicit anti-goalpost mechanisms to prevent the framework from degenerating into an unfalsifiable meta-narrative:
- Every commitment is individually testable. Each of the 18 commitments (C1–C18) can be independently tested against empirical evidence. The framework stands or falls on the conjunction of its commitments, not on untestable axioms.
- Every empirical program has preregistered falsifiers. The five empirical programs (A–E) specify in advance what observations would falsify the framework's predictions, at what significance level, and with what number of independent confirmations.
- The framework is self-applicable. TCR is itself a compression of metascientific experience. If a rival metaframework compresses the same experience with shorter description length, TCR should be revised or replaced.
- No layer is immune. Commitment C18 explicitly prohibits granting any description layer—including the framework's own commitments—metaphysical immunity from revision.
41Minimal Reproducible Package
For any TCR-compliant analysis to be reproducible, the following minimal package must be provided:
- Regime contract R — fully specified, version-controlled.
- Data archive — the raw and processed data comprising the information substrate I_R.
- Ledger declaration Λ — the thermodynamic ledger adopted, with version and estimator specifications.
- Model library M — the set of candidate models considered, with their complexity measures.
- Compression code — the software implementing the MDL/Bayesian compression and model selection.
- Validation reports — cross-validation, bootstrap, perturbation, and negative-control results.
- Claim registry entry — the completed claim registry template (Table 38.1).
- Drift diagnostics — results of the drift detection protocol for sequential observation windows.
This package ensures that any independent researcher can reproduce the analysis, verify the compression quality, and audit the claim against the declared regime and ledger.
References
- Rissanen, J. (1978). Modeling by Shortest Data Description. Automatica, 14(5), 465–471.
- Grünwald, P. D. (2007).The Minimum Description Length Principle. MIT Press.
- Jaynes, E. T. (2003).Probability Theory: The Logic of Science. Cambridge University Press.
- Friston, K. (2010). The Free-Energy Principle: A Unified Brain Theory? Nat. Rev. Neurosci., 11, 127–138.
- Zwanzig, R. (2001).Nonequilibrium Statistical Mechanics. Oxford University Press.
- Ladyman, J., Ross, D., Spurrett, D., & Collier, J. (2007).Every Thing Must Go: Metaphysics Naturalized. Oxford University Press.
- French, S. (2014).The Structure of the World: Metaphysics and Representation. Oxford University Press.
- Peirce, C. S. (1878). How to Make Our Ideas Clear. Popular Science Monthly, 12, 286–302.
- Dewey, J. (1938).Logic: The Theory of Inquiry. Henry Holt and Company.
- van Fraassen, B. C. (1980).The Scientific Image. Oxford University Press.
- van Kampen, N. G. (2007).Stochastic Processes in Physics and Chemistry (3rd ed.). Elsevier.
- Prigogine, I. & Stengers, I. (1984).Order Out of Chaos: Man's New Dialogue with Nature. Bantam Books.
- Anderson, P. W. (1972). More Is Different. Science, 177(4047), 393–396.
- Landauer, R. (1961). Irreversibility and Heat Generation in the Computing Process. IBM J. Res. Dev., 5(3), 183–191.
- Ames, A. D., Xu, X., Grizzle, J. W., & Tabuada, P. (2017). Control Barrier Function Based Quadratic Programs for Safety Critical Systems. IEEE Trans. Autom. Control, 62(8), 3861–3876.
- Weinberg, S. (1995).The Quantum Theory of Fields, Vol. I: Foundations. Cambridge University Press.
© 2026 Vareon Inc. and Vareon Limited. All Rights Reserved.