Related News · May 2026

Correlated Decoherence
as Bilateral Crossing Propagation

A Note on Quantum Processor Noise
and the Relational Structure of Measurement
Dunstan Low — A Philosophy of Time, Space and Gravity — ontologia.co.uk

A measurement is a crossing event at \(\tau_0\).
Its egress record becomes the ingress potential of every neighbouring crossing.
Correlated decoherence is not a bug. It is the relational structure of reality.

I. The Problem in Quantum Computing

As quantum processors scale toward and beyond one thousand qubits, engineers observe a class of noise that standard error models do not account for: errors that are correlated in time, contextually dependent on which qubits measured first, and non-Markovian — meaning the future error rate depends on the history of past measurements, not just the present state. In Markovian noise models, each qubit decoheres independently. What is observed at scale is something different: a single measurement event appears to influence the decoherence behaviour of neighbouring qubits in a structured, causal cascade.

This is treated as an engineering problem — an unwanted source of correlated noise to be suppressed. The bilateral framework offers a different interpretation: it is not unwanted. It is the expected behaviour of a measurement apparatus operating in a universe that is fundamentally relational.

II. What the Bilateral Framework Says About Measurement

In the bilateral framework, a quantum measurement is a bilateral crossing event. Before measurement, a superposition is an ingress-face state: potential, unwritten, holding multiple outcomes simultaneously. Measurement is the transition from ingress to egress — the actualisation of one outcome at \(\tau_0\), the crossing point where future meets past (Axiom A3).

The critical point is what happens next. The egress record of one crossing — the written, actualised outcome — immediately becomes part of the potential landscape for every subsequent crossing in its causal neighbourhood. The bilateral mesh is relational (Axiom A1: existence is relational; no object exists independently of all others; every state is defined by its intersections). A written crossing record is not isolated; it is woven into the potential face of all neighbouring crossings via the bilateral mesh.

One superposition collapsing is one crossing event at \(\tau_0\). That collapse then triggers a causal chain reaction — because the egress face of one crossing becomes the ingress face of subsequent crossings along the ladder.

This is not collapse propagating instantaneously across the system. It is a localised crossing that writes a record, and that record then structurally constrains the possible outcomes of neighbouring crossings — forward in \(\tau\), causally, at a rate determined by the bilateral ladder and the shape operator.

III. The Mechanism

The bilateral framework identifies three structures that govern how a crossing record propagates through the mesh:

Bilateral structure In quantum computing terms
Egress record at \(\tau_0\) — the actualised outcome of a crossing, written on the past face A qubit measurement outcome — written, irreversible, now part of the classical record
Ingress potential — the as-yet-unactualised face of neighbouring crossings, shaped by adjacent egress records The modified decoherence landscape of neighbouring qubits — their error probabilities now conditioned on the first measurement
The shape operator \(\mathcal{S}(n)\) — determining which bilateral ladder dominates at each rung, and how far its prominence extends The correlation length of error propagation — how many qubits away a single measurement event exerts measurable influence on decoherence rates
Prime gaps as resonant cavities — the bilateral mesh has irregular spacing between stable crossing points, following the prime gap distribution Non-uniform correlation structure — errors do not propagate uniformly; they cluster and gap in a pattern related to the prime structure of the crossing spectrum

The non-Markovian character of the noise follows directly. In a Markovian model, the future depends only on the present state. In the bilateral framework, the future depends on the entire accumulated egress record — because A3 is irreversible (\(\tau\) monotonically increasing) and the egress record persists as the written history that shapes all subsequent ingress potentials. The processor's noise is non-Markovian because the universe is non-Markovian. It has a \(\tau\)-direction. It accumulates a record. It does not forget.

IV. A Testable Prediction

Prediction

The correlation length of decoherence errors in a large-scale quantum processor — the characteristic distance over which a single measurement event influences neighbouring qubit error rates — should not be uniform. It should follow a distribution related to the prime gap structure of the bilateral crossing spectrum.

Specifically: the correlation function \(C(d)\) between qubits separated by \(d\) rungs on the bilateral ladder should exhibit peaks at distances corresponding to prime gaps, with a decay envelope set by the electromagnetic prominence radius \(r \approx 12.5\) rungs. That is, the correlation length is not a single number but a discrete set of preferred distances, each associated with a prime gap — errors cluster at specific separations rather than decaying smoothly. Errors should cluster more densely near prime-indexed qubit positions, with correlation lengths corresponding to prime gaps rather than to a smooth exponential decay. The shape operator predicts that the dominant correlation at each scale is set by the bilateral ladder with the largest prominence radius \(r_i = 1/b_i\) at that scale — which for electromagnetic interactions gives \(r \approx 12.5\) rungs, consistent with the observed long-range, non-local character of the error propagation.

This is a concrete, falsifiable prediction that could be tested against IBM's detailed error characterisation data from large-scale processors.

V. Collapse as a Write Operation

The bilateral framework reframes what a measurement is. In the standard view, collapse is an endpoint — coherence is lost, the computation suffers. In bilateral terms, collapse is a write operation: the ingress face (superposition, potential) becomes actual at \(\tau_0\), and an egress record is produced. That record is physical. It propagates.

The egress record of one crossing is carried by whatever physical channel connects that crossing to its neighbours — primarily electromagnetic (photons) and, in principle, gravitational (spacetime curvature directly). It then becomes the ingress potential of subsequent crossings: it shapes what outcomes are available to the next measurement event. This is not teleportation or signalling — it is ordinary causal propagation, forward in \(\tau\), at \(c\). What makes it bilateral is that the outcome of crossing A is structurally incorporated into the potential of crossing B, not as noise, but as information.

This reframing has a concrete implication for how correlated decoherence in large processors should be understood. The errors that propagate across a processor after a single measurement event are not random contamination from that event. They are the causal downstream of a crossing record propagating through a relational mesh. Suppressing them entirely would require suppressing the relational character of the measurement itself — which is not possible, because A1 is not an engineering choice. What is possible is predicting the structure of the propagation — which the bilateral ladder and shape operator provide — and using that structure to design error mitigation that works with the cascade rather than against it.

VI. What This Is — and Is Not

A note on scope. This note does not claim that IBM's quantum processor data validates the bilateral framework. The correlated decoherence observed at scale has multiple possible explanations within existing frameworks — crosstalk, two-level system defects, substrate phonon modes. The bilateral framework does not replace these models; it provides a complementary structural interpretation that may guide future mitigation strategies by identifying which correlation lengths and clustering patterns are intrinsic to the relational structure of measurement, rather than reducible to hardware imperfections. The prediction above is testable in principle; its verification or falsification would be meaningful for the framework. This note is a commentary, not a derivation.

VII. What Collapse as Write Implies for Quantum Computing (speculative)

If measurement is a write operation rather than a loss event, the standard scaling narrative of quantum computing — more qubits, more computational power — is incomplete. A single crossing event contains the full potential of the ingress face; what limits computation is not the size of the Hilbert space but the rate at which egress records can be written and read, and the causal structure of how those records propagate. Fault-tolerant quantum computing, on this view, may require not larger qubit arrays but more efficient write and read protocols — protocols that work with the causal cascade structure rather than treating it as error. This perspective follows directly from Axiom A3 (\(\tau\) monotonically increasing, every actualisation a permanent record) and is offered here as a speculative direction, not a derived result.