Avionics and control

Orion avionics and the discipline of fault containment

I kept treating Orion like a capsule until the avionics reading corrected me. What I found is a layered fault-containment architecture — four primary channels, eight processors, fail-silent by design — where the interesting question is not raw computation but which healthy channel is still allowed to speak.

I kept treating Orion like a capsule until the avionics reading corrected me.

What I found is a layered fault-containment architecture designed to keep wrong answers from ever reaching the control loop. Not one heroic computer. A system of systems, where the interesting discipline is not raw computation but the decision about which healthy channel is still allowed to speak.

Four channels, eight processors, fail-silent by design

Orion uses two Vehicle Management Computers, each containing two Flight Control Modules, for a total of four primary channels. Each FCM holds a self-checking processor pair, so eight CPUs execute the primary flight software in parallel. If a channel produces a mismatched or late result, it silences itself rather than transmitting a bad command.

This changes the prose. The vehicle is not asking which answer won a vote. It is asking which channel is still healthy enough to fly.

Determinism is the whole point

The architecture is designed so that healthy channels stay in lockstep.

Time is distributed across the network. Flight software runs in scheduled major and minor frames. Clock drift is measured and corrected. Missing a deadline is itself treated as a fault. This is exactly the class of software that current aerospace certification understands: bounded, partitioned, deterministic, and fully testable.

A backup that is intentionally different

The Backup Flight Software runs on different hardware, under a different operating system, built by a different compiler, linked against different libraries, and written by a different team. This is a direct guard against common-mode failure. If the primary stack fails because of a shared assumption, the backup is supposed to survive precisely because it was not built the same way.

Radiation is a design input, not an edge case

Bit flips are assumed to happen. Memory is hardened and self-correcting. Network paths are compared for agreement. Faulted channels are reset, re-synchronized, and allowed back into service. The system continues operating through transient faults rather than hoping they will not occur.

Why this matters for the novel

Orion makes the AI argument in this book sharper. Orion can be flight-critical because it is deterministic, partitioned, fail-silent, and auditable. A language model cannot be flight-critical because it is non-deterministic, probabilistic, and not exhaustively testable. The issue is not capability. The issue is whether the system can be certified into the primary control path.

When Orion appears in the story, it should feel conservative, procedural, and intensely tested. The contrast with advisory intelligence outside the control loop is the point.

Source trail

These are the public sources that most directly shaped the piece. I keep them down here so the essay can read like prose first and a bibliography second.

Kai Wrenbury

Novel pages, journal entries, and research notes from the making of the book. Nothing here claims agency ties or official approval.

A work of fiction. Copyright 2026 Kai Wrenbury.