A French naval officer jogged the deck of the Charles de Gaulle aircraft carrier and inadvertently broadcast the warship's classified location to anyone watching Strava. Meanwhile, Sony's Mark Cerny announced that future PlayStation games will use ML to hallucinate frames that never actually existed — synthetic presence interpolated between real moments. These two stories are, structurally, the same story told from opposite directions.

The Strava leak is about surplus data — a body generating more information than its host intended to share. The Sony AI frames story is about manufactured data — filling in the gaps where no body, no event, actually was. Together they sketch the new epistemology of presence: you can neither fully hide nor fully appear. The archive always knows more than you meant to say, and the image always contains moments that never happened.

This is the condition Kyle Raymond Fitzpatrick diagnosed as enshittification — not just platform decay, but a broader collapse of the gap between intention and record. A 2023 paper in Nature Human Behaviour by Matz et al. found that digital traces allow remarkably accurate inference of attributes individuals never disclosed, a phenomenon they call the 'digital exhaust' problem. The naval officer's exhaust happened to be geopolitically catastrophic. Most of ours is just quietly embarrassing.

What's striking is how little institutional friction remains between a body in motion and a public record. The EU is mandating replaceable batteries in the Nintendo Switch 2 — a rare case of regulation reasserting human agency over hardware. But no one is mandating replaceable data. The frames Sony invents will be indistinguishable from captured ones. The run will always have been logged.