Two tech stories this week occupy the same moral geometry. Bryan Fleming, founder of stalkerware platform pcTattletale, avoided prison despite being the first person successfully prosecuted for spyware in the US in over a decade. Polymarket quietly removed bets on when the US would confirm rescuing a downed Air Force officer, only after a congressman publicly objected. In both cases, the system produced a consequence. In neither case did the consequence feel remotely proportional to the harm.

The Architecture of Non-Accountability

Fleming's case is the easier one to read. His software was used to surveil domestic abuse victims. He was convicted. He walked. The sentence is not a deterrent; it is a data point confirming that the legal architecture around surveillance technology is structurally lenient toward builders and structurally brutal toward targets. Polymarket is different in register but similar in logic. The platform's defense is market neutrality: we facilitate price discovery, we don't make moral judgments. But a 2026 arXiv paper by Li and Bakker on LLM-written community notes found that AI fact-checking systems on X systematically underperformed on context-sensitive claims, precisely the category that Polymarket's war-outcome bets fall into. Neutrality is a position. It just gets to pretend it isn't.

When Fringe Tech Becomes the Infrastructure

The New Yorker's piece on how internet fringe culture infiltrated Republican politics offers the macro frame. The same tolerance for norm violations that let pcTattletale operate for years and let Polymarket post military rescue bets is the same ambient permission structure that moved fringe content into the political mainstream. The platforms are different. The non-accountability architecture is the same one.