Two chokepoints broke this week. Iran has effectively shut down the Strait of Hormuz, blocking the transport of a significant portion of the world's energy supply in a survival-mode response to U.S. military pressure. Across a different kind of infrastructure, Elon Musk reversed an announced change to X's creator revenue-sharing program within hours after creator backlash. Both events expose the same structural fragility: when one actor controls a critical passage, the moment they move against the interests of those who depend on it, the system reveals how little legitimacy was underwriting the arrangement.

Chokepoint Politics Across Scales

The Hormuz situation is a geopolitical crisis with measurable economic consequences. Bloomberg reports that U.S. import prices jumped by their largest margin since 2022, and this was before the full blockade effect propagated through supply chains. Italy is already rerouting toward Algeria for gas supply. The logic of chokepoint leverage is simple: whoever controls the narrow passage controls the terms of everyone who passes through it, until the cost of that control becomes higher than the benefit. Iran has calculated that shutting the strait is worth the military and economic price. Musk made a different calculation in the same week: that the creator backlash to X's new payout terms was a cost he wasn't prepared to absorb. The reversal came in hours. Iran's reversal, if it comes, will take months of diplomacy and possibly a ceasefire framework, per Fast Company's reporting on the U.S. ceasefire proposal.

Creator Economies and the Legitimacy Floor

What the X reversal clarifies is that creator revenue-sharing programs are not just economic arrangements. They are legitimacy infrastructure. The platform's value proposition to creators depends on predictable terms. When Musk moved to change those terms unilaterally, the backlash was not just about money. It was about the credibility of the relationship. A 2026 preprint, "Unilateral Relationship Revision Power in Human-AI Companion Interaction," documents a parallel dynamic in AI companion platforms: when providers update AI companions without user consent, users report grief, betrayal, and loss, responses that are structurally identical to what creators expressed about X's policy shift. The emotional grammar of chokepoint control is the same whether the passage is a waterway or a feed algorithm. Kyle Raymond Fitzpatrick's concept of enshittification as a condition rather than an event is the right frame here: the betrayal is not the policy change. It is the accumulated realization that the terms were always provisional.