Two stories from opposite ends of the hype spectrum share an uncomfortable structural rhyme. A new arXiv paper on continually self-improving AI systems argues that the fundamental limit on current LLMs isn't capability but the static nature of their training — they're frozen at a moment, unable to compound on their own outputs. Separately, Fast Company profiles Rare Beauty, Bogg, and Goodles, brands that are deliberately throttling distribution, choosing scarcity over scale.
The connection isn't superficial. Both stories are really about the compounding problem: what happens when a system optimizes so hard for growth that it loses the capacity for quality iteration? The brands that went slow — staying out of mass retail, building genuine community before scale — are now better positioned than competitors who flooded every channel. The AI systems that can't self-improve are hitting the same ceiling: they were optimized for a moment, not for compounding.
The Skele-Code paper out of arXiv this week — arguing for 'skele-coding' over 'vibe coding' — makes the same point from an engineering angle. Vibe coding is fast and feels productive; it's also the brand that scaled too fast. Skele-coding builds reusable, auditable structure. Slow growth that compounds.
For investors navigating this, the accelerator model itself is being stress-tested by exactly this tension — the YC Speedrun data shows batch companies that front-load distribution without product-market fit compounding fail at higher rates than cohorts that iterate slowly first. Restraint as moat is not a new idea. It's just newly legible.