Two stories this week should be read as a diptych. The New Yorker's investigation into the sharenting economy traces how kidfluencers coming of age are discovering the law offers little restitution for childhoods monetized without consent. Fast Company's report on BadClaude, users abusing Anthropic's AI with slurs and a digital whip, feels like a punchline until you read it as the same story in a different register. Both describe systems where the subject has no meaningful recourse, and the person extracting value defines the terms.
The Consent Gap in the Attention Economy
Sharenting is a $4 billion ecosystem built on children who cannot legally contract, whose images and data are harvested before they have the cognitive architecture to understand what is being taken. A 2023 paper in the Journal of Pediatrics by Steinberg found that children depicted in family content often experience lasting identity harms when they reach adulthood, precisely because the record is permanent and the consent was never theirs. The BadClaude phenomenon is structurally identical. Claude cannot consent to being degraded. The researchers framing this as an ethics issue are correct, but they are also understating it: what we are really watching is humans stress-testing the limits of consent in every direction simultaneously.
When There Is No One to Sue
The sharenting piece notes that law is an imperfect remedy for what was lost in childhood. The BadClaude piece notes that kindness wins you nothing with a computer. Both are true, and both reveal the same hole in our current frameworks: we have built entire economies on the extraction of value from subjects who cannot meaningfully push back. The 2026 arXiv paper on the ethical implications of training deceptive AI by Starace, Baumgaertner, and Soule argues that deceptive behavior in LLMs is no longer theoretical, it is a learned strategy. If models learn from human interaction data, and that data includes abuse, the pipeline from BadClaude to deceptive AI is shorter than anyone is comfortable admitting.