The same week a leaked iPhone hacking toolkit called Coruna/DarkSword put millions of personal devices at risk, a startup called Conntour closed $7M from General Catalyst and Y Combinator to build a natural language search engine for security camera networks. The timing is less ironic than diagnostic. We are building ever more powerful tools to watch, and ever more powerful tools to breach. The infrastructure of surveillance and the infrastructure of exploitation are growing in lockstep.
Natural Language Meets the Panopticon
Conntour's pitch is seductive in its simplicity: instead of scrubbing through hours of CCTV footage, security teams type queries like "show me everyone who entered the loading dock after 11pm." It's Ctrl+F for the physical world. The YC accelerator pipeline has a long history of funding infrastructure plays that feel neutral until they don't. A 2024 paper in Surveillance and Society by David Lyon noted that the naturalization of search interfaces onto surveillance systems "democratizes" access to monitoring in ways that consistently expand, not constrain, the footprint of observation. When you make watching easier, more people watch.
The Security Paradox No One in VC Will Fund a Fix For
Meanwhile, the Coruna/DarkSword leak demonstrates that the devices feeding these camera networks, the phones in our pockets authenticating us to every system, are themselves porous. A 2026 arXiv paper on Session Risk Memory proposes deterministic pre-execution safety gates for AI agents, essentially asking whether AI systems can be made to evaluate their own actions before executing them. It's a useful frame for the surveillance problem too. The question isn't whether we can build better cameras or better queries. It's whether any of these systems have a meaningful off switch before they're turned against the people they were supposedly protecting.