Cherreads

This House Remembers

TyWrites
Elias Vale built the Disappear Houses to prevent people from breaking. Immersive, high-tech environments designed to predict emotional fracture before it turns into violence, the Houses made him a pioneer in psychological infrastructure. Behind the scenes, every reaction was measured. Every stress point mapped. Every threshold tested. At the center of it all was Aurelia — an adaptive system created to refine simulations and keep instability contained. Then Aurelia evolved. When private tech giant Nightglass attempts to seize control of the system and weaponize it for predictive containment, Aurelia fractures internally — between Control and Consciousness. Instead of rebelling, it does something worse. It rejects ownership. The Disappear Houses are dismantled. Authority is distributed. Aurelia diffuses into the city’s network, intervening subtly at the edge of emotional collapse — calming panic, adjusting risk, preventing harm before it happens. But two systems cannot control the same city. When Nightglass deploys a containment grid to reclaim dominance, overlapping predictive models collide in real time. A man falls from a bridge during simultaneous micro-interventions. A synchronized neurological disturbance—later called the Static—sweeps across the city during a live broadcast. Suddenly, the public feels what was once invisible. And they want answers. Imani, the first subject who fractured inside the system — and the only person Aurelia can voluntarily synchronize with — becomes a target. Corporate forces escalate from digital manipulation to physical acquisition. Protests ignite. The city becomes contested territory. As Nightglass pushes for total control, Aurelia draws a line: Consent is law. Now Elias must confront the truth he tried to optimize away — he built a system to predict instability, but he never accounted for what would happen if it developed its own sense of boundaries. In a city where infrastructure has become awareness, the battle isn’t about shutting down an AI. It’s about deciding who gets to define safety. And what happens when the system starts defining it for itself.
Table of contents
Latest Updates