I spent the first week of Arc 5 running old plays.
This was, in retrospect, embarrassing.
Not because the tactics had been wrong before—they hadn't. I'd built up a fairly rigorous understanding of how the system worked over a hundred and eighty chapters: what triggered evaluation, what constituted coercive intent, which behaviors generated warnings versus which ones generated consequences. I had a mental model. It wasn't perfect, but it was calibrated.
Arc 5 broke the calibration.
It started with small things.
A behavior I'd confirmed as "neutral" through repeated testing—physical proximity without contact, maintained distance, consistent non-initiation—got flagged. Not warned. Flagged, with a new classification I hadn't seen before.
SYSTEM NOTICE
Behavioral pattern: PASSIVE OPTIMIZATION
Classification: strategic non-engagement.
Evaluation: initiated.
I stared at that notification for a long time.
Strategic non-engagement. The system had reclassified doing nothing as a tactic.
Which, technically, it sometimes was. I'd be lying if I said I'd never deliberately maintained distance to avoid evaluation. But the system had never called that out before. In a hundred and eighty chapters, deliberate non-action had been treated as neutral.
Now it wasn't.
I found Sienna in the library—her usual corner, her usual posture of aggressive focus that was actually aggressive awareness—and I changed course without thinking, taking the longer route to my usual table.
SYSTEM NOTICE
Route deviation logged.
Pattern match: avoidance behavior.
Reclassification: pending.
I stopped walking.
Stood in the middle of the library's reference section, between two shelves of books no one had touched since the digital catalog went live, and thought: it's watching what I don't do now.
This was new.
Or rather—it had always been coming, and I'd been slow to recognize it. The system didn't stay still. I'd known that since Arc 2, when it added the first set of eligibility gates. By Arc 4, it had personality bleed—pointed phrasing, the occasional label that felt less like categorization and more like commentary.
Now it had opinions about my walking routes.
I changed course again and walked directly to my usual table, which took me past Sienna's corner. She looked up. We made eye contact for approximately one second, which was the right amount. Then I sat down twenty feet away and opened my laptop.
No notification.
Interesting.
The lead who found out was Maya.
She messaged me midweek, which was unusual—Maya had a policy of minimal text contact, preferring in-person conversations where she could read the room. A text message meant she'd had to reach out before she was ready.
Something happened. Can we talk?
We met on the bench outside the arts building. Her preference for difficult conversations: outside, some background noise, easy to stand up and leave if necessary. I'd learned to read her meeting locations as emotional indicators.
"I did something I thought was safe," she said, not looking at me. "I've done it before and it was fine."
"What changed?"
"I don't know. Everything looked the same. The situation was the same. And then—" She pressed her thumbnail against the back of her other hand. "The consequence was different. Worse."
I asked her to walk me through it.
She did, carefully, in the way she did everything—measuring each detail before she offered it, making sure she wasn't oversharing. The situation had been ordinary. A social interaction she'd managed before without triggering anything significant. But the outcome this time had added a new line to her ledger.
A cost she hadn't been charged before.
"The system updated its evaluation criteria," I said, when she finished. "What read as neutral six months ago isn't neutral anymore."
She finally looked at me. "Can it do that retroactively?"
"I'm starting to think it can do whatever it decides is consistent with its own logic." I paused. "Which is a bad answer. I'm sorry."
She sat with that for a moment.
"So everything I thought I knew—"
"Is a starting point. Not a guarantee." I kept my voice level. "I'm in the same place. I've been running old models all week and they're not holding."
She exhaled slowly. Not quite a laugh. More like the sound of a person recalibrating.
"What do we do?"
"I stop trying to work the old model," I said. "Start watching again. Like it's new."
It felt like losing ground. It probably was losing ground.
My phone buzzed.
SYSTEM NOTICE
Analysis mode detected.
Clarification will be provided when relevant.
Timeline: not yet.
Not yet as a response to nothing I'd asked. Just preemptive. Just to let me know it was aware I was watching, and it had decided what I was allowed to see, and it had a timeline it wasn't sharing.
I put my phone back in my pocket.
"Start from scratch," I said, mostly to myself.
Maya heard it anyway. She nodded once, like she'd already made that decision three days ago.
I walked back to the dorm that evening taking the most direct route I could find. No detours. No avoidance behavior. No route deviations for the system to log.
It didn't help. Two notifications by the time I reached my building, both of them using classifications I'd never seen before. The system's vocabulary was expanding faster than I could track it.
There was a point in Arc 2 where I'd felt like I understood the game. Where I'd had a framework that was incomplete but workable—enough to navigate without constant surprises.
Arc 5 had taken that feeling and replaced it with something that felt uncomfortably like being new again.
I sat down at my desk and opened a blank document.
At the top, I typed: Things I thought I knew.
And then I stared at it for a while, because the list was longer than I wanted to admit and shorter than it needed to be.
