At 2:17 a.m., the emergency department at Portsmouth Regional Hospital slipped into that narrow, deceptive quiet that falls between surges. The machines didn’t stop. Monitors kept their rhythm, IV pumps clicked, a curtain shifted somewhere down the hall. It just felt, briefly, under control.
At the central station, a nurse wasn’t looking at the screens. She was watching the room itself—the way one patient shifted, the way another’s breathing landed just slightly out of sync with the numbers being displayed. Nothing dramatic. A fraction. The kind of mismatch you miss if you’re watching the chart instead of the person.
She walked into the room before the alarm sounded.
Later, the chart would compress it into a sentence: “patient deterioration noted prior to monitor escalation.” Clean. Precise. Technically correct. It doesn’t capture how the signals actually arrived—simultaneous, overlapping, resolving into a decision without steps in between. When she tried to explain it, she shrugged. “It’s pattern,” she said. “You don’t go one, two, three. You just know when something’s wrong.”
That ability doesn’t make her an easy employee.
It makes her a great nurse.
The system she works in doesn’t quite know what to do with that distinction. It tracks compliance, timing, documentation, protocol—the visible parts—while the thing that brought her into that room sits outside all of it, hard to standardize, harder to train, and almost impossible to audit. Over time, that gap matters, because what a system measures is what it learns to preserve.
You can see the same gap much earlier, long before anyone steps into a hospital, in a classroom outside Manchester where a student’s file sits on a desk with a pattern that has repeated for years: strong test scores, missing assignments, comments that circle the same idea in different language—bright but inconsistent, easily distracted, needs to apply himself. He has just been diagnosed with ADHD.
Nothing about him changed.
Only the explanation did.
School is built around a particular kind of work—sit still, focus on one thing, follow the steps, finish the task—and those are useful skills. They’re also very specific ones, shaped by the kind of world that needed them. About 150 years ago, most work didn’t look like this. Then factories arrived, and everything tightened. Work became repeatable, structured, timed. The economy needed people who could show up, stay on task, and do the same thing the same way, over and over, and schools followed that need with rows of desks, fixed schedules, one subject at a time, and one correct answer.
It worked long enough to feel inevitable.
It scaled well enough to become invisible.
And it sorted people.
If your brain matched that structure, things felt natural. If it didn’t, things got harder—not because you weren’t capable, but because you didn’t fit the system that defined capability in the first place. That sorting held because the economy reinforced it. Employers paid for consistency, compliance, and repeatability, and the labor market reflected that preference with almost mechanical precision.
Now that reinforcement is weakening—not disappearing, but weakening—and the reason is mechanical. Over the past decade, a growing share of routine cognitive work has been absorbed by software. McKinsey estimates that up to 60% of current jobs have at least 30% of tasks that are technically automatable, and the tasks that go first are the ones that are predictable, structured, and repeatable. This isn’t a cultural shift or a change in taste; it’s a supply shock, and supply shocks don’t negotiate.
They reset prices.
When the supply of “routine cognition” explodes, its value drops. The market doesn’t argue with that. It absorbs it, adjusts, and moves on—slowly at first, then faster as the edges begin to fold inward.
You can see that repricing most clearly in places where the job remains but the center of the work has shifted. In a law office in Boston, a junior associate described how his role used to involve long hours reading documents line by line, careful and repetitive, the kind of work that rewarded endurance more than judgment. Now software handles much of that first pass. “I’m not reading everything anymore,” he said. “I’m figuring out what’s wrong.”
That’s not a smaller job.
It’s a different one.
A cab driver in Manchester described the same underlying skill from a different vantage point. After years on the road, he doesn’t track individual cars so much as the pressure between them—how traffic builds, where it releases, when someone is about to move before they commit. “If I have to think it through, I’m already late,” he said, describing pattern recognition operating just ahead of conscious explanation.
That doesn’t show up on a résumé.
It prevents collisions.
For a long time, abilities like that lived at the edges of the economy—useful, sometimes critical, but not central—because the center was built on repetition, structure, and control. That’s what we trained for, and that’s what we rewarded. As machines absorb more of that work, the center doesn’t disappear, but it hollows out, and the value begins to migrate toward what remains difficult to automate: judgment, synthesis, anomaly detection, and the ability to work with information that doesn’t resolve cleanly.
This is where the reframing of neurodivergence begins to matter in a practical sense. The underlying traits haven’t changed, but the environment they operate in has, and with it the balance between cost and contribution. ADHD, in a system built on low-stimulation, delayed-reward tasks, looks like a deficit. Research by Nora Volkow at the National Institute on Drug Abuse shows that those brains don’t engage as strongly with that kind of work. In environments where signals change quickly and decisions carry immediate consequences, that same sensitivity can become an asset—not universally, and not without cost, but in ways that are increasingly relevant.
Autism shifts the lens again. The difficulty is often with ambiguity and shifting social expectations, but the strength lies in systems—seeing structure, tracing logic, finding where something breaks. Researchers like Simon Baron-Cohen at University of Cambridge have documented that profile for years, and it becomes more valuable as systems grow more complex and less transparent.
Dyslexia follows a similar pattern in a different direction. Reading may be slower, but pattern recognition across space and structure is often stronger. Work summarized from Maryanne Wolf and observations from NASA point to the same trade-off.
Different wiring.
Different payoff.
But this is where the clean version of the story breaks.
The system doesn’t want this shift.
Standardization isn’t just efficient—it’s enforceable. It makes performance legible, outcomes predictable, and people interchangeable enough to manage at scale. Schools, corporations, and bureaucracies aren’t neutral observers of this transition; they are built on the logic that’s being disrupted, and that logic still works well enough to defend itself.
So the result is not a broad revaluation of cognitive difference. It is something narrower and more uneven. The traits that map cleanly to high-value roles—pattern recognition in AI oversight, system analysis, edge-case detection—get pulled upward, often into specialized or elite positions, while the rest remain embedded in systems that still reward predictability and compliance.
That’s not inclusion.
That’s selection under new rules.
What emerges isn’t the end of sorting. It’s a reshuffling—one that elevates certain forms of difference while leaving the underlying structure intact. The system learns to extract value from variance without needing to accommodate it broadly.
Some people adapt to that shift. Others are filtered out faster as the margin for mismatch narrows. Many end up caught between systems, misaligned with the old one and not yet recognized by the new one.
That tension isn’t a bug.
It’s the transition itself.
Back in the classroom, the student with the missing assignments solves a problem in a way that doesn’t follow the steps but still reaches the answer. The teacher pauses—not because it’s wrong, but because it doesn’t fit the expected path—and in that pause you can see the system trying to decide what matters more, the method or the result.
That moment is easy to overlook.
It shouldn’t be.
Because what looks like a small mismatch in a classroom is the same mismatch playing out across the economy. For a century, we selected for people who could follow the system. Now we’re building systems that do that better than we can, and we haven’t decided—at scale—what replaces it.
That might not make him the easiest student to teach.
It might make him the kind of mind the next system depends on.
Or it might mean the system adapts just enough to use him—
and keeps calling everyone else like him the problem anyway.
⸻
Bibliography