Judgement as Depreciating Asset
There’s a paradox hiding in every successful AI deployment. The more effectively AI encodes human judgment, the less humans practice that judgment. And judgment, it turns out, depreciates like any other asset.
The Use-It-or-Lose-It Economy
A 2025 study in Science Advances tracked cognitive skills across thousands of adults over time. The finding was stark: skills decline at older ages only for those with below-average skill usage. People who read and do math regularly held onto those abilities into their sixties. Those who didn’t peaked in their early thirties.
This isn’t metaphor. It’s measured depreciation. Research published in the American Economic Review estimates unused skills depreciate at roughly 4.3% per year. That’s not nothing—it means half your capability erodes in about 15 years of disuse.
Now consider what happens when AI takes over a task you used to do yourself.
Three Case Studies in Skill Decay
Pilots and Autopilot. NASA researchers studied what happens when experienced airline pilots rely heavily on automation. The surprising finding: their basic instrument scanning and manual control skills remained reasonably intact. But their cognitive skills degraded significantly. Specifically, tracking aircraft position without map displays and recognizing instrument failures. The mental model eroded even when the physical skills didn’t.
The FAA has warned that excessive autopilot use degrades “stick and rudder” skills, citing accidents where pilots lost situational awareness. IATA reports widespread industry concern about manual flying skill degradation. The pattern is clear: automation doesn’t just do the work—it removes the practice that maintains competence.
Navigation and Spatial Memory. London taxi drivers who complete “The Knowledge” (memorizing 26,000 streets over two or more years) have measurably larger posterior hippocampi than the general population. Gray matter volume correlates with time spent navigating. GPS users, by contrast, show reduced hippocampal activity and volume compared to people who navigate spatially.
This matters beyond navigation. The hippocampus is one of the first brain regions affected by Alzheimer’s. Outsourcing navigation to GPS may have long-term cognitive costs we’re only beginning to understand.
Healthcare and Automation Bias. Medical residents who train with AI diagnostic tools risk becoming less skilled at interpreting subtle findings, because the AI sees them first. This creates “automation bias”: the tendency to trust automated systems uncritically. In one mammography study, radiologists’ accuracy dropped when influenced by AI suggestions. A study of clinical decision support systems found that 5.2% of prescription changes switched from correct to incorrect based on AI advice.
The pattern across domains is consistent: when AI handles the task, humans lose the repetitions needed for expertise.
The Deliberate Practice Problem
Here’s where it gets uncomfortable. Anders Ericsson’s research on expertise shows that skill development requires deliberate practice with immediate, accurate feedback. Experience alone isn’t enough, you need structured repetition where you can see what worked and what didn’t.
AI removes exactly this. When the AI drafts the document, generates the code, or makes the diagnosis, you don’t get the practice. You don’t build the pattern recognition. You don’t develop the intuitions that come from doing the work thousands of times.
Daniel Kahneman’s dual-system framework explains why this matters. System 1—fast, automatic intuition—becomes reliably accurate through repeated practice with feedback. If AI handles the task, your System 1 never calibrates. You develop confidence without expertise.
Gartner predicts that by 2026, atrophy of critical-thinking skills from AI use will push 50% of organizations to require “AI-free” skills assessments. They’re not worried about AI capability. They’re worried about human capability degradation.
The Verification Trap
As AI does more work, humans shift to verification and oversight roles. But here’s the catch: verification requires the same expertise as doing the work.
A lawyer who lets AI draft contracts still needs to spot the missing clause. A developer who lets AI write code still needs to catch the subtle bug. A radiologist who lets AI flag anomalies still needs to recognize when it missed something.
If you haven’t developed judgment through practice, you can’t effectively verify AI output. You end up rubber-stamping decisions you can’t actually evaluate. The oversight becomes theater.
The jagged frontier isn’t static. It’s not just about what AI can and can’t do today. It’s about what you will still be able to do after years of letting AI handle the task. The frontier moves—and not always in your favor.
What This Means
I argued in Making Agents Make Wealth that AI creates value when it encodes and executes judgment. That’s still true. But encoded judgment has a shadow cost: the hollowing out of the human capability that created it.
This isn’t an argument against using AI. It’s an argument for using it deliberately, with awareness of what you’re trading.
The practical implications:
-
Maintain practice rotations. Regularly do work without AI assistance, even when AI could help. The inefficiency is an investment in capability maintenance.
-
Use AI as training, not replacement. AI that explains its reasoning and invites critique builds skills. AI that just delivers answers atrophies them.
-
Distinguish which skills matter. Some capabilities are fine to let depreciate. Others are load-bearing for your professional judgment. Know the difference.
-
Watch for automation bias. The more you trust AI output uncritically, the more your ability to evaluate it decays. Skepticism is a skill that requires practice too.
The question isn’t whether AI makes us more productive today. It does. The question is whether we’re optimizing short-term productivity while depleting long-term human capital.
Judgment is an asset. Like any asset, it depreciates without maintenance. And like any asset, you don’t notice the depreciation until you need the capability and find it’s no longer there.
Related: The Jagged Frontier Is Personal, Making Agents Make Wealth