The Most Important Memory is Still the One Inside Your Head
A new study shows how the convenience of offloading memory to AI erodes long term learning and why memorisation is still essential to critical thinking.
“Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?”— T. S. Eliot, Choruses from “The Rock” (1934)
We're living through an extraordinary cognitive paradox. Just as artificial intelligence gives us unprecedented access to information, there is evidence to suggest we're becoming less intelligent. IQ scores, which rose steadily for nearly a century, have plateaued and have begun declining in wealthy nations. People born after 1975 score lower on intelligence tests than their parents did at the same age.
Correlation/causation conflations aside, it’s notable that this began when education began to deride memorisation as a low order skill and digital tools became ubiquitous. A new study by neuroscientists Barbara Oakley, Terrence Sejnowski, and colleagues offers an insightful explanation for this decline.
The Memory We've Forgotten About
Over the last 50 years, the idea that memorisation is much less important than generalised critical thinking became a largely uncontested orthodoxy in education. Of course, rote memorisation of disconnected facts without linking them to broader concepts and ideas is a joyless, pointless enterprise but recent research suggests we may have thrown the baby out with the bathwater, creating a false dichotomy between memory and understanding that has undermined the very cognitive foundations we sought to strengthen.
The problem isn't memorisation itself, but a misunderstanding of how memory actually works. When we memorise multiplication tables or vocabulary words, we're not just storing isolated facts, we're building structured knowledge that enables mathematical intuition and sophisticated reasoning. These aren't competing educational goals; they're complementary processes that work together to create critical thinking and genuine understanding.
This shift away from memorisation is evident in movements such as Reform Mathematics in the 1990s, which explicitly de‑emphasised procedural fluency in favour of conceptual understanding and open‑ended problem‑solving. In literacy, the now discredited Whole Language approach dominated classrooms from the 1980s and most notably, Bloom’s Taxonomy, widely adopted in teacher training, reinforced a hierarchical view of learning in which factual recall was treated as a lower‑order skill, subordinate to analysis, synthesis, and evaluation.
The emergence of 21st Century Skills frameworks further entrenched this perspective, promoting laudable but ultimately nebulous concepts like creativity, collaboration, and “digital literacy” over actual content knowledge. International assessments such as PISA, launched by the OECD in 2000, reflected and reinforced this trend by prioritising applied problem‑solving rather than factual recall, thus influencing curriculum reforms around the world. Together, these developments reflect a pervasive belief that in a world of easy access to information, memorisation is no longer necessary.
This thinking became empowered with the emergence smartphones and AI, where all knowledge is now available in our pockets. The logic appears sound; free students from rote memorisation to focus on "higher-order thinking" and creativity. But again, this logic rests on a fundamental misunderstanding of how the brain actually learns.
Your Brain's Hidden Learning Systems
Oakley and her colleagues' research illuminate how genuine expertise emerges from the interplay between two complementary memory systems, each with its own character and purpose.
The declarative system handles conscious facts and concepts; the deliberate recall of explicit information. It functions like a rapidly learning but somewhat cumbersome library, carefully cataloging new acquisitions. When you first learn that seven times eight equals fifty-six, your declarative system dutifully records this information.
The procedural system, by contrast, acquires habits, skills, and routines through repeated practice. It operates like a well-trained craftsman who has internalized technique to the point of artistry. With sufficient practice, seven times eight becomes not a fact to be retrieved but a response as automatic as breathing. The crucial insight from this research is that expertise emerges precisely when knowledge transitions from the declarative to the procedural realm.
I always think of drummers here. They must perform multiple, often conflicting rhythms; left hand, right hand, left foot, right foot, all in synchrony without conscious effort. Each limb maintaining its own pattern while contributing to a unified whole. At first, every movement demands attention. But with repetition, the patterns become procedural; the drummer no longer thinks about each beat but can concentrate on more complex things like the actual song and applying different dynamics, phrasing, and interpretive choices that breathe life into what would otherwise be mere mechanical execution.
This is expertise: not conscious calculation, but fluency born of deep practice. The paradox is that by memorising so much you can actually forget about the very thing that took up all your cognitive bandwidth. The automaticity is a liberation and affords the bandwidth for the kind of creativity that separates novices from experts.
When we constantly outsource basic operations to external systems, we prevent this crucial transition. Students may get correct answers, but they never develop the procedural fluency that enables genuine mathematical thinking. Many us us are familiar with the misconception “The One Doing the Talking Is the One Learning" but a more accurate formulation, as Peps McCrea says is “the who doing the thinking is the one learning.”
The Prediction Error Paradox
One of the study's most compelling revelations concerns how the brain learns from its mistakes. Your neural networks constantly generate predictions about the world, and when reality fails to match these expectations, you experience what neuroscientists call "prediction error", that moment of cognitive surprise that drives learning forward.
Consider a Year 6 student who has deeply internalised multiplication tables. When she sees a worksheet that claims 8 × 7 = 102, her brain immediately flags the error. The mismatch between her schema and the external input triggers a prediction error — releasing dopamine, sharpening attention, and reinforcing the correct association: 8 × 7 = 56.
Now consider her classmate who never memorised multiplication tables and relies exclusively on a calculator. Presented with the same incorrect answer, there is no internal expectation to violate, no alarm bell to ring. The error may pass unnoticed, unchallenged, a kind of cognitive passivity that leaves her at the mercy of external tools.
Without internally stored knowledge, the brain's natural error-detection mechanisms lie dormant. We become not just dependent on external tools but vulnerable to their failures, unable to distinguish sense from nonsense.
Cognitive karaoke
Recent studies reveal how AI tools can amplify these problems. College students using ChatGPT to write essays produced higher-quality work but showed no actual knowledge improvement when tested later. In other words, they were performing knowing without actually knowing. Ghostwriting their own ignorance.
They exhibited what researchers call "metacognitive laziness": less reflection, fewer self-corrections, and reduced engagement with the material. I am reminded here of Dylan Wiliam’s sage assertion that the purpose of feedback is to improve the student and not the work.
This pattern repeats across disciplines with depressing consistency. High school students using GPT-4 for mathematics practice outperformed their peers during lessons but collapsed on final examinations when the AI was removed. Programming students relying on ChatGPT experienced lower self-efficacy and poorer learning outcomes compared to their conventionally taught counterparts.
This is essentially a kind of cognitive karaoke; a theatre of mastery without memory, effort, or ownership. Like Koriat and Bjork's “illusion of competence”, the seductive feeling of understanding that comes from accessing information without truly grappling with it.
What This Means for Learning
The implications are concerning but not hopeless. The research points toward evidence-based strategies that harness both human cognition and artificial intelligence:
Embrace Practice at the Edge of Mastery: According to the paper, the "85% rule" suggests optimal learning occurs when students achieve about 85% accuracy. This sweet spot provides enough challenge to promote neural growth without overwhelming cognitive capacity.
Build Procedural Fluency: Certain basics like multiplication tables, vocabulary, scientific formulas etc should be practiced to automaticity. This isn't mindless drilling; it's creating the cognitive infrastructure that enables later complex thinking and creativity.
Use AI as Amplifier, Not Replacement: When students have solid internal knowledge, AI becomes a powerful thinking partner. Without that foundation, it becomes a cognitive crutch that prevents real learning.
Teach Metacognitive Awareness: Students must understand the difference between knowing where to find information and truly knowing it. Knowing that you can ask ChatGPT about photosynthesis is not the same as understanding how plants convert sunlight into energy.
Nothing Beside Remains
The very architecture of expertise is built not by exposure, but by effort. Without internalised knowledge, there is no deep understanding, no flexible thinking, no meaningful creativity. The most advanced AI can simulate intelligence, but it cannot think for you. That task remains, stubbornly and magnificently, human.
Yet perhaps the most insidious aspect of cognitive offloading is not what we lose, but our inability to recognise the loss. Like Shelley's traveller encountering the ruins of Ozymandias, we may find ourselves surrounded by the remnants of human intellectual achievement without understanding what once stood there. A generation raised on external cognition may never develop the internal dialogue that characterises deep thought, the ability to hold multiple, contesting ideas in tension, to sense contradictions, to bear the weight of knowledge accumulated through struggle.
This erosion happens silently, imperceptibly. Each generation's cognitive baseline becomes the next generation's assumption about human capability. We may celebrate our efficiency while mourning, without quite knowing why, the loss of something indefinable, that spark of recognition when disparate ideas suddenly connect in our minds, the satisfaction of solving problems through pure thought, the quiet confidence that comes from carrying knowledge as part of our identity rather than merely knowing where to find it.
The loss is not just intellectual but spiritual: we risk becoming strangers to our own minds, fluent performers of intelligence we do not possess, successful at tasks that require no growth, no struggle, no transformation of the self.
Pair this with the fact that the AI systems themselves don’t truly “understand” what they’re producing. So, not only are we short cutting the human thinking process, but we’re not getting it through the machine either. It’s pattern recognition and mimicry all the way down.
This is something that I think about a lot as the tides of education ebb and flow. I think education tends to overcorrect. Rote memorization is seen as bad because there was a time when that was the end goal, the only thing students really did. So we overcorrected and seemingly forgot the power of automatic knowledge. I think the same thing has happened with direct instruction (lecturing). If all a student ever does in school is sit in lecture, then their learning might be subpar. But there are simply some topics and ideas that simply are best taught through direct instruction. Anyway, thanks for a good writeup!