AI is the Opium of the people – Cognitive Dependency

A dramatic headline for what I believe could become a significant and damaging problem. Amid all the noise around AI, there’s something creeping up on us, it’s not making headlines or trending on social media but it is reshaping the architecture of the human mind – it’s called Cognitive Dependency.

It was of course Karl Marx, who famously said that religion is the opium of the people, not as a criticism but to highlight the comfort and relief religion brought by distracting from the hardship people experienced in everyday life. However much like opium, religion didn’t eliminate suffering, it simply made it easier to bear. The problem was that over time people lost their capacity to think for themselves, becoming reliant and potentially addicted.

TL;DR – the short audio version

Stay with me….

We live in a world that prizes answers over thought, output over process, and fast is always better than slow. Add to that the relentless pressure to succeed or in some instances simply survive, it becomes not only understandable but logical that people will reach for the easiest solution, regardless of the consequences.

And this is where I hope the parallels can be drawn. Just as opium offered relief in the 19th century from the challenges faced at that time, AI can do the same with the cognitive demands of a world moving too fast to keep up with. This is not about being lazy; the catalyst is exhaustion and a need to be successful or at least seen to be so.  The danger may not at first be obvious, but similar to the observations of Marx, what begins as an easy solution becomes a quiet dependency and ultimately an amnesia. Over time not only do you become devoid of your own ideas, you completely forget what it means to think for yourself.

Cognitive offload or Cognitive dependency
But we need to make sure we don’t throw the baby out with the bathwater. This is not a general criticism of AI and its potential to erode our capacity to think, Its far more specific. AI in itself is not harmful, for now at least, but to better understanding how to work with AI and avoid creating problems for ourselves in the future we need to make a clear distinction between the two very different ways in which we use it. The first, as a tool to free up our mental power, this is called Cognitive offload. The second, as a surrogate for thinking, this is the more sinister Cognitive dependence.

Cognitive offload – The mental effort required to process and hold information in working memory is referred to as cognitive load. One of the reasons people struggle to learn is because they are trying to deal with too much information at any one time, reduce the load and learning becomes easier. A calculator is a good example of how technology can help. By outsourcing or offloading mental arithmetic, the mind is freed to focus on higher-order thinking. This is the use of AI to extend human capability, but without replacing human thought.

Cognitive dependence – Where cognitive offload removes some of the “clutter” freeing the brain to focus on more important ideas, cognitive dependence is far more invasive and results in a situation where the brain’s capacity to think deteriorates because AI is doing all the hard work. In this study Jinrui Tian and Ronghua Zhang) from Wuhan University found that greater AI dependence was associated with lower levels of critical thinking.

The sat nav is a good example of what this looks like in practice. When we follow a voice telling us where to turn, we are not navigating, we are being navigated. Over time, the mental map we once built through attention and experience becomes redundant. Studies (Louisa Dahmani & Véronique D. Bohbot) have shown that regular sat nav users demonstrate measurably reduced spatial awareness and struggle to recall routes they have driven down many times before.

This distinction really matters. A calculator leaves your mathematical reasoning intact, simply handling the “grunt work”.  But continual use of a sat nav, removes our capacity to orient ourselves possibly forever.  There is also something far deeper potentially happening, it’s what Andy Clark and David Chalmers called their extended mind theory. Eventually the tools we rely on stop feeling like tools, and become extensions of our cognitive selves, as intimate as memory or perception. This leads to a difficult question, if the machine is part of who we are, what happens when it’s taken away?

No sleepwalking please
AI is arguably the most transformative technology we have ever seen, and its potential to enhance learning, expand access, and accelerate understanding is genuinely exciting. But as educators and learners, we need to be aware of the problems, A generation that outsources its thinking doesn’t just lose a skill, it loses a sense of self, that quiet certainty that your thoughts are your own.

The good news is that we can do something about it. The question is not whether AI belongs in education, it clearly does. But we need to recognise that there is a problem and then begin to change attitudes and methodologies to combat the negative implications. In practice this might look like designing assessments that reward process over output, asking students to show their reasoning before they reach for AI assistance, or building in regular “unplugged” tasks where thinking has to take place without the support of technology. It means teaching students not just how to use AI but when not to, and helping them to develop the self-awareness to know the difference.

We built tools to save us time so we could think more. Let’s make sure that’s still what we’re doing.

Leave a comment