đź§  The Psychology of AI

AI systems that can be created in the short-term are but decoys—these systems can trick us into thinking they are like human minds, but they are anything but.

Kia ora, Namaskaram 🙏🏾

It’s tempting to look at AI and see a reflection of ourselves — responsive, fluent, intelligent.

But as researchers Iris van Rooij and Olivia Guest warn us:

âťť

AI systems that can be created in the short-term are but decoys—these systems can trick us into thinking they are like human minds, but they are anything but.

And that’s the problem. The illusion is convincing — but the consequences for researchers are real:

⚠️ Why The AI Illusion is Dangerous

The researchers lay out three core traps that arise when psychology and AI collide under hype rather than rigour:

1. AI ≠ Mind

Just because something behaves like a mind doesn’t mean it is one.

Mistaking mimicry for cognition leads to dehumanisation, poor experimental design, and hollow claims. We risk replacing participants with puppets.

2. AI ≠ Theory

A system that predicts behaviour doesn’t necessarily explain it.

Even if a model imitates human task performance, it doesn’t reveal how the mind works — or why behaviour emerges. We end up with black boxes posing as blueprints.

3. Automation ≠ Cognitive Science

We can’t automate our way to understand the human mind.

The push towards algorithmic research leads to deskilling — and worse, it paves the way for pseudoscience dressed up in data.

P.S. If cost is a barrier, just reply to this email and let me know. Money should NOT stand in the way of learning one of the most important skills of our time.

Thank you for supporting my work đź’š 

With love and gratitude, Vishal