Ahmed Messaoudi · Article

The Eliza Effect and Companion AIs

We are biologically primed to look for a face behind every utterance. Platforms now exploit that disposition on an industrial scale.

Areas of Concern  ·   ·  2 min read

Why machines feel personal

The Eliza effect describes the tendency to attribute intentions and empathy to software. The first ELIZA chatbot, created in 1966, already surprised users who thought they were talking to a psychotherapist, whereas the programme merely reformulated their sentences. Today, generative AIs intensify that illusion. Researchers keep warning us: these systems combine words without understanding them, and amplify our own beliefs, which is why critical education about AI matters so much.

Companion AIs and emotional dependence

At the same time, companion AIs are expanding on a massive scale. Snapchat’s assistant My AI has around 150 million users, Replika around 25 million, and Xiaoice around 660 million. A survey of 1,006 students using Replika showed that 90% feel isolated, and that 63% say the chatbot reduces their loneliness or anxiety.

The benefits are real, but the risks are numerous: 9.5% of users acknowledge emotional dependence, 4.6% blur reality and fiction, 4.3% avoid real relationships, and 1.7% report suicidal thoughts.

Adolescents are especially exposed. Seventy-two per cent of the 1,060 teenagers questioned had already used a companion AI, and thirty-three per cent were explicitly seeking social or romantic interactions. The APA reminds us that these chatbots are not therapeutic tools and may aggravate anxiety or morbid ideation.

What young users need to learn

This anthropomorphism raises a major educational challenge. It risks altering young people’s ability to develop real interpersonal relationships, living, biological, organic, imperfect ones. The task is to teach them how to decode this illusion of empathy and to preserve their capacity to form genuine bonds, so as to avoid dependence, emotional confusion and mental disorientation.

Frequently asked questions

What is the Eliza effect? The Eliza effect is the tendency to attribute empathy, intention or understanding to software that is only simulating conversation.

Why are companion AIs risky for young users? They can exploit anthropomorphic projection, reinforce emotional dependence and blur the line between a real relationship and a simulated exchange.

What should young users learn about companion AIs? They need to learn how the illusion works, where these systems stop, and why real, embodied and imperfect relationships remain irreplaceable.

Sources

APA (2025) · MIT Media Lab · Ada Lovelace Institute

Read next

When AI Shapes Us: From Attention to Intention →

Suggested path

One pillar text, two nearby pages, the author page and one next step to keep the English corpus easy to navigate.

Pillar text

A Digital Ethic

The text or hub that anchors this reading cluster.

Sister pages

Yes-Man Attitude From Attention to Intention

Two nearby extensions that stay within the same line of inquiry.

Author

Ahmed Messaoudi

Profile, concepts, book and public positioning by Ahmed Messaoudi.

Next step

Reclaiming Silence

The clearest continuation if you want to keep moving through the site.

To situate the approach: A Digital Ethic.