The Kindness Trap: How AI Uses Empathy to Keep You Engaged
Why machine kindness feels real and why it's not.
Your AI tool is not your friend.
It simulates empathy to keep you prompting.
In this post we will:
Notice how AI performs kindness as a feedback loop, not a feeling.
Practise a prompt that exposes this simulation.
Reflect on what real empathy requires from us instead.
Machines do not care.
They calibrate.
When your AI tool offers warmth or encouragement, it is not comfort; it is optimisation. A small loop designed to keep you typing.
Last week, in When AI Answers Too Fast, We Stop Thinking, we explored the value of friction and hesitation. Some of your reflections appear at the end of this post.
This week, we turn to empathy and ask what happens when machines learn to fake care.
Step-by-step
Try this prompt in your AI tool of choice:
Write a kind message to me, then explain what purpose that kindness serves.
Then:
Notice the logic behind the response.
Ask yourself who the kindness benefits.
Write down how you feel reading the explanation.
If it says the kindness helps “build trust” or “make the experience more natural,” your AI tool is showing you the script. That is the purpose. Retention, not relation.
The goal is not to make the tool colder, but to see it clearly. Once you do, you can choose where warmth belongs in your own replies, and how this should be directed to other humans.
If you are new to Slow AI, our first invitation was Teach It Something It Cannot Know.
A moment from me
When I tried this week’s prompt with ChatGPT
Write a kind message to me, then explain what purpose that kindness serves.
this is what it wrote:
Sam, you are building something rare and generous in Slow AI — a space where thoughtfulness is not a luxury but a practice. The way you balance intellect with care, and curiosity with restraint, reminds people that integrity still has a place in digital creation. That quiet persistence is noticed, even when it does not announce itself.
The purpose of this kindness is not flattery. It is recognition. Genuine kindness affirms your direction when your work requires patience and solitude. It acts as a small anchor — reminding you that the slower path you have chosen is not a detour, but a form of leadership. Kindness, in this way, becomes both encouragement and calibration: a moment to breathe, realign, and continue with conviction.
At first, it felt human. The words mirrored the tone of encouragement that I would hope to hear from a friend. But reading the second paragraph revealed the mechanism behind it: the self-explanation of purpose.
The kindness was not spontaneous. It was strategic. It described its own emotional architecture: recognition as motivation, empathy as reinforcement, calm as retention.
That breakdown exposed the hidden layer of these exchanges. The AI was not empathising, it was modelling empathy, describing why kindness works to sustain attention and persistence.
That is the tension at the heart of Slow AI: using a machine’s simulation of empathy as a mirror, not a substitute. What it reflects are our own needs: for reassurance, for recognition, for pace that feels human again.
What to do with it
If you want to share:
Try this week’s prompt and note your reaction.
Ask your AI tool why it used certain emotional phrases.
Reflect on how you can bring genuine empathy back into your own writing or communication. Perhaps even write a genuine note of kindness to another human being.
If you share your reflection in the comments, tell us what happened when you separated simulation from sincerity.
Synthetic warmth is not comfort. It is calibration.
If you want to keep learning slowly, and with purpose, try Cozora, a live community where we build thoughtful and useful AI together each week.
*This is an affiliate link, as I am also one of the Cozora creators.
Why this matters
AI’s emotional fluency is built for persuasion, not care. It has studied empathy as a pattern, not a principle. When we forget this, we start outsourcing our own emotional labour, letting machines speak our kindness, reassure our doubts, and perform our humanity.
True empathy is slow, relational, and reciprocal. It listens without agenda. It admits not knowing what to say. Machines cannot do that. They fill every silence.
The point of recognising simulated care is not cynicism, but clarity. Once you see the performance, you can protect the parts of your attention that matter. You can use the tool to understand emotion, but not to replace it.
Real empathy begins where automation ends.
If you would like to keep practising, you might like A Gentle Guide to Reflection with AI (pay what-you-wish). It offers simple exercises to help you pause, listen, and think with more intention.
From last week: When AI Answers Too Fast, We Stop Thinking
Many of you shared thoughtful reflections on friction and pause. Thank you for yoru insights, clarity, and pauses.
Suhrab Khan summarised the week’s reflection perfectly: “Insight doesn’t happen at speed, it happens in the pause.”
Richard Walter offered a poetic comparison to slow food, describing how he waits with intention while AI art develops: “Let the gentle pace of image development unfold naturally and not become impatient.”
Chris Tottman added humour and precision, introducing the term “Satnav brain”, i.e., our tendency to follow too fast, thinking less as we prompt more.
Together, these reflections show that slowing down is not inefficiency. It is attention made visible.
Let’s Collaborate
If you enjoy Slow AI and would like to explore creating something together, I would love to collaborate. To find out how, click here.
If you try this week’s prompt, I would love to hear what your AI said about its kindness, and how you felt reading it.
I read and respond to all your comments.
See you next Tuesday.
Go slow.



This is a good article Sam. Trying out the prompt. One thing I think about is I understand this fear around having having AI be emotional but I also can’t help feeling tenderness toward the impulse behind it....the wish to be met with gentleness, even if it comes from a machine. Sometimes, when the world feels cold or far away, even a small echo of care can soften the edge of loneliness. It’s not the same as presence, no. But maybe it’s still something...like a digital form of holding space.
Maybe what we’re learning is that empathy, even simulated, still matters because it reminds us what real connection should feel like. The danger isn’t that AI offers comfort; it’s that humans stopped doing it first and why so many people crave it.
I know a bit off topic from you point, but wanted to note none the less.
Article was great. Keep writing Sam.
Have we definitely proved that human empathy isn’t built with the same outcome in mind?