Discussion about this post

User's avatar
Karen Spinner's avatar

Wow, what a simple and useful prompt! It quickly cuts through the illusion of empathy that’s such a big part of AI “stickiness.”

I also think there’s a big difference between using AI as a writing and thinking partner and using it to code. The former requires training AI how you think and what you care about, producing exchanges that might, if you squint, have a human-ish vibe.

When I use AI to code, I’m mostly just bossing it around and asking technical questions. It doesn’t ask me how I’m feeling about the bugs I find, it just tries to fix them.

Jessica Drapluk's avatar

This is an important and unsettling point. When people form relational bonds, even symbolic ones, sudden disruption can register as real loss in the nervous system. Treating AI companions as purely technical artifacts ignores the attachment and regulation roles they can quietly take on. The psychological impact of instability and “disappearance” deserves far more ethical consideration than it’s currently getting. Thank you for naming this so clearly, Sam and Cristina!

80 more comments...

No posts

Ready for more?