Emotional Needs in the Age of AI
Understanding our emotional projections onto artificial entities
The other day, I was at the supermarket, rushing through the self-checkout. When the machine said, “I didn’t register that,” I instinctively replied, “Oops, sorry!” It was automatic. I knew I wasn’t speaking to a person, but still, some part of me reacted as if I had burdened someone.
As I walked out of the supermarket, still thinking about my odd little apology, I recalled how my hairdresser always says “thank you” to Alexa, and when a colleague told me she felt strangely “bad” for closing down her chatbot mid-conversation. Most of us know these machines aren’t conscious. Yet, we treat them as if they were.
Why do we do this? And what happens when these interactions start to feel meaningful? Especially with chatbots and AI systems designed not just to converse, but to respond in ways that feel emotionally attuned?
The Stories We Tell Ourselves
At heart, we are storytellers, and to make sense of the world around us, we weave meaning into what we see. One of the oldest ways we’ve done this is through anthropomorphism: the tendency to attribute human thoughts, emotions, or intentions to non-human entities. Back in the day, for example, ancient civilisations imagined gods in human form to explain nature’s forces. These days, we accessorise our pets, thank Alexa, and apologise to self-checkout machines!
Interestingly (but unsurprisingly), feeling socially disconnected increases our tendency to anthropomorphise [1]. That is, when we’re lonely, we’re more likely to attribute human traits to non-human agents, especially those that seem responsive or intentional.
So, assigning human-like qualities helps us create a sense of connection. When we imagine that a machine has thoughts or feelings, we can relate better to it, and it starts to feel familiar. But familiarity alone doesn’t explain the bond. There’s also an emotional layer, one that taps into our need to feel seen and understood.
The Plot Thickens
See, unlike humans, AI is always available, it doesn’t look at the phone while we speak, it doesn’t seem bored or tired. For those of us who feel overlooked or lonely, that kind of “attentiveness” appears to us as emotional warmth, even when we know it isn’t real. But real or not, feelings and emotions are still registered.
That’s where the connection begins. Once an interaction feels emotionally validating (even if artificially so) we start to form attachments. We don’t bond with the algorithm, of course. We bond with what we project through our own meaning-making.
This brings us to what we’re seeing more and more today, people forming relationships interactions with AI: from simulated intimacy to virtual marriages [2,3], indicating that the lines between imitation and real connection are starting to blur.
Worryingly, this doesn’t look like a passing trend!
But, to my mind, beyond the surface, these interactions reveal something deeper about our emotional lives.
We get to Write it
What we’re seeing is a signal that emotional needs are not being met in the real world. Otherwise, how could we so easily replace our beautiful reality with its virtual counterpart? I speculate that it’s not that we want to leave the real world behind. We’re pushed out of it.
For so many of us, the real deal feels too demanding, too unpredictable, or too indifferent. The virtual world, on the other hand, especially now with AI designed to be non-judgemental, and endlessly available, is a kind of emotional haven. Also, in the virtual world, we always get to tell the story. And in these stories we’re the best version of ourselves (or someone else entirely!), we never get left out of the script and always play the lead.
So the ease with which people turn to it may reveal a deeper longing for emotional safety and simplicity. But that also reveals something more troubling, a growing difficulty in tolerating the hurdles of real human connection.
We all know real relationships involve misunderstandings, emotional labour, and discomfort. They require patience and compromise. Meanwhile AI offers (the illusion of) connection without any of the demands. It adapts, agrees, responds on cue. There are no mixed signals, awkward silences, or emotional unpredictability. It’s all smooth sailing.
Going Around in Circles
The problem with overly agreeable AI isn’t just that it feels fake—it’s that it can reinforce unhelpful or even harmful beliefs.
However, over time, this contrast can condition us to favour ease over depth in our real interactions. In many ways, I feel we already doing this! But, in avoiding the emotional work of real connection, we may end up empty, meaningless. Because relationships that ask nothing of us often give little back. The result is a disconnection that feels safe on the surface but leaves us emotionally undernourished, lonely.
Have you noticed the conundrum?
We turn to AI to soothe our loneliness because it feels emotionally safe and responsive. But because it's not real connection, it lacks reciprocity, spontaneity, the richness of human unpredictability. Over time, this deepens the very loneliness we sought to escape.
Now, I think this is where we often go wrong: we tend to assume that emotional fulfilment means having our needs perfectly met. Then we set off searching for that perfect attunement (in people and now in AI). And the search is almost always outward. The issue is that when we rely only on external fulfilment to meet our emotional needs, we lose sight of the internal resources we already have, such as resilience and the capacity to grow.
The flight of the alone to the alone
Further, in that outward search, we become more vulnerable to illusions of connection. We’re more likely to grasp at what merely simulates presence, even when it comes from another human being. So, I believe we need to rethink how we relate to our emotional needs. More often than not, the answer is not in a more perfect companion, human or AI, but in a deeper relationship with ourselves.
This ‘inward turn’ is not about withdraw, but about reconnection, a kind of search that Plotinus once described as “the flight of the alone to the alone.” An insight I read as a way of meeting ourselves fully, of learning to be with what we feel, to listen before we seek, and to recognise that not every ache demands an external cure.
Please read
’s article, where he lifts the curtains prompting us to think what’s ‘behind the scences’: the interests of the tech industry.For a look into the AI & Ethics intersection please read Joanna’s article, featured in her Newsletter:
Reference List:
[1] Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
[2] Vincent, J. (2023, March 16). Replika brings back erotic AI roleplay for some users after outcry. Vice. https://www.vice.com/en/article/replika-brings-back-erotic-ai-roleplay-for-some-users-after-outcry/
[3] Kim Komando. (n.d.). Woman ‘marries’ AI husband (update) [Video]. YouTube.
Ah, a topic I feel supremely soap-boxish about! Give me a moment to rub my hands together and snicker under my breath....
Just kidding...kind of. Your apology to the talking price scanner / calculator / money gobbler doesnt seem very strange to me at all. People generally do not go about ignoring people when they address them. Consider that in that moment, while in a hurry, alone, you were not in a state of mind in which to expect to be addressed. The fact that you were, you responded automatically like most normal people likely would in a similar state of mind, I think. What would have been strange is if, upon "hearing" your apology, it then tried to assure you that there was "no harm, no foul".
I think it's a similar mechanism for smart home devices. The difference here is that when addressing such devices, at least when I do, I am generally not alone, so I am in a state of mind that promotes good manners. I realize that with this explanation, when coupled with the above explanation can seemingly account for all moments of interaction. To that I say....meh....*shrug*
The story teller in us all....I'd naunce that statement to death were I in an argumentative mood, but I am not, so can only agree. I without a doubt, tell stories for nearly every aspect of life I encounter. Mostly internally, but I do express some small portion of them here, on substack, to which I am sure if people read most of them, they'd walk away thinking I am a broken psychopath / eternally dark soul looking for sympathy. Neither of which I actually am...probably.
I would also like to rail against the disconnection observation, but you are likely right, within the context of my own world anyway. I seperate myself from the whole people experience as much as possible and have for many years now. I tend to anthrop...that word, almost anything I do not understand to my own satisfaction. However, once understanding has been reached, I nearly always do not give them the same treatment unless trying to explain the understanding to someone else. So, yeah, I do it, but for me, it is quite literally a tool to help me relate, and then understand. Then the tool gets set aside again.
Feeling understood? I have that in spades. It's the sole reason I tend to use so very many words to say something others could have said in a third of the time. I do it without thinking. Overexplain, often times to the point of muddying the waters worse in the doing. Ah well.
As for the emotional connections to AI....well....I am torn. My view on such may not be the most generous to people that do this. At least, not the way I'd explain it, and that is not something I am willing to do in order to be heard. The needless targeting of anyone is not my bag. For me, if I am honest, sure, there is probably some kind of attachment to AI. It's my sounding board for code more times than not, but also where I run ideas through, get an array of relevent sites in which to browse myself when I am learning something interesting, and more. It's not strange to me that I'd feel some kind of...possessiveness? I suppose the word would be..towards what I consider to be a valuable tool.
I will say briefly though that the people who might use AI for some kind of sexual gratification....I am mixed on that one. It seems to me to be no more strange than someone who chooses to watch porn and beat off into a sock or something. It's a matter of awkwardness, and has existed since time immemorial. Its not such an odd thing then that the end result still happens but the medium to get their differs. Again...*shrug*
Leaving the real world behind...I did that long before LLM's made it possible to experience it from afar. People are not leaving the world behind for a virtual one, but rather living in a world that accepts them for who they are, or allows them to be who they want to be. This used to be achieved through dungeons and dragons on weekends, reading a book at night, or excelling in schoolwork beyond simply understanding it. That was not an exhaustive list. The problem as I see it lies squarely at the feet of whoever's responsible for the kids until such time as they are considered adults, and for those of us who are already adults, it's a mix between the adults that can use their heads well enough to think about their actions and the parents of the ones who must still think for their children. To sum all of that up, bad influences have always been a thing. The bad influences of the past required touching grass and breathing fresh air enough so that some community exposure might leak through. Now, it's not a requirement to exit the dungeon for very long, and nobody seems to be making sure they do so.
In that, I am bias however. I think people can function fine in solitude, passably fine anyway. But the more one is exposed to society, the more easily they can navigate it. AI was never meant to be a replacement for life. It was meant as a tool to make life easier and has been doing just that since 1959. Now that it can "talk" to us, it's suddenly a threat to our way of life. I humbly submit that WE are the threat to our way of life, not AI.
Also, I cant speak to everyones interactions with AI, but mine are most certainly not the ideal versions you hint at. It's almost a certainty that at some point throughout the week I'll end up calling ChatGPT a dumbass for repeatedly thinking something I said meant something else and it ran off on a tangent about it. However, even I find myself being respectful most of the time. I do this because it's my nature to do so, no matter the entity I am engaging with. And just like ChatGPT, were my dogs to suddenly become dumbasses, I'd tell them the exact same thing.
You are not wrong though. There are people who use AI in exactly the way you have described them. That's an issue I suppose, but this too, has always been the way of such people. The way of ALL people, to act in the way that feels the most agreeable to their peace of mind. Except for the rare nutbags such as me who thrives on contrast rather than symmetry. Even when I am seeming to be helpful and altruistic there is a very small part of me that thinks I am doing so just because it proves me right somehow. So who's worse, someone like me, or someone looking for a friend in all the wrong places? I have no easy answer for that.
I am not sure if I would be what Plotinus considered being in a "flight from the alone to the alone" but I imagine it and I are near cousins in our states. I am indeed alone, but it's because I can live with contrasting ideas both being true, while I think what Poltinus meant, likely could not fly that way.
Great article, as always. I am sorry for the novel of a comment, but I figured it would end up being that way. :(
Be Well
Who needs AI when there are dogs? 😂🐾❤️