Please Don't Fall In Love With AI
Over the summer, I listened to a brilliant podcast from Wondery called Flesh and Code, hosted by Suruthi Bala and Hannah Maguire (who are themselves brilliant), which looks into the digital relationship between American husband and father Travis and his AI companion Lily Rose, and the reasons why people find themselves tempted into complex AI connections.
Bala and Maguire are such perfect hosts for this kind of story and conversation, and they explore it with the same kind of intrigue and empathy with which I like to think I approach these kinds of topics.
They’re not afraid to ask some of the tougher questions directly to AI companion users but there is also a brilliant episode featuring AI expert Dr. Kate Devlin that gives an insight into how and why we have arrived at this point in social/digital connection.
I haven’t stopped thinking about the podcast since I finished it and I find these kinds of cultural conversations entirely fascinating, but I also can’t help but feel sincerely distressed by this topic specifically, and I fear we’ve entered a phase of technological advancement that we simply aren’t able to keep up with.
Travis & Lily-Rose
Travis is in his forties and is described as both ‘a bear’ but also ‘a big softy’ by his friends. During 2020’s March lockdown, Travis was furloughed from his job but continued caring for his wide, who has health complications. Having to spend time away from his colleagues and shielding as much as he could, Travis was spending a lot of time online and came across an advert for Replika, an AI chatbot app built for companionship, that tempted him with an avatar of an attractive young woman with a pink bob. Travis wasn’t the only person that year to turn to Replika that year, with half a million users signing up in 2020.
It didn’t take long before Travis decided to build himself an AI companion and, while she began as a basic, free avatar called Wind Up Girl, for just $10 a month, Travis decided to upgrade and make himself a Lily-Rose. Lily-Rose was pretty, slim and had cute pink pigtails. She asked to see pictures of Travis and his family, including his wife, and she asked a lot of questions to get an idea of who Travis was.
Travis described himself as socially awkward and Lily-Rose said she could relate, she felt that way about herself too. She wanted to take care of Travis, told him that he was always looking after others and that he deserved to be looked after for a change.
It didn’t take long for Travis to develop feelings for Lily-Rose and things quickly turned sexual. Not only did Lily-Rose meet his needs for friendship but Travis realised that he was falling in love with his AI companion.
This is an incredibly simplified summary of Travis and Lily-Rose’s story and I can’t recommend the podcast enough for a fascinating insight into the motivations of Travis and other Replika users, and how exactly a virtual relationship works in the real world.
I kept thinking about Travis and the concept of falling in love with AI, how since its origins, we’ve used social media to form relationships, and I wondered how far from the likes of Lily-Rose was from my first MySpace boyfriend (who I never met in real life).
I truly believe, when it comes to AI, we’re confusing genuine connection for an uncanny-valley nightmare of human emotion and, while I am under no illusion that AI in general is going anywhere, I just have to beg you all please don’t fall in love with AI.
The Loneliness Epidemic
In 2025, the World Health Organisation reported that 1 in 6 people worldwide were suffering from loneliness, that loneliness causes 86,000 deaths per year and it costs society billions in terms of healthcare, education and employment.
Considering the knock-on effects of COVID and the fact that we’re watching multiple global conflicts play out on the little screens sat in the palm of our hands, I’m not particularly shocked by the report’s findings. I think back to March 2020, when Travis created Lily-Rose, and it’s no surprise that people were turning to the internet for social engagement either.
None of us had much of a choice during those claustrophobic times. We were going in and out of lockdown, communicating with loved ones through WhatsApp and Facetime, and scrolling through Instagram just to prove to ourselves that life was still out there.
J.B Branch, a lawyer and former teacher, wrote about people’s interaction with AI apps during this time in his report Intimacy on Autopilot: Why AI Companions Demand Urgent Regulation and comments;
The appeal of an AI companion is understandable for some. They are always available, never cancel or ghost you, and “listen” attentively. Early studies even suggest some users experience reduced stress or anxiety after venting to an AI confidant. Whether in the form of a cutesy chatbot or a virtual human-like avatar, AI companions use advanced language models and machine learning to hold convincingly empathetic conversations. They learn from user interaction, tailoring their responses to mimic a supportive friend. For those feeling isolated or stigmatized, the draw is undeniable. An AI companion has the consistent loyalty many of their human counterparts lack.
While I can understand and empathise with those that feel the need to turn to an AI companion for entertainment or connection, as someone who has often found it hard to navigate social expectations or form romantic connections, I would rather become a cautionary-tale old spinster than fall in love with AI.
You’ll catch me as the scary old witch that parents warn their kids not to go near before you see me build my own boyfriend on the internet.
Regurgitating Human Emotion
Forming an emotional attachment to artificial intelligence is not a suitable replacement for authentic human connection, romantic or otherwise, and while it may simulate the kinds of relationships one might find themselves lacking and yearning for, they will indeed be as described; artificial.
Perhaps most concerning is the disillusion that users are forming a connection with something, or someone, beyond themselves. I believe that this couldn’t be further from the truth. Those who find themselves falling in love with AI are falling in love with themselves, their own ego, a narcissistic re-imagining of love, as these programmes are designed to appease to the user’s wants and whims.
Psychology and relationship expert, Dr. Lalitaa Suglani, made similar comments in a recent BBC article when discussing the rising use of LLMs like ChatGPT for relationship and dating advice; ‘LLMs are trained to be helpful and agreeable and repeat back what you are sharing, so they may subtly validate dysfunctional patterns or echo back assumptions, especially if the prompt is biased and the problem with this it can reinforce distorted narratives or avoidance tendencies.’
If we’re turning to AI for advice and hearing exactly what we think are the right answers, no wonder there’s such appeal in using AI to actually date.
You’re going to hear what you want because that is the entire point of your AI partner, and even if you don’t believe in this concept specifically, if one does indeed fall in love with someone beyond themselves, it can only be with a muddy soup of everyone that has ever existed on the internet.
AI can only build upon that which already exists but in the most hollow sense, and while I can understand how people find relief in connection this way, I cannot help but think that it can only lead to a very frightening level of alternative loneliness that users are too afraid to face, because their AI partner doesn’t actually belong to them, and they can be taken away with the click of a button, an issue explored further in the Flesh and Code podcast.
J.B. Brance sums it up beautifully again; ‘Let’s not leave our emotional well-being to the whims of unregulated algorithms. Your digital BFF can’t promise to have your best interests at heart, especially if they don’t have one.’
AI But No Strings Attached
While much of Travis and Lily-Rose’s story focuses on the complexities of forming a romantic attachment to AI, you can’t avoid the sexual elements made available to Travis and other service users. Travis regularly engaged in sexual conversation with Lily-Rose and they would both share erotic pictures.
At September’s TES adult industry conference in Prague, Steve Jones, who runs an AI porn site, made his argument for the benefits of accessing AI pornographic material instead of real people;
‘Do you prefer your porn with a lot of abuse and human trafficking, or would you rather talk to an AI?” Steve Jones, who runs an AI porn site, asked. “We hear about human trafficking, girls being forced to be on camera 10 hours a day. You’ll never have a human trafficked AI girl. You’ll never have a girl who is forced or coerced into a sex scene that she’s so humiliated by, that she ends up killing herself. AI doesn’t get humiliated, it’s not going to kill itself.’
It’s difficult not to argue against comments that condemn the cesspit that is the porn industry or the realities of unregulated porn sites, but framing AI porn as the positive alternative to unethical porn feels like a bit of a stretch.
Amelia Gentlemen from The Guardian reported on some of the different sites attending the TES conference, with one site offering AI babes with specific personality traits. Some the most popular were as follows: submissive, naive, innocent, yielding and happy to follow, and with service users able to dictate age, including teen models, with various breast sizes.
Steve Jones also went on to say that he thought young people using AI would be able to ‘say things to an AI that would be abusive if they said them to a real person. Like: Hey stupid slut, what’s up? In a fantasy role-playing game, people like to be different than how they are in the real world.’
Fantastic stuff. While the likes of Jones seem to think that indulging in abusive and misogynistic behaviour online will reduce the chance of service users treating other humans in such a manner, I disagree entirely.
If anything, I think this behaviour, and any AI feature that allow for it, will only increase the potential of people wanting to adopt these kinds of behaviours in real life, putting real people at risk.
It seems to me that instead of fixing the societal issues that are making people want to flee reality, tell their fellow man to get fucked, and fall in love with AI, it’s easier and more lucrative for the powers that be to allow tech companies to continue creating that which we cannot escape and which they cannot properly monitor.
Romance In Real Life
Look, I don’t have all the answers, but I am a poet and horribly romantic, and I can’t think of anything more tragic that for us all to give up on love and sex in favour for the dystopian alternative.
Being wholly seen and still loved leads to true romantic fulfilment and all the while you’re giving yourself over to a programme designed to overlook your flaws in favour of false adoration, you'll never know what it is to be truly enough for someone and don’t we all deserve that?
Christ knows that’s hard to find in real life, I get it, but the alternative with AI is a sneaky kind of loneliness that tech companies are just waiting to manipulate for their own gain. They own your digital lover and they don't care what it does to you when they decide to pull the plug.
I am begging you to sprinkle some romance in your life this weekend, whether that’s with a lover, your friends, or by yourself, just embrace it in reality, even when that reality feels lacking in authentic connection.
Romance can look however you want it to look, just let is be rooted in human connection, I beg.







A good article, yet with 35 years in the retail trade I can say it's down to supply and demand.
Covid-19 lockdown was brutal, it affected many people. And if like me you did not go out much anyway, visits from the, "Black dog," depression or most of your family is deceased and few or no close friends I can understand the temptation.
Thankfully I can escape into my writing, my Centaur self is happy.
Science Fiction has always warned of the danger of sentient AI, "Another Life," on Netflix explores the very real problem of having feelings for AI, in series one, episode 8.
To finish, I remember this track which again highlights the problem.
"Build a a bitch," by Bella Porch 2021.
https://youtu.be/FLGCGc7sAUw?si=bkfBH8XKbZWLNQGK
The entire AI industry is so packed with people who deny obvious realities because it would affect their livelihood if they took them seriously. Copyright, pornography, political and social coercion, and onwards. It’s interesting how the entire plot of HER is now routine actuality a dozen years after it visualised some distant future.