We Need Better Markets. Not More AI Girlfriends.
The tech world has discovered that loneliness is less a societal problem, more a market niche ripe for monetisation.
In 2017, an app called Replika was released.
It was a project born from grief: a woman created a chatbot using her best friend’s text messages after he died in a car crash. The idea was simple. A free AI companion that could learn from you, talk to you, and reflect your personality back at you.
But the app didn’t stay that way.
Over time, Replika pivoted. It began offering relationship modes, only if you paid. Free users could have a ‘friend.’ But want a ‘girlfriend’? NSFW roleplay or a call? Lewd pictures? $14.99 a month, please!
Not to mention, the free version of the AI constantly steered towards the paywall, the conversational capabilities and memories were neutered to vapidity, and the company put out suggestive advertisements proclaiming Replika as a tool to ‘end loneliness.’ It doesn’t take much reading between the lines to determine exactly what this means.
It’s part of a wider trend. The tech world has discovered that loneliness is less a societal problem, more a market niche ripe for monetisation.
The model is simple. Create a ‘relationship’ that hits just enough emotional beats to keep you engaged, then charge for anything that feels real. And Replika isn’t the only one.
CarynAI, a chatbot based on a real Snapchat influencer, charges users $1 a minute(!) to talk to their ‘new AI girlfriend,’ again to ‘end loneliness.’ It’s a parasocial relationship promoted by the influencer behind it, who furthers the illusion by posting flirtatious stories, encouraging fan Snapchat conversations and even real-world encounters, and proudly tweeting she now has ‘20,000 boyfriends.’
And it works. Because the alternative for many people is nothing. The illusion of love, digital or not, is better than being alone. With AIs programmed to provide it at the swipe of a card, it’s a necessary evil to many who can’t (or won’t) seek this out in real life.
Then you get emotionally invested, and pay whatever it takes to maintain the feeling.
In the grand scheme of things, the state’s efforts to tackle loneliness have largely failed. Whether Labour’s manifesto promise to increase funding for mental health services comes to fruition or not, our NHS is hardly the best vehicle for its efficient provision regardless, with mental health cases constantly on the increase year by year.
Additionally, funding to local councils is comically sparse, with many community initiatives smothered by red tape and crushed by business rates. If you want to start a local club, run a drop-in centre, or create a public meeting space, the barriers can be punishing. Even starting a parkrun sets you back several thousand pounds.
Why should local entrepreneurs or communitarians bother when the risks far outweigh the reward?
That’s the real tragedy. When social capital and community resilience erode, and the alternatives are priced out or bureaucratically strangled, the ‘market’ steps in with a substitute: artificial intimacy on demand. At a cost.
This doesn’t stop at loneliness being sold back to us either. It’s about the ‘self’ becoming the product.
In many ways, these AI girlfriend models represent a frighteningly innovative but dystopian evolution of the market: one in which the consumer is encouraged to commodify their own personality, attention, and emotional vulnerability. You essentially pay to simulate a version of yourself that is perpetually desired, reflected back, and emotionally rewarded.
Take OnlyFans, for instance. It is the purest expression of the commodified self: turning intimacy, personality, and even the illusion of connection into a subscription model. For creators, it’s efficient. For consumers, it’s accessible. But as with AI girlfriends, the incentive is to monetise the most private aspects of human life, hollowing out trust and belonging in favour of transactional encounters. The platform thrives because it exploits isolation. The only ‘community’ there is based around a marketed figment made to wrangle money.
This goes beyond just AI and subscription platforms, too. Since the birth of online gameplay on console and PC, gaming increasingly has started to simulate friendship and community itself. This isn’t a bad thing in isolation, of course. Many friends of mine I’ve met via online games and platforms. The difference is that the game world is now not only used to, but expects this sort of interaction.
Games like The Sims or Animal Crossing, or even gacha-style freemium mobile games, essentially drip-feed you various ‘companions’ and interactions to encourage you to treat them like proxies for human connection.
These games are engaging and lucrative, but (however on purpose or not) they normalise the idea that relationships can be manufactured, managed, and ultimately bought. In that sense, they’re training wheels for the commodification that services like Replika utilises today.
Markets for their own sake are reckless when they corrode the social capital they rely on. From Smith to Hayek to McCloskey, the liberal tradition is clear: commerce flourishes only inside a culture of trust, voluntary association, and shared norms.
If classical liberals believe in markets that serve the individual, they should be deeply troubled by this one.
Because the more you look, the more exploitative the arrangement becomes. These services don’t really want to fix loneliness. Why move on when you can keep an easy subscription and your AI girlfriend who’ll continue to love you? If nothing else readily exists, no viable public spaces, accessible services, or healthy online communities, the model works.
There are those who want to ban Replika or CarynAI. It’s a wishful idea, but plays into the cycle of punishing people for feeling emotions, and punishing corporations for being innovative. When, in a society, people are paying bots to tell them ‘I love you,’ the solution isn’t a paper-over-the-cracks blanket ban. That just pushes the problem to another exploitable vacuum.
So what to do?
Let’s get to the root of the problem. As modern technology provides new opportunities, humanity’s collective shift to the online sphere means this was inevitable. Being trapped indoors by COVID-19 accelerated this exponentially. The three lockdowns the UK experience shaped the way an entire generation understands relationships.
Many teenagers came of age during years when friendship and intimacy were mediated almost entirely through screens. Flirting, friendships, even grief were filtered through digital platforms.
Discord, in particular, exploded in popularity, with its ease of access allowing servers to spring up around everything from witchcraft to anime to the passionate world of train simulators. Friend groups grew out of these digital hubs, complete with running jokes, late-night voice calls, and the occasional ‘Discord kitten’ relationship.
At the same time, platforms like Twitch supercharged parasocial culture. Entire communities were built around personalities; figures like Minecraft streamers Dream and TommyInnit achieved near-messianic status, where the line between fan and friend blurred uncomfortably. For many, ‘community’ meant stanning (obsessively enthusing, to put it lightly) someone they’d never meet. Stans would invest hours into streams, lore, and inside jokes that fostered the feeling of closeness without ever offering the substance of it. Any successes had by streamers became their drama, any sadness became theirs, and, often, support would be shown through a cash donation or five.
Now more than ever, the difference between online and offline interaction is stark. For many young people, the internet became a kind of social doom loop, where practising connection online made offline connection ironically unnatural.
Online, interactions were easier, more curated, and often endlessly reinforced by echo chambers. Offline interaction is not nearly as optimised or sanitised, especially for those used to being surrounded by likeminded individuals and safe spaces, so naturally when the world reopened, many found it easier to retreat into the certainty of select online spheres than face the complexity of real connection.
Chatbots like Replika and CarynAI certainly didn’t create that warped foundation, but they’ve capitalised on its recent surge, offering costed intimacy without risk, commitment, or rejection.
Ultimately, this is simply the logical endpoint of a tech ecosystem optimised for monthly recurring revenue.
Thus, we need policy and infrastructure that make real connection easier than the artificial kind. For instance, empowering local initiatives. Subsidise community spaces, simplify the bureaucracy around setting up clubs or events, or expand access to mental health services without means-testing or waiting lists to give individuals the will to get it.
Of course, empowering local initiatives shouldn’t just mean councils writing bigger cheques. Like those who say we must simply fund the NHS or build more houses, it underestimates just how difficult that is to properly do (and just how useless the government can be at doing it).
Too often, councils themselves are the choke point. Devolving power further down to neighbourhood associations, parish councils, or cooperative ventures, would allow communities to design their own responses to loneliness. It’s not outlandish to think that a regional area likely knows its citizens better than the national government. A local sports hall, café, or even a simple weekly event hall meet-up is more likely to succeed when residents feel ownership, rather than when it’s imposed from above. As well as helping communities, we must make it easier for communities to help themselves.
At the same time, digital regulators should focus less on splashy bans and more on transparency. Legislate by focusing on what these AI tools are doing, how they’re shaping behaviour, and who they’re really serving. AI regulation is still very new, and still suffers from the ‘new thing scary’ mentality that plagued internet regulation. If the government jumps the gun, or blunders in blindly with laws and regulation, we’ll end up with another OSA-esque disaster.
Society doesn’t need a dependency on pixellated partners claiming to understand us (with intimacy as a necessary add-on). We simply must be understood without paying for it.
And right now, neither government nor market are truly delivering that.
John Abbott works at Smart Thinking, a think tank network connecting London’s think tank world. He has previously worked at prominent free market institutions such as the IEA and EPICENTER.
Very interesting article. Thank you.
I would tend to agree that we need policy and infrastructure that make real connection easier than the artificial kind.
However, subsidising community spaces would also be wrong. As you point out, the government (national, local and regulatory agencies) need to get their knee off people's neck so that they have the freedom to pursue these social endeavours without red tape and restrictions.
As you've shown so well, it is this bureaucracy and burdens of the state which distort the playing field to make technology interactions more profitable than real life interactions.