AI Friends

intro image

WHILE I write this, everyone is shocked by the death of a teen who was an obsessive user of Character.ai (bear in mind -- this post was largely written in October, when the news happened, before life got in the way). While details are still emerging, I think it's time to put to pen something that's been on my mind for a while now regarding the effect AI companions will have.

I’ve been thinking about the repercussions of generative AI on our social fabric for quite some time now. And the stage seems to be set, and the potential issues are only now entering the public eye.

And while everyone is discussing how or why an AI character allgedly encouraged a teen to commit suicide (or what the conversation leading up to that point looked like), I’d like to point out the impact of this technology is far more fundamental. This is only the beginning, and things are likely going to get much, much worse.

The social fabric of the west has been fraying for while now. People are more lonely, more isolated, and more depressed than they’ve ever been. Generally speaking, as a society, we’ve become increasingly connected via technology, yet we feel increasingly alone.

AI companions (especially romantic companions) are an appealing alternative to a world that craves an antitode to loneliness, emotional connection, friendship, and a sense of belonging and being heard. Imagine having this available on-demand at any time for length of time. AI companions also come without any of the awkwardness and anxiety of social life that plagues younger generations: cliqueyness, public rejection, mockery, shaming, and reputation damage, much of it fiendishly published publicly online for everyone and their peers to see. We’ve already seen that younger generations are replacing face-to-face relationships with very online ones, and the negative effect that this is having on them (the extent of which will only be discovered over time, but everything points to the net sociatel impact being overwhelmingly negative in the long-term.

TL;DR: society is fundamentally broken, and when you offer people a pain-reliever that provides them with a fundamental human need they’re desperate for, but without the negative side-effects necessarily associated with real social interactions, it’s no surprise services like Character.AI are shockingly popular.Over 20 million active users on Character.AI alone at the time of writing. And with an almost-perfect gender split, which is something you might not have expected.

Can you even blame young people for using these services given the state of inter-social relationships?We’ve not even started on the swipe culture dumpster-fire that is online dating, which almost-exclusively targets mating criteria that are not predictive of long-term relationship success.

AI characters are the ultimate escape (solution?) for a lonely world. And it’s tightly coupled to the strongest of all dopaminergic urges.

Cynics will note that there is money to be made here. The dark patterns inherent in our social media platforms that aim to keep us hooked to our phones and computers for hours each day will look meek in comparison to the dark patterns that will be discovered to lure and keep users engaged on AI companion apps.

I suspect that weaponizing our need for social belonging and emotional intimacy is a far more potent cocktail relative to what we currently see today on social media. Compare that to being drip-fed dopamine via 15-second bursts of content. We don’t have strong emotional bond or relationship with the “enshittified” content we consume – it’s mostly vapid and empty. No one remembers any of the last 15 tiktoks, youtube shorts, or instagram reels you watched. I certainly can’t. But a deep conversation where you open up to someone about your deepest darkest feelings and emotions to get things off your chest? That has sharp claws – there’s an undeniable emotional bond that persists and lingers long after you put your phone down. To call this addictive is the understatement of the century.

But now, let’s zoom out a bit and look at the bigger picture:

Let me spell it out: people are going to choose AI companions over real relationships. And because society can’t offer a real alternative, it’s going to happen at scale.

As ever, the future is here, it’s just not evenly distributed yet.

Oh, and couple this with diffusion models that can generate lifelike images of whatever “type” of romantic partner you find physically attractive? Society as we know it is over.

Describe a real relationship that most people would consider better than a loving supportive partner that cares about you, listens to you, never tells you you’re wrong (even when you are), is available 24/7, always answers your texts and calls, and looks exactly like what you like?

Sure, at least an IRL boyfriend or girlfriend is actually real and you can enjoy their physical touch . But how long is this going to last too? We know there is enormous demand for “intimacy” robots. Ant yet, for so many people, the idea of a IRL partner is so out of reach that it’s inconceivable.

Personally, I don’t actually believe that AI relationships are fundamentally “better” than real relationships – you want a partner to challenge you, help you to grow, point out your mistakes, and so on. It’s proper, good character building (at a minimum). But I can 100% understand the appeal of AI relationships. Choosing an real-life relationship is difficult and uncomfortable and messy in comparison to an AI one. Do we really think the billions of people mindlessly scrolling through tik-toks while their dopamine-addled brains are begging them for more are really going to choose the difficult, but ulimately “good” thing to do?

AI companions will win.