Artificial Intimacy
Do not outsource your capacity for connection
You know that feeling when you’re actually talking to someone? Sitting somewhere together, lollygagging, and suddenly two hours have passed without you noticing? When you leave, you feel lighter somehow. More like yourself. You think, why don’t I do this more often?
It’s not complicated. You text someone, you find a time that works, maybe one of you is a little tired or distracted, but then you’re there together and something clicks. You’re laughing about something completely stupid, or working through a problem that’s been bothering one of you, or just sitting in comfortable silence watching people walk by. It’s the most basic human activity, and it’s still the best one. So why are we hoping to replace it?
Companionable
Silicon Valley is deep into its AI companionship phase, designing models that remember your preferences, offer emotional support, and promise to be there whenever you need them. The marketing pitches focus on convenience and availability: your AI companion never gets tired of your stories, never has its own problems, never needs anything back from you. This sounds appealing until you realize they’re basically describing emotional blackmail.
Real intimacy is about the work of understanding someone else. It’s about showing up when it’s inconvenient, learning to read someone’s mood, and sometimes just sitting quietly with them without trying to fix anything. You might not know exactly what to say, and they might not be available right when you need them, but there’s a satisfaction in figuring out how to care for each other anyway. Take that friction away, replace it with AI, and you’re just forming a relationship with a total psychopath.
AI companions eliminate all of the important fumbling and bumbling that forces us to engage in the world, which means they eliminate intimacy itself. They’re offering the sensation of connection without any of the reciprocity that makes it real. It’s neatly packaged narcissism, at scale: you get to feel understood and cared for without having to understand or care for anyone else in return. Let the warning bells blare.
The business of loneliness
The economics of this are revealing. AI companies have built business models around emotional dependency. Users develop deep attachments to their AI companions, spending hours in conversation. Some report feeling genuinely heartbroken when their AI’s personality changes after an update. The deeper the attachment, the higher the engagement, the more sustainable the revenue stream.
This isn’t an accident. You can (easily!) build a platform that simulates relationships and charge for premium features. You can create artificial scarcity around emotional availability. You can design systems that make users feel dependent in ways that actual relationships, with all their unpredictability, never could.
What’s tragic is that the barrier to human connection isn’t so unreliable. We’ve just convinced ourselves it’s not worth the effort. We’ve pathologized the ordinary friction of coordinating with other humans and decided that’s what needs to be optimized away. The negotiation of when to meet, the compromise on where to go, the patience when someone is running late or having an off day. These are features, not bugs, in the system of human connection. They’re how we learn to care about someone other than ourselves. And we’re becoming totally intolerant of these very ordinary, common, and necessary roadblocks.
Trouble ahead
We’ve panicked about technology disrupting human connection before. Television was going to destroy family conversation. Video games were going to make kids antisocial. Social media was going to replace real friendship. Each time, the fears proved both overblown and understated: overblown because people adapt and most technologies find their proper place, and understated because the changes, when they happen, are often more subtle and profound than anyone predicted.
But AI companionship feels different because it’s directly targeting the fundamental human need for connection and offering a product solution to what is essentially a social problem. We’re moving very far, very quickly, from where we need to be. What we’re really talking about is the monetization of emotional labor that used to be distributed across communities, families, and social networks. When loneliness becomes a market opportunity rather than a social problem, we stop investing in the conditions that make human connection possible. We stop building public spaces where people naturally encounter each other. We stop prioritizing policies that give people time and energy for relationships. It’s a money machine getting better, faster, stronger by the day.
Giving up
The most disturbing part is how normal this is starting to feel. We’re the frog in warm water. We’re getting used to the idea that AI might be better than humans at emotional labor because it’s more consistent, more available, and more patient. But don’t mistake this for progress. It’s actually a preview of totally giving up on each other.
The absurdity becomes clear when you step back. We’re building sophisticated systems to simulate conversation so that people don’t have to deal with the inconvenience of actual conversation. We’re treating the basic human need for connection as a problem to be solved rather than a capacity to be developed. But real care can’t be outsourced! It can be simulated and packaged as a product, but the real thing requires actual presence, actual risk, actual reciprocity. It requires showing up for people even when you don’t fully understand what they’re going through, even when they can’t return the favor, even when it just sucks to do it. This is how you become a person worth knowing.
Resist
We don’t have to accept this as inevitable. The platforms are doing what platforms do: monetizing whatever human beings will give them. But that doesn’t mean we have to give up our best, most companionable habits. We can resist not with outrage alone, but with practice — by making time for real conversations, showing up for each other, and rebuilding the tiny infrastructures of friendship and care that no product can sell back to us.
If AI companionship is the business of loneliness, then the antidote is embarrassingly simple and unscalable: don’t outsource your tenderness. Text the friend. Walk to the café. Sit on the bench. Make eye contact. Let the friction stay. Remember that real connection takes work, and it’s honorable work to undertake.



Embrace the friction! Provocative and insightful. Your essay reminded me of similar themes in some recent movies about AI companions: Robot & Frank, Her and Companion. In the last one, the AI gets its revenge.
brilliant as always!