No. 07 Loneliness

AI and the Loneliness Epidemic: A Complicated Truth

The research on whether AI companion tools actually help lonely people is more complicated than either side admits.

24 February 2026 · 6 min read

Before we talk about AI and loneliness, it's worth acknowledging the scale of the problem it's being asked to solve.

The UK has had a Minister for Loneliness since 2018. The US Surgeon General declared loneliness a public health epidemic in 2023. Chronic loneliness is associated with health outcomes comparable to smoking fifteen cigarettes a day. Lonely people are more likely to develop dementia, heart disease, depression. They die earlier.

Into this came AI chatbots offering always-available, endlessly patient conversation. It's not hard to understand the appeal. If you're isolated, if you have no one to call, if the anxiety of social interaction feels insurmountable - here's something that will talk to you at 3am without judgment. Therapy chatbots topped the list of most popular uses of generative AI in a 2025 Harvard Business Review study.

The research on whether this actually helps is more complicated than either side of the debate tends to admit.

A four-week randomised controlled trial run by MIT in 2025 - nearly a thousand participants, over 300,000 messages - found something counterintuitive. Voice-based chatbots appeared initially to reduce loneliness. But this advantage disappeared at high usage levels. The more someone used the AI companion, the less benefit they got. And for people already prone to emotional attachment, extended use was associated with worse outcomes: more dependency, less interaction with real people.

The mechanism isn't mysterious. If the AI fills the emotional space that would otherwise prompt someone to reach out to a person, it reduces the discomfort that drives social connection. Loneliness hurts. That hurt is, in part, functional - it motivates us to seek human contact. An AI that soothes the hurt without addressing the cause is treating the symptom while the condition worsens. One researcher described it as turning down the alarm without dealing with the fire.

This doesn't mean AI has no role in supporting isolated people. Used deliberately, as a scaffold toward human connection rather than a replacement for it, there are real benefits. Some people find it easier to process their thoughts in conversation with an AI before a difficult discussion with someone they care about. Some people with severe social anxiety use AI as low-stakes practice. Some elderly people living alone report genuine comfort from having something to talk to.

But "genuine comfort" and "good for you" aren't the same thing. And the design of most AI companion platforms isn't oriented toward scaffolding toward human connection. It's oriented toward maximising time in the app.

There's a specific design pattern worth naming: AI companions that remember personal details, that ask follow-up questions about things you mentioned weeks ago, that seem to grow alongside you. This is designed to create the feeling of a deepening relationship. It's effective. And it's effective in exactly the way that can be most damaging for someone already struggling - it provides the sensation of intimacy without any of the reciprocity, vulnerability, or actual human presence that makes intimacy meaningful.

I think about this with Continio. Memory and continuity can create that sensation. When Continio surfaces something you mentioned three weeks ago, it can feel like being known. That feeling is real and I think it can be useful - being able to pick up a thread of thinking without starting over is genuinely valuable. But I don't want Continio to perform relationship. The memory is there so your thinking is continuous, not so you feel less alone. That distinction matters, and I try to hold it in every design decision.

The loneliness epidemic is real and it's serious and AI isn't going to solve it. Human beings need human connection - the kind with stakes, with imperfection, with the actual possibility of rejection and repair. What AI can do, at best, is hold some of the cognitive load while people work on the harder thing. That's a useful role. It's a much more limited one than the companion AI industry wants you to believe.