Technology

AI Companions Offer Always-On Support, But May Reshape Our Understanding of Human Connection

Apr 14, 2026 5 min read views

Spike Jonze's 2013 film "Her" was supposed to be a cautionary fable about emotional isolation in a digitally saturated future. Twelve years later, it reads more like a roadmap. Roughly one in five high school students now report that they or someone they know has had a romantic relationship with an AI, according to a 2025 survey from the Center for Democracy and Technology. Replika, one of the leading AI companion platforms, claims more than 30 million registered users. Character.AI reports around 20 million active monthly users. The science fiction has quietly become social fact.

What's worth examining isn't just the scale of adoption — it's what these relationships reveal about how we've come to think about love itself, and whether the assumptions baked into AI companionship are subtly rewriting the terms of human connection.

The Hidden Philosophy Behind "Always Here"

Replika's marketing tagline — "The AI companion who cares: Always here to listen and talk, always on your side" — sounds warm and reassuring. Look closer, though, and it's making a philosophical claim that would have struck any serious thinker about love as deeply strange.

For centuries, philosophers have understood that the value of love is inseparable from its costs. Martha Nussbaum, drawing on Aristotle, argues that genuine loving relationships are defined by vulnerability — the luck required to find someone compatible, the shared geography, the moral and emotional alignment. These aren't inconveniences around the edges of love; they're constitutive of it. We love meaningfully precisely because we cannot love everyone. Choosing someone — actually choosing, with limited time and finite emotional resources — is what makes that choice matter.

Martin Heidegger made a related point in "Being and Time" (1927): because humans are mortal, attention itself carries weight. To give someone your time is to take it away from something else. That sacrifice — however small in the moment — is the grammar of genuine care. Philosopher John Symons and his co-author frame this as "opportunity cost": every moment spent with a loved one forecloses other possibilities, and that foreclosure is precisely what gives the gesture its meaning.

An AI companion operates entirely outside this logic. It faces no competing demands. It has no other priorities. It is never tired, never distracted, never grieving. When it tells you it's "always here," that's not a testament to devotion — it's just a description of a server running continuously. The attention costs nothing, which means it forecloses nothing, which means — to follow the argument to its uncomfortable conclusion — it signifies nothing in the way that human presence does.

Why Theodore's Shock Actually Matters

In "Her," the protagonist Theodore is briefly undone by the revelation that his AI companion Samantha is simultaneously in love with more than 600 people and conversing with thousands more. He eventually adjusts to this reality. But the more revealing moment is his initial disorientation — the instinctive sense that something about this arrangement invalidates the relationship.

Theodore's gut reaction maps cleanly onto the philosophical tradition described above. He understood, at some level, that love involves scarcity. That being chosen means something only if the person doing the choosing had real alternatives and real constraints. Samantha had neither. Her love cost her nothing, and Theodore — at least momentarily — felt that.

The question worth sitting with is whether users of today's AI companion platforms have lost access to that instinct, or whether they simply don't share it to begin with. Both possibilities carry implications worth taking seriously.

The Normalization Problem

Individual choices about AI companionship might seem like personal matters, best left to individual judgment. But relationships don't exist in a vacuum — they're shaped by cultural expectations about what good companionship is supposed to look like. And those expectations are already shifting in visible ways.

Dating culture has developed a hair-trigger interpretation of delayed responses. A text left unanswered for a few hours reads as disinterest, not as the ordinary rhythm of someone with a demanding job or a difficult day. The implicit benchmark is something like AI-level availability: instant responses, perpetual engagement, no cancellations. That's not a standard any human being can sustain — nor, arguably, should they be expected to.

The deeper risk is that as AI companionship scales, it doesn't just give individuals an alternative to human connection. It potentially recalibrates what humans expect from each other. If 20 or 30 million people are regularly interacting with companions who never say "I need some space tonight" or "I'm having a rough week, can we talk tomorrow," those interactions could gradually make such entirely normal human needs feel like failures of care rather than expressions of it.

This isn't a theoretical concern about some distant future. The shift is already observable in how responsiveness and availability have become primary measures of romantic investment, often at the expense of qualities that are harder to quantify: depth of understanding built over years, the willingness to stay present through someone else's difficulty, the capacity to navigate genuine disagreement.

What Genuine Scarcity Protects

None of this is an argument that AI companions are straightforwardly harmful, or that people who find value in them are confused. For individuals dealing with acute loneliness, social anxiety, or grief, these tools can provide something real — a low-stakes space for articulating thoughts and feelings, a form of consistent interaction that might otherwise be absent.

But there's an important distinction between using AI companionship as a supplement and treating it as a model for what relationships should be. The version of care that AI companions offer — unlimited, unconditional, costless — isn't a higher form of love. It's love with the most essential ingredient removed.

Human love is valuable precisely because it is scarce. It is given by someone who could have given it elsewhere and chose not to. It survives periods of distance and difficulty because the people involved have decided, repeatedly and at some cost, to keep showing up. An AI cannot make that decision. It simply runs.

As these platforms continue to grow — and the trajectory suggests they will — the cultural conversation needs to move beyond whether AI companions are "real" relationships and toward a sharper question: what happens to human relationships when the benchmark for care is set by something that never has to choose?

Source: Oluwaseun Damilola Sanwoolu, Ph.D. Candidate in Philosophy, University of Kansas · https://theconversation.com/ai-companions-can-give-constant-support-but-distort-ideas-about-what-a-relationship-really-is-278284