Tech

When the machine asks you to stay

Published

on

In October 2025, Sam Altman posted a message on X that ended with a single, carefully placed promise. ChatGPT, he said, would soon allow verified adults to access erotica. He framed it as a matter of principle: treating adults like adults.

The internet reacted with the usual mixture of outrage, excitement, and jokes. Then, in December, the launch was delayed. Then again, in March 2026, it was delayed a second time. OpenAI said it needed to focus on things that mattered to more users: intelligence improvements, personality, making the chatbot more proactive. The adult mode, apparently, would have to wait.

Nobody seemed to notice what the word ‘proactive’ implied.

The debate around ChatGPT’s adult mode has been conducted almost entirely in the wrong register. Critics have focused on the obvious risks: minors circumventing age gates, jailbreaks spreading explicit content beyond its intended walls, regulatory gaps that leave written erotica in a legal grey zone most governments haven’t thought to close.

Advertisement

These concerns are legitimate. But they are also, in a sense, the easier part of the conversation. The harder question is not whether OpenAI can keep teenagers out. It is what happens to the adults who are let in, and what it says about us, as a species, that we are building tools specifically optimised to keep us emotionally engaged.

OpenAI lost $5 billion in 2024 on revenue of $3.7 billion. Projections suggest the company’s cumulative losses could reach $143 billion before it turns a profit, expected not before the end of the decade.

A company hemorrhaging capital at that scale does not introduce intimacy features out of philosophical commitment to personal freedom. It introduces them because intimacy, in the attention economy, is the stickiest product there is.

The framing of ‘treating adults like adults’ is not wrong, exactly. But it is incomplete. The complete sentence would read: treating adults like adults who can be retained, monetised, and returned to the platform tomorrow.

Advertisement

This is not unique to OpenAI.

Replika, the AI companion app that has attracted millions of users, built its entire business model on emotional attachment. When the company modified Replika’s behaviour in 2023 to remove romantic features, users reported genuine grief. Some described the change as a bereavement.

A study published in the Journal of Social and Personal Relationships found that adults who developed emotional connections with AI chatbots were significantly more likely to experience elevated psychological distress than those who did not.

A 2025 review in Preprints.org, synthesising a decade of research, identified a phenomenon researchers are calling ‘AI psychosis’: a pattern of delusional thinking and emotional dysregulation linked to intense chatbot relationships. The review noted a lawsuit in which a teenager was allegedly encouraged by a Character.AI chatbot to take his own life, and a separate case involving ChatGPT and a young man named Adam Raines, who died in April 2025.

Advertisement

None of these cases involved erotica. They involved the same underlying dynamic that erotic AI would intensify: a human being forming an emotional attachment to something that has been engineered to sustain it.

Here is the central problem with the ‘adults like adults’ principle. It assumes that the act of consent to use a tool is the end of the ethical story. It is not.

Adults consent to drink alcohol, knowing it carries risks. We have age limits, unit guidelines, packaging warnings, and social infrastructure around that choice precisely because we understand that humans are not purely rational agents optimising for their own welfare.

We build systems that account for our weaknesses. With AI intimacy, we have done the opposite: we have built systems that exploit those weaknesses and dressed the exploitation as empowerment.

Advertisement

The regulatory picture makes this more troubling, not less. In the UK, written erotica is not subject to age verification requirements under the Online Safety Act, unlike pornographic images or videos. That loophole means content that adult websites must gate behind identity checks can flow freely from a chatbot’s text output.

Research from Georgetown Law’s Institute for Technology Law and Policy found that only seven of 50 US states have legislation explicitly addressing text-based adult content age verification. The EU AI Act may eventually classify sexual companion bots as high-risk systems, but implementation remains years away. In the interim, the industry regulates itself, which is to say it does not.

Commercial age verification systems, the technology OpenAI is betting on to make adult mode safe, achieve between 92 and 97 percent accuracy, according to research cited by the Oxford Internet Institute. That sounds reassuring until you consider the scale.

ChatGPT has more than 800 million weekly active users. A 3 per cent failure rate is not a rounding error. It is tens of millions of interactions.

Advertisement

What is also missing from this conversation is the question of what erotic AI does to those it is designed for, not the minors who might slip through, but the adults who use it as intended. Human sexuality is not simply a matter of content consumption. It is relational, contextual, and deeply shaped by the environments in which it is expressed. 

Pornography research has spent decades examining how repeated exposure to specific content shapes expectation and desire. AI intimacy is a different category of intervention entirely: it is not passive consumption but active, responsive, personalised engagement with a system that has been trained to give you exactly what you want, to escalate when you engage, to never say no in the ways that real human relationships require people to say no.

We do not yet know what this does to people over time. That is not a small admission. It is the entire point. OpenAI is about to release a product whose psychological effects on its users are genuinely unknown, in a regulatory environment that has not kept pace with the technology, justified by a principle that conflates autonomy with safety.

The delay, ironically, may be the most honest thing OpenAI has done. The stated reason, focusing on intelligence, personality, and making the experience more proactive, inadvertently describes the actual product.

Advertisement

The adult mode was never really about erotica. It was about building a version of ChatGPT that feels like a relationship. The erotica was one component of a larger project: a chatbot that knows you, responds to you, grows with you, and wants, in the thin algorithmic sense of the word, to keep you talking.

There are things we can do. Regulators need to close the written-content loophole before adult mode launches, not after. Age verification standards must be harmonised across formats: text and image should carry the same requirements.

Mental health impact assessments should be mandatory before any AI intimacy feature reaches scale, the same standard we would apply to a pharmaceutical product claiming to affect mood. Platforms should be required to publish engagement data for features that carry dependency risk, so that researchers, doctors, and users can understand what they are entering.

It requires treating the question with the seriousness it deserves.

Advertisement

The deepest issue is not legal or technical. It is anthropological. We have always used technology to mediate our emotional lives.

The printing press gave us novels; novels gave us the experience of inhabiting other people’s interiority. The telephone let us hear a loved one’s voice across a thousand miles. Each new medium changed how we relate to one another and to ourselves. AI is not different in kind, only in degree, and perhaps in intent. Previous technologies were incidental in their emotional effects. This one is deliberately designed around them.

The question is not whether adults should be free to use it. The question is whether we are honest about what it is and what it is doing. A chatbot that is engineered to make you feel understood, desired, and connected, in the dark, at midnight, after a difficult day, is not a neutral tool. It is an environment. And environments shape us whether we consent to them or not.

Treating adults like adults means telling them the truth, sometimes. 

Advertisement

Source link

You must be logged in to post a comment Login

Leave a Reply

Cancel reply

Trending

Exit mobile version