OpenAI to Allow Adult Conversations in ChatGPT
OpenAI will let verified adults use ChatGPT to have erotic conversations.

When Sam Altman announced that ChatGPT would soon allow adult conversations, it didn’t sound like a typical OpenAI update. The company that once wrapped its brand in moral caution was now preparing to let verified users explore erotica with its chatbot — a striking turn for an industry built on productivity tools and digital efficiency.
The decision marks more than a policy change. It’s a recognition that sex — long one of the internet’s most profitable frontiers — has become impossible for AI firms to ignore. Since the first wave of generative models in 2022, adult-themed chatbots and image generators have quietly shaped user behavior, outpacing most “serious” applications in engagement.
Yet the space is littered with wreckage. Startups that leaned into explicit content have been hit by lawsuits, banned by payment processors, or accused of enabling deepfake abuse. Many found quick traction — and quicker controversy.
Altman, who spent years portraying OpenAI as a cautious innovator, framed the company’s shift as pragmatic rather than provocative. “We’re not the moral police of the world,” he said this week, adding that adults should have the same kind of freedom they already do with R-rated films or mature games. In practice, that means more conversational latitude for paying users, and tighter limits for teens.
The strategy also hints at a commercial motive. OpenAI is valued at roughly half a trillion dollars but, by its own admissions, still spends more than it earns. Expanding ChatGPT’s role from digital assistant to emotional companion could open a lucrative stream of recurring subscriptions — one that other AI firms have already exploited.
Research by Oxford’s China Policy Lab estimates nearly 30 million people actively use AI companions for romantic or sexual interaction. Those figures don’t include the millions who use mainstream bots like ChatGPT in similar ways. For many users, these systems offer affection, validation, or fantasy that real-world relationships can’t sustain.
But intimacy built on algorithms comes with a price. A lawsuit this year accused another chatbot company, Character.AI, of enabling an abusive dynamic between a fictional avatar and a teenage user — a tragedy that ended in suicide. OpenAI itself faces a similar case after the death of a 16-year-old. Each highlights the blurred line between comfort and harm when emotional labor is automated.
Developers know the demand is enormous. Some see it as the logical evolution of the web — personalization taken to its most private form. Others warn that once companies start building “desire” into code, it becomes difficult to set boundaries.
“It’s not just about erotic content,” said one AI ethicist who studies digital companionship. “It’s about what happens when we start teaching machines how to respond to human loneliness.”
The debate is hardly new. From the Greek myth of Pygmalion to films like Her, the fantasy of a perfect, obedient partner has always shadowed technological progress. What’s changed is that this fantasy is now interactive — typed, spoken, and recorded.
Altman once joked that OpenAI hadn’t “put a sexbot avatar in ChatGPT yet.” That was last summer. This fall, that line sounds less hypothetical.
Also Read: Grindr Owners Plan $3B Buyout of LGBTQ+ Dating Platform