NEWS

Should You Be Able to Have Sex With ChatGPT?

Photo-Illustration: Intelligencer; Getty Images

An iron law of running a large internet service is that your users will figure out how to use it to share, consume, or make porn. Explicit adult chatbots preceded ChatGPT by many years, and as soon as widely available and far more fluent and flexible LLMs showed up, it was inevitable that a meaningful portion of new users would wonder how far they could push them sexually.

As a result, every big internet company has to figure out what its relationship to porn is going to be. Search engines settled on solutions like SafeSearch, giving users the option of whether they wanted to see adult content; social-media companies drew a wider range of boundaries, from Meta’s systematic prudishness to Tumblr and Twitter’s fairly open embrace of porn; mobile platform companies like Apple and Google kept most adult apps out of their app stores. The boundaries these mainstream companies chose helped shape the outside porn industry, which has been steadily growing alongside them with parallel versions of mainstream platforms. (YouTube becomes YouPorn, etc.)

I’d say it’s early, but adult chat was a use case for LLM-based chatbots from the moment they launched, as every AI firm came to understand as soon as early-user data started coming in. So far, all the big AI companies have come up with slightly different answers to their version of the porn question. Elon Musk is all for it with adult modes and sexualized avatars in Grok (as well as extremely permissive rules for its image-generation tools, which led to instant and widespread abuse). Meta allows “romantic” role-play with guardrails that don’t seem to work very well. Google is being fairly cautious but also partnered early on with Character.ai, a role-playing chatbot app that many have used to engage in adult conversations and that has been implicated in lawsuits concerning teen mental health and suicide. Anthropic’s business is focused on enterprise customers — it doesn’t even offer an image generator — and has steered clear of sex-themed chat for safety-related reasons as well.

This leaves OpenAI, which is under slightly different pressures than its peers, for the simple reason that it operates ChatGPT, which is far and away the most popular chatbot, used by the widest range of people. In terms of usership, it’s the Google or Facebook of its moment. The company has known since before it launched ChatGPT that people would want adult chats, as employees have testified and as actual usage data has proven. (After launch, two of the earliest popular use cases to emerge were cheating on homework and sexual chat.) OpenAI’s leadership — or at least Sam Altman — wants to push forward but seems to be surrounded, externally but also internally, by people who think that might be a very bad idea. From The Wall Street Journal:

Citing the need to “treat adult users like adults,” OpenAI Chief Executive Sam Altman had last year floated the idea of enabling erotic conversation in its ChatGPT chatbot and dropping its ban on such X-rated content. The plan sparked vigorous debate internally over the potential risks. [Advisory council] members, with backgrounds in fields like psychology and cognitive neuroscience, had also expressed strong reservations. Then OpenAI dropped a bombshell: Despite the concerns, it was forging ahead with its erotica plans.

Some members of this council were evidently “furious” enough to talk to the press, and their particular objections are a mixture of familiar and novel to the AI age: They’re worried about child safety and extreme or harmful content but also about chatbot-specific harms like “emotional overreliance” and “crowding out offline social and romantic relationships.”

These objections make sense and are the kinds of things you’d want leaders at these companies to be thinking about. They’re also somewhat beyond the responsibility of one company: Adult-chat and -content services by smaller companies, often built on uncensored open-source models, are proliferating anyway into a parallel AI industry of their own. OpenAI’s choice here isn’t a new one — the company knows that some users want to chat sexually using its platform and that some of those same users will be hyperengaged and willing to pay; it’s also aware, much like social-media companies, that some of its most devoted customers will utilize their platforms in ways that are detrimental to their mental health and general well-being and that it’s layering a new and super-compelling interface on top of our existing understanding of compulsive social-media use, porn consumption, and addictive apps.

One way to resolve questions like this is by separating functionalities into interfaces that feel different from one another, isolating controversial uses from the brand and demanding a bit more intentionality from users to find them. Google’s AI products mostly share the Gemini banner, but there are dozens of ways you might encounter them, in situations that suggest and accommodate vastly different uses (doing research, coding, searching, or composing work documents). Likewise, users who engage with Claude as an advice bot interact with Anthropic’s models in a meaningfully different way than users of Claude Code. Another way is to shift users between different characters — that is, being explicit about the role that the chatbot is attempting to play (assistant, guru, boyfriend). There are a number of ways — many of which OpenAI is already using to separate out other features — a company might cordon off adult functionality that don’t present as much risk for its users or its brand and that resolve the awkwardness of a user’s “intern” and “romantic partner” characters inhabiting the same chat window.

But, again, OpenAI’s position is both historically familiar and yet unique among its contemporary peers. Erotica (or, to use the company’s preferred term, “smutty” content) is a growth hack, and ChatGPT, a general-purpose chatbot that has become a generic brand for “AI,” is the product OpenAI absolutely needs to keep growing — and for now, as far as Sam Altman is concerned, that seems to mean offering its billion or so users a tempting off-ramp into, you know, fucking the bot.


Source link

Back to top button