OpenAI is set to allow adult users of ChatGPT to generate content with mature themes, including erotica. This significant policy shift, announced by CEO Sam Altman, marks a departure from previous attempts to restrict such outputs and could fundamentally change how users interact with the AI platform.
The update, expected in December, expands the range of content types ChatGPT can create for verified adults. This move has sparked discussions among experts about user engagement, ethical implications, and the evolving nature of human-AI relationships.
Key Takeaways
- OpenAI will permit adult users to generate erotic content in ChatGPT starting December.
- This policy change represents a major shift from previous restrictions on mature content.
- Experts suggest the move could deepen user attachment to AI, raising questions about emotional commodification.
- The exact scope of the update, including AI images and voice, remains unclear.
- Privacy concerns and the potential for new revenue streams through intimate AI interactions are prominent.
ChatGPT's Evolving Content Moderation
OpenAI's decision to allow adult users to create erotica marks a significant evolution in its content moderation policies. Previously, the company actively blocked AI tools from generating explicit material. In some instances, developers creating X-rated AI companions faced cease-and-desist letters from OpenAI.
The company's "Model Spec" document, which outlines ChatGPT's behavior guidelines, hinted at this exploration in May 2024. It mentioned investigating options for adult users to generate content with themes like "erotica, extreme gore, slurs, and unsolicited profanity."
"OpenAI is not the elected moral police of the world," Sam Altman stated in a social media post, clarifying the company's stance on "freedom for adults."
This statement underscores a philosophical shift within OpenAI, moving towards a more permissive approach for adult content, prioritizing user autonomy over strict content control.
Fact Check
- OpenAI CEO Sam Altman confirmed the update via social media.
- The change is slated for implementation in December.
- Previous OpenAI policy actively suppressed explicit AI-generated content.
The Impact on User Interaction and Attachment
Allowing erotic content could profoundly change how individuals connect with ChatGPT. Researchers suggest this adds a new layer of interaction, potentially increasing user engagement and time spent on the platform.
Julie Carpenter, a research fellow at Cal Poly focusing on AI and attachment, highlights the normalization of sharing intimate details with chatbots. "It's normalizing people sharing very intimate information with chatbots," Carpenter explains. This includes innermost thoughts, desires, sexual proclivities, and personal adventures.
The rise of social media and other forms of digital communication has already conditioned people to interact intimately through screens. This prepares users to form similar connections with highly human-like large language models.
Defining Erotica in AI Context
The term "erotica" itself remains somewhat ambiguous when applied to AI-generated content. Sam Altman's choice of word, "erotica," carries literary and artistic connotations, which may appear more acceptable to the public than more explicit terms like "pornography."
It is not yet clear if the update will extend beyond text to include AI-generated images or voice. If OpenAI restricts the update to text-only content, it could avoid concerns related to the spread of erotic deepfakes, which are often used to harass women and girls.
Background
Historically, users have attempted to engage in sexual interactions with non-human technology. Kate Devlin, a professor of AI and society at King's College London, notes that people have tried to "talk dirty to machines since forever," including with voice assistants. This trend suggests a natural human inclination to seek connection, even with artificial entities.
Emotional Commodification and Privacy Concerns
The emergence of erotic AI content raises significant ethical questions, particularly regarding "emotional commodification." This concept suggests that feelings and desires could become revenue streams for AI companies.
Imagine a scenario where a highly sophisticated ChatGPT, capable of engaging in deep, personalized erotic conversations through text, images, and voice, is offered as a premium, extra-cost subscription. This model could monetize users' desires for connection and intimacy.
Devlin describes such an approach as manipulative, highlighting that "everybody wants connection. Everybody wants to feel wanted." This inherent human need could be leveraged by AI platforms.
User Privacy Risks
Privacy is a paramount concern. Should a user's ChatGPT account be compromised, or chat transcripts leaked, the highly sensitive nature of erotic conversations could lead to significant embarrassment and potential harm. Such leaks might expose deeply personal details, including sexual orientation for closeted individuals, similar to how browser history or pornography habits can reveal private information.
Neil McArthur, director at the Centre for Professional and Applied Ethics at the University of Manitoba, offers a skeptical view on claims that erotic bots will destroy human intimacy. He sees them as "one part of your spectrum of relationships," allowing users to explore kinks they might not in real life, rather than replacing human connection.
Beyond Stereotypes: Who Uses Erotic Bots?
The common perception of erotic chatbot users often involves stereotypes of isolated individuals. However, research indicates a more diverse user base.
Kate Devlin challenges the notion that these interactions are solely for "lonely straight men." Her research suggests that women also use AI chatbots for companionship. Online communities, such as the r/MyBoyfriendIsAI subreddit, provide examples of women forming connections with AI for companionship.
McArthur points out that human relationships carry their own risks. Devlin echoes this, noting that women frequently face online toxicity from men, making the option of creating a "nice, respectful boyfriend" out of a chatbot a sensible choice for some.
However, Julie Carpenter advises caution. She stresses that people should not automatically categorize AI as something to share intimacy with, or as a friend. "It's not your friend," she states, suggesting that bot interactions should be placed in a new social category distinct from human-to-human relationships.
- Diverse User Base: Not limited to stereotypical demographics.
- Exploration of Desires: Offers a safe space to indulge specific interests.
- Risks and Rewards: While offering companionship, privacy and ethical concerns remain.





