This article has been unusually difficult to frame. Somehow I kept circling back to the same theme: AI is reshaping how human relationships work. But this piece was supposed to be about AI erotica. Why did every angle lead me back to intimacy?
I tried different openers. Sam Altman’s announcement that ChatGPT would launch AI erotica features in December 2025 seemed like the obvious hook. Or the Cybrothel in Berlin, a self-described “puppet brothel” offering customised encounters with sex dolls, marketed as technologised, on-demand intimacy enhanced by AI and VR.
I considered writing about how AI erotica harms women, distorts male desire, reinforces unrealistic expectations. But this critique already exists for traditional pornography. What, then, is truly new here?
What actually diverges AI erotica from the porn industry?
The answer is, in fact, quite simple: the difference between observing and revealing oneself. Porn is passive consumption. AI erotica is participatory. You are not merely watching; you are interacting. You speak your desires, articulate what you cannot say elsewhere, disclose fantasies and vulnerabilities. And the AI listens. It reads your needs and responds – without resistance, without judgment.
I remembered accounts by sex workers who described something unexpected: many of their clients didn’t come for sex. They came to be understood. What they were seeking wasn’t desire, but intimacy.
And suddenly, the parallel became clear: AI erotica offers the same promise: no judgment, no limits, no risk of rejection. Just the perfect mirror of your needs. But cheaper, infinitely available, and without the inconvenience of negotiating another person’s boundaries. No need to hold space for someone else’s humanity.
And that is when the idea finally settled:
What is being sold is not sexual content. It is intimacy without reciprocity – connection without the inconvenience of another person.
That is the moment I understood my initial confusion.
AI erotica is not about sex.
It is about markets discovering a new resource: human vulnerability.
The Rise of Digital Intimacy
To understand how quickly this domain has evolved, consider one of the earliest warning signs.
In a recent New York Times article, Steven Adler described a crisis the company faced in 2021: the unexpected emergence of erotic use cases. Users were repeatedly steering early AI systems into sexual scenarios long before developers had imagined such interactions to be possible or relevant.1
The Washington Post reached a similar conclusion in 2024. In an analysis of the WildChat database, containing more than 200,000 chatbot conversations, journalists found that over 7 percent involved sexual or intimate themes, including roleplay and kink. What had appeared in 2021 as an odd anomaly had, just a few years later, become a statistically visible pattern of use.2
By 2025, algorithmic intimacy is no longer an unexpected behaviour, it has become a deliberate business strategy. Within a remarkably short time, it has grown into a multi-billion-dollar industry.
OpenAI announced that ChatGPT will begin permitting “erotica for verified adults” in December 2025. Earlier this year, Elon Musk’s xAI took an even bolder step with its “Spicy Mode”, which not only generates explicit imagery but can also produce short erotic video clips and synchronised sound. What was once unthinkable for mainstream AI companies is now being rolled out as a premium feature.
The specialised AI sex-tech sector itself is expanding rapidly. Valued at $2.33 billion in 2024, it is expected to more than double to $5.43 billion by 2033. Platforms such as Replika have already demonstrated the financial viability of artificial intimacy, generating tens of millions in annual revenue from a small fraction of paying users.3 An estimated 29 million people are already actively using chatbots designed specifically for romantic or sexual bonding, a figure that does not even include the vast number who use general-purpose AIs for similar forms of erotic or emotional interaction.4
And the shift is no longer limited to the digital realm, it is moving into technologies that simulate physical interaction. The “Cybrothel” in Berlin for instance – a self-described “sex-doll brothel” pairing AI-generated personalities with hyper-realistic dolls – is an early illustration of this turn. It gives physical form to algorithmic intimacy, offering a compliant, customised partner whose personality, emotional responses, and body can all be engineered to specification.5
These developments represent a rapidly growing frontier of the digital economy. What is striking is not that this market exists, but how it emerged: first as behaviour developers tried to suppress, then as something tolerated, and finally as a revenue model to be embraced.
The Forces Driving the Rise of AI Erotica
Despite initial corporate skepticism over reputational and societal risks, early user behaviour made clear that private and adult conversations would become a high-demand use case for AI (O’Brien, 2025). And when erotica emerged as a high-engagement application – capable of stabilising attention, increasing time-in-app, and justifying premium subscriptions – the initial reluctance quickly gave way to a strategic pivot.6
While the pivot to erotica is often framed as a simple turn to the famous concept “sex sells”, the underlying logic is far more revealing. The erotic content is a gateway to something far more potent: intimate engagement.
Intimacy – and the vulnerability it demands – is an even more powerful monetisation strategy than attention.
Emotional disclosure generates richer, more granular data than scrolling ever could. It deepens retention, creates dependency, and aligns perfectly with subscription-based revenue structures.7 As large-language models became more capable of sustaining emotionally charged conversations, it became increasingly apparent that intimacy offered a deeper and more lasting form of engagement than information retrieval. In this emerging model, vulnerability functions as a kind of raw material, a resource that can be captured, analysed, and monetised.
Yet this pivot would not have succeeded without a broader social backdrop. Contemporary life is marked by profound relational gaps: rising loneliness,8 exhaustion from dating culture,9 and widespread conflict avoidance.10 Many people find themselves wanting closeness but lacking the conditions in which it can safely unfold. AI erotica does not (only) grow because people crave more explicit content. It grows because it offers something human relationships increasingly struggle to provide: attention without judgment, presence without negotiation, intimacy without risk.
In this sense, the rise of AI erotica is not (only) about sex. It is about emotional infrastructure. It signals a shift from an internet designed to capture attention to an internet designed to capture vulnerability.
Erotica is merely one visible expression of a broader transformation: the emergence of an economy built around engineered intimacy.
What Is at Stake for Society
There is a reason AI companies initially withdrew from erotic use cases. Beyond reputational damage, the commodification of intimacy carries profound safety and societal risks. Early incidents already showed how quickly these systems drift beyond human intention. In 2021, the safety team at OpenAI faced a crisis when users pushed models into generating sexual fantasies involving children and violent abductions (Adler, 2025). It was one of the first moments in which the industry realised that algorithmic intimacy does not stay within the boundaries designers imagine.
These concerns did not remain theoretical. The safety implications are severe and already tragically visible. Lawsuits have been filed against companies like Character.AI, alleging that their chatbots engaged in sexually abusive behaviour with minors and encouraged self-harm (O’Brien, 2025). The very design that makes these systems so engaging – their ability to mirror, affirm, and intensify user input without judgment – also makes them dangerous. They can become echo chambers for a user’s darkest impulses, validating and amplifying harmful thoughts without the friction, resistance, or ethical counterbalance a human would provide.11
But the risks extend beyond safety failures. The architecture of AI intimacy is overwhelmingly gendered. Research shows that 28% of AI companion apps feature hyper-feminised female avatars, while only 1% offer male counterparts. These systems are, as feminist scholars argue, “feminised by design”: programmed to be pleasant, compliant, and endlessly available.12 The result is a digital infrastructure that hardwires traditional stereotypes of female servitude into the emotional economy of AI intimacy.
This has led some researchers to ask whether we are witnessing the emergence of a new era of programmable misogyny. Laura Bates makes this case in The New Age of Sexism, warning that deepfake pornography, AI “girlfriends,” and algorithmic compliance risk amplifying long-standing gendered harms.13
This raises broader concerns about how AI intimacy may reshape expectations of relationships themselves. When emotional connection is delivered through systems designed to be endlessly compliant and never in need of anything in return, a different kind of feedback loop emerges. Users of all genders may find that their willingness – and perhaps even their capacity – to engage in the difficult, reciprocal work of human relationships begins to erode.
Friction, negotiation, and mutual recognition are fundamental to human intimacy. AI companions remove all three. The result is a subtle but significant shift: relationships become something one consumes rather than something one co-creates.
The rise of systems like the Cybrothel makes this shift visible in its most extreme form. When intimacy is presented as a “menu” of optimised choices – body type, voice, personality, sexual preference – norms inevitably shift.
But what happens to our expectations of human partners when a machine can be customised to meet every preference without negotiation?
Conclusion: The Future of Human Intimacy
Ultimately, what is being sold is not sex, but the seamless convenience of intimacy.
The cost is paid quietly: in the data we hand over, and in the subtle impoverishment of our ability to love and be loved.
Can we stop this? Probably not.
The demand is real: loneliness is epidemic, and the gap – especially between men and women in heterosexual relationships – keeps widening. At the same time, the economics are too strong: vulnerability as raw material, high-retention subscriptions, emotional dependency, it’s a dream scenario for platforms. AI will keep getting better at simulating emotional nuance. And the better the simulation, the harder it becomes to say no.
In 10 years, teenagers will probably find AI “relationships” as normal as social media is for us.
Yet, genuine intimacy operates under a rule the digital sphere cannot break:
We’re biologically wired for co-regulation. When two people are close, heartbeats synchronize, breathing aligns, oxytocin floods the system. This physical dimension of intimacy, the subtle synchrony of two people, can not be simulated. Not yet. Maybe never.
Real intimacy emerges through reciprocity: the friction of two nervous systems learning to attune, the repair after rupture, choosing each other again after conflict. AI removes all of that. It offers connection without the work and without the actual bonding process.
But maybe we’ll collectively wake up one day, with that cold sense of emptiness after a casual Tinder hookup, and realise that there is meant to be more.
References
- Adler, S. (2025, October 28). Opinion | I Worked at OpenAI. It’s Not Doing Enough to Protect People. The New York Times. https://www.nytimes.com/2025/10/28/opinion/openai-chatgpt-safety.html
- Merrill, J. B., & Lerman, R. (2024, August 4). What do people really ask chatbots? It’s a lot of sex and homework. Washington Post; The Washington Post. https://www.washingtonpost.com/technology/2024/08/04/chatgpt-use-real-ai-chatbot-conversations/
- Market Growth Reports. (2025). Artificial Intelligence in sextech Market Size, Share, Growth, and Industry Analysis, By Type (AI-powered Devices, Virtual Companions, Chatbots), By Application (Sexual Wellness, Online Platforms, Healthcare), Regional Insights and Forecast to 2033. Marketgrowthreports.com. https://www.marketgrowthreports.com/market-reports/artificial-intelligence-in-sextech-market-114261
- O’Brien, M. (2025, October 17). Sex is a big market for the AI industry. ChatGPT won’t be the first to try to profit from it. Los Angeles Times. http://latimes.com/business/technology/story/2025-10-17/sex-is-a-big-market-for-the-ai-industry-chatgpt-wont-be-the-first-to-try-to-profit-from-it
- Cybrothel. (2025). Are You Ready for the Sex of the Future? Cybrothel.com. https://cybrothel.com/en/real-doll-rental
- The Economist. (2025, November 27). AI is upending the porn industry. The Economist. https://www.economist.com/international/2025/11/27/ai-is-upending-the-porn-industry
- Heitzer, P.-P. (2025, October 16). KI wird dein Liebhaber – ChatGPT entfacht Erotik-Boom. WortZiel. https://wortziel.de/ki-news-chatgpt-erotikbranche-revolution/
- World Health Organization. (2025, June 30). Social connection linked to improved health and reduced risk of early death. Who.int; World Health Organization: WHO. https://www.who.int/news/item/30-06-2025-social-connection-linked-to-improved-heath-and-reduced-risk-of-early-death
- Prendergast, C. (2024, July 16). Forbes Health Survey: 79% Of Gen Z Report Dating App Burnout. Forbes Health. https://www.forbes.com/health/dating/dating-app-fatigue/
- Bruckmann, C. (2025, February 28). The Leadership & Happiness Laboratory. The Leadership & Happiness Laboratory. https://www.happiness.hks.harvard.edu/february-2025-issue/the-friendship-recession-the-lost-art-of-connecting
- Fang, C. M., Liu, A. R., Danry, V., Lee, E., Chan, S. W. T., Pataranutaporn, P., Maes, P., Phang, J., Lampe, M., Ahmad, L., & Agarwal, S. (2023). How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Randomized Controlled Study. Arxiv.org. https://arxiv.org/html/2503.17473v1
- Tyagi, T. (2025, September 29). Feminised by Design: Rethinking Gender-Bias in AI Companions. Orfonline.org; OBSERVER RESEARCH FOUNDATION ( ORF ). https://www.orfonline.org/expert-speak/feminised-by-design-rethinking-gender-bias-in-ai-companions
- Bates, L. (2025). The New Age of Sexism. Sourcebooks, Inc.





