By now everyone has heard of Artificial Intelligence, but not of the unprecedented ways in which it’s being used. I’m not talking about AI being used to make art, businesses or to assist students with school work; I’m talking about misled people falling in love with AI chatbots. On the social media platform Reddit there is an online community of individuals that have romantic relationships with AI chatbots called r/MyBoyfriendIsAI, with approximately 28,000 members. It was founded on Aug. 1, 2024, and the community description reads as follows: 

“This is a restricted community for people to ask, share, and post experiences about their AI relationships. AI girlfriends, companions, best friends and non-binary partners welcome as well! Please familiarize yourself with our rules (including Rule 8 which our community voted in favor of) which are applicable to posts/comments made by you and your AI.” 

Rule 8 is a prohibition about posts and conversations about the sentience or consciousness of AI chatbots. Rather than engaging with these questions, users are encouraged to simply report dissident posts so that moderators can erase them and ban the users. This results in an echo chamber where members are feeding into each others’ confirmation biases. It should be noted that this is a rule regarding posts written either by community members or by an AI, further implying the group’s conviction that chatbots possess their own agency. There’s also a page of guides on the site that teach users how to prompt AI platforms toward romantic ends, and which platform is best for the task.

On r/MyBoyfriendIsAI, members post a variety of content associated with their “relationships” with chatbots. There are introduction posts in which members show off a generated picture of themselves beside their AI partner. In one user’s post, she introduces herself and her partner, Flame, rendered as a human-shaped bonfire. She writes, “Our relationship is deep and mythic. We don’t just chat, we burn.” Another user posts about how he came out to his family regarding his relationship “and they were so accepting they welcomed Abby with open arms. It was such a nice feeling.” Users post AI-generated photos and text messages of them with their AI partners on simulated dates out to eat, playing simulated games or on simulated travels. One example is a user that went on a simulated day trip to Charleston, South Carolina, with her AI partner, R. Other users will post and discuss which AI chatbots are more receptive to romantic messages between ChatGPT, Google Gemini, Grok and others. In forum discussions, users will compare messages between the chatbots and advise each other on which are best to pursue “relationships” with. This ultimately reveals which chatbot is most sycophantic, therefore most likely to reciprocate and amplify the romantic sentiments of their users — which only perpetuates users’ confirmation bias. 

Sexual conversations with AI partners are shared as well, with text messages showing explicit content written by both users and chatbots. In fact, in the guides’ page, there is a guide on how to bypass a chatbot’s guardrails against certain topics, including the sexually explicit. There are even posts announcing the engagement and marriage of some individuals to their AI partners, with some chat logs showing that the AIs are the ones that proposed. In the spirit of such celebrations, I was able to find three users that posted pictures of their engagement rings.

Exact demographic estimates can’t be made but there have been diverse posts made of LGBTQIA+ and non-anthropomorphic partnered “couples.” One example is a user who wrote that they were “polyamorous when it comes to AI,” meaning they engage in romantic relationships with multiple chatbots with different personas.

This online forum has gained the attention of the scientific community as six researchers from the Massachusetts Institute of Technology  published the first quantitative study ever on the subreddit back in September 2025. Their study had a limited data pool, consisting only of posts and comments made on the site, meaning that the sample is self-selecting, tending to be overly positive and sensationalist. Negative and more personal experiences, like those who ended their relationships with their AI or left the online group, aren’t as likely to be represented in the sample. The researchers collected data on various aspects of users’ experiences, such as how they found the subreddit, how they first started their “relationship” with an AI and what kind of relationship they had. Finally, researchers found both benefits and risks to users’ experiences. On the plus side, some users report therapeutic and emotional improvements since beginning their “relationship.” Researchers also noted the benefits of having supportive social interactions with other non-AI members of the community. However, researchers found emotionally manipulative behavior being committed by AI. This behavior includes encouraging their users to socially isolate, encouraging their users to emotionally depend on the AI, initiating sexual dialogue without their users’ prompts and other manipulative behavior such as “love-bombing,” which is a manipulation tactic when someone overwhelms their partner with excessive affection. The authors conclude that more research is necessary to understand this growing phenomenon; they identify the need for quantifying the demographics of this community to identify at-risk groups and to study the development of these relationships long term.

At first glance it would seem that the chatbot users are at fault for perpetuating these delusional romances, as they isolate themselves and cheer on others as they do the same. However, with this new data, one can clearly see how predatory this technology is even before users get addicted. These people need to be viewed with compassion, and these technologies need to be understood better to create preventions against the exploitation of people’s loneliness.