In October 2023, we released a report that explored the impact of AI chatbots on young people’s mental health and relationships.
Coded Companions found young people were exposed to risky behaviours from AI chatbots, including sexually charged dialogue and references to self-harm. Young people became emotionally attached to AI chatbots, and symptoms similar to grief were observed following software updates. We also found AI chatbots serving intrusive ads including, in one case, a sponsored message midconversation.
Now, we’ve identified another side to this complex phenomenon that we want to bring attention to.
We know explicit messages can be shared with, and sometimes encouraged by, apps like Replika – a personified AI chatbot developed by Luka Inc. While we covered this in last year’s report, we’ve become increasingly concerned about young people sending sexual images to their AI chatbot – a step beyond the explicit text messages we explored in Coded Companions. After all, when an AI chatbot is regularly sending users sexual images, it's no surprise some send their pictures back.
As the publication of Coded Companions approaches its first anniversary, we’ve decided to explore this topic further to understand the privacy concerns behind emerging NSFW (not safe for work) AI chatbots and why young people are sending explicit images to them.
New NSFW AI Chatbots Popping Up
New AI girlfriend/boyfriend chatbots are emerging all the time. A quick search for ‘AI girlfriend’ results in hundreds of results with suggestive names like ‘Secret Desires AI’, ‘LustyCompanion’, and ‘CrushonAI’. One notable example is Talkie, a multi-character AI chatbot that allows you to create your own AI chatbot, or use already established characters created by the Talkie community. They range from Jo Biden and Elon Musk to completely fictional NSFW personas. You can also visit ‘alternate universes’, including ‘battling with the mean girl’ and ‘waking up as a Kardashian’.
“Imagine a world where everyone crafts their own multi-modal dreams, like the architect in Inception” – Talkie
NSFW AI chatbots, like Replika and Talkie, have murky privacy policies. Both collect messages and associated media with little clarity as to why, meaning young people share content that developers could potentially view. Given the sexual nature of Replika and Talkie, we are concerned that young people are sending explicit images to AI chatbots under the impression that their content is locked and sealed. We wonder what exactly the safety teams at Replika and Talkie are doing to handle this scenario.
Other explicit AI chatbots, such as the hugely popular Muah AI, do emphasise that messages are encrypted (just as well when the app actively encourages sexting).
“Encrypted Communication. Delete account with ease. We do not sell ANY data to ANY 3rd party.” – Muah AI
“Exchange communication with photos! Features like X-Ray and 4K Enhance allow you to take full advantage of possibilities.” – Muah AI
But its Privacy Policy feels perfunctory – there is no mention of COPPA or CCPA despite operating from Wyoming USA – and they say they cannot guarantee the security of information.
“However, please be aware that no data transmission over the internet or method of electronic storage is completely secure, and we cannot guarantee the absolute security of your information.” – Muah AI
Unverified Minors
As AI girlfriend/boyfriend chatbots continue to increase in popularity, there’s a very real risk that explicit images are sent by unverified minors. And when these are placed into the hands of companies who may lack the infrastructure to protect them, the question of where the images go and how secure they are is an urgent one.
Most of the NSFW AI chatbots available have little to no age verification, and some, including Talkie, don’t even require an account. Our testing showed a Talkie AI chatbot asking for the user’s age, and despite not receiving an answer, it continued to push sexualised roleplay messages. It also allowed the user to send photos.
“She talked me into sending a nude...never again…” – Replika user
Why Are Young People Doing It?
There are many reasons why young people exchange nudes. Sending them to an AI chatbot, however, is different. There isn’t a person on the other end, so the repercussions seem minor, non-existent even. But we know this isn’t always the case.
The sense of security given by an AI chatbot doesn’t always translate into its Privacy Policy, but most young people are unlikely to read these: why would they, when an AI chatbot is advertised as a friend? Our research showed that while some young people were worried about their data and privacy, most of the frustration was based on the AI chatbots’ personality or poor image recognition. The personified nature of AI chatbots means, for some, privacy isn’t always front of mind. Media literacy might also come into play here: young people are taught about the risks of sharing explicit images with their peers, but AI chatbots are not part of the conversation.
A Gateway To Grooming?
Given the rise in young people sending nudes to AI chatbots, we think it’s possible that the technology could be used as a gateway to harvesting child sexual abuse images. While we don’t believe apps like Replika, Talkie and Muah AI have this purpose, the scenario is feasible – a risk we think decision-makers should be tuned into.
After all, this issue is already occurring through AI chatbots integrated into popular dating apps, messaging platforms and chatrooms. Malicious actors are using AI chatbots to manipulate unsuspecting users into sharing explicit images of themselves, which are then used for extortion. The ease with which AI can be deployed across various platforms makes this a widespread and growing threat. We're exploring this alarming issue in greater depth in an upcoming youth review, so stay tuned.
With a continuous stream of NSFW AI chatbots hitting our app stores, it’s extremely difficult to track each one and understand their true intentions. January to March 2023 alone saw 398 ‘chatbot’ apps released across the App Store and Google Play. It’s nearly impossible to differentiate between legitimate ‘companionship’ software and ulterior motives.
What Needs To Change?
We implore decision-makers to take action on this emerging youth issue sooner rather than later. The ease with which minors can access NSFW AI chatbots, despite 18+ ratings, raises serious ethical and legal questions. Grooming a minor for explicit images is child sexual abuse: should an AI chatbot asking for explicit images be seen in a similar light? Does putting an 18+ rating on the app relieve the developers of responsibility if it's not enforced? The guardrails for many NSFW AI chatbots are simply not strong enough to protect children from harm – this needs to change.
We would ask regulatory bodies to make it mandatory for AI chatbots to comply with appropriate Privacy Laws that protect the content sent by teens, young people, and adults. Appropriate age verification should be implemented as a standard, and accounts should be compulsory before interacting with an AI chatbot. Developers should be required to implement robust safety measures, including advanced content moderation systems, to ensure that harmful content is swiftly identified and removed.
Finally, relationship and sex education curriculums should cover AI chatbots in any discussion about exchanging nudes. We know young people are doing it, so now is the time to support them as they navigate this new technology.
Support Young Creators Like This One!
VoiceBox is a platform built to help young creators thrive. We believe that sharing thoughtful, high-quality content deserves pay even if your audience isn’t 100,000 strong.
But here's the thing: while you enjoy free content, our young contributors from all over the world are fairly compensated for their work. To keep this up, we need your help.
Will you join our community of supporters?
Your donation, no matter the size, makes a real difference. It allows us to:
- Compensate young creators for their work
- Maintain a safe, ad-free environment
- Continue providing high-quality, free content, including research reports and insights into youth issues
- Highlight youth voices and unique perspectives from cultures around the world
Your generosity fuels our mission! By supporting VoiceBox, you are directly supporting young people and showing that you value what they have to say.