“Inside the Secret World of Trading Nudes”

VoiceBox responds to a recent BBC investigation
Profile picture of VoiceBox

Created by VoiceBox

Published on Sep 5, 2022
Reddit App Icon

Please note: this article includes sensitive topics that some people might find difficult. Please visit our Resources Page for help.

If someone were to ask whether trading unconsenting explicit imagery online was an acceptable form of digital etiquette, most people would answer no. Indeed, even before the likes of social media, the idea of meeting someone in the park to sell a sexually explicit print-out seems almost like something out of a fictional 90s drama. 

But unfortunately, you no longer have to slink around in the night to behave immorally. A recent BBC investigation has found groups of men on Reddit using the platform as a marketplace for sharing, trading and selling explicit content of unsuspecting women with few repercussions from Reddit itself, or, indeed, the law. Hundreds of anonymous profiles were sharing thousands of photographs of women, photos that were often accompanied by vile comments – including threats of abuse and ‘doxing’, a practice in which an individual's information is spread around the internet, putting both their reputation and safety at risk.

There are two issues that underlie the prevalence of this phenomenon on Reddit: anonymity and self-regulating forums. We at VoiceBox have spoken before about the multifaceted issue of anonymity online. There are many valid reasons why someone may choose to be anonymous online. For example, someone may want to remain anonymous when discussing sensitive issues online, or they may choose to remain anonymous to avoid harassment in other online spaces or in real life. Our OnlyFans report highlights the importance of anonymity in the porn industry, as many women use fake email addresses, fake names, and unassuming backgrounds with no recognisable features in order to anonymously carry out their work over the web. But there are communities who abuse the right to anonymity by participating in dangerous practices, hiding behind a blank, powerful shield that makes it very difficult to bring them to justice. 

Reddit forums, or subreddits, have their own community moderator(s), who bear the responsibility to protect users and ensure the forum follows Reddit’s rules. But the BBC discovered a moderator of one of the forums sharing explicit imagery (and we expect the moderators of similar groups) was doing the opposite – allowing members of his subreddit to trade images, dox victims and post derogatory comments. This form of self-regulation is just one loophole that allows harmful activity to continue, and the debate of individual morality is raised here again and again. Most subreddits are harmless and include a dedicated team of moderators who ensure their forums remain a safe and fun place to interact. Community moderators are certainly not a bad thing, but it is down to the individual to carry out that regulation appropriately. If even one person doesn’t bear the morals needed to protect their community, the whole concept is at risk of collapsing. 

But community moderators or not, social media firms should surely have some level of responsibility to regulate the content on their platforms, right? Well, in the UK, that remains to be seen. The Online Harms Bill, introduced in parliament last year, proposed to “require social media platforms, search engines and other apps and websites allowing people to post their own content to protect children, tackle illegal activity and uphold their stated terms and conditions.” The bill was also strengthened later on, with a new list of criminal content for tech firms to remove as a priority, including revenge porn. The bill has been postponed recently due to changes in parliament. While we don’t believe it has been shelved completely, we can expect further delays before any form of legislation is passed through parliament. This delay will continue to put people at risk as both individuals and tech firms evade any sanctions for contributing to the spread of harmful and explicit content. 

Existing legislation against revenge porn across the UK may provide a loophole that makes it unlikely for the perpetrators to face consequences. The law states that proof must be provided to demonstrate that the person sharing photos without permission is doing it specifically to ‘cause distress’ to the victim before any action is taken. In the BBC investigation, they spoke to a young woman whose ex-partner evaded prosecution for sharing her explicit photos on the internet without her consent by simply saying he “never meant to hurt or embarrass her”. The Online Safety Bill does not include any change to remove the requirement to prove intent to cause harm. There needs to be a deeper look into whether or not having that clause in the law causes more harm than good.

As a tech-positive enterprise, we at VoiceBox do not want to scare young people and organisations working with young people away from social media. When used appropriately, it can be a wonderful tool to interact with friends, immerse yourself in different cultures and learn new skills. We hope to see improvements in the UK legislation over the next year, and that other countries will follow suit.

More for you