Policing The Metaverse: Who’s In Charge?

How will crimes that happen in the Metaverse be handled in the future?
Profile picture of VoiceBox

Created by VoiceBox

Published on Feb 20, 2024
closeup of a VR headset
James Yarema on Unsplash

Please note: this article includes sensitive topics that some people might find difficult. Please visit our Resources Page for help.

This article was written in February 2024 and may not reflect changes in law and regulation later in time.

Tim Berners-Lee, in all his wisdom, probably didn’t imagine an immersive virtual world of legless characters when he was making the internet. And yet, three decades later, we find ourselves grappling with the Metaverse – a 3D space where users are represented by avatars, and the real world is faced with a very tough list of ethical questions to answer. 

One of those questions is how on earth do we police something that exists in limbo between physical and fictitious? Crime in the Metaverse and other VR spaces – such as virtual sexual assault and user harassment – is already known to cause devastation among some children and young people, yet regulation is slow, and outcomes for perpetrators are undetermined. 

So, how do we begin this momentous task? 

Where Are We With Current Regulation?

Well, we aren’t anywhere statutory (yet). But conversations are happening in various spaces, including youth communities and policy groups. In 2022, the European Commission introduced plans to ‘Thrive In The Metaverse’, with a focus on people, technologies and infrastructure. It echoed our concern about vague, partisanship regulation, acknowledging that the metaverse should be based on coherency, with no single person or group holding power over laws or terms and conditions.

“We will not witness a new Wild West or new private monopolies.” – European Commission

The following year, the European Commission announced four key strategy pillars for the next technological transition: 

  • Empowering people and reinforcing skills
  • Business: supporting a European Web 4.0 industrial ecosystem 
  • Government: supporting societal progress and virtual public services
  • Shaping global standards for open and interoperable virtual worlds and Web 4.0

They promised that by the first quarter of 2024, their Citizens' Panel would develop a ‘Citizen toolbox’ for the general public to use in virtual spaces. We will be watching closely to see if this happens.

Similarly, UK representatives say the metaverse “should not be beyond the scope of the [Online Safety] Bill”, with Lord Clement-Jones expressing concern that the now-passed Bill has already fallen behind emerging technologies – something VoiceBox has spoken about before.

“I believe that there is a case to answer by the Government. Things are moving very fast—I entirely understand that it is difficult to keep up, in a sense, with the changes—but the metaverse should not be beyond the scope of the Bill and nor should the environments created by it.” – Lord Clement-Jones

Meta – who sit (somewhat) on the other side of the regulation debate – has developed a basic Code Of Conduct to help users function appropriately in the metaverse. While we welcome their efforts to promote safe play, we are concerned by the lack of discipline should a user misbehave. Feature limitations and temporary account restrictions might work in non-immersive spaces (such as Instagram), but when virtual reality is experienced so differently, we wonder if further steps are needed to instil good behaviour.

“...well, my avatar gets killed in a video game all the time - doesn't mean I’m trying to charge people with murder” – Reddit user

Surely just removing a feature or whatever isn’t enough to bring perpetrators to justice? You wouldn’t just take away someone’s, I don’t know, like, gaming console if they stalked someone in real life, would you?” – VoiceBox Ambassador

Will Crime In The Metaverse Become More Extreme?

The metaverse is incredibly new. It simply hasn’t been around long enough for researchers to properly identify the crime rate.

But we do know that crime is happening in VR spaces, with heinous incidents already worming their way into user experiences. Only recently, a virtual sexual assault on a girl’s avatar – which reportedly caused ‘psychological trauma’ – resulted in a real-life police investigation. Such significant action following a virtual assault is rare because the definition of ‘rape’ and ‘sexual assault’ in UK criminal law requires physical contact between perpetrator and victim. 

But because the attack was intensified due to the immersive nature of virtual reality, it reiterates experts’ arguments for changing regulation. There are already cases of users falling over or running into things after becoming too immersed in their headsets. This begs the question: if virtual reality feels ‘real’ enough for users to stumble and hurt themselves, is it not reasonable that an assault in the metaverse might have a psychological impact on the victim? We expect that as the technology develops, crime will become more and more extreme – including rape, sexual assault, data theft, money laundering, stalking and harassment. 

"VR raping someone is new, yeah. Someone raping you in a convincing simulation of reality is gonna be a different ballgame of bad compared to a creepy DM on Instagram." - Reddit user

“Will be interesting to see how they adapt laws to accommodate the metaverse. If someone follows you IRL and shouts abuse at you, it's considered stalking and harassment. But if they do the same in the metaverse, it's unclear. Will basic internet trolling be a criminal offence one day? Idk.” – Reddit user

“I honestly can’t even begin to comprehend how they’re going to deal with metaverse crime. Why on earth did they just go ahead and make the technology without considering it?” – VoiceBox Ambassador

But despite the creeping risk of serious misconduct, many young people are worried that current moderation systems are not reliable enough to handle what’s to come. 

“People aren’t confident the platforms to police these things properly – meaning maybe if something is extreme enough the police should step in?” – Reddit user

"Every time I report an account that is clearly a bot or shilling some scam on social media, it seems like nothing is ever done about it. These platforms can't seem to handle the amount of misconduct that happens on them already. I have no idea how they are going to add VR spaces to the mix and have them be safe places for young people to be in." - VoiceBox Ambassador

“Can’t wait to get hate crimed in the metaverse only for Zuck to reply in 10 business days that it ‘doesn’t violate our Community Guidelines’” – Twitter user 

Others, however, aren’t sure that metaverse crimes will have the same impact as real-life crimes.

“Like, no. Hundreds of millions of people say and do things online that they would never do in real life. People harass others online all the time, and it's very easy to unplug and tune them out.” – Reddit user

“There’s a block function for a reason. I’d love to do that with half the people I meet on the street.” – VoiceBox Ambassador

“In real life, you can't mute someone, in the metaverse, you can. Stalking has been going on since the start of the internet” – Reddit user

“I can 100% appreciate that it is an unpleasant experience, and action should be taken against the perpetrators within the platform. But to be honest, to treat it the same way as a real-life sexual assault is a little insulting to those who have experienced these horrible acts in real life” – VoiceBox Ambassador

With two very polar opposite sides of the argument, will any statutory law applied to the metaverse be treated with the same significance as laws in real life? We have witnessed cases of mockery towards policing metaverse – something that decision-makers must take into account when considering new legislation. 

“I think it’s a ridiculous concept.” – Reddit user

“But in all seriousness, until such time as virtual worlds become an “alternative” to life as opposed to an extension of it, all crimes committed in virtual worlds should be handled in the real world.” – Reddit user

Potential Remedies 

Policing the metaverse shouldn’t just be a case of arresting perpetrators and putting them behind bars. We believe there are balanced, community-based techniques that have the potential to remedy crime before it spirals out of control. 

Early intervention is an essential method for combatting crime, both in real life and immersive digital environments. Children and young people should be given the opportunity to thrive as civilians in the metaverse, with schools, in particular, holding the key to unlocking good behaviour in virtual spaces. 

We also see prospects in paid community moderators – much like police community support officers – who share some, but not all, the power and responsibility that falls to regional police departments. 

Of course, any police presence in the metaverse should be nuanced. Not every situation will be the same, and some crimes simply won’t have the same psychological effect as crimes in the real world. 

“I feel like touch is a big thing here. You can’t feel physical pain in the metaverse, and I feel like for some crimes, that’s what makes them so much worse?” – VoiceBox Ambassador

“Bro, I was playing CS earlier, and something FAR worse happened to me!!!! Several people SHOT my avatar, burned me with molotov, KNIFED ME, threw a frag grenade at me, and I died from a bomb set up by another person! This is unacceptable, I have PTSD now and want those awful people in jail NOW.” – Reddit user

“They didn’t care for a decade while people were getting tea bagged [sexually assaulted] in CoD, why do they care now?” – Reddit user

“It’s great that they are actually trying to address some of the problems early on, especially since it feels like the regulations are always going to be lagging behind the tech advances. I’m just not sure that putting someone behind bars is the real solution here.” – VoiceBox Ambassador

What About Privacy And Anonymity?

One thing that many find comfort in is the power of anonymity online. Most platforms do not require you to verify your identity when using their services – particularly crucial for vulnerable groups such as the LGBTQIA+ community. 

Any unyielding regulation in the metaverse would likely require anyone and everyone to make their identity known to various groups, including the criminal justice system and developers. 

This is a cause for concern. What would happen if powerful players gained access to user identities? Could they abuse their position to dox or suppress communities? Will racial and/or sexuality profiling not become prolific? 

Children's privacy, in particular, shouldn’t be cast aside in favour of data collection.

“I really don’t like the idea of very powerful people having full access to the information and history on my metaverse profile…” – VoiceBox Ambassador

Where Does Responsibility Lie?

We reflect the European Commission’s call for no single player to hold power over metaverse law. To achieve this, VoiceBox recommends a cross-sector collaboration between governing bodies, charities, tech firms, users and schools. Only then will a balanced, user-centred approach be plausible. 

Equally, blocking children from accessing the metaverse is not the answer to safe online exploration. Like any other digital platform, harmful things can happen, meaning digital resilience and media literacy should be central to navigating Web 4.0 and its services. We expect media literacy to be emphasised in any draft legislation to equip people with the tools to learn and recover. 

But any responsibility to protect and nurture online experiences should not fall to everyday users. While peer-to-peer blocking and reporting have their place, no individual in the metaverse should be solely expected to police those around them, in the same way we are not expected to execute policing in a real-life situation. 

“Parents shouldn't let their kids loose into such games and use them as babysitters. If you don't trust your kid to be outside playing alone they don't need to be on VR.” – Reddit user

Some young people expressed frustration at the lack of time and resources used to combat real-life crime. They felt that virtual crimes should not be prioritised over worsening youth violence – such as knife and gun crime. 

“Someone literally tried to break into my house a couple of years ago. My flatmates were standing in the corridor with a kitchen knife in case we had to use it to threaten the guy off. Did we get any support after it happened? No. Surely that deserves more attention?” – VoiceBox Ambassador

“Somewhere, there are actual crimes being committed against actual people while actual police are wasting time with this.” – Reddit user

And The Answer Is?

While there is currently no concrete solution to policing the metaverse, it’s paramount that policymakers act now. The metaverse may be a foreign body in the world of smartphones, but this will not be the case forever. There are already whispers of virtual reality glasses replacing the phone in our hands, yet with limited laws and regulations to enforce the new world we’ve been invited into. 

In the UK, we see merit in the Department of Science, Innovation and Technology funding an independent public body to scope and understand the metaverse. Any recommendations should be reviewed and affixed to the Online Safety Act swiftly. We praise other governments around the world acting on metaverse innovation and law, particularly in technological powerhouses such as India and South Korea

As always, children and young people must be at the heart of any legislation, and we expect them to be consulted throughout. 

More for you