Facebook's Metaverse is unsafe
One incident of abuse and harassment every 7 minutes
The Metaverse is not safe for kids
VR Chat—the most reviewed social app in Facebook’s VR Metaverse—is rife with abuse, harassment, racism and pornographic content.
Our researchers found that users, including minors, are exposed to abusive behaviour every seven minutes. Such as:
Minors being exposed to graphic sexual content.
Bullying, sexual harassment and abuse of other users, including minors.
Minors being groomed to repeat racist slurs and extremist talking points.
Threats of violence and content mocking the 9/11 terror attacks.
We reported all of these incidents to Facebook using their web reporting tool when we found them. All of our reports about users who abused and harassed other users went unanswered.
It’s just not good enough. When Mark Zuckerberg and Nick Clegg launched their vision of the Metaverse with great fanfare they said:
"open standards, privacy and safety need to be built into the Metaverse from day one" ... "you really want to emphasize these principles from the start"
This was obviously a hollow promise.
Imran Ahmed, Chief Executive of the Center for Countering Digital Hate, said:
“When Facebook launched the Metaverse for Oculus just in time for Christmas shopping, its CEO, Mark Zuckerberg, pledged that privacy and safety is at the heart of Virtual Reality.
“But our researchers discovered that, contrary to his promises, Metaverse is a haven for hate, pornography and child grooming.
“In our study, Metaverse connects users not just to each other but to an array of predators, exposing them to potentially harmful content every seven minutes on average. If Metaverse is safe for predators, it’s unsafe for users, especially children.
“Any parent who gifted Facebook’s VR Oculus headset for Christmas needs to be aware that they are potentially exposing their children to serious danger.”
How to return your Oculus: