The Metaverse’s Dark Side: Here Come Harassment and Assaults

Chanelle Siggens wore a virtual reality headset to play her favorite shooter game. She waited for the action to begin after she maneuvered heravatar into a virtual lobby in the virtual world.

But as she waited, another player approached her. Ms. Siggens said that the stranger had sex with her on her account. She asked the player to stop because she was shocked.

He shrugged and said he didn't know what to tell you. Ms. Siggens said that it was the metaverse that she would do what she wanted. He walked away.

The metaverse is a virtual reality world where people can play video games, attend gym classes and participate in meetings. In October, Facebook founder and chief executive Mark Zuckerberg said that he would invest billions in the effort. He renamed his company Meta.

Questions about the metaverse's safety have surfaced even as tech giants bet big on the concept. There are few ways to report harassment in virtual reality games, which are part of the metaverse. According to the Center for Countering Digital Hate, a violating incident occurs about once every seven minutes in a popular virtual reality game.

Bad behavior in the metaverse can be more severe than online harassment. Virtual reality immerses people in a digital environment where unwanted touches in the digital world can be made to feel real and the sensory experience is heightened.

Ms. Siggens said that when someone comes up and gropes you, your mind is tricking you into thinking it is happening in the real world. It will be more intense with the full metaverse.

Toxic behavior in virtual reality is not new. The issues are likely to be magnified by the companies reach over billions of people, as Meta and other huge companies make the metaverse their platform of the future. The companies are trying to get people to join the metaverse, which makes the headsets, and cut prices during the holidays.

The image is.

An image from a virtual reality game.

In March, Andrew Bosworth, a Meta executive who will become chief technology officer in 2022, wrote in an employee memo that it is impossible to moderate what people say and how they act in the metaverse. The memo was reported by The Financial Times.

Meta is working with policymakers, experts and industry partners on the metaverse, according to a spokeswoman. In a November post, Meta said it was investing $50 million in global research to develop its products.

An internal memo seen by The New York Times states that Meta has asked its employees to volunteer to test the metaverse. A company spokeswoman said that a person recently groped a tester of a Meta virtual reality game. The incident was reported by The Verge.

Misbehavior in virtual reality is hard to track because incidents occur in real time and are not recorded.

The chief parent officer at Bark, which uses artificial intelligence to monitor children's devices for safety reasons, said she was concerned about what children might encounter in the metaverse. She said that abusers could target children through chat messages in a game or through headsets, actions that are difficult to document.

Ms. Jordan said that V.R. is a world of complexity. The ability to block a bad actor indefinitely or have ramifications so they can't just get back on is still being developed.

The head of research at the Center for Countering Digital Hate spent several weeks recording interactions in a game that is made by a developer and is mostly played through headsets. In the game, people can form virtual communities, have their virtual friends play cards, party in a virtual club, and meet in virtual public spaces to talk. It is safe for teenagers.

Mr. Hood said that he recorded more than 100 problematic incidents on the virtual chat platform. He said that users made sexual and violent threats against children. Someone tried showing sexually explicit content to a minor.

Mr. Hood said that the incidents had violated the terms of service. He said he had reported his findings to both companies, but had not heard back.

He said that the developers and Facebook have failed to put basic measures in place to ensure that abusive users cannot access its services. They have created a safe haven for abusive users at the same time as they are opening the metaverse.

The image is.

An image from the virtual reality chat.

Meta has community standards and V.R. policy that outline what is allowed on its platform. She said that they don't allow content that attacks people based on race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability.

She said that children are not allowed to use accounts or use devices. The developers of the apps are responsible for part of the responsibility.

A request for comment was not responded to.

Ms. Siggens joined a virtual support group for women after she faced abuse while playing the game. She said that members dealt with harassment in the game. BigBox was acquired by Meta in June.

Mari DeGrazia, 48, a member of the support group, said she saw harassment and assault happen in Population One two to three times a week.

She said that sometimes things happen two to three times a day that violate the game's rules.

BigBox did not reply to the request for comment.

Ms. DeGrazia said the people behind Population One were interested in making the game safer. She said that despite the harassment, she has found a community of virtual friends who she enjoys interacting with.

She said she would not stop playing because she thought it was important to have diverse people playing the game. Even though it is hard, we are not going to be pushed out of it.

In July, Ms. DeGrazia wore a vest to play Population One. She said it felt awful when another player groped her. She said that Mr. Zuckerberg's description of a metaverse where people can be fitted with full-body suits that let them feel even more sensations was troubling.

Ms. Siggens said she reported the user account of the person who groped her to the game. She received an automated response saying that the user had been taken to task.

She doesn't know if they were banned for a day or a week. It just keeps happening.

Ms. Siggens said that heravatar was groped again by a different user an hour after the incident with the stranger.

Ryan Mac was involved in reporting.