Technology

The metaverse has a sexual harassment problem and it’s going to get worse – Morning Brew

Nina Jane Patel designed her avatar, a cartoonlike version of herself with blonde hair and freckles, and entered Meta’s Horizon Venues, a 3D digital world, using a virtual reality (VR) headset. She was there for less than a minute before a group of men avatars touched and groped her avatar without consent, taking photos as they harassed her. Patel detailed the “horrible experience” in a December 2021 Medium post. It “happened so fast and before I could even think about putting the safety barrier in place,” she wrote. “I froze.”

Patel isn’t the only woman to report digital sexual harassment on Meta’s virtual reality platforms or in other digital worlds in the metaverses. When Facebook rebranded itself to Meta last fall, it mainstreamed the concept of living a digital life in 3D metaverses, or virtual worlds where people can meet and play. But the early presence of digital sexual harassment, which can include nonconsensual touching, verbal harassment, and simulation of sexual assault on avatars, raises questions about whether this new immersive tech can shed old tech’s problems.

It’s not that the metaverse creates new opportunities for digital sexual harassment—social media has always been rife with gender-based harassment—but virtual reality technology dissolves the gap between the physical and digital selves, creating immersive experiences that heighten both realism and emotional connection. Users watch as digitally rendered hands grope a representation of their own body, and it all feels increasingly real, just as metaverse designers intended. “All of these innovations and technology that can make a digital life seem like a real one with real feelings, that has exacerbated the impact of sexual misconduct in a metaverse,” Michael Bugeja, a professor at Iowa State University who teaches media ethics and technology, said.

For those like Patel who experience digital sexual harassment, it can be degrading and emotionally devastating. It’s “surreal,” it’s “a nightmare,” Patel wrote. Despite this, and despite Big Tech’s history of ignoring the concerns of groups particularly vulnerable to online harassment, sexual harassment and its aftermath are often glossed over or ignored by developers. It’s a problem that needs a solution, especially as haptic technology–tech that mimics the effects of touch–evolves.

Researchers at Carnegie Mellon University recently developed a VR attachment for a headset that sends ultrasound waves to the mouth, allowing people to feel sensations on the lips and teeth (fingertips are the other haptic hotspot, where it’s easiest for developers to send signals of feeling). With the mouth attachment, players can feel spiders, raindrops, and even a stream from a water fountain on their lips. They can simulate brushing their teeth. But headlines ran away with the advances by declaring that users would soon be able to kiss, and subsequently feel the kiss on their physical bodies, in the metaverse. For now, that’s not true. Another set up of lips would be too large of a haptic signal for this type of tech, said Vivian Shen, one of the researchers behind the attachment. Her research is an example of how developers are looking for ways to mimic touch that feels intimate and real. Another company, Teslasuit, introduced a full-body haptic suit that resembles a wetsuit. It can capture the feeling of bullets, for example, or a hug. Then there’s Meta: Last year it developed a prototype for a haptic glove far more sophisticated than its current hand controls that would allow people to feel an object’s weight and texture when they lift it in a digital world.

Better, more realistic haptic technology is coming to the metaverse even as harassment is persistent. Women on nearly every social media and gaming platform have told stories about grotesque harassment, stories so common that their narrative arc is now familiar. According to a recent Pew survey, 33% of women under 35 report having been sexually harassed online, compared to 11% of men. Lesbian, gay, and bisexual people are more likely to be harassed, too. It’s unsurprising then that digital sexual harassment was routine in early virtual worlds. In 1993, reporter Julian Dibbell wrote in the Village Voice about a “rape in cyberspace” in LambdaMOO, an early text-based online world. One player known as Mr. Bungle used a “voodoo doll” gaming feature that let him attribute actions to other people in the community rather than his own character, even though he was behind the keyboard. He created a narrative in which some players performed nonconsensual sex acts on others. Dibbell’s reporting asks the same questions about the boundaries between the digital self and a physical being raised in the metaverse. This debate has simmered for nearly 30 years, and the tech community is no closer to a consensus on what digital sexual assault means or how to prevent it.

Bugeja has studied what he calls “avatar rape” for more than a decade, and witnessed it in 2007 on a virtual beach in Second Life, where he watched one male avatar “violently rape” another who had been sitting at a boardwalk bar, drinking a martini. That made him a fierce critic of the platform’s use for education. Some saw Second Life as the future of gathering to work and learn (a familiar refrain used to market the metaverse). He worried experiencing digital harassment could trigger a trauma response in someone who had been sexually assaulted in real life. “I’m saying not to get rid of the metaverses,” Bugeja said. “I’m saying, let’s put more controls in the metaverses. Let’s de-emphasize the profit margin.”

Become smarter in just 5 minutes

Get the daily email that makes reading the news actually enjoyable. Stay informed and entertained, for free.

Tech companies tend to be reactive rather than proactive when it comes to preventing harassment, and even then they can be slow to act. In 2016 Jordan Belamire, a player of the VR video game QuiVr, wrote a Medium post about her experience being groped in the game. Players there appeared as a floating helmet and a set of disembodied hands. Belamire wrote that one approached her and rubbed where her chest would be, an image made more real by the VR headset she wore and the game’s immersive nature. The makers of QuiVr released an in-game solution that allowed users to push other avatars away. In 2018, someone reported to The Sims team that a player with special credentials had exploited his power and shared sexual fantasies with teenage boys. EA Games, which runs The Sims, waited three months to remove the player.

Many platforms have anti-harassment policies, but their enforcement is lacking, and some put responsibility on the harassed to take action instead of blocking abusers or being proactive to prevent abuse. Women who stream on Twitch have had their images manipulated and sexualized, despite a company policy that bans “unsolicited objectifying statements relating to the sexual body parts of practices of another person,” among other forms of harassment. It’s true that anyone who feels uncomfortable can instantly log off from a game or remove their headset to leave the metaverse, but that puts the onus on the victim to take responsibility for the bad behavior of others. It’s also an excuse—especially since Big Tech knows that digital sexual harassment is a persistent problem—that allows metaverse builders to ignore the problem rather than proactively making the metaverse a place where women don’t have to enter with a plan to exit. Upon initial reports that a beta tester of Meta’s VR world was groped last fall, the company reviewed the report, and determined the player should have enacted a tool called “Safe Zone,” a protective bubble that keeps people from touching or talking to players. That would isolate her from harassment. It also halts her interactive experience in the virtual world, while the harassers can continue to play.

It would make more sense for games to require consent between players before they can touch, rather than trying to only give them an exit after harassment has occurred, said Jesse Fox, an associate professor of communication at The Ohio State University: “If we can’t solve the problems, then ask yourself, is this ready to be a public space? Something companies don’t want to do, because they just care about profits and money and not people,” Fox said. “If we haven’t thought this through or if we have people having negative experiences, should we let this continue?”

After Patel wrote about being digitally harassed at the end of 2021, Meta added a feature called “Personal Boundary” in February. It creates a four-foot invisible bubble around avatars in Horizon Worlds. People can use it to block everyone, or just strangers, from getting too close. They can also disable it. “We believe Personal Boundary is a powerful example of how VR has the potential to help people interact comfortably. It’s an important step, and there’s still much more work to be done,” the company said in the announcement. On this, Patel agrees.

“I think it’s a step in the right direction,” Patel said of Meta’s new protective bubbles in an email to Morning Brew. “Everyone in this industry needs to accept the ethical responsibility that is incumbent upon us as gatekeepers of the future Metaverse.” But whether or not the tool has put a dent in digital sexual harassment is unclear. A Meta spokesperson declined to provide stats on the number of people who have reported being harassed in Horizon Worlds.

Metaverse platforms will encourage people to spend more time and money on their avatar and their digital life. Those are investments in its significance, and validations of these digital selves. But until those companies invest wholly in safety, it may be difficult for workplaces, schools, and brands to ethically lure people there with the promises of stunning, futurist experiences. Developers pushing metaverses will need to come to a consensus on what digital harassment means, and shapers of these metaverses should listen to those affected. If they don’t, new digital worlds can’t hope to escape the errors of past ones.