Editorial Revisado por pares

Sexual Harassment in the Metaverse

2022; Mary Ann Liebert, Inc.; Volume: 25; Issue: 8 Linguagem: Inglês

10.1089/cyber.2022.29253.editorial

ISSN

2152-2723

Autores

Brenda K. Wiederhold,

Tópico(s)

Technology, Environment, Urban Planning

Resumo

Cyberpsychology, Behavior, and Social NetworkingVol. 25, No. 8 EditorialFree AccessSexual Harassment in the MetaverseBrenda K. WiederholdBrenda K. WiederholdBrenda K. Wiederhold, Editor-in-ChiefSearch for more papers by this authorPublished Online:9 Aug 2022https://doi.org/10.1089/cyber.2022.29253.editorialAboutSectionsPDF/EPUB Permissions & CitationsPermissionsDownload CitationsTrack CitationsAdd to favorites Back To Publication ShareShare onFacebookTwitterLinked InRedditEmail Imagine entering a virtual environment—one where you embody an avatar and collaborate with other users to explore, chat, and build things in a digital world. Suddenly, one of your fellow inhabitants moves toward you, reaches out their hand, and touches you without your consent. In the real world, you may physically move to safety, report the incident to authorities, and even seek out counseling to help process the emotions surrounding the event. But online, the solutions are not so clear. Does this count as an assault? If so, how do you respond? And will there be any consequences for the perpetrator?As our lives increasingly move online, these questions become more and more pressing. Yet, as online harassment becomes a widespread societal issue, universal answers to these questions are not readily available.The impetus behind developing advanced technologies is almost always an optimistic one. Architects crafted virtual reality (VR) to be a fully immersive experience, allowing users to access worlds that they may not otherwise be able to inhabit. Since its creation, VR has been widely used to entertain, as well as to enhance the efficacy of things such as occupational training, scientific research, and healthcare. Other technologies, such as the GPS behind apps such as Find my Friends, were developed to assist with social interaction, and the Internet at large brings a world of information to a wider population than could have been imagined decades ago.Nevertheless, despite developers' intentions, people will inevitably use technology for nefarious purposes. The same apps that help users locate their loved ones can also facilitate stalking. The chat programs that connect users to friends can be used to harass strangers and spew hateful rhetoric. And now, as the metaverse—a network of immersive and fully interactive online environments—comes into being, it is no surprise that this type of behavior, alternately termed "technology-facilitated abuse" or "technology-assisted abuse," has extended to those spaces as well.This issue is not a new one. Abuse of technology has existed for as long as technology itself. However, as technology becomes ubiquitous and increasingly immersive, incidents of abuse and harassment have become more and more common. A 2021 Pew Research Center study1 found that 41% of Americans have experienced some kind of online harassment, with half of these indicating that they believe that they were targeted for their political beliefs. In addition, growing numbers face more severe online abuse such as sexual harassment or stalking. The same study found that the percentage of women who reported being sexually harassed online has doubled from 8% to 16% in 4 years.1 Some of this behavior can be attributed to the anonymity that the Internet provides. People often feel emboldened to act inappropriately when they will not be identified or face consequences for their actions. Such abusive behaviors may also be influenced by the long-standing toxic culture that has grown around online gaming and social media in general.Unfortunately, these bad behaviors are only increasing as the expansion of VR platforms throw users into virtual worlds that feel increasingly real. In late 2021, when Meta (the company formerly known as Facebook) released its VR social media platform Horizon Worlds, users immediately reported incidents of harassment.2 Although experts have been highlighting the dangers of online interactions and have emphasized the importance of virtual safety measures for decades, tech companies do not currently have a legal obligation to protect their users.Because cyber abuse and harassment is so common, it is tempting to gloss over these behaviors and to encourage victims to ignore their experiences or to just "get over it." Nonetheless, experts have observed that the negative effects of online incidents are comparable to those of real-world stalking, bullying, and harassment. Even offline, sexual harassment has never been limited to in-person interactions. People have been harassed in any number of ways, from catcalls to lewd letters to obscene phone calls. It makes sense, then, that toxic behavior that occurs in a virtual space would have the same consequences as those real-world scenarios.In fact, negative experiences in VR may impact victims more than those that occur on other technology platforms. VR is different from technologies such as social media and email in that it is immersive. When a user enters a virtual environment, the virtual world becomes their world, their avatar becomes their body. Because of this, if someone is sexually assaulted in such an environment, the trauma can easily move to the real world.Unfortunately, virtual bad behavior does not just affect a victim while they are online. The feeling of "being there" that makes VR so effective for therapeutic purposes also creates a situation where virtual trauma is likely to be transferred to the physical world. Research has found that VR environments become very real to users. People automatically fill in the pieces of the VR world with their own memories. For instance, during therapy, my patients with post-traumatic stress disorder (PTSD) may hear sounds of gunfire or explosions in the VR simulation of Fallujah before any have occurred.VR users also experience measurable physiological reactions to virtual stimulus, which can be beneficial during therapy, but not necessarily in other situations. For example, those who experience virtual sexual assault will most likely experience an increase in heart rate and other physical measures of anxiety—the same fight or flight response that they would have if the incident had happened in the real world. As a result, negative virtual experiences can impact people psychologically, physically, and socially, even when offline. It is not easy simply to take the headset off and forget the experience.While cyberabuse will affect people differently, victims may develop trust issues, which can damage both online and offline relationships. Avoidance, often used as a coping mechanism, can lead to social isolation, which in turn may lead to negative emotions such as depression, anxiety, and even panic attacks. Still others may develop hypervigilance (such as increased heart rate), feelings of shame, embarrassment, and even PTSD.Given that cyberabuse and cyberharassment often have a significant negative impact on victims, it seems reasonable to assume that there would be a robust network of safety mechanisms to prevent and address such incidents. Unfortunately, online misbehavior is typically difficult to track because events occur in real time and are not generally recorded. Moreover, there is no clear agreement as to who is responsible for ensuring the safety of online spaces.Many tech companies place the onus of safety on the user, ostensibly providing tools to prevent and report abuse but failing to moderate the platforms themselves. For example, Meta does provide a safety measure called Safe Zone—a protective bubble that users can activate when feeling threatened.2 When Safe Zone is in place, no one can touch the user, talk to them, or interact in any way until the user signals that they would like the Safe Zone lifted. However, a recent Buzzfeed investigation3 found that there was very little moderation of content in the Horizon World platform, and that the only remedies available to victims of cyberabuse are user blocks, mutes, and reports of Community Standards violations.Ideally, platforms should incorporate safety measures that are both intuitive and accessible, whether this is through automatic personal distance restrictions (such as Safe Zone), universal alert gestures, tutorials spelling out rules, or active moderation. Preferably, all platforms would employ a combination of these.Increased government regulation around content moderation would also contribute to safer online spaces. Right now, cyberabuse is new legal territory. There are few laws in place to protect users against digital or virtual sexual harassment, and there are also no specific laws that cover digital avatars. Harassment in the metaverse is a nascent area of law that legislation has yet to address fully.In a promising move last April, the European Union did pass the Digital Services Act to address illegal and harmful online content. The law requires tech companies to monitor and rapidly take down any hate speech or face a fine of up to 6% of the company's global revenue.4 Unfortunately, U.S. companies are not subject to similar regulations or accountability.In the end, online safety will only be achieved with a combination of approaches. Solutions should involve governmental bodies, nongovernmental organizations, and corporations all working together to set industry standards for safer technology use.References1. Pew Research Center. The state of online harassment. https://www.pewresearch.org/internet/2021/01/13/personal-experiences-with-online-harassment/ (accessed Jul. 14, 2022). Google Scholar2. Basu T. The metaverse has a groping problem already. https://www.technologyreview.com/2021/12/16/1042516/the-metaverse-has-a-groping-problem/ (accessed Jul. 14, 2022). Google Scholar3. Baker-White E. Meta wouldn't tell us how it enforces its rules in VR, so we ran a test to find out. https://www.buzzfeednews.com/article/emilybakerwhite/meta-facebook-horizon-vr-content-rules-test (accessed Jul. 14, 2022). Google Scholar4. Singh K. In the Metaverse, sexual assault is very real—so what can we do legally? https://www.refinery29.com/en-us/2022/06/11004248/is-metaverse-sexual-assault-illegal (accessed Jul. 14, 2022). Google ScholarFiguresReferencesRelatedDetailsCited byDefining Virtual Consumerism Through Content and Sentiment Analyses Sezai Tunca, Violetta Wilk, and Bulent Sezen31 January 2023 | Cyberpsychology, Behavior, and Social Networking, Vol. 0, No. 0 Volume 25Issue 8Aug 2022 InformationCopyright 2022, Mary Ann Liebert, Inc., publishersTo cite this article:Brenda K. Wiederhold.Sexual Harassment in the Metaverse.Cyberpsychology, Behavior, and Social Networking.Aug 2022.479-480.http://doi.org/10.1089/cyber.2022.29253.editorialPublished in Volume: 25 Issue 8: August 9, 2022PDF download

Referência(s)