Artigo Revisado por pares

Haptics: Making the Metaverse a Touching Experience

2023; Mary Ann Liebert, Inc.; Volume: 26; Issue: 9 Linguagem: Inglês

10.1089/cyber.2023.29278.editorial

ISSN

2152-2723

Autores

Brenda K. Wiederhold,

Tópico(s)

Media Influence and Health

Resumo

Cyberpsychology, Behavior, and Social NetworkingVol. 26, No. 9 PerspectiveFree AccessHaptics: Making the Metaverse a Touching ExperienceBrenda K. WiederholdBrenda K. WiederholdBrenda K. Wiederhold, Editor-in-Chief Search for more papers by this authorPublished Online:11 Sep 2023https://doi.org/10.1089/cyber.2023.29278.editorialAboutSectionsPDF/EPUB Permissions & CitationsPermissionsDownload CitationsTrack CitationsAdd to favorites Back To Publication ShareShare onFacebookTwitterLinked InRedditEmail We've all been there. We sense that familiar buzz in our bag, fumble around for our phone, and extricate it only to see a blank lock screen. Or our device vibrates in our pocket, yet when we reach to check it, we see no one has called at all. It's enough to cause confusion and momentary concern. But it's not uncommon. Even 10 years ago, nearly 90% of young adults reported experiencing these types of phantom vibrations.1The truth is, most of us are so finely tuned to the tiny beeps and boops of our personal devices that we end up suffering sensory hallucinations. While this depiction of the human–machine relationship may seem a bit extreme, devices are intentionally designed to exploit our human senses of sight, sound, and even touch.If you use technology at all, whether you realize it or not, you are likely familiar with haptics. Haptics, or haptic feedback, is the use of motion to imitate touch. While most current technological devices predominantly stimulate vision and hearing, with haptics, machines can reach out and touch their users. That tiny jolt under your fingertip when you click on a touch screen, the rumbling game controller when you make a goal, a car steering wheel that vibrates when you get too close to another object … it's all haptic feedback.Humans are social beings; we are wired to respond to touch. So, it is not particularly surprising that we respond strongly when technology uses haptics to excite those tactile pathways. As it turns out, the relatively simple applications for haptics mentioned above are just precursors to what awaits users in the sensory wonderland of the metaverse.Construction of the metaverse, a new iteration of cyberspace that incorporates advanced technologies such as virtual reality (VR) and augmented reality to create a collective virtual shared space, is already well underway. Experts estimate that by 2026, a quarter of us will spend at least an hour a day working, studying, shopping, and socializing in the metaverse.2 Companies, creators, and developers are all looking for ways to capitalize on these hours, hoping to draw in users with the promise of new immersive experiences.This type of widespread innovation could provide opportunities to bridge global gaps in both human connection and universal access to information. Yet, there are concerns. Devices are known to create a digital wall in the physical realm, preventing people from chatting with those sitting right next to them. If the metaverse becomes so incredibly compelling, what is to stop users from replacing human–human interaction with human–machine interaction?Haptics is part of the answer. As the boundary between the real and the virtual continues to blur, the power of touch can encourage healing, both online and in person. While technology has so far offered ample opportunity for connection, most people (about 78%) say they miss the ability to touch people physically when they interact virtually.3 Furthermore, at least half of users want to be able to feel or touch virtual things physically to make their experiences more fulfilling.In order for the developing metaverse to be successful, users will need to feel immersed in the digital experience. Relying on sound and sight will no longer be enough to keep them online. To be fully immersed, users will need to form a sensory connection to the virtual world around them. One way to do this is through haptic clothing—wearable body tech that allows users to feel their virtual surroundings.While the idea of a full body haptic suit is alluring, it is probable that wearing that much tech would be uncomfortable enough to break the illusion and take the wearer out of the experience, not to mention the expense of such a garment would likely be prohibitive. The good news is haptics can contribute to immersion with much simpler, smaller, and cost-effective solutions.There are a range of haptic garments already on the market, and many more are in development. These range from gloves to vests to headset add-ons, and they employ technologies that include force feedback, vibrating actuators, electrical stimulation, pneumatics, and even ultrasound waves. There are devices that impart sensations of buzzing and tingling, wind and rain, and heat and cold.Years of research on using VR to improve mental health has confirmed that realism is not essential for a patient to make progress. It is the sense of immersion and presence that determines whether emotions are activated. In the past, we have had to be creative in finding low-tech ways to provide tactile sensations to increase immersion in virtual environments. We used a fan to simulate wind, a heat lamp for the warmth of the sun, and subwoofers to create engine vibrations. All of this haptic feedback helped create virtual worlds that felt real to patients, allowing them to access their emotions and, in turn, take control of them.These same principles can be used to enhance connection in the metaverse, and haptic technology is better than ever before. A team at Carnegie Mellon4 has created a way to stimulate the human mouth's sense of touch by equipping a VR headset with a series of ultrasonic transducers. By projecting these waves onto the mouth and chin, the transducers can simulate a variety of sensations, including drinking water from a fountain or even walking through a spider web. Though these modifications obviously cost more than a heat lamp or subwoofer, they may allow for more precise and realistic sensations, enhancing the efficacy of virtual worlds.In addition to creating experiences that are more immersive for everyone, haptics can make technology and the metaverse more accessible for those who are neurodivergent or disabled. For example, scientists have created an assistive glove with inflatable "fingers" that allow a user to grip a soda can or a tennis ball.5 It is hoped that this glove can be used to help people with hand or brain injuries, limited mobility, or trauma to the fingers to perform tasks virtually that they cannot do in the real world. Meta is also working on a haptic glove prototype.Haptic technology could also be beneficial in helping people understand each other's emotions. One app in development6 uses haptic feedback to alert users on the autism spectrum to the underlying emotion in a conversation partner's voice. The app records vocal pitch data through a smartphone microphone, runs this through a neural network, and then provides a specific vibration to a wristband to indicate the possible emotion. It is thought that this app may assist autistic individuals in learning to interpret emotions of the people they talk to, and it is also a useful opportunity for anyone to improve their emotional awareness and enhance their conversation skills and relationships.Of course, as with past technological advances, factors such as cost, infrastructure, and availability will mean that some people will have an easier time entering the metaverse than others. Because this new digital frontier will impact all aspects of our lives (education, employment, finances, etc.), it is vital to be proactive in matters of equal access. Governments, nonprofits, and corporations all have a role to play in this endeavor. If developed with intention, the metaverse promises to be a virtual community that can be accessible, safe, and healthy for all—one in which we can reach out to those near and far to share truly touching experiences.References1. Drouin M, Kaiser DH, Miller DA. Phantom vibrations among undergraduates: prevalence and associated psychological characteristics. Computers in Human Behavior 2012; 28:1490–1496. Crossref, Google Scholar2. Rimol M. (2022) Gartner predicts 25% of people will spend at least one hour per day in the metaverse by 2026. https://www.gartner.com/en/newsroom/press-releases/2022-02-07-gartner-predicts-25-percent-of-people-will-spend-at-least-one-hour-per-day-in-the-metaverse-by-2026 (accessed Mar. 29, 2023). Google Scholar3. National Research Group. (2021) Initial data released on consumer views of the metaverse. https://assets.ctfassets.net/0o6s67aqvwnu/3CI122pu3J5hd1RJFw988b/98e010c5cb766d823a13b1c54c15d023/NRG_Metaverse_Press_Release.pdf (accessed Mar. 29, 2023). Google Scholar4. Kopp D. (2022) Is the world really asking for VR games that can put spiders in your mouth? https://www.cbr.com/meta-quest-2-haptic-mouth-feedback/ (accessed Mar. 29, 2023). Google Scholar5. Best S. (2022) An a-peel-ing invention! Scientists create an assistive glove with inflatable "banana fingers" that can grip a Coke can or a tennis ball—using an autonomous knitting machine. https://www.dailymail.co.uk/sciencetech/article-10785683/Scientists-create-glove-inflatable-banana-fingers-grip-Coke-tennis-ball.html (accessed Mar. 29, 2023). Google Scholar6. Hardesty G. (2022) Good vibrations. https://news.usc.edu/trojan-family/good-vibrations/ (accessed Mar. 30, 2023). Google ScholarFiguresReferencesRelatedDetailsCited byVirtual Influencers versus Real Influencers Advertising in the Metaverse, Understanding the Perceptions, and Interactions with Users21 July 2023 | Journal of Current Issues & Research in Advertising, Vol. 44, No. 3 Volume 26Issue 9Sep 2023 InformationCopyright 2023, Mary Ann Liebert, Inc., publishersTo cite this article:Brenda K. Wiederhold.Haptics: Making the Metaverse a Touching Experience.Cyberpsychology, Behavior, and Social Networking.Sep 2023.668-669.http://doi.org/10.1089/cyber.2023.29278.editorialPublished in Volume: 26 Issue 9: September 11, 2023Online Ahead of Print:April 24, 2023PDF download

Referência(s)