BBuiuilldinding a Rg a Reessppoonnssibiblle Me Meettaavveerrssee Digital Safety A combination of policies, practices, tools Moderating virtual environments is more and technology are needed to meet the complex largely because interactions in the basic safety needs of users in virtual spaces, metaverse occur in real time and involve both for consumer and enterprise metaverse gestures and speech, not just static data such applications. While the metaverse has the as pictures, text and videos. This means that potential to reimagine and enrich how people digital safety and content moderation in the engage with each other, it can also exacerbate metaverse may depend more on artificial online harms such as cyberbullying and intelligence (AI) to detect harmful content harassment. Safety is a foundational human and behavior. In leveraging these tools, it is need. Maslow’s Hierarchy of Needs predicates important to acknowledge that AI still struggles that people’s need to feel safe overrides most with contextual differences. Algorithms are other human needs. also only as good as the data they are trained on, and implicit biases embedded in the data Enforcing digital safety is even more critical may skew results. For example, historically more in the metaverse because the immersive and men than women engage in online gaming, so embodied nature of virtual experiences blurs an AI system may not immediately recognize the lines between digital and physical harms. offensive behaviors toward female gamers. Anecdotal user reports confirm that people can viscerally feel the offensive behaviors that The metaverse will require hybrid approaches to occur on immersive platforms. Additionally, digital safety that leverage technology (e.g., AI, platforms that are governed in a decentralized analytics), human, and user and community- manner will not only struggle with implementing driven approaches (e.g., features that let users tools to detect objectionable content but also control their own safety). Our study confirmed face challenges deciding what is allowed and as much. When asked about digital safety, prohibited while ensuring compliance with consumers indicated that their perceptions of regulatory requirements. trust were positively influenced by a variety of digital safety features, some human-driven and 12 others technology-driven. 15

This is a modal window.