Managing Online Toxicity: Challenges and Solutions
At the Reboot Develop Blue 2024 conference, Unity's Micaela Hays addressed the pressing issue of online toxicity, emphasizing its significance not only from an ethical standpoint but also as a crucial business decision. Between 2021 and 2023, the percentage of individuals experiencing some form of online toxicity increased from 68% to 76%, with approximately 49% of players admitting to avoiding specific games due to this reason. When asked about the rising numbers, Hays attributed the growth to the COVID-19 pandemic, which led to a surge in gaming activity and a subsequent shift in empathy among players. The lack of face-to-face interactions during this period, particularly among young people, contributed to a more self-focused and less empathetic online community. Hays suggested that this phenomenon is not unique to gaming and is reflective of a broader societal issue. The data also indicates that toxic behavior is not limited to a small group of individuals but is instead a widespread problem. Hays believes that the early 2000s' tolerance for toxic behavior in online gaming has contributed to the current situation, as players who started their gaming journeys in such environments may view toxic behavior as a normalized aspect of the experience. Changing these online behavioral patterns is challenging, as they have been ingrained for so long. However, there is a growing recognition among companies that they have a responsibility to address these issues. During her talk, Hays highlighted a positive trend: approximately 96% of people are willing to take action against online toxicity, and many are even willing to pay a premium for a game with a non-toxic environment. To combat online toxicity, companies have employed various strategies, including moderation based on user reports, the use of forums or Discord for reporting, and speech-to-text transcriptions. However, these methods have limitations, such as the reliance on user reporting, the lack of nuance in speech-to-text transcriptions, and the challenges of evaluating contextual behavior. Hays' approach to addressing online toxicity is informed by her background as a teacher. She believes that managing online communities to prevent toxic behavior shares similarities with teaching, as it involves not only punishing misbehavior but also educating users. For instance, some games reward players who do not receive bans, promoting positive behavior. The issue of online toxicity is also closely tied to the well-being of moderators and support staff, who often face high levels of stress and burnout due to their exposure to toxic language and behavior. Hays emphasized the need for better support and compensation for these individuals. To address these challenges, Unity has developed Safe Voice, a cross-platform tool that utilizes machine learning to manage online toxicity. Safe Voice combines transcription with tonal analysis and environmental monitoring to track player behavior and responses. The tool provides detailed, easy-to-parse reports, giving moderators the context they need to make informed decisions. While there is a risk of misinterpreting context or nuances with automated systems, Hays believes that Safe Voice's customizable nature and focus on tonal analysis mitigate these risks. The system is designed to adapt to different community standards and can distinguish between acceptable and unacceptable language in various contexts. Hays acknowledges that the system still requires human intervention to evaluate selected data and take action, but she hopes that future advancements will lead to increased automation. Ultimately, the goal is to create a safer, more positive online environment for all players.