The Evolution of Online Toxicity Management

Micaela Hays, from Unity, took the stage at Reboot Develop Blue 2024 to address the pressing issue of online toxicity, its implications, and the necessity to combat it from both ethical and business perspectives. Between 2021 and 2023, the prevalence of online toxicity experienced by gamers increased from 68% to 76%, with approximately 49% of players avoiding certain games due to this reason. When asked about the rising numbers, Hays attributed it to the COVID-19 pandemic and the subsequent increase in gaming community size, which has since plateaued as people returned to work and school. The lockdown had a profound impact on mental health, leading to a shift in empathy among players, who became more self-focused and lacked face-to-face interactions. This is particularly concerning for young people who missed out on crucial human interactions during their emotional development. Hays explained that this siloed environment took away the humanity from online interactions, making it easier for people to engage in toxic behavior behind the anonymity of a screen. Data also indicates a generalized increase in toxicity, rather than it being limited to a small group of individuals. The current situation can be traced back to the early 2000s, when there was no anti-toxicity movement, and such behavior was normalized in the gaming culture. Hays suggests that starting gaming adventures in a toxic environment can lead to the expectation that this behavior is a normal part of the experience. Changing online behavioral patterns is challenging, as it has been normalized for so long. However, there is a growing number of companies that recognize the importance of addressing this issue and taking responsibility for the environments they create for their players. During her talk, Hays highlighted a positive aspect: approximately 96% of people want to address online toxicity, and many are willing to pay more for a game with a non-toxic environment. Companies have tried to tackle the issue through moderation, but it often involves compromises, such as relying on user reports, which can be subjective and lack evidence. Hays approach to addressing online toxicity is rooted in her teaching background, where she learned the importance of public speaking and adapting to different audiences. She believes that dealing with online communities to combat toxic behavior is similar to teaching, as it involves educating and punishing, and can be done in various ways, such as limiting rewards to players who do not receive bans. The issue of toxicity management is also linked to manpower, as moderation and customer support teams face high turnover rates due to the stressful nature of their work. Hays emphasized the need for human intervention in the process, even with the use of AI-powered tools like Safe Voice, which combines transcription with tonal analysis to provide context and protect both players and moderators. While there is a risk of misinterpreting context and nuances with automated tools, Hays believes that Safe Voice is effective in addressing these concerns and can be customized to accommodate different community standards. The tool has been designed to provide detailed and easy-to-parse context, making it a valuable asset in the fight against online toxicity. In conclusion, Hays stresses the importance of addressing online toxicity, not only as a moral imperative but also as a business decision. With the help of tools like Safe Voice and a human-centered approach, it is possible to create a safer and more positive online gaming environment.