Sponsored | Modulate: Reducing Toxicity in Online Games Boosts Profits

The issue of toxicity in online gaming has become a pressing concern, with players expecting game studios to take action. New regulations are also being introduced, imposing significant fines on studios that fail to protect their players. It is not only morally and legally imperative for studios to address toxicity, but it also makes sense from a business perspective. Modulate CEO Mike Pappas explains that effective content moderation creates a positive and safe gaming environment, which in turn improves player experience and retention. A safe and welcoming environment leads to increased spending. When players feel comfortable and supported, they are more likely to continue playing and spending money on the platform. This is particularly important for live service games, which rely on a loyal user base to generate revenue. According to Pappas, a bad experience can lead to player churn, and even those who stay may become disillusioned and stop reporting toxic behavior. A survey by Take This found that 61% of players spend less money in games due to experiencing hate speech or harassment. A study by Dr. Constance Steinkuehler found that average monthly spending in toxic games was $12.09, compared to $21.10 in games with safer and more inclusive communities. This highlights the potential financial benefits of addressing toxicity. The cost of not addressing toxicity can be significant, with regulators imposing fines for non-compliance. For example, the EU's Digital Services Act can levy fines of up to 6% of a company's global turnover, while the UK's Online Safety Act can impose fines of up to 10%. ToxMod is designed to help studios combat toxicity, and Pappas argues that it is not just a cost, but a revenue driver. By reducing toxicity, studios can increase player retention and spending, leading to significant financial benefits. In one example, a studio using ToxMod saw a 6.3% increase in active players after just three days, and a 27.9% increase after 21 days. This increase in player activity can lead to higher spending and revenue for the studio. ToxMod's voice-native technology also reduces the mental cost to moderation teams, allowing them to prioritize and mitigate harm more effectively. By minimizing the time needed to review harmful audio, moderators can focus on more impactful tasks, leading to a better player experience and increased revenue for the studio. In conclusion, Pappas argues that the costs of toxicity far outweigh the costs of content moderation. By addressing toxicity and creating a safe and welcoming environment, studios can increase player trust, retention, and spending, leading to significant financial benefits and a positive reputation.