Keynote: “Tackling toxicity on FACEIT: Using AI and community moderation to create inclusive player experiences”
Maria Laura Scuri (VP of Community Integrity and Labs at the ESL FACEIT Group)
Toxicity is a common issue encountered in online multiplayer games. From the use of offensive language against other players, deliberately playing loud music in the voice chat, to practices of griefing and trolling – we see how different forms of toxic in-game behaviour have a strong negative effect on the player experience. Over the last years, we have been working on a range of solutions to foster less toxic and more inclusive gaming environments. At the forefront of these efforts sits Minerva, a state of the art in-game moderation system developed at FACEIT. Minerva is capable of detecting toxicity in real-time by analysing millions of chat messages and audio files every hour; issuing warnings and bans to players who engage in toxic behaviour. Outside of the game, we are invested in engineering social solutions that help to de-anonymise the matchmaking experience and nurture more communal forms of online competitive play. Through initiatives such as Community Clans we are trying to create moderated online spaces where players can compete with their future friends. We believe that toxicity is a complex issue that needs to be tackled from different angles, and through collaboration between stakeholders, to create positive change in the long run. This is why we are looking to share knowledge and collaborate to build more inclusive gaming communities together and in the future.
More information will follow…