How to make online games less toxic? GDC Developer Debate Moderation
SAN FRANCISCO — As long as popular video games depend on online services like matchmaking and chat, those games will suffer from toxicity, harassment and bullying. Or at least, that’s the assumption some panelists at this year’s Game Developers Conference (GDC) are keen to tone down or quash altogether.
Before the conference exhibit hall opened on Wednesday morning, we listened to a few attendees express their hopes for more positive social gaming environments — and three perspectives stood out as a combined pitch for a better future. The proof is not yet in the pudding of these locations, but each points to different, seemingly smarter steps towards a better online gaming ecosystem.
Lower the temperature on the “heat maps”
The first pitch, from game moderation startup Good Game Well Played (GGWP), suggests aiming for the problem with an AI-powered laser. Co-founded by professional gamer and entrepreneur Dennis “Thresh” Fong, GGWP is designed to slot into existing game moderation systems to strengthen report-based moderation by coupling it with two types of real-time data. : voice chat and “heat” gameplay. Maps.”
The project began at the start of the 2020 pandemic, Fong told Ars Technica, after talking with existing game makers about the variety of toxic behaviors in online games. Fong was surprised by the revelation of an unnamed game: less than 1% of its user-generated reports were moderated. (Industry-wide statistics on this subject remain obscured, in part, because player bases are fragmented across a number of online portals, ranging from Xbox Live and PlayStation Network to queues of publisher and game-specific matchmaking. Industry-wide audits don’t really exist to confirm toxic online trends across them all.)
The problem, says Fong, comes down to available resources. Insults, bad sportsmanship, and even racial slurs go to the bottom of the moderation queue in this no-name game. Reports of violent threats, self-harm, danger to children and other extreme cases are getting attention.
This iceberg approach leaves enough annoying, gameplay-centric toxicity intact to frustrate players – or drive them out of certain games altogether. So Fong and his eventual GGWP partners began plotting a system to triangulate all in-game reports with data from the game sessions in question. Fong says that with a single API call, GGWP can funnel voice chat through its systems and use speech recognition to analyze the language a reported player is using. Other API calls can do the same to track relevant data from each game session and then check if a reported player has negative trends in other sessions of the same game.
Behaviors tracked in GGWP’s eyes include friendly fire, drop rage, body block (intentionally stand in the way of your teammates as they try to attack enemies), and feed (play evil on purpose to let the opponents win). Fong also suggests that players who exhibit none of the negative traits tracked by GGWP could benefit from a positive reputation score, although exactly how a game would recognize this remains unclear.
Fong suggests that GGWP’s voice chat tracking systems will recognize session-specific context, especially when the audio in question is between friends. He thinks it will work using an ever-changing AI model trained on the in-game chat. However, when asked about the system’s ability to analyze language that might target traditionally marginalized groups, Fong suggested that the phrase ” go back to the kitchen” could mean different things to different people in an online game. (At press time, GGWP’s public website does not include any women on its board.)