Newsletter Subject

How AI-assisted voice moderation can help combat toxicity

From

gamesindustry.biz

Email Address

noreply@gamesindustry.biz

Sent On

Thu, Nov 16, 2023 10:34 AM

Email Preheader Text

Why Modulate’s voice-native moderation tool ToxMod is more beneficial than old forms of grief r

Why Modulate’s voice-native moderation tool ToxMod is more beneficial than old forms of grief reporting [View this email in your browser]( You have been sent a sponsored message via GamesIndustry.biz, in association with Modulate Modulate: how AI-assisted voice moderation tools can help combat toxicity Why Modulate’s voice-native moderation tool ToxMod is more beneficial than old forms of grief reporting Voice moderation is a sensitive issue. Players expect privacy, but long gone are the halcyon days of early, friendly online gaming. Today, when players interact with strangers in online games it can all too often lead to toxic behaviour. Striking the balance between player privacy and safety for online communities is the challenge facing games studios today. Boston-based start-up Modulate wants to help game companies clean up toxic behaviour in their games with machine learning-based tools that promise to empower moderators and protect players. Modulate CEO Mike Pappas told GamesIndustry.biz why its voice-native moderation tool ToxMod is more beneficial than old forms of grief reporting, why studios should build effective codes of conduct amid changing online safety regulations and how their technology and guidance is helping to make online communities safer. ToxMod: machine-assisted proactive reporting vs user-generated reporting User-initiated incident reports for bad behaviour have been standard for many years – just this summer, [Xbox rolled out new voice reporting features for its platforms.]( But Pappas says game studios cannot rely on this method alone. “User reports are a really important part of the overall trust and safety strategy. You have to give your users ownership,” he says. But you can’t just rely on that channel. A lot of users have had bad experiences with user reporting systems in the past, where they feel like their reports go into a black box, that they don’t have the tools to submit, and they don’t know if anything they reported is actually getting addressed.” Submitting a report often takes players out of the action when playing a game or chatting in a social group. And this plays a part in Modulate’s assertion that user reports only account for anywhere from 10 to 30% of the bad behaviour in studios’ games. This is what the AI-assisted ToxMod, Modulate’s voice-native and comprehensive moderation tool, aims to improve. “There’s a lot of tools out there to just transcribe audio, but you lose so much valuable nuance from what’s actually being said. We’re not only looking at things like tone and emotion, but we can even look at behavioural characteristics,” says Pappas. “So, if you join a group of people, say one thing and then everyone goes shocked silent for a second, that’s a warning sign. We can look at all of those kinds of indications.” “Users still can report, but especially for the really harmful stuff, like child grooming or violent radicalisation, you often have a target that is not aware of what’s happening and not able to report it. So it was really important for us to be able to proactively bring those things to a studio’s attention.” [To read the full article, click here]( [Facebook icon]( [Twitter icon]( [LinkedIn icon]( Copyright (C) 2023 ReedPop. All rights reserved. You got this email because you signed up to an account on GamesIndustry.biz and agreed to receive promotional emails from our partners. Our mailing address is: ReedPop 1-6 Grand ParadeBrighton, East Sussex BN2 9QB United Kingdom [Add us to your address book]( Want to change how you receive these emails? You can [update your preferences]( or [unsubscribe]( [Privacy Policy]( | [Cookie Policy](

Marketing emails from gamesindustry.biz

View More
Sent On

30/05/2024

Sent On

29/05/2024

Sent On

28/05/2024

Sent On

28/05/2024

Sent On

10/05/2024

Sent On

01/05/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.