Discord revealed Teen Safety Assist, a new safety initiative to promote a safer environment for younger users to hang out in. The initiative includes default enabled safety features for teenagers on the app, such as proactive filters and alerts.
“As great as all of these amazing updates to Discord are, none of this matters if you don’t feel safe on Discord,” the company said in a blog post Thursday. Teen Safety Assist will officially roll out next week.
One of the features Discord is implementing is automatically blurring sensitive images shared with teenagers on the platform. The app may have taken a page out of Apple’s book, which started blurring inappropriate photo messages to kids in 2021.
Teens on Discord will also get an alert when they receive messages from a new sender, containing prompts to double-check if they’d like to reply or block the user.
More than 150 million people use Discord every month, according to the company, and Statista found that 22% of them are younger users ages 16-24. The minimum age to join the platform is 13, but Discord is most popular among an older user base in their 20s and 30s.
Teen Safety Assist also includes a new warning for offenders of Discord’s rules. Offenders will receive a message letting them know the rule they broke, and whether Discord is giving them a warning or a violation. The warning messages offer a chance for users to understand the rules in place and give them a chance to become a better digital citizen.
These warnings, however, are only for users who break certain rules. Discord continues to have a zero-tolerance policy for violent extremism and policies that sexualize children.
Discord has a Safety News Hub where it releases new safety, privacy and policy initiatives. “Creating a safer internet is at the heart of our collective mission,” the company said in a blog post. Roughly 15% of Discord’s employees are focused on safety initiatives.