Tech

TikTok Tries to Make App Safer With New Filters

MATURE

The social media company is testing out new features that would rate certain content as mature and allow users to filter out keywords or hashtags they’re not interested in.

tiktok_wmuxpy
Dado Ruvic/REUTERS

TikTok is testing new safety features that would filter out problematic or mature content for users, particularly for minors, the company said in a press release Wednesday. One feature the social media giant is rolling out, called “Content Levels,” would assign ratings to certain content in a similar fashion to movie and TV ratings, so videos deemed mature would not be shown to minors. Parents of teen users have sued the company after their kids died from attempting dangerous trends they saw on the app. With Content Levels in place, Trust and Safety moderators would classify videos gaining popularity or reported to the app. The company is also testing a feature for users to have more control over their experience by blocking words or hashtags they don’t want to see on their For You or Following feeds. “We also acknowledge that what we’re striving to achieve is complex and we may make some mistakes,” said Cormac Keenan, TikTok Head of Trust and Safety, in the release.

Read it at TechCrunch