YouTube is preparing to make some serious changes to the way it manages offensive and extremist content that do and do not violate its policies.
Today, in a blog post, the company said it has begun using an AI-assisted video detection and removal system that has already taken down over 75 percent of videos in the past month without the help of humans.
The company is also working with a number of non-governmental organizations including the Anti-Defamation League, the No Hate Speech Movement and the Institute for Strategic Dialogue.
“These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists,” said YouTube in the blog post.
For videos that contain “controversial religious or supremacist content” but don’t violate any of YouTube’s policies, they’ll now be placed in a “limited state.” YouTube said, “The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes.” It says that the limited state will start being applied to desktop versions in the coming weeks and will hit mobile versions shortly thereafter.
YouTube said that these changes are just the beginning and it will be sharing more about its work in the months ahead.