YouTube announced Wednesday that it will ban videos espousing white supremacist ideology and conspiracy theories about the Holocaust and the Sandy Hook school shooting.
"Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status," the company wrote in a blog post.
The company said the new policy aims to “prevent our platform from being used to incite hatred, harassment, discrimination and violence.”
"In January, we piloted an update of our systems in the U.S. to limit recommendations of borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness, or claiming the earth is flat," the company wrote. "We’re looking to bring this updated system to more countries by the end of 2019."
The company said that channels that “repeatedly brush up against our hate speech policies” will lose their ability to run ads and monetize their videos.
YouTube under fire for inaction on homophobic harassment:
YouTube has come under heavy criticism after Vox reporter Carlos Maza was repeatedly called anti-gay and anti-Mexican slurs by YouTuber Steven Crowder.
Despite videos in which Crowder calls Maza a “lispy queer” among worse slurs, YouTube announced they would not remove the videos.
“Our teams spent the last few days conducting an in-depth review of the videos flagged to us, and while we found language that was clearly hurtful, the videos as posted don’t violate our policies,” the company said on Twitter.
After coming under fire for inaction, YouTube tweeted that it has suspended monetization on Crowder’s channel but said it would lift the suspension if he removed a link to a site selling t-shirts that say “Socialism is for Fags.”
YouTube algorithm recommended home movies of children on sexually themed content:
YouTube is also under fire after The New York Times reported that its algorithm suggested videos of home movies featuring prepubescent, partially clothed children to users that watched sexually themed content on their platform.
“Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations,” The Times reported. “So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.”