The Google-owned video-sharing platform has pledged to remove content that glorifies Nazi ideology or well-document events like the Holocaust and 9/11. YouTube is also tweaking its algorithm that recommends videos.
YouTube announced Wednesday it was enforcing a ban on videos that promote racism and discrimination.
The Google-owned streaming service said that content that glorifies ideologies like Nazism, white supremacy and other extremist views had no place on its platform.
The new YouTube policy will prohibit "videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, cast, religion, sexual orientation or veteran status," the company said in a statement.
YouTube will also remove existing content that denies well-documented violent events like the Holocaust; US school shootings, or the September 11, 2001 terrorist attacks.
"YouTube has always had rules ahead of the road, including a longstanding policy against hate speech," a company underlined.
The move comes amid a series of measures taken by tech giants to filter our hateful and violent content, which have triggered calls for tougher internet regulation.
"We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we'll be gradually expanding coverage over the next several months," YouTube said on Wednesday.
The streaming service also said it was making changes to its algorithm that recommends videos to its users.
A curb on free speech?
YouTube, however, acknowledged that its new policy would affect the work of researchers who analyze videos "to understand hate in order to combat it."
Internet restrictions are also opposed by advocates of free speech, who argue that all kinds of ideas should be allowed in the public sphere.
Jonathan Greenblatt, chief executive of the Anti-Defamation League, which researches anti-Semitism, said the move alone was insufficient to curb discrimination. "Many more changes from YouTube and other tech companies" would be needed to "adequately counter the scourge of online hate and extremism," Greenblatt said in a statement.
Social media giants, including Facebook and Twitter, were heavily criticized after the March 15 terror attacks on New Zealand mosques for their perceived inactivity in dealing with material livestreamed by the suspect.
Following the attacks, which saw 51 people killed, Facebook admitted that it had not done enough. The attacker livestreamed the rampage on Facebook for 17 minutes before the company removed it. Clips from the stream had already gone viral. An Australian white supremacist was charged with murder and terrorism
Facebook subsequently announced that it would ban praise or support of white nationalism and white separatism as part of a crack down on hate speech.
shs/rt (Reuters, AFP, dpa)