© 2021 WOSU Public Media
Play Live Radio
Next Up:
Available On Air Stations
News Partners

YouTube Announces It Will Ban White Supremacist Content, Other Hateful Material


YouTube says it is removing thousands of white supremacist and extremist videos from its platform. Now, this does not happen in a vacuum. YouTube was under intense pressure, and its move comes as a broader debate unfolds over whether social media companies are responsible for hate speech posted to their platforms.

NPR's Hannah Allam is here to talk about YouTube's new policy and the wider challenge of trying to police hate online. Hi, there.


KELLY: So what exactly has YouTube announced it's going to do?

ALLAM: In a company blog post today, YouTube said it was going to ban what it called supremacist content - so videos that YouTube says justify discrimination, segregation, exclusion of specific groups of people. And this goes beyond a 2017 policy YouTube introduced that made it harder to recommend or comment on or share videos with hateful content. And so this move today - it really goes beyond that to thousands of videos by white supremacists, neo-Nazis and other types of extremists. And YouTube says that's also going to apply to videos that deny well-documented violent events, so that means Holocaust deniers and deniers of the Sandy Hook Elementary shooting, for example.

KELLY: All of that will be banned going forward - kind of mind-blowing to think that all of that was allowed on YouTube until now. So what prompted this? Why now?

ALLAM: The context is this bigger debate going on about the responsibility of tech companies like YouTube, Facebook and Twitter to police hate speech on their platforms. There's political and public pressure to clean up those sites. Facebook and Twitter have announced similar efforts to remove hateful content. But YouTube was seen as especially resistant.

KELLY: And why? What was going on specifically at YouTube?

ALLAM: Well, we had one clear example this week in the experience of Carlos Maza, a journalist for Vox who'd been targeted for two years by a right-wing YouTube creator who has a following of more than 3 million subscribers. Maza was referred to by slurs about his Cuban heritage, his sexual orientation. And that creator's fans even published his cellphone number at one point. Maza reported the harassment to YouTube, which said the slurs did not violate its policies. And here's what Maza had to say about that today in an interview with BuzzFeed News.


CARLOS MAZA: It should not be a policy that you're allowed to get away with harassment and hate speech on YouTube as long as you're popular enough to make them uncomfortable about shutting you down.

ALLAM: There was this big social media backlash echoing Maza's outrage. And examples like his show how when tech companies fail to act on this quickly, it can really spiral into a PR problem for companies like YouTube and its parent company Google.

KELLY: I'm also imagining the challenge of, how do they enforce this new policy with the - I don't know what the number is - but the zillions of new videos getting posted on YouTube every day?

ALLAM: And that's the big question. YouTube was already under fire for not doing more to enforce the old measures it had, which were supposed to make it harder for people to find and profit from these kinds of videos. So the blog post didn't really go into enforcement, into much detail. And critics like Maza, the Vox journalist, have said already that they're skeptical this is going to make a difference.

KELLY: NPR's Hannah Allam reporting on the new YouTube policy announced today. Thank you, Hannah.

ALLAM: Thank you. Transcript provided by NPR, Copyright NPR.