YouTube under fire
YouTube on Wednesday updated its policies to ban videos that promote extremist ideologies such as white supremacy or caste superiority, a move that could see hundreds of thousands of videos removed.
The changes aimed at curbing hate speech and misinformation come amid escalating scrutiny from regulators and lawmakers around the world over how the company, which is owned by Google, deals with bigoted or potentially radicalizing content.
But the ban was derided by critics on Wednesday as a public relations stunt. YouTube announced the policy in the midst of a high-profile controversy over its decision to allow a conservative commentator accused of racist and homophobic harassment to remain on the platform.
Capitol Hill criticism: Tech industry critics on Capitol Hill accused YouTube of putting profits over user safety and failing to dedicate enough resources to prevent the posting of hateful and extremist content to the platform, which boasts a user base of more than 1.8 billion people.
“Congress has focused a great deal on the ways in which Russian operatives exploited Twitter and Facebook in 2016, but an underdeveloped area has been the extent to which YouTube has been used by a range of bad actors, including far-right groups, to facilitate targeted harassment, spread extremist content, and radicalize an entire generation of young users,” Sen. Mark Warner (Va.), the top Democrat on the Senate Intelligence Committee, said in a statement to The Hill.
Sen. Brian Schatz (D-Hawaii) responded with skepticism to YouTube’s announcement on Wednesday, saying the company seems “to only do things when they’re under pressure.”
“What they actually need is human observation of their platform — not just externally from journalists and people on Twitter, but internally as a matter of principle and a matter of how they actually run their company,” Schatz said. “Algorithms are amoral. People can make judgements.”
Activist criticism: Civil rights activists, who have been pressing YouTube to do more about extremists proliferating and making money on its platform, celebrated the new ban on Wednesday but questioned the timing.
“The announcement of this policy, just days after being called out for their continued inaction, underscores YouTube’s flawed approach to handling the growth of white nationalism on its platform: as a public relations crisis first and an operational priority second,” digital civil rights group Color of Change’s president, Rashad Robinson, said.
What YouTube changed: YouTube in a blog post on Wednesday announced it will begin removing videos that allege any group is superior to justify “discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”
The Google-owned company said videos that “glorify Nazi ideology” would fall under that category, and it will begin banning videos that promote hoaxes like claims that the Sandy Hook shooting did not happen.
The controversy: But the moves come with the company under fire after it said it would not take down videos from a far-right commentator accused of harassing Vox Media journalist Carlos Maza. Maza, who identifies as gay and is Cuban American, highlighted commentator Steven Crowder using racist and homophobic slurs in videos.
YouTube investigated Crowder and concluded that he had not violated any of the company’s policies.
Backtrack: After criticism, YouTube on Wednesday said it will no longer allow Crowder to make money from ads on his videos, citing “a pattern of egregious actions has harmed the broader community.”
In a tweet shortly after, YouTube clarified that Crowder can lift the restrictions if he stops selling merchandise that says “Socialism is for Fags” on his account and deals with other “issues” with his YouTube presence.
The company’s response only highlighted YouTube’s challenges in enforcing its rules and determining where to draw the line on problematic content.
Maza in an interview with The Hill called YouTube’s latest move “a bullshit policy.”
“All of the content that should be targeted by this policy should already have been removed under YouTube’s anti-harassment and anti-hate speech policies,” Maza said. “What content is caught by this policy that shouldn’t already have been caught by the former ones?”