Facebook, Google face tough questions on white nationalism
Facebook and Google on Tuesday sought to reassure lawmakers that they are cracking down on white nationalism and extremist content online before a hearing of the House Judiciary Committee.
The hearing on white nationalism and hate crimes came as the tech giants face threats of tougher regulations in the U.S. and abroad after they struggled to remove viral footage of a mass shooting at two New Zealand mosques last month.
What lawmakers said: Lawmakers asked Google and Facebook to explain how they deal with hateful content, pushing them to account for the role their platforms have played in the resurgence of white nationalism in the U.S. over the past few years.
“These platforms are utilized as conduits to spread vitriolic hate messages into every home and country,” Judiciary Committee Chairman Jerrod Nadler (D-N.Y.) said during his opening remarks. “Efforts by media companies to counter this surge have fallen short, and social network platforms continue to be used as ready avenues to spread dangerous white nationalist speech.”
How tech responded: Facebook public policy director Neil Potts and Google public policy and government relations counsel Alexandria Walden testified, alongside several civil rights advocates, who accused the companies of allowing their platforms to empower white nationalists and white supremacists.
Potts said Facebook’s policies dictate “white supremacists are not allowed on the platform under any circumstances,” noting that Facebook has been increasing its efforts to remove hate content in the last few years.
Walden touted Google’s artificial intelligence tools, which are trained to remove violent and extreme content that violates Google-owned YouTube’s community guidelines. But Walden also warned that “overaggressive” enforcement can censor some voices.
Why it matters: According to data from the Anti-Defamation League, white supremacists have been responsible for more than half of all domestic extremist murders in the past 10 years. In 2018, white supremacists committed 78 percent of all extremist murders in the country. And white supremacists have often used Facebook, Google, Twitter and more fringe social media platforms to organize and recruit new members.
The big takeaway: The pressure on tech companies to do more about extremist content is intensifying.
Lawmakers made it clear they would watch tech companies closely as they addressed these issues.
“Figure it out,” Rep. Cedric Richmond (D-La.) warned Google and Facebook. “Because you don’t want us to figure it out for you.”