Human rights groups press TikTok on hate speech

A coalition of tech advocacy and human rights groups is urging TikTok to crack down on the spread of hate speech and extremism after a recent report detailed the reach of such content on the platform.

The top line: More than a dozen groups signed on to a letter sent to top TikTok executives Monday, calling for the platform take “substantive action” against hate speech and extremism, citing a report released by the Institute for Strategic Dialogue (ISD) last week that outlined the breadth of the harmful content on the video-sharing app.

“We’ve already seen countless examples of online hate translating to offline violence, both in the United States and around the globe, often with deadly consequences. It’s because of this dangerous reality that TikTok has a responsibility to act to address gaps in your content moderation policies and boost transparency more widely for the public and researchers. For some populations around the world, this is a matter of life and death,” the groups wrote.

The background: The report is based on three months of research on a sample of 1,030 videos from 491 accounts posted to TikTok. To conduct the research, ISD generated a list of 157 keywords associated with extremist individuals, groups, ideologies and related incidents or events. After finding such accounts typically follow or are allowed by other accounts with shared ideological interests, ISD used a “snowball methodology” to expand the sample.

Researchers identified 312 videos — 30 percent of the full sample reviewed — promoting white supremacy. They also reported 246 videos — 24 percent of the sample — featured support for an extremist or terrorist individual or organization.

What’s next: UltraViolet, which spearheaded the letter, and the 19 other signatories urged TikTok to put in place recommendations included in ISD’s report.