Critics fear Facebook losing misinformation fight

Facebook’s program to hire third-party fact-checkers to crack down on misinformation on the platform has been ramping up, with partners adding staff and expanding their work.

But the program still faces skepticism from activists and tech industry critics who say the company and its partners are still not providing the resources needed to address the scope of the problem on a platform with more than 2 billion users.

Launched in December 2016 after intense criticism of how Facebook handled false or misleading content in the last presidential election, the third-party fact-checker program shifts responsibility for verifying the accuracy or truthfulness of content from Facebook to independent, verified outside organizations.

Facebook does not fact-check content itself. However, it does send posts that its algorithm flags as potentially false to its partners. The partners can review the content flagged by Facebook or search for and identify false content themselves.

The hard numbers: There are six partners in the program evaluating U.S. content. Those The Hill reached out to all said they had recently or were currently adding staff to their efforts. Together, Facebook’s six partners have 26 full-time staff and fact-checked roughly 200 pieces of content per month.

Is it enough? But experts who spoke to The Hill said those changes were insufficient to make a serious dent in the fake accounts and disinformation they say are rampant on Facebook.

“The volume seems inadequate given the scale of the challenge that Facebook faces,” Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights, said.