THAT SEEMS REGRETTABLE
YouTube users have reported potentially objectionable content in thousands of videos recommended to them using the platform’s algorithm, according to the nonprofit Mozilla Foundation.
The findings, released Wednesday, revealed many instances of YouTube recommending videos that users had marked as “regrettable” — a broad category including misinformation, violence and hate speech.
The 10-month-long investigation used crowdsourced data gathered by the foundation using an extension for its Firefox web browser as well as a browser extension created for Chrome users to report potentially problematic content.