YouTube updates child content policy

YouTube on Monday rolled out a series of changes to its content policies aimed at protecting children on the platform in an effort to appease federal regulators who fined the company millions for alleged privacy violations last year.

The changes, first introduced last September, were fully rolled out on Monday. The Google-owned company will now restrict the collection of data from people who watch videos meant for children, whether or not the viewers are children themselves.

YouTube will also stop running targeted ads on content for minors.

The decision on what content falls under these new rules will be made primarily by content creators. As of Monday, creators will have to designate whether videos are made for children during the uploading process.

“We also use machine learning to help us identify this content, and creators can update a designation made by our systems if they believe it is incorrect,” the company said in a blog post Monday, clarifying that YouTube can label a video as made for kids even if its creator does not.

“We will only override a creator designation if abuse or error is detected.”