By allowing ads to appear on this site, you support the local businesses who, in turn, support great journalism.
Google just changed the parental controls on YouTube Kids; heres what it switched
0f4ea66f5b3ddbb40fdb4509b239b805f34ac56ee069a9dc80f7ee2b6ab973be
Google announced a new change to parental controls for YouTube Kids that allows parents to have more control over which videos their children see. - photo by Herb Scribner
Google announced a new change to parental controls for YouTube Kids that allows parents to have more control over which videos their children see on the app.

As TechCrunch reported, the change will allow parents to only show videos that have been hand-picked by YouTube to be suitable for children, rather than ones chosen through the companys algorithm.

Parents will see a new option on their childs profile to display approved content only. Selecting the option will take down the YouTube search function and only show videos that YouTube's employees have personally recommended.

But parents can customize even more. They can pick YouTube original videos or ones created by family-friendly brands, like PBS Kids and Sesame Workshop, according to The Verge.

Parents can also filter by categories, like arts and crafts, video games and educational videos.

More updates will arrive soon, including the option for parents to pick specific videos or channels that their child can watch, according to TechCrunch.

A report earlier in April said that YouTube planned to release a new white-listed version of YouTube Kids app, which wouldnt rely on the algorithm at all.

YouTube Kids has faced several controversies over the last year. A Business Insider report found that YouTube Kids suggested videos to children that included conspiracy theories and fake news, including ones about how the Earth is flat and how the U.S. faked all the moon landings.

Another suggested that the planet is ruled by reptile-human hybrids, according to Business Insider.

YouTube fell under fire last year, too, when parents noticed that videos on YouTube Kids contained sexual and mature content. Specifically, users would upload videos that appeared to be about cartoon characters and family-friendly material, but they actually contained violent scenes.

For example, one video showed cartoon characters from Peppa the Pig breaking other characters legs.

However, YouTube vowed to fix the problem by hiring 10,000 reviewers to monitor all of the videos on the YouTube Kids app.
Sign up for our E-Newsletters