YouTube Kids lets parents override algorithm that served disturbing videos - Action News
Home WebMail Saturday, November 23, 2024, 04:45 AM | Calgary | -12.0°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

YouTube Kids lets parents override algorithm that served disturbing videos

YouTube is overhauling its kid-focused video app to give parents the option of letting humans, not computer algorithms, select what shows their children can watch in response to complaints that the YouTube Kids app has repeatedly failed to filter out disturbing content.

But default is still automated system criticized for letting disturbing content through its filters

The YouTube Kids app is engineered to automatically exclude content that's not appropriate for kids, and recommend videos based on what children have watched before. That hasn't always worked to parents' liking especially when videos with profanity, violence or sexual themes slip through the filters. (AnnaTamila/Shutterstock)

YouTube is overhauling its kid-focused video app to give parents the option of letting humans, not computer algorithms, select what shows their children can watch.

The updates that begin rolling out Thursday are a response to complaints that the YouTube Kids app has repeatedly failed to filter out disturbing content.

Google-owned YouTube launched the toddler-oriented app in 2015. It has described it as a "safer" experience than the regular YouTube video-sharing service for finding Peppa Pig episodes or watching user-generated videos of people unboxing toys, teaching guitar lessons or experimenting with science.

The updates to YouTube Kids allow parents to switch off the automated system and choose a contained selection of children's programming such as Sesame Street and PBS Kids. But the automated system remains the default. (Adrian Bradshaw/EPA)

In order to meet U.S. child privacy rules, Google says it bans kids under 13 from using its core video service. But its official terms of agreement are largely ignored by tens of millions of children and their families who don't bother downloading the under-13 app.

Both the grown-up video service and the YouTube Kids app have been criticized by child advocates for their commercialism and for the failures of a screening system that relies on artificial intelligence. The app is engineered to automatically exclude content that's not appropriate for kids, and recommend videos based on what children have watched before. That hasn't always worked to parents' liking especially when videos with profanity, violence or sexual themes slip through the filters.

Off switch for algorithm

The updates allow parents to switch off the automated system and choose a contained selection of children's programming such as Sesame Street and PBS Kids. But the automated system remains the default.

"For parents who like the current version of YouTube Kids and want a wider selection of content, it's still available," said James Beser, the app's product director, in a blog post Wednesday. "While no system is perfect, we continue to fine-tune, rigorously test and improve our filters for this more-open version of our app."

Robert Kyncl, YouTube Chief Business Officer, speaks about YouTube Kids in 2015, the year Google-owned YouTube launched the toddler-oriented app. (Danny Moloshok/Associated Press)

Beser also encouraged parents to block videos and flag them for review if they don't think they should be on the app. But the practice of addressing problem videos after children have already been exposed to them has bothered child advocates who want the more controlled option to be the default.

"Anything that gives parents the ability to select programming that has been vetted in some fashion by people is an improvement, but I also think not every parent is going to do this," said Josh Golin, director of the Boston-based Campaign for a Commercial-Free Childhood. "Giving parents more control doesn't absolve YouTube of the responsibility of keeping the bad content out of YouTube Kids."

He said Google should aim to build an even cleaner and safer kids' app, then pull all the kid-oriented content off the regular YouTube where most kids are going and onto that app.

Golin's group recently asked the Federal Trade Commission to investigate whether YouTube's data collection and advertising practices violate federal child privacy rules. He said advocates plan to meet with FTC officials next week.