Get the latest tech news How to check Is Temu legit? How to delete trackers
TECH
Federal Trade Commission

YouTube Kids app doesn't filter raunchy videos, complaint says

Elizabeth Weise
USA TODAY
A screen shot from a video about the YouTube Kids app, which a coalition of consumer groups wants the FTC to investigate for inappropriate content.

Inappropriate videos found on a Google's YouTube Kids app aimed at preschoolers have caused a coalition of consumer and child advocacy groups to expand a complaint they sent to the Federal Trade Commission last month.

Some of the videos available through the supposedly filtered search function on the app included an ad for protective glasses that featured a nail gun being shot into the eye of a mannequin head, videos on how to juggle a chainsaws and knives and how to make toxic chlorine gas.

There are also ads for alcoholic products and videos that include graphic discussions of pornography, jokes about pedophilia and drug use and explicit sexual language, the group said.

"We have discovered that Google's deceptive practices toward parents are even more widespread and pervasive than we documented in our initial request for investigation," the coalition wrote in a letter sent Tuesday to Donald Clark, FTC secretary.

The app, launched in February, is available on both the iPhone and Android platforms.

It features short videos that are algorithmically filtered for child-friendliness, according to Google, which also has a team that manually samples the videos to make sure they're child-appropriate.

In a statement Tuesday, YouTube said, "We work to make the videos in YouTube Kids as family-friendly as possible and take feedback very seriously. Anyone can flag a video and these videos are manually reviewed 24/7 and any videos that don't belong in the app are removed."

Google has said the app will "screen out the videos that make parents nervous."

However the coalition found numerous videos available through the app's search function that "would not meet anyone's definition of 'family friendly,' the letter said.

While the FTC took the initial complaint seriously, the follow up provides additional evidence that Google is engaging in deceptive acts, said Jeff Chester, executive director of the Center for Digital Democracy in Washington D.C., which is a member of the coalition.

"What we found is alarming, disturbing and egregious. Children can access very harmful content. Google has been reckless, unfair to children and has deceived the public with its claims of a kid-safe YouTube," he said.

The coalition notes that the app includes a voice-enabled search function so that it can be easily used by preschool children who can't yet read or write.

On the YouTube Kids page, Google notes that because about 300 hours of video are uploaded to YouTube every minute, "it's nearly impossible to have 100% accuracy." For that reason it also gives parents the option to "turn off search for a more restricted experience."

The group posted a video containing excerpts of some of the videos it found through the app which it considered inappropriate.

"Federal law prevents companies from making deceptive claims that mislead consumers," said Aaron Mackey, the coalition's attorney at Georgetown Law's Institute for Public Representation, said in a statement. "Google promised parents that YouTube Kids would deliver appropriate content for children, but it has failed to fulfill its promise."

Featured Weekly Ad