The problem lies in the automated filter that YouTube uses to designate content to the kids app.
Animators are using popular children’s characters in “compromising positions,” which are disturbing for kids. Videos are automatically filtered from YouTube’s main site, but some of the inappropriate animated content slip past its filters—either by mistake or because of Internet trolls who have found ways to fool the YouTube Kids algorithms.
When asked about the inappropriate content, the Google-owned company took full responsibility and assured parents that their highest priority is to provide a safe and child-friendly outlet that parents and users can trust.
“The YouTube Kids team is made up of parents who care deeply about this, so it’s extremely important for us to get this right, and we act quickly when videos are brought to our attention,” a YouTube spokeswoman said in a statement. “We use a combination of machine learning, algorithms and community flagging to determine content in the app as well as which content runs ads. We agree this content is unacceptable and are committed to making the app better every day.”
In addition to YouTube vetting the content more carefully, parents are encouraged to take advantage of the controls within the app, where they can block specific videos and channels, and even turn off the search function.
YouTube also asks parents to be a part of the solution by flagging inappropriate videos, which are reported to a team trained to review the inappropriate content and remove as they see fit.
As always, the greatest solution to protecting your kids when they’re online is to monitor their usage and what they’re watching. Experts recommend keeping electronic devices in an open area while your children are using them, and watching the content with your children to know exactly what they’re being exposed to.