The Brief: Recent reports of videos showing and encouraging self-harm on the YouTube Kids app, as well as issues with people making predatory comments on videos made by children raise the issue of how inappropriate content can affect kids on YouTube.
Harmful And Violent Content
On her blog: PediMom, Dr. Free N. Hess, a pediatrician and mom from Florida reported multiple instances in which she found highly disturbing content on the YouTube Kids app. Dr. Hess who blogs about parenting, internet safety, and children’s’ mental health wrote:
“My research has led me into a horrifying world where people create cartoons glorifying dangerous topics and scenarios such self-harm, suicide, sexual exploitation, trafficking, domestic violence, sexual abuse, and gun violence which includes a simulated school shooting.”
The Washington Post reported this story, citing PediMom’s post. The content included in these videos includes directions on how to self-harm or commit suicide. Several videos included both seemingly kid-friendly footage with disturbing and adult-only content spliced in. Some kid-style videos had clips of Filthy Frank’s graphic descriptions and depictions of self-harm and suicide inserted into them.
In her efforts to protect children from harmful online content, Dr. Hess has created and spread hashtags such as #YouTubeWakeUp, #ProtectOurKids, and #ParentsDemandAction. She has also reported such videos and is rallying parents to do the same and to demand better safeguards from YouTube.
This week, AT&T and several other brands suspended their advertising on YouTube after groups of child predators were discovered making inappropriate sexual comments on kids’ videos. This move by advertisers highlights the issue of pedophilic and lewd content infiltrating YouTube comments sections, even targeting children.
Sensitive Content On YouTube
This is not the first time inappropriate content geared towards children has appeared on YouTube. Over the past several years, there have been many reports of seemingly kid-oriented and cartoon videos that actually include sex, violence, and other elements unsuitable for children. Some of these more disturbing videos seem to intentionally hide creepy content among positive child-oriented imagery. Others may be parody videos meant for adult viewers. Regardless of why these videos were created, it can be harmful to unsuspecting young viewers. It can be traumatizing and confusing for children to consume media with dark messages hidden among their favorite games and cartoon characters.
YouTube Kids, a site and app created by YouTube that is supposed to feature content suitable for kids 8 years old and younger filters out a significant amount of adult content that can be found on YouTube. The fact that highly-disturbing content including some related to suicide and self-harm indicates that YouTube Kids is not always suitable for its intended vulnerable audience.
YouTube’s Policies & Actions
YouTube’s algorithms and reporting systems are designed to filter out sensitive and harmful content, keeping unsuitable content away from children. However, findings like those of Dr. Hess reveal that these algorithms are inadequate. As 300 hours of video content are uploaded to YouTube every minute, YouTube’s gatekeepers have a vast amount of content to monitor. Some of these issues have been exacerbated by YouTube’s algorithms suggesting “similar videos,” but not picking up on the fact that some content “similar” to kids videos can be completely unsuitable for children.
The videos that PediMom reported were taken down, but it is unclear exactly how YouTube will effectively monitor and censor harmful content geared toward minors. Every quarter, Google reports on how many channels and videos have been removed from YouTube and for what reasons. As parents, educators, and consumers call for safer restrictions for children’s content on YouTube the site must grapple with this and other issues of inappropriate content on the platform.
Kids, Toxic Online Content, & Mental Health
Reports such as these highlight how difficult it is to ensure that kids and teenagers safely navigate the internet. Basic internet safety and privacy guidelines can help, but it is also important to consider the relationship between internet use and mental health, especially for minors. With the vast amount of harmful content related to suicide and self-harm online, including Momo and Blue Whale stories, it is key to educate parents and children about issues of suicide prevention and mental healthcare. The National Suicide Prevention Lifeline 1-800-273-8255 is a resource for those who need help, including children.
Especially for younger children, watching media with them, or sticking to approved content can best ensure that they do not view image such as those discovered by Dr. Hess. For older children and teenagers, discussions about their mental health and how it relates to online content is a step towards a healthy digital environment.