The Brief: Reports of harassment and predatory behavior targeting minors on TikTok brings up issues about how the app is responding to these allegations and what app users can do to protect themselves.
With any social media platform, there will be issues with inappropriate content, harassment, and more. TikTok, the vastly popular meme-generating video sharing app is no exception. The application’s popularity among young people makes it an especially volatile place when it comes to user privacy, safety, and protection from online predators.
ByteDance, TikTok’s parent company has over one billion users worldwide and addressing the issue of predation and other inappropriate behavior on its platforms will be no easy task.
Internet predators can exhibit a variety of different behaviors, but the common thread is that they take advantage of the anonymity of online interactions to have inappropriate contact with minors and other vulnerable individuals. On TikTok, reports of inappropriate interactions between adults and children include solicitations of nude images, explicit messages, grooming, threats, demands for personal information, and more. Sometimes, adults remix videos of children with TikTok’s duet feature in ways that can be distinctly creepy if not explicitly inappropriate.
A report by the BBC investigates the issue of sexually suggestive comments being made on the videos of children as young as nine. These public comments were mostly deleted after they were flagged, but many of the people who made these remarks were not banned from the platform. TikTok’s community guidelines officially forbid posts or direct messages that “harass underage users” and say that if such content “sexually exploits, targets, or engagers children” that it may report cases to law enforcement.
Recently, TikTok users have been creating their own networks and systems to police predators and expose harassers. As the app falls short in its moderation of predatory messaging and behavior, some young advocates are working together to hold alleged predators with large followings accountable for their actions.
In a BuzzFeed News article titled “TikTok Has A Predator Problem. A Network Of Young Women Is Fighting Back,” Rrian Broderick explores how young women are collecting information on TikTok predators to expose them to the public and demand that they be banned from the app. Reports of adults, often men, reaching out to kids, often girls, and exhibiting inappropriate behavior indicate that not enough is being done to address the issue of predators on TikTok.
TikTok has systems in place to detect, report, and take down inappropriate content, but many young users and their parents have reported that these are ineffective or insufficient.
Earlier this year, ByteDance was fined a record $5.7 million by the FTC for allegedly violating children’s’ privacy. As a result, TikTok added a more restricted version of the app for kids under 13 wherein they can view but not post content. In May, downloads of TikTok in India were blocked for two weeks after a court ruled that the app could expose children to sexual predators, pornographic content, and cyberbullying. TikTok appealed, citing its crackdowns on inappropriate content as effective safety measures and the Indian government gave Apple’s App Store and Google’s Play Store permission to offer the app for download again.
Pressure from parents, TikTok users, influencers, and news outlets may encourage TikTok to implement more effective methods to protect young users. In the meantime, as many children will continue to use the app, there are specific measures that can be taken in order to ensure enhanced safety for children and teenagers who use the app.
In addition to general internet safety and privacy practices, it’s a good idea for parents to discuss with their children how they use TikTok and how to be aware of potentially predatory adults. Inappropriate comments and advances should always be reported, both through the application’s reporting feature and directly to parents/guardians.
Cyberbullying and harassment can have significant impacts on kids’ mental health. Because of this, it’s’ important for parents and children to have open conversations about mental health generally and how social media affects their wellbeing.
Kids under the age of 13 are not allowed on TikTok, but it is possible to lie about one’s age after downloading the app. Common Sense Media’s guide recommends TikTok for ages 16 and up. They also suggest ensuring that children’s accounts are set to private and that comments be turned off or limited to ‘friends only.’
The issues with TikTok, while troubling, are not unique to this one app. It’s crucial to stay up-to-date with new apps and social media platform in order to ensure awareness and safety for young users. For more information on apps similar to TikTok, check out StayHipp’s app guide.