Get your children off of TikTok.
That’s the only sane thing to do in light of a recent report by The Wall Street Journal that uncovered the startling way the app exposes children to pornography, violence, drugs, alcohol and other horrifying content.
The news outlet set up several “bots,” or automated accounts, disguised as children ages 13 to 15 to test what content would be curated for them — and watched as the app automatically loaded video after video of gruesome and perverse content.
One bot that simulated a 13-year-old simply had to search the adult content site “OnlyFans” and play a few of the pornographic videos for the algorithm to start sending more smut.
At first, turning on the “For You” feed loaded some of the social media site’s popular and harmless videos.
But soon those gave way to scenarios with deviant fantasies and twisted, sexualized caregiver relationships — and it only got worse from there.
All the user had to do was watch a little of the sexually explicit videos for more to load, eventually leading to more sick content, such as sadomasochistic material and torture porn, content sometimes referred to as “Kinktok” on the social media platform.
Some of the youngest bots were shown promotional videos for pornography websites with paid content and sex shops, but it wasn’t just sex that TikTok was selling to minors.
Do you allow your children to use TikTok?
Besides the endless feed of sexual perversion, the app served up hundreds of videos promoting drug use, including milder drugs such as alcohol and marijuana as well as harder substances like prescription narcotics, psychedelics, cocaine and meth.
There also was content that favorably portrayed eating disorders, drunkenness and driving while intoxicated — all in a format the app’s creators know will get kids hooked on the steady stream of content using its powerful artificial intelligence models.
“All the problems we have seen on YouTube are due to engagement-based algorithms, and on TikTok it’s exactly the same— but it’s worse,” former YouTube engineer Guillaume Chaslot told The Wall Street Journal, which published its investigation last month. “TikTok’s algorithm can learn much faster.”
Chaslot should know — he was partly responsible for creating YouTube’s powerful algorithm as an engineer for the platform and now uses that experience to push for companies to be upfront about their use of artificial intelligence.
Through its algorithm, TikTok learns user preferences based on how long a person lingers over content before moving on to the next video and adjusts what’s offered accordingly.
TikTok told the WSJ that it does not separate content aimed at adult users from content deemed safe for children. A TikTok spokesperson said the app is looking at technology that would filter out adult content for users who are underage.
Any misdirected content is a problem on its own, but the situation gets worse when you factor in that kids between the ages of 4 and 15 spend an average of 80 minutes a day watching TikTok, according to a study by the Qustodio.
Though the study comes from a company that helps parents limit screen time for kids, all one needs to do is observe schoolchildren staring at their phones or copying popular TikTok dances to get a sense of how popular and influential the platform is.
The app clearly is efficient at controlling online behavior, and some of that activity carries offline, with damaging or even deadly results that have lawmakers scrambling.
“Tik Tok challenges are posing real threats to the safety of educators in #Roc and beyond,” U.S. Rep. Joe Morelle, a Democrat from New York, tweeted earlier this month about a challenge that dared kids to assault educators.
“Encouraging students to ‘slap a teacher’ isn’t funny—it’s assault, and it’s illegal. I’m writing to @tiktok_us and calling on them to stop the spread of this dangerous content immediately.”
Tik Tok challenges are posing real threats to the safety of educators in #Roc and beyond. Encouraging students to ‘slap a teacher’ isn’t funny—it’s assault, and it’s illegal.
I’m writing to @tiktok_us and calling on them to stop the spread of this dangerous content immediately.
— Joe Morelle (@RepJoeMorelle) October 6, 2021
Even at its best, TikTok is a waste of time, loading short videos in an endless succession that keep eyes glued to smartphones instead of gazing at the wonders of the real world or even just the people in the room.
At its worst, it’s exposing America’s children to a dark, sick world that is destructive to young minds and souls — and it’s all happening right under parents’ noses.
Many parents feel they must allow their children to have smartphones and use platforms such as TikTok so their children don’t feel left out or ostracized.
That has never been the soundest strategy in any era, but now the stakes are too high to simply give in to whatever the “cool kids” are doing.
After all, children grow up and move beyond their adolescent and teen peer groups to the wider world, where they find acceptance for who they are rather than for simply conforming.
But allowing children to engage with TikTok when they’re too young to resist the powerful algorithm leads them to consuming darker and more deviant content that will stay with them for life.
The children who are staring at glowing screens filled with filth will have families of their own someday; what will their hearts and minds be like after decades of consuming this material?
The only solution is to do what our parents would have done less than a generation ago — be the adult and take away the thing your kids want.
It might make you unpopular with your kids and might lead to them feeling left out or isolated. But as my late father would say: If your kids don’t like you, it means you’re doing your job.
And someday, they might even thank you for caring enough to be disliked.