Many accounts on TikTok have become portals to some of the most dangerous and disturbing content on the internet. As private as they are, nearly anyone can join.
Alexandra S. Levine, Forbes Staff
Nov 14, 2022,06:30am EST
The following article contains descriptions and discussions of graphic social media content, including child sexual abuse material and adult pornography.
Don’t be shy, girl.
Come and join my post in private.
LET’S HAVE SOME FUN.
The posts are easy to find on TikTok. They typically read like advertisements and come from seemingly innocuous accounts.
But often, they're portals to illegal child sexual abuse material quite literally hidden in plain sight—posted in private accounts using a setting that makes it visible only to the person logged in. From the outside, there’s nothing to see; on the inside, there are graphic videos of minors stripping naked, masturbating, and engaging in other exploitative acts. Getting in is as simple as asking a stranger on TikTok for the password.
TikTok’s security policies explicitly prohibit users from sharing their login credentials with others. But a Forbes investigation found that’s precisely what’s happening. The reporting, which followed guidance from a legal expert, uncovered how seamlessly underage victims of sexual exploitation and predators can meet and share illegal images on one of the biggest social media platforms on the planet. The sheer volume of post-in-private accounts that Forbes identified—and the frequency with which new ones pop up as quickly as old ones are banned—highlight a major blind spot where moderation is falling short and TikTok is struggling to enforce its own guidelines, despite a “zero tolerance” policy for child sexual abuse material.
The problem of closed social media spaces becoming breeding grounds for illegal or violative activity is not unique to TikTok; groups enabling child predation have also been found on Facebook, for example. (Its parent, Meta, declined to comment.) But TikTok’s soaring popularity with young Americans—more than half of U.S. minors now use the app at least once a day—has made the pervasiveness of the issue alarming enough to pique the interest of state and federal authorities.
“There's quite literally accounts that are full of child abuse and exploitation material on their platform, and it's slipping through their AI,” said creator Seara Adair, a child sexual abuse survivor who has built a following on TikTok by drawing attention over the past year to exploitation of kids happening on the app. “Not only does it happen on their platform, but quite often it leads to other platforms—where it becomes even more dangerous.”
Adair first discovered the “posting-in-private” issue in March, when someone who was logged into the private TikTok account @My.Privvs.R.Open made public a video of a pre-teen “completely naked and doing inappropriate things” and tagged Adair. Adair immediately used TikTok’s reporting tools to flag the video for “pornography and nudity.” Later that day, she received an in-app alert saying “we didn’t find any violations.”
The next day, Adair posted the first of several TikTok videos calling attention to illicit private accounts like the one she’d encountered. That video went so viral that it landed in the feed of a sibling of an Assistant U.S. Attorney for the Southern District of Texas. After catching wind of it, the prosecutor reached out to Adair to pursue the matter further. (The attorney told Adair they could not comment for this story.)
Adair also tipped off the Department of Homeland Security. The department did not respond to a Forbes inquiry about whether a formal TikTok probe is underway, but Special Agent Waylon Hinkle reached out to Adair to collect more information and told her via email on March 31 that “we are working on it.” (TikTok would not say whether it has engaged specifically with Homeland Security or state prosecutors.)
TikTok has “zero tolerance for child sexual abuse material and this abhorrent behavior which is strictly prohibited on our platform,” spokesperson Mahsau Cullinane said in an email. “When we become aware of any content, we immediately remove it, ban accounts, and make reports to [the National Center for Missing & Exploited Children].” The company also said that all videos posted to the platform—both public and private, including those viewable only to the person inside the account—are subject to TikTok’s AI moderation and in some cases, additional human review. Direct messages may also be monitored. Accounts found to be attempting to obtain or distribute child sexual abuse material are removed, according to TikTok.
TikTok's moderation often fails to properly action "post-in-private" content
that users flag. Many of these reports come back "no violation."
TIKTOK USER
The app offers tools that can be used to flag accounts, posts and direct messages containing violative material. Forbes used those tools to report a number of videos and accounts promoting and recruiting to post-in-private groups; all came back “no violation.” When Forbes then flagged several of these apparent oversights to TikTok over email, the company confirmed the content was violative and removed it immediately.
There is much more on this story at Forbes.
=====================================================================================
No comments:
Post a Comment