The Department of Homeland Security has opened a TikTok CSAM investigation, after child sexual abuse material was posted both publicly and privately on the video sharing network.
Additionally, the platform is being heavily used by abusers for grooming – the practice of befriending a child online with the intention of later abusing them, either online or offline …
The Financial Times reports that TikTok moderators have been unable to keep up with the volume of videos being posted, meaning that abusive material has been posted to the public feed.
Additionally, abusers have been exploiting a privacy feature offered by TikTok.
The US Department of Homeland Security is investigating how TikTok handles child sexual abuse material, according to two sources familiar with the case. The Department of Justice is also reviewing how a specific privacy feature on TikTok is being exploited by predators, said one person with knowledge of the case.
One pattern that the Financial Times verified with law enforcement and child safety groups was content being procured and traded through private accounts, by sharing the password with victims and other predators. Key code words are used in public videos, usernames and biographies, but the illegal content is uploaded using the app’s “Only Me” function where videos are only visible for those logged into the profile.
Seara Adair, a child safety campaigner, reported this trend to US law enforcement after first flagging the content on TikTok.
TikTok is also accused of failing to be as proactive as other social networks when it comes to detecting and preventing grooming attempts.
“It is a perfect place for predators to meet, groom and engage children,” said Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security’s cyber crime division, calling it the “platform of choice” for the behaviour […]
Burke claimed that international companies such as TikTok were less motivated when working with US law enforcement. “We want [social media companies] to proactively make sure children are not being exploited and abused on your sites — and I can’t say that they are doing that, and I can say that a lot of US companies are,” she added.
The use of the platform by predators is especially worrying given the predominantly teenage demographic.
TikTok said that it did work with law enforcement “as necessary.”
“TikTok has zero-tolerance for child sexual abuse material,” the company said. “When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.”
FTC: We use income earning auto affiliate links. More.
Comments