Skip to main content

TikTok CSAM investigation underway by Dept. of Homeland Security; privacy feature exploited

The Department of Homeland Security has opened a TikTok CSAM investigation, after child sexual abuse material was posted both publicly and privately on the video sharing network.

Additionally, the platform is being heavily used by abusers for grooming – the practice of befriending a child online with the intention of later abusing them, either online or offline …

The Financial Times reports that TikTok moderators have been unable to keep up with the volume of videos being posted, meaning that abusive material has been posted to the public feed.

Additionally, abusers have been exploiting a privacy feature offered by TikTok.

The US Department of Homeland Security is investigating how TikTok handles child sexual abuse material, according to two sources familiar with the case. The Department of Justice is also reviewing how a specific privacy feature on TikTok is being exploited by predators, said one person with knowledge of the case.

One pattern that the Financial Times verified with law enforcement and child safety groups was content being procured and traded through private accounts, by sharing the password with victims and other predators. Key code words are used in public videos, usernames and biographies, but the illegal content is uploaded using the app’s “Only Me” function where videos are only visible for those logged into the profile.

Seara Adair, a child safety campaigner, reported this trend to US law enforcement after first flagging the content on TikTok.

TikTok is also accused of failing to be as proactive as other social networks when it comes to detecting and preventing grooming attempts.

“It is a perfect place for predators to meet, groom and engage children,” said Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security’s cyber crime division, calling it the “platform of choice” for the behaviour […]

Burke claimed that international companies such as TikTok were less motivated when working with US law enforcement. “We want [social media companies] to proactively make sure children are not being exploited and abused on your sites — and I can’t say that they are doing that, and I can say that a lot of US companies are,” she added.

The use of the platform by predators is especially worrying given the predominantly teenage demographic.

TikTok said that it did work with law enforcement “as necessary.”

“TikTok has zero-tolerance for child sexual abuse material,” the company said. “When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.”

Photo: Verne Ho/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear