Skip to main content

40%+ of children under 13 using TikTok, Facebook, and more; high rates of online sexual abuse

A research study into use of social media platforms by minors found that upwards of 40% of children under 13 were using them, despite accounts supposedly being limited to teenagers and adults. The research included TikTok, Facebook, Instagram, and Snapchat.

The study also found that a full third of minors had experienced an “online sexual interaction,” which included being asked for, or receiving, nude photos …

Adding to the concern, children are much more likely to simply block those sending inappropriate messages than they are to report them to a parent or caregiver.

Child-protection nonprofit Thorn carried out the research among 2,000 children in the US.

Young people use many of the same widely popular platforms as adults, often in spite of age limitations put in place by the platform. They are drawn to opportunities to meet new people, generate content and build a following, and explore without fear of judgement.

While the internet offers boundless opportunities to connect and discover, it also creates new opportunities for risk and harm. Nearly half of participants (48%) said they had been made to feel uncomfortable, been bullied, or had a sexual interaction online.

While the most common experiences reported involved bullying or generally being made to feel uncomfortable (38%), 1 in 3 participants reported having had an online sexual interaction.

Response options coded as an “online sexual interaction” in analysis included: being asked for a nude image or video, being asked to go “on cam” with a nude or sexually explicit stream, being sent a nude photo or video, or being sent sexually explicit messages.

The most common online sexual interactions that participants reported involved receiving sexual messages (such as a “sext,” 21%), receiving a nude photo or video of the sender (18%), or being asked for a nude photo or video (18%).

Site default logo image

The worst platforms for sexually abusive messages were Instagram and Snapchat, with 16% of minors on both platforms reporting sexual interactions.

Minors who did find themselves on the receiving end of such messages were twice as likely to block the sender than they were to inform a parent or guardian. Sixty-six percent said they blocked the person, 46% reported it on the platform, and only 29% told a parent or caregiver.

Thorn recommends that children need to be better warned of the risks, and specifically told that online sexual contact with a minor is illegal and should always be reported. Platforms should ensure reporting tools link to sources of help and support, and blocking should raise similar red flags to reporting. Blocking tools also need to be improved to prevent re-contact after blocking, which is currently common.

Photo by Gaelle Marcel on Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing