Meta has announced that teen Instagram and Facebook accounts will in future block a wide range of harmful content, including posts about self-harm, graphic violence, and eating disorders. The move is being made in response to lawsuits by the majority of US states …
Meta being sued by more than 40 US states
Instagram and Facebook parent company Meta is facing lawsuits from more than 40 US states, who are alleging that the company knew its platforms were harmful to teenagers but failed to do anything about it.
The controversy dates back to 2021, when an internal report revealed the company’s knowledge about the dangerous effects of its apps.
An internal report describes a number of ways in which Instagram is harmful to as many as 20% of teenage girls using the app. It can increase anxieties about physical attractiveness, social image, and money, and even increase suicide risk, according to Facebook’s own research.
The research was later made public.
While the company argued that the findings were taken out of context, and its apps do more good than harm, US states have been unconvinced. Hundreds of lawsuits have been filed against social media app companies, including more than 40 US state attorneys general suing Meta.
While developers tried to block these lawsuits on First Amendment and Section 230 grounds, a court ruled late last year that they can proceed.
Teen Instagram and Facebook accounts to be protected
The WSJ reports that new protections are being rolled out for accounts used by teenagers.
Meta plans to automatically restrict teen Instagram and Facebook accounts from harmful content including videos and posts about self-harm, graphic violence and eating disorders. The changes are expected to roll out in the coming weeks.
This marks the biggest change the tech giant has made to ensure younger users have a more age-appropriate experience on its social-media sites.
Meta previously offered optional settings for teens, known as Sensitive Content Control on Instagram and Reduce on Facebook, but teens could opt out. Now these settings will be automatically applied, with no opt-out, for anyone under 18. Additionally, sexually explicit content will be blocked for those under 16.
The changes take effect immediately for any new teen accounts created, and will be applied to existing accounts from this week onwards, expected to apply to all teen accounts within a few weeks.
Photo by Creative Christians on Unsplash
FTC: We use income earning auto affiliate links. More.
Comments