Facebook published today two articles about how to create a safer experience for young people on Instagram as well as how the company verifies whether someone is the appropriate age required to be on its platforms.
Over the past months, some organizations have criticized Facebook for trying to create an Instagram for kids. In a letter to Mark Zuckerberg in April, the Campaign for a Commercial-Free Childhood, an international coalition of 35 children’s and consumer groups, wrote that “a children’s version of Instagram (..) could hook even younger users on endless routines of photo-scrolling and body-image shame.”
“The true audience for a kids’ version of Instagram will be much younger children who do not currently have accounts on the platform. While collecting valuable family data and cultivating a new generation of Instagram users may be good for Facebook’s bottom line, it will likely increase the use of the app by young children who are particularly vulnerable to the platform’s manipulative and exploitative features.”
Even though, Facebook still thinks it’s best to create a tool for these kids rather than prohibiting them from being online. The company wrote in a blog post today:
“We’re also looking at ways we can reduce the incentive for people under the age of 13 to lie about their age. The reality is that they’re already online, and with no foolproof way to stop people from misrepresenting their age, we want to build experiences designed specifically for them, managed by parents and guardians. This includes a new Instagram experience for tweens. We believe that encouraging them to use an experience that is age appropriate and managed by parents is the right path. It’s going to take a village to make this experience compelling enough so that this age group wants to use it, but we’re determined to get it right.“
What Facebook is doing to improve Instagram’s experience with people under 18 but over 13
Facebook is focusing on three main pillars to help give young people a safer, more private experience on Instagram:
- Defaulting young people into private accounts
- Making it harder for potentially suspicious accounts to find young people
- Limiting the options advertisers have to reach young people with ads
For example, kids under 16 (or under 18 in certain countries) will be defaulted into a private account when they join Instagram. With that, other users won’t be able to comment on kids’ content as well as won’t be able to see them at all in places like Explore or hashtags.
In the US, Australia, France, the UK, and Japan, Instagram also developed a new technology that will help the company to find accounts that have shown potentially suspicious behavior and stop them from interacting with young people’s accounts By “potentially suspicious behavior,” Instagram means accounts belonging to adults that may have recently been blocked or reported by a young person, for example.
Facebook is also changing how advertisers can impact children:
Starting in a few weeks, we’ll only allow advertisers to target ads to people under 18 (or older in certain countries) based on their age, gender and location. This means that previously available targeting options, like those based on interests or on their activity on other apps and websites, will no longer be available to advertisers. These changes will be global and apply to Instagram, Facebook and Messenger.
Facebook also gives some insights into how the company manages to know if an underaged person is using its platform with the help of AI, working with industry partners, and experts, as well as counting with other users reporting people under 13 on Facebook and Instagram.
FTC: We use income earning auto affiliate links. More.