Apps used by children – defined as any app “likely to be used by children,” even if they are not the target audience – must meet new UK privacy standards from today …

The Children’s Code (more formerly known as the Age Appropriate Design Code) officially came into force last year, but developers and online service providers were allowed a grace period to bring their apps into compliance. That grace period expired today.

The code requires a number of measures to be implemented, beginning with a high default level of privacy for the app.

The code is not a new law, but instead represent’s the UK privacy watchdog’s view of how GDPR applies to apps and online services used by children.

The code […] sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services. It follows a thorough consultation process that included speaking with parents, children, schools, children’s campaign groups, developers, tech and gaming companies and online service providers.

Such conversations helped shape our code into effective, proportionate and achievable provisions.

Organisations should conform to the code and demonstrate that their services use children’s data fairly and in compliance with data protection law.

The code is a set of 15 flexible standards – they do not ban or specifically prescribe – that provides built-in protection to allow children to explore, learn and play online by ensuring that the best interests of the child are the primary consideration when designing and developing online services.

Settings must be “high privacy” by default (unless there’s a compelling reason not to); only the minimum amount of personal data should be collected and retained; children’s data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings. The code also addresses issues of parental control and profiling.

Any app deemed not to comply will be subjected to a full privacy audit by the Information Commissioner’s Office (ICO), which can impose high fines for breaking GDPR protections.

Since apps used by children include mainstream ones like YouTube and Facebook, the requirements will apply to a sizeable proportion of apps in both Apple and Google app stores.

Ironically, Apple’s latest child-protection move – scanning iPhones for CSAM – has resulted in a huge privacy row that remains unresolved.

Photo: Sajad Nori/Unsplash

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!

Ben Lovejoy's favorite gear