A new report says that the British government will be asking Apple and Google to block the taking and sharing of nude photographs unless the user has been verified to be an adult.
Additionally, the report says that the government wants iOS and Android to prevent nude images even being displayed on the device unless the user has been verified as an adult …
Age verification proposals
We’ve been reporting growing support for the idea that app stores, rather than individual developers, should be held legally responsible for verifying the age of users. In the US, that has led to a new proposed law, the App Store Accountability Act.
Instead of users having to prove their age to a whole bunch of individual developers each time they download an app with a minimum age requirement, the idea is that we would do so once to either Apple or Google, and the company would then age-gate apps as appropriate.
Apple has so far been lobbying against this, but we argued last week that this would actually be the best solution.
Apple and Google to be asked to block nude photos
In the latest development, the Financial Times reports that the British government is to ask both Apple and Google to protect children from the taking and viewing of nude photos.
Ministers want the likes of Apple and Google to incorporate nudity-detection algorithms into their device operating systems to prevent users taking photos or sharing images of genitalia unless they are verified as adults […]
[Additionally,] the Home Office wants to see operating systems that prevent any nudity being displayed on screen unless the user has verified they are an adult through methods such as biometric checks or official ID.
The report says that an announcement is set to be made in the next few days. The wording suggests that this will be a request rather than a legal requirement, at least for now.
Apple currently offers some protections within the Messages app. If a child in an iCloud Family group receives a sexually explicit image, it will initially be blurred and a warning message displayed. If the child taps the View Photo button, they will see a pop-up telling them why the message is considered sensitive and asking them to confirm that they wish to view it. The pop-up also explains that the parent set as admin for the group will be notified.
9to5Mac’s Take
This is the most controversial age verification proposal we’ve yet seen. The idea of Apple and Google essentially spying on both photos and any material viewed in any app seems a complete non-starter from a privacy perspective, even if it is carried out on-device.
At the same time, there is a growing problem with sex offenders grooming children to share CSAM – in particular, impersonating teenagers to encourage genuine teens to share compromising photos and video. The typical pattern here is an escalation where victims are blackmailed with existing images into sending ever more explicit ones. A number of teen suicides have been linked to this.
It does seem useful for this proposal to at least prompt discussion of the problem in order to explore whether more realistic and reasonable solutions can be found.
Highlighted accessories
- Official Apple Store on Amazon
- Apple 40W Dynamic Power Adapter for iPhone 17
- Official Apple iPhone Air cases and bumpers
- iPhone Air MagSafe Battery
- Official iPhone Air case
- Official iPhone 17 cases
- Official iPhone 17 Pro cases and Pro Max cases
Photo by Logan Voss on Unsplash
FTC: We use income earning auto affiliate links. More.

Comments