Facebook has been working on ways to prevent the posting of so-called “revenge porn” – aka non-consensual sharing of explicit photos by former partners and others – and sextortion attempts, especially those against teenagers.
Its first tool, back in 2017, was so scary that few were willing to use it: You had to upload nudes to Facebook in order to have them tagged for blocking! But a new tool, geared initially to teenagers, will allow them to create digital fingerprints, right on their own device …
Revenge porn and sextortion
Revenge porn – better referred to as non-consensual porn – has become a growing problem. Where once an abusive ex might have shown a photo in person to a few friends, it’s now simple for an abuser to upload photos to social media, enabling them to be seen by potentially hundreds of people.
Teenagers are also increasingly becoming victims of “sextortion” – where an adult tricks them into revealing themselves on webcam, records the footage, and then uses the threat of exposure to either blackmail victims into other acts, or to demand money.
Teenage males are the most common victims, via “honey-traps.” An attractive female model from a dating app, usually posing as someone of the same age, persuades victims to perform acts on webcam, which can then be used to extort them.
Previous approaches too scary, or unreliable
Facebook’s first approach to the problem was to allow people to upload their own nudes to the service, which would then create a hashed digital fingerprint. That fingerprint would then be used to block others from attempting to upload the same photo.
Unsurprisingly, few people were willing to take the risk of uploading compromising photos to Facebook, even though it only required you to send the image to yourself.
A second version used AI to try to identify problematic images.
Latest tool works entirely on your own device
CNN reports that Facebook and Instagram owner Meta is now partnering with the National Center for Missing and Exploited Children (NCMEC) on a far better approach.
Take It Down, which is operated and run by the National Center for Missing and Exploited Children, will allow minors for the first time to anonymously attach a hash – or digital fingerprint – to intimate images or videos directly from their own devices, without having to upload them to the new platform.
To create a hash of an explicit image, a teen can visit the website TakeItDown.NCMEC.orgto install software onto their device. The anonymized number, not the image, will then be stored in a database linked to Meta so that if the photo is ever posted to Facebook or Instagram, it will be matched against the original, reviewed and potentially removed.
Similar approach to Apple’s abandoned CSAM scanning
If this sounds familiar, that’s because it’s very similar to that which Apple planned to use to scan for known Child Sexual Abuse Materials (CSAM).
In that case, the NCMEC would have provided Apple with digital hashes of known, existing abusive material; Apple would create digital hashes of the photos on user iPhones; and it would then trigger your iPhone to run a scan comparing the two sets of fingerprints. If multiple matches were found, the photos concerned would be uploaded for manual review.
However, four concerns were raised, including the ability of authoritarian governments to force Apple to check for hashes of protest posters and the like. Apple was able to address some of the fears, but not all, and the company subsequently confirmed that it had abandoned the plan.
The system developed by Meta and the NCMEC should not raise similar concerns, as iPhone owners freely choose to use the tool.
Photo: Melanie Wasser/Unsplash
FTC: We use income earning auto affiliate links. More.
Comments