One of the challenges Apple has faced in making its services truly personal and proactive is its focus on privacy. While Google unashamedly collects masses of data about its users, even going to far as to scan emails for boarding passes and restaurant reservations in order to provide automated reminders, Apple has been extremely conservative in the amount of data it collects.
We even learned a few months ago that any collection of customer data requires sign-off from three ‘privacy tsars’ and that getting permission can take a year.
A feature Apple mentioned yesterday almost in passing seems to allow the best of both worlds – collection of data while ensuring user privacy – but a leading cryptography expert has questioned whether Apple’s approach is really safe …
All Apple said in its press release about iOS 10 was that it was ‘increasing security and privacy with powerful technologies like Differential Privacy.’ Software engineering SVP Craig Federighi said only a little more about it during the keynote.
We believe you should have great features and great privacy. Differential privacy is a research topic in the areas of statistics and data analytics that uses hashing, subsampling and noise injection to enable […] crowdsourced learning while keeping the data of individual users completely private. Apple has been doing some super-important work in this area to enable differential privacy to be deployed at scale.
Apple provided Wired with a little more info.
Starting with iOS 10, Apple is using Differential Privacy technology to help discover the usage patterns of a large number of users without compromising individual privacy. To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.
However, Gizmodo noted that Matthew Green, a Cryptography professor at Johns Hopkins University, was tweeting skeptically about it, describing the approach as untested.
Most people go from theory to practice, then to widespread deployment. With Differential Privacy it seems Apple cut out the middle step.
— Matthew Green (@matthew_d_green) June 13, 2016
Querying him, Green said that existing implementations of Differential Privacy had needed to compromise privacy to obtain accurate data.
The question is, what kind of data, and what kind of measurements are they applying it to, and what are they doing with it,” Green told Gizmodo. “It’s a really neat idea, but I’ve never really seen it deployed. It ends up being a tradeoff between accuracy of the data you are collecting and privacy.
“The accuracy goes down as the privacy goes up, and the tradeoffs I’ve seen have never been all that great,” Green continued. “[Again] I’ve never really heard of anyone deploying it in a real product before. So if Apple is doing this they’ve got a custom implementation, and they made all the decisions themselves.
Apple did appear to have an expert on its side during the keynote, displaying a slide in which Aaron Roth, an associate professor of computer science at the University of Pennsylvania, called it ‘visionary’ and said that it ‘positions Apple as the clear privacy leader among technology companies today.’ But Federighi admitted that Roth had been given only a ‘quick peek’ at the technology, and Roth told Wired that he ‘[couldn’t] comment on anything specific that Apple’s doing with differential privacy’ though he did think the company was ‘doing it right.’
Apple has a lot invested in its reputation for protecting user privacy, so what was a small sidenote in yesterday’s presentation could turn out to be an important issue to follow.
Photo: Slashgear
FTC: We use income earning auto affiliate links. More.
Comments