Skip to main content

Chinese surveillance and a post-Roe world may need Apple to go even further on privacy

The scale and reach of Chinese surveillance of its own citizens is well documented, but a new piece shows that the country’s government is now trying to use this vast trove of data to predict crimes and protests before they happen.

The Supreme Court ruling on abortion is also raising fresh concerns about the way that personal data may be used to prosecute women. We’re increasingly living in a world where Apple’s decision to have privacy be a major focus is looking increasingly prescient – but even the Cupertino company may now need to do more …

Chinese surveillance: The story so far

The sheer scale of Chinese surveillance of its citizens is mind-boggling. It’s estimated that there are more than a billion surveillance cameras in use around the world – and around half of them are in China.

In 2016, a new cybersecurity law required cloud companies to store the personal data of Chinese citizens within the country, on servers run by state-owned companies which are widely believed to be fully-accessible by the government. Apple was forced to comply.

In 2020, Chinese police officers were given ‘smart helmets’ which can check forehead temperatures, touted as a means of spotting COVID infections. But it was later discovered that these helmets had far more extensive capabilities.

A law enforcement officer wearing the helmet could do any of the following: Measure the temperature of a specific individual; measure the temperatures of people passing by in larger crowds; scan a person’s QR code for personal data; recognize license plates; spot people in the dark; or recognize people using facial recognition.

The country uses both offline and online surveillance to assign ‘social credit’ scores. Negative scores can be accumulated for everything from jaywalking to failing to visit parents regularly. Those with poor scores can be prevented from travelling, attending college, obtaining better jobs – and subjected to public shaming.

‘How China is policing the future’

A New York Times piece reveals how China is now attempting to use this data to predict the future – aiming to detect everything from someone leaving home to attend a protest to criminals getting together to plan or carry out a robbery.

The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and aberrations, promising to predict crimes or protests before they happen. They target potential troublemakers in the eyes of the Chinese government — not only those with a criminal past but also vulnerable groups, including ethnic minorities, migrant workers and those with a history of mental illness […]

the new Chinese technologies, detailed in procurement and other documents reviewed by The New York Times, further extend the boundaries of social and political controls and integrate them ever deeper into people’s lives. At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression […]

Three people with a criminal record check into the same hotel in southeast China. An automated system is designed to alert the police.

A man with a history of political protest buys a train ticket to Beijing. The system could flag the activity as suspicious and tell the police to investigate.

A woman with mental illness in Fujian leaves her home. A camera installed by her house records her movements so the police can track her […]

The police could set the system to send a warning […] when four people with a history of protest enter the same park.

Data being ‘weaponized’ in the US after Roe overturned

When the Supreme Court ruling was leaked back in May, pro-choice and privacy campaigners warned that app data could be used to prosecute women who have had abortions.

For example, in states that make it a crime to help an abortion-seeker such as Texas and Oklahoma, data from women’s period-tracking or pregnancy apps could end up being subpoenaed as evidence against the person who helped them, said Danielle Citron, a law professor at the University of Virginia and author of the forthcoming book “The Fight for Privacy.” “Let’s say you got your period, stopped your period and then got your period again in a short time,” Citron said. “It’s [potential] evidence of your own criminality, or your doctor’s criminality.”

Now that the ruling is official, there are calls for tech companies to respond by protecting user data – but suggestions that they won’t, because the use and sale of personal data is big business.

We all know we live in an era of surveillance unparalleled in human history, but it’s hard to comprehend how broad and deep that surveillance network is. If companies stepped up to the plate to curtail data collection, it’d probably be eye-opening to a large swath of the public.

That data is also wildly valuable to companies. The user data trade was a $29 billion market last year.

In essence, companies’ silence says they think the data collection and sale process is so opaque, users won’t raise a fuss about it. And data itself is more valuable to the companies than helping people have access to life-saving health care.

It also tells users that they’re on their own if they want to safely seek an abortion free of prosecution. Doing so will be a monumental challenge for those people.

Apple looks increasingly prescient on privacy

Apple made an early decision to make customer privacy a key priority, going to some extreme lengths to do so.

Any collection of Apple customer data requires sign-off from a committee of three “privacy czars” and a top executive, according to four former employees who worked on a variety of products that went through privacy vetting […] The trio of experts […] are both admired and feared.

One former employee said that debates over whether or not customer data can be used to improve a service usually take at least a month to settle, and some privacy issues are debated for more than a year before a final decision is reached. Key privacy issues are escalated all the way to Tim Cook.

A refusal to compromise on privacy killed one of Apple’s products, says the piece, while others needed to be substantially reworked to achieve privacy sign-off.

Apple engineers say the rules have resulted in fewer app features and slower development.

Even paying that price hasn’t always worked out for the company. Ironically, it was Apple’s attempt to do CSAM scanning in a more privacy-focused way than other companies which mired it in so much controversy. Siri has also been lambasted for being dumber than other intelligent assistants, in part because Apple’s privacy rules give the service much less access to personal data. And, of course, there was the furore when Apple refused to create a backdoor into iPhones at the request of the FBI.

Apple doesn’t have a perfect record on privacy. When Europe’s tough GDPR privacy law came into effect in 2018, the Cupertino company was forced to introduce new protections in order to comply. The company ran into trouble on ‘Siri grading’ back in 2019. There are other examples, but there’s no question that the company has gone further than any other tech giant to protect user privacy.

China surveillance techniques being used in the US

There was a time when the average non-tech Apple customer might have seen privacy as a somewhat academic issue. Nice to have, but not something that ordinary people needed to worry too much about. That attitude is now rapidly changing.

Machine-learning in particular is enabling unprecedented new surveillance capabilities. Where once the sheer volume of data would be a limitation as well as an enabler, giving governments more information than they could possibly analyze, AI systems are now enabling the sifting of vast amounts of data in ways that have never before been practical.

China may represent an extreme, but other countries – including the US – are taking steps in this direction. The Brookings Institute last year noted that China is not alone in trying to predict crimes before they happen.

[Across the US], predictive policing systems digitally redline certain neighborhoods as “hotspots” for crime, with some systems generating lists of people they think are likely to become perpetrators. These designations subject impacted communities to increased police presence and surveillance that follows people from their homes to schools to work.

The typical targets are Black and brown youth, who may also be secretly added to gang databases or asked by school officials to sign “contracts” that prohibit them from engaging in behavior that “could be interpreted as gang-affiliated.” For communities whose lived experience includes being treated as inherently suspicious by police and teachers, increased surveillance can feel like a tool for social control rather than a means of public safety.

The foundation of these practices are what police departments and technology companies call data-driven policing, intelligence-led policing, data-informed community-focused policing or precision policing. While data has always been used to solve crime, these tools go a step further, relying on a fraught premise: that mining information from the past can assist in predicting and preventing future crimes.

As the scholar Andrew Guthrie Ferguson has said, “Big-data technology lets police become aggressively more proactive.” But this data can be biased, unreliable, or simply false. Unquestioned reliance on data can hypercharge discriminatory harms from over-policing and the school-to-prison pipeline.

Apple may need to go even further on privacy

The US Supreme Court seems set to potentially overturn other landmark rulings which may fundamentally change the rights of citizens. Personal data which was once innocuous may become incriminating, as in the case of period-tracking apps.

Apple was at the forefront of protecting personal data, and may need to go even further in the face of increasing threats to civil liberties.

Apple already requires apps to have a privacy labels, revealing what categories of data are collected by the app. But as the threats grow, the company may need to respond with tougher protections.

Apple-certified safe data storage is one possibility that occurs to me. Apps which collect sensitive data can apply for Apple validation of their personal data storage. This might require all data to use end-to-end encryption, for example, so that not even developers have access to it.

Any law enforcement agency – or even private citizen – serving demands for access to personal data could then simply be told by developers and Apple alike that they do not have any way to obtain it.

What are your thoughts? Please take our poll, and share your views in the comments. Given the sensitive nature of the topic, please ensure that all comments are respectful of opposing views, arguing your own case rather than insulting those who disagree with you.

Image: Mohamed Hassan/PxHere

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing