Skip to main content

Cops unofficially using apps for face recognition and accessing medical records

Hundreds of thousands of police officers are unofficially using apps for face recognition, and gaining access to a wide range of databases, from credit checks to medical records, according to a new report today.

The report says that companies aiming to sell these apps to police departments are first offering them to individual cops, hoping that this will then create the demand for official purchases. This approach means there is no oversight to ensure civil rights are protected …

TNW’s headline doesn’t pull any punches: Lying, corrupt, anti-American cops are running amok with AI.

Any cop, regardless of affiliation or status, has access to dozens (if not hundreds) of third-party AI systems [in the form of] an Android or iPhone app that officers and agents can use without their supervisors even knowing.

A cop installs software from a company such as Clearview AI on their personal smartphone. This allows them to take a picture of anyone and surface their identity. The cop then runs the identity through an app from a company such as Palantir, which surfaces a cornucopia of information on the individual.

So, without a warrant, officer Friendly now has access to your phone carrier, ISP, and email records. They have access to your medical and mental health records, military service history, court records, legal records, travel history, and your property records. And it’s as easy to use as Netflix or Spotify.

Best of all, at least for the corrupt cops using these systems unethically, there’s absolutely no oversight whatsoever. Cops are often offered these systems directly from the vendors as “trials” so they can try them before they decide whether to ask their departments to adopt them at scale.

The reason officers use these systems is because they make their jobs much easier. They allow a police officer to skip the warrant process and act as judges themselves.

Additionally, even official departmental use of such apps often appears to be in contravention of constitutional rights to privacy, argues the piece.

Apps for face recognition have been found to be racially biased, with a greater likelihood of wrongly identifying Black people, leading to higher instances of false arrest.

It’s certainly an area that needs close scrutiny. The challenge being that such oversight should start with the House of Representatives and the Senate – many of whose members don’t appear to have much understanding about technology.

Photo: Fred Moon/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications