Instagram and Facebook collect your data from thousands of companies, according to an experiment carried out by Consumer Reports.
Separately, the company is the largest reporter of potential child sexual abuse materials (CSAM), but there is a legal problem with the way many of these reports are submitted …
Instagram and Facebook collect your data on a huge scale
Consumer Reports sought the assistance of more than 700 volunteers, to determine the sources of personal data used by parent company Meta to serve personalized ads.
The Markup says the study found that Meta collected data from an average of 2,230 companies.
The Markup helped Consumer Reports recruit participants for the study. Participants downloaded an archive of the last three years of their data from their Facebook settings, then provided it to Consumer Reports […]
Consumer Reports found that a total of 186,892 companies sent data about them to the social network. On average, each participant in the study had their data sent to Facebook by 2,230 companies. That number varied significantly, with some panelists’ data listing over 7,000 companies providing their data.
Unsurprisingly, data brokers were the most common source of personal data hoovered up by the social media giant, but Home Depot and Amazon also made the top 10.
One company appeared in 96% of participants’ data: the San Francisco-based data broker LiveRamp. But the companies sharing your online activity to Facebook aren’t just little-known data brokers. Retailers like Home Depot, Walmart and Macy’s all were in the top 100 most frequently-seen companies in the study. Credit reporting and consumer data companies such as Experian and TransUnion’s Neustar also made the list, as did Amazon, Etsy and PayPal.
The most common type of data gathered is the websites you visited, collected using cookies or tracking pixels, which can be used to build a profile of your interests and activities.
For example, if you visit a bunch of tech sites, that will be used to serve gadget ads to you, and if you searched Amazon for bathroom fittings, that can be used to display ads either for that specific product category or for more general ones like home improvements.
Meta claims to be transparent about its data collection and usage, and to offer choices to users:
We offer a number of transparency tools to help people understand the information that businesses choose to share with us, and manage how it’s used.
But the Electronic Privacy Information Center says that it’s nonsense to suggest consumers understand the scope and scale of this tracking.
This type of tracking which occurs entirely outside of the user’s view is just so far outside of what people expect when they use the internet […] they don’t expect Meta to know what stores they walk into or what news articles they’re reading or every site they visit online.
Meta CSAM reports have a legal problem
Large platforms have a legal obligation to report suspected CSAM to the National Center for Missing & Exploited Children (NCMEC), and Meta files more of these reports than any other company. That’s a good thing, but much of the suspected abuse material is detected by AI, and that poses a legal problem, reports The Guardian.
Social media companies, Meta included, use AI to detect and report suspicious material on their sites and employ human moderators to review some of the flagged content before sending it to law enforcement. However, US law enforcement agencies can only open AI-generated reports of child sexual abuse material (CSAM) by serving a search warrant to the company that sent them. Petitioning a judge for a warrant and waiting to receive one can add days or even weeks to the investigation process […]
Due to US privacy protections under the fourth amendment, which prohibits unreasonable searches and seizures by the government, neither law enforcement officers nor NCMEC – which receives federal funding – are permitted to open reports of potential abuse without a search warrant unless the contents of a report have been reviewed first by a person at the social media company.
The delays this creates may mean that abuse continues for a time, or that further evidence may be lost.
“This is frustrating,” said one California-based assistant US attorney. “By the time you have an account identified and have a warrant, there may be nothing there.”
Meta said it understood the concern, but that only AI systems can keep up with the sheer volume of material which needs to be checked.
Our image-matching system finds copies of known child exploitation at a scale that would be impossible to do manually.
Photo by Annie Spratt on Unsplash
FTC: We use income earning auto affiliate links. More.
Comments