Apple has been facing a lot of criticism following the announcement of a new system that will scan users’ photos for CSAM (child sexual abuse material) content. However, not only regular iOS users are worried about this, but also Apple’s own employees.
A new report from Reuters mentions that multiple Apple employees have expressed concerns about the new CSAM system in an internal Slack channel. According to some of these employees, who asked not to be identified, they fear that the feature could be exploited by authoritarian governments to censor people.
Another worker said “the volume and duration of the new debate is surprising” as more than 800 messages were sent about CSAM content detection after the feature was announced last week.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
A group of employees created a dedicated thread to discuss the new feature, but some of them argued that Slack was not the best place for such discussions. Based on the reports, there are also employees who argued in favor of CSAM detection because they believed that it will “crack down on illegal material.”
Apple employees and Slack
Apple began adopting Slack more widely after offices around the world were closed due to the COVID-19 pandemic and multiple employees were forced to work from home. Despite Apple’s strong secrecy culture, public discussions about what the employees have been saying in the company’s internal Slack have become inevitable.
Other Slack channels have been used by Apple employees to discuss things like the company denying new work-from-home requests and even pay equity surveys. Apple has been asking employees not to use Slack to discuss labor issues or anything else about sensitive topics. However, this does not seem to have stopped some people from showing their frustration with the company.
Besides using internal platforms, some employees have also been sharing their opinions about the company’s positioning on Twitter.
Read also:
- Apple announces new protections for child safety: iMessage features, iCloud Photo scanning, more
- Apple confirms CSAM detection only applies to photos, defends its method against other solutions
- Apple CSAM FAQ addresses misconceptions and concerns about photo scanning
- Apple Privacy exec details system to detect CSAM in new interview
FTC: We use income earning auto affiliate links. More.
Comments