Following the revelation of a new system that will scan users’ photographs for CSAM (child sexual abuse material), Apple has received a lot of backlash. Not just regular iOS users, but also Apple employees, are concerned about this.
Multiple Apple employees have expressed worries about the new CSAM system in an internal Slack channel, according to a recent story from Reuters. Some of these employees, who wanted not to be identified, are concerned that the tool could be used to suppress people by authoritarian countries.
After the feature was announced last week, more than 800 mails about CSAM content detection were submitted, according to another worker, “the amount and duration of the current argument is surprising.”
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Some employees believed that Slack was not the appropriate location for such talks, so a group of employees started a dedicated thread to debate the new function. Employees advocated in favor of CSAM detection because they believed it would “crack down on unlawful stuff,” according to the reports.
Apple employees have used other Slack channels to talk about topics like the firm declining new work-from-home requests and even pay equity surveys. Apple has asked its staff to refrain from using Slack to discuss labor matters or other sensitive topics. This does not appear to have deterred some people from expressing their dissatisfaction with the company.