After Apple’s CSAM Program is Delayed, Protests Emerge to Cancel It Altogether

After Apple delayed it’s file scanning program (CSAM), protesters came out to urge Apple to cancel these plans altogether.

Last month, Apple announced plans to scan files uploaded to the cloud and messages flowing through it’s iMessage service. Apple contends that the idea is to scan and detect images and video’s of known child abuse. Of course, there is inherent security issues with that.

This happened in the backdrop of Apple saying that what happens on your iPhone stays on your iPhone. The US government has, for years, been pushing a war on encryption. Essentially, demanding that backdoors be implemented into encrypted communications. Large tech companies, Apple included, pushed against this idea and seemingly started to move towards being proponents of privacy. That is how Apple started building their reputation as a protector of people’s privacy in recent years.

So, when Apple announced that they would be scanning people’s files in the background, it shocked a lot of people. This is mainly because this represents a complete 180 for the company in their eyes. The move sparked petitions and international condemnation. Naturally, this expanded on the backlash Apple faced from their announced plans.

Earlier this month, facing mounting criticism, Apple said that it would be delaying it’s scanning plans. From CNBC:

After objections about privacy rights, Apple said Friday it will delay its plan to scan users’ photo libraries for images of child exploitation.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in a statement. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple shares were down slightly Friday morning.

Apple immediately stirred controversy after announcing its system for checking users’ devices for illegal child sex abuse material. Critics pointed out that the system, which can check images stored in an iCloud account against a database of known “CSAM” imagery, was at odds with Apple’s messaging around its customers’ privacy.

From the beginning, critics have been saying that this effort to scan files is far more invasive than what other companies have implemented. Apple tried to deflect this by saying that it’s system is just detecting images and video’s that matches a database.

The Electronic Frontier Foundation (EFF) said that this was not good enough. They called on Apple to abandon its plans altogether:

Apple announced today that it would “take additional time over the coming months to collect input and make improvements” to a program that will weaken privacy and security on iPhones and other products. EFF is pleased Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely.

The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship. These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a new mass surveillance system to spy on citizens. They also put already vulnerable kids at risk, especially LGBTQ youth, and create serious potential for danger to children in abusive households.

The responses to Apple’s plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children. This week, EFF’s petition to Apple demanding they abandon their plans reached 25,000 signatures. This is in addition to other petitions by groups such as Fight for the Future and OpenMedia, totaling well over 50,000 signatures. The enormous coalition that has spoken out will continue to demand that user phones—both their messages and their photos—be protected, and that the company maintain its promise to provide real privacy to its users.

In response, the EFF joined Fight for the Future to organize protests against Apple. According to Fight for the Future, protestors did show up at Apple:

Protesters with banners and signs are gathering outside Apple stores in major cities across the US, demanding the company commit to never implementing its misguided on-device photo and message scanning proposal

The night before Apple’s much-hyped iPhone 13 rollout tomorrow, the company is facing protests at its retail stores across the US. Organized by Fight for the Future, the Electronic Frontier Foundation, and a network of volunteers, the protests are demanding that Apple permanently shelve their dangerous proposal to install photo and message scanning malware on millions of people’s devices. The company already announced it was delaying the misguided proposal after widespread backlash from security experts and human rights experts. Protesters are calling on them to publicly commit to never implementing it.

The post contains numerous pictures of different locations across the US where protesters demanded that Apple abandon it’s plan to implement it’s scanning proposal.

At this point, Apple hasn’t said that it would abandon it’s plan altogether, but opposition seems to have at least won a delay in the plan. So, there has been headway in all of this at least. So, the push to get Apple to back down is still ongoing.

Drew Wilson on Twitter: @icecube85 and Facebook.



Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.


%d bloggers like this: