Apple’s Surveillance Plans Sparks Petitions and International Condemnation

Apple’s plan to scan people’s personal files is gaining negative attention from around the world. Many are asking Apple to halt its plans.

Earlier this month, Apple announced its plans to begin scanning people’s files. The plan was to check peoples devices for images or videos of exploitation. The level of surveillance being planned, however, has sparked concerns about surveillance and the fact that back door technology could be exploited by governments around the world.

For many advocates, the concern is magnified by the fact that Apple has been billing itself as a company that respects privacy. In 2019, Apple was among a number of companies that called on the American government to implement an American version of the GDPR laws. In addition to that, Apple was among companies that was the target of lawmakers scorn over efforts to increase encryption. Lawmakers were demanding that effective encryption be banned in the country at the time. So, there have been moments that increased Apple’s profile in the world of privacy and security in recent years.

With this latest effort, it seems that Apple is basically undoing its reputation as a company that respects user privacy. Instead, they are increasingly being seen by some as a threat to users personal privacy. As one can imagine, for advocates of personal privacy, this isn’t something that will politely be ignored. In fact, digital rights organizations from around the world have signed an open letter calling on Tim Cook, CEO of Apple, to reverse this decision. From the open letter:

Apple announced that it is deploying a machine learning algorithm to scan images in its text messaging service, Messages, to detect sexually explicit material sent to or from people identified as children on family accounts. This surveillance capability will be built right into Apple devices. When the algorithm detects a sexually explicit image, it warns the user that the image may be sensitive. It also sends a notice to the organiser of a family account whenever a user under age 13 chooses to send or to receive the image.

Algorithms designed to detect sexually explicit material are notoriously unreliable. They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child. Moreover, the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk. As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent. Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit.

Apple also announced that it would build into the operating system of its products a hash database of CSAM images provided by the National Center for Missing and Exploited Children in the United States and other child safety organisations. It will scan against that database every photo its users upload to iCloud. When a preset threshold number of matches is met, it will disable the account and report the user and those images to authorities. Many users routinely upload the photos they take to iCloud. For these users, image surveillance is not something they can opt out of; it will be built into their iPhone or other Apple device, and into their iCloud account.

We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.

The Electronic Frontier Foundation (EFF) also encouraged their users to sign a petition, calling on Apple to abandon its plans. From the EFF:

Mass surveillance is not an acceptable crime-fighting strategy, no matter how well-intentioned the spying. If you’re upset about Apple’s recent announcement that the next version of iOS will install surveillance software in every iPhone, we need you to speak out about it.

Apple plans to install two scanning systems on all of its phones. One system will scan photos uploaded to iCloud and compare them to a database of child abuse images maintained by various entities, including the National Center for Missing and Exploited Children (NCMEC), a quasi-governmental agency created by Congress to help law enforcement investigate crimes against children. The other system, which operates when parents opt into it, will examine iMessages sent by minors and compare them to an algorithm that looks for any type of “sexually explicit” material. If an explicit image is detected, the phone will notify either the user and possibly the user’s parent, depending on age.

These combined systems are a danger to our privacy and security. The iPhone scanning harms privacy for all iCloud photo users, continuously scanning user photos to compare them to a secret government-created database of child abuse images. The parental notification scanner uses on-device machine learning to scan messages, then informs a third party, which breaks the promise of end-to-end encryption.

Apple’s surveillance plans don’t account for abusive parents, much less authoritarian governments that will push to expand it. Don’t let Apple betray its users.

American users can sign the petition on this page.

Meanwhile in Canada, digital rights organization, OpenMedia, is also collecting signatures from Canadians who also want to have their voice heard on this issue. From OpenMedia:

After years of rejecting government pressure to break encryption, and promising us that data on our iOS devices is secure, Apple has caved to the pressure. In iOS 15, Apple is introducing a mandatory backdoor for government agencies and Apple to access all the images we store on our phones that are backed up to iCloud — once certain conditions are met.

This is something Apple said they would never do. While Apple is making those conditions stringent to start, once the technical backdoor is in place, it isn’t a question of if the government will force them to loosen their conditions of access — it is a question of when.

We need to send a clear message to Apple that they must stick to their promise to us and never give access to the data on our phones to anyone. Sign the petition telling Apple to uphold their privacy promise by protecting the data on our phones!

Their petition is also available on the same page.

The condemnation has also been heard in Europe. European Digital Rights (EDRI) commented with the following:

EDRi and other civil society groups have been warning of the risks of these proposed technological ‘solutions’. Despite the laudable goal to protect children, these changes in fact introduce measures that make everyone less safe by creating a ‘backdoor’ into our private lives. With Apple blessing these privacy-invasive technologies and the adoption of the recent interim CSAM legislation, we are deeply concerned that these practices will be normalised and promoted further by other companies and by policy-makers. We call on Apple to abandon these proposed changes and to step up against corporate and government surveillance.

One question in all of this is whether Apple will actually back down on this. We haven’t heard a lot from Apple since this whole scandal broke. What is likely happening is that they are trying to gauge the reaction people are having and doing something along the lines of risk assessment. One question is whether this story will simply fade away in a few weeks time or not. Sadly, there is precedence that this is a possible outcome.

Back in April, Facebook suffered from a massive breach that exposed 553 million users. A subsequent followup showed that Facebook’s strategy was to simply lay low and hope the controversy simply blows over. As frustrating as it is, Facebook managed to get their way for the most part. There was apparently litigation against Facebook, but the story did die down after several other stories broke. We’ll have to see if this problem repeats itself in Apple’s case.

Drew Wilson on Twitter: @icecube85 and Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top