Apple’s Plan to Scan Files Uploaded to the iCloud Sparks Security Concerns

Apple’s plan to combat child exploitation by scanning every users photo’s is sparking concerns over privacy and security.

To get this out of the way straight up at the top, child exploitation is a serious problem. No one is saying that nothing should be done to combat it. On the other hand, what kind of method goes too far? What method actually does more harm than it solves? That’s a big part of the debate with Apple’s latest announcement with images uploaded to their cloud storage services.

As a general rule for those who provide online services, as long as the bills are paid and nothing illegal is being done with their services, there is a general hands-off approach for those customers. There are very good reasons for this. One big reason is that it maximizes the amount of innovation that can possibly happen online. Another good reason is that it encourages free speech. With those two reasons alone, this helps promote a healthy and better Internet. For instance, is a certain government doing something nefarious and a user has information that exposes this? The Internet might be a great place to go to take this information while maintaining a certain amount of anonymity.

That, of course, leads to another key pillar to a better Internet: security. Would you honestly do online banking knowing that your information was not secure? Absolutely not. Would you purchase something online with a credit card knowing someone could obtain that credit card and pin number? Heck no. Knowing that your information is also secure online is also a very big thing. After all, the amount of money that electronically changes hands every day is staggering. There’s really no way that would be possible without strong security. Little surprise that online and tech security is such a huge industry in the first place.

So, when there is a move to undermine that security, it’s going to get some attention. Whether it is an unauthorized third party breaking into something, a government wanting to clamp down on the effectiveness of security, or the companies themselves doing something to make their services less secure, it’s just going to get attention. The scope often determines how much attention that move is going to get.

So, when Apple said that it’s going to make changes to one of it’s larger services that affects security, it got a lot of attention. From Forbes:

Apple Inc (AAPL.O) on Monday said that iPhone users’ entire photo libraries will be checked for known child abuse images if they are stored in the online iCloud service.

The disclosure came in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users’ phones, tablets and computers for millions of illegal pictures.

In a posting to its website on Sunday, Apple said it would fight any such attempts, which can occur in secret courts.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” Apple wrote. “We will continue to refuse them in the future.”

In the briefing on Monday, Apple officials said the company’s system, which will roll out this fall with the release of its iOS 15 operating system, will check existing files on a user’s device if users have those photos synched to the company’s storage servers.

Julie Cordua, chief executive of Thorn, a group that has developed technology to help law enforcement officials detect sex trafficking, said about half of child sexual abuse material is formatted as video.

This easily caught the attention of digital rights organizations. Many are expressing anger and disappointment by the latest decision. The Electronic Frontier Foundation (EFF) said that what Apple is implementing is a backdoor in their security. From the EFF:

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.

To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.

There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.

When Apple releases these “client-side scanning” functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.

Another fear is that with Apple now saying that it can scan people’s devices for illicit governments, that is only going to increase pressure from governments who are hoping to target a host of other kinds of content. Sometimes, it’s terrorism. Other times, governments are wanting to find a new edge against political opponents. Some governments around the world are known to be trying to target general pornography. What is often a target varies. What’s more is that the content being targeted is always evolving.

It’s partly why security advocates push for a system that cannot just be eavesdropped on. If there is a weakness in the security, it can be abused. What’s more is that if there is a weakness within the security, it can also be abused by malicious third parties. Weakening security in any way weakens the security full stop. There is no, “let’s weaken security, but only allow the good guys to use this exploit.” From a technological point of view, such a thing is not possible in any practical sense. A security hole is just a security hole.

Going back to the Apple example, if Apple can peer into people’s devices, that means it’s theoretically possible that a malicious party can do so as well. Whether it’s through a sophisticated hack, a successful spearphishing campaign, or an insider job, it opens that possibility that such a system can be infiltrated. So, it really is no surprise that the security concerns with this idea has outweighed the possibility of combating child exploitation. What’s more is that this is inevitably going to undermine the trust Apple has built up over the years as well.

As a result, Apple is really at a crossroads here. Do they continue to maintain their status as a trusted company when it comes to security of the users, or do they continue down this path that has the security community screaming, “What are you doing Apple?” Probably the only silver lining is that cloud storage competitors are licking their chops over this. Someone is going to be able to exploit this move by saying how they don’t scan your files as they believe in personal privacy. There could very well be a windfall incoming. At any rate, Apple is taking a hit to their image here.

Drew Wilson on Twitter: @icecube85 and Facebook.



2 Trackbacks and Pingbacks

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.


%d bloggers like this: