EFFecting Digital Freedom

by Jason Kelley

When We Fight, We Win

In September, EFF announced that we were planning to put our ten-year-old browser extension, HTTPS Everywhere, into maintenance mode.  This wasn't giving up - this was a victory.  As we had hoped when we launched the project, HTTPS, which encrypts website traffic, has become essentially ubiquitous.  HTTPS is, actually, everywhere, offering protection against eavesdropping and tampering with the contents of a site or with the information you send to a site.  This provides protection against a network observer learning the content of the information flowing in each direction - for instance, the text of email messages you send or receive through a webmail site, the products you browse or purchase on an e-commerce site, or the particular articles you read on a reference site.

Since we started offering HTTPS Everywhere, the battle to encrypt the web has made leaps and bounds.  The vast majority of the web is now encrypted.  In 2017, half of all web traffic was encrypted - by 2019, nearly 90 percent was encrypted.  And the relatively recent addition of a native setting to turn on native HTTPS-only mode in Firefox, Chrome, Edge, and Safari has made our extension, which ensures that users benefit from the protection of HTTPS wherever possible, redundant.  This is a win for all users and a clear signal that encryption matters.

But despite the overwhelming recognition that encryption benefits users and its wholesale adoption by tech companies, encryption is still, incredibly, under attack.  Technologists, privacy activists, and everyday users were shocked when one of the largest providers of end-to-end encrypted messaging, Apple, announced in August that it would be adding two scanning features to its devices.  The first feature would search for "explicit" photos sent to or by young people via Messages if they were on Family plans that enabled it.  Separately (and using different technology), Apple is also planning to pilot scanning for their iCloud photos backup service, to look for matches against a database of known child sexual exploitation material, on every device that uses the iCloud service.

Both of Apple's plans are what we call "client-side scanning."  Proponents maintain that if the device scans an encrypted message after or before it's been delivered, then end-to-end encryption - where only the sender and the recipient have the keys to unlock the message - remains unbroken.  This is wrong.  These privacy-invasive proposals work like this: every time you send a message, software that comes with your messaging app first checks it, whether using hash matching or applying an on-device machine learning classifier.  "Hash matching" checks the message against a database of "hashes,' or unique digital fingerprints, usually of images or videos, and machine learning uses software trained to recognize similar images.

In either case, if it finds a match, it may refuse to send our message, notify the recipient, or even forward it to a third-party, possibly without your knowledge.  Apple's plan is to scan the photos sent by or to some young people via Messages for "explicit" content, and potentially notify parents if it is sent or received.  Apple's other plan is to use client-side hash matching to scan iCloud photo libraries, and if the iCloud photo scanning matches enough photos to the hashed database of known child exploitation material, then a manual review will take place, and Apple may eventually send the information to the National Center for Missing & Exploited Children (NCMEC).

Though these plans differ in intent and technology, they are both extremely dangerous.  A backdoor by any other name is still a backdoor.  Client-side scanning, like many proposals to infiltrate secure messaging before it, would render the user privacy and security guarantees of encryption hollow.  Adding a backdoor that is only accessible by the good guys is impossible, and it fundamentally breaks the promise of encryption.  It would offer not only an incentive but a ready-built system for authoritarian governments to scan for additional content from all users - such as protest imagery or even LGBT content in countries where it is outlawed.  Creating a system to scan photos that are uploaded to the iCloud service or those sent to other users isn't a slippery slope; it's a fully built system just waiting for external pressure to make the slightest change.

As we have for decades, EFF and our allies jumped into action after Apple's announcement, delivering a petition with nearly 60,000 signatures to Apple in under a month, leading protests at Apple stores, and even flying a banner over Apple's headquarters during their September iPhone launch event.  We also wrote nearly a dozen different explanations of why the plan was dangerous, many of which have been translated into multiple languages and quoted by hundreds of news sites.  Due to this massive criticism, which not only came from EFF and our allies, but from human rights groups, heads of state, and users like you, Apple announced they would delay the launch of these features.  This isn't sufficient, of course - the company must wholly commit to protecting encryption and user privacy - but for now, encryption continues to thrive.  When cryptographer and cybersecurity expert Bruce Schneier was asked whether backdoors to Apple's iPhones were inevitable during the press conference where we delivered our petitions, he was blunt: "Client-side scanning is the solution du jour.  If you go back through the decades, there were different ones: there was key escrow, weakening algorithms... an update mechanism to push hacked updates.  In every time period, a different solution becomes the one that is favored.  Inevitably, in the past, none of those have been implemented.  So, I don't think this is inevitable either.  So far, we're doing well here in not getting a backdoor into our devices."

Encryption has become standard over the web, and the number of users who rely on encrypted messaging and encryption generally has grown..  Even as the government and law enforcement have pushed tech companies for decades to find backdoors into encryption and encrypted devices, smartphones have become both more important and more secure..  They are in the pocket of elected officials and heads of state, CEOs and power plant operators, judges and police officers...  We rely on encrypted messaging in ways that many don't even realize.

Apple famously denied the FBI's request to add a backdoor into the device of a suspected terrorist in 2016, which is part of what makes their client-side scanning plan such a slap in the face.We believe the company will find its way to the correct, secure decision in this case as well.  There's no guarantee that we'll win the war, but so far, we've won the battles - and EFF will continue to fight to protect the growing number of users who rely on safe, secure communications, whether that's on the web, on phones, or anywhere else.  The contours of the fight may have shifted, but when we fight, we win.

Return to $2600 Index