The Powers That Want to Be

There is nothing new here.

For as long as they've existed, governments have wanted more information and intelligence on what their citizens are up to.  There likely will never be a government that isn't fixated on this, even if they don't start off that way.  It's an inevitable side effect of power, one that exists in corporate structures as well.  And as long as we remember this, we should be able to deal with it in a variety of creative ways.  But we all too often forget this danger of authority, especially in times of crisis or perceived crisis.

That is what we saw back in February when the FBI came to Apple with a seemingly strange request.  In order to gain access to the phone of a deceased mass shooter, they would need the company to manufacture code to basically get around its own security, specifically that which would wipe the data off the phone after a certain number of invalid password attempts were entered.  Apple, to its credit, challenged this request and has since been mischaracterized as aiding and abetting criminals, terrorists, etc.  Par for the course for those who don't leap up to play ball.

Now it's one thing when the NSA spies on everybody and spends all of its time trying to crack our encryption.  Sure, it's a betrayal and it goes against everything that our country stands for, but it's pretty much the kind of thing we've always expected them to do.  Looking back at our own publication from as far back as the 1980s reveals that very suspicion from many of our readers and writers.  But what we have with the Apple case is something very different.  Here we see a company being expected ("asked" is just too weak a word) to bypass its own security to help in an investigation.  Laughably, they were told that this would only be used one time and wouldn't become a routine method of gaining access to any phone the feds felt like investigating.  When has a tool ever been invented and then immediately destroyed?  Everyone knows that once in place, it would always be in place.  And even if it were completely wiped out of existence after being used this one time, the precedence would be in place, meaning that another request would mandate the tool be reinvented.  A bit slower, perhaps, but just as destructive to our privacy and Apple's reputation.

That is why this fight is so important.  We doubt anyone would have a problem handing over a PIN or password that would help in the investigation of someone who obviously was up to no good.  If there is a hint of co-conspirators somewhere, that certainly should be investigated.  But to do this in the way the government desires is the equivalent of kicking in everyone's doors to investigate a single criminal.  Better analogies might be expecting homeowners to hand over copies of their house keys or all of us to make a list of our passwords so investigators can have a look around whenever they felt the need to.  Subverting their own code is the digital equivalent of this for Apple.  And the fact that we're even having this dialogue is worrisome in many ways.

First off, how many companies have received similar requests without challenging them?  How hard would it be for governments to demand such access and also forbid the recipient from disclosing the requests?  We've already seen this happen on the transactional level with National Security Letters (NSLs), but this scenario takes that a step further with the expectation that companies will not only turn over information, but also construct the means that allow this to happen in the first place.  And what assurance do we have that some other government, or even another agency within the same government, won't make additional or even more intrusive requests?

The best case scenario would have occurred if Apple were able to say with assurance that they couldn't fulfill this request because it simply wasn't possible.  But that's not what happened.  Apple said that it was indeed doable but would violate its trust with its customers and basically destroy its own security.  In other words, Apple engineers here or anywhere in the world might have already succeeded in doing this.  We're told they didn't, but that requires us to trust what we're being told without any real evidence to the contrary.  That's a problem.

One could make the argument that since we know Apple's security can be thwarted by their own engineers, that it's as good as cracked already and they might as well just humor the authorities.  Of course, that greatly devalues the millions of devices out there that are supposedly protected.  But how secure are they really since we now know they can be subverted by people with the right amount of access?

The answer, obviously, is to have technology in place where companies don't control the security more than the actual users - unfortunately still something of a rarity.  If, for example, you send someone a PGP-encrypted message, the government can't just go to an Internet Service Provider and demand that they decrypt it.  The provider simply can't do this because only the user has access to their private key and only they know the passcode.  Of course, the user can do something stupid by storing their private key publicly and having the passcode kept in a file called "PGP-passcode" in an account that authorities can access.  But that puts the onus entirely on the end user.  If they take the right steps, they will be secure from prying eyes.  It doesn't preclude the NSA and their ilk from turning their attention to cracking this code, but it gives the user as good a chance as any against this sort of thing.  And this is something we all should expect as a basic right; just because there are criminals in our midst does not mean we should give up any of our own privacy.

Part of the myth that gets floated in situations like this is that authorities are being crippled by technology in their struggle to track down the bad guys.  Encryption, anonymity, rapid transfers of data - they all keep criminals ahead of the law.  This is, for the most part, utter nonsense.  Consider how technology has changed over recent years and decades.  We have devices that track our every movement on street corners, in our vehicles, and even in our pockets.  Often we willingly embrace these intrusions as marvels and conveniences.  Other times we don't even know they're there.  There are gadgets in our homes that listen and watch, reporting everything from our physical motions to the television channels we watch to the room temperature we choose.  It's estimated that the average person is on camera around 75 times a day and as high as 300 times a day in cities like London.  License plates can be scanned at a rate of thousands per minute which keeps a pretty good record of who's in what part of town.  Add social media to the mix and you'll quickly see how much of our lives is open to scrutiny and analysis.

Police and investigators will always claim that being shut out of one source of information makes it impossible for them to do their job.  But it wasn't that long ago that none of the above conveniences were available to them and yet they still managed.  Yes, there are challenges to law enforcement that technology presents, but these are offset by advantages that make their job easier.  If they could read our brain waves and know our every thought, they would object strenuously to losing that ability, saying that there would be no way to stop crime without it.  It's simply not true.

The Apple case should serve as a warning to any of us who truly value our privacy.  The information we store on our phones is not as secure as we might think, either due to technological weakness or the force of authority.  Until we are confident that the security we employ to protect the data on our phones is strong and ultimately under our control and not that of a large company somewhere, we must assume that anything we store could fall into the wrong hands.  That includes pictures, texts, contacts, access gained through any apps, plus a whole lot more.  Consider also that most people opt to back up their phone's data to the cloud so that it doesn't get lost if the phone does, which is smart on one level but opening up yet more vulnerabilities on another.  We would be wise not to store the entirety of our lives on these or any insecure devices, even if we feel we have "nothing to hide."  We all value our privacy and that's one thing technology can help to strengthen as long as we make it a priority.

Return to $2600 Index