Thursday, September 25, 2014

Keep Your Government & Corporate Hands Off My Passwords

You may have heard that Apple has implemented a robust new security feature with its latest mobile operating system: the phone's data is now encrypted by default, and Apple retains no record of your passcode or other "backdoor." As a result, Apple cannot "unlock" your phone, even if it has physical possession of the phone, and even if it is served with a lawful warrant or subpoena. It's simply "technically infeasible" for Apple to comply. Law enforcement might as well send the iPhone to Google, which probably is just as likely to have a record of your iPhone passcode somewhere in its vast treasure trove of data about you.

There's nothing nefarious or even new about this, as this has been the standard for encrypted hard drives since forever. I'm writing this post on a 2010 MacBook Pro (with upgraded RAM and SSD drive, I might add!), and its hard drive is encrypted using Apple's standard File Vault utility. Apple offers to keep a copy of your recovery key, but it's not required. If you care about having a truly secure computer, especially as a lawyer, you decline the offer like I did.

But because lots of people have iPhones, important people have noticed the change in Apple's default iPhone security settings, and some of them are freaking out. Most notably, Professor Orin Kerr—a well-respected and influential 4th Amendment scholar who blogs at Volokh Conspiracy—called Apple's move a "dangerous game" that would "thwart" lawful warrants and probably lead to reactionary legislation far worse for privacy interests and civil liberties than simply letting Apple store a copy of your passcode.

I had a lot of thoughts in response to Prof. Kerr's post, but I'm a terrible blogger so the vast majority of them have already been ably expressed by others:
  • Julian Sanchez detailed all the ways in which Apple's move is nothing new, so presents no shift in the overall "equilibrium" between privacy and law enforcement interests, and certainly is not in derogation of the public interest.
  • Matthew Green at Slate did the same, pointing out in particular how any ability we give the US government can be used equally by less friendly governments.
  • Windypundit explained how any backdoor Apple can exploit for the government is a backdoor bad guys can exploit. 
  • Kerr himself has admirably walked back from his original overreaction ("very troubling") to a more scholarly investigatory mode ("need more information to decide" and "where do you draw the line?").
I recommend you read all these responses. But there are a few things I think have been left unsaid.

Kerr's Sense of "Public Interest" is ... Very Troubling

Kerr's original reaction was based his inability to imagine how Apple's change (encryption by default plus no backdoor) could possibly be in the "public interest." This only reveals either his impoverished imagination or his perverted sense of the "public interest." Others (above) have adequately exposed his lack of imagination, but the deeper problem, I think, is that his sense of the public interest essentially boils down to "law enforcement interests." The fact that Apple's change will make tens of millions of Americans more secure in their papers, effects, documents, photos, etc., apparently doesn't register for Kerr as something that could possibly count as in the public interest. That's really weird. Maybe this is a cheap shot, but Kerr's mindset makes it hard for me to imagine how the 4th Amendment's warrant requirement would meet his definition of the "public interest" if it were up for debate today.

Indeed, Kerr's initial response to Apple's move was suspicion because he thinks anything that makes warrants—the "gold standard" of privacy protection—less effective is presumed illegitimate. Apple's move therefore could not possibly be in the "public interest" because it would make it harder for law enforcement and counter-terrorist officials to crack cases, and Apple's old way already protected people from government snooping without a warrant.

But the existence of the 4th Amendment warrant requirement proves that there is indeed a "public interest" in respecting people's privacy: making millions and millions of Americans more secure in their possessions adds up to an almost insurmountable public interest. Warrants are the minimum constitutional requirement for an invasion of privacy.  It does not follow that there is no freestanding public interest in allowing people to maximize the security of their own possessions. 

Consider, for example, a law that imposed criminal penalties for the destruction of any electronic documents. From Kerr's perspective, this would seem to be obviously beneficial to the public interest. After all, allowing people to destroy documents makes it inevitably less likely that future crimes will be solved. And all of these documents would be protected from government snooping without a valid warrant, and nowadays there's no practical limitation to the number of documents that a person can store so there's no legitimate reason to destroy an electronic document.

Perhaps there's some basis for finding such a law unconstitutional, but my belief is that most people confronted with such a proposal would recoil in terror at such an intrusion on their privacy and autonomy. Such a law, which would decrease the "private interests" of millions, has a huge bar to clear to be considered in the "public interest" overall because for the most part the public interest is just the sum of private interests. And I think this analysis applies almost directly to the question of any policy to make mobile phones less secure than is technically feasible (which is what Kerr's conception of the public interest would require).

If Backdoors are in the Public Interest, Why Require Private Companies to Possess Them?

Now let's consider what would seem to be the natural response if you accept Kerr's premise that Apple's move is in derogation of the public interest: legislation to fix it. That's how we usually advance public interests. He proposes a simple amendment to a 90s law that essentially required cell-phone makers to let law enforcement tap them. Kerr thinks we'll see a movement to change that law just to require smartphone manufacturers to keep a backdoor or a copy of your decryption key so they can crack open a phone's data if served with a lawful warrant.

But even if Kerr is right that it's in the public interest for law enforcement to have this capability (he's not, of course), it's unclear to me why the answer is that people should be forced by government mandate to trust private, profit-maximizing companies with the their secrets. I'm aware of no analogous legislation, and I think it would be quite radical.

Instead, if we really think it's in the public interest for all smartphones to be crackable by government, any "key escrow" should be in public hands. In other words, the legislation should require smartphone passwords to be registered with the FBI or some other government agency. Maybe event the Supreme Court. Or maybe the legislation could require mobile operating systems to have a backdoor that only the government itself is allowed to access. The same rules would apply: e.g., law enforcement could only access this publicly held database of passwords with a lawful warrant.

Now, the black helicopter brigade will scream and moan—"Are you crazy!? Trusting the government with out secrets??" But this is a modest proposal. Would you rather trust a private corporation like Apple, or the public-spirited civil servants in the good ole United States government? 

And of course it would be made a serious crime for anyone to access this data without a warrant or for any improper purpose. To some extent we have no choice to trust the people in power, and wouldn't we rather this information be in the hands of public servants rather than private corporations, if we're going to force it to be in someone's hands? This would also alleviate the concerns about bad-guy foreign governments being able to serve warrants on Apple; they'd have no rights to the information held secure by Uncle Sam in its Fort Know bunker.

Obviously, I'm trying to illustrate the absurdity of the proposed legislation. It strikes me as absurd to legislate that people register their passwords with the government. But it's obviously more absurd to require that they register their passwords with private companies. Isn't it?


    1. My personal data, including my password, is not a public good. The real issue is that we don't introduce that the producer of data, the individual, should own it. I made my location data as a walk around with my phone and it should belong to me, not google. Using that approach, Apple seems to be doing the right thing and asking the user what they want to do with their own property, in this case the encryption key. However we need a comprehensive understanding of data ownership or each privacy issue will be answered in a vacuum and the "policy" will be inconsistent as a whole.

      1. Interesting point.

        I think part of what's going on here is the fact that most of us are, by default, pretty loose with how we protect this "property." We happily give Google pretty much whatever it wants, on the assumption that Google doesn't really care about us as individuals, only as consumers. Or perhaps on the assumption that Google has so much information that our daily walk is just a drop in the ocean.

        So you can understand a kind of envy from law enforcement - people will just give this stuff away, but when there's a serious crime to be investigated, all of a sudden it's the most private stuff in the world.

        I have an iPhone, and have upgraded to the latest operating system, which comes with the "health app." One of the feature that is now evident is that it really does track your movements to a precise degree. It knows how much I walk and run, down do increments of .00001 miles. It is a little creepy, and a little bit awesome. I don't really go anywhere interesting (drop kids off at school, walk to work, go to court, come back, walk home, watch tv) but you can see how this data is very enticing to law enforcement. If there was no way to protect if from the government, I would turn it off.


    Comments on posts older than 30 days are moderated because almost all of those comments are spam.