Apple’s Position on Crypto
There has been a lot of tooth-gnashing lately about a conflict between the Obama administration and Apple, related to Apple’s stance that its customers are entitled to the ability to use strong encryption to protect the data on their devices. Apple has designed security and encryption into their iOS devices, at a fundamental, opt-out, level.
When you set up a new iPhone, you are prompted to set up a password, and the device is encrypted and locked with that password. If you forget the password, you cannot take the device to Apple and get them to unlock it. This is not because Apple is worried that it might be stolen, or that something else nefarious is going on, but instead because Apple has designed the system to completely protect your device and your data from unwanted eyes.
The FBI, on the other hand, wants a back door into your device, so that if you are suspected of doing things they dislike, that they can access your phone (which is very likely a complete record of everything digital you do, or have done).
What the government wants
A lot has been written about this, and I won’t cover it all. I understand that the FBI needs to enforce laws, and that prosecutors need access to evidence. I also understand that technology is always ahead of the law. We now have a situation whereby law enforcement could effectively tap your phone, your Facebook account, your text messages, your SnapChat, and your email, simply by getting access to your device. The law does not effectively deal with this shortcut to data, either.
This all got more interesting yesterday, as a Federal Court ordered Apple to help the FBI circumvent the security of a terrorist’s device. I’m a big fan of Apple’s response, but if you want a bit of background, check Matthew Panzarino’s great piece at Tech Crunch.
What this is really about
In my opinion, this is a debate about the velocity by which the FBI can have access to the data on the phone. Almost (more on that word in a second) nothing on the phone is inaccessibly to law enforcement through other means. For example, I took a look at my iPhone this evening, and other than the standard Apple stuff, I have a few social networks installed (TweetBot, Facebook, Path, LinkedIn, Peach, Instagram, and Ello). I have a few messaging apps installed, as well (Meerkat, Periscope, Facebook Messenger, SnapChat, and WhatsApp). All of this data is accessible if the FBI were to serve a subpoena on any single one of these companies.
My ISP can provide the logs of my server traffic, and a decent engineer should be able to filter that data, and reconstruct a log of the data, including data protected by SSL/TLS encryption, which the Edward Snowden leaks suggested can be broken in close-to-real-time, so are certainly insecure to a motivated investigator with the FBI’s resources.
In short, the FBI can get access to everything my phone has done, without my phone.
Except for one thing
The FBI cannot get access, retroactively, to the blue-bubble text messages I send from Apple’s Messenger app. These messages are strongly encrypted, on my phone, using a public key scheme, that ensures that only the intended recipient can read the message. Apple’s design for Messages is, in my opinion, fairly secure. Messages are encrypted in a manner that prevents them being read in transit by anyone with the proper keys, which are stored on the device, and not on Apple’s servers. Effectively, you cannot read these messages without the device.
Apple, does, however, have the “metadata.” They have to route messages to their recipients, so they do know when, and to whom, messages are sent. I suspect the FBI already has that information.
So this is the crux
This battle is about the speed of the FBI’s access, pitted against a very solid argument about our 4th and 5th Amendment rights, security of papers and letters, and the right to avoid self-incrimination. Current case law suggests, though Supreme Court law is unsettled, that you can refuse to provide law enforcement with data that could potentially incriminate you.
If you’ve ever gotten a virus on your computer, or clicked a link to an inappropriate website, or downloaded something that was not what it represented itself to be, you have opened yourself to potential liability. Might your computer or phone have something on it that you have no idea about, but which might lead someone to believe you’re doing something illegal?
Probably not. But remember what Warren Buffett said. “If a cop follows you for 500 miles, you’re going to get a ticket.”