[ Updated below, but I’m leaving the text here as I originally wrote it. ]
By now, just about everyone has seen the open letter from Apple about device encryption and privacy. A lot of people are impressed that such a company with so much to lose would stand up for their customers. Eh, maybe.
I have to somewhat conflicting thoughts on the whole matter:
1)
If Apple had designed security on the iPhone properly, it would not even be possible for them to do what the government is asking. In essence, the government plan is for Apple to develop a new version of iOS that they can “upgrade” the phone to, which would bypass (or make it easier to bypass) the security on the device. Of course, it should not be possible to upgrade the OS of a phone without the consent of a verified users, so this is a bug they baked in from the beginning — for their benefit, of course, not the government’s.
Essentially, though they have not yet written the “app” that takes advantage of this backdoor, they have already created it in a sense. The letter is therefore deceptive as written.
2)
The US government can get a warrant to search anything. Anything. Any. Thing. This is has it has been since the beginning of government. They can’t go out and do so without a warrant. They can’t (well, shouldn’t) be able to pursue wholesale data mining of every single person, but they can get a warrant to break any locked box and see what’s inside.
Why should data be different?
I think the most common argument around this subject is that the government cannot be trusted with such power. That is, yes, the government may have a reasonably right to access encrypted data in certain circumstances (like decrypting known terrorist’s phones!) but the tools that allow that also give them the power to access data under less clear-cut circumstances as well.
The argument then falls into a slippery slope domain — a domain in which I’m generally unimpressed. In fact, I would dismiss it entirely if the US government hadn’t already engaged in important widespread abuse of similar powers.
Nevertheless, I think the argument that the government should not have backdoors to people’s data is one of practical controls rather than fundamental rights to be free from search.
I have recommendations to address both thoughts:
- Apple, like all manufacturers, should implement security properly, so that neither they nor any other entity possess a secret backdoor.
- Phone’s should have a known backdoor, a one-time password algorithm seeded at the time of manufacture, and stored and managed by a third party, such as the EFF. Any attempts to access this password, whether granted or denied, would be logged and viewable as a public record.
I don’t have a plan for sealed and secret warrants.
[ Update 2/17 11:30 CA time ]
So, the Internet has gone further and explained a bit more about what Apple is talking about and what the government has asked for. It seems that basically, the government wants to be able to to brute-force the device, and wants Apple to make a few changes to make that possible:
- that the device won’t self-wipe after too many incorrect passwords
- that the device will not enforce extra time-delay between attempts
- that the the attempts can be conducted electronically, via the port, rather than manually by the touch screen
I guess this is somehow different than Apple being able to hack their own devices, but to me, it’s still basically the same situation. They can update the OS and remove security features. That the final attack is brute force rather than a backdoor is hardly relevant.
So I’m standing behind my assessment that the Apple security is borked by design.