Apple Open Letter… eh

[ Updated below, but I’m leaving the text here as I originally wrote it. ]

 

By now, just about everyone has seen the open letter from Apple about device encryption and privacy. A lot of people are impressed that such a company with so much to lose would stand up for their customers. Eh, maybe.

I have to somewhat conflicting thoughts on the whole matter:

1)

If Apple had designed security on the iPhone properly, it would not even be possible for them to do what the government is asking. In essence, the government plan is for Apple to develop a new version of iOS that they can “upgrade” the phone to, which would bypass (or make it easier to bypass) the security on the device. Of course, it should not be possible to upgrade the OS of a phone without the consent of a verified users, so this is a bug they baked in from the beginning — for their benefit, of course, not the government’s.

Essentially, though they have not yet written the “app” that takes advantage of this backdoor, they have already created it in a sense. The letter is therefore deceptive as written.

2)

The US government can get a warrant to search anything. Anything. Any. Thing. This is has it has been since the beginning of government. They can’t go out and do so without a warrant. They can’t (well, shouldn’t) be able to pursue wholesale data mining of every single person, but they can get a warrant to break any locked box and see what’s inside.

Why should data be different?

I think the most common argument around this subject is that the government cannot be trusted with such power. That is, yes, the government may have a reasonably right to access encrypted data in certain circumstances (like decrypting known terrorist’s phones!) but the tools that allow that also give them the power to access data under less clear-cut circumstances as well.

The argument then falls into a slippery slope domain — a domain in which I’m generally unimpressed. In fact, I would dismiss it entirely if the US government hadn’t already engaged in important widespread abuse of similar powers.

Nevertheless, I think the argument that the government should not have backdoors to people’s data is one of practical controls rather than fundamental rights to be free from search.

 

I have recommendations to address both thoughts:

  1. Apple, like all manufacturers, should implement security properly, so that neither they nor any other entity possess a secret backdoor.
  2. Phone’s should have a known backdoor, a one-time password algorithm seeded at the time of manufacture, and stored and managed by a third party, such as the EFF. Any attempts to access this password, whether granted or denied, would be logged and viewable as a public record.

I don’t have a plan for sealed and secret warrants.

 

[ Update 2/17 11:30 CA time ]

So, the Internet has gone further and explained a bit more about what Apple is talking about and what the government has asked for. It seems that basically, the government wants to be able to to brute-force the device, and wants Apple to make a few changes to make that possible:

  1. that the device won’t self-wipe after too many incorrect passwords
  2. that the device will not enforce extra time-delay between attempts
  3. that the the attempts can be conducted electronically, via the port, rather than manually by the touch screen

I guess this is somehow different than Apple being able to hack their own devices, but to me, it’s still basically the same situation. They can update the OS and remove security features. That the final attack is brute force rather than a backdoor is hardly relevant.

So I’m standing behind my assessment that the Apple security is borked by design.

3 thoughts on “Apple Open Letter… eh”

  1. Also, I ding Apple for publishing a letter set in “Myriad.” Sans-serif is fine for headlines and small snippets, but a block of text like that, ick. Form over function, I guess. The Apple Way.

  2. Totally agree on the font. On the main issue, I think we are fine with criminal warrants, and I think that the government has expanded FISA beyond what its framers had intended to authorize. I’m waiting for some sort of showing that 70 years of spying on the public has kept us safe from anything.

    1. But this is not a FISA case, so I don’t get Cook’s position. He doesn’t want to give such a tool to the FBI not because of this case, but because he fears that will use it more broadly.

      OK, then Apple should hack the phone themselves and give the FBI the contents.

      Though I agree that the unwarranted spying is probably not helping, I don’t quite understand how it becomes Apple’s right/responsibility to be involved. In the end, we need a government that behaves lawfully. Apple should not be our bulwark against a lawless state.

Leave a Reply

Your email address will not be published. Required fields are marked *