A judge just ordered Apple to help brute-force San Bernardino shooter's iPhone—and break its promise on privacy


The legal showdown between Apple and the government that we all knew was coming is here. On Tuesday, California judge Sheri Pym ordered Apple to provide “reasonable technical assistance” in unlocking a deceased criminal’s encrypted iPhone 5c.

The order would allow the FBI to go through an iPhone left behind by San Bernardino shooter Syed Farook, who, along with his wife, shot 14 people at his workplace in December. The iPhone, provided by his county employer and found in a family Lexus according to NBC News, has a passcode on it. And that should mean, per Apple’s new aggressive stance on privacy, that the phone can only be decrypted by someone who knows that passcode.

Update: Apple issued a “customer letter” late Tuesday night in response to the order. “The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone,” Apple CEO Tim Cook writes in the letter. “We oppose this order, which has implications far beyond the legal case at hand.”

It’s an intriguing order, as Apple has been saying for the last year that it can’t technologically do this, a stance that has garnered much criticism from government officials. FBI director James Comey said Apple was putting its customers “beyond the law,” comparing it to a car dealer that sells “cars with trunks that couldn’t ever be opened by law enforcement.”

“We refuse to add a ‘backdoor’ into any of our products because that undermines the protections we’ve built in,” Apple explains on its privacy page. “And we can’t unlock your device for anyone because you hold the key—your unique password.”

But Judge Pym’s order indicates that the FBI has a workaround for Apple’s “unbreakable” iPhone encryption. As predicted by Washington Post’s Ellen Nakashima, the judge ordered Apple to provide the FBI with special software that could be loaded onto the phone to override the feature that auto-erases it after 10 incorrect passcode attempts. The FBI’s plan then is to try random combinations repeatedly until it gets the right one. So Apple is being ordered to help brute-force an iPhone, and undermine the protections it deliberately built into its operating system to prevent this.

The judge, cognizant of the privacy battle ahead, asks that the special software be coded with a “unique identifier” so that it “would only load and execute on [Farook’s iPhone].” Assuming that’s possible, it would mean that the FBI wouldn’t get a tool it could use again any time it wanted to get into someone’s encrypted iPhone.

Apple has 5 days to appeal the order if it considers the ask “unreasonably burdensome.”

The company has not yet responded to a media request about the feasibility of this order, but iPhone forensics engineer Jonathan Zdziarski said it should be possible to comply.

“Apple has signing keys so they can design their own firmware to load onto the phone and change whatever they want,” explained Zdziarski, who expressed surprise that the NSA, who would potentially be involved in a terrorism case, couldn’t just break the iPhone’s encryption. “Either the government’s capabilities are more limited than we thought or this is a legal test case to see how the courts and how Apple will respond.”

Other information security experts were less certain that it would be this easy for Apple to engineer the workaround. Columbia professor Steven Bellovin called the technological ask “very hard” while University of Pennsylvania Matt Blaze called it “risky,” unless Apple already has this program sitting on a shelf, with both saying the bespoke technological solution might result in the phone being erased completely. Even the order says Apple isn’t responsible for making sure the user data can be copied.

Some tech experts say this would only work because the iPhone is an older model, a 5c; newer models have enhanced security built into the phones that might cause their memory to be erased if an attempt was made to override their auto-erase feature, writes Dan Guido of infosec start-up Trail of Bits.

According to NBC News, law enforcement already tried its other iPhone encryption workaround: getting the data backed up to Apple’s iCloud, which isn’t subject to the only-a-passcode-can-unlock-it encryption. Via NBC News:

Although investigators have been able to obtain several backup versions of Farook’s iCloud data, the most recent version they’ve been able to access dates from about a month and a half before the shooting. They said this showed Farook “may have disabled the feature to hide evidence.”

Privacy experts are aghast at the idea of a company being forced to write software to circumvent its own technological data protections.

It’s definitely a test case for the claims that Apple CEO Tim Cook has made about how far Apple’s willing to go to protect customers from having their digital evidence and trails used against them. Given that, it seems almost certain that Apple will attempt to fight the request by appealing the order.

Inline Feedbacks
View all comments
Share Tweet Submit Pin