Monday, April 18, 2016

How Apple Could Make the iPhone 7 Completely Warrant-Proof

Apple's face-off with the government over providing access to the iPhone used by one of the San Bernardino shooters may be over, but the re-awakening of another case in Brooklyn, New York, proves that the two parties are destined to dance again-perhaps many times over, unless Apple can find a way out of the dispute. It might.



The FBI won a court order in Riverside, California, compelling Apple to go as far as to write a new OS for the phone used by Syed Farook in order to break into the device. It has since abandoned that order, but the Justice Department has now petitioned a federal court in Brooklyn, New York, to force Apple to help it access the contents of an iPhone owned by a convicted drug trafficker.



Apple has been put at a disadvantage in these jousts by having to admit that enabling access to these phones was indeed technically possible. Apple's basic position is that while it has the technology and expertise to hack into encrypted data on an iPhone, it's the wrong thing to do. Creating even a single hack, Apple says, could endanger the security of millions of iPhones because there's always a chance that the hack might somehow be released into the wild.



Apple has indeed helped the government access encrypted data on iPhones, but when the FBI demanded in open court that it create a special operating system to hamstring the security features on the Farook phone, Apple drew a line in the sand.



In the future, Apple may not be put in this position. It could design the iPhone 7 and change iOS in ways that will make the answer to the question "can you break into this criminal's iPhone?" a truthful "No."



But how? That's where things get a little intense.



When you power down an iPhone or put it to sleep the device automatically generates an encryption key, which can only be re-opened with a decryption key. The decryption key is automatically generated by the phone when the user inputs their passcode. Prior to iOS 8, the decryption key existed on the user's device and with Apple. Since iOS 8, the decryption key exists only on the user's device.



So Syed Farook's iPhone 5c, which ran iOS 9, limited the number of times the FBI could guess Farook's passcode to 10. After the tenth incorrect guess the phone would disable the decryption key-data gone, game over. The OS also put progressively long time delays between login tries, making the process of guessing the passcode using a computer take years.



By building these features into the OS, Apple essentially removed itself almost entirely from the business of securing and accessing the data on a user's iPhone. But, as we learned from the San Bernardino case, it didn't remove itself completely. That is, Apple has the ability to create a different version of the OS that, when loaded onto the phone, shuts all those security measures off. Then, a computer program can quickly run through thousands of possible password combinations to eventually guess the correct one and unlock the device.



Could Apple remove itself completely from the security chain, so that it couldn't do anything to help open an iPhone-even if the device contained, say, the immediate location of a hidden dirty bomb?



That's very likely, but there are big trade-offs involved, security experts tell me. The more Apple removes itself from the secure relationship between user and phone, the more it gives away its ability to access the system when something needs fixing.



Cooper Levenson attorney and security expert Peter Fu explained it like this: If you're trying to get rid of all the rats in the sewer system, you can cement over all the drains so no rats can get in. But if the rats get in some other way, you have a big problem getting down into the sewers to remove them.



Apple could almost certainly alter iOS to preclude the user, Apple, or anybody else from downshifting into an earlier OS version with weaker security measures.



"It can remove all of its administrative privileges altogether," Fu says. This seems extreme, but then "no one would have guessed that Apple would have written itself out of the ability to recover lost passcodes," Fu says.



Nobody knows exactly what would happen if Apple went to these lengths. There would almost certainly be unforeseen and unintended consequences.



One consequence would be Apple losing the level of access it needs to fix security problems in the OS that come up over time.



As in the "rats" analogy, Apple would be effectively sealing off its own access to its OS, eliminating its ability to fix security exposures that might be revealed as time goes on.



"Who says there's not another Heartbleed?" Fu says. "We are removing our ability to combat those kind of vulnerabilities."



Better to leave the drains open. "In order to make a secure system, you need to make it a little less secure," Fu says.



Also, Apple may have a few of its own secret methods of bypassing security and accessing user data-methods that are used only in Cupertino. It may not want to give those up.



The Hardware Approach



The other general method of hacking into a locked iPhone is by compromising the hardware.



The reason the FBI gave for backing off the San Bernardino court order is that a third-party had emerged with a technique for breaking into the Farook iPhone without Apple's help. Some in the security community speculated that the third party supplied a method that calls for the removal and duplication ("mirroring") of the phone's NAND memory module, where the encryption and decryption keys are stored. With the contents of the NAND mirrored on another module, a fresh copy of the encryption keys can be swapped on a new module every time the 10 login limit is reached, effectively extending the number of possible passcode guesses to infinity.



The most obvious way for Apple to thwart this kind of attack would be to relocate those encryption keys to a more secure place on the phone-someplace that can't be physically removed from the device. One possibility is the Secure Enclave in the CPU, where the unique transaction codes for mobile payments are stored.



But this approach, security people say, comes with serious drawbacks. The reason the crypto-keys are kept on the NAND module is so that they can be quickly accessed when the user enters a passcode or passes the fingerprint scan. After all, we unlock our phones thirty times a day or more. Storing the keys on the CPU would slow down the act of unlocking the phone. And the CPU itself might have to grow bigger to handle the increased power requirement, which in turn could compromise the svelte design of future iPhones, Fu says. Moreover, making such a radical change to the CPU would take time.



"Apple needs time to re-spin hardware, so don't expect any Secure Enclave updates until the iPhone 7s," says security researcher Dan Guido.



A more practical idea, perhaps, is to keep the crypto-keys on the NAND module and make the module much harder to remove from the circuit board of the device. This might be done by attaching the module to the circuit board with heat glue, Fu says. Attempting to remove it would break the module or the circuit board. Either way, the phone becomes inactive.



Pushed To The Defensive



Over the past couple of years Apple has been steadily tightening the security on its phones and making it more difficult for law enforcement to execute search warrants that extend to the contents of the devices. The matter came to a head when the FBI won its court order requiring Apple to help hack the Farook phone.



When the FBI called "time out" in that case, I believed Apple might continue to strengthen the security of its phones, but still leave a tiny crack in the backdoor in case the government presented a profound need to break into a terrorist's device in the future.



My opinion changed when the Justice Department recently revived a criminal case in Brooklyn, New York, again trying to compel Apple on the strength of the antiquated All Writs Act. The Brooklyn case involves a convicted drug peddler, while the San Bernardino matter involved mass murder by people with connections to international terrorist groups. The government has an even lesser need to gather information from a locked iPhone in the Brooklyn case than it had in the San Bernardino case.



The Justice Department's reviving of the matter proved that the stand-off over the Farook phone was no one-off, and that the government was looking to establish precedent for using the All Writs Act to quickly gain access to private iPhone contents in a broad array of future cases.



Now that this card has been played, it becomes very likely that Apple will do everything it can to further remove itself from the secure relationship between user and phone. If the government has a warrant to search a locked iPhone, and the user-like Farook-isn't around to enter the passcode, Apple will simply, truthfully, be unable to help.



No comments:

Post a Comment