On December 2, 2015, the married couple Syed Rizwan Farook and Tashfeen Malik shot and killed fourteen people at a training event for the Department of Health in San Bernardino, California, where Farook was employed. A further twenty-two were injured in what was the deadliest mass shooting in the United States since 2012, and the deadliest in California since 1984. The entire shooting took less than four minutes.
The shooters fled the scene in a rented car, leaving behind explosive devices intended to target emergency responders. Fortunately, these bombs never exploded.
About four hours after the shooting, police were able to find and stop Farook and Malik in their rental car. There was an exchange of fire and both shooters died at the scene.
According to the FBI, Farook and Malik had sent Facebook messages to each other in which they both committed to violent Jihad. Their Facebook profiles also declared an allegiance to ISIS leader, Baghdadi. In light of these findings, on December 6, 2015, President Obama defined the shooting as a terrorist attack — the deadliest on US soil since 9/11.
According to the media, Farook and Malik had both thoroughly destroyed their personal phones prior to the attack, making it impossible to retrieve any information from the devices.
Farook’s employer, however, had issued him with an iPhone 5C, which Farook did not destroy before his death. This iPhone is running iOS 9, which is secured with a numeric passcode and had regularly backed up with Apple’s iCloud service. While the information on the iPhone itself is encrypted, the backups in the cloud are not. The investigators could have triggered an automatic iCloud backup by simply returning the iPhone to one of the Wi-Fi networks it had previously accessed. But this option became moot when an investigator reset the passwords of the iCloud account, thereby disabling the automatic backups.
Why the iPhone’s Passcode Is Impossible to Crack
While a simple numeric passcode would be easy for any computer to guess, three restrictions in iOS 9 prevent devices from being cracked:
- There is an 80 ms delay between each password attempt.
- While the 80 ms delay would slow down an attack, in theory, it would nonetheless only take 800 seconds to go through all 10,000 possible combinations of a four-digit code. Without the artificial delay, it would take less than a second to crack the password. This lag becomes very significant with longer passwords.
- The passcode must be entered by hand.
- If it takes two seconds to manually enter an incorrect passcode and get an error message, it would still only take about five and a half hours to guess the code.
- The kicker: the device becomes unrecoverable after ten failed attempts.
- While the first two restrictions are only relevant for complex passcodes for which millions or billions of possible combinations exist, this third barrier makes it entirely unfeasible to attempt to unlock the phone by simply guessing the password.
To circumvent these barriers, the FBI would have to write their own version of the iOS firmware, load it onto the phone, then attempt to automatically guess the password. Such a technique could still be unsuccessful if the phone were protected with a strong password, as it would take today’s computers too long to crack.
Perhaps the FBI do not have the technological experience to create such a hacking tool. But it is very likely that other agencies, such as the NSA, do. We don’t know if the FBI has asked the NSA for assistance in this matter, or whether the NSA has already developed software capable of thwarting the three restrictions mentioned above, but we do know that they could — just as we know Apple could.
The FBI Wants Apple to Help Hack Apple Devices
When the FBI asked Apple to voluntarily help them create software to remove the three restrictions, Apple said no. So, a few days before the warrant to search the phone expired, the FBI sought the assistance of the United States District Court for the District of California. On February 16, the court ordered Apple to comply with the FBI’s request.
The original order asked Apple to provide the FBI with firmware that bypasses the restrictions, although it does gives Apple the permission to design firmware that can function solely on Farook’s phone, as identified by its unique identifier (UDID), which functions like a serial number. The order also allows Apple to conduct this “recovery” on its own premises, and to charge the government for “providing this service”.
Apple refused to comply and their response, signed by CEO Tim Cook, has since been praised and shared countless times all over the Internet. In their response, Apple stresses that they have already shared all data they can share (which presumably includes the iCloud backups of the iPhone in question), and they did everything within the law to help the FBI.
Apple also stresses that it would be technologically impossible to create this kind of software in such a way that can only be used once. Any firmware capable of cracking Farook’s phone would work on any other iOS device. The federal government has already sought Apple’s help to unlock phones in 12 other cases, and the newly-created software would no doubt be requested again the next time the FBI want to access a phone. Once a legal precedent is set, it would be very difficult to refuse a demand in the future.
The FBI Wants You to Take Their Side
It seems the FBI chose Farook’s case to create a precedent. The FBI chose it not because of its relevance to national security, but because Farook’s label as a “terrorist” makes it more likely public opinion will side with the FBI.
We are witnessing not just a court battle but also a battle for public opinion. And a lot is at stake.
The FBI is determined to set a legal precedent in which it successfully unlocks an encrypted phone using modified firmware, because it seeks to use dozens if not hundreds of unlocked phones as evidence in criminal trials. Seeking the NSA’s help to unlock a device might be useful in an investigation, but when it comes time for a public trial, the NSA would not be willing to share details of how they obtained the evidence, and the evidence would have to be dismissed. And without evidence, nobody can be convicted of a crime.
In this case, the FBI argues that Farook may have used this phone to communicate with his colleagues. Imagine if the NSA had already hacked the device and found that one of Farook’s colleagues had prior knowledge or even involvement in the attack. Knowledge of this communication would not necessarily be enough for this colleague to be convicted because the state attorney or FBI would not be able to explain to the court how they know (i.e., with help from the NSA), and the evidence would not be admissible.
Who Should You Trust: Apple or the FBI?
This FBI vs Apple court case puts Apple in an extremely difficult situation, because their decision affects the security of all Apple devices, not just Farook’s iPhone. While it’s likely that Apple would be willing to assist in gathering evidence in this particular case, providing the FBI with an altered version of their software has grave consequences.
As with other technology and data, it is not unreasonable to assume that this password-guessing software would quickly spread: first among various agencies of the US government, then to foreign governments, then to organized criminal organizations, and later end up as an open-source tool on Github.
This issue is less a question of data privacy (a philosophical right) than it is about data security (a technical problem), although the two are interlinked. While Apple, and particularly Tim Cook, has been vocal about our need for privacy in the past, Apple products have surpassed its competitors as best in class when it comes to built-in device security.
Apple, along with other manufacturers, wants us to store all our personal information on devices in our pockets, at our homes, and on our wrists. To do that, the tech giants need to convince us their devices are safe. So far Apple has been very successful at establishing trust with their customers and convincing people that Apple products are secure. If this relationship of trust were to be damaged, especially in such a public way, Apple’s market position might be severely harmed.
In contrast to Apple, the United States government and its various agencies no longer have the reputation they used to have, especially as perceived by individuals and organizations abroad. Apple, then, is a reliable supplier to international companies and foreign governments: they are willing to stand up to the FBI, and they work tirelessly to maintain the security of their products — even against some of the most sophisticated potential adversaries.
Are the FBI’s Demands Unconstitutional?
Beyond the question of security, the court case raises another important issue: conscription. Forcing individuals and private companies to hand over information they possess is hugely different from forcing these individuals and private companies to perform actions they find morally questionable.
The FBI is asking Apple and their engineers to create a tool they believe shouldn’t exist. To force Apple to build the firmware quite likely violates the Thirteenth Amendment to the United States Constitution (involuntary servitude). In the past, exemptions to this amendment have been granted by the Supreme Court only in cases of war.
Apple, however, argues that the court order violates their First Amendment rights: that code is free speech. Apple says their code incorporates their values, which are protected by the First Amendment. Changing the code violates and alters these values and it is unconstitutional for the government to force them to do it.
The Government Has Always Hated Encryption
A similar argument has successfully been applied in the past. When Phil Zimmermann, the inventor of the encryption program PGP, was distributing PGP around the world, he was investigated in 1993 for “munitions export without a license”. At the time, encryption was considered a weapon. To get around this ban, Zimmermann and his followers challenged this regulation by printing the source code into a hardcopy book and distributing it around the world. They argued that, as a book, the code constituted protected speech and could barely be considered a weapon. The investigation against Zimmermann was eventually dropped.
While the Zimmermann case was unfolding, the NSA was busy promoting its Clipper Chip, a tool for transmitting encrypted telephone conversations. The Clipper Chip contained a backdoor that would have gained the agency access to all phone conversations.
The agency abandoned the Clipper Chip project after significant backlash from the Electronic Privacy Information Center and the Electronic Frontier Foundation. Fears that the NSA would not be able to force foreign companies to include this chip into their products, and that these companies could then gain a competitive edge in the international markets, also prompted the NSA to drop the Clipper Chip.
This consideration will play a role in the current court case with the FBI vs. Apple, with many fearing US technology companies could lose out on contracts with foreign governments, companies, and individuals.
The Clipper Chip was later found to be insecure, and would have quickly become breachable by foreign intelligence agencies and large criminal organizations.
Is This the Return of the Crypto Wars?
The controversies around the Clipper Chip and other attempts by the government to weaken the security of our everyday communications and devices gave rise to the term Crypto Wars. The wars were infamously declared “won” after the UK Government also shelved its plans to restrict access to strong encryption protocols.
Today encryption has become widely accessible and used across the Internet. Most reputable websites encrypt their traffic with HTTPS (as indicated by the green lock in the address bar). Operating systems encrypt their hard drives by default. And messaging systems like Signal, Telegram, and Whatsapp all encrypt chat traffic in transit.
With the FBI’s attacks on Apple’s device security and encryption techniques, we may have entered the second round of the Crypto Wars. The two cases are similar in that they both remove vital security features for the sake of government access in the name of national security.
May history repeat itself and encryption win once more.
Featured image: Andrey Burmakin / Dollar Photo Club
Apple stab: Krzysztof Budziakows / Dollar Photo Club
Unconstitutional: larryhw / Dollar Photo Club
Crypto war: kaalimies / Dollar Photo Club