Category Archives: Risk management


Shhh… What the FBI Already Knew About Apple

From the arstechnica:

Encryption isn’t at stake, the FBI knows Apple already has the desired key

The FBI knows it can’t bypass the encryption; it just wants to try more than 10 PINs.
by Peter Bright – Feb 19, 2016

Apple has been served with a court order at the FBI’s request, demanding that it assist the government agency with unlocking an iPhone 5C that was used by Syed Rizwan Farook. Farook and his wife, Tashfeen Malik, killed 14 and injured 24 in an attack in San Bernardino, California on December 2, 2015.

In response, Apple CEO Tim Cook said that the FBI was demanding the equivalent of a backdoor and that complying with the FBI’s demand would undermine the security of all iPhones.

Whether you call it a “backdoor” or not, it’s important to recognize that the ordered changes to the iPhone operating system would not circumvent the core of the iPhone’s encryption. The court isn’t asking Apple to defeat the encryption in any way. Nor does the court require Apple to create a vulnerability that would jeopardize the security of any other phone. Rather, it’s asking Apple to do the one thing that Apple alone can do: use the iPhone’s built-in method of installing firmware written by Apple.

The FBI wants to search the iPhone 5C and has been granted permission to do so by the device’s owner, the San Bernardino County Department of Public Health (Farook’s employer). To perform this search, the FBI needs the device’s PIN. Without it, the government has no way of decrypting the iPhone’s storage and hence no way of examining any data stored on the device.

The encryption used by the iPhone to protect its storage is a multi-tiered system. At its core are two keys, one embedded in the hardware and the second derived from the PIN. The hardware key is used to generate a file system key that is in turn used to encrypt the file system metadata. That metadata includes an encryption key for each individual file. That per-file key is encrypted using (indirectly) an encryption key that is derived from a combination of the hardware key and the PIN key. As such, without the PIN key, it’s impossible to decrypt those per-file keys and hence impossible to decrypt files stored on the iPhone.

There are some minor nuances over where and how the hardware key is stored and where the different encryption operations are performed. There are variations between different iPhone models, but the broad design is true of all iPhone models running iOS 9.

It’s important to note here that the cryptography aspect is robust. The FBI is not asking for, and Apple almost surely could not provide, any kind of bypass or backdoor for the cryptographic parts of the system. There is no “master key” that can decrypt the files or otherwise break the dependence on the PIN key. The cryptography appears to be secure.

In practice, encryption isn’t usually defeated by cryptographic attacks anyway. Instead, it’s defeated by attacking something around the encryption: taking advantage of humans’ preference for picking bad passwords, tricking people into entering their passwords and then stealing them, that kind of thing. Accordingly, the FBI is asking for Apple’s assistance with the scheme’s weak spot—not the encryption itself but Apple-coded limits to the PIN input system.

PINs, especially four-digit PINs, are highly susceptible to brute-force attacks. With four digits and hence only 10,000 possible combinations, it’s straightforward to simply try every number in sequence until you hit the right one. To combat this, the iPhone uses three specific techniques.

The first is that the iPhone imposes delays between PIN attempts. While the first four attempts can be entered back-to-back, the iPhone will force you to wait one minute before the fifth attempt, five minutes before the sixth, 15 minutes before the seventh and eighth, and a full hour before the ninth.

The second technique is that the iPhone can be configured to wipe the device after ten failed PIN attempts. When this option is turned on, the phone will discard its file system key after 10 bad PINs, rendering all the file system metadata (including the per-file keys) permanently inaccessible.

The third and final technique is that the computation used to derive the PIN key from the PIN itself is slow, taking approximately 80 milliseconds.

It’s the first two of these mechanisms that the FBI is asking for assistance with. While the 80 millisecond delay is in some sense unavoidable (a faster system might be able to perform the key derivation more quickly, but it’s not as if the iPhone hardware is readily upgradeable), both the escalating long delays and device-wiping functionality are arbitrary software decisions. The FBI is asking for Apple to create a custom iPhone firmware that removes the escalating delays and omits the device wipe. As a bonus, the FBI is also asking for a way to enter PINs other than typing them in one after the other on the touchscreen. Thus, the FBI wants Apple to make a special version of iOS that is amenable to brute-force attacks on its PIN.

As long as the phone uses a PIN, this would ultimately let the FBI unlock it. If it’s locked with a secure password, unlocking the phone may well prove intractable even with the special firmware.

Such a firmware would not seem to be generally useful for attacking other iPhones, though. The FBI’s request is that the special firmware be tied to the specific device. Every iPhone contains a multitude of unique identifiers that are baked into its hardware (the serial number, the cellular radio IMEI, and the Wi-Fi and Bluetooth MAC), and the court order explicitly states that the custom firmware must be tied to the San Bernardino phone’s unique identifier, such that it can only run on that specific phone.

Assuming that this can be done (and done robustly), it means that even if the custom firmware were given to nation-states or even published on the Internet, it would not serve as a general-purpose way of performing brute-force PIN attacks. It would be useless on any device other than the San Bernardino device. To make such leakage less likely, the court order does allow for the possibility that the custom firmware might be used only at an Apple location, with the FBI having remote access to the passcode recovery system.

Such an approach is consistent with the way Apple already performs lock screen bypasses on devices running old versions of iOS; law enforcement sends the device to Apple, Apple does the data extraction using tools the company has explicitly created to perform the extraction, and law enforcement receives a FireWire or USB drive with the data. Apple’s custom tools never leave Cupertino.

Hypothetically, if the special firmware were to leak, what exactly would prevent people from making it work with a different unique identifier—or even with any unique identifier. This concern strikes at the very heart of the matter, and it’s why Apple is involved at all.

The FBI does not really need Apple to write a custom firmware that lets you brute force the iPhone PIN without risk of wiping the device or suffering lengthy timeouts. It’s much easier for Apple to write this code, of course, because Apple knows all about the iPhone, but there’s no doubt that the FBI could pay some enterprising reverse engineers and hackers to develop the software itself. The problem for the FBI is not so much the development of the software; it is getting that software to run on the iPhone.

The iPhone requires that its firmware have a digital signature that authentically demonstrates that the firmware was developed by Apple and has not been subsequently modified. The FBI does not have (and is not asking for) access to Apple’s signing key. It is instead asking for Apple to use its signing key to sign the custom firmware so that the iPhone will accept it and run it. It is this signature requirement that means the FBI cannot create the software itself.

It’s this same requirement that also means that iPhone users would be safe even if the special firmware leaked. Changing the embedded unique identifier within the special firmware would break the signature and thus cause targeted iPhones to reject the firmware. This is why complying with the court demand would not jeopardize the security of any other phones. The cryptographic safeguards don’t allow it.

The security of these digital signatures is being taken for granted by the FBI; once again, the cryptography underpinning the system is sound, and the government is not asking for it to be bypassed or backdoored or otherwise attacked.

The FBI’s request does, however, put into sharp relief the parts that aren’t sound. The PIN lockouts and device wiping measures are all “just software.” They’re not dependent on any particular mathematical feature of the algorithms, they’re not proven by years of analysis of the underlying mathematics. And as “just software,” Apple has every ability to override them.

One could imagine ways in which iPhones were made a little more resilient against this kind of thing, but they’re not straightforward. The court order suggests the use of the iPhone’s “DFU” mode. This is an extremely low-level mode designed for last-ditch recovery of the device. In this mode, the screen is not even activated or enabled; the phone has to be connected to a computer via USB to transfer a new firmware image. One could imagine ways in which even this mode could be PIN protected, perhaps even making it destroy the file system key if a correct PIN is not available, but this is tricky. One of the points of DFU mode is its simplicity. It does one thing as a fail-safe emergency measure. Making it more complex would jeopardize its ability to serve its fundamental purpose.

Overall, the FBI’s request could be seen as a testament to just how good encryption is. The FBI can’t attack the iPhone’s encryption directly, and it can’t bypass the firmware signature mechanism. There’s no existing backdoor to the crypto.

But what the iPhone does have is software lockouts, and the security of those lockouts is entirely up to Apple. Apple’s signing key gives the company wide power over the software-level protections built in to iOS. The FBI knows this, and that is why it’s demanding the company’s assistance.


Shhh… James Clapper on the Internet of Things – to Spy on You

In the future, intelligence services might use the IoT for identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to networks or user credentials.

That’s the future for the Internet of Things, according to the US Director of National Intelligence James Clapper in his copy of the testimony to the Senate Armed Services Committee.


Shhh… New US Bill to Stop State-Level Decryption

California Congressman Ted Lieu has introduced the “Ensuring National Constitutional Rights for Your Private Telecommunications Act [ENCRYPT]of 2016” – find out more from the WIRED.


Shhh… Poles Protest Against Internet Surveillance

Poland’s online media laws are some of the most invasive in Europe. Polish law requires telecom companies to retain metadata on its users for some length of time and allows nine different law enforcement agencies (an exceptionally large number) to demand it. According to the digital rights group Panoptykon Foundation, nearly two million requests for user data are made by the government yearly, whereas in most EU countries it is less that half that number.

Find out more from the KraKow Post.


Shhh… Skype to Hide Your IP Address (Finally)

Here’s the Skype announcement:

Skype is fully committed to delivering as safe and secure of an experience as possible to our customers. We have recently introduced the ability to hide a Skype user’s IP address and we’ve set this as a default status in the latest versions of Skype.

Starting with this update to Skype and moving forward, your IP address will be kept hidden from Skype users. This measure will help prevent individuals from obtaining a Skype ID and resolving to an IP address.

You can find this update in the latest versions of Skype on desktop and mobile* devices, which you can download here. We also recommend you update Skype across your devices to ensure you benefit from the best experience possible.

*Android, and coming soon as default on iOS.


Shhh… Blackberry Deny Dutch Police Crack Encryption

BlackBerry has claimed on its corporate blog that its phones are “as safe as they have always been” after reports that Dutch police are capable of accessing encrypted BlackBerry messages (like the video clip above).


Shhh… The 25 Worst Passwords of 2015

From the, here’s the complete list of the 25 worst passwords for 2015, with their ranking from 2014 in brackets:

1. 123456 (Unchanged)
2. password (Unchanged)
3. 12345678 (Up 1)
4. qwerty (Up 1)
5. 12345 (Down 2)
6. 123456789 (Unchanged)
7. football (Up 3)
8. 1234 (Down 1)
9. 1234567 (Up 2)
10. baseball (Down 2)
11. welcome (New)
12. 1234567890 (New)
13. abc123 (Up 1)
14. 111111 (Up 1)
15. 1qaz2wsx (New)
16. dragon (Down 7)
17. master (Up 2)
18. monkey (Down 6)
19. letmein (Down 6)
20. login (New)
21. princess (New)
22. qwertyuiop (New)
23. solo (New)
24. passw0rd (New)
25. starwars (New)


Shhh… France “Non” to Encryption Backdoor Bill

Check out the following ZDNET article:

Encryption backdoors by law? France says ‘non’

A proposed amendment to France’s Digital Republic Bill, suggesting mandatory hardware backdoors to bypass encryption, has been rejected by the government.

By Liam Tung | January 18, 2016

The French government has rejected a proposed bill that would have required hardware makers to design products that give authorities access to stored data, even if it is encrypted.

The draft bill, proposed by a right-leaning politician in the wake of the Paris terrorist attacks, would have required all tech companies to insert backdoors into devices, on the grounds that encryption should not impede a police investigation.

The proposal, brought by Republican politician Nathalie Kosciusko-Morizet, came as an amendment to the Digital Republic Bill, France’s proposed legal framework for open data, net neutrality, and data protection in the context of cloud computing.

“France must take the lead by requiring equipment manufacturers to consider the imperative of access of police and gendarmes, under the supervision of a judge and only in the context of a judicial inquiry, to these materials,” the draft amendment read.

The failed bid to introduce mandatory backdoors marked one more effort to legislate against encryption in a debate that’s been reignited by the Paris terror attacks, after speculation the attackers used encryption to coordinate the assaults.

It came alongside a proposal in New York to ban the sale of any smartphone using encryption that cannot be bypassed by its manufacturer.

Critics of such proposals have repeatedly pointed out that secret backdoors cannot be kept exclusively open to law enforcement without the risk that they’ll be found and exploited by criminals or other governments.

That was the argument taken up by France’s deputy minister for digital affairs, Axelle Lemaire, who was quoted by French site Numerama as calling the proposal “vulnerability by design”. With the Digital Republic Bill, the government hopes to enable privacy by design.

“With a backdoor, personal data is not protected at all,” Lemaire said. “Even if the intention is laudable, it also opens the door to players who have less laudable intentions, not to mention the potential for economic damage to the credibility of companies planning these flaws.”

A case in point, she said, was the recently discovered backdoor in Juniper’s ScreenOS, thought to have been inserted in 2012, giving the attacker a free hand to decrypt data passing through its equipment.

She also pointed to the recent announcement by the Netherlands that it would not legislate against the development, availability and use of encryption due to its importance to businesses, such as online banking, and personal privacy.

While acknowledging that the Paris attackers possibly did use encryption, the Netherlands government said, “A technical input into an encryption product that can be seen by the prosecution authorities would allow encrypted files in digital systems to be vulnerable, eg to criminals, terrorists and foreign intelligence services.”

Kosciusko-Morizet defended her proposal on the grounds that police should be able to inspect computers the way they can search a home.


Shhh… No Safe Haven – Cops Can Access Encrypted PGP Blackberry

So much for this useful video instructions, the cops are now able to break into the supposedly safe havens – ie. decrypting PGP BlackBerry.