In August 2015, a British journalist and cameraman were travelling in Turkey, making a documentary for Vice News. As is often the case, they were working with a local agent, a “fixer” who was responsible for getting them access to the locations and subjects they wanted to include in their documentary. All three were arrested and, in September 2015, charged with deliberately aiding an armed organisation. The primary justification for the charge was reported to be the presence of sophisticated encryption software on the devices of the fixer, of a type alleged to be commonly used by terrorists.
A few months later, in California, the San Bernardino department of public health was holding a training event and Christmas party for around 80 staff. During the event, an employee of the department and his wife carried out a terrorist attack, killing 14 people and injuring a further 22. During a subsequent running battle with police, both of them destroyed their personal phones before finally being killed. The health department employee’s work phone, an Apple iPhone 5C, was recovered intact by the FBI, who believed that the phone contained information which might assist their investigation of the incident.
The FBI asked Apple to give them access to the device, which was locked with a 4-digit PIN number, and encrypted. The phone was set to erase its contents if an incorrect PIN was entered 10 times. Apple told the FBI that their devices are so secure that even Apple itself is not able to gain access to them. The FBI then asked Apple to develop software which would enable the FBI to gain access, and when Apple refused the FBI commenced legal proceedings. These were eventually discontinued when the FBI said that they had found a third party willing and able to break into the device – which was subsequently found to have no information relevant to the incident stored on it.
Incidents like these, of which there are very many examples in the media, illustrate the tension at the heart of the debate around encryption and the central role that privacy through encryption has assumed in most of our lives. There will be those who imagine that they never use encryption, but anyone who has visited a secured website employing TLS (formerly SSL) protection has done so. Many mainstream communication channels, such as WhatsApp, are end to end encrypted. Devices, not only from Apple but also from other major manufacturers are either sold with encryption turned on by default or capable of being configured to protect any data stored or transmitted through the use of freely available ubiquitous encryption software.
Two sides of the same coin
The driver behind this dramatic upsurge in the use of encryption in the everyday lives of ordinary citizens is the desire for privacy, coupled with a growing awareness of the danger presented by engaging in the digital world (particularly online) without such protections in place. Encryption is what gives confidence to shoppers wanting to make purchases online, or the users of internet banking or government portals. But, crucially, it has other purposes which give rise to the tensions identified above. It can be used by dissidents, activists and journalists, looking to co-ordinate activities and gather information on the world’s most repressive regimes. And it can also be used by terrorists, cyber-criminals and other bad actors to conceal their identities or hide details of their plans from investigating authorities.
The reason for focusing on 2015 in the examples given above is that this was the year in which the UN Human Rights Council published a report of its Special Rapporteur on the protection and promotion of the freedom of opinion and expression, David Kaye. This report was focused largely on the extent to which encryption and similar technologies were essential for the protection of free opinion and speech, as well as other fundamental freedoms.
In the context of the Snowden revelations about mass state surveillance by the NSA and others, the report considered carefully the arguments for reducing that protection in the interests of state security. Its conclusion was that none of the arguments made could justify such a weakening of privacy through encryption. But the same arguments considered and dismissed in that report continue to be made today, even as the commercial and regulatory pressure on businesses to provide enhanced privacy to their users also continues to increase.
Understanding the balancing exercise
In order to make sense of the balancing exercise that is having to be negotiated, between national security and privacy interests, it is important to understand the options that are being proposed. It would be naïve to suggest that any form of encryption is impossible to crack, but when considering the need to obtain actionable intelligence, the critical factor is time. The strength of encryption is measured in bits (essentially the size of the key that needs to be entered to unlock the data). If you imagine a 4 digit combination padlock, you can readily appreciate that it would take 10,000 attempts to try every combination between 0000 and 9999, although you might find the one that works much sooner than that. The DES encryption standard which was pioneered in 1976 used 56-bit encryption and can now be broken routinely by supercomputers in very little time at all. AES encryption using 128-bit encryption is readily available commercially and it would take the most powerful computer currently known to exist longer than the current age of the universe to test all possible combinations. 256-bit AES encryption is believed to be uncrackable (subject to developments in the field of quantum computing).
Consequently, intelligence agencies are keen to identify shortcuts that allow them to unlock the encrypted data without having to resort to brute force attacks. As detailed in the UNHRC report from 2015, three main alternatives are proposed. These are: the deliberate weakening of commercially available encryption standards; the creation of so-called “back doors” which would allow approved state actors to use something like a skeleton key to unlock the encryption; or an arrangement called “key escrow”. This latter option involves the key that will unlock the data being held (perhaps by a trusted third party) in a way that means that it could also be accessed in appropriate circumstances by those same approved state actors.
There are practical difficulties with each of these approaches, which it is not necessary to go into for the purposes of this article. But each of the three, it is generally accepted, involves weakening the privacy protection that encryption affords – not only against state actors, but also against everyone else. Weakened encryption will be just as easy to crack for cyber-criminals as it would be for intelligence agencies. Identification of the back door key to a method of encryption would expose all of the communications sent using that encryption, and not just the specific message being investigated; and trusted third parties could be vulnerable to breaches just like any other organisation, but if compromised would expose all of their customers to risk.
The question then becomes, to what extent is that weakening of privacy (which – as the 2015 report argues – also means a weakening of the associated freedoms of speech and opinion) something that can be justified in the interests of national security? This is relevant, because the rights referred to are seldom absolute. Under art 17 of the International Convention on Civil and Political Rights, interference with these rights is only permissible to the extent that it is lawful and not arbitrary. The European Convention on Human Rights (art 8(2)) requires such interferences to be lawful and “necessary in a democratic society”. Where those conditions are fulfilled, therefore, it follows that it can be permissible to diminish or even remove those rights, to the extent necessary.
For a time, after 2015, it had seemed that the balance of this argument was shifting in a number of countries in favour of legislating for national security interests, at the expense of privacy. Recent developments however, have demonstrated that the debate is still very much alive. The GDPR (and associated domestic legislation across Europe) has provided a strong incentive for data controllers to implement encryption on an end to end basis across all of their personal data estates. The use of encryption assists with the requirement to impose adequate technical and organisational safeguards; and with the risk assessment necessary to determine other safeguards. It also plays a significant part in determining whether there has been a loss of control of data following a data breach, and whether the potential impact on the rights of affected data subjects is of a level that requires a report to be made to supervisory authorities.
Even more recently, on 15 January 2020, the Advocate General’s Opinions in Case C-623/17 Privacy International, Joined Cases C-511/18 La Quadrature du Net and Others and C-512/18 French Data Network and Others, and Case C-520/18 Ordre des barreaux francophones et germanophone and others involving legislation requiring the retention by commercial entities of electronic records for state surveillance purposes underscored that the PECR does apply to such activities, notwithstanding the national security considerations. It seems that encryption will continue to represent a significant battleground between security and privacy, and that the tensions between those two competing objectives are no closer to being resolved.
Further reading
Human Rights Council: Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression
Wired: The Same Old Encryption Debate Has a New Target: Facebook
The Guardian, Edward Snowden: Without encryption, we will lose all privacy. This is our new battleground
Will Richmond-Coggan is a director at Freeths LLP, specialising in data protection, privacy, media and reputation law. Email William.Richmond-Coggan@freeths.co.uk. Twitter @Tech_Litig8or.
Image by OpenClipart-Vectors from Pixabay.