When is a government back door into your social media account a back door and when is it “access to communications”?
That is the question experts are struggling to figure out after the government on Friday further detailed legislation, to be introduced this year, that will oblige tech companies and telcos to assist law enforcement in accessing encrypted information.
“What we seek to do with other leading economies in the world is to ensure that [tech companies] assist the law, to enable our law enforcement agencies to have access to these communications so that they can keep us safe”, Prime Minister Malcolm Turnbull said at a press conference.
As it has done in the past, the government stressed that its methods would not involve asking for a “back door” into devices like the iPhone or services like WhatsApp and Telegram.
But it’s tough to say whether that’s accurate, given the vagueness of the government’s wording, and depending on the government’s definition of a back door.
Why back doors are taboo
A back door is generally taken to mean any feature of a computer system that allows for unauthorised or undetected access. These can be administrative tools to let programmers or device-makers in, and can also be something installed by hackers as a means of attack.
In the case of encryption, a back door would mean a way to decrypt information without the proper key, i.e. without being one of the people authorised to see the data. The scary thing about this is that if a method is created for one person or government to do this, there’s not much stopping anyone else from using it.
“If you start to weaken encryption, whether that be for lawful interception or any other reason, you start to put at risk the very premise of why we have encryption in the first place, which is to keep messages private”, says security researcher Troy Hunt.
“Yes, you might give our government access, you may also give unfriendly governments access, you may also give criminals access”.
Take a service like Telegram, for example, which is used by a huge number of people but has been frequently referenced as an app used by terrorists and other criminals to plan attacks and crimes. It uses end-to-end encryption, meaning that if law enforcement intercepts the message between the two parties, it cannot be read. Even if the government takes the garbled message to Telegram and asks it to translate, it won’t be able to.
Asking Telegram to change its systems so it could decrypt on request “is not really feasible”, Hunt says. “Not if you still want end-to-end encryption”.
So what is the government proposing?
The information given at the press conference was very general, but the recurring theme was that the government wanted the same laws that apply to the collection of physical information to apply to the collection of data, and that means the ability to decrypt.
If you start to weaken encryption, whether that be for lawful interception or any other reason, you start to put at risk the very premise of why we have encryption in the first place.
“It has always been accepted that in appropriate cases, under warrant, there can be lawful surveillance of private communications,” Attorney-General George Brandis says.
The use of the word “surveillance” seems to indicate a desire to intercept communications in a way that would require a back door, but Hunt says that’s not necessarily the case.
“[End-to-end encryption systems] have been designed in such a fashion … that there’s absolute privacy between the two parties, at those two ends,” he says.
“The most feasible way of being able to get law enforcement the data they need is not to break encryption but rather to move the discussion to one of those ends.”
This means that law enforcement would need to get direct access (through physical aquisition or digital monitoring) to a device being used by the target of their investigation, rather than plucking the communication from the internet as it travelled between users.
If the device itself was encrypted, as many are, authorities would presumably require the device’s manufacturer to have a way to decrypt it.
And there are other ways to get at information without creating back doors too, supposing tech companies are co-operative.
“If there is data which is residing on servers that the likes of Apple or Microsoft or whoever has access to, then that’s quite a different discussion to back-dooring encryption,” Hunt says.
The Silicon Valley problem
Mr Turnbull noted the “very libertarian culture” of many US tech companies, which had informed technologies designed specifically to keep governments from snooping on encrypted messages.
“The reality is, however, that these encrypted messaging applications and voice applications are being used, obviously by all of us, but they’re also being used by people who seek to do us harm. They’re being used by terrorists, they’re being used by drug traffickers, they’re being used by paedophile rings.”
At the recent G20 summit in Hamburg, Mr Turnbull addressed world leaders by saying: “There should be no ungoverned space on the internet. We need more assistance to ensure that higher and higher levels of encryption are not being used to conceal terrorists and criminals.”
But it’s not clear that many in Silicon Valley are keen (or even able) to provide such assistance, and it’s hard to blame them, some argue, given prior events in their home country.
“We have these NSA precedents from the Snowden leaks where we saw illegal behaviour, we saw massive overreach and we saw things happening which really should not have been happening in the first place,” Hunt says.
“The US government has done law enforcement a lot of harm and also done US tech companies a lot of harm as well.”
Laurie Patton, executive director of not-for-profit advocacy group Internet Australia, says the government’s assurances about back doors don’t inspire confidence.
“Based on our experience with the flawed data retention scheme, we are very concerned,” he says, pointing out that encryption systems protect everything from modern banking systems to local and global business.
“The risk is that in a well-intentioned effort to deal with a serious issue we inadvertently create an even bigger problem.”
Finding a balance
“The reason this is causing such debate is that there are no easy answers,” Hunt says.
While Apple and many other companies and individuals oppose government decryption of any data, it’s clear some access is necessary when highly dangerous activity is being planned and discussed in a way that’s undetectable.
At the same time, fears of government overreach are not unfounded, and the privacy and security of people using encrypted phones and messaging is enormously important.
“It’s important that we have transparency around what constitutes a lawful request. We need to have confidence as a society that the government is actually working within the boundaries which we expect them to”, Hunt says.
“Having said that, of course, we don’t want the government to burn what their procedures and processes are for accessing data. There’s a tricky balance here around how we get law enforcement to be transparent about what sort of data they can request and what data they can access, while at the same time not putting at risk their ability to do that against adversaries who we really, really want to have monitored by law enforcement.