Get off the Internet

In my last post I summarised why undermining (all) our security with intrusive laws is unhelpful. But what if it comes to pass? The best advice (and what I would try and do) is get off the Internet.

Ok, that’s a little overblown, but I would stop using the Internet for important things. Banking, bills, anything to do with money.

I’m not concerned about social media, although some things might be embarrassing taken out of context. I’ve always assumed they could become public, however, there are good reasons to be concerned about privacy.

What is most dangerous is the undermining of digital security, as without secure data you can’t have private data.

From here to dystopia

The big fuss at the moment is the Apple vs FBI case, but no-one seems to be looking at how it plays out later.

The FBI’s request is exactly the kind of thing required by the UK’s Investigatory Powers Bill (IP Bill):

The bill authorises agencies to compel “telecommunications providers” to assist them in effecting a hacking warrant, unless “not reasonably practicable”. Apple has pointed out that the term “telecommunications provider” is so broadly defined as to expand the government’s “reach beyond UK borders to … any service provider with a connection to UK customers”.

There’s a great cartoon from Stuart Carlson that perfectly illustrates the situation:

Tim Cook prepares to open the back door of a giant iPhone, with various people lining up behind him, they have evil grins and are rubbing their hands in glee.

Hackers and repressive regimes queuing up behind the FBI to get in the back door.

So if the IP Bill passes and there is similar legislation in the US, what happens next?

  • To start with, not much. There will be a lot of arguing about the details and planning.
  • Back doors get put in US and UK companies products. For online services that may not be a big issue as they have access behind the scenes anyway, unless the Government wants live (i.e. external) access. For hardware and applications you download, it leaves a crack in their armour that anyone can attack.
  • The number of ‘hacks’ that we hear about in the news doesn’t just continue, it accelerates year over year. More people suffer from identify theft, more money is stolen from accounts.
  • Anyone wanting to avoid detection will use non-US/UK devices and applications with end-to-end encryption so criminals will be unaffected. After all, you can’t ban maths.
  • Countries like China will demand the same sort of access, but with no limits on what they do with the information.
  • Governments get more desperate and declare ‘cyber war’, implementing increasingly intrusive and draconian measures.

I’m not alone in these conclusions, the New York Times speculates about future headlines.

If you aren’t worried about security on the internet now, you should be, here are just a few examples that have already happened without the extra insecurity of backdoors:

Even with the current state of security, a recent study by the US Department of Commerce found that half of American Internet users are “deterred” from engaging in online transactions because of fears over privacy and security breaches.

Making something secure is hard already, if our general security on the Internet is undermined it will get worse, and the only rational response is to minimise it’s impact and use it less.

That undermines the utility of the Internet, and the UK’s digital economy (of which I’m a part – that is my bias).

When considering even a theoretically secure ‘back-door’, you are stepping into the dual truisms of security: “Cryptography is harder than it looks”, and “Complexity is the worst enemy of security”.

If anyone tells you that [the vendor] can just ‘tweak’ the system a little bit to add key escrow or to man-in-the-middle specific users, they need to spend a few days watching the authentication dance between [the client device/software] and the umpteen servers it talks to just to log into the network. I’m frankly amazed that any of it works at all, and you couldn’t pay me enough to tamper with any of it.

For the case of Apple’s encryption specifically:

My colleagues and I do not argue that the technical community doesn’t know how to store high-value encryption keys­ — to the contrary that’s the whole point of an Hardware Security Module. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology

One year later, and the company the FBI paid to get into the iPhone got hacked, so if that ‘secret’ method wasn’t out then, it probably is now.

We should also consider the direct damage done by Government backed hacking, such as when GCHQ hacked a Belgium Telecoms company and prevented a key infrastructure company from delivering email and updating it’s software. Or the backdoor left in our company’s Juniper firewall that took researchers 6 hours to backwards-engineer and access. These have directly harmed our own security.

A related point is that the NSA (and presumably GCHQ) hoard issues that they can use offensively, at the detriment of our own defence. When the NSA gets hacked and others find their trove of tools for hacking, suddenly things we have been relying on for years are vulnerable. These have also directly harmed our own security.

Avoiding the dystopia

We need people to understand that the Apple vs FBI case is just one example of the bigger picture, but it is an important one and Apple is right to make a stand [1].

The framing of the problem should not be about the organisations involved, but National Security vs Everyone’s Personal Security. We give up security for everyone in order to break into the devices of a few.

We need to answer the question: Do people in the UK/US have the right to use encryption? (I.e. encryption using a key the Government does not have access to, therefore might not be able to get into.)

Given that criminals (including terrorists) can use encryption now, and will be able to whatever the laws are here, I think the answer should be “yes”. Answering “no” just means our security is weakened for no benefit.

Bruce Schneier said it well:

Ubiquitous encryption protects us much more from bulk surveillance than from targeted surveillance. For a variety of technical reasons, computer security is extraordinarily weak. If a sufficiently skilled, funded, and motivated attacker wants in to your computer, they’re in. If they’re not, it’s because you’re not high enough on their priority list to bother with. Widespread encryption forces the listener – whether a foreign government, criminal, or terrorist – to target. And this hurts repressive governments much more than it hurts terrorists and criminals.

The Investigatory Powers Bill forces companies to undermine their own security, so answers ‘no’ to the encryption question. It should not be rushed through, it needs to account for many of the issues raised in order not to undermine our security.

This question is one example of how we need to improve our ‘defence’, as I outlined before. The UK & US security services are very attack focused (e.g. finding security vulnerabilities to use), but our countries have the most digital infrastructure and are therefore the biggest targets for digital attacks. We need more focus on defence, such as using good encryption by default, for everyone.

For example, where is our (opt-in) version of the Estonian ID card with your private/public key pair for electronic signatures? Or the proposed blockchain for medical records?

UPDATE: And since I wrote this, John Oliver said it better (if you are in the US, or can pretend to be):

Harvard also published a well rounded report on the ‘going dark’ framing, which is a good in-depth but understandable read.

1] In the specific Apple vs FBI case (PDF) I don’t think it would be a big issue for Apple to do what was asked as it is specific to one device. The FBI/Government have cleverly picked the ideal case from a publicity point of view.

However, it is the thin-end of the wedge, the precedent would be set for Apple to be required to repeat the process until the only practical way of dealing with all the requests is a universal back door. According to former FBI & NSA employees, it is all about the precedent:

former officers with the FBI and NSA acknowledged that U.S. intelligence agencies have technology that has been used in past intelligence-gathering operations to break into locked phones. The question, they added, was whether it was worthwhile for the FBI to deploy that technology, rather than setting a precedent through the courts.

There’s a good overview of the case so far, and in a similar case the judgement went against the Government.