The NHSX contact tracing app has nothing to do with contact tracing

Instead the UK government are working with Faculty to create a comprehensive social graph

Georgia Iacovou
5 min readMay 7, 2020

Today NHSX, the digital arm of the NHS, are piloting their contact tracing app in the Isle of Wight. Before we dive in to the privacy concerns I would like to point out that the app itself is shoddy — it will not work properly as a contact tracing app, and therefore, it probably won’t work very well as a secret spying app either. Once again, the sheer ineptitude of this government stops them from even being evil effectively. Classic.

Let’s start with the basics: this app is not even up to standard to feature in the NHS app store. According to the Health Service Journal, NHS digital haven’t been able to get their hands on a solid codebase to test it out because “they keep changing it all over the place”. Doesn’t sound fishy at all.

Next we have the fact that the information commissioner, Elizabeth Denham, has yet to receive a data protection impact assessment — the very thing that the ICO need in order to understand the potential data privacy risks of a new project before it can be launched.

🤫 I just want to be totally clear on that: this app allows the government to surveil UK citizens in a way that’s never been done before, and they are deciding to forego a crucial step in protecting people’s privacy? Their desperation to launch this thing is grossly tangible.

This all means that the current the piloting of the app in the Isle of Wight is completely unlawful. Love it when the government break the law 👏

This app allows the government to surveil UK citizens in a way that’s never been done before, and they are deciding to forego a crucial step in protecting people’s privacy.

Of course the government are making flimsy statements about how the app was built with privacy in mind; either they’re super thick or they think we are. They’re saying the data collected will be anonymous, but what they expertly fail to understand is that when you collect data at this scale, it’s really hard to keep it truly anonymous. Perhaps they haven’t heard of Facebook or Google, two tech giants who’s core business model is to make money from inferred data.

How does the app ‘maintain’ privacy?

First of all, upon initial opening of the app, you are told to enter your postcode, which — you know, I’m not a data scientist so please correct me if I’m wrong here — really fucking narrows it down. Right from the get-go you’ve made the jump from ‘someone in the whole of the UK’ to ‘someone in SE15’.

Then, you are assigned a random number, or a ‘big random number’ as they put it, and that is essentially how they ‘anonymise’ your passive interactions with other devices. It’s just one random number coming into close proximity to another.

Then of course, as soon as a user is symptomatic, they press a button and the last 28 days worth of data about them is sent off to a central server for the government to gawk at. Obviously there are very valid reasons for this, but it really throws that user’s anonymity straight out of the window. The randomly assigned number is associated with your device forever; put that together with the other data the app has gathered and you have a really good fingerprint ID for that person, and everyone else who presses the symptom button.

It’s very clear that the government have chosen not to use the Google and Apple API because it would get in the way of what they really want: to build a social graph of the UK population.

It’s very clear that the government have chosen not to use the Google and Apple API — which is a much more decentralised approach — because it would get in the way of what they really want: to build a social graph of the UK population. The data collected from this app will contain information about who you regularly associate with, where you live, and a rough picture of your everyday life. This information can be abused, and I find it hard to believe that Matt Hancock is not aware of this, when he tells us that we have a ‘duty’ to use it.

Now, the API built by Google and Apple also creates a randomly generated user ID — but it creates a new one everyday. It’s almost as if Apple and Google have built apps before? Of course the API is of course not without it’s vulnerabilities, and the second phase of this project is to introduce this functionality at operating system level. This makes sense, because it allows for wider adoption, which is one of the main challenges of contact tracing solutions.

However this feeds into the over-arching privacy concerns that have been cropping up all over the place throughout this pandemic; these emergency surveillance projects — such as allowing bluetooth broadcasting at operating system level — need to be implemented now, sure, but the exit strategy for these systems need to be baked in from the very beginning. Otherwise, general surveillance becomes the norm, and in ten years we’ll have another Edward Snowden.

Now let’s build that social graph

Someone smarter than me has tweeted about this, here’s an example:

NHSX have been working with Faculty, who have a website that screams ‘I eat government contracts for breakfast’. Faculty are just another Cambridge Analytica, or Palantir. They say they use AI to do things like ‘stop online propaganda’. But of course, that’s just code for ‘replace existing online propaganda with your own, probably much worse, propaganda’. Their TLD is .ai which is indicative of how much more they care about money than they do about helping people.

So why do they need to work with Faculty? Because data is useless if you don’t know what to do with it. The government are at least clever enough to know that. So, in turn, they are also clever enough to give Faculty our tax money to take the data they collect, and aggregate it. Faculty is probably bloated with machine learning algorithms and prediction models — enough to give Matt Hancock a permanent semi.

Using Faculty’s resources means that the government, in theory, can learn a great deal about how the virus spreads. But it also means they can learn a great deal about our daily lives. What’s stopping them from using the data however they want?

The refusal to use an API that would allow the app to work effectively now makes sense — because they do not want the app to work effectively. They just want to use this pandemic as an excuse to create a centralised bank of social information on every UK citizen.

--

--

Georgia Iacovou

Writing about data privacy and tech ethics; dismantling what Big Tech firms are doing in a way that’s easy to understand