I’ve been writing about data privacy for over a year. Here’s what I’ve learned

Privacy is too big to understand. But do all of us really need to understand it?

Georgia Iacovou
6 min readAug 18, 2020
Image by me

It’s boring, abstract, and honestly ‘privacy’ isn’t even the right word. After being immersed in the data privacy space for a solid year, I’ve noticed some common themes in rhetoric and attitudes — as well as problems where there still aren’t solutions, and summarised these into five main points:

1. No one cares about privacy

Here’s something that privacy advocates desperately need to understand: privacy is unimportant and expensive. Sure apps like TikTok are built with intrusive SDKs, and have the most aggressive recommendation algorithms out there, BUT: the people using TikTok don’t care. They just want some fun social media to briefly distract them from the much more pressing issues that underpin their lives. Such as earning a living wage during a global pandemic. Or — duh — the environmental crisis.

The GDPR is unsexy and tedious; no one has time to think about the fact that their ‘free’ period tracking app actually makes money by selling data via ad networks like Facebook. If everything you do produces data which can be exploited then how can you do anything? Frankly I just want to get on with my day without having to make ludicrous adjustments to my behaviour in lieu of abstract and invisible threats to my privacy.

The thing is, someone has to care. So that leads us gracefully to number two:

2. Privacy solutions aimed at consumers don’t work; aim them at startup CEOs instead

If you’re thinking about how to ‘solve privacy’ then you should know that you and your organisation cannot do it alone. More importantly, you cannot solve it with fancy little apps or browser extensions that end users will set up and ignore — these are just band-aid solutions, and they need to be put straight in the bin. Why exactly?

☝️ It’s unfair to make privacy everyone else’s problem. Why should end users venture several layers deep into their privacy settings just to stop yet another company from tracking their online behaviours?

✌️ Solutions built directly for the consumer just slow things down; tools like Tapmydata and Delphia are dressed up to look like progress, but all they do is gamify privacy and distract people with gimmicks.

🤟 Finally, products or services that force consumers to change their behaviour just won’t fly; that’s why people ignore cookie banners instead of engaging with them. My good friend Josh Balfour has been building products for years, and he understands this.

“I will never use a product that makes me change my behaviour. I only use products that make my existing behaviours cheaper and faster.”

That’s what consumer tech is all about: making life more convenient. So why push a bunch of debilitating privacy tools on consumers? They’re the exact group of people who don’t want to change. Do you know who has a permanent semi for change, the future, and new technologies? Startup founders. So build privacy tools for them instead.

Why should end users venture several layers deep into their privacy settings just to stop yet another company from tracking their online behaviours?

The only way we can really make any progress is by fundamentally changing the way emerging technologies are built: e.g. instead of trying to get users to download an extra app that will ‘protect’ them from their existing apps, just make the existing apps better. That means developing privacy tools that are aimed at new and existing businesses, and software engineers.

Then, we can finally stop shouting into a void of consumers who don’t have the time or energy to care about this. It’s about holding the right people accountable.

Finally discovering that Google Maps has been tracking your most visited places — and turning that setting off. Photo from Unsplash, edited by myself.

3. Telling users that they ‘own’ data is completely unhelpful

Realising how grossly unpopular he’d become following the Cambridge Analytica scandal, Mark Zuckerberg clumsily announced that Facebook were ‘pivoting to privacy’. He did so via this verbose blog post last year.

With this blog post he made attempts to crystallise the harmful narrative that users ‘own’ their data. That’s completely wrong: you do not own any of the data that you passively produce by interacting with products and services. You do not own your route to work. You do not own your delivery slot preferences. You do not own your browsing history.

With this blog post, Mark Zuckerberg made attempts to crystallise the harmful narrative that users ‘own’ their data. That’s completely wrong…

That is because, crudely, two parties are required to produce data: you, and whatever app you’re using. It’s quite obvious that you don’t own this data — if you did you wouldn’t have to ask to see or delete it via a subject access request.

Get the word ‘ownership’ out of your mind, and instead start thinking more about the word ‘access’. The next section will really help with this…

4. ‘Data privacy’ is a misleading term. We’re really talking about data governance.

🤔Why is ‘data privacy’ misleading? Keeping your data ‘private’ means that no one is allowed to see it — like your diary when you were 14. Data that no one else can access is a nonsense idea; the data may as well not exist if you’re just going to hide it completely. Data is only actually useful when others can access it.

🤔What is data governance? It’s what people actually mean when they talk about privacy. Really, people just want control over their data, who else can access it, and what it’s used for. For example if you ‘hid’ all your data away from YouTube, it just wouldn’t be as good, because they need data to power their recommendations.

Of course, how we govern data goes so much further than just grabbing and holding the attention of consumers. This excellent Tech Crunch piece explains that a collection of data relating to a one individual is sort of like a single vote in an election: on it’s own, it’s almost nothing, but when put together with other data, it can change the world.

Just think how much data Facebook have sitting on their servers (it’s a lot), and how much use we’re actually getting out of it (not a lot). As a Facebook user, you only have access to your own data, which is useless. But Facebook as an entity, has access to data relating to all Facebook users which is not only useful — but valuable. This is obvious, because while you were Instagramming your lunch, Facebook have used data produced by you and billions of others to develop AI-enhanced farming machines, or acquire Giphy.

While companies like Facebook keep their data closed, we are more or less at their mercy — they can govern it however they like. Imagine what we could achieve if data like this was opened up, and governed by a more neutral institution, perhaps even one that is not driven by profit? What an idea! You can read more about this in my piece about data trusts.

5. The way we monetise the web is one of the biggest threats to online privacy

It’s quite simple: if a product or service is free to use, the money must be made a different way. Let me put that a different way, into a sentence I’ve both typed and read a million times: when the product is free, you are the product. You’re no longer charged for using apps and services, because they take the one valuable thing that you have in abundance: data. And, annoyingly, they take it without asking. How else do you expect to read Guardian articles all day without paying them a single penny?

Don’t blame The Guardian… it’s not their fault. Ad networks (such as Facebook and Google) made it possible for websites to make money without having to ask for it from their users. It’s hard to say no to that. However, the result is an internet that is bloated with click-bait, and content optimised not for quality, but for ad impressions.

I’m ending this article on web monetisation because it’s almost impossible to avoid being tracked across the web — but it’s not like you’re going to stop browsing any time soon. This illustrates my point perfectly: privacy is too big to understand, and therefore difficult to care about.

The main thing you should take away from this article is that data privacy is not a thing that we should all strive to care about. That responsibility should be placed on those building and maintaining the technologies that will govern our lives in the near and distant future. Because, just because an individual doesn’t ‘care’ about their privacy online, it does not mean they don’t deserve it.

--

--

Georgia Iacovou

Writing about data privacy and tech ethics; dismantling what Big Tech firms are doing in a way that’s easy to understand