This article is more than 1 year old

Behind Big Tech's big privacy heist: Deliberate obfuscation

You opted out, but you didn't uncheck the box on page 24, so your data's ours...

Opinion "We value your privacy," say the pop-ups. Better believe it. That privacy, or rather taking it away, is worth half a trillion dollars a year to big tech and the rest of the digital advertising industry. That's around a third of a percent of global GDP, give or take wars and plagues. 

You might expect such riches to be jealously guarded. Look at what those who "value your privacy" are doing to stop laws protecting it, what happens when a good law  gets through, and what they try to do to close it down afterwards. 

The best result for big tech is if laws are absent or useless. The latest survey of big tech lobbying in the US reveals a flotilla of nearly 500 salespeople/lawyers touring the US state legislatures, trying to either draw up tech friendly legislation to insert into privacy bills, water then down through persuasion, or just keep them off the books.

With Apple, Google, Meta, and Microsoft frequently drawing on the same specialist covens, it's fair to say that deadly rivals in the market find plenty of common cause in the corridors.

You can see why they bother. A law that's good for us is very bad for them: four years after the EU's GDPR got through, it's drawing blood. The top six fines to date are shared between Amazon, Meta and Google, with the AWS parent taking the number one spot with $844 million (€746 million) fine for tracking users without consent and not providing opt-outs. (It has already stopped making those payments, we note). Meta picked up the second and fifth spots for WhatsApp and Facebook: $241 million (€225 million) for poor privacy policy and lack of transparency, and $64 million (€60 million) for pushing cookies down users' throats.

Google got bronze, fourth and sixth place, with a grand total of $161 million (€200 million) for a mixture of all of the above. So that's just shy of $1.3 billion (€1.2 billion), which sounds a lot, but is a quarter of a percent of global ad tech revenues. That leaves plenty left over for lobby lunches. 

There's also plenty left over too to evolve defences. Last week, Meta announced a complete rewrite of its Data Policy. It makes major claims for clarity, in response to those fines for lack of same.

Does it actually clear things up for users? Not even close, says researcher Wolfie Christl in a surgically scathing Twitter thread.

The new policy is near silent about "personal data", a term it almost entirely ignores in favour of the ambiguous "information." It elides "improving our service" with "serving more ads", and sweeps whole categories of concern, such as what data it gets back from "partners, vendors and 'third parties'" about you and what happens to it.

One special treat is a 10,000 word section on how it justifies what it does, whatever it is, under Europe's GDPR and data protection law in other jurisdictions. It tangles its semantic web so deftly that Professor Lilian Edwards, Chair of Law, Innovation & Society at Newcastle and self-confessed GDPR nerd, tweeted in exasperation: "This is insane. [...] if I can't face going through this [...]  How could ordinary punters?"

Which is the point. All those fines for misleading and confusing us were based on old policies, exhaustively analysed and compared against evidence. With a completely rewritten policy, that process has to start all over again, only this time with a document custom designed to keep the lawyers going for as long as possible. It's cynical, but it is effective. 

The underlying problem is structural. If the companies won't say what data is being collected and how it is shared between their divisions and their partners, vendors and third parties, then those relationships must change.

Companies must cease being monolithic black boxes, with each function answerable. For example, iIf the privacy policy defeats professors at law, then demand a regulatory right of approval.  Send it back. Enforce independent sign-off before it's allowed to go through. You want dollars? Make sense. 

Other industries have been through processes like this. You can't buy a car that doesn't come as the product of a huge legal and compliance framework that specifies safety down to the component level, imposes strict controls on the consumables it uses, and has a hundred years of consumer protection baked in. 

Look at the ad tech industry. At worst it allows its algorithms to radicalize people into murderous psychopathy, and at best it operates a global surveillance machine that nobody would ever allow if asked. Why then should it have claim to lenient treatment? Imagine if in the 1980s any regime had demanded every citizen install omniscient monitoring devices in their homes, and carry one around in their pockets. It would be seen as the ultimate authoritarian nightmare. Yet here we are. 

So, let's make it easy to check.  With the business structure visible, the data on which it runs should be as well labelled and checkable as any pharmaceutical feedstock, the processes as amenable to inspection, and the output policed for purity. That would be a lot of work. That would be the price of doing business. 

We only agreed to the ad tech armageddon because it happened bit by bit, and we refused to believe  where it would go. How much less did we imagine the regulatory environment we'd need to rein it in? Fortunately, we don't need much imagination or belief now it's here. We can see what it's doing. 

A well-regulated market protects all, including the powerful, from the consequences of their own temptations. A market where we are free to use the products without fear means one where enterprise can flourish within the rules, and innovate without interference. After all, we value their privacy too. They just have to earn it. ®

More about

TIP US OFF

Send us news


Other stories you might like