European Data Protection Supervisor
European Data Protection Supervisor

'Accept and continue': billions are clocking into digital sweat factories without realising it

'Accept and continue': billions are clocking into digital sweat factories without realising it

Monday, 30 April, 2018

The digital information ecosystem farms people for their attention, ideas and data in exchange for so called 'free' services. Unlike their analogue equivalents, these sweatshops of the connected world extract more than one's labour, and while clocking into the online factory is effortless it is often impossible to clock off.  

I am reminded of this state of affairs by the recent stream of messages from online service providers about changes to their terms and conditions and privacy policies. The messages vary of course and some explicitly cite the GDPR - fully applicable in less than a month's time - as the reason for the change. Failure to accept the new terms by 25 May, we are told, will mean we can no longer use these services. 

For most people outside the esoteric data protection bubble this represents first contact with the new dispensation of digital rights and obligations in the EU. If this encounter seems a take-it-or-leave it proposition - with perhaps a hint of menace - then it is a travesty of at least the spirit of the new regulation, which aims to restore a sense of trust and control over what happens to our online lives. 

The most recent scandal has served to expose a broken and unbalanced ecosystem reliant on unscrupulous personal data collection and micro-targeting for whatever purposes promise to generate clicks and revenues.  In such a distorted environment everyone must now participate, instilling the paradoxical sense of being more and more monitored and yet less and less known and respected by the small number of remote tech powers. 

As the state of things digital becomes gradually clearer, there are already noises suggesting that if you object to being tracked in exchange for the 'free' services on which many of which our lives now depend, then the only alternative is to pay. But the fundamental right to privacy and related freedoms like free speech and non-discrimination apply to all, they cannot be the exclusive privilege of those who can afford to pay. 

The positive takeaway from all of this is not simply that data protection has suddenly become trendy. Regard for online privacy is now firmly a part of the PR toolkit of any organisation which cares about its customers and reputation. The big risk however is a growing gulf between hyperbole and reality, where controllers learn to talk a good game while continuing with the same old harmful habits which the EU legislator has been attempting to dispel with the GDPR and other ongoing reforms, notably the ePrivacy Regulation.

So for instance we have seen a spate of articles alleging that stricter data protection will favour big companies. There is no doubt that controllers tracking people with whom they have no relationship are rightly going to have to adjust their behaviour. But the broader reality is that accountability and obligations in the GDPR are scalable, with data protection authorities empowered to enforce the law with rigour proportionate to the scale of the violation. Controllers responsible for personal data processing on a massive scale, involving data of the most sensitive nature, face by far the biggest challenge in demonstrating the lawfulness and indeed ethical grounds for what they have been doing over the last decade or two. 

The GDPR is, essentially, about accountability of controllers, safeguards for individuals including giving them more control over what happens to their data. Its greater goal is to protect individuals not companies.  

Too often privacy policies have seemed to be designed to provide legal cover for the companies themselves in the case of harm to a customer: non-negotiable, incredibly long, complicated, full of legal jargon which nobody reads (except to expose the unfairness of this practice).  Furthermore, the policies have tended to give an illusion of user control - while in reality you cannot see or control what the company does with information about you. 

Companies whose business model depends on tracking are now asking their customers to say whether they agree to, for example, the use of sensitive data and data from outside sources. Just like with the notorious cookie pop-ups, people fell pushed towards clicking 'I accept' because the only apparent alternative on offer seems complicated, time-consuming and risks excluding them from digital society. 

We and other DPAs are therefore worried that even the biggest companies may not yet understand that with the GDPR these manipulative approaches must change. They must change, for instance, to satisfy Article 7(4) of the GDPR, which states that consent cannot be freely given if the provision of a service is made conditional on processing personal data not necessary for the performance of a contract. 

Brilliant lawyers will always be able to fashion ingenious arguments to justify almost any practice. But with personal data processing we need to move to a different model. The old approach is broken and unsustainable - that will be, in my view, the abiding lesson of the Facebook/ Cambridge Analytica case. 

DPAs are taking action, with a new social media subgroup meeting for the first time in mid-May. 

We must all be vigilant about attempts to game the system. 

Controllers will be concerned about compliance, and the most thoughtful and diligent controllers will aim to turn responsible data processing into a competitive advantage - that will be one of main messages of the EDPS Opinion on privacy by design to be published next week. But what individuals and regulators expect is a change of culture.

Data protection emerged in the 1970s and 80s as a response to the automation of processing operations and new forms of communication. Massive digitisation and machine learning are demanding new and smarter policy responses: stronger enforcement but also empowerment through tools like meaningful consent; ethics and accountability and a fairer allocation of the digital dividend.  

Tomorrow on Labour Day, let's hope that just like the de-humanising working conditions of the  sweatshops of the 19th century, the abuses so prevalent in early phase of digitisation are soon to be consigned to the history books.