Data protection: Reflections on the EU entering the digital age

Cathleen Berger
5 min readApr 14, 2016
Nervous Systems Exhibition, Berlin (c) Cathleen Berger

After almost 5 years, it’s finally happened. Today the European Parliament (EP) signed off without rejections on the EU’s new data protection package.

Negotiated in countless sessions between Commission, Council and Parliament, it includes the General Data Protection Regulation and the Data Protection Directive for the police and criminal justice sector, replacing legislation written when computers were running Windows ’95 and global internet penetration stood at around 1%.

From a human rights perspective, this is a big deal. The European human rights regime (and EU data protection laws in particular) are often seen by other countries as a reference point for adequate and strong legal protection mechanisms — meaning the norms established in Brussels are likely to have global ramifications. It is therefore crucial that this regime is fit for purpose in the digital age, strengthens the rights of the individual, and enshrines the principle that individual freedoms and security are mutually reinforcing — not antagonistic.

How does it measure up? From a first review of the legislation we’ve found a lot of good things, some potentially good things, and some questions which need to be answered. Let’s go through them.

The good

No more data protection ‘shopping’

By harmonising data protection laws across the EU, the package should hopefully stop businesses trying to avoid regulation by housing their operations in the least demanding jurisdictions.

One stop shops

In cases relating to any online service or product, EU citizens can now reach out to their national data protection authority rather than going through complex jurisdiction systems across countries (as happened notably in Schrems vs. Facebook case). This also means citizens can seek clarification, access and — if necessary — litigation in their own language through one point of contact.

Tougher rules for global companies

Any global companies “offering their goods or services” in the EU will now have to respect and adhere to EU data protection law (Art 3 Para 2, GDPR). This is crucial in a digital environment dominated by US players like Google and Facebook, which can now be held accountable for their practices in the EU. This will have interesting knock-on implications for discussions around Safe Harbour II — or, ‘Privacy Shield’ — with Articles 41–43 of the GDPR particularly relevant.

Data minimisation

Only data which is deemed absolutely essential can now be collected, stored and processed (Art 5 Para 1c, GDPR). This will hopefully spell the end of onerous (and invasive) registration forms for services like online newspapers.

More control over personal data

Article 18 (GDPR) gives EU citizens more control over and insight into their own personal data. This will likely make it easier to migrate your data to other providers — helping to break up social media silos for instance. In other words, your Facebook photos won’t be lost if you move to a more privacy-protecting (or trendier) platform.

Good news for crypto

The package recognises encryption as an important feature of security (Art. 30 GDPR) — a rare and welcome positive step amid an increasingly polarised and securitised debate.

The ‘could-be-good’

Data ownership and informed consent:

The sections about “control of one’s own personal data” leave quite a bit of room for interpretation, particularly in relation to data ownership and questions of informed consent (Art. 7 GDPR). When does a citizen actually know what they are agreeing to, for example? And to what extent is opting out (e.g. of social media) a realistic option if you don’t want to be excluded from societal developments?

The aspect of control is also relevant with regard to the Right to be Forgotten (Art. 17 GDPR). While there are strong arguments in favour of having the ability to delete false or defamatory information, the effects of this right on the web as a digital, global archive are yet to be assessed in detail, and raise concerns of fragmentation.

Privacy as default

New clauses, such as “privacy by design” on a technical level (Art, 23 GDPR) or the need to issue an impact assessment for data protection (Art. 33 GDPR) will have to be monitored closely to be meaningful, as the package itself does not include implementation guidelines. These principles will be incorporated by each member state individually, meaning follow-up regulations need to be monitored on the national level. One example where this will be relevant is the storage and processing of genetic, biometric and other health-related data (cf. Art. 8 Directive).

Definition and justification of data usage

There are certain grey areas around the precise interpretation of “necessary and proportionate” (Art. 14–15 Directive) — for example, regarding the use of CCTV in public spaces. Legal guarantees that these videos will only be used for clearly defined law enforcement purposes will have to be adopted at a national level. Human rights defenders will need to pay close attention.

Remaining questions

The package perhaps invites as many questions as it offers answers. Among the things I’d like to know:

  • How will it be ensured that users accept Terms of Service by informed consent? And how can we make sure that opting out remains/becomes a realistic option?
  • Which funds will be available for the development of technologies that protect “privacy by design” — and will there be a commitment to free and open-source software?
  • How can privacy by design be protected if anonymity and encryption are not explicitly recognised as enablers to privacy and freedom of expression?
  • Are there any mechanisms in the legislative package allowing citizens to challenge national legislation which, for example, outlaws encryption or anonymity (e.g. proposals by Hungary, France, and the UK)?
  • What is the status of data retention after the ECJ judgement — will the legislative package clarify boundaries and allow for litigation if national bills introduce their own non-compliant data retention models (cf. Art. 4b Directive)?
  • What is the EU’s position on tracking by private companies (e.g through cookies)? Can citizens meaningfully control their data when their behaviour is being monitored in this way? And are attempts by publishers to force users to turn off adblockers therefore an infringement of rights?

I’m going to be posing some of these questions directly to the EP rapporteur on the EU data protection package Jan-Philipp Albrecht on Thursday April 28th at his Q&A at the “My Data, My Choice” event at Humboldt University, Berlin (HIIG).

If you have any other questions you’d like me to raise, feel free to use the comment section below or get in touch on Twitter (@_cberger_)— I’d love to hear your thoughts.

--

--

Cathleen Berger

Strategy expert, focusing on the intersection of technology, human rights, global governance, and sustainability