It’s been a busy year at the EDPS. With the entry into force of the GDPR in May, the international conference in October and the introduction of Regulation 2018/1725 in December, you would be forgiven for overlooking some of this year’s less high profile activity.
With this in mind, January 2019’s Newsletter provides you with a retrospective look over the past twelve months. Here are ten things you might have missed:
Erasmus+ is the European Commission’s programme for education, training, youth and sport. The Education, Audiovisual and Culture Executive Agency (EACEA) runs the programme for the European Commission's Directorate General for Education and Culture (DG EAC), alongside national agencies in the Member States.
As part of the Erasmus+ programme, EACEA runs an online tool known as Online Linguistic Support (OLS), used to check if an individual’s language skills have improved during their stay abroad. Language skills are tested both before leaving and upon return home, and the OLS also provides online language classes. The tests are mandatory for any individual receiving funding through Erasmus+, but any consequences relating to failing to take the test, such as a reduction in an individual’s mobility grant, are decided at national level.
Any processing of personal data proposed by an EU institution or body and considered to pose a specific risk to the rights and freedoms of the individuals concerned is subject to prior checking by the EDPS. However, while EACEA provides the online tool, it does not carry out an evaluation of the data collected by the tool, a task which is done at national level. We therefore informed EACEA that no prior check was necessary in this case.
The case raised an interesting question, relevant to many European systems, platforms and tools. The EDPS sees the distribution of tasks between EACEA and DG EAC on the one hand, and national agencies on the other, as a case of joint controllership, meaning that both EU and national authorities are responsible for determining the means and purposes of the processing of personal data. We therefore recommended clarifying the responsibilities of the different controllers involved, so that it is possible for individuals to address the right organisation immediately, depending on their needs. For example, in the case of the OLS, requests for access to personal data relating to test results should be addressed to the national agency of an individual’s home country, while the security of the OLS central system remains the responsibility of EACEA.
Regular attacks on EU citizens inside the European Union remind us of the challenges faced by national law enforcement authorities in ensuring our security. The European Agency for Law Enforcement Cooperation (Europol) supports national authorities in their fight against serious cross-border crime and terrorism. However, a secure and open Europe depends not only on ensuring enhanced operational effectiveness in the fight against cross-border crime, but also on protecting fundamental rights and freedoms.
On 1 May 2017, the EDPS took over responsibility for the supervision of personal data processing carried out by Europol as part of their operational activities. One key challenge is to make sure that Europol strikes the right balance between security and privacy. However, as Europol relies heavily on information provided by national law enforcement authorities to perform its tasks, meeting this challenge is only possible if we work closely with national supervisory authorities.
EDPS collaboration with national authorities is facilitated by the Europol Cooperation Board. The Board acts as an advisory body on Europol matters. For example, it plays a key role in ensuring that citizens are able to exercise their rights in relation to the processing of personal data by Europol and provides clear guidelines to national law enforcement authorities on the data protection rules that apply to the data they send to Europol, particularly as they relate to vulnerable individuals.
On 29 May 2018, we attended the third meeting of the Europol Cooperation Board, at which we were able to share information relating to the supervisory activities undertaken since the last meeting, in November 2017. We also discussed the work programme for the next two years. We look forward to strengthening our cooperation within this essential network of national authority representatives as we work towards achieving the joint aim of a secure and open Europe.
On 21 March 2018, we adopted an Opinion on the processing of personal data for social media monitoring at the European Central Bank (ECB). The ECB intends to use an external contractor to monitor and track discussions about ECB related topics on different social media channels. Their aim is to gain a better understanding of how internet users perceive the ECB and to improve the ECB’s communication and reputation.
Specifically, the ECB intends to collect information on what is being said about them, topics related to their activities, the tone used and how far the information is spread. The external contractor will conduct the monitoring and analysis of the aggregated data on different groups of users, while the ECB will analyse this information and draft reports.
As some internet users, who are not public figures, may be indirectly identifiable by their quotes, their likes or their native language, we provided the ECB with some specific recommendations aimed at ensuring that the rights of individuals are respected. In particular, we focused on the need to ensure the quality of the data collected and processed, provided recommendations on the content of the contract with the external contractor and advised the ECB on an individuals’ right of access to their own data. We also provided them with advice on the information they must provide to internet users and the security measures the contractor must adopt.
As part of the registration process for an international conference organised by one of the EU institutions, individuals were required to submit a scanned copy of their passport or identity card, in order to verify their identity. The EDPS received a complaint relating to this requirement, to which we responded on 10 April 2018.
In our investigation, we found that the EU institution could have used a less intrusive means of verifying the identity of participants. For example, checking passports or ID cards at the entrance to the conference and comparing them with the information submitted online. Moreover, in certain Member States, it is illegal to photocopy passports unless justified by the law. The EU institution also failed to formally notify their Data Protection Officer (DPO) of the collection of scanned copies of individuals’ ID, as is required under Regulation 45/2001, which sets out the data protection rules for the EU institutions. We therefore concluded that requesting scanned copies of participant ID in this case was disproportionate and not in compliance with the legal requirements laid out in Regulation 45/2001.
We also responded to concerns about the transfer of the personal data collected to the authorities of the host Member State, based on the premise that participants had consented to this. To qualify as a valid legal basis for the transfer of data, consent must be freely given. As participants were not able to register for this conference unless they gave their consent to share personal information with the host Member State authorities, their consent was not freely given and so consent cannot, in this case, be considered a valid legal basis for the transfer of data.
Back in August, we issued an Opinion on the Commission’s Proposal for a Regulation seeking to strengthen the security of identity cards and other documents issued to EU citizens and their families who exercise their right to free movement within the EU. The goal is to improve the security features of EU citizens’ identity cards and non-EU family members’ residence cards.
Our Opinion states that the Proposal does not sufficiently justify the need to process two separate types of biometric data (facial images and fingerprints), as the stated purposes could be achieved using a less intrusive approach.
The Proposal would have an impact on up to 370 million EU citizens, potentially subjecting 85% of the EU population to mandatory fingerprinting requirements. Taking into account this wide scope and the sensitive nature of the data involved, the necessity of the measures proposed must be clearly demonstrated. Moreover, explicit safeguards must be established to ensure that implementing the Proposal at national level does not lead to the setting up of national fingerprint databases.
While the storage of fingerprint images does enhance the way in which EU databases communicate and exchange information - known as interoperability - it also increases the amount of biometric data that is processed, which inflates the risk of impersonation in the case of a personal data breach. Accordingly, the EDPS recommends limiting the fingerprint data stored in the chip of residence documents significantly, to include only a subset of the characteristics extracted from the fingerprint image.
Finally, the EDPS advocates setting the minimum age limit for collecting children’s fingerprints under the Proposal to 14 years old. This is in line with the approach taken in other instruments of EU law.
On 4 June 2018 we received a request from the European Parliament's Committee on Internal Market and Consumer Protection (IMCO) to comment on the European Commission proposal for a Regulation on a framework for the free flow of non-personal data in the European Union. The request cited concerns about certain amendments relating to the relationship between the proposal and the General Data Protection Regulation (GDPR). A consultation request from a responsible European Parliament Committee is a new development and we see it as yet further proof that the co-legislators are increasingly interested in our input in the course of legislative process.
The legislative proposal in question was published on 13 September 2017. It would allow for the free flow of non-personal data within the EU internal market, making the porting of data easier for professional users and enabling users to more easily change between cloud service providers. The proposal also contains provisions relating to the availability of data for competent authorities.
We sent our Comments to IMCO on 11 June 2018, just four days after receiving the request.
One of the main issues we highlighted, and which was already present in the initial proposal from the Commission, is that the proposal would apply to data that is not personal data under the GDPR. The problem with this negative definition is that it is likely to be very difficult to apply in practice, since the definition of personal data is very broad and context-dependent. It also automatically creates a tension with the GDPR and results in legal uncertainty as to which legal framework should apply in a given situation. Moreover, IMCO introduced amendments which, instead of clarifying the relationship between the proposal and the GDPR, risk blurring them even further.
We recommended that the proposal should clearly state that the GDPR fully applies to all personal data, irrespective of whether personal data are inextricably linked or not with non-personal data.
On 15 February 2018, the EDPS published an Opinion on a proposal for a recast of the Brussels IIa Regulation, which we presented at the Council on 1 March 2018. This Council Regulation on jurisdiction concerns the recognition and enforcement of decisions in matrimonial matters and matters of parental responsibility, including international child abduction. The Opinion was formally requested by the Council.
The recast of the Brussels IIa Regulation establishes uniform jurisdiction on rules for divorce, separation and annulment of marriage, as well as for disputes about parental responsibility in cross-border situations. Its overall objective is to remove the remaining obstacles to the free movement of judicial decisions, in line with the principle of mutual recognition, and to better protect the interests of the child by simplifying the procedures involved and improving efficiency. The new rules also aim to avoid the creation of a new EU IT system, by improving cooperation between the central authorities involved in exchanging information within and across Member States.
Our Opinion outlines specific recommendations to ensure that any processing of personal data is done lawfully and that suitable and specific safeguards are put in place to protect the fundamental rights and interests of the individuals concerned. We also recommended that clauses explaining the specific purposes for which data can be processed, and the individuals this concerns, be inserted into the text, as well as explicit references to the need to respect the principles of data quality and minimisation.
In addition, we stressed the importance of specifying that any reference to the national law of a Member State should not lead to increased limitations on an individual’s right to information at national level. This is important in order to ensure that data is processed fairly and consistently across the EU. We also recommended establishing a principle in the Regulation providing individuals with the right of access to any information transmitted to the requesting authority of a Member State. To deal with cases where restrictions to an individual’s rights of access and rectification are considered necessary, a clear and specific provision laying down the scope of these restrictions must be included.
The public debate on the misuse of personal data for tracking and profiling and the role of technology in our society has received unprecedented attention in recent weeks. One element of this debate concerns the role of technology in our society, in particular whether companies should be able to take advantage of it exclusively as a means to increase their profits, or whether they should be obliged to use it to further the interests of individuals and the common good.
The principle of privacy by design may help to establish the human perspective as the main driver for technological development. Privacy by design involves planning for the integration of personal data protection into new technological systems and processes from the initial design stage of a project, as well as throughout its whole lifecycle. Privacy by default is a complementary principle, which involves integrating privacy protection into all technological services and products as a default setting. Both principles are cited in the General Data Protection Regulation (GDPR) as essential obligations in ensuring accountability, which requires those responsible for collecting and processing personal data to implement appropriate technical and organisational methods to ensure and demonstrate data protection compliance.
The EDPS has played an active part in attempts to further the dialogue between policy makers, regulators, industry, academia and civil society on how new technologies can be designed to benefit the individual and society. The 2018 IPEN workshop, which will take place in Barcelona on 15 June 2018, will focus on initiatives and case studies relating to privacy engineering and the use of privacy enhancing technologies, while the 40th International Conference of Data Protection and Privacy Commissioners, which will take place in Brussels during the week of 22 October 2018, will address digital ethics in general, helping to identify the way forward for privacy by design.
Our preliminary Opinion on Privacy by Design, published on May 31, 2018 sets out the groundwork for this dialogue and builds on our previous work in this area. We welcome any feedback and hope it will foster productive debate moving forward.
Blockchain has become a powerful buzzword in the world of technology and financial innovation. The technology is currently used as an enabler for Bitcoin and other so-called crypto-currencies, and sparked the development of Distributed Ledger Technology (DLT). Distributed ledgers such as blockchains are databases with many replicas under the shared control of distinct, often autonomous participants.
Originally developed to secure online transactions based on sophisticated cryptography instead of intermediaries, EU industries and legislators are now assessing the viability of using blockchain technology in a range of areas, from finance to e-government, and even in personal healthcare. However, it is vital to ensure that these assessments consider the data protection implications of such distributed databases.
Whenever blockchain technology is used to process personal data the relevant data protection law applies. The processing of personal data must respect the data protection principles outlined in the General Data Protection Regulation.
The EDPS has been following the evolution of blockchain for the past two years. So far, we have identified challenges to data protection principles in areas such as storage limitation, controllership and individual’s rights. Over the course of 2018, we plan to increase our efforts to monitor this fast evolving technology, in order to adequately advise the EU legislator on the possible risks and safeguards involved in its application.
The 2018 EDPS-IPEN Workshop took place in Barcelona on 15 June 2018 with the support of the Polytechnic University of Catalonia (UPC). This annual workshop has taken place at different locations across Europe since 2014. It aims to bring together privacy experts and engineers from public authorities, industry, academia and civil society to discuss relevant challenges and developments for the technological implementation of data protection and privacy.
EDPS Assistant Supervisor Wojciech Wiewiórowski gave the opening keynote speech, in which he emphasised the need to develop practical solutions for privacy engineering. The role of privacy engineering is now more important than ever, since, under the General Data Protection Regulation (GDPR), data protection by design and by default are now enforceable legal obligations. Ensuring that privacy and data protection are incorporated into all new technologies from the development phase is a crucial step in ensuring that we are able to protect personal data in the digital age.
The main aim of the workshop was to assess the state of play for privacy engineering and privacy-enhancing technologies (PETs) in the wake of the GDPR and to follow-up on the outcome of last year’s trans-Atlantic workshop. Some IPEN participants provided updates on ongoing initiatives, such as the IPEN wiki on privacy related standardisation initiatives and the PETs maturity repository.
Beyond current legal obligations, the relationship between ethics and technological developments was also a topic of discussion. We challenged those present to try to answer the question of whether privacy engineering can help solve the ethical problems posed by artificial intelligence (AI). We also asked them to think into the future, from data protection by design to human rights by design. The idea of introducing a Human Rights, Ethical and Social Impact Assessment (HRESIA) was presented as a possible way forward from Privacy Impact Assessments.
The workshop was also an opportunity for businesses to present and demonstrate solutions which combine innovation and data protection. Companies including SAP, Jolla, Qwant and Brave shared their best practices and how to give users more control over their personal data, implementing the spirit of the GDPR to its full extent. Academics from European and American universities reported on recent research results, not only at a theoretical level but also through the presentation of practical tools which help to detect privacy compliance issues and might support regulatory authorities or controllers that wish to act in full accountability.
The workshop marked an encouraging start to the GDPR era and we are looking forward to continue this valuable interdisciplinary dialogue over the months and years to come.
The workshop was web streamed and the presentations are still available on the UPC web site.