The Council of the European Union released its version of the Proposed Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation or ‘GDPR’) on 11 June 2015, as mentioned by Alison in an earlier post. This ‘general position’ text was adopted on 15 June 2015 by the Justice and Home Affairs Council.
On 14 August, the Conference of the Data Protection Commissioners of the Federal German Government and the Federal States (Länder) issued a comment on the Council’s version entitled ‘Key data protection points for the trilogue on the General Data Protection Regulation’. Unsurprisingly, the German Data Protection Commissioners are a bit worried.
14 reasons explain the German Commissioners’ preoccupations. I will mention 6 of them.
- The scope of the GDPR as delineated by the Council is problematic for 2 reasons:
- “The council has [unduly] expanded the household exemption in Article 2(2)(d) of the Regulation by crossing out the words “without any gainful interest” and “exclusively” in the Commission’s proposal”. The concern is here that processing activities that are not exclusively undertaken for personal or household purposes could thus be exempted.
- “The scope of the Regulation must not be further restricted in favour of the proposed Directive on data processing by the police and judicial authorities”. In other words, the proposed Directive on data processing by the police and judicial authorities (also currently being negotiated as part of the EU data protection reform package) should not extend to “the safeguarding against and the prevention of threats to public security”.
- The very definition of personal data is also problematic, according to the Commissioners. In particular, the new version of Recital 24 proposed by the Council states that “Identification numbers, location data, online identifiers or other specific factors as such should not be considered as personal data if they do not identify an individual or make an individual identifiable”. The Commissioners notes that this exclusion is too broad and is contrary to the case law of the Court of Justice of the European Union (CJEU) (See its judgement in Scarlet v Sabam).
The underlying problem here, giving rise to a point of contention, resides in understanding what the last part of the sentence “if they do not identify an individual or make an individual identifiable” really means. What about the Browser-Generated Information (BGI) at stake in the Google v Vidal-Hall case as discussed by Alison here? Would elements of this information class still be considered personal data under the new regime? The answer appears to be a resounding ‘yes’, as BGI combines both the addresses of the website the browser is displaying, and online identifiers or location data, as the very purpose is the profiling of users. Such an approach would mean that what really matters is the purpose of the processing: if the purpose of the processing is to profile users (be it by identifying them or singling them out) then identification numbers, location data, online identifiers… should be considered personal data. On the contrary, using the same logic, when the same data is processed for other purposes, and in particular for network security purposes (e.g. to detect malware,) it would not be considered personal data.
The question is, however, whether such an approach is really sustainable? What if this data, not being considered personal data, is then publicly released? Are we sure that we do not want to impose upon the subsequent processors of this ‘non-personal’ data any duty at all? [Is it enough to state that if subsequent processors of the data use it to identify or single out individuals, the data becomes personal data again?] The foregoing maybe all the more worrying because, as mentioned by the German Commissioners, the phrase ‘singling out’ is not included in Recital 23 which expands upon the notion of identifiability.
- The data minimization principle should be included in Article 5 of the GDPR as it is a key data quality principle. [Such an inclusion is all the more warranted because data minimization – as a system requirement – is still mentioned in draft Article 23 of the GDPR on data-protection-by-design!].
- The German Commissioners advocate the deletion of the Council’s proposed Article 6(4) – see Alison’s post here for more detail. Indeed, the Council has extended the list of legal bases that can be used to justify the further processing of personal data for a purpose that is incompatible (and not “not compatible” as the European Commission used to say) with the original purpose: in the new version of Article 6(4), the legitimate interest ground could now be used. [I am asking maybe naively, is it true that the ground of the legitimate interest of the data controller is always less protective than the ground of unambiguous – or even- explicit consent?].
- As regards the rights of data subjects, and in particular in relation to profiling, the German Commissioners think that the Council’s draft Article 20 about ‘profiling measures’ and their compatibility with data protection rules is too weak. To be more specific, I will mention 2 (of the 4) things that, according to the German Commissioners, are missing:
- “Instead of being limited to automated decision making, an approach should be chosen which covers all profiling or measures based on it”.
- “Exceptions from the prohibition on profiling need to be clearly defined. Due to their highly sensitive nature, special categories of personal data should not be allowed for use in profiles”.
Actually Recital 23c) of the Council’s general approach text is not crystal clear. It reads as follows: “in order to create incentives for applying pseudonymisation when processing personal data, measures of pseudonymisation whilst allowing general analysis should be possible within the same controller when the controller has taken technical and organisational measures necessary to ensure that the provisions of this Regulation are implemented, taking into account the respective data processing and ensuring that additional information for attributing the personal data to a specific data subject is kept separately”. What does “whilst allowing general analysis” really mean? [Could it mean that if data is pseudonymised, the principle of purpose limitation does not apply, or that some big data personal data re-usage practices are permissible? In this respect, we might remember the text of the German proposal “where further processing takes place by using measures of pseudonymisation, it should not be considered as incompatible with the purpose for which the data have been initially collected as long as the data subject is not identified or identifiable (Art. 6(3a) (f))”. Or is the Council simply saying that personal data processing analysis is still enabled even if data is pseudonymised… which maybe in a way a truism.]
- “Goals relating to guaranteeing technical and organisational data protection must be anchored in the Regulation”, also say the German Commissioners. In other words, the list of system requirements to be found in Article 30 (security of processing) is not rich enough and should also include “non-linkability, transparency and the ability to intervene” [I am wondering here, to what extent is Article 30 really adding to Article 23 of the draft GDPR on ‘data protection by design and default’?].