content regulation / Copyright / Data protection / filtering / freedom of expression / Human rights / illegal content / Internet intermediaries / notice-and-action / notice-and-take down / online platforms

Data Protection and Copyright: Could Art. 29 WP guidance on automated decision-making “help” with filters?

automated decision making

In its own way, the pan-EU Article 29 Data Protection Working Party (Art. 29 WP) has been very active in the past few months. One of the most awaited piece of advice released by Art. 29 WP this month covers automated individual decision-making and profiling for the purposes of Regulation 2016/679 (Opinion WP 251).

Why is Opinion WP 251 worth reading? Because the aforementioned incoming General Data Protection Regulation 2016/679 (GDPR), expressly targets profiling activities, regardless of whether they are taken as part of the process of automated decision-making. Profiling is therefore broadly defined, as confirmed by Art. 29 WP and in accordance with the definition in Article 4(4) GDPR (“profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”). One important additional clarification by Art.29 WP in Opinion WP 251 is the mention that profiling should include processing that does not produce inference, as well as processing that “merely” consists in “automated analysis to identify correlations.”

Another reason why Opinion 251 is particularly fascinating to read is that it sits at odds with [for lack of a better term, as both of these instruments are not binding] the recent European Commission Communication on tackling illegal content online I wrote about recently in posts here and here.

Readers surely remember the “vision” the European Commission had on 28 September this year in its Communication: it expressed the view that – after all that had been said since September 2016, and the release of the proposal for a Copyright Directive in the Single Market – online platforms should be invited to implement filters. And this was just great, said the Commission, because even with filters, proactive online platforms would remain “passive” in the sense of Court of Justice of the European Union (CJEU) case law.

The Commission, however, would [probably] have benefitted from having a chat with Art. 29 WP before publicly revealing its political agenda in relation to online content regulation.

Why? Well, because asking private actors through a soft law instrument that is not entirely internally consistent to implement “re-upload” filters [as I have argued in my previous post, in particular, it is not even clear how these “re-upload” filters are different in kind from “upload” filters] raises some concern.

Why is it that the Communication on tackling illegal content could be seen as problematic?

Let’s read out [loud if possible] Opinion WP 251!

(Solely) automated decision-making “is the ability to make decisions by technological means without human involvement.” To note as well, “automated decisions can be based on any type of data.” [Note that some have tried to argue that processing fingerprints or hashes of protected works is not processing personal data, but really is this tenable?].

Art. 29 WP explains very clearly that the principle under review (see Article 22 GDPR) is a prohibition in relation to fully automated individual decision-making. There are exceptions to this principle. Nevertheless, in any case, “there should be measures in place to safeguard the data subject’s rights and freedoms and legitimate interests.” Within the list of safeguards one finds: the right to be informed, the right to obtain human intervention, and the right to challenge the decision.

To be more precise, the prohibition of fully automated individual decision-making “only apply when a decision based solely on automated processing, including profiling, has a legal effect on or similarly significantly affects someone.”

The fundamental challenge for Art. 29 WP is, therefore, to define what possible legal effects or significant impact upon individuals covers, and to provide guidance to data controllers regarding determination of whether they might arise.

 “A legal effect suggests a processing activity that has an impact on someone’s legal rights, such as the freedom to associate with others, vote in an election, or take legal action,” writes Art. 29 WP.

So let’s randomly pick a legal right! The right to freedom of expression, a fundamental right protected by Article 11 of the European Charter of Fundamental Rights. Let’s assume an online platform prevents individual ‘A’ from uploading a specific item of (legal) content as a result of the implementation of content recognition technologies, i.e. upload filters. This processing activity has a direct impact upon individual A’s right to freedom of expression.

Following this line of reasoning, for the upload filter to be lawful an exception to the prohibition of solely automated individual decisions would need to be applicable.

Article 22(2) GDPR recognises three exceptions. Because these are exceptions, they should be interpreted narrowly (not to forget to mention, as well, that the list of exceptions appears limitative). What are these 3 exceptions? Automated individual decision making process is permitted when it is:

“(a) necessary for the performance of or entering into a contract;

(b) authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or

(c) based on the data subject’s explicit consent.”

Once exception ground (c) is excluded, we are left with (a) and (b). Could it be that upload filters should be seen as “necessary for the performance of or entering into a contract”? Art. 29 WP offers some guidance as to when ground (a) could be used. This is the case when the automated decision-making process at stake:

  • “potentially allows for greater consistency or fairness in the decision making process (e.g. it might reduce the potential for human error, discrimination and abuse of power);
  • reduces the risk of customers failing to meet payments for goods or services (for example by using credit referencing); or
  • enables them to deliver decisions within a shorter time frame and improves the efficiency of the process. Routine human involvement may sometimes also be impractical or impossible due to the sheer quantity of data being processed.”

The 3rd consideration is the most relevant for upload filters. However, Art. 29 WP is adamant to add that, [r]egardless of the above, these considerations alone are not always sufficient to show that this type of processing is necessary under Article 22(2)(a) for entering into, or the performance of, a contract. As described in the WP29 Opinion on legitimate interest, necessity should be interpreted narrowly.”

Could upload filters be deemed necessary for online platforms to perform their contracts with users (regularly or on occasion uploading content)?

This is not exactly obvious. Online platforms have always refused (at least contractually) to subject themselves to any obligation as regards content regulation.

In any case, a strict scrutiny test would need to be applied in this context to determine whether other less privacy-intrusive methods could be adopted. Shouldn’t notice-and-action procedures be considered as less intrusive methods?

If ground (c) cannot be used, we are then left with ground (b): when “a Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests.”

Could the Commission Communication guidance be considered as a Union law of this type? Art. 29 WP does not really expand on this definition but wouldn’t it be odd to argue that a non-binding guidance, which is unclear as to what it exactly requires from online platforms, should count as a law “which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests?”

Dear European Commission, could you enlighten us on this particular point?

Sophie Stalla-Bourdillon