You are here: Resources > FIDIS Deliverables > Profiling > D7.4: Implications of profiling practices on democracy > 

D7.4: Implications of profiling practices on democracy

Identity, the human person and the legal person  Title:
 What should Data Protection regulations protect?


Data Protection

Directive 95/46/EC is entitled the directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data. This means that data protection does not aim at protecting data per se, but at protecting individuals. Art. 1 explicitly states that one of the objectives of this directive is the protection of privacy. So, like other rights and obligations, analysed above, data protection regulations aim to protect privacy. And, like some of the other rights and obligations that protect privacy, data protection regulations aim to protect more than just privacy. At the same time it is aimed at facilitating the greatest possible free movement of data. Data protection legislation thus entails at once (a) the constitution of an overall legal competence to collect, store and process personal data and (b) a set of restrictions upon which this general legal competence is conditional. This double instrumentality of data protection legislation is characteristic for the attribution of legal competence under the rule of law; when legal competence is created, it is at the same time restricted. In that sense the attribution of legal competence in the field of data protection is at once a tool of transparency and a tool of opacity. From the perspective of the rule of law it would be problematic to separate tools of transparency from tools of opacity, because this would imply granting an unconditional legal competence.


Whereas privacy is a  good to be protected, data protection is not the object or good that is being protected, but the tool used to protect a number of objects or goods, for instance autonomy, security, privacy and equality, in relation to the recording and processing of (personal) data. While data protection is both a tool of transparency and of opacity, its default position allows the collection, storage and processing of data (thus making citizens and their relations transparent to the data user, for example a service provider). This default position – allowing access to information – is conditional. A set of ‘fair information practice’ principles has to be applied, one of which is transparency of the collection, storage and use of data for the data subject. But data protection also involves tools of opacity as it prohibits collection or use of certain data and/or in certain circumstances. So it is a tool of transparency in that it allows data collectors and users to make data subjects transparent, on the condition that the process of collection, storage and processing of data is also made transparent; it is a tool of opacity in that it prohibits data collectors and users making data subjects transparent.


Data protection dates back to the 1970s when the first attempts were made to regulate the collection, storage, exchange and use of personal data. Today’s data protection legislation is generally based on a set of principles, first developed in the 1974  US Privacy Act, later expressed in the (non-binding) guidelines of the OECD of 1980, CoE Convention 108 of 1981 and numerous national statutes on data protection (and also Directive 95/46/EC). These principles can be summarised as:


  1. the collection limitation principle, stating that collection of personal data should not be unlimited;  

  2. the data quality principle, stating that personal data should be correct, complete and up-to-date;  

  3. the purpose specification principle, stating that the purpose for which personal data are collected must be specified, and that they may only be used for that purpose;  

  4. the use limitation principles, stating that disclosure or use for other purposes is only allowed subject to the consent of the data subject or on the basis of the authority of the law;  

  5. the transparency principle, stating that the data subject should be able to know about the collection and storage of personal data, its purpose and the identity of the data controller;  

  6. the individual participation principle, stating that a data subject has the right to erase, rectify, complete or amend her data; and finally  

  7. the accountability principle, stating that the data controller should be accountable for complying with these principles. 


Now, first of all, for the European domain it is important to stress that D95/46/EC is only applicable in relation to community law. This means that the processing of data by the Member States in the areas of criminal law or concerning public security does not fall within the scope of the directive (as art. 3, paragraph 2 states explicitly), as this is not – yet – part of community law. This raises a lot of questions if we take the perspective of the rule of law. Montesquieu stressed time and again that the way criminal procedure is organised determines to a large extent whether one lives in a free society. If the protection of personal data in the directive is based on a default position of access, conditioned by a coherent set of restrictions, the fact that data processing in the area of criminal law is excluded from application makes one wonder which restrictions do apply in the sphere of criminal procedure.


Second, the directive only concerns personal data, that is data that can identify a person. It should by now be obvious that in the case of profiling the limitation to personal data seriously hinders adequate protection of individuals with regard to the processing of data that are not – yet – considered personal data. Individuals need protection regarding the knowledge that is constructed through profiling techniques on the basis of their and other data, whether personal or not. The transparency aimed at by data protection regimes is of the utmost importance in the case of this type of new knowledge.


Third, the consent of the data subject is taken quite seriously in the directive.  Art. 2(h) states: ‘the data subject’s consent’ shall mean any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed’. However, in the reality of constant data exchange, such consent is utterly improbable, because (1) the data subject most of the time is not aware of data being recorded, stored and processed and (2) even if some awareness is present the number of decisions to be taken would paralyse the data subject and only be feasible via an identity management device (IMD) or digital persona that serves as a proxy.


Fourth, the directive grants higher protection to a set of personal data that are usually referred to as sensitive personal data. Art. 8(1) states: ‘Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life’. The idea behind such special protection is that in specific circumstances knowledge of such data can give rise to unjustified discrimination, obstructing both democracy and the rule of law. However, profiles can be constructed out of sets of insignificant data (not of a sensitive nature and maybe not even personal), and still contain a type of knowledge that can be used to discriminate between citizens, customers, clients, employees, patients etc. Contrary to the suggestions of Gutwirth and De Hert, I think that data protection legislation is built on old ways of thinking about data, personal data and their possible abuse, without having an eye for the new type of knowledge that is generated by data processing. As a result, I  argue that data protection legislation might be adequate if we were only dealing with data. However, as we are dealing with patterns of correlated data, data protection regulations appear inadequate: first, because profiles can be constructed out of anonymous personal data to which data protection regulations do not apply; second because group profiles do not necessarily apply to identifiable persons but may, even so, affect the autonomy, privacy, security and equality of European citizens.



Identity, the human person and the legal person  fidis-wp7-del7.4.implication_profiling_practices_03.sxw  What should Data Protection regulations protect?
Denis Royer 19 / 45