You are here: Resources > FIDIS Deliverables > Profiling > D7.4: Implications of profiling practices on democracy > 

D7.4: Implications of profiling practices on democracy

Combining the tools  Title:
THE EXISTING LEGAL FRAMEWORK OF PRIVACY AND DATA PROTECTION
 Correlatable humans, profiling and data protection.

 

The existing legal framework of privacy and data protection

Today, privacy protection and data protection co-exist and overlap. According to general legal dispositions such as Article 8 ECHR the protection of privacy spans a wide and very diverse range of issues. Nevertheless, the legislators have enacted specific legislation to deal with a number of particular issues falling under the scope of these privacy provisions, such as, for example, telecommunication tapping, the use of surveillance cameras, the use of DNA, ID-cards and, last but not least, data protection. In these cases, the specific legislation provides for more elaborate and specific rules and protections. This does not, however, imply that these issues do not fall within the scope of constitutional privacy provisions, such as Article 8 of the ECHR, for specific legislation must be compatible with these provisions and, if not, can be contested on this basis.  

 

Data protection is a strong example of such a specific legislation. Surely, the organs of the ECHR have, on several occasions, recalled that data protection is an issue which can fall under the power of Article 8 of the ECHR. But they have also held that not all aspects of the processing of personal data are protected by the Convention because they estimated that not all personal data are worthy of privacy protection. The Court thus makes a distinction between privacy sensitive and privacy insensitive personal data, a distinction which is not made in data protection law. Moreover, the case by case approach to data protection issues by the Strasbourg Court could never have led to a consistent body of rules that could deal satisfactorily with the numerous problems spawned by the large scale automatic processing of personal data. Privacy case law could not have developed into something comparable with the elaborate list of rights and duties invented by data protection law (which, by the way, nicely illustrates our point of making the difference between privacy and data protection). 

Of course, the former is one of the reasons why the body of data protection law and its many statutory expressions has been developing since the late 1970s. Compared to privacy protection (as in Article 8 of the ECHR), data protection operates an essential shift. Its application is no longer dependent on the complex and uncertain question of what privacy rights might be at stake in any individual case; it simply applies when personal data are processed. Hence the complex and subjective question ‘is this a privacy issue?’ is substituted by a more neutral and objective question ‘are personal data processed?’.  

 

Ergo : when ‘personal data’ are ‘processed’, data protection applies. It is interesting to consider this rule in the light of the very broad definitions of the legal terms of ‘personal data’ and ‘processing’ in the Data Protection Directive. Personal data refer to ‘any information relating to an identified or identifiable natural person.’ There are no limits on content or technology. Phone numbers and license plates, social security numbers, images, voices, genetic information, finger prints are explicitly mentioned as personal data. A person is identifiable as soon as identification is possible based on means that can reasonably be assumed to be used by the responsible data controller. The processing includes automated and manual processing. The directive covers manual processing only if it constitutes a filing system and has a minimal structure. This condition does not, however, apply to automated processing because programs have the capacity to merge and intertwine masses of loose data within seconds. The term ‘processing’ also needs to be given a comprehensive interpretation. The directive imposes upon the member states a definition which says that it applies to any operation which is performed upon personal data. The rules cover each part of processing - ‘collection, recording, organisation, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.’ It is worth highlighting that even the mere collection of personal data is covered by the Directive.

 

The conclusion is easy to make: data protection regulations apply in principle to automated collection and processing of personal data. This implies that a number of rules are applicable from the default transparency perspective and that the processing of data must fulfil the foreseen conditions. However, as we argued above, some data protection rules are proscriptive opacity rules: sensitive data may not be processed at all, unless exceptions apply. This could, for example, entail that video surveillance falls under this regime because images of a person (can) contain racial, ethnic, health or religious information. Another opacity rule is that secret collecting and processing of personal data is in principle forbidden: there must be openness of the processing. 

 

 

Combining the tools  fidis-wp7-del7.4.implication_profiling_practices_03.sxw  Correlatable humans, profiling and data protection.
Denis Royer 8 / 45