You are here: Resources > FIDIS Deliverables > Profiling > D7.4: Implications of profiling practices on democracy > 

D7.4: Implications of profiling practices on democracy

Reply Martin Meints (ICPP):  Title:
 Examples and scenarios of the application of current data protection legislation


Profiling and data protection law

The data protection principles summarised in section characterise the requirements which are set not only by the OECD guidelines and several collections of Fair Information Practices, but also by the European Directive 95/46/EC. This means that they are enacted into national law by most of the members of the European Union and thereby have a large area of application. I agree with the view of S. Gutwirth and P. de Hert that these principles apply for profiling ‘at least in principle’ as far as personal data is concerned. Obviously there are limitations in the application of these principles with respect to profiling technologies. They are described by M. Hildebrandt and will be further analysed in this reply with particular reference to reasons and scope.

Within the Information Society management of information is the central issue. Profiling is a core technology which operates by distilling usable information from a large amount of structured, raw information. From my practical perspective, the answers to the following questions are vital to categorise profiling and to get an understanding of the described limitations of data protection legislation and its implementation in the EU: 

  1. What source of data is being used?  

    1. Are the data personal or not? 

    2. If personal, is the collection and processing allowed by law or by effective consent of the data subject given? 

  2. Who is using the technology, is it a public body or a private organisation? 

  3. Is there a defined purpose for the collection and processing of data? Which purpose? 

  4. What kind of profiling is being used: personal or group profiling? 

    1. What type of profiling method been used? 

    2. If personal profiling:  

      1. Are the data subjects informed prior to the profiling? 

      2. Do they get access to the own personal data? 

      3. Can they obtain knowledge of the logic involved in the automatic processing of data concerning the data subject? 

      4. Can the profiling lead to automated decisions?

    3. If group profiling: is it distributive or non-distributive? 

  5. Is the profiling performed directed towards a hypothesis (which one?)? 

  6. Who is using the results and for what purpose? Does it match with the answer to the questions concerning the purpose asked before? 

In the field of profiling the following limitations with respect to traditional data protection law can be depicted:

  • Limitation: quality and reliability of data
    A typical problem we face with profiling is similar to other technologies, e.g. biometrics. We use probabilities and sometimes fuzzy results of a complicated analysis done by an expert (or in case of biometrics by a system) and use them often to make binary yes/no decisions. Those decisions are, in the profiling area, often made by someone who is not familiar with the way the data is collected and processed. The decision maker often has no feeling for how reliable the result of the profiling might be. This is especially true when methods are used that may produce no predictable and revisable results, such as neural networks. Like False Acceptance and False Rejection Rates in biometrics this will certainly result in wrong decisions based on profiling techniques.     
    Not only is the accuracy of data questionable in many cases, but also the accuracy of linkage to a specific individual in the case of personal profiling. This can be a severe problem if a personal profile is created on basis of data which is only thought to belong to the specific person, but in fact belongs to others. This profile which is assigned to a specific person is mixed or even entirely built from other individual’s data. Of course decisions concerning this specific person influenced by or basing on that profile are questionable, too. Therefore data processing entities should ensure accuracy of data and document possible risks to accuracy decrease. Note that individuals who don’t like to be profiled may try to render the profiles useless by lowering their quality through disinformation.This limitation obviously is a problem for the data quality principle (No. 2) stated in section .

  • Limitation: the problem of ensuring use limitation according to the stated purpose.

  • Limitation: information and participation of users (data subject from the perspective of profiling), is not common as well as adherence to the individual participation principle (No. 6). Information about the profiling method is often considered a trade secret and therefore not disseminated to users. Furthermore, some companies deny that the information is personally identifiable and refrain from informing users, but nevertheless use the profiles for decisions on individuals which may affect the user’s privacy in any event.


Moreover, everybody who deals with data mining should check whether the use of so-called privacy-preserving data mining techniques is appropriate: these techniques especially try to minimise personal data while keeping the quality of results.

In the following section we will have a look at some examples and scenarios where the application of data protection legislation will be discussed.  



Reply Martin Meints (ICPP):  fidis-wp7-del7.4.implication_profiling_practices_03.sxw  Examples and scenarios of the application of current data protection legislation
Denis Royer 24 / 45