You are here: Resources > FIDIS Deliverables > Forensic Implications > D6.7c: Forensic Profiling > 
Re-use of personal data  Title:
RISK PROFILING
 Alternative legal safeguards for Risk Profiling: Adequate Remedies and Due Processing.

 

Risk profiling

Risk profiling – acting proactively on information inferred from aggregative data

After our exploration of the interconnection of police databases in section , we now turn to another new development in forensic profiling: so called risk profiling. There are at least two aspects of forensic risk profiling which make it profoundly different from other methods of criminal investigation (i.e. classical forms of data matching: even from very advanced data matching like, e.g., searching in several interconnected databases for a certain shoe print). Two central characteristics of forensic risk profiling are that it involves (a) hypothetical information derived from aggregative data and (b) its pro-active character. Two other issues that are sometimes at stake in forensic risk profiling are (c) automated decision making, and (d) the opacity of the reasoning involved.

 

(a) Hypothetical constructions derived from aggregative data 

The first aspect which makes forensic risk profiling stand out against more classical methods of investigation is the fact that it does not limit itself to uniquely individual information (e.g. the fingerprint of one particular individual) but that it makes use of statistical information derived from huge databases (e.g. the profile of the average terrorist inferred from a certain pattern of correlations). Of course forensic risk profiling might still aim at the identification of one unique individual, but the data (that is, the patterns of data) to which a particular individual is compared are not a unique marker (e.g. his fingerprint) but a hypothetical construction derived from information from various people ( i.e., a database). The fact that forensic risk profiling uses a construction derived from the data gathered from more than one individual makes it likely that the data mining algorithm will get wrapped up in opaqueness: not only could disclosure of the algorithm lead to an infringement of the privacy of those people whose personal data were used in the construction of the algorithm, but disclosure could also make the algorithm valueless (screening passengers for certain criteria might not be so useful if everybody knows exactly which characteristics are sought for) or possibly be a breach of the intellectual rights of the body which constructed the algorithm – sometimes at high costs. A forensic profile might sometimes turn out to be as secret as a ‘secret recipe’, surrounded by technologically inspired terminology like “data mining”, “data harvesting”, “crunching raw data”, “data processing and “DNA banking”.

 

(b) The pro-active character of forensic risk profiling – keeping it outside the investigative process 

A second peculiarity of forensic risk profiling is the fact that it can be used in a pro-active and hypothetical way. Instead of looking for an individual matching the traces left at a place of crime, forensic data mining can be used to prevent a crime (e.g. not admitting a potential terrorist to an airplane) or to raise a hypothesis about the characteristics to look for (e.g. “the robber is likely to drive a red car, because a significant majority of the robbers in our database did”). This pro-active or hypothetical character of forensic risk profiling dissociates it from the investigative process directed at a potential trial. The profile used to detect high risk air-plane passengers is not meant to be used as evidence in a criminal trial, but is meant to prevent the high-risk passenger from entering the plane without further screening. The passenger who is told that he cannot enter the plane will often even be unaware of the fact that he was subjected to forensic profiling and simply assume that he apparently looked suspicious. This means that forensic risk profiling will often stay outside, far from the scrutiny and transparency of a trial, because it is not so much part of a criminal process as it is a hypothetical result in itself: “Data matching and data mining give no process as the law understands that term. There is no notice, no opportunity to be heard, no confrontation with evidence, no reason given – only a result. Under any theory of due process, decisions based solely and irrevocably on the results of data matching or data mining are deficient, where they affect substantial interests” .

 

(c) Automated individual decisions 

Sometimes risk profiling is also embedded in an automated decision system. A notable example resides in the surveillance of flight passengers where “the profile constitutes the basis for decisions on fly/no-fly, arrest, detain for questioning and so on” . The problem of decisions with substantial effects on individuals based on automated software has already been debated within the debate relative to the draft of the Directive. This issue has been revived by the use of profiling techniques for law enforcement purposes (i.e., third pillar activities)

The Framework decision on data protection in the third pillar of the EU (i.e., concerning Police and Judicial Co-operation in Criminal Matters) dedicates an article to the taking of automated individual decisions by law enforcement authorities. Article 8 of the decision stipulates that a decision which produces an adverse legal effect for the data subject or seriously affects him and which is based solely on automated data processing for the purposes of assessing individual aspects of the data subject shall be permitted only when the legitimate interests of the data subject are safeguarded by law.  

This article echoes article 15 of Data Protection Directive 95/46/EC (made within the first or ‘Community’ pillar) which had tried to tackle the issue of increased automation in the decision-making process, mainly with regard to organizational decisions. Article 15 aims at protecting “the interest of the data subject in participating in the making of decisions which are of importance to him.” This use of extensive data profiles of individuals by powerful public and private institutions risk to “deprive the individual of the capacity to influence decision-making processes within those institutions, should decisions be taken on the sole basis of his “data shadows”” . The problem of the lack of transparency (see above) was already at the centre of the debate. 

A second fear which was expressed in the debates surrounding Data Protection Directive 95/46/EC was that the “automatic acceptance of the validity of the decisions reached and a concomitant reduction in the investigatory and decisional responsibilities of humans” . According to the European Commission: “the result produced by the machine, using more and more sophisticated software, and even expert systems, has an apparently objective and incontrovertible character to which a human decision-maker may attach too much weight, thus abdicating his own responsibilities” (COM(90)314 final - SYN 287, 13 September 1990, p. 29). As highlighted by Bygrave  “the increasing automatisation of decision-making processes engenders automatic acceptance of the validity of the decisions reached and a concomitant reduction in the investigatory and decisional responsibilities of humans. Thus, there is an implicit assumption that the reasoning linking the premises and conclusions for these predictive judgments will be grounded in reality.”

 

(d) Opacity of the reasoning 

A related issue which is sometimes at stake in forensic risk profiling is the opacity of the reasoning process. The problem arises with regard to the opacity of the algorithms used which are not necessarily connected to truisms about human behaviour. Furthermore, as stressed by Steinbock, contrary to human judgement, “computer analysis has no way to evaluate the probable accuracy of the data on which it relies” . Computer reasoning is finally also more difficult to evaluate than human assessment . Whereas the wording of article 15 of the Data protection Directive enables the data subject to be informed of the logic underlying the processing, article 8 of the Framework Decision does not even make a reference to this difficult issue. Once again, the needs of law enforcement activities will need to be balanced with individuals’ right and alternative solutions may have to be found when such information should be kept secret. Prior safeguards, such as a strict assessment of the conditions required for the legitimacy of such processing, may be needed. Intervention of independent authorities may be required as well.

 

Keeping Risk Profiling fair: Due Process?

Due Process: a transparent trial giving the individual a fair opportunity for defence. 

The fears about the present day avalanche of data technologies are mainly framed in terms of loss of privacy . Terms like “Big Brother” and “Surveillance Society” abound and even sometimes risk becoming worn out mantras . Yet, the consequences of the collection, storage and processing of personal data are not limited to a simple dissolution of the respect for private and family life (as described in e.g. art. 8 of the European Convention on Human Rights), nor do they necessarily imply a loss of privacy at all .

What is sometimes overlooked in the debates on data technologies which focus solely on their intrusiveness in the private sphere is the information asymmetry not perceive the fact that the government knows his year of birth as an intrusion to his private sphere, the use of this piece of information in a profiling practice of whose existence he is not aware, nor accessible to him if he would be aware of it (e.g. a classified algorithm predicting the likelihood that somebody who was born in 1979 is a terrorist) puts him in a situation of information asymmetry. Such information asymmetry can undermine some of the principles on which constitutional democracies are built – i.e., undermine the empowerment of the individual citizen against the force of the State. Especially within the domain of criminal investigation and adjudication, one of the big achievements of Western constitutional thought is to come up with safeguards which give the individual a fair chance to contest the allegations made by a State which tends to have a larger amount of resources than the individual and therefore be more powerful.

It is especially the notion of due process which is at the core of this empowerment of the individual citizen towards the State (in particular expressed by the idea of equality of arms): it is seen as the adequate shield against over intrusive State power. In Europe, due process is identified with the right to a fair trial such as described in art. 6 ECHR (which encompasses, e.g., the right to a public hearing before an independent and impartial tribunal within a reasonable time, the right to adversarial process and the presumption of innocence). In the US  the principle of ‘due process’ is mainly to be found in two provisions of the Bill of Rights, i.e. in the 5th and 14th amendments to the US Constitution and, in the context of criminal investigation or prosecution, the 4th amendment (the right to be secure against unreasonable searches and seizures). Even though the words wherein the notion of due process is framed differ, both in Europe and the US due process is a right which should prevent citizens ending up in Kafkaesque opaque situations – where a suspect is not told what the accusation is, where he has no legal assistance or any other means of protecting himself, where the rules of the process are unclear, where the process is held in a language he does not understand, where the judges are bribed or seem to act completely irrational and arbitrary, etc. Central to due process is thus the idea of a criminal process culminating in a trial where all the cards are put on the table – a transparent situation – in order to give the individual citizen a fair opportunity to refute the allegations made against him.

 

Why Due Process is at odds with Risk Profiling 

The pivotal ideas of due process (an investigative process aiming at the production of evidence at a trial, in a transparent way, where the individual can have his say about it) are endangered when forensic risk technologies are applied which are not at all aiming for a trial, transparency or to address one specific individual; not providing individuals notice and an opportunity to be heard . Thus, with respect to forensic risk profiling technologies, due process might not turn out to be sufficiently protective anymore because they could be considered as either jeopardising due process or simply just falling outside the reach of the right to due process.

However, it is quite difficult to realise that the venerable notion of due process might fall short in the case of some of the modern forensic data technologies.  

This can be illustrated by looking at one of the most well known applications of forensic risk profiling: the use of Passenger Name Record (PNR) data in combination with risk assessment profiles for the surveillance of flight passengers. As highlighted by the European Data Protection Supervisor, “suspected persons could be selected according to concrete elements of suspicion included their PNR data as well as on the basis of “patterns” or an abstract profile. The main concern of the EDPS relates to the fact that decisions on individuals will be taken on the basis of patterns and criteria established using the data of passengers in general. Thus decisions on one individual might be taken, using as a reference (at least partially), patterns derived from the data of other individuals. It is thus in relation to an abstract context that decisions will be taken, which can greatly affect data subjects. It is extremely difficult for individuals to defend themselves against such decisions” (Opinion on the draft Proposal for a Council Framework Decision, EDPS, 20 December 2007). In order to make this technology – which is so highly-intrusive nature into individuals’ privacy – legitimate the processing of PNR data should be in accordance with the criteria developed by the jurisprudence of the European Court of Human Right. To comply with the general data protection principles, the processing should be transparent to the data subject and adequate legal remedies should be implemented to protect the data subject against arbitrary decisions.

Thus, in this vain the first pillar Data protection Directive foresees several mechanisms to ensure transparent processing. They intend to empower the citizens and give them the possibility to control the processing carried out on them by a first obligation of prior information on the processing and by granting him with rights of access, rectification and deletion. In particular, when exercising his right to access with regard to automated decisions, the controller of the processing should inform the data subject about the logic of the processing, apart from their source and the person to whom they have been communicated.  

However, transposing this first pillar logic into the third pillar by simply demanding that forensic risk profiling is made transparent would be contrary to the large part of law enforcement. Risk assessment of flight passengers requires a certain opacity – and the same goes for many third pillar law activities. Transparency could make the privacy of the individuals within a database at stake, creating the possibility that ‘contestation’ by a conscious data subject might turn out to be distortion in disguise and would risk making an algorithm valueless as profiling techniques are frequently applied for investigation and prevention (e.g. surveillance, investigative stops and frisks, searches) which can (or even must) be done without informing the profiled person. So what about due process then? The present situation is that forensic risk profiling is a technique which is clearly outside the reach of the classical notion of due process, as well as it is outside the reach of any other form of democratic and constitutional control.

Is it possible to create alternative safeguards which protect and empower the individual citizen without the need to dissolve the opaque and aggregative nature of forensic profiling? In order to answer that question we need to take a closer look at the existing legislation and especially the proposed Frame work decisions. 

 

Critical analysis of the existing safeguards with regard to risk profiling

As was shown above a major issue arising from the use of profiling techniques resides in the (lack of) transparency of the processing. The European Data Protection Supervisor recalls the jurisprudence of the European Court of Human Rights, according to which domestic law may be sufficiently precise to indicate to citizens in what circumstances and on what terms the public authorities are empowered to file information on their private life and make use of it. The information “should be accessible to the person concerned and foreseeable as to its effects, a rule being “foreseeable” if it is formulated with sufficient precision to enable any individual – if need be with appropriate advice – to regulate his conduct” . 

However, the Framework decision on data protection – which is supposed to clarify further when transparency is required – has met with a considerable amount of critique since it was first put forward by the Commission on 4 October 2005. The European Data Protection Supervisor published two critical Opinions on the proposal (First Opinion of the EDPS on the proposed Framework Decision; Second Opinion of the EDPS on the proposed Framework Decision) in which he voiced his concerns that (Third Opinion of the EDPS on the proposed Framework Decision) 

“…developments in the negotiations were leading towards a level of protection of personal data not only below the standards laid down in Directive 95/46/EC, but also incompatible with the more generally formulated Council of Europe Convention No 108”. 

After the proposed Council Framework Decision was revised by the German Presidency of the European Parliament on 13 March 2007 (Council document 7315/07 of 13 March 2007) the European Data Protection Supervisor published a third Opinion on the 23rd of June 2007  wherein he noted his appreciation for the German attempt but also stated that he was disappointed about the content, which has become according to him:

“…a lowest common denominator approach that would hinder the fundamental rights of EU citizens as well as hamper the efficiency of law enforcement” (Third Opinion of the EDPS on the proposed Framework Decision) 

 

Part of the reason for why the proposal of 13 March 2007 failed to fulfil the expectations was probably the decision-making procedure in the Council which asks for unanimity and leads to this “lowest common denominator”. Another likely reason is the political climate (“war on terrorism”) which puts more stress on crime control than due process.  

However, another cause may have been the fact that the drafters of the first Framework Decision draft of 4 October 2005 seemed to have overlooked the fact that the classical legal protection such as offered by the idea of due process (presuming an investigative process aiming at the production of evidence at a trial, in a transparent way, where the individual can have his say about it) will offer no protection in the case of forensic risk profiling. Although due process is a right to be cherished in many a context it seems to be inadequate in protecting the individual citizen against the information asymmetries arising from forensic risk profiling technologies.

The extent in which the proposed Framework Decision assumes transparency and the possibility for the individual to stand up for his rights varies slightly between the different drafts (first draft 4 October 2005; revised draft 13 March 2007; revised second draft 23 October 2007; latest draft 11 December 2007).  

The paradox contained within the text of the first draft is the fact that on the one hand it acknowledges the existence of profiling technologies of which the data subject is unaware and which are opaque to him (e.g. art 20 (2) “Right of information where the data have not been obtained from the data subject or have been obtained from him without his knowledge) but on the other hand it still seems to assume an informed data subject standing up for its rights. But how can a data subject without knowing that he was profiled know if its rights were infringed? And even if the data subject is aware of the fact that he was subject to a forensic risk profiling practice - what use of knowing that your data were processed if you do not know how? Who is going to determine if the grounds for an exception to the right of information (art 19 (2) and art 20 (2)) are present if nobody who is affected by the profiling practice is aware of the existence of the practice? Who is going to claim the right to be informed about data collection and processing if there is no awareness about its existence?

In the revised German draft and in the latest draft the paradoxical articles 19 and 20 have disappeared: instead of solving the underlying paradox the new texts apparently try to avoid it by reducing the right to information of the data subject. Article 16 (information for the data subject) of the latest draft of the proposed Framework Decision

 

Article 16: Information for the data subject 

1. Member States shall ensure that the data subject is informed regarding the collection or processing of personal data by their competent authorities, in accordance with national law. 

2. When personal data have been transmitted or made available between Member States, each Member State may, in accordance with the provisions of its national law referred to in paragraph 1, ask that the other Member State does not inform the data subject. In such case the latter Member state shall not inform the data subject without the prior consent of the other Member State. 

 

In article 17 of the same draft of December 2007 we find that the data subject has a right of access ‘on request’ – but how can one make requests about something one is not aware about? If the responsible authority considers that there are legitimate grounds to deny the data subject access to the data which were processed, the data subject shall be advised – according to art. 17 (3) – “that he may appeal to the competent national supervisory authority, a judicial authority or to a court”. Thus, the legality of a profiling practice will only be considered by a competent authority or court if the data subject – who will be normally unaware of the existence of the practice – places a request. Viewed from this perspective article 20 of the latest draft (“…, the data subject must have the right to seek judicial remedy for any breach of the rights guaranteed to him by the applicable national law”) risks becoming a right that is difficult to exercise in practice.

 

 

Re-use of personal data  fidis-wp6-del6.7c.Forensic_Profiling.sxw  Alternative legal safeguards for Risk Profiling: Adequate Remedies and Due Processing.
23 / 27