You are here: Resources > FIDIS Deliverables > HighTechID > D3.6: Study on ID Documents > 

D3.6: Study on ID Documents

European Data Protection and Human Rights Framework  Title:
CRITICAL OBSERVATIONS
 Conclusion

 

Critical Observations

This part contains some critical observations. These relate in the first place to the choice - as such - for biometrics as a unique identification and verification tool. Further, we will discuss the underlying legal framework in which the Regulations (including the ICAO standards) came into existence. We will then apply the principles of proportionality, purpose specification/finality and individual participation to some provisions of the Regulations. We will finally say some words on the issue of central storage of the biometrical data.  

 

The Choice for Biometrics

 

Concerns about the body as a passport 

Storing digital photos and fingerprints for visa applications relates to the data subjects of more than 130 countries in the world. The mandatory taking and storing of fingerprints of asylum seekers and aliens concerns hundreds of people, every day. The fundamental issue in Regulation 2252/2004 concerns the duty to sample fingerprints of over 450 million people and the possibility of central storage (that is not excluded) of these fingerprints, together with digital photographs and other data.

Article 29 Data Protection Working Party - Europe’s data protection observatory - has published a critical opinion on EU Regulation 2252/2004. It had already issued an opinion on the use of biometrics and an opinion on the inclusion of biometrics in residence permits and visas (VIS).

The expectations in connection with biometrics are probably overestimated. Indeed, biometrics could ‘cause us to place too much trust in the effectiveness of electronic solutions’. Biometrics is based on probabilities: false positives and - negatives are unavoidable. Pincodes are not based on probabilities (Van Kralingen et al. 1997, 14). If only one percent of a targeted group of 100.000 people a day suffers from a false negative, this would cause every day 1000 people to be ‘automatically’ (but wrongfully) stopped. That biometrical data, even fingerprints, change throughout time, confirms this risk (Wayman 2006, 14).

Furthermore, most of the information we know about the reliability, accuracy and efficiency of biometrics, is provided by the vendors of biometrics (OECD 2004, 32). The ‘Biometric summary table’ in the O.E.C.D. Report of 2004 (see next page) shows that the biometric technology - as is - is not perfect. Moreover, no biometric technology seems to be in line with all data protection principles and user acceptance at the same time (data quality principle, transparency principle, data security principle). Whereas fingerprint scanning is only ‘possibly’ very high, the user acceptance is medium to low. Whereas the accuracy of facial recognition is medium to high, the stability and the transparency are low.  

 


Table : Biometrics summary table

 

from the same Report confirms this.

 


Table : Candidates and preferred technologies

 

Biometrics does not exclude identity theft or forgery. Although biometrics prevent so-called ‘identity substitution’ to a certain degree, the fraudulent issuance of a genuine passport cannot be prevented. In addition, the best fakes seem to be still intercepted by inspectors on the basis of the holder’s behaviour, among other variables.

These conclusions can make that the ‘necessity’ criterion - indispensable to interfere with an individual’s private sphere - can still play a role in a later discussion on the legitimacy of the Regulations. Although we do not expect the European Court of Human Rights to decide that one or more databases (or parts thereof) or biometrics identification systems as such turn out to be unnecessary infringements of the private life of people, the tension is clear and should bring policy-makers in the European Member States to greater care. 

 

Biometrics, privacy and sensitive data  

Two questions with legal consequences arise: Are biometrics sensitive data in the sense of data protection law and does the obligation to be subjected to biometrical identification not conflict with feelings of (bodily) dignity of people? 

The issue whether biometrical data are sensitive data, remains important. Article 8 of the Data Protection Directive principally prohibits processing of sensitive data. The use of biometrics can involve the processing of sensitive data in the sense of article 8. Biometrical data of disabled people may relate to their medical condition and correlations could for example be made between papillary patterns and diseases such as leukaemia and breast cancer. Face recognition can reveal racial or ethnic origin. The processing of biometrical data may thus reveal - more or less immediately - sensitive information about an individual. This goes far beyond the purpose for which biometrical identification is supposed to be used.

It seems clear that the taking of fingerprints and photos may involve the processing of sensitive data. It is however not clear if the algorithms and machine-readable templates that contain the information, are also to be considered as sensitive personal data (Hes et al. 1999, 42).

Possibly, case law (of the European Court of Human Rights) will make a distinction between different biometrics (e.g. non-sensitive fingerprint vs. sensitive facial image).

Taking, measuring and processing of biometrical data may also harm a person’s personal feeling or experience of dignity. The fact that people feel uncomfortable with close observations (they are obliged to look into a lens, they are obliged to put fingers on holders used by other people etc. …) has already been observed as a possible feeling of intrusion of dignity (Wayman 2006, 15).

That taking facial images is related to this observance, may also be derived from the fact that for example the Quality Assurance (QA) software - used to examine the properties of the applicant’s photo for a passport or travel document - can reject a photo thereby explaining why (Friedrich & Seidel 2006, 5). Exceptions to the photo requirements are possible for handicapped citizens and for certain religious reasons, but this may at the same time confront people with themselves as being an exception; and force them to reveal their religion.

The use of ‘your’ body as an identification tool for others might likewise infringe what is called our informational privacy. Anton Alterman states in other words: “The degree to which the body is objectified by the process, suggest[s] that biometric identification alienates a part of the embodied self. The body becomes an object whose identity is instantly determinable by purely mechanical means, and subject to external controls on that basis; while those means themselves are removed from the control of the subject. The representations are infinitely reproducible by their owner, but are not even accessible to the subject whose body they represent. The embodied person now bears, more or less, a label with a bar code, and is in this respect alienated from her own body as well as from the technology used to recognize it. If having an iris scan on file is not quite like being incarcerated in the world at large, being made known to mechanical systems wherever they may be is still a tangible loss of privacy that is not precisely paralleled by any other kind of information technology.” (Alterman 2003, 146)[Emphasis added]

 

The use of two biometric identifiers 

The introduction of two biometric identifiers in EU passports and travel documents raises lot of concern. The impact of these choices is great. Whereas the Council is calling for the use of two biometrical identifiers, the U.S. and the ICAO only require one, and this only involves a digital photograph. Nowhere in the Regulation have we found the need for two biometrics argued as a proportional measure. 

The inclusion of a fingerprint biometric is unprecedented. Moreover, the U.S. has no intentions of implementing fingerprints in their passports. In the Regulation it is said that these choices are ‘in accordance with the principle of proportionality’ and do ‘not go beyond what is necessary in order to achieve the objectives pursued, in accordance with the third paragraph of Article 5 of the Treaty’. This Article of the EC Treaty contains the proportionality principle of the EC law. It should be noted with regard to the vagueness of data protection that some members of the Parliament proposed to limit the passports to only one identifier with exactly the same reference to the principle of proportionality. Steve Peers reaches a similar conclusion (Peers 2004, 3).

When article 4.1 of the Council Regulation states that no information in machine-readable form shall be included in a passport or travel document unless foreseen in the Regulation or unless it is mentioned in the passport or travel document by the issuing Member State, the latter possibility raises concern: The possibility for Member States to include other machine-readable information than the information described in the Regulation itself and the fact that this must merely be mentioned on the passport or travel document, opens doors for different interpretations of the principle of proportionality.  

 

The Underlying Legal Framework (Reconsidered)

EU Regulations enter - contrary to directives - immediately and integrally into force in the Member States on the date indicated in the Regulation. EU Regulations achieve the highest harmonisation level, since no deviation can be made by a Member State.  

 

The legitimacy of EU Regulation 2252/2004 

The procedure followed by the Council - to vote the Regulations - has met strong criticism. It is said to have exploited the democratic deficit of the European Union to an unheard extreme.

Firstly, more than seventy civil society organisations from the EU and abroad, nine national or regional Data Protection Commissioners and more than two hundred concerned citizens signed an open letter by Privacy International, Statewatch and European Digital Rights opposing this proposal and the procedural ‘black-mail’ of the Parliament by the Council, to vote on the amended Regulation that was only send to the Parliament at the latest minute.

A second critique concerns the legal basis for the European Union to act. The idea of common standards for ID cards set out in EU measures is new. Commentators hold that new and separate legislation is required, since the EU does not have the power to adopt measures on ID cards at present, although it would gain such powers if the Constitution is ratified (Peers 2004).

This non-competence of the EU could be found in Article 18(3) of the Treaty Establishing the European Community. Article 18 (amended by the Treaty of Nice) states: 

 

  1. Every citizen of the Union shall have the right to move and reside freely within the territory of the Member States, subject to the limitations and conditions laid down in this Treaty and by the measures adopted to give it effect. 

  2. If action by the Community should prove necessary to attain this objective and this Treaty has not provided the necessary powers, the Council may adopt provisions with a view to facilitating the exercise of the rights referred to in paragraph 1. The Council shall act in accordance with the procedure referred to in Article 251. 

  3. Paragraph 2 shall not apply to provisions on passports, identity cards, residence permits or any other such document or to provisions on social security or social protection. (emphasis added)

 

A third critique concerns the speed and the impact of the proposals. Sensitive information may be transferred to other countries when verification is required at border controls. With the traditional passports, personal data in the passports were shown to the border authorities, but not as such processed. Hence, the Directive did not apply. With automated verifications the Directive applies. Whenever the EU tolerates that the new passports are scanned in countries without adequate data protection, it knowingly permits violations of its own principles.

This risky attitude seems to be of a more general nature. Immediately after the parliamentary vote, doubts were expressed about the willingness of the Council, to take any of the amendments voted by the Parliament into consideration. The amendment to make Recital 7 more stringent, for instance, was not followed. A proposed amendment of the Parliament to prohibit the establishment of a ‘central database of European Union passports and travel documents containing all EU passport holders’ biometric and other data’ was again not upheld. Also not upheld were amendments to involve the data protection experts of the Article 29 Working Party in the follow-up of the Regulation and the more precise choice of standards.

 

The legitimacy of the ICAO standards 

As we have seen (see above), ICAO has a major impact on the technical aspects of the biometrics deployed for passports and travel documents. Document 9303 is the guideline legally implemented by EU legislation. The ICAO however does not oblige encryption of the data on the RFID chip (storage medium).

The reference in Regulation 2252/2004 to Document 9303 and the mandatory use of the RFID-chip has met criticism: The ICAO is not a legislative body representing all members of the society (including citizens) and the standards established by ICAO might be considered as the result of a non-transparent procedure. Although this is the case for many standardisation bodies, this observation must be made because the implications of the ICAO standards will probably have huge impacts - good or ‘bad’ - on the long term.

The (non-binding) Report of the Parliament concludes the same where it states that ‘Document No 9303 should not be referred to in an EU regulation, since it is constantly being amended by means of a process which lacks transparency and democratic legitimacy.’”

 

Compliance with Quality Principles Taken from Privacy Law and Data Protection Law

Although all the (draft) Regulations expressly recognise the application of the Data Protection Directive and although these (draft) Regulations contain provisions that elaborate the principles of data protection law, it can be argued that some of the articles of the (draft) Regulations infringe the qualitative principles such as the principle regarding proportionality contained in the Data Protection Directive 95/46 and the privacy case law. We will give some examples that relate to the principle of confidentiality, proportionality, finality/purpose specification and individual participation.  

 

The proportionality principle 

Some provisions of the Proposal for the VIS Regulation show that the personal data processed are inadequate, irrelevant and excessive in relation to the purpose for which they are collected. For example: The requirement to hand over surname at birth (article 6.4.a) can infringe the right to a private life. The obligation to provide the nationality at birth (article 6.4.b) is of no relevance and may lead to discrimination. The possibility to link individuals with a group (article 5.4) opens the door to make profiles; the definition of ‘group’ is unclear and should be precisely defined. The Proposal justifies the inclusion of such data stating merely that ‘these data are required for the assessment of the application and for checks on the visa of the applicant. (…).

Article 6.4.f. requires to give the name and address of the person issuing an invitation or liable to pay the costs of living during the stay of the applicant. Why should this (possibly very) sensitive information - for example the issuer or liable person being a hospital, a politician, a priest - be given and stored in the database? Here, the Proposal justifies by stating that ‘the inclusion of data on persons and companies issuing invitations will help to identify those persons and companies which make fraudulent invitations. This constitutes important information in the fight against fraud, illegal immigration, human trafficking and the related criminal organisations which often operate in an international scale’.

The retention period for each application file being maximum five years (article 20) is also subject to critique for specific situations such as in case of visa applications being refused or visas that have been issued for less than three months.

As already indicated, the inclusion of two biometric identifiers (not encrypted) in the EU passport also candidates for an infringement of the proportionality principle.  

 

The purpose specification and finality principle

According to the purpose specification principle, personal data can be processed as long as the processing meets specified, explicit and legitimate purposes. This principle is the most important touchstone of data protection law because it provides the criteria to decide about the legitimacy of a processing and the use and quality of the personal data processed. 

Many purposes foreseen in the VIS Regulation may be criticised: they go far beyond the achievement of a common visa policy. For example, the purpose to prevent threats to internal security of any of the Member States is already pursued by means of the SIS and other tools available for police cooperation.

The purpose of facilitating checks at external border checkpoints and within the territory of the Member States, of assistance in the identification and return of illegal immigrants and of facilitating the application of Regulation (EC) No 343/2003, is excessive when analysing the legal basis of the Proposal, namely Article 62.2.b, ii and Article 66 of the Treaty establishing the European Community (TEC).  

There exists currently a Proposal for a Council Decision concerning access to the VIS by Europol for the purpose of prevention, detection and investigation of terrorism and other serious criminal offences. This Proposal reflects a more general trend to allow several law enforcement authorities to access all available databases for their work on terrorism and other serious criminality.

The EDPS stated very clearly in an Opinion on the VIS Proposal: “One must bear in mind that the VIS is an information system developed in view of the application of the European visa policy and not as a law enforcement tool. Routine access would indeed represent a serious violation of the principle of purpose limitation. It would entail a disproportionate intrusion in the privacy of travellers who agreed to their data being processed in order to obtain a visa, and expect their data to be collected, consulted and transmitted, only for that purpose”. EDPS proposes here that access to VIS by law enforcement authorities should only be granted ‘in specific circumstances on a case-by-case basis and must be accompanied by strict safeguards’. For example, the condition in the Proposal’s Article 5 that “the consultation of VIS data will contribute to the prevention, detection or investigation of [a specific] offence, does not suffice for the EDPS: consultation must “substantially” contribute.

Also other purposes mentioned throughout the Proposal (examination of applications, consultation between authorities, reporting and statistics, identification) derive from the main purpose of improving the common visa policy. Since VIS seems to become a common search- and research tool for all authorised public authorities within the EU, privacy and data protection safeguards should be indisputably introduced in a clearly defined and accessible way before effectively deploying the VIS.  

Also the European Passport requirements for biometrics can lead to unexpected and unforeseen purposes. Once biometrical data and corresponding information (for example: a person x is identified in Frankfurt Airport through facial recognition) are available, the risk of their use for other purposes than the ones they were collected for will undeniably remain present. Obviously, enhanced interoperability of systems will contribute to increasing such risk. The possibility that the data subject will never be aware of such illegitimate uses and/or processing of data is realistic as well.

This is not just a scenario. A most intriguing example has already been demonstrated in America. During the American Super Bowl Final in Tampa (Florida) in June 2001, the police deployed intelligent video cameras with facial recognition technology to scan the faces of all the 100,000 spectators present in the stadium. The faces of these spectators were compared with the facial templates of wanted criminals and terrorists, stored in a database.  

Personal data collected through machine-readable passports and travel documents could be used for other purposes than legally permitted, such as profiling. It is not clear whether and when profiling falls under the rights and obligations of the Data Protection Directive; privacy and anti-discrimination law may also apply (Hildebrandt & Backhouse, 2005; Hildebrandt & Gutwirth, 2005; Schreurs et al., 2005). The directive may allow statistical processing or profiling of personal data, once the data are made anonymous.

But the results thereof (the profiles) can be applied afterwards to data subjects without them knowing that the profiles are applied to them: this could for example result in people being individually stopped at a border control or individually checked because they fall under a certain profile. How is it guaranteed that the data subject is informed that such automated individual decisions are applied to him? How can be guaranteed that the data subject can exercise the right to obtain from the controller knowledge of the logic involved in such automatic processing operations? Will all authorised agents acting upon these automated decisions ‘know’ this logic involved and be able to communicate this logic to the data subject?

Another risk can be that the machine-readable systems (and not only personal data) may be used by other persons and for other purposes than foreseen. In 2005, the 27th International Conference of Data Protection and Privacy Commissioners adopted a Resolution in which it expressed its awareness ‘of the fact that the private sector is also increasingly processing biometric data mostly on a voluntary basis’. The Conference called for ‘1. effective safeguards to be implemented at an early stage to limit the risks inherent to the nature of biometrics; 2. the strict distinction between biometric data collected and stored for public purposes (e.g. border control) on the basis of legal obligations and for contractual purposes on the basis of consent; 3. the technical restriction of the use of biometrics in passports and identity cards to verification purposes comparing the data in the document with the data provided by the holder when presenting the document’ (emphases added).

It should be added here that the strict separation between public and private authorities blurs, also in law enforcement practices (such as airport security). This creates the risk that personal data that are primarily processed for identification purposes, may afterwards be used for private purposes such as - for example - the billing of outstanding debts for the use of a service, proof of evidence in a private litigation (divorce, employment) or direct marketing. The latter practice is forbidden without prior consent of the data subject but it is widely spread and hardly sanctioned. The law does not guarantee us that - roughly said - ‘these data or databases can and will never be used for any private purpose, whatever later legislation may say’.  

 

The individual participation principle 

The individual participation principle is also confronted with problems. While there is no difference with traditional passports (without biometrical identifiers) when a person wants to access or rectify personal data that are visibly written on the passport, this may not be the case for his biometrical identifiers in the passport. How to check and verify if the biometrical data are still accurate? How an individual for example knows that the biometrical identifiers in his passport are still working before or during his travel? What if the storage medium is destroyed? What if his physical data do not match the biometrical data any more and access is unexpectedly refused? How to settle these (literally) unforeseen problems? Fallback procedures should be available to constitute safeguards for the introduction of biometrics, as they are neither accessible to all nor completely accurate.” It remains however not clear how these fallback procedures can be worked out.

 

Central Biometrical Database(s)?

 

What  

The issue whether one central European database or several databases in the Member State will be deployed, is still open. Council Regulation 2252/2004 does not oblige central storage but leaves this important issue to the Member States open.

The European Parliament, in its Report on the Commission’s proposal for the Council Regulation, proposed that the creation of a central database of EU passports and travel documents containing the biometric and other data of all EU passport holders should be forbidden. The Council finally did not take into account this proposal of the Parliament.

Article 29 Data Protection Working Party states in its Opinion on Council Regulation 2252/2004 that “there is a risk that the setting up of a centralized database containing personal data and in particular biometric data of all (European) citizens could infringe against the basic principle of proportionality; intensifies the dangers of abuse and function creep; raises possibilities of using biometric identifiers as ‘access keys’ to various databases, thereby interconnecting data sets.” 

In the ICAO reports, centralised databases are also an important feature. The ICAO calls for central databases that allow for additional security confirmation checks, but does not go so far as to effectively require such systems. Hence, there is some flexibility permitted by the ICAO and some states may interpret the ICAO standards to require centralised databases.

 

Which are some of the possible impacts of databases containing biometrical data?  

First of all, a central database allows determining that the biometrics of an applicant is enrolled only once (Wayman 2006, 15). If the biometrics are not stored in a central database or the database is not connected with other databases, a fraudulent person might use the same (but stolen) biometrics (of a former bona fida applicant) for the second time. Or the other way around: a bona fida applicant can enrol without knowing his biometrics is already circulating in the system (OECD 2004, 26).

But central database bring along risks.  

When biometrics is stored in a database, the database allows for identification. State authorities can determine one’s identity independently, without the data subject being aware of it.  

All terrorist attacks in the last years have been committed by people who were already suspect in a way or another. So these people were already in a database. This did not help to prevent the attacks. The question can be asked whether putting everybody in the database would prevent then these kinds of serious crimes? Why would putting millions of innocent people in a database help to prevent crime, whereas having had the possibility to access the data of thousands of suspected people in a much smaller database did even not prevent serious crimes?  

Another reason why creating a database of ‘innocent’ biometrics raises questions, is the simple fact that terrorists are clever enough to be not in the database. Moreover, terrorists come often from third countries and just need to pass the border. It will not be difficult to stay out of the database when you have bad intentions.  

Having your biometrics put in a central database gives raise to security problems that are the mere consequence of the compulsory ‘being’ in the database.  

Identities can be stolen without any physical perception of the theft (Hoepman & Jacobs, 2006). Whereas in case of a token, one needs to copy the template of the biometrics on the token, a central database just needs to be accessed by authorised or unauthorised persons. Studies prove that often the computer crimes occur from the inside. Another issue concerns profiling (see further).

In addition, databases containing biometrics can offer big business opportunities, even for governments. This would however irrevocably lead to a function creep and infringe the finality and proportionality principle. But the exercise must be made. Biometrics are principally used to verify the link of an individual with the document containing the biometrics (Van Kralingen et al. 1997, 9). When biometrics are stored in a database, biometrics can also be used to identify an individual (Jacobs, 2006). Even more: the identification can be established without the need for a passport. Once stored in a database for the purpose of issuing passports, these biometrical data can be used to offer services to public and private institutions that want to identify people (customers) with merely a fingerprint reader or face recogniser (‘the body as passport’). If access rights to these databases will be granted to private institutions, governments are building a major asset for the future: they could ask ‘fees’ or ‘costs’ each time the database is consulted by third parties for identification (Van Kralingen et al. 1997, 4).

For the same reason, a biometrical database can offer huge possibilities for law enforcement agencies. The traditional approach in law enforcement was to look first for suspects and then to search them among citizens. Biometric databases allow law to look for all citizens and than search for suspects. In other words: Every search for a possible perpetrator of a crime can start with a presumption of all persons present in a database, being suspect.  

The possibility to allow a combination of at least two verification methods should be encouraged. When biometrical data are measured against the template in a database, a second verification tool such as a password seems appropriate to allow individuals to secure their own personal data (Van Kralingen et al. 1997, 20).

 

 

European Data Protection and Human Rights Framework  fidis-wp3-del3.6.study_on_id_documents_03.sxw  Conclusion
Denis Royer 22 / 56