You are here: Resources > FIDIS Deliverables > HighTechID > D3.10: Biometrics in identity management > 

D3.10: Biometrics in identity management

Proportionality  Title:
PRIVACY PROBLEMS
 Direct identify ability, link ability and profiling

 

Privacy problems

For a description of the privacy problems of biometric systems, an analysis of the privacy problems of biometrics will in principle start from the issues which were described in the opinion of the Article 29 Data Protection Working Party (Article 29 Working Party) on biometrics of 1st August 2003. The privacy problems which were put forward in this opinion will in this document not be discussed at length since this was already done in FIDIS deliverable 3.2: A study on PKI and biometrics. The overview of the privacy problems given in that working document of the Article 29 Working Party, however, remains valid as a starting point for the present report. For a proper understanding, the main privacy concerns which were outlined in that document are hereunder represented again, in brief, in a schematic overview.

 

Privacy Risk 

Storage 

Qualifying factors

Data Protection principle

Suggested remedy in WP 80 to counter risk 

Identification 

Central storage 

Size of database

Type of biometrics used

Proportionality 

Art. 7 

 

Biometrics contain sensitive information (health, race)

Central (or local) storage 

 

Prohibition to process sensitive data 

Art 8 

Data minimisation

Art. 7 

No images 

Use of templates which exclude such information

Secret capture and/or surveillance

Central storage

Especially vulnerable are low-level intrusiveness biometrics (e.g., face, voice), but also fingerprint, …

Fair collection and processing 

Art. 6 (a) 

Local storage under control of data subject 

Incompatible re-use (‘function creep’)

Central storage

 

Special risks to rights and freedoms

Art. 20 

Prior checking with DPA 

Theft 

Central (or local) storage

 

Appropriate technical and organisational security measures

Art. 17 

Appropriate security measures

Including revocability of templates and impossibility to reconstruct biometric raw data from template

Use as unique identifier for connecting databases

Central storage 

Use by governments

Conditions to be determined

Art. 8 § 7 

Right to object 

Art. 14 (a)

Mathematical manipulations

FAR/FRR  

Central or local storage

Type of biometrics used

Prohibition of automated decisions

Art. 15 

Re affirmation of outcome, appropriate back-up procedures

Table 7 : Overview of privacy risks of biometrics as stated in WP 80

 

Almost all of the privacy concerns which were described, in fact relate to biometric Type I , II and III models (for the models, see above, section ) and the use of biometrics in these models requires special attention for the reasons set out above. The risks also relate most often to the place of storage of the biometrics. In the case where the biometric characteristics are stored in a central place, the risk increases. The Article 29 Working Party has already warned that setting up a centralised database containing personal data and in particular biometric data of all (European) citizens ‘could’ infringe the proportionality principle.

There are however additional concerns which the use of biometrics raise. The Article 29 Data Protection Working Party already stated in its opinion that the document was only a ‘working document’ which it intended to revisit in the light of the experiences of data protection authorities and technological developments linked to biometric applications.

Some of these additional privacy concerns and issues under the framework of the data protection legislation are hereunder identified and further described.  

Data quality 

One of the privacy problems which need to be further investigated is in relation to the data quality of specific biometric data, such as face scan and hand geometry, for specific groups of people. Such a group is that comprised of children and teenagers, which is likely to have a higher risk of processing of inaccurate or outdated biometric data.

One of the basic principles of the data protection legislation as set forth in the Directive 95/46/EC is the data quality. The principle requires that the personal data must be ‘accurate, and, where necessary, kept up to date’; furthermore, ‘every reasonable step must be taken to ensure that data which are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they are further processed, are erased or rectified [emphasis added]. This requirement with regard to the data quality poses a problem for specific forms of biometric data which relate to a human characteristic that changes over time, for example, if the individual grows older. The reference biometric data relating to hand geometry or the face of younger persons, for example pupils of a school, at a certain point may not be of good quality anymore, as the characteristics change and these changes are not reflected in the reference data. This problem has been recognised in relation to the use of the facial image of children for the use of identity documents in a study performed for the Ministry of the Interior in the Netherlands in 2005. The report stated that ‘it is very likely that facial recognition of children of twelve years or younger, on the basis of a reference image that is some years old, is problematic. The reason is the significant changes in the proportions of the characteristic points in the face during growth. These changes take place after a complex process that is to a large extent determined by the sex and genetic background’. Furthermore, the report stated that the problem also exists for children older then twelve, and that additional research on this topic is needed. The data quality of the reference biometric data of younger persons (for example, under eighteen) is therefore a concern, not only from a practical point of view, but also under the data protection legislation, which imposes requirements for the quality of the data, in particular that the data shall be accurate. Inaccurate data would lead to increased False Acceptance Rates (FAR) and False Rejection Rates (FRR) and would render the whole biometric application unreliable. FAR and FRR also pose risks for the data subjects, by either having somebody else in your place identified for the service or by being wrongly rejected.

While biometric applications are sometimes promoted in schools or other environments involving children, for convenience and other purposes (for example the administration of meals), this privacy concern should not be neglected and the efficiency of the whole biometric application could become questioned if there are no appropriate (technical) measures implemented to solve this problem. In the absence of such technical measures which should ensure that the reference data do not become outdated too soon, the inaccurate data shall be kept up to date. This would mean that the reference data needs to be replaced at regular intervals with new reference data by a new enrolment of the data subject. If this would not be possible, the data shall not be used any longer and is to be erased. The administrative and operational requirements for such replacement and the consequences of this principle is most important for the biometric Type I government controlled ID model; the importance remains but decreases in the biometric Type II Access model, the Type III mixed model and the Type IV convenience model.

Data quality of face recognition in general

In preparation of the introduction of the e-passport in Europe, two relevant tests of face recognition systems using biometric face reference data standardised by the International Civil Aviation Organisation (ICAO) were carried out by the German Federal Office for Information Security (BSI) and the Ministry of Interior of the Netherlands. The results were summarised and analysed in the FIDIS deliverable 3.6 (V. 1.1, pp. 25-26). In general 2D face recognition accuracy could not compete with fingerprinting and iris scan, and the use of ICAO-compatible reference data seemed to decrease the quality compared to vendor-proprietary templates. As a consequence most European counties currently do not intend to use 2D face recognition for border control purposes yet, among them Germany – instead fingerprinting is planned to be used.

In March 2007 the results of the “Face Recognition Vendor Test 2006” were published by the (U.S.) National Institute of Standards and Technology (NIST). One of the remarkable conclusions of this report is that now face recognition can well compete with the quality of fingerprinting and iris scan. Looking into the details of the set up for the testing it has to be pointed out that no ICAO-compliant reference data format has been used, so these results can not be compared directly with the previously mentioned test results from 2005. In fact the best results were achieved with high and very high resolution pictures (resolution of 4 Mbit and higher) or 3D face recognition. Especially low resolution, use of compression (JPEG picture format) and non-standardised light conditions led to results that were not impressive (FRR higher than 5% with FAR = 0.1%).

In general it can be concluded that the quality problems of face geometry compared to finger printing and iris scan still remain when using ICAO-compatible reference data. 

Recognition (matching) decisions are intrinsically inaccurate to some extent

As described above, the personal data processed should be accurate. A fundamental element of biometric systems is the recognition (matching) decision and the threshold employed (see section 3.1.2). Because of the inherent statistical nature of a biometric system, a decision of a biometric system merely gives a degree of correlation between the submitted biometric samples and the reference biometric data. Each type of biometric system has a FRR and a FAR to a higher or lower degree depending on the threshold. It is the system designer or the operator (owner) who will set the acceptance threshold and error rate, often decided and adapted to the requirements of a specific application. In a low security application, e.g., the registration of meals of pupils, one could decide to reduce the FRR, which will have as effect an increased FAR. This trade off and the fact that the match is never a complete match (but only a probability) imply that the decisions that biometric systems make about an individual and the data relating thereto are never for 100% correct or sure. One could question if this is in conformity with the requirements of the Directive that the data relating to individuals shall be accurate. Biometric matching decisions of biometric systems fail per definition to fulfil this requirement. Individuals may always be subject to false decisions which affect them. To avoid a negative impact of the failure rates of biometric systems for the data subject appropriate back-up procedures have been suggested (e.g. by Clarke). Consequently authentication using biometrics never should be solely based on biometrics using one feature only.

Right to object 

If biometrics are used by a private owner for access control purposes, e.g., to a place open to the public such as a dancing club or soccer stadium, the public interest (securing (public) order) is often invoked, or the interests of the controller, outweighing the interests of the individuals. The Directive 95/46/EC states that especially if the processing is based on these grounds, the data subjects should have the right to ‘object at any time on compelling legitimate grounds relating to his particular situation’ to the processing, unless the national legislation states otherwise. Legitimate grounds could be the contention that biometric data include sensitive data, difficulties to enrol, or also religious belief. The principle of the right of individuals to object to the processing of data shall be taken into account in the discussions about biometrics in the way that alternatives for use of a biometric system will always have to be provided. This will be especially true for the biometric Type II access models operated by a private organisation. For biometric Type II access models operated by a government, e.g., for e-government services, the use of alternatives will be even less desirable in view of a streamlined organisational model where as much as possible the biometric verification or identification functionality will be used.  The use of alternative means for biometric Type I government controlled ID models, will even be much more difficult. Biometric systems will therefore probably never include all individuals to whom it might be directed (e.g., controlling access of passengers to specific zones in airports). Appropriate specifications as to when somebody is entitled to object to become enrolled in a biometric system would therefore be needed.

Unauthorised access to biometric data stored on RFID chip

Biometric data that are stored on a contactless chip need to be sufficiently secured in order to prevent unwanted disclosure of the data contained therein. This security is often lacking or insufficient. FIDIS and other authors have strongly advocated the use of appropriate security measures to avoid tracking and eavesdropping of the personal biometric data stored on media which involve new technologies such as RFID. This issue has become very important, because the use of a contactless chip has been agreed for the issue of the so-called e-passports, following the ICAO specification for Machine Readable Travel Documents in May 2004, and confirmed and mandated in the 2252/2004 Regulation. The vulnerability of the biometric data stored on the RFID chip has been proven by documented attacks on the e-passport in several countries, including Germany, the Netherlands, the United Kingdom and Belgium.

In addition to the privacy threats and ethical concerns described above, biometrics raise in principle concerns of link ability, disclosure of additional health information and unobserved verification or identification.

 

Proportionality  20071228_fidis_deliverable_wp3_10_V1.0.final.sxw  Direct identify ability, link ability and profiling
19 / 40