You are here: Resources > FIDIS Deliverables > HighTechID > D3.2: A study on PKI and biometrics > 

D3.2: A study on PKI and biometrics

Biometrics and cryptography  Title:
PRIVACY CRITERIA
 Current Privacy Research Applied to Biometrics

 

Privacy criteria

International discussion about biometrics lead to a number of questions that can be used for the evaluation of biometrics towards privacy respecting. Main input for this discussion was generated from the Information and Privacy Commissioner of Ontario, Dr. Ann Cavoukian, [BÄU99], the Federal Privacy Commissioner of Australia, Malcolm Crompton, Roger Clarke, [CLA02] and R. Hes, T. F. M. Hooghiemstra and J. J. Borking [HES99]. Further work was done by the European Art. 29 Data Protection Working Party (WP29) resulting in a working documents on biometrics. The following questions to evaluate a biometric system with regard to privacy are developed from the results of this discussion:

 

  • Is the biometric system transparent and participation voluntary? 

  • What personal information collected? 

  • Are the biometric and personal data separately stored? 

  • Is bodily privacy being protected (health information/behavioural traits)? 

  • Is covert collection possible? 

  • What data can be linked to “track” people?

  • Biometrics can be a universal and global unique identifier – are they linkable to a person easily and directly? 

  • Where is the biometric data held?  

  • What is the performance – FAR & FRR? 

  • How is it secured?  

  • What accountability, transparency, and audit arrangements, are in place?

 

Directive 95/46/EC constitutes the general legal framework for the processing of personal data. As stated above, the term “personal data” is defined as meaning “any information relating to an identified or identifiable natural person (…); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental (…) identity”.  

Directive 95/46/EC does not mention biometric data as such. However, as biometric data relates in most cases to an identified or identifiable person, it is generally considered that the provisions of the Directive 95/46/EC apply to the processing of biometric data.

For the purposes of this report, the general principles and guidelines will not be discussed. It is assumed that data protection principles are generally known. Specific guidelines for the processing of biometric data have been provided in August 2003 by the Art. 29 Data Protection Working Party (WP29) established by Directive 95/46/EC in a working document on biometrics.

Some national Data Protection Authorities of EU member states, including those of France, Spain, the Netherlands, Germany, Italy and Greece, have also issued opinions on the use of biometric systems in given situations. The recommendations of WP29 which are specific for the use and processing of biometric data in general are summarised hereunder. In addition to the general data protection principles, a biometric system shall take these additional requirements and recommendations into account. The controller of a data processing is in principle responsible and liable for the obligations imposed by the Directive and the national data protection laws.

 

First of all, ‘raw’ biometric data should not be stored. Raw biometric data may reveal information about a person’s health or race. This information is from a data protection legislation point of view, sensitive information. In addition, this information is in principle not necessary for the verification or identification purposes of the biometric application. Templates, however, may not reveal this information about a person’s health or race. Therefore, and based upon the principle of proportionality, it is generally recommended not to store the raw biometric data, but only the templates. It is clear that the security measures for the database with the raw biometric data, if any, and if required for the envisaged purposes and obtained upon a valid legal ground, shall be of the highest level.

 

Furthermore, one should take into account that in most systems biometric templates are personal data. WP29 states that while ‘measures of biometric identification or their digital translation in a template form (are) in most cases (…) personal data’, ‘in cases where biometric data, like a template, are stored in a way that no reasonable means can be used by the controller or by any other person to identify the data subject, those data should not be qualified as personal data’. The question remains what should be considered ‘reasonable means’ in the current state of practice and the evolving techniques of identification. To the extent that a database would use only the digital templates of e.g. voice or fingerprint, or the digital templates of the 2D or 3D picture of the face, without a name, and without any reference, one should examine to what extent the available techniques would permit reconstructing a full image (e.g., of the voice, fingerprint or face) from the templates or to what extent the available techniques would enable the comparison of the biometric templates with existing templates combined with personal data or other personal data held in other databases. If the answer to one of these possibilities were yes, one might conclude that the persons could indeed be identified, directly or indirectly, and that the data contained in the biometric template are personal data to which the data protection rules apply. If the answer were no, one should indeed conclude, in conformity with the definition in the Directive 95/46/EC, that these biometric data are not personal data. One has to conclude, though, that there is presently still some discussion going on about when and under which conditions templates should be considered personal data (e.g., when is the effort unreasonable) and what different criteria could be used.

 

The template should also preclude processing of data that is not necessary based upon the principle that the processing of personal data should be proportional and not excessive. Biometric data often contain more information than necessary for the identification or the verification functions of the biometric system. The WP29 therefore recommends that the templates should technically be constructed in a way to preclude the processing of data that are not necessary. Unnecessary data should be destroyed as soon as possible.

 

Furthermore, central storage of biometric data is to be avoided if the identification/verification applications can be carried out without, based upon the principle that the processing should be proportional and not excessive. This means that biometric systems to be developed should be designed in such a way that excessive identification techniques are avoided. Such excessive identification may take place if the biometrics are stored in a central database rather than in an object exclusively available to the user, e.g. a microchip card (smart card), if possible. There are many reasons for this aversion of central storage. Users have no control anymore over their biometric data if it is stored in a central database. In addition, the more data stored in a central database, the more public and private institutions will attempt to gain access to these data for their purposes. Furthermore, if the data are stored in a central database, users do not have knowledge when their data is evaluated or used. Therefore, users have in principle more control over their data if the biometric data are stored locally, e.g. on a card in their possession and the user him/herself has to submit the card to the system knowingly and willingly.

 

Furthermore, one should avoid the use of a unique identifier by the manipulation of templates. The Directive states that the Member States shall determine the conditions under which an ‘identifier of general application’ may be processed (article 8.7 of the Directive). Fingerprints could in principle be considered as an ‘identifier of general application’. Other biometric data could also serve as such. As stated above, data subjects should in principle retain control over their personal data. In addition, they should be able to object to the processing of their data.  For this reason, WP29 finds it desirable to avoid as much as possible that biometric data could be used as a unique identifier to link databases containing personal data. It this view, it is for example recommended that templates and their digital representations are processed with mathematical manipulations (encryption, etc.) with different parameters for every biometric product or system. This should avoid the combination of personal data from several databases through the comparison of templates.

 

One should also segregate the biometric information from other personal data. Biometric templates shall be secured and encrypted during enrolment and storage. In addition, each controller shall review whether it is necessary to attach in the same database system other individual information (such as e.g., a name, an address, an id-number, …) to the encrypted templates, by which it is directly clear (upon decryption) to whom the encrypted biometric templates belong. If individual information is required, though, any additional individual information directly linked to the biometric data should be avoided if this is not necessary for the purposes of the processing. In order to enhance security, it is therefore strongly recommended to also use encoded data, i.e. biometric data with a reference code instead of the name of the individual and to store the name of the individual and the corresponding code in a separate place, even by a separate authority/company, if possible. Please note that encoded data is also considered to be personal data as long as identification of the data subject remains possible. Only data which is absolutely anonymous does not fall within the scope of the Directive (the identity of the person behind the data cannot be traced back since, e.g., the encrypting code was destroyed).

 

The controller of personal data shall take all appropriate technical and organisational security measures to protect personal data against accidental or unlawful destruction or accidental loss, alteration, unauthorised disclosure or access. This applies in particular where the processing involves the processing of biometric data. Integrity and confidentiality are of utmost importance to any biometric application. In addition, special attention shall be given in case of the transmission of biometric data over a network, whether private or public networks. If biometric data are sent over the Internet, the highest security measures shall be taken. The security measures should be designed right from the start. It should be part of a formal process and should result in what is commonly designated as the ‘security architecture’ of the system. Several organisations are developing international standards which are relevant for the security of biometric systems. The elements of such security architecture shall include authentication, access control, data confidentiality, data integrity and non-repudiation. The security measures shall be implemented from the beginning of the processing of the personal data, as of the phase of ‘enrolment’ by the data subjects. In addition to the ‘usual’ security architecture, a biometric system requires enhanced security measures in order to minimise the risks and to guarantee privacy to a maximum extent possible. For example, in the BioSec project, in order to prevent replay attacks, it is advised that key generation will be done for each session with random or pseudo-random numbers by both the host and the smart card/token and that in order to avoid token or host emulation, mutual authentication of both the host and token will be used. Furthermore, also in BioSec, the transfer of the individual’s template to the smart card/token in the enrolment phase shall be done exclusively in a controlled environment.

In addition to the design of an overall ‘security architecture’ (see above), biometric systems require the design and development of a so-called ‘privacy architecture’. The aim of the ‘privacy architecture’ is to minimise the impact of the biometric system upon the privacy of the data subjects. Above are some requirements and recommendations in order to enhance the privacy of the data subjects upon the processing of their personal data. These requirements and recommendations take the special nature and privacy concerns of biometric data into account. In addition to ‘privacy architecture’, some authors plead upon the design of each new biometric system for a ‘Privacy Impact Assessment’ (PIA) for the identification of the privacy risks. Such PIA should be understood as an analysis of the system as how is dealt with the risks of the biometric data, taking the applicable legislation, recommendations and good practices into account. The ‘privacy architecture’ and the ‘privacy impact assessment’ should interact and accept input from each other.

 

 

Biometrics and cryptography  fidis-wp3-del3.2.study_on_PKI_and_biometrics_03.sxw  Current Privacy Research Applied to Biometrics
Denis Royer 26 / 40