You are here: Resources > FIDIS Deliverables > HighTechID > D3.6: Study on ID Documents > 

D3.6: Study on ID Documents

Biometrics  Title:
THE TECHNICAL PERSPECTIVE
 RFID

 

The Technical Perspective

Biometrics both has and causes security problems that are investigated in the following.  

 

Security Problems of Biometrics

As with all decision problems, with biometric authentication/identification, two kinds of failures occur: 

 

  • Persons are wrongly not authenticated or wrongly not identified. 

  • Persons are wrongly authenticated or wrongly identified. 

 

This leads to the dilemma of (biometric) pattern recognition (Jain, Hong, Pankanti 2000): If the similarity test is strict, people will be wrongly accepted or identified only rarely - but wrong non-acceptance and non-identification will happen more often. If the similarity test is less strict, people will be not accepted or not identified only rarely - but wrong acceptance and wrong identification will happen more often.  

Practical experience has shown that only the frequency of one error type can be kept small - and the price to be paid for that is that the frequency of the other error type increases. 

A biometric technique is more secure for a certain application area than another biometric technique if both error types occur more rarely. It is possible to slightly adapt the strictness of similarity tests used in biometrics to various application areas. But if only one of the two error rates should be minimised to a level that can be provided by well managed authentication and identification systems that are based on people’s knowledge (e.g., passphrase) or possession (e.g., chip card) today’s biometric techniques can only provide an unacceptably high error rate for the other error rate. 

Since more than two decades we hear announcements that biometric research will change this within two years or within four years at the latest. In the meantime one might begin to doubt whether such a biometric technique exists, if the additional features promised by advocates of biometrics shall be provided as well: 

 

  • user-friendliness, which limits the quality of data available to pattern recognition and 

  • acceptable costs despite possible attackers who profit from technical progress as well (see below). 

 

In addition to this decision problem being an inherent security problem of biometrics, the implementation of biometric authentication/identification has to make sure the biometric data come from the person at the time of verification and are neither replayed in time nor relayed in space (Schneier 1999). This may be more difficult than it sounds, but it is a common problem of all authentication/identi­fication mechanisms. 

 

Biometrics does not only have the security problems sketched above, but biometrics’ use also causes security problems. Examples are given in the following. 

 

  • Devaluation of classic forensic techniques: Widespread use of biometrics can devaluate classic forensic techniques as sketched for the example of fingerprints as a means to trace people and provide evidence: 

    • Databases of fingerprints (especially biometric raw data) essentially ease the fabrication of finger replicas and thus leaving someone else’s fingerprints at the site of crime.

    • If biometrics employing fingerprints is used to secure huge values, quite probably, an “industry” fabricating replicas of fingers will arise.

    • As infrastructures, e.g. for border control, cannot be upgraded as fast as single machines (in the hands of the attackers) to fabricate replicas of fingers, a loss of security is to be expected overall. 

  • Chopping off and taking away of body parts (Safety problem of biometrics): In the press you could read that one finger of the driver of an S-class Mercedes has been cut off to steal his car. Whether this story is true or not, it does exemplify a problem we might call the safety problem of biometrics when using unobserved sensors:

    • Even a temporary (or only assumed) improvement of “security” by biometrics is not necessarily an advance, but endangers physical integrity of persons.

    • If checking that the body part measured biometrically is still alive really works, kidnapping and blackmailing will replace the stealing of body parts. 

  • Wanted multiple identities could be uncovered as well: The naive dream of politicians dealing with public safety to recognise or even identify people by biometrics non-ambiguously will become a nightmare if we do not completely ignore that in our societies accepted and often useful multiple identities for agents of secret services, undercover agents and persons in witness-protection programs do and have to exist. The effects of a widespread use of biometrics would be:

    • To help uncover agents of secret services, each country will set up person-related biometric databases at least for all “foreign” citizens.

    • To help uncover undercover agents and persons in witness-protection programs, in particular organised crime will set up person-related biometric databases.

 

Privacy Problems caused by Biometrics

Biometrics is not only causing security problems, but privacy problems as well: 

 

  • Each biometric measurement contains potentially sensitive personal data, e.g. a retina scan reveals information on consumption of alcohol during the last two days, and it is under discussion, whether fingerprints reveal data on homosexuality (Forastieri 2002), (Hall, Kimura 1997). 

  • Some biometric measurements might take place (passive biometrics) without the data subject getting to know of it, e.g. (shape of) face recognition. 

 

In practice, the security problems of biometrics will exacerbate their privacy problems: Employing several kinds of biometrics in parallel to cope with the insecurity of each single kind, multiplies the privacy problems (c.f. mosaic theory of data protection). Please take note of the principle that data protection by erasing personal data does not work on the Internet, since it is necessary to erase all copies. Therefore even the possibility to gather personal data has to be avoided. This means: no biometric measurement.

 

Conclusion

Especially because biometrics has security problems itself and additionally can cause security and privacy problems, one has to ask the question how biometrics should be used and how it should not be used at all. 

 

  • Between data subject and his/her device: Even biometric techniques that often accept people erroneously, but rarely reject people erroneously, can be used between a human being and his/her personal devices. This is even true if they were too insecure to be used in other applications or would cause severe privacy or security problems in these other applications:

    • Authentication by possession and/or knowledge and biometrics improves security of authentication. 

    • No devaluation of classic forensic techniques, since the biometric measurements by no means leave the device of the person and persons are not conditioned to divulge biometric features to “foreign” devices.

    • No privacy problems caused by biometrics, since each person (hopefully) is and stays in control of his devices. 

    • The safety problem remains unchanged. But if a possibility to switch off biometrics completely and forever after successful biometric authentication is provided and this is well known to everybody, then biometrics does not endanger physical integrity of persons, if users are willing to cooperate with determined attackers. Depending on the application context of biometrics, compromises between no possibility at all to disable biometrics and the possibility to completely and permanently disable biometrics might be appropriate. 

  • How not at all? Regrettably, it is to be expected that it will be tried to employ biometrics in other ways:

    • Active biometrics in passports and/or towards “foreign” devices is noted by the person. This should help him/her to avoid active biometrics.

    • Passive biometrics by “foreign” devices cannot be prevented by the persons themselves - regrettably. Therefore, at least covertly employed passive biometrics should be forbidden by law.

What does this mean in a world where several countries with different law systems and security interests (and usually with no regard of foreigner’s privacy) accept entry of foreigners into their country only if the foreigner’s country issued a passport with machine readable and testable digital biometric data or the foreigner holds a visa containing such data?

  • Visas including biometrics or passports including biometrics? Visas including biometrics do much less endanger privacy than passports including biometrics.

    • Foreign countries will try to build up person-related biometric databases of visitors - we should not ease it for them by conditioning our citizens to accept biometrics nor should we make it cheaper for them by making our passports machine readable. 

    • Organised crime will try to build up person-related biometric databases - we should not ease it for them by establishing it as common practice to deliver biometric data to “foreign” machines, nor should we help them by making our passports machine readable without keeping the passport holder in control (cf. insecurity of RFID-chips against unauthorised reading, http://dud.inf.tu-dresden.de/literatur/ Duesseldorf2005.10.27Biometrics.pdf).

    • Since biometric identification is all but perfect, different measurements and thereby different values of biometric characteristics are less suited to become a universal personal identifier than a digital reference value constant for 10 years in your passport. Of course this only holds if these different values of biometric characteristics are not always “accompanied” by a constant universal personal identifier like the number of your passport.

 

Like the use of every security mechanism, the use of biometrics needs circumspection and possibly utmost caution. In any case in democratic countries (like the countries of the EU) the widespread use of biometrics in passports needs a qualified and manifold debate. After a discussion on how to balance domestic security and privacy, an investigation of authentication and identification infrastructures that are able to implement this balance should start:

 

  • Balancing surveillance and privacy should not only happen concerning single applications (e.g. telephony, e-mail, payment systems, remote video monitoring), but across applications. 

  • Genome databases will possibly undermine the security of biometrics measuring inherited physiological characteristics. 

  • Genome databases and ubiquitous computing (= pervasive computing = computers in all physical things connected to a network) will undermine privacy primarily in the physical world. 

  • Privacy spaces in the digital world are possible (and probably needed) and should be established - instead of trying to gather and store traffic data for a longer period of time at high costs and for (very) limited use (in the sense of balancing across applications). 

 

 

Biometrics  fidis-wp3-del3.6.study_on_id_documents_03.sxw  RFID
Denis Royer 42 / 56