Resources
- Identity Use Cases & Scenarios.
- Introduction to the Scenarios.
- Ambient Intelligence – putting the machines in control.
- Use and Abuse of Biometric Data and Social Networks.
- Social Networking 4.0: Identity in the future of the digital social landscape.
- Virtually Living in Virtual Reality.
- Powering the profile: Plugging into the Grid.
- The role of forensics in identity.
- Human enhancement, robots, and the fight for human rights.
- Discussion.
- FIDIS Deliverables.
- IDIS Journal.
- FIDIS Interactive.
- Press & Events.
- In-House Journal.
- Booklets
- Identity in a Networked World.
- Identity R/Evolution.
Brief background
The aim of forensic research is to support investigatory and judicial processes by finding traces in otherwise apparently unpromising raw material from which it is possible to build a picture of events and activities. Locard’s Principle is at the foundation of what forensic scientists do: “Every contact leaves a trace”. Clearly forensics and identity are inherently linked because the aim is typically to identify a person or persons, or to link a person with an activity at a scene. As forensic techniques improve, the knowledge of how to defeat them also keeps in step. As such, the investigator has to keep in mind that what the evidence points to may not in fact be correct, and as such a broader picture is necessary.
Here, set in a world not too dissimilar to today, the scenario will explore how biometric identification can be spoofed such that someone is implicated in a crime, and how future advances in forensic techniques could subsequently prove innocence.
Scenes
Scene 1: A rude awakening
The digital readout on the clock flashes to 03:05 – the night is very still, and the Craggs are sound asleep. While the people may be resting, the house is very much awake. Such uninterrupted time is ideal for dedicated number crunching – a time when all the data collated during the day can be sorted, cleaned and processed to yield new information to update and augment current profiles being used in the system. That is, however, until the system flags a new primary task – the security system’s proximity sensors have detected an anomalous movement in the vicinity of the front door. Because of their countryside location, and the local wildlife inhabitants, such an event is not unusual. Indeed the system is able to monitor through a variety of sensors to establish whether an event is of true importance. As the threat level flicks from amber to red, it appears in this case it very much is. In line with David’s preferences, the lights in the bedroom are switched on dimly, and a computer generated voice tries to wake him from his slumber with a warning. He comes round in time to hear an almighty crash at the front door, a thunder of feet pounding through the house, and the sound of men shouting down the hallways.
Scene 2: Ello, ello, ello …
By late morning, things have started to become somewhat clearer. The hasty arrest of David’s wife Li-lian for ‘data theft’, and the immediate confiscation of their laptop computers and primary house server during the police raid had shed precious little light on the situation. In fact little was revealed during the associated chaos until Li-lian’s interview with the detective in charge of the case some hours later. It transpired that someone had gained high level access to the computer system in the hotel where Li-lian worked, and had stolen the personal details, including banking and credit card numbers, from their customer database. A partial print and DNA left at the scene had been cross referenced with the UK’s national ID card and national DNA databases, and had placed Li-lian in the top ten of likely matches. Knowing that Li-lian did not have security clearance for the main server room where the security breach occurred – finding her partial fingerprint and DNA there appeared to be quite damning evidence. The only problem was that not only did Li-lian emphatically deny any knowledge of the crime, she also appeared to have an alibi for the time it occurred …
Scene 3: Good old fashioned high-tech forensic police work
It was certainly true that Li-lian did not fit the profile of a cyber-criminal, and this had cast doubt from the beginning of the investigation. However, identity theft was big business, and the police had taken a rapidly growing interest in it over the last few years. As such, it was now procedure to confiscate personal computer equipment for searching before anything could be removed or deleted. Of concern was the fact that no evidence could be found on the computers, and that the profiling agent on Li-lian’s home server indicated that she was in fact at home with her family at the time of the attack - something which her husband readily confirmed. This left something of a conundrum – someone had managed to defeat the iris scanner on the door to the server room to gain access, had stolen personal data, and had then left the fingerprint of someone else. As all leads began to look cold, there came a stroke of luck. The details of the crime had, as usual, been entered into the local police station’s database. While databases across the country were not explicitly linked per se, the UK police force now uses a system called LinKSeE, an artificially intelligent data-mining program which distributes software agents across the isolated police databases which hunt for patterns and correlations, and generate new, potentially useful knowledge. In this case, the system had noted a case six months previously in a different police jurisdiction which had a very similar modus operandi. Indeed, not only was the target again a hotel, and the method of attack identical, but the system had cross-referenced the employee lists from both hotels and had come up with a match.
Scene 4: A rude awakening, take 2
At 07:00 in the morning, the police swooped on the home of their new suspect. Having been employed as a cleaner at both hotels at the time of the attacks, it seemed clear that this man was key to the data theft crimes. Indeed the lifestyle revealed by analysis of his bank records and the out of place Mercedes on his driveway also indicated someone not surviving on a cleaner’s wage. In a makeshift workshop in the house the police found what they were looking for: materials for lifting fingerprints and constructing gelatine copies to make fake prints at the scene, and samples of Li-lian’s hair containing her DNA. On a computer, high resolution holiday photos of the head of security at the hotel downloaded from the internet were also found, from which printed copies of his iris could be made to spoof the hotel security systems. Certainly enough evidence to vindicate Li-lian of the crime.
Discussion
Data matching is the traditional, retrospective, way of offender profiling, linking individuals with personal identifying data. But there is also the proactive practice of ‘data mining’ or ‘risk profiling’ - that is, finding patterns and correlations in large, sometimes separate databases.
Although the information that is extracted from data matching techniques can be used as evidence in court, for forensic science it is important to give a statement of the technologies limitations and thus how strong or weak the evidence alone is. As such, it is important to also consider other available evidence. With many systems there exists a possibility of incorrect association of a user with a device, deliberate tampering with the system or system error through incorrect usage or technical faults. A classic example is that fingerprints can be spoofed, and indeed other biometric features can be copied, even without the owner of that feature knowing it. Such issues could be compounded by enormous databases such as national fingerprint or DNA databases since false hits are inevitable, especially from partial prints or because of errors in the data entered. Already in the UK such a DNA database exists in which samples are retained from people charged with an offence, even if they are subsequently acquitted. Further to this, an obligatory fingerprint database seems likely as part of the UK’s national ID card scheme. As such, sensible measures need to be employed by the investigators in the examination process to factor in the integrity of the data and thus of the results since this could provide an alternative hypothesis. Additionally, manufacturer’s claims regarding biometric security devices should always be verified. If they claim a device has liveness detection (a test that determines that the sample being used with the device is living and not e.g. a paper copy) for example, this should be checked as such claims are not necessarily always true.
A common problem in police investigations is that of ‘linkage blindness’ i.e. there may well be data relevant to an ongoing case, but if it is not known to the investigators because e.g. it is stored in another police database, then it is of no use. New emerging technologies could help alleviate these issues by having ‘intelligent’ software agents which serve to find links in distributed data that would take a human operator impossibly long to find. The danger with such technology is mainly the impact of false hits. This has implications on two levels – firstly the police do not want to waste time investigating what is simply coincidence, and secondly there could well be negative impact on those people who are falsely implicated in a crime. Technology will undoubtedly be helpful, but it is certainly not going to replace good old fashioned police work.
The use of biometrics has been extensively covered in D3.2: ‘A study on PKI and biometrics’, and D.3.10: ‘Biometrics in identity management’. Biometrics in a forensic context, including methods of spoofing, has been covered in D6.1: ‘Forensic Implications of Identity Management Systems’, while profiling in a forensic context, including methods to address linkage blindness, has been explored in D6.7: ‘Forensic Profiling’. Biometry and security in existing and future ID documents is covered by D3.6: ‘Study on ID Documents’.