Resources
- Identity Use Cases & Scenarios.
- FIDIS Deliverables.
- Identity of Identity.
- Interoperability.
- Profiling.
- D7.2: Descriptive analysis and inventory of profiling practices.
- D7.3: Report on Actual and Possible Profiling Techniques in the Field of Ambient Intelligence.
- D7.4: Implications of profiling practices on democracy.
- D7.6 Workshop on AmI, Profiling and RFID.
- D7.7: RFID, Profiling, and AmI.
- D7.8: Workshop on Ambient Law.
- D7.9: A Vision of Ambient Law.
- D7.10: Multidisciplinary literature selection, with Wiki discussion forum on Profiling, AmI, RFID, Biometrics and Identity.
- D7.11: Kick-off Workshop on biometric behavioural profiling and Transparency Enhancing Technologies.
- Forensic Implications.
- HighTechID.
- Privacy and legal-social content.
- Mobility and Identity.
- Other.
- IDIS Journal.
- FIDIS Interactive.
- Press & Events.
- In-House Journal.
- Booklets
- Identity in a Networked World.
- Identity R/Evolution.
When elaborating the concept of ‘law by design’ we may need more detailed definitions of concepts like privacy, transparency, and profiles. To articulate the legal norms that protect values like privacy, autonomy, and non-discrimination, we will need semantics that can be translated into machine-readable data. At the same time we must prevent static a-contextual definitions, because core features of AmI are flexibility, mobility, and contextual adaptation. For this reason we may turn to Helen Nissenbaum’s concept of ‘contextual integrity’ (Nissenbaum 2004), which can serve as an example of how to rethink legal notions in order to make them robust in the face of emerging technological infrastructures.
An important consequence of turning the offline world online will be a further transformation of the borders between the public and the private. By introducing the notion of ‘contextual integrity’, Nissenbaum avoids the need to separate the private sphere from the public sphere, and we would argue that an AmI space will require safeguards against a violation of ‘privacy in public’. The vision of AmI implies a series of enabling technologies, like sensors and RFID, that will be used as surveillance technologies that are capable of making people transparent in their public behaviour, thanks to profiling techniques. Both government agencies and commercial service providers (as well as public-private service providers in the sphere of, e.g., health care) will develop extensive monitoring infrastructures to be able to deliver real-time adaptations of the environment. For this reason Nissenbaum’s dynamic concept of ‘contextual integrity’ is promising, compared to the notion of privacy (which is too easily restricted to the private sphere). To prevent a violation of a person’s contextual integrity, Nissenbaum claims the relevance of two types of norms:
norms of the appropriateness of a specific information flow, and
norms of flow or distribution of information.
By articulating such norms as legal norms into the technological infrastructure of AmI, we could perhaps find ways to make the opposing paradigms of proactive computing and personal autonomy compatible. The violation of privacy would depend on the context and refer especially to a potential disruption of power (knowledge) balance, which could develop between individual citizens and the service providers that process data and apply profiles. It would require an intelligent mix of sensitivity to context and foreseeability, combining dynamic interpretation with the robustness of legal certainty. In law and legal theory, such a combination is not very surprising: fundamental concepts like privacy are essentially underdetermined, having an open texture in need of permanent fine-tuning to changing circumstances.
The concept of appropriateness is relevant for fair information-processing principles like purpose specification and use limitation, providing the rationale for the interpretation of such principles. For instance, instead of demanding that all purposes are always specified and the use of data always limited to the declared purpose, norms of appropriateness would require of the purpose to be appropriate, taking into account the specific features of the context within which the data are exchanged with special focus of the consequences of the processing of data in terms of profiling and categorisation.
The concept of distribution is especially relevant in the case of transparency rights, again providing a rationale for the application of such rights, taking into account the reciprocity between data subject and data controller.
Nissenbaum’s concept of contextual integrity thus seems promising as a means to create a middle ground between the opposing paradigms of data minimisation (as a means of user control) and data maximisation (as a means to achieve a smart environment):
the flow of information is not unlimited (not every exchange of data or profiles is appropriate), and
the transparency of consumer-citizens is countered by transparency of profiles (the flow of information is reciprocal, generating a fair distribution of knowledge and information).
In fact, Nissenbaum has made her concept of ‘contextual integrity’ operational by formalising it in a framework of temporal logic, thus articulating norms for the exchange of data into the technological architecture (Barth, Datta et al. 2006). She thus provides an interesting example of how to inscribe a normative framework into a technological infrastructure, which may clarify how AmL should be designed.
16 / 31 |