You are here: Resources > FIDIS Deliverables > HighTechID > D12.2: Study on Emerging AmI Technologies > 
Reply 1:  Untitled
CONVERGING TECHNOLOGIES, SOCIETY & PRIVACY
 Reply 2: An anthropological approach of technology and society: an overview

 

Converging technologies, society & privacy

 

Eleni Kosta, Diana Bowman & Bert-Jaap Koops 

Authors backgrounds: Eleni Kosta is a legal researcher at the Interdisciplinary Centre for Law and Information & Communication Technology (ICRI) in the Katholieke Universiteit Leuven, and primarily works in the field of privacy and identity management, specialising on new technologies. Diana Bowman is a research fellow at the Institute for Energy and Environmental Law in the Katholieke Universiteit Leuven, working in the area of nanotechnology and regulation. Bert-Jaap Koops is professor of regulation & technology and Academic Director of the Tilburg Institute for Law, Technology, and Society (TILT) of Tilburg University, with a research interest in law & technology and other topics of technology regulation, such as information security, identity, and regulation of bio- and nanotechnologies.

The emergence of new technologies provides the potential for vast and varied applications, bringing with it both promise and peril. As Daniela Cerqui reminds us, technology has both good and bad implications, but its value does not depend merely on the uses to which it is put: technology is never neutral. Technology mirrors social and cultural values, if only because technology developers do not operate in a vacuum, but in a broader social and cultural context.  

This dynamics of technology development and the socio-cultural context provides a relevant perspective for analysing a topical issue: the value of privacy in the up-coming era of converging technologies. Starting from the promises and perils of nanotechnologies, we will argue that the miniaturisation of technology, particularly through the combination of nanotechnologies, information and communication technology (ICT), and bio-implants, is likely to diminish the level of privacy protection in society, unless action is taken to actively embed privacy protection in technology. Cerqui argues that after the ‘transplantation revolution’, direct and unbounded sharing of information between a human being and their environment is possible but only if we discard privacy. We will show that this is not necessarily the case: we can choose to retain at least a certain amount of privacy even in the era of converging technologies, provided that it is embedded in the design stage of technology.

While the evolution of new technologies and applications, including the Internet and the World Wide Web, have fundamentally changed the way in which we, for example, communicate, work and even engage in everyday consumer transactions, the borderless and ubiquitous nature of the ‘Internet Revolution’ has similarly challenged society’s ability to protect one’s privacy. As the notion of privacy is not a static one, it is changing and adapting according to the needs of the society and the progress of technology. Data about the individual can be collected through various channels (for example see [Froomkin (2000)], such as cameras in public and private places, mobile phones, access cards, security controls or smart dust, to name just a few. The threats against privacy arising from Jeremy Bentham’s conception of the ‘Panopticon’ or the ones described in the Orwellian society of ‘Nineteen eighty-four’ seem minimal compared to the mass threats against privacy posed by emerging technologies. At least in those societies, the individual had the knowledge that they could be being watched at any point in time. Baird and Vogt add to this that while “advances in information technology are already now augmenting concerns about privacy, […] these will be severely heightened by likely developments in nanotechnology” [Baird and Vogt (2004)].

 

Nanotechnologies and privacy 

The nanotechnology ‘revolution’ has already begun and as highlighted by Cerqui in relation to all technologies, the coming ‘nano-age’ is likely to have both positive and negative impacts on society. Projected benefits of this heterogeneous family of technologies are anticipated across all industrial sectors including, for example, electronics, health, manufacturing and energy sectors. At the same time, concerns have been raised over the potential health and safety implications of engineered nanoparticles, and the impact of the emerging technology on privacy.

While nanotechnology is still a relative young field of science, commentators have identified a range of potential products particularly within the field of ‘nano-electronics’ that may additionally challenge the current regulatory frameworks that are charged with protecting our privacy. These include the development of smart dust, ‘ubiquitous miniature sensors floating around in the air’ designed to detect any communication information with other machines, micro- or nano-scale Radio Frequency Identification (RFID) tags that could be incorporated into an extensive range of products for use in inventory control and products information storage purposes [Rodrigues (2006)], and the creation of nano-scale surveillance devices. In relation to the latter, Moor and Weckert suggest that it would make it ‘extremely easy to put a nanoscale transmitter in a room or onto someone’s clothing so that he or she will have no idea that the device is present or that he or she is being monitored and tracked….implanting tracking mechanisms within someone’s body would also become easier with nanotech devices’ [Moor and Weckert (2004)].

While the use of nano-scale monitoring and tracking equipment may pose a number of challenges to the protection of privacy, commentators such as Mehta have also recognised that these developments may offer significant societal benefits within the fields of health care, environmental remediation and national security. Within the health-care sector for example, long-lasting nano-scale sensors designed to detect viruses, pathogens, and cancer cells may be embedded into the body to provide continuous health monitoring. However, the use of nano-sensors and collection and use of sensitive health information including genetic information within this context may give rise to a number of potential privacy implications, as well as to other fundamental issues like employment and genetic discrimination. While these privacy and discrimination issues are far from unique to the advent of nanotechnology, there is the potential for these issues to be amplified by its invisible and unobtrusive nature.

 

Privacy in the era of converging technologies 

Cerqui highlights the prospect of ICT convergence with cognitive sciences, noting for instance that, “information technologies are also getting closer to the human body with each new breakthrough”. This is actually part of a longer-term development (10-20 years) of nanotechnologies’ convergence with other technology platforms, specifically biotechnology and biomedicine, ICT and the cognitive sciences [Roco and Bainbridge (2002)]. This anticipated convergence has been labelled ‘Nano-Bio-Info-Cogno’, or NBIC. It is often associated with the idea of human enhancement or improvement, through for example, neuro-implants, cognitive enhancement, and brain-to-machine communication [Roco and Bainbridge (2002a)]. Should NBIC technologies be realised, Gordijn [Gordijn (2006)] argues that as with other technological developments, NBIC technologies will similarly bring with it promise and peril, highlighting the issue of infringement on privacy. Regardless of the trajectory of NBIC, it appears likely that the issue of privacy will remain as a central concern in the coming ‘converging-technologies age’.

What role is there for privacy in this up-coming age? Technology is considered in several instances as privacy-destroying and is often perceived as a natural enemy of privacy. No one can question the fact that an individual’s privacy is threatened by the massive and easy collection of information realised through the use of emerging technologies. Notwithstanding the fact that new technologies have increased the amount of data collected, some believe that their actual sorting out and processing becomes even more cumbersome and costly. Contrary to the human processing that took place in the past, the one carried out by machines followed by human intervention for verification and checking can prove so expensive that many will not go on with it, “the costs of control [thus yielding] a certain kind of freedom”. This statement should of course be countered by the argument that the linking of available data or databases, data-mining, and profiling do allow an increasing scrutiny of groups and individuals, which may give substantially more power to law enforcement agencies and private companies over the individual. However, technology can work not only to the detriment of privacy but to its protection as well. Privacy Enhancing Technologies, such as P3P, may assist individuals to secure themselves against technology violations and allow them to “enable upstream control of privacy rights as well as individual control.”

While considering the threats against privacy imposed by the broad use of nanotechnology, the French Data Protection Authority (CNIL) highlighted that it shall be ensured by the European Data Protection Authorities that the wave of broad use of nanotechnologies “does not constitute a threat for the consistency of data protection principles and provisions, which are more relevant than ever”. CNIL seems however to forget in its thinking that new technologies create new expectations of privacy compared to the ones already existing. It is observed that when people interact in the on-line world, they are practically transferring their off-line expectations in this new environment. As mentioned by Leenes and Koops, “in the case of new, privacy invasive technologies, the user will have false expectation of privacy. And hence the use of such technologies will be considered a greater violation of privacy” [Leenes and Koops (2006)]. The reverse, however, is also true: as technology is slowly embedded in society, often to the detriment of privacy, reasonable expectations of privacy are lowered, resulting in a slow erosion of privacy through the mere development of technology. This process is hard to counter, but a conscious effort of society to stimulate and use PETs, notably in the design stage of technology, may provide a way to retain privacy after all. In order to continue to protect individual privacy, it is important that society anticipates the trajectory of developments within this field and takes a proactive approach to the implementation of relevant privacy safeguards. And while it is sometimes argued that privacy is rapidly becoming outdated, with younger generations caring less and less for keeping things private in the era of mobile communications, on-line communities, and Big Brother TV programs, it is not a foregone conclusion that privacy will not be considered an important value in future society. As De Hert and Gutwirth demonstrate, a democratic society needs both tools of transparency and tools of opacity, and privacy is a key tool of opacity to foster [Gutwirth and De Hert (2005)].

The vulnerability of privacy in light of all technology threats is beyond doubt. Maybe in the society of micro- and nano-technologies, of RFID chips and ICT implants, privacy has to be redefined. Machine-to-machine communication can lead the way for privacy-enhanced technology tools and new ways of communication, where the expectations of the individuals about privacy will differ from the ones they have had in earlier ages. But however changed, we believe that they will not disappear outright.  

The important fact to bear in mind is that the relationship between technology and privacy is not intrinsically antagonistic. Although technology poses an abundance of new threats against privacy, it can also function as one of the means to protect privacy. Privacy-Enhancing Technologies have not found wide development or implementation yet, but that does not mean the promise of PETs will never come true. The best news is that slowly, technologies are emerging that incorporate privacy protection while at the same time allowing for other interests, such as security, ease-of-use, or efficiency. 

As an example of these promising emerging technologies, we refer to information-security protocols being developed that reconcile information-sharing with confidentiality-keeping [Teepe (2006)]. These protocols allow interested parties, such as the police and national-security agencies, to have access to personal data and to compare these data with other databases, without actually being able to know the content of these data. This enables information-sharing and data-mining without infringing privacy up-front. Only after a match shows up, e.g., when a passenger on a flight to the United States features on the blacklist of people not allowed to enter the country, are the data of that particular passenger revealed to the proper authorities. As Wouter Teepe argues, not all privacy problems related to anti-terrorism measures can be solved by such protocols, but some of them can. If ‘privacy by design’ is taken seriously, a new range of opportunities opens up. It is just a matter of deciding “what kind of society we want to live in, in the future”.

Having these thoughts in mind we would like to believe that, despite all gloomy prognoses, ‘all is not yet lost’.

 

 

Reply 1:  FIDIS_D12.2_v1.0.sxw  Reply 2: An anthropological approach of technology and society: an overview
22 / 26