You are here: Resources > FIDIS Deliverables > HighTechID > D12.2: Study on Emerging AmI Technologies > 
Reply 2: An anthropological approach of technology and society: an overview  Untitled
REPLY 3: AN ANTHROPOLOGICAL APPROACH OF TECHNOLOGY AND SOCIETY: AN OVERVIEW
 Conclusion

 

Reply 3: An anthropological approach of technology and society: an overview

Martin Meints 

Authors background: Martin Meints is a scientific researcher and security auditor at the Independent Centre for Privacy (ICPP) of Schleswig-Holstein, Germany.

Cerqui introduces the theories of technology determinism and technology neutralism. In her analysis she comes to the conclusion that technology is never neutral, but also not deterministic. Analysing the relationship between modern western societies and technology she comes to the conclusion, that the adaptation and use of technologies is an important part of these societies and thus evasion of technologies by individual members of theses societies becomes difficult. She further concludes that the future information society using ICT implants will also lead to a society that will be transparent for any flow of information. As IT has to be transparent by default, we can either opt for free flow of information, or privacy. 

Society knows different types of research. The most relevant two of them can be classified as (a) clearly target-oriented research and (b) as fundamental research. While target-oriented research aims at research results with a clear purpose of use or even a market in mind, fundamental research aims at methods, mechanisms or components that can be used in many ways and for many purposes. Research projects clearly seem to belong to one of these classes. But history shows that research projects may change their class or split up in two or more tracks. One example for this is research in nuclear fission – it started as fundamental research and led to nuclear power stations and atomic bombs. The risk of radioactive contamination is well known, taken into consideration and “managed”. Another example is the development of Thalidomide sold as Contergan, a pharmaceutical substance planned to be used as pain killer. Unfortunately Contergan caused birth deformities, notably in Germany in the early 1960s. Contergan consequently as pain killer was removed from the market. Today Contergan is back in the market – it meanwhile showed to be one of the most efficient medicines against leprosy, despite the well known adverse effects.

Both examples show that technology is neither deterministic, nor neutral. In addition these examples show that society can make a choice how technology is being implemented and used, taking adverse effects into consideration. Technology and risk assessment and as a result modification of the implementation or the technology itself and in some cases even the decision not to use the technology are well possible (see e.g. regulations for security of products and the prohibition of certain drugs). Regulations on the implementation and use of technology in many cases come along with law enforcement, typically by police and justice. Society in the past proved well to be able to manage technologies and their use.  

Cerqui points out that access to information is the keyword to understand the driving force of the information society. She also points out that the speed one is able to access the information is important. But quality of the information also is a relevant aspect not mentioned as such by Cerqui, though the ability to access information timely is clearly an aspect of quality. Another relevant aspect of quality is context-dependence –information not related to the context one is interested in is just noise and consumes brain resources without any benefit or may even lead to wrong decisions. Though context related technologies such as data mining that made impressive progress, context detection, context linking and context verification still requires quite some know-how and effort.

Another aspect that seems important is that IT is not transparent by default. First of all, IT is very complex and nowadays complexity increases further. As a result even experts do not understand all aspects of today’s integrated IT systems. In this context trustworthiness of hard- and software modules and components, based on standardised evaluation procedures e.g. carried out by trusted third parties, becomes important. Secondly research in IT security (today also called information security) already is dealing among others with confidentiality of information processed by IT systems. As a result in most cases there is no free flow of information in IT systems today. Information typically is used in communicational contexts. It may well move from one context to another, but there are areas e.g. in military research, planning and operations, where information is efficiently kept secret. In addition the transfer of personal data from one context (e.g. the original purpose for which the data have been collected and processed) to another one is in Europe regulated by data protection legislation. Legislation and market forces (e.g. trade secrets) together with information security mechanisms provide for important domains of opacity.

Opacity in many cases is a result of information asymmetry. Information asymmetry of course may have negative effects such as ‘lemon markets’ (Akerlof 1970) and building of monopolies. In cases where society is not willing to accept these monopolies, we soon observe regulatory approaches, for example at the moment at European market for electric energy.

It is difficult to predict what the results from ICT implants such as body to system or brain to brain communication may look like. The human nerve system is not readily understood yet, and scientists only have a rough idea how the human brain works. Free and transparent flow of information will depend on the type of ICT implant and it position, determining the way to access the brain. In addition the brain will adapt to the implant and the resulting flow of information. And it is very possible that certain types of ICT implants show severe adverse effects in a similar way, we know this from pharmaceuticals.

From my personal point of view a positive and constructive use of ICT implants today seems possible. Probably we will see limitations in purposes of implantation, types of implants and places where implants are allowed to further guarantee the existence of domains of opacity. Filters controlled and configured by the bearer of the implant might be a future Privacy Enhancing Technology (PET) in this context.  

The most relevant aspect to me seems where the borders between the domains of transparency and opacity are and who controls them. Who acts in which informational context on which side of the border primarily is a question of power, as outlined by Hildebrandt and Gutwirth [Hildebrandt, Gutwirth and de Hert, 2005] when analysing constitutional democracies.  

An anthropological approach of technology and society: a final riposte

Daniela Cerqui 

It is true that technology is a wide phenomenon. It is also true that an important part of anthropology is dedicated to the study of the relationship human beings develop with technology, considered as a natural human attribute. Several famous anthropologists and palaeontologists have studied the history of humankind, and have shown that there have been no human beings without technology (see for instance [Leroi-Gourhan, (1964, 1965)]). Therefore, general propositions about technology are necessary to understand its universal aspect and what it means for us, as humans. Then, this has to be considered in comparison with specific manifestations: technologies can be understood in their cultural dimension. The main goal of a committed anthropological approach applied to technology is to make people aware that it is never neutral, and that there are values embedded in it. It seems that such a statement can lead to a few misunderstandings. 

First, this does not mean that the user is not responsible for his/her use, as suggested by Köpsell, in reply 2. On the contrary, our legal system is based on the responsibility of the user, and it is a good pragmatic approach. But one should be sensitive to the different levels of analysis, as there is also a deeper level, which is the one we forget when focusing on the user. Generally speaking, responsibility does not start once the device is produced. Thinking that technology is the only feasible approach to solve our problems is a cultural vision of the world. Reply 2 is a very typical example of a speech that does not stand back from its own values at all: the difference between a stone used as a weapon and a technological weapon is that the second one was built by people, according to a clear project. There is also one big common denominator: in both cases we should not just think about how to use the object, but about which values are behind this use. 

The aim of anthropology is not to promote ‘good’ technologies, but to show that there is always a dark side included in the package. If we want the bright side, we have to cope with the dark one. And until we consider it simply as a side effect, we cannot face it properly. Wise political reflection has to take that into account. 

For instance, the idea that technology can work not only to the detriment of privacy but also to its protection is a very good starting point for acting based on an anthropological standpoint. Such a position admits the two faces of the coin. When describing privacy as a dynamic notion, it implies that privacy might have to be redefined because new expectations appear, and it can definitely lead us to think about how to balance good and bad impact.

The more information circulates, the better we understand that it needs boundaries. It is true, as argued by Meints in reply 3, that IT are not transparent in the way they work. Nevertheless, my point is not about the way they work, but about what they are supposed to bring to the user, whose access is supposed to be unlimited to data. In such a context, making the machines ‘userfriendly’ is one more way to make sure you do not loose time understanding how it works and you can directly become an efficient user.  

Information security is a very good illustration of how sometimes we try to give limits afterwards to a system which was first built to have insufficient boundaries. It does not make sense to create new ways for keeping information secret, every time we realise that there is a new ‘sensitive’ field in which unlimited access may be a problem. That is a reactive way of acting, based on a misunderstanding of the fundamental logic. Focusing on these boundaries just gives us the illusion that we master the flow and the way it is oriented, but we do not, unless we develop an anticipative reflex, taking into account the various plausible scenarios for the future. 

That implies an interdisciplinary collaboration, involving people open-minded enough to be embracing of other points of view. But respect for other scientific fields may well be an unreachable goal for some (disagreement is one thing, contempt is another!), therefore, the road is still long. 

 

 

 

Reply 2: An anthropological approach of technology and society: an overview  FIDIS_D12.2_v1.0.sxw  Conclusion
24 / 26