You are here: Resources > FIDIS Deliverables > Mobility and Identity > D11.3: Economic aspects of mobility and identity > 

D11.3: Economic aspects of mobility and identity

Data transfer to third countries  Title:
BUSINESS COMPLIANCE TO EUROPEAN DATA PROTECTION LEGISLATION
 Theories for User Acceptance in the Market of Mobile Identity Management

 

Business Compliance to European Data Protection Legislation

The data protection directive aims to provide a working balance between the needs of the data subjects and those of the data controllers by facilitating and encouraging the free flow of personal data while at the same time strictly safeguarding the privacy of the individual. Within this perspective, we can perceive data protection as a technical term relating to specific information management practices, or as the preferred stance of those who would see data protection primarily as an aspect of business regulation. In contrast, privacy is more likely to be considered as a fundamental human right and accorded specific protection under human rights conventions or constitutions. It is however possible to discuss privacy issues in the terminology of risk and risk assessment, concepts which are, perhaps, more familiar in a business environment. In particular, three risk factors can be identified which could be considered to be elements of privacy:  

  1. risk of injustice: this can occur due to a significant inaccuracy in personal data, unjust inference, ‘function creep’ (the gradual use of data for purposes other than those for which it was collected) or reversal of the presumption of innocence as seen in data mining when correlation of information from disparate sources may produce an impression that is greater (or different) than the sum of the parts.

  2. risk of excessive and unjustified surveillance: the data controller could exert dubious control over the collection of personal information as a result collecting data without the data subject’s consent. This risk could even extend to the active discouragement of the means to remedy these risks, such as the use of encryption and anonymising software or hardware.  

  3. risk to data subject’s dignity: this can occur as a result of exposure or embarrassment due to an absence of transparency in information procedures, physical intrusion into private spaces, unnecessary identification or absence of anonymity, or unnecessary or unjustified disclosure of information without consent.

All of the above have echoes of data protection issues and, in the technical sense, data protection measures may be considered as risk management devices which need to balance the risk to the individual from unnecessary invasion of privacy with the measures necessary to control that risk.

A brief taxonomy of the various compliance costs that have an impact on the business sector must precede any analysis on the specific issues relating to the concept of mobility. These costs, as indicated by the Final Report on the Economic Evaluation of the Data Protection Directive fall under the following categories:

  1. costs linked to learning about the requirement of the Directive 

  2. costs in adjusting the internal organisation to comply with the Directive 

  3. running cost of compliance 

  4. quantity and costs of human resources involved in the compliance  

  5. cost of external advice and support 

 

In addition, the aforementioned costs can be divided into three categories: Financial costs, subjective compliance costs and administrative burdens. 

Financial costs are the result of a concrete and direct obligation to transfer a sum of money to the government or the competent Data Protection Authority. This would include for example the one-off fee for notification that can be imposed on data controllers by the national Data Protection Authority, under article 18 of the data protection directive.

Subjective compliance costs cover expenses that businesses must undergo in order to comply with specific substantive obligations that legislation and regulation require. The investment in new technological measures in order to ensure the protection of personal data is a typical example of these costs. 

Finally, administrative burdens cover the costs of business compliance with the information obligations resulting from legislation. The requirement to notify the national Data Protection Authority, for example, would fall under this category 

From a different perspective, compliance costs can fall in two categories:  one-off costs cover expenses and activities such as the gathering of knowledge about the requirements of the data protection directive, the initial training of the staff which by later handle the data processing, the initial notification to the competent national authority, the investment in new technology in order to ensure the secure environment in which data will be processed, etc. On the other hand, running costs include the notification to the competent authorities regarding the processing operations, the authorisation and notification of transfer of data to third countries, the training of staff in order handle new processing techniques, etc., as well as the handling of data subjects’ requests for information, correction, etc. (being a legitimate exercise of their data protection rights).  

In order to comply with the data protection principles laid down in the data protection directive, data controllers should store only a bare minimum of data, which suffice for the running of their services. By adopting a ‘data avoidance policy’(that is, by implementing an infrastructure which is oriented towards collecting, processing and using either no personal data or as little as possible) data controllers can greatly minimise their compliance costs. For this purpose, it is advised that privacy issues and in particular the processing of personal data (with the further implications regarding identity management) be taken into account at the earliest stage of the organisation of the data controllers’ infrastructure (‘privacy by design’).

Besides the use of anonymity as a weapon for compliance to data protection legislation, anonymity can also be perceived as a tool which is available to a data subject in order to shield his identity from those with whom he interacts. Current interest in this topic has primarily focused on anonymity in the context of the Internet, and the need to balance privacy through anonymity against the needs of society to be able to identify individuals engaged in particular activities. However, to the extent that electronic communication networks are converging by offering the same types of services, the relevant issues that have already emerged on the Internet are bound to occur to other electronic communications networks as well.  

More specifically, a feature of all electronic communications networks is their potential to generate a huge quantity of traffic data, that is, data processed for the purpose of the conveyance of a communication on an electronic communication network or for the billing thereof. The possibilities for interactive use of the networks increase the amount of traffic data yet further. It follows that the choices of the user of the network create a ‘clickstream’ of traffic data, which can be perceived as a ‘digital trace’, the monitoring of which enables the profiling of the user’s online behaviour. For these reasons, the specific data also fall under the protective scope of the legislative framework of data protection: Traffic data must be erased or made anonymous when it is no longer needed for the purpose of the transmission or necessary for billing purposes.  However, this obligation to erase or anonymise the traffic data does not conflict with such procedures on the Internet as the caching in the domain name system of IP addresses or the caching of IP addresses to physical address bindings or the use of log-in information to control the right of access to networks or services.

Moreover, the processing of traffic data by the provider of publicly available electronic communications services for the marketing of electronic communications services or for the provision of value added services may only be allowed if the subscriber has agreed to this on the basis of accurate and full information given by the provider of the publicly available electronic communication services about the types of further processing it intends to perform and about the subscriber’s right to give or to withdraw his consent to such processing. Especially in the area of digital mobile networks, location data giving the geographic position of the terminal equipment of the mobile user fall under the definition of traffic data and therefore are regulated by article 6 of the ePrivacy directive. However, in addition, digital mobile networks may have the capacity to process location data which are more precise than is necessary for the transmission of communications and which are used for the provision of value added services such as those providing individualised traffic information and guidance to drivers. The processing of such data for value added services should be allowed, only in cases where the subscribers have given their consent. Even in these cases however, the subscribers should have simple means to temporarily deny the processing of location data, free of charge.

From a more general perspective, in order to tackle the various privacy and data protection issues that arise in respect of mobile users in a – more or less – ubiquitous computing infrastructure, businesses need to adopt a privacy driven security and data protection model, which respects and protects the privacy of the user/data subject, e.g. the user’s identity and his preferences, the management of his profile. One way to develop such a privacy model is to rely on virtual identities and zero-knowledge authentication, which severs the link between the identity of the principal and the requested action. A virtual identity provides anonymity to the users, which in turn calls for an implementation of a security model that ensures the accountability for potential abusers of the network. A viable solution would be the adoption of a trust-based security architecture, in which autonomous entities establish trust in other entities based on collected evidence such as reputation, recommendation and records of past experience (see, example the reputation system adopted by eBay).

However, business compliance to data protection legislation should not be perceived as a passive behaviour, but as a field where initiatives to protect the privacy of the data subject are welcomed and endorsed. For example, in order to help the individuals to control the flow of the personal information when they interact in a networked environment, a new ‘breed’ of technologies, so-called Privacy-Enhancing Technologies (or PETs) have been developed over the last couple of years. Their purpose is to restore the balance of power between the individual who wants to retain privacy and many other actors in the online environment who want to gather personal data. Rather than relying on the state  or some industry association to deal with the possible privacy threats on a collective level, these technologies are designed to support action which confers protection only to the interested individual. It is a more realistic approach that recognises that electronic communications have massively increased scope of surveillance and thus the development of a solution is aimed to remedy the situation on the same technological level: a technological remedy for a technological threat. For these reasons and based on the ‘data avoidance’ principle, data controllers who are active in the field of electronic communications services should encourage their customers to use these technologies on a personal level. After all, minimising the personal data that traverse the network minimises the liability risk for the data controller in case of a breach in the security infrastructure.

 

Data transfer to third countries  fidis-wp11-del11.3.economic_aspects.sxw  Theories for User Acceptance in the Market of Mobile Identity Management
18 / 37