You are here: Resources > FIDIS Deliverables > Profiling > D7.4: Implications of profiling practices on democracy > 

D7.4: Implications of profiling practices on democracy

Examples and scenarios of the application of current data protection legislation  Title:
CURRENT PROFILING IN THE PUBLIC SECTOR
 Conclusion

 

Current profiling in the public sector

Looking into the current work of privacy commissioners we find certain areas where data protection legislation can be applied efficiently. This is especially true when the profiling is performed by public bodies.  

Examples for the use of profiling techniques in the public sector are statistics (including population census), academic research, and the health area. 

In some areas the application of the described instruments of data protection legislation is limited by law. One important example is the field of law enforcement by the police, e.g., the collection and processing of data after 9/11 in Germany to find potential terrorists at German universities in form of a dragnet investigation (‘Rasterfahnung’). In this case the approach was not successful, and in some federal Bundeslaender the collected data meanwhile have been completely deleted. Additionally secret services certainly use profiling techniques.

A trend we observe is that the basic principles defined in section are in some cases weakened by legislation by defining exceptions, e.g.:

  • The collection limitation principle (No. 1) is restricted by numerous exceptions where the collection of data is (or is planned to be, for example the currently discussed retention of internet and telecommunication data) allowed. 

  • The purpose definition principle is limited by definition of very general purposes which allow for an extended use of the collected data (such as the extended access of public authorities to personal data of customers of banks in Germany). 

  • The use limitation principle is constrained by centralisation of until now decentralised procedures and the corresponding data in large databases (such as those planned for the German JobCard procedure) 

This trend has to be observed carefully – it will potentially influence the balance of separated powers in the democracy by strengthening the executive. 

 

Current profiling in the private sector

In the private sector the application of data protection legislation to profiling practice at least in Germany is somehow limited. Trade secrets are often used as an argument not to inform clients or the data protection commissioners about details of the used methods. In a very easy and often not provable way it is simply stated, that no personalised data is being collected and processed. While this statement will be true and provable in many cases, an independent judgement on the profiling practice used from the data protection perspective is often not possible in these cases. One example for this is the use of scoring systems for risk assessment performed in the banking sector in Germany (see also FIDIS deliverable 7.2) – in the beginning it was simply stated that no personal data was processed and stored. This was obviously not correct; the scoring value resulting from the group profiling was assigned to a specific person and thus personal data. In addition even the fact that individuals would assert the privacy right to access to their personal data would have a negative effect on the calculated scoring values. So we observe a lack of transparency here. 

Another limiting factor is the resources that are available to check and enforce the data protection legislation in the private sector. But this is to be seen as a general question of law enforcement, not a specific one of data protection legislation. 

In general we still observe imbalances between organisations (in these cases enterprises) and members/clients resulting from the use of profiling technologies. But this happened with numerous technologies in the past, for example, industrialisation in general. Pressure from strong organisations on an informed society will probably lead to countermeasures such as changed legislation, appropriate enforcement of it, organisation of members/clients (such as trade unions and consumer organisations) and application of comparable methods by them. Society will balance these effects resulting in a compromise based on data protection legislation; democracy and the rule of law in the long run will not face any severe damage.  

A final question in this context is: how can we promote the balancing process? One answer is transparency: to bring independent information in the right form to the right recipients and enable a common discussion to reach an informed consensus.  

 

Future scenario of AmI

For Ambient Intelligence a number of different scenarios are being discussed. Some of them, for example, a scenario where the individual controls the ambient intelligent environment in their own apartment, can easily be brought into line with current data protection legislation. But many AmI scenarios will raise new questions depending on details of those scenarios, especially if this environment is run by different AmI providers, uses passive authentication and collects and processes personal data all the time. Such a system could generate a huge mass of personal data (e.g. personal profiles) which would be transferred to the different providers and stored and processed there. In such a scenario we have some conditions that have to be met: 

  • The business case: will it be profitable to collect and process all these personal data? 

  • Will effective control of the implementation of data protection legislation really remain an illusion? 

If those conditions are met and society really wants such scenarios to be implemented, we can identify several topics where we will face problems in applying the current data protection legislation in the way we do it today: 

  • Passive authentication against the AmI provider: who authenticates to whom for what? 

  • How can control of or consent for data transfers and the processing be handled? In this case more than one solution is thinkable to balance transparency and opacity, for example:  

    • An easier, let us call it implicit consent basing on a new understanding of consent far beyond today’s data protection legislation (note: this understanding of consent would be contradictory to the current understanding of consent);

    • Powerful tools e.g., personal identity managers negotiating privacy policies which are in addition (legally?) enforceable; 

    • Combined solutions including changes in data protection legislation and technical approaches such as personal identity management. 

 

 

Examples and scenarios of the application of current data protection legislation  fidis-wp7-del7.4.implication_profiling_practices_03.sxw  Conclusion
Denis Royer 26 / 45