You are here: Resources > FIDIS Deliverables > Identity of Identity > D2.2: Set of use cases and scenarios > 

D2.2: Set of use cases and scenarios

Privacy-Enhanced Scenario  Title:
TRADEOFFS
 References

 

Tradeoffs

 

Privacy has a cost. The inclusion of privacy enhancing and anonymity techniques puts an overhead on the system, in terms of communication, computing power, performance and complexity. Therefore, the first technical tradeoffs that need to be found are between security, robustness, anonymity degree [12,13], availability, and functionality, performance, overhead and usability. 

Regarding the management and storage overhead, it is not so clear which of the two options is more efficient. In the first option, the management of the data is centralized, and the amount of data gathered in databases is larger. In the second, the storage is more distributed (as users would take care of storing their own data), and the management of the information is mostly done on the user side. The particular design choices made to implement these systems would determine the management and storage costs of each scenario.  

Accountability mechanisms may prevent fraud and abuse of the system. However, we note that the task of law enforcement agencies may get harder when it comes to trace communications that do not violate the system’s rules. For example, the online monitoring and tracing of (off-line) criminal suspects (i.e., those who commit crimes outside the Internet) may become harder with privacy enhancing mechanisms in place. A tradeoff must be found in the legal and technical issues in order to provide law enforcement with tools to fight crime.  

From a conceptual point of view, the central concepts involved are trust, privacy, individual freedom, security and control. The relationship between these different concepts is complex. For example, privacy and security are not necessary contradictory, as a lack of privacy constitutes a security hole in a broad sense. However, the issues of freedom and control are directly contradictory, as the balance between these two will define where to draw the line. For example, a difficult, borderline case would be that of someone expressing an opinion which is unacceptable for the public powers (e.g., what would happen to someone who posts pictures of torture applied by an army? Would that person be identified? On the one hand, public institutions do have the ability to identify annoying citizens, which is dangerous; on the other hand, these tools are needed in order to prevent crime).  

The key concept is that of trust. Users must trust that the entities with the ability of identifying them will only do it respecting the legal guarantees. The trust in single entities can be enhanced by distributing the trust, so that several entities need to collaborate in order to identify a subject (e.g., the agreement of three judges is required). Verifiability of the identification processes also enhances trust. It is worth noting that trust is a very subjective concept, which depends not only on technical or legal objective factors, but also on cultural, psychological and social factors. 

 

Privacy-Enhanced Scenario  fidis-wp2-del2.2.Cases_stories_and_Scenario_04.sxw  References
41 / 69