You are here: Resources > FIDIS Deliverables > Interoperability > D4.1: Structured account of approaches on interoperability > 
Database of papers and projects  Foreword
 Social aspects of interoperability in identity management


Understanding interoperability

Andrew Wallwork and John Baptista, LSE 


“The shift from the total integrated approach to interoperability development is not only a technical change, but reflects organisational, economical and social trends/requirements of the society. To successfully tackle this very complex and highly detailed endeavour, it is necessary to develop research involving knowledge and competencies of all domains concerned.” 

Chen 2003


Establishing interoperable systems is a complex operation and goes far beyond the technical interconnectedness of databases and systems. Interoperability emerges from the need to communicate data across different domains for a specific purpose. Transferring the data may represent a technical challenge because of different protocols, standards, and so forth. However, the key challenge is with the purpose, use and changes consequent on transferring that data. Changes in data ownership and custodianship have an effect on power structures, roles and responsibilities and on risk. These issues go well beyond the technical dimension into the formal and social spheres. We discuss these different dimensions in this section. We will also strive to develop a holistic conceptual understanding of this phenomenon which can support the work of the FIDIS consortium. 



Interoperability – to define, or not to define? 


According to Harvey et al (1999) , it is broadly accepted that ‘interoperability’ has emerged as a new paradigm, which facilitates a more efficient use of information resources through the linkage of heterogeneous ICTs into synergistic units (1999: 213). “electronic networks [and] the electronic delivery of government information and services” (Moen 1994: 368 ).


However, interoperability still lacks a dictionary definition. ; Harvey et al, 1999 ; Ouksel & Sheth, 1999 ; Choi & Whinston, 2000 ; Brodeur et al, 2003 ; and Kinder, 2003 ) simply avoid offering a definition at all, and among the papers that do attempt to give a meaning, there is a surprisingly varied selection to choose from. In this chapter, we aim to develop a common understanding of this term in order to develop solid conceptual ground from which to build future work within FIDIS.


For Miller et al (2001), (information) interoperability is, “the ability of processes and systems to effectively exchange and use information services” (2001: 259), although their study seeks to address the shortcomings of this definition. provides a similar but richer definition seeing it as “the ability of different types of computers, networks, operating systems, and applications, to exchange information in a useful and meaningful manner” (2000: 129). .


Woodall declares a technical level definition of interoperability as  


“The condition achieved among communications-electronics systems or items of communications-electronics equipment when information or services can be exchanged directly and satisfactorily between them and/or their users” (Woodall 2000: 310 ).


Woodall is motivated by the undeniable, exponential increase in system complexities and components, and their related coding and data processing requirements (Ibid).  Thus, he puts interoperability into a technical context, which can be approached and hopefully resolved through technical and technological means.


In stark contrast, Landsbergen & Wolken (2001) argue that interoperability is “more than getting bits and bytes to flow properly” (2001: 206). “people talking and sharing information” (Ibid).


In fact Miller et al admit that interoperability can fail even if the associated processes are properly exchanging logical units of data. exchange (compatibility) of a service and the ability to use (interoperability) the service – “compatibility is a requirement for interoperability but not a sufficiency” (2001: 267). highlight interconnectivity as a term related to interoperability, yet similarly guard against complete assimilation, proposing that “achieving interconnectivity is a necessary preliminary step towards interoperability” (1999: 94) but it cannot complete the ‘big picture’.


So, to define or not to define? notion of interoperability can serve as an umbrella, beneath which can exist many disparate but complementary definitions, according to perspective or layer of abstraction.


This section has tried to address the problem of first coming to a simple definition, or at least of negotiating one that is mutually acceptable. It points to semantic discordances and difficulties, that can be linked to a body of work concentrating on semantic interoperability - a concept to which we will be return later. must make this conceptual leap before any real progress in improving interoperability can take place” (Landsbergen & Wolken, 2001: 212; emphasis added ).


Technical to Social and back again 

“Technological systems are socially produced. ).


Technology alone may appear compatible, and standards and policy may enable interoperability, yet there is some dynamic missing in this ‘bigger picture’ – people.  Landsbergen & Wolken (2001) hint at social interoperability in their definition and research, and requested additional “support mechanisms to understand the range of economic, political, technical and organisational issues involved with information sharing” (2001: 213).  


Historically, we can find these elements in advice offered by Kraemer & King (1986) , relating to fundamental, innate problems of IT management within the environment of public administration.  Crucially, we need to consider these elements in context and in practice:


“Computing fits within existing organisational life and exerts subtle influences. This does not mean, however, that computing is an activity that is easily managed. (Kraemer and King, 1986: 494 ).


Choi & Whinston (2000) are supportive of this ‘bigger picture’ in their research, firstly by stressing, “Technological standards at the infrastructure level are relatively easier to reach than those at the applications and business process levels” (2000: 38).  Of course, they do not suggest technical-formal elements are trivial or easy; they are merely easier than those at the applications and business process levels.  Moreover, they continue describing culture and practical differences as contributing to some of the many pitfalls to establishing standards in the application layer and ultimately to ensuring interoperability (2000: 40).


The failure of interoperability projects has not been confined to the technical realm, but to political – informal – friction among public agencies (Ibid).  Undeniably, as Homburg & Bekkers note, “e-Government initiatives can be characterised as political” (2002: 8).


In the following section, we propose a framework comprising Technical, Formal (policy and standards), and Informal (TFI) notions to engender a holistic understanding of interoperability functions and as a useful tool for analysing EU interoperability projects, providing a direction for future research and practice.


Using the TFI model to understand interoperability 

The following framework will help to fulfil the interrelationship between abstracted layers of interoperability leading ultimately to conclude that technical requires formal and formal requires informal.  Stamper et al , succinctly illustrate this interrelation of abstracted layers explaining that,


Informal norms are fundamental, because formal norms can only operate by virtue of the informal norms needed to interpret them, while technical norms can play no role…unless embedded within a system of formal norm.” (2000: 19).


Metaphorically, this can be viewed as a ‘Russian doll’ effect, where the informal is the outer shell containing the formal, which in turn contains the technical.  Inside out, the technical cannot be removed from the toy without consideration for (unwrapping) the outer layers:










Figure : The embedding of computer systems in the formal and informal organisation (Stamper et al, 2000:19)



Relating abstracted norms to the dynamics of interoperability assumes the capacity to infer an affiliation between the two. et al, 2000: 22), then signs – i.e. semiotics, “the study of signs, signals and symbols, esp. in language and communication” (Chambers Dictionary, 1999) – provides the link in the chain connecting norms and interoperability.


Much interoperability literature explores ‘semantic interoperability’, whereby semantics is defined as “the area of linguistics dealing with symbols (comput) – (loosely) differences in, and shades of, meaning esp. of words” (Chambers Dictionary, 1999). infer an approach to treat “meaning as a relationship between signs and human behaviour” (Stamper et al, 2000: 23).


Chen & Doumeingts (2003) postulate that semantics runs through all layers of an organisation, and so, the adapted below suggests that a similarity exists between the three ‘layers of interoperability’ and the TFI framework. Chen & Doumeingts’ model incorporates semantics across the three layers in a similar way to the TFI.
















Figure : Adapted framework showing ‘interoperability on all layers of and enterprise’ (Chen & Doumeingts 2003:154)


Consistent with general interoperability research discussed above, much of the literature on semantic interoperability focuses on the technical domain (Harvey et al, 1999: 228 ).  One such protagonist - Sheth (1996) [71] - approached semantic differences with an engineering orientation, working on the concept of semantic proximity, demanding “declarative language to articulate definitions of objects, and very strong ontological definitions” (Ibid).   Yet by 1997, working with Oeksel, an approach is taken to support a more general notion of semantics transpired, which relates the “content and representation of information resources to entities and concepts in the real world” (Beech 1997 [7]; Meersman 1997 [54]; Sheth 1997 [72]).  That is, “the limited forms of operational and axiomatic semantics of a particular representational or language framework are not sufficient” (Ibid).  


For Bunge (1974), semantics is “concerned not only with linguistic items, but also, and primarily, with the constructs such items stand for and their eventual relation to the real world” (in Lee & Siegel, 1996: 151 ).  Accordingly, this gives credence to the proposition of a TFI framework incorporating the addition of cross-sectional semantics (see Figure 6).  Thus, uniting the findings from Stamper et al (2000) and Chen & Doumeingts (2003) with current research at the London School of Economics conducted using a TFI model applied to interoperability, a potentially useful ‘tool for thinking’ materializes, see bellow:



Figure : Adapted TFI framework to include the dynamics of semantics


Metaphorically, this diagram can offer further value to incorporate the potential for ‘seepage’ between the different domains of the TFI.  This embodies the impossibility of navigating differences in meaning to ensure absolute conformity, if this is possible, between disparate and dispersed social groups.  A complex interoperability project may resemble more of melting pot than an assemblage of distinguishable layers of abstracted meaning. Furthermore, individuals construct different parameters according to their internal biases, norms and assumptions, and continually translate and interpret associated meanings – solidifying the argument for the presence of semantics at each level of the TFI.


Evidently, the scope of the interoperability literature is vast, with its range of contrasting theories, frameworks and conceptualisations. In the next section, we present a framework debate based on the dichotomous approaches of ‘bottom up’ or ‘top down’ interoperability.  


‘Bottom-Up’ versus ‘Top-Down’ in interoperability development 

A ‘Top-down’ approach argues that at the peak of a hierarchy, a powerful administrative body “prescribes the interoperability methods and resources to be applied by all actors on lower levels” (Klischewski, 2003: 22 ).  In contrast, a ‘Bottom-Up’ approach suggests that at the base of the hierarchy, administrative bodies propose and “share interoperability methods and resources from their point of view; centralised prescription is only accepted where there is consensus on all lower levels” (Ibid).  


There is no agreement on how interoperability should emerge. Stamper et al hint at the failure of a top-down approach, pointing out that often “technical experts are called in to interpret [the] formal layer with extra expense and a fair chance of misinterpretation” (2000: 19),  i.e. the Russian doll effect; ‘technical requires formal, formal requires informal’. Experience confirms


“that standardisation and the adoption of standards are very difficult processes, even…where the subject matter [appears] clear-cut and of limited complexity…Social issues make things complicated” (Klischewski, 2003: 23).  


In which case does Kinder’s (2003) proposal convey more verisimilitude?


“Big system, top-down approaches to interoperability are much more likely to fail than planned incremental bottom-up change” (2003: 154). 


Especially in relation to web technology, most standardisation efforts assume an open process. This enables community members to contribute their specifications and requirements, within the network, to an accepted interoperability solution.  For example, “local administrations publish their service interface using bilateral or mutual service to reconcile the technical, syntactic and semantic differences as much as possible” (Klischewski, 2003: 23).  In essence, this represents a ‘bottom-up’ approach to interoperability (Ibid).


However, despite being useful conceptualisations encouraging two very opposing perspectives on interoperability, when taken alone, neither methodology can succeed, particularly when considering the e-Government community of Europe.  For sure, ‘top-down’ is a useful conception, as we must have some sort of foundation (Stamper et al, 2000: 20), but often top-down standardisation efforts have been “countered by mistrust and ignorance from the local level” (Klischewski, 2003: 23), whereas “grass-root [bottom-up] initiatives mostly failed to reach the impact they had hoped for” (Ibid).  Consequently, to accomplish interoperability Klischewski proposes to combine,


“the more technical ‘top down’ approach to ensure that agents of other system components make sense of the resources encountered, as well as the social network orientated ‘bottom up’ approach to ensure that semantics processed in the systems effectively relate to the world view of the local actors in charge” (Klischewski, 2003: 24). 


From this last quote from Klischewski (2003), one can begin to see overlapping and quite striking similarities between the TFI model and Semantics. element of the TFI is necessarily implied in the framework’s consideration for standards and standardisation.



Cases of Interoperability in identity systems in Europe 

Threlfall (2003) describes how “the transferability of state pension rights was enlarged…in 1998 and became ‘portable’ through freedom of cross-border payments” (2003: 130). Interestingly, until the 1992 Treaty on European Union, free-moving pensioners were not at liberty to burden their host country’s health system.  However, such restrictive health entitlements made the “maintenance of such compartmentalised health-care non-viable” (ibid) if not impossible in critical cases. By 1997, all community free movers were granted medical benefits, thus “freedom of choice of residence for pensioners has therefore been widely enhanced, subject to the constraints of an individual’s means” (ibid).  This brief case is unavoidably associated with the messy, convoluted matter of interoperability and identity - and is moreover devoid of any reference to technical concerns. Hence, the e-Pensions domain will face political, organisational and social challenges, as well as having to build the foundations of an interconnected, interoperable technical platform.  Not only does this support Klischewski’s (2003) consideration for simultaneous middle-up-down considerations, but also the case appears amenable to interpretation using the TFI framework.


A similar discussion by Threlfall (2003) within the health care domain offers supplementary evidence to consider interoperability in Europe also as an Identity issue, as well as one which incorporates the abstraction of interoperability across the full spectrum of the TFI framework.  The European Commission aims at improving the EU’s healthcare system without direct interference in each country’s delivery of health services (2003: 130-131).  Nonetheless, in 1998,


“Twin phenomena of ‘patient mobility’ (Wavell, 1998 [77]) and a ‘Europe of Patients’ (European Commission, 1999 [25]) had been created de jure, so that from the point of view of the patient’s healthcare, they were living in the EU as in one country” (Ibid).


Again, for the domain of e-Health, we are confronted with a plethora of interrelated technical, formal and informal elements.  For example, an European Health Card will replace form E111 by 2005, entailing much work on technical interoperability and the creation and revision of formal standards. Lastly, to exemplify an informal (behavioural) concern, “implications [may ensue] arising from patients circumventing waiting lists by going to another member state” (Ibid). e-Health also relates to identity, requiring consideration of all levels of the TFI to enhance likelihood of success.  


Overcoming purely technical hurdles will do little to reassure communities of the merits of a potential information system, which may threaten privacy, trust and undermine cultural beliefs, i.e. a feeling of “but that’s not the way we do it round here”.  For Wimmer (2002) , identity considerations are crucial because “citizens feel vulnerable when using e-Government systems…they want to have security solutions, which provide subjective trust” (2002: 1). Here, the issue of privacy surfaces, as identity data exchange is a very sensitive subject (Homburg & Bekkers, 2002: 4-8 ). Further, privacy concerns become politically charged in practice as information exchange and standardisation across boundaries may “reflect, legitimate and re-produce the discourses of powerful groups, validate their ways of steering and thinking, and give tangible force for their influence on organizational life” (Bellamy, 1998 [4] ).  


The two examples of pensions and health have brought further evidence to suggest interoperability goes well beyond the technical, and that within the EU, identity is a term that also needs to be given value and meaning.  The following section presents an overview and summary of the current EU interoperability context, discussing the challenges and proposing directions for future research and practice.



The Great Interoperability Challenge – A Discussion 

The eEurope Action Plan 2005 called on the European Commission “to issue an agreed interoperability framework to support the delivery of pan-European eGovernment services to citizens and enterprises” (IDABC, 2005 ).  More than just e-Pensions and e-Health, this plan of action encompasses an abundance of services including harmonising tax, social security systems, educational systems, jurisdiction for divorce and family law, driving risks and benefit and welfare regimes across Europe – all of which currently remain in their infancy (Kinder, 2003 ; Threlfall, 2003 ).  In addition, the establishment of a common Visa Information System is slowly becoming reality, although “there is currently no interoperability between existing national visa systems in Europe or the possibility to check reliably whether an applicant for a visa has applied under another identity” (BTT, 2003: 1 [10]).  Nevertheless, this too is a highly charged, emotive interoperability of identity project – a branch located on the biometric technology tree.


Many authors (Moen, 1994 ; Prokopiadou, 2000 ; Homburg & Bekkers, 2002 ) view the complexities in developing an integrated social dimension for e-Government applications (in practice) as the broadest, most difficult challenge.  Owing to the multilevel, hierarchical nature of local, national and international public administrations, government procedures for production and dissemination of information are considered overcomplicated, rigid, fragmented and dispersed (Moen, 1994; Prokopiadou, 2000; Homburg & Bekkers, 2002; Virginadis ).  Szulanski calls this ‘internal stickiness’: a resistance by local Public administrations to adopt new ideas from outside.  (Szulanski, 1996 in Kinder, 2003: 143 ).  In addition, Choi & Whinston warn that the time needed to reach consensus among Public Administrations may prove too lengthy to support rapidly changing technologies and practices. (2000: 40)


Within the broad complexities of public authorities lie three concepts: technical challenges relating to data homogeneity and system interoperability for proper and efficient metadata exchange (Prokopiadou, 2004: 189 ); formal concerns lying within the policy realm of the creation, communication and diffusion of commonly accepted standards (Moen, 1994: 358 ); and informal elements encircling these two with politics, culture and behaviour (Choi & Whinston, 2000: 41 ).  Isolated successful interoperability within each of the three domains is not a guarantee for complete interoperability. For example, referring back to e-Health and privacy, beyond infrastructure, systems and standards for practice, a genuine feeling of trust and control is required by citizens before a government can overcome the social, political, cultural and legal barriers to interoperability. (Homburg & Bekkers, 2002 ; Landsbergen, 2001 )


Challenges to interoperability have been identified from an analysis of the holistic notion of interoperability and identity, using the TFI model.  Further, semantics are integral to every level of abstraction and to the individual and contextual characteristics of citizens and communities, whether relating to the creation and exchange of metadata and communication protocols, establishing common agreed standards and policy between different national, legal and language borders; or relating to the flexible and dynamic meanings of interoperability and identity – and the associated understandings of their technical and formal structures.


Holistic understanding of interoperability 


“Within informal cultures, openness and trust are necessary to assimilate cross-boundary norms to guarantee adequate communication and control” (Stamper et al, 2000: 22, emphasis added ).


Only through sound understanding of interoperability as a holistic notion applying at varying levels of abstraction, can we hope to achieve a seamless transition to successful interoperability in practice. As a forewarning, Mulley et al (1999) construct a prophetic but disturbing conundrum,


“Enhanced interoperability… may be a catalyst for closer links between nation states, integrating and consolidating the EU and achieving a more equitable distribution of wealth. (1999: 97)


Hopefully, steps being made towards multidisciplinary interoperability research will help avoid the problems outlined by Mulley et al and instead lead to a substantial reorganisation of the research activities and cooperation in Europe (Chen & Doumeingts, 2003:162 ).


Ultimately however, Kinder (2003) , offers the following comment,


“the usefulness to users of interoperable public service systems increases in proportion to the extent to which users cannot detect where one organisation’s system begins and another ends” (2003: 156).


Thus, the litmus test for successful pan-European interoperability endeavours rests in addressing all levels technical, formal and social in a seamless and integrated manner. 


In the next section we focus on the social/informal dimension of interoperability in IMS. 


Database of papers and projects  fidis-wp4-del4.1.account_interoperability_02.sxw  Social aspects of interoperability in identity management
5 / 15