Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

According to IEEE interoperability is defined as: 

'The ability of two or more systems or components to exchange information and use the information that has been exchanged.' 

                                                                                              (Source: IEEE Standard Computer Dictionary: a compilation of IEEE Standard Computer Glossaries, 1990.)


Interoperability can ambiguously be

Table of Contents

Introduction

Interoperability defined

Interoperability is understood in different ways depending on the specification under consideration:

...

  1. Technical or Functional interoperability = the ability for two systems to exchange information (message). 
    If the specification is of an artifact (e.g. a business document, a process definition, a policy contract, a Vitalink medication scheme), then interoperability is understood as the ability to process this artifact with consistent results, using different platforms or processors. In such a case, interoperability is often described as portability from the artifact perspective (the artifact is portable across platforms), while the platforms or processors are qualified as interoperable.

Clear understanding of specification

When you are developing an implementation, make sure that you have a clear understanding of the specification of the protocol and the artefact. The specification should be precise enough to ensure that your implementation is interoperable with others or is at least very close to be. In case of questions, do not consciously make assumptions but contact the writer of the specification. Interoperability is primarily a matter of correct and consensual interpretation of this specification.

  1. Semantic or Non-Functional interoperability =  the ability of those systems to understand and use the information (meaning) held in the message.
    If the specification is about a communication protocol (e.g. a transfer protocol, an interface like the eHealth Hub webservices) and about the behaviour of processors of this protocol, then interoperability is understood as the ability of two implementations of this specification – i.e. processors of this protocol - to communicate properly. In the case of an interface, ability of a user entity to communicate with an implementation (or processor) of the interface.


Given interoperability in general and interoperability in compliance with Vitalink, please find useful guidelines below: 

Children Display


Vitalink Cookbooks

Specific helpful documentation exists for the implementation of Vitalink services into your applications. For more specific information we refer to the Vitalink documentation available on a dedicated Confluence site for this purpose. 


Responsibility

...

Be aware that it is not always clear which aspects of interoperability fall under the specification writers writer's responsibility, and which fall under the implementation developers responsibility. Too often interoperability problems arise when each party is over-reliant on the other party to ensure interoperability.

Guidelines

Defined behavior after errors

Define the behavior of your implementation in case of errors. 

The next type of errors should be taken into consideration:

  • the errors from "Errors" section of the specification, mainly concerning the protocol
  • errors after reception of faulty artefacts from another implementation
  • errors when processing correct artefacts by your implementation
  • errors during generation of faulty artefacts by your implementation

Used version of the specification

Choose the version of the specification that will be used for the implementation, document the choice and make this information accessible for the end-users. Not doing so will cause misunderstandings and interoperability issues over time. Providing the user with clear information about the version and revision numbers of both the implementation and the implemented specification will help to avoid this.

Be aware that it is a very unlikely event that everyone migrates his implementation at the same time to a new version of the specification. 

Compatibility

When migrating your implementation to a newer version of the specification, make sure that you identified the non backward compatible features of the specification. This is needed to be able to correctly consume artefacts produced by implementations using the older version of the specification.

Processing of optional features

The optional character of a specified feature, when concerning an artifact that may be produced and consumed, is a common source of confusion and interoperability failure. An artifact MAY implement a feature. This clearly means that a compliant device producing such an artifact MAY omit this feature. However, any compliant device consuming such an artifact MUST be able to process this feature, should it be present (unless specified otherwise).

Composition of two specifications

Specifications refer to each other, and compose with each other. For example, a Vitalink layer will operate above the eHealth Kmehr artefact specification and Hub webservices protocol specification, that are defined outside of Vitalink. On some aspects, the specification writer sometimes assumes a “black-box” or perfectly modular composition. This implies that some “corner cases” (error escalation, mismatched features) are not explicitly documented. This way the details are left to the implementers. The degree of variability introduced by such compositions is then underestimated. As a result, implementers may interpret differently how the composition is supposed to operate. This can be avoided by not making any assumptions in case of doubt, and contact the writer of the specification.

Requirements

This paragraph should be moved to the requirements documentation.

Interop1 - Transparency in exporting of local data

Verify that it is visible for the user which locally edited data will be shared.
This includes explicitly visualising any removal of central data.

Interop2 - Transparency in importing of central data

Verify that it is visible for the user which displayed data comes from the shared central source.
This includes explicitly visualising any removal of local data.

Interop3 - Exhaustive data mapping

Verify that all the data from the central source is visible for the user.

Interop4 - Erroneous data mapping

Verify that the user is consulted when incoming data from the central source can not be processed. 

Interop5 - Input validation

Verify that the validation of user input is also applied for incoming data from the central source. If the validation fails, the data should be handled as erroneous data.

Interop6 - Transformation design

Verify that your transformation of central data is well-designed and documented. The design includes the restrictions the central data should apply to to be able to serve as input for the transformation.
All data that is not suitable for the transformation should be handled as erroneous data.

Interop7 - Server processing

Verify that the processing of erroneous data on a server (where there is no immediate user interaction) is logged, monitored and eventually presented to the user.

Interop8 - Logging

Verify that enough information about manipulation of central and local data is logged to enable later analysis of possible problems.