All interfaces have the potential to present data integrity issues, due to their nature:

  1. They’re often custom (category 5) software components, with a raised level of inherent risk in the absence of a well-planned specification, design, review, and verification process
  2. They’re typically designed and monitored by IT personnel—who may have no direct interest in– or understanding of– the data, and who may manage many such interfaces, perhaps not knowing which are mission critical or which are trivial
  3. They occasionally fail to execute properly
  4. They may map and/or transform data moving between systems
  5. They may unintentionally truncate or obliterate data that goes undetected without robust design/review
  6. As a result of the fact they work between systems, the owner can be unclear
  7. Unless designed otherwise, they cannot distinguish between source data and migrated data (original data source is lost)

Interfaces typically pass data between business teams, each using that data for different functions (e.g marketing, execution, distribution, release, and testing). Fortunately, most interface failures are detectable: someone expects data that does not arrive – the incident is reported, an investigation ensues and the problem is fixed; data flows and the task moves forward.  But some scenarios have a darker side, where old/out of date data could lead to an improper business decision. For instance, test results are released to the Laboratory Information Management System (LIMS) for batch disposition, but an error is discovered and a test result is “recalled” to the lab for an update and re-release. Does your LIMS have business rules to prevent batch release until that test result is updated?  If not, there is a potential for the batch to be released based on out of date information.  In another scenario, process automation data that would be useful for investigations fails to transfer to the historian for several weeks because it is not directly reviewed as part of batch release, and therefore the failure goes undetected. You probably know of other scenarios where old/missing data has created business issues.

Data transformation within interfaces requires developers who understand how the inbound and the outbound business units interpret data from the system.  There should be business rules to control when data is available (for various users), and any inherent risks.  For example, Manufacturing wants to see all test results entered into LIMS, unaware that test results are not trustworthy until they have been reviewed (status = COMPLETED). If a LIMS interface transfers all testing data—not just the COMPLETED data–product decisions could be made using test results that could possibly change if errors are discovered later during review.

So what to do? Some suggestions:

  1. Start with risk management – evaluate data integrity risks based on the data moved by the interface
  2. Make failure evident—send interface failure notices to someone (perhaps a business person) who will respond based on the criticality of the data involved
  3. Develop a business notification process so failures will not create improper decisions
  4. Include verification that impacted interface(s) are working as a key part for fixing any failure condition – there might be no error message because the interface is never executed
  5. Lastly, perform a periodic review on your interfaces, noting failures, response times, validation status and criticality. This visibility creates an environment where robust interfaces are expected, and resources are devoted to improve them.

The net result is correct data available for business personnel to make decisions. That is in everyone’s best interest.

by:  Mark E. Newton, Associate Sr. Consultant, Global Quality Laboratories for Eli Lilly and Company

See more popular blog posts from Mark on Data Quality and Data Integrity:

Are you looking for a hands-on approach for identifying, mitigating, and remediating potential causes of breaches in data integrity?

Don’t miss this special half-day Data Integrity Workshop focused on key data integrity issues facing the pharmaceutical product lifecycle. This interactive workshop will identify important regulatory issues impacting data integrity, answer key questions surrounding current expectations, and provide an overview of the Application Integrity Policy.  Learn more about the Data Integrity Workshop and how to register.