Is your Healthcare Data Quality Facilitating Optimal Patient Care? Part 1

Healthcare IT leaders must ensure they are staying on top of technology trends to ensure their organization offers the best health services.  Healthcare delivery continues to evolve at a rapid pace, with shifts from EHR implementation to population health to risk-based contracts.  With this fast moving process, Healthcare organizations often gloss over the instrumental step of evaluating the quality of the data that serves as the foundation of their strategic initiatives.  With the adoption of population health-focused tools and methodologies, an integrated analytics platform and a secure, high-quality objective data asset are critical for success under the aforementioned payment models.

The majority of Healthcare analytics platforms rely heavily on claims data.  These platforms, while highly structured, lack the rich context afforded by clinical data.  In addition, the small number of analytics programs that happen to leverage clinical data usually are dependent upon vendor-supplied integration messages.  An example of this is Continuity of Care Documents (CCD).  While CCDs are attractive because they offer a compact and convenient way to integrate clinical data, CCDs also impose limitations on both design and implementation.  These limitations make CCDs inadequate for population health and performance analytics.

When looking at the quality of data, one must look downstream to see what processes positively or negatively affect the quality of data.  Downstream processes such as data transport, aggregation, EHR configuration, normalization, and reporting mechanisms through omission or commission can all negatively effect data quality.  How can a company recognize a negative effect on data quality?  How does a company define a data quality gap properly?  These are complicated questions because of the wide range of data quality issues that a Healthcare organization might encounter.  Issues with data quality that might be encountered in the EHR include:

  • Quick, broad, generic diagnosis codes entered out of habit rather than entering actionable, more-specific diagnosis codes that are appropriate to the patient’s needs and treatment.
  • Variable entry of standard codes. (ex: National Drug Catalog (NDC)), derailing bulk analysis.)
  • Incorrect patient identifies. (ex: missing social security number, misspelled name, incorrect sex, or transposed date of birth.)
  • Missing radiology images from reports that result in insufficient information to correctly diagnose or confirm a patient’s injury or ailment.
  • Blood pressure, or other standard numerical metrics, written in text in encounter notes rather than in appropriate structured fields.

Each one of the above cases involve very different causes, and as such, very different data elements and outside standards.  Each unique case may result in one or more different types of gaps.  Some of these gaps result from standard reporting configurations that fail to convey instrumental information.  Other gaps arise due to clinical practices that may stem from organizations’ workflow, EHR configuration, or even varying user personalities.  Concerns about the quality of Healthcare data generated in the clinical environment pose a grave threat to derail efforts to derive public and organizational value from Healthcare data sets.

The importance of quality clinical data has been highlighted and problems with data quality have been identified.  Now the question is, “how can the user gain confidence in his or her EHR data?”

Check back with us at Gordian Dynamics later this week for a follow-up post that will highlight how users can improve data quality and integrated analytics, leading to increased confidence in EHR data and improved care.