Improving the health of at-risk, vulnerable and chronically sick populations remains a massive challenge for healthcare organizations’ financial administrators. Social, cultural, hereditary, economic and geographical issues not only stand in the way of traditional approaches to care, but also complicate efforts at controlling medical costs.
In order to properly manage complex populations, comprehensive and reliable data on patient health and demographics are required. However, many players in the healthcare ecosystem have only a limited or imprecise understanding of the populations in their care.
In the face of dramatic transformation of standards for healthcare delivery – from fee-for-service to value-based-payment, independent providers to clinically integrated networks, and payer-based risk to risk-sharing contracts – the importance of population health management (PHM) has become apparent to all stakeholders in the healthcare marketplace. PHM can be defined as the shift in focus, beyond the individual patient and specific healthcare events, to an entire population of patients with care coordination needs that change over time. Successful PHM is not dictated by a provider’s ability to manage a single population, but rather by how he or she manages multiple populations on differing levels and timescales of risk.
Why data quality?
Data quality affects a healthcare organization’s ability to engage in effective population health management. Data quality gaps can result from a variety of operational inefficiencies, such as:
- Erroneous patient identifiers, including a missing social security number, misspelled name, incorrect sex, or transposed date of birth;
- A standard numerical metric, such as blood pressure, written in text in encounter notes rather than in appropriate structured fields;
- Generic diagnosis codes entered quickly or out-of-habit instead of more specific and actionable diagnosis codes appropriate to the patient;
- Crucial radiology images absent from reports resulting in insufficient information to consult or verify a diagnosis;
- Inconsistent entry of standard codes, such as National Drug Catalog (NDC), derailing bulk analysis.
Getting the complete picture
The impacts of poor data quality are not at all theoretical; for one large physician and hospital network, the positive results of an analytic program went unrewarded due to data quality issues.
In this case, to facilitate reporting requirements by this network of over 2,000 providers in the Northeast United States, each practice EHR system transmitted a data feed nightly to a third-party analytics tool that would, in turn, calculate a set of quality measures for reporting. The data feeds were formatted according to vendor-defined Coordination of Care Document (CCD) Specifications, which were the only source of clinical data for the analytics tool. While analysts identified inconsistencies in the measures reported, they weren’t able to identify the source of the data quality issues due to the number of systems involved in data storage, transmission, and analysis.
Based on an analysis of data flows, types and gaps, the organization and its consultants were able to identify the sources of flaws in their existing clinical measure reporting infrastructure, classified along three axes:
- Capture, representing the accurate acquisition of the desired data elements into storage;
- Structure, representing the storage of the information in a format appropriate to its use; and
- Transport, representing the means by which data are transmitted and reported.
By applying these axes to a data model aligned with the most common representations of EHR data, the healthcare organization was able to identify several clusters of data integrity issues. A targeted effort to improve transport mechanisms would have the most profound and broad-reaching impact on these providers’ ability to understand and treat their sickest patients. And, by implementing these changes, external reporting would far more accurately represent the performance of the organization on crucial and nationally-recognized standards.
Data quality ready for population health management
Improving data quality can seem like a daunting task, but it is critical for successful PHM. By specifically categorizing points of failure and identifying the most prevalent data element types for those failures, an organization is not only able to identify when, where and to what degree data gaps are incurred, but also clarify what can be done to resolve the issues. In that way, they’re one step closer to the goal of advanced health information technology in clinical transformation. By tying into knowledge of the underlying populations, healthcare organizations are able to access an improved quality of data that enables improved quality of care and incentive revenues.