Data Usage Facilitated by Interoperability is the Key to Value-Based Care

In a conversation between The Clinician and Greg Robinson of ICHOM, Greg offers his insights regarding the challenge of data usage for value-based care and improved interoperability capabilities.

A Chief Technology Officer by title, Greg’s role straddles the line between CTO and Chief Strategy Officer as he, together with his colleagues and the ICHOM board, navigate ICHOM’s mission to move the needle on value-based healthcare on a global scale.

The Clinician is proud to be the official technology and implementation partner for ICHOM in Asia Pacific and the Middle East.

What interoperability makes possible for data

The global ongoing pursuit of value-based healthcare has meant moving away from fee-for-service, and towards bundled payments and other types of programs. The ultimate objective in value-based healthcare is figuring out the right ways to focus providers on receiving reimbursement based on patient outcomes rather than volume. Interoperability is crucial to the value-based healthcare model.

Fundamentally, what interoperability makes possible is based on its facilitation of large-pool data. Large-pool data allows providers to be better informed about gold standard paths to treatment, and enables patients to get connected while at the same time empowering them to be more educated about what they should expect by way of outcomes.

For many years, the main objective was getting the industry to adopt electronic health records (EHRs). With adoption came all these data, from both claim systems and EHR systems, and that influx of data coincided with the larger general question of technology around big data: How do we actually use these data? Interoperability is that next step.
 

The problem of data-rich, information-poor health systems

We’ve got systems that generate a lot of data and we have the tools behind big data, but we still have systems that don't talk to each other. For example, two hospitals which are both on EPIC—even sister hospitals within the same health system—could actually be spinning up two different versions of data due to how highly configurable these systems are.

As an example of a successful interoperable project, consider the CMS Blue Button. The efforts of Aneesh Chopra [CTO under President Obama in the United States] promoting and pushing interoperability resulted in the CMS Blue Button initiative. With that initiative came the opening up of Medicare data to the quasi-public realm, as patients enabled access to their underlying data. And with that we saw a concentration of developers, of digital health individuals, companies, and individuals that wanted to form companies—everybody came with an idea, because now there was data.

When it comes to successful interoperability projects and successful integrations of new technologies with existing systems, the factors at play are funding and regulation.

 

The role of regulation

You see a lot of interoperability successes stemming from the mouths of regulators because that's where the concentration of money and the volume of willing underlying participants gather. Both from the patient side and the organisation side, the successes are coming out of what regulators are requiring.

In the building of these ecosystems we’re starting to see now—whether health system-initiated as in the case of Jefferson Health, or consortium-initiated like Graphite Health out of Intermountain—people have recognised there is value in these data. The result is more adoption of interoperability because that will continue to increase the ability to not only monetise the data, but to produce meaningful outcomes and continue to enable the monetisation.

 

The role of funding

From the patient perspective or even the provider perspective, the power of interoperability is also in all of the enabling technology and subsequent massive amounts of private equity or VC funding.

[That funding] is starting to funnel into the notion that once we have data, we can start to empower patients with apps that help them to understand how to monitor their health and how to understand their billing experience. The CMS Blue Button, for example, takes this notion that we can connect patients with their own Medicare claims data, and with data coming out of large learning collaboratives or consortiums and large programs like All of Us.

We see a lot of successes in these projects in areas where the regulators have said “it's time” and then enable the more extensible use of the data. And that brings the flooding of successful projects and money from the financial sector coming in to fund greater and better ideas.

 

Small data pools and limited learning

Anytime you try anything small or simple like a learning collaborative, you could have all the interoperability you want, you could say we're going to predefine the data elements and scrub the data we get from 12-15 different institutions, or a hundred or even a thousand different institutions.

But once you're sitting on the data, there's still these rules and constraints around the use of data. And, of course, there are certain areas where those constraints serve toward the reduction or elimination of reasons of avarice.

Even when we strip that away, and only include limitations of use of data around pure greed, we’re left with the challenge of a lot of the super privacy issues, as in: because of GDPR you can’t release one data element outside of these countries.

We're sitting on a small planet; 8 billion people sounds like a lot but, quite frankly, it's not. There's a lot of data, there are a lot of challenges, there are a lot of ongoing pressures in the healthcare world. Take COVID: we're exiting this massive global event where we've learned very little because we don't have strong enough data rules in place, or lack of rules in place.

We have all these data, and with the amazing advances in both computing ability and data science techniques we have the power to use it, but we’re still constrained by small data pools and limited learning. And that's a shame.

So how do we make the data more interoperable, and how do we gain standards that will stick within the industry?

Doing so is a crucial step to help to drive better knowledge around payment reform measures and policy, as well as empowering providers with improved understanding about how to treat at the point of care.



The interoperability and anonymised hybrid architecture of The Clinician’s ZEDOC platform means it can be scaled up across entire hospital and health systems - securely.

Contact The Clinician to learn more

Previous
Previous

The Power of Standardisation: ICHOM’s Standard Sets of Patient-Reported Health Outcomes

Next
Next

Why Data Standards are Essential for True Interoperability and Ease of Integration