Okay, good afternoon, everyone. We'll go ahead and get started. Welcome to the next installment of our Digital Quality Expert Webinar Series here at NCQA. We are incredibly excited today. We have some great partners join here from VLOCs, who are going to really dive into how to assess health plan readiness and get ready for the transition to digital quality. So I want to go ahead and introduce our speakers today. We have Michael Klotz and Doctor. Michael Barr. Both are co founders of Velox. Michael Klotz has been in the digital quality industry for many, many years, has worked with data and clinical data exchange, and is going to bring that insight to us today. Doctor. Michael Barr, has been as well in the HEDIS and quality measurement industry for many years. He served at NCQA at one point in time and has continued to drive forward the progression towards HEDIS and quality measurement. So very excited to share with them today, and I will go ahead and hand it over. Great. Thank you, Allison. Great. You have the first slide up already, so thank you and thank NCQA for this opportunity to speak with all of you and thank all of you for attending. This is my first webinar with NCQA since leaving the organization in twenty twenty one. Part of that, I was on the other side of the screen representing NCQA during the very early phases of the transition to digital measures. So it's exciting to see how far we've come in the past few years. The idea of digital quality surfaced around twenty seventeen when NCQA held its first digital quality summit. The first Future of HEDIS webinar, laying the foundation for the digitalization of HEDIS measures, was in early twenty nineteen. So here we are about seven years later talking about finalizing the transition to digital HEDIS over the next few years. And it's not just NCQA. CMS is on board, as are the other measure developers. So today, we'll start with a brief recap of digital quality, what's changing and why. Then most of the presentation will focus on assessing payer readiness with specific attention to planning and executing the transition. So why digital quality? What's the core premise of digital quality? What's behind the motivation to move forward? Well, here's my take. To improve the quality of care, manage population health, and achieve the quality and cost outcomes desired for our patients and members, especially under value based payment models, we need more and better measures that cover more aspects of clinical care, provide timely guidance for payers, accountable organizations, clinicians, and their teams, and support the differentiation of high performers for the purposes of value based payment models. Now, the move toward value based payment models is accelerating, but these models have not consistently delivered definitive positive outcomes on quality, cost, access, patient experience. And I said more measures. I did say it. However, not more measures in non digital format. That would further complicate the already challenging processes of implementation, data collection, reporting. So I mean more digital measures, because digital quality measures have the potential to cover more clinically relevant issues while reducing this burden. And so how will they do that? Well, they're written in clinical quality language and using fast interoperability resources and are executable in terms of the measure logic. And they can handle more complex clinical indicators. They're also configurable to meet national reporting requirements and population health needs, for example, to align with NCQA's allowable adjustments. These measures use clinical data in a structured standards based format, making them easier to validate for use. So this will reduce variability in implementation effort and lower the cost of execution and validation. Payment models and more downside risk alone won't achieve value based care objectives. Technology and AI can't do it. Asking clinical and administrative teams to work harder won't either. We need innovative payment models, and CMS is driving that train. You've seen a lot of them come out recently. They go beyond primary care, enhanced operational readiness, smartly apply technology to support clinical teams and patients, and higher quality data to support better digital quality measures to get there. So what's changing with digital quality? We can advance going forward. The development, distribution, and reporting are better, faster, and easier. So measure authorities like NCQA write the measures using FHIR CQL. And as I mentioned a few moments ago, no more interpretation or translation of PDS and narrative specifications. The digital measures are delivered to vendors who ingest them in that format, facilitating the implementation, execution, and validation. Since everyone starts with the digital specification, that should be faster and less costly. And then finally, reporting leverages standard based software in FHIR. Now, FHIR is both an application programming interface and a data formatting standard. So in the context of digital quality measures, the data format aspect of the standard is applicable. And importantly, we don't expect any changes for reporting to NCQA via the interactive data submission system or IDSS. Now, before we get into the readiness planning and execution part of the presentation, it's important to highlight when you need to be ready. But here's the punchline. The time to start the process is now. Michael Klotz will provide details in a few moments, but here's an overview. It's clear. NCQA is on this path. The portfolio of digital measures is growing, and NCQA is signaling that all measures will be digital by measurement year twenty twenty nine. And NCQA will sunset the hybrid sampling methodology measure by measure. And Michael has a slide to that effect on one of his upcoming slides about that schedule. And depending on the pace of market change, the hybrid methodology will be retired and replaced with measures based on full population data collection in two thousand and thirty. Sampling will go away. Digital quality solutions are entering the space, and we expect the options to grow. Now, an additional optional but important part of the process for the transition is comparative testing, previously referred to as parallel or concurrent testing. This is a process that organizations may use to evaluate the accuracy of digital HEDIS measure results by comparing them with traditional HEDIS results for the same measurement year. This used to be a requirement, but in March, NCQA published guidance that it no longer requires organizations to complete the one year comparative testing. Nevertheless, organizations may want to avail themselves of this option to help build confidence that what they're doing is working, understand the differences between traditional and digital results, and validate the data mappings, logic, and workflows. So if you're interested, go to NCQA's DQM readiness reporting guide, and the team will put some links in the chat for you in a moment or so. I would highly suggest you read those if you haven't done so already. So now that I've covered some of the basics, let me turn it over to Michael Klot. He's going to take you through our recommendations for assessing payer readiness and for planning and executing the transition. Michael? Thank you. Effectively, what we're doing here is we define four major categories or capability categories, data completeness, data readiness, and we'll explain the difference, software capabilities, and then quality or HIE does reporting, which pertains to what Michael mentioned just before around the comparative testing. So let's get started with data completeness. So data completeness is effectively saying, you know, do you have all the data to report HIE does properly? In other words, as the hybrid methodology phases out first and foremost, that your rates don't drop. Now you may have other benchmarks for that. You may actually have your rates go up or have the desire to have your rates go up or other goals around that. But bottom line is, you know, you need to be very prepared and aware of what what happens to your rates when the hybrid methodology goes away. So as it phases out, the main drivers, and we'll look at some real data on this in a minute, the main driver is obviously the size of your eligible population, which is also a function of the membership in a given plan. But as the sample gets replaced with eligible population, obviously the larger and larger your membership, the larger and larger the impact on your rates, and therefore the data that you may need to procure to make up for those rate drops. It varies by line of business and MRR. So the medical record review in in our assessment will typically not scale sufficiently. That is for a number of reasons. You know, vendors will be taxed with more demand. I think we're starting to see that. But I think there are practical limitations to that and provider abrasion, which is already bad, I think may escalate as well. Bottom line, even if that is not the issue, budget may be. We'll show some numbers on what the impact, the likely impact of going more towards the medical records review as opposed to other ways of collecting clinical data might be. Okay. So the broader theme, which we're not gonna get into today, but, that is a core theme of what we work with our customers on is what we call the clinical data imperative. Effectively, we we were observing and what also we see confirmed by even folks like Gartner is that the demand for clinical data is increasing dramatically. So the volume is going up, but also the data needs to be of better quality. It needs to be more real time, and it needs to be structured and ideally standard based. With budgets being tight and potentially getting tighter, what that means is that the unit cost for clinical data needs to drop very, very significantly. And there is ways to do that, but it needs to be a deliberate strategic effort on behalf of payers. Okay, so we've developed sort of a methodology to model the impact of a hybrid measure going away or the hybrid methodology going away for a measure and then basically figuring out what's the impact on the rate and potentially on, you know, how much more data you would need to collect to meet the the rate that you previously had with hybrid. So I'm not gonna explain the whole waterfall, but basically what we're doing here is we're deconstructing the final hybrid rate. So we take out the hybrid hits, we take out the supplemental hits, and then we take the same data and build up the same rate if it's based on the full eligible population. In this example, we have a plan with about one hundred and forty thousand members and they modeled, modeled because we are basically rolling up across measures. So it's not just a simple addition that would we overstate the EP, but basically a modeled eligible population across all the measures that are phasing out. So what we're getting to in this case is that, would have roughly a rate shortfall. In this case, it's a Medicare advantage plan of about twenty five percent. And in this case, again, this varies. The data shortage is thirteen thousand members, which is obviously a big portion of the eligible population. So that is just one example, but it's something that we've applied across a number of plans, and we'll show you a statistical analysis of this data across one hundred and forty seven plans in a second. Okay. So here's actually that graph. So we did a regression analysis by line of business. So you see the ends are meaningful at twenty nine, thirty one, twenty eight and fifty nine. And we see a pretty consistent strong correlation between the size of the eligible population and data shortage that we predict on that. So if you build out the next couple boxes here, we can actually see the data. So what we're saying is that typically and on average for one hundred and eighty eight members per one thousand eligible population, you would need additional clinical data. And if we model that into a medical records review where we typically use two point three charts per member and about thirty dollars per chart, obviously, you know, use your own numbers if you can, that would add add up to about twelve thousand five hundred dollars for every one thousand additional members in the eligible population. And there's obviously a strong, there's a fairly, so this correlation is really strong. It's ours credit is about zero point nine six on the, on the membership, the correlation is still zero point seven. So there's clearly strong correlation between the size of the population and the eligible population and the size of the membership to what you can expect to see in terms of rate drop and even more importantly, in what additional data you may need to get to get to rate parity as we call it. So how to approach, assess and fix that? The key is track the hybrid phase out schedule, which you see on the right. This is the latest from NCQA. That is already underway. So, you know, you may already start seeing impacts or have addressed this ideally. But if you haven't, we highly encourage you to model the impact and then obviously plan and mitigate that. Deploy alternatives to chart review for the reasons I mentioned before and monitor for new measures. We don't expect a whole lot of new measures by two thousand and thirty, but just in case, obviously monitor for that. What you see at the bottom is basically modeled out from the previous example. What we have a readiness score sort of on a scale from one to ten, how ready is this plan specifically for the digital quality transition or for the hybrid phase out across all the measures? We see again the membership and the eligible population. If we've modeled, if you were to do MRR, that would cost another eight hundred thousand dollars approximately. If you use NLP where you don't do manual chart abstraction, but you still need to get the charts, that still costs significant money. However, if you go all in on structured data or what we used to call supplemental data, then you may actually save money because you're doing away with MRR and the more expensive methods. And the way we're modeling fire from the source, we'll define that in minute, you may actually have some additional savings from that. But again, you know, the key here is be aware of what the impact will be, model it properly and plan accordingly. Now on to data readiness. So data completeness is do you have the data you need? Data readiness is do you have data into the FHIR format when it matters so you can actually run digital quality measures? If you feel that you're not quite ready, you're definitely not alone. Most payers we talk to are basically saying we are just starting to figure this out. We're not ready or we're we're not quite even ready to have a solid plan. The reasons for those are common and I think very addressable. It's lacking clarity on requirements around data and also about how to implement digital quality around the software, which we'll talk about in a minute. And also the sort of the confusion we often see about FHIR's definition. Doctor. Barr mentioned this earlier. FHIR is used for both the API aspect of the standard and the data format aspect of the standard. Now while APIs are very important and useful in the context of digital quality, we really only focus on the data format definition of the standard. And then there's, of course, terminology and concepts, which are very useful and important, but can be confusing as well. So FHIR repository is something you may have heard or you may even have one just in time FHIR and FHIR from the source, and we'll actually explain those a little bit later as well. So when you look at the data flow where on the left of the Sankey chart, we have basically all the external sources of your clinical data, which then comes in house. In house, obviously, also added add admin data, and then the data flows to your digital quality software, which eventually will replace your traditional software. The the point here is, and maybe you can build this out, Allison, that while there's a lot of misperception about, you know, data potentially having to come from the sources FHIR, that is not the case. That is ideal for a number of reasons, but not necessary. What's really necessary is the data is FHIR formatted just before it actually gets consumed by the digital quality software, which would sort of be that orange step in the process here. So we recommend that you look at readiness really from the software backwards, and we'll we'll explain what that means. But basically, the idea is if your software vendor already has that capability, or if you have used an incumbent software vendor who may do that for you, then you may not have to do a whole lot to change on your data, or you may have to do more. But as you prepare that, think of it going backwards, do I have a hundred percent FHIR when the data enters the digital quality software and how much can I get further upstream, if you will? So that's really the point here. The other things that are important to consider here is that you also do this in the context of your enterprise clinical data operations. Inventorying your data flows like you may already do for your audit and inventorying and figuring out how your external and internal sources contribute to the data that eventually goes into the digital quality software is essential for that assessment. Then the enterprise clinical data operations aspect of it is important because these investments should be considered as a broader clinical data strategy in the enterprise. Okay. So planning and tracking. Start at the end, like we said, where all data needs to be FHIR. Convert to FHIR along the way and do that as you have capabilities or as it makes sense or business sense depending on what else is going on in the enterprise. You may have a FHIR repository where FHIR bundles are ready to use. And, you know, in some cases, you may even have FHIR from the source. This is, again, ideal, but not required. The demand side FHIR capabilities as we call it, where now plans actually have the ability to get data in FHIR format from their data partners in the long term is extremely strategically valuable. However, the This is not necessary, but will be very, very important going forward for for a number of reasons and obviously advantageous to digital quality. What you see on the right is sort of one way we could. You could dashboard that where you keep track of, you know what the coverage is in terms of FHIR availability from external data sources like we saw earlier to internal. And again, make the distinction between availability of FHIR, either from a repository just in time capabilities, and also do that for both clinical and administrative data. Now, software readiness. The. As Doctor. Barr mentioned, more and more software vendors are starting to offer software packages for digital quality. And they at the core of each digital quality software package is what's called the CQL engine or CQL runtime engine. And obviously the core requirement is that they implement the FHIR CQL standard so that measures like digital HEDIS, which are built on that standard can run on that software. But the longer the longer end really where the benefit of digital quality will compound is when other measures and other measure programs will run on this exact same software using the exact same standard. And by the way, use the same FHIR formatted data. The software, however, is more than the CQL engine. So what we recommend is, you know, assess and compare offerings from vendors, work with your incumbent vendors, see what their roadmaps are and so forth and so so on and so forth. But maybe create an inventory of that. Here's a quick assessment of the landscape today of vendors out there. We would call them incumbents and new vendors, if you will. And. The incumbents are what we call traditional HEDIS vendors. Now there's a lot of certified vendors out there, and if we actually pulled a very recent list and you may notice, but only a small number of these vendors actually have the capability to fully report HEDIS and you probably have a good idea of who they are. With digital quality and especially sort of driven by the efforts of the digital quality implementers community, we see a lot of new vendors, players come into the space as well, which we categorize new vendors, which have digital heaters capabilities already. So most started with building out an engine, is sort of at the heart of that and building offerings around that. There's currently fourteen listed and there's a link for that which we can put in the chat. It's actually at the bottom of the slide as well. And the bottom line is no vendor is completely ready. And that may be concerning, but I would say it's not. Not yet, right? We see the speed at which things are moving forward. And since all the measures effectively ready from NCQA, vendors now can really make a full effort to make sure they all run properly and start building complete solutions around that and work with their customers on that. So that is a quick overview of the landscape. So when evaluating software vendors, you know, think about you want to stick with your vendor that you currently use, or, you know, if they have an offering, can you stand stick with that vendor versus, you know, there's obviously trade offs between sticking with a vendor or potentially switching to a new one. Key criteria are, you know, work on or find out the release plan, When is what is when ready when for their software supported measures? Some of the vendors say, oh, we have almost all measures ready. But I think typically vendors are starting to give counts now. Key capabilities from that wheel on the previous slide. So again, just make sure that you do apples to apples comparisons when you look at vendors and software. What's the implementation readiness and the timeline for the implementation? And then also what's the deployment model, right? Some vendors may only provide cloud only offerings. Others may offer on prem as well. Your preference may matter as well. Again, what we're doing here is you could dashboard this by saying, going back to the previous conversation about data readiness, do they have just in time FHIR capabilities? Meaning can you give them data more or less the way you hand data off to your current vendor and they will then take care of the fire part? Or would you potentially need to take care of that outside the software and then, you know, tracking measure availability and potentially other aspects of that? So that is planning and tracking for software readiness. And finally, quality reporting readiness. So Doctor. Barr mentioned earlier that it is probably a good idea, although not mandatory, to do a side by side testing of your existing software and what it outputs for reporting and the software that will replace that, whether it comes from the same or a different vendor. So the concurrent testing in our opinion is essential, and I think everybody would want to do that ahead of reporting digital only and potentially turning off the old software. So to accomplish that, results, the output files should be matching and the conditions that need to be met for that is that obviously the input data needs to be the same first and foremost, and then it's properly formatted. The measure logic is properly executed, meaning both engines basically calculate the measures the same way, and then the output files are properly rolled up and assembled in the IDSS format. So you can potentially do that both with the initial and the preliminary files, but certainly you would want to run that on the final submission files. Again, make sure you validate the XML file that comes out of the digital quality software. Is it properly formatted or the data types and values compliant with the requirements? Run comparative testing, do the rates match, does the P match and then potentially drill into more details on that to do root cause analysis and remediate. That may be an iteration that you have to go through. And you know a dashboard on that could look something like on the on the right where you know we would load the traditional HITAS file and the digital HITAS file and basically compare and point out where there is a delta. Okay, so back to the timeline. So we kind of moved things down a little bit. And as Doctor Barr said, we highly recommend that plans if you haven't started yet that you get started. The reason is not just that, you know, things are underway like the hybrid measures phasing out, but there are a lot of milestones that need to be met ahead of the final transition to digital in two thousand and thirty. And there's also a number of dependencies. And we illustrate that with just popping a few milestones in there. These may not be comprehensive, but it should give you a sense of already some milestones have passed for some of the hybrid measures phasing out. Some of the other red ones are hard deadlines, whereas yellow ones are key milestones along the way or dependencies that should be met. So I'm not going to get into what we think those are, but obviously each one of them represents a meaningful milestone. So with that, let's wrap it up. So to recap the four categories, data completeness, model and address data gaps, minimize MRR, data readiness, inventory, firefighter capabilities and take the reverse approach from the software backwards in terms of determining when data will become FHIR compliant. Software capabilities survey RFI RFP vendors start that process with some of the criteria potentially that we're recommending, and then inventory their capabilities and deployment models. In terms of reporting, which is covered at comparative testing, highly recommended and analyze and remediate until the outputs match. On the business metrics side, we don't really have time to get into that today. But it's key that you obviously keep other business metrics in mind, which you probably do, whether it's HIE does reporting goals that are not just keeping rate parity on hybrid measures or, you know, making sure also your star measures and your star reporting is up to up to your goals. And the other one is return on investment. Obviously, to digital quality will require some investments, but it also has opportunities to potentially save some money, especially on the clinical data side. We know that budget scrutiny is increasing at health plans today. Some plans have pretty stringent rules about approving any investments, but you know modeling the ROI properly and making sure that you can also measure by the debt being attained is key. And I think there's ways to do that. And we've worked with our customers already to accomplish a good, meaningful modeling and tracking of ROI timeline again. Plan backwards from two thousand and thirty. Make sure you don't just look at that final, you know, where everything needs to be digital deadline, but all the interim milestones, whether they are dependencies or hard deadlines, including hybrid phase out. Wonderful, thank you so much, Doctor. Barr and Michael. I think it was great the way that you guys laid out such a practical approach of what are the things we need to do to think about getting started. Think so frequently when we talk about the transition to digital measurement, we talk about the broad, the big, the industry changes, but I felt like you guys were able to share just some really clear and concise steps that organizations can take in order to get prepared for that data. So let's go ahead and move into some questions. And please keep adding questions to the chat, folks. We are seeing a few come in. We're going to go through some moderated questions, and then we will jump over to the ones in the chat. So the first question is, what are some topics that payers do not need to be overly concerned about as they're getting started? I like that question, as you know. I do believe there there is a lot of them, or there has been a lot of emphasis on the standards and technologies around digital quality, specifically FHIR CQL or CQL, and obviously the FHIR data standard. And, you know, while obviously those are important and very much inside, you know, the implementations, I believe plans, in my opinion, should primarily focus on the concepts to ensure they understand in broad terms what needs to happen around digital quality, but not necessarily worry about being a CQL expert or even a FHIR expert. Right? Make sure you work with your vendors and ask them what the requirements are and the capabilities are of their software. And then, you know, work backwards and say, okay. That that basically means we we don't have to worry about FHIR at all, or we may have to prepare data before it actually gets to the software based on that analysis, which is showed earlier. But all in all, I think the important thing is know, understand the concepts and the big topics around the transition and don't get too worried about the details, especially the standards details and the technical details. And at the same time, I would encourage everybody to really start embracing what I would call shared concepts and language around digital quality. And, you know, I I will pluck an NCQA course here, digital quality transition one zero one, as I call it. I Alison, you probably know the formal title, which I had a chance to contribute to. I don't make any money off of this, but I still highly recommend it because I think it it is a great baseline for those concepts and that terminology. In addition, once you identify those, let's say, gaps in knowledge, put aside the things you don't need to worry about, vet the vendors or the the consultants you bring in to help make sure you're getting the right guidance that you need. Right? I I think that's an obvious thing, but it's not always done well, especially if you're not really sure what you need and going through what Michael suggested gets you there. So we're happy to help, of course. Yeah. You know, an analogy I use sometimes into people who've been around me too much are probably sick and tired of hearing it. But, you know, since we talk a lot about the CQL engine or, you know, in certain contexts, a CQL engine comes up a lot. You know, I say, think of it like buying a car, right? If you bought a traditional car, I. E. Traditional HITIS, you know, you don't necessarily worry too much about what's inside. You just kind of know it works, right? Because your vendor takes care of the details, including building the measures and getting them certified, which is not necessary with digital measures. But also, you know, don't don't think you have to assemble your own car. I think just understand how the new car is different from the old car. That's maybe a good way to think about. Yep, absolutely. And I think that part of HIE DIS' success in the past has been the partnership between all of the vendors who the payer works with. And you can continue that same effort as work with your experts, work with your vendors who you already have those established pieces with, and go on the journey together. So our next thing to kind of talk about is, most payers have pretty sophisticated clinical data capabilities. Should they think about expanding those, or what is the approach to take with FHIR when working with clinical data? I would say start with saying that, yes, we all want to use the term clinical data going forward. And ECDS basically defines that data quite well. Traditionally, payers usually talk about supplemental data capabilities. So can you leverage what you've done with supplemental data and or should you think about expanding or taking a different approach with FHIR? I don't think it's an either or choice. I believe that ultimately going to FHIR will have significant advantages, including financial advantages. In the meantime, though, I think the most important thing is to have a very clear inventory of, you know, the data flows you have. Make sure you can replace medical records review and all this additional data you may need to make up the gaps on the hybrid measures with structured data. Now, ideally that's FHIR. Second best option I think is other standards like ADT, CCD, CCDA, which typically would get from some vendors or from HIE or even TEFCA. And the third one is sort of a lot of this custom files that are floating around. Those are usually the most delayed and the most cumbersome to set up and maintain and process. But. So I would say structured data in any form is the desired first goal and then in with an eye towards, you know, get fire from the source, I. E. From your data partners wherever possible. I think that's an important sort of prudent approach to the topic. I'll just add kind of what I shared in my comments that the need for clinical data is only going to increase. And it's the amount of clinical data you're going to need for existing measures and future measures as the digital measures incorporate more clinical logic and depend upon more clinical information is going to be greater. And that's not just the volume of data, it's the quality of the data. We didn't talk too much about that. But on one of Michael's slides, we have something about sort of improving the quality. So we help find the data, and then there's a process for improving the quality That more data, better data sort of equation, is really critical for the future. Population level data to satisfy the population needs. Administrative data, you already get really well, right? Nothing really changes for that. A clinical data that you're going to need. So, Michael gave the technical information. I'm not the technical person on the call trying to give you the perspective of a clinician. I'm excited about this because I think we can get to better measures. And as I mentioned, I think for all the value based payment models, we need better measures. Those will come eventually. Right now, we're working on the corpus of measures that are in the portfolio because you're used to those. Expect I mean, NCQA has already started to build different measures on the same core concepts, right? The follow-up aspect of things, that's a start. Eventually, we're going to get to sort of, for example, the blood pressure measures, average blood pressure over time, intensification and de intensification of treatment based upon sort of the clinical information. You can't get that from administrative data. So just be aware that clinical the clinical data paradigm is only going to increase the need for it. So over to you, Alison. Excellent. So how should payers approach the challenge of needing more better clinical data while also dealing with the budget and the resources that we all live within our day to day? That is effectively a continuation of the theme we started. And it goes back to what we call the clinical data imperative. But I think it's again, as much it's a challenge, it's also an opportunity, right? And again, that's where the longer arc towards FHIR is gonna be very, very meaningful because the idea is standardized structured clinical data is a lot easier to handle if it comes in the same format from all sources, if it's used in different ways. What comes over the top of that is that I think there's very meaningful regulation from CMS and ONC as it is called again since about a week ago, I think, known as ASTP, where they're very much focused on certifying the APIs from the data sources, primarily from EMRs and, with an emphasis on on enforcement. So there's there's a standard called US CDI three, which already has a a good amount of the data elements that are mandate mandated as of January first of this year. Compliance is sort of it it will take a while for most data sources to get there, but this is very meaningful. And then it's also a lot easier to to standardize code sets on that to measure and assess the quality of that data and improve the quality of that data. But also it allows clients to work with their data partners and implement some good best practices. You know, we've previously often seen things like, oh, you know, providers in our network need to provide us the charts we need, for example, for hybrid measures and for risk adjustment. We have seen initial success from some of our customers working with their data partners to say, hey, we want to basically have the same for access to your FHIR APIs. And it's a win win because access to a FHIR API is much more secure and a lot less abrasive. So there is a number of considerations that I think will will be great opportunities around this and will let plans to thread that needle, although tricky between needing all this more better standardized data and actually make do with the budgets that are gonna continue to be pretty tight as far as we can tell. I mean, this is not a place where you wanna be behind the curve. You wanna think ahead. As Michael showed in his slides, using the old methodologies is gonna cost more in the long run than trying to make the change now. A first step, again, as Michael alluded to, find your best clinical data sources now and optimize those and continue to look for more. And get ready. It's coming. Lay the foundation now. Make sure it works. You have an opportunity to run-in parallel with the comparative testing that NCQA now has made optional. Use that opportunity, see how things are going, but definitely put some pressure on improving your clinical data acquisition. It'll pay dividends in the long run. Yeah, I was just gonna reiterate that we realized that getting budget for these efforts is not trivial these days, obviously, but I think there's some emerging tools and we can certainly help with that around modeling ROI and making that meaningful. Or as one of our esteemed colleagues in clinical data would say, spent half my day pitching and pitching just to get my budget. We, I think you can standardize some things and reduce the time and actually have people spend more time with their clinical data as they should. So I think that's sort of lots of changes, obviously, but if approached properly, I think it's more opportunity than it is pain, honestly. And I think you've definitely highlighted throughout our time today that this is a journey, this is a step process. So it's incremental, so you make those investments as you move along the journey to get to that point to where we need to be. So let's go so for our folks listening to us, you're your last chance, go ahead and ask any of those questions. Go ahead and put them in the Q and A. We do have a couple. We do have a few questions around XML files and IDSS. I wanted to go ahead and address those from the top. The IDSS experience and process as of today is not changing. So your transition to being fully digital is the data that's going into the measure, the execution of the measure, and the outcomes that come out of the measure. That then goes into the standard IDSS and reporting that is done today. We do suspect that as we move into digital and we have health plans who are starting to report digital data, just like everything else that'll continue to grow and modify as we move forward. But that piece of it doesn't change as of today. Give my editorial on this. I think there's a lot of changes in IDSS. It's already XML, which is it's not a bad standard. There's there's some things around the edges that could be improved upon, but I think that is not something that needs fixing right now. So considering all the things that are happening, I think we don't necessarily need to worry about IDSS at this point. Yep. Absolutely. We have a couple great questions that came in from those listening to us on LinkedIn. So the first one is, what is the biggest gap that you're seeing today between organizations who think they're data ready and those who actually are? I don't know that a lot of plans actually think they're ready. I think it's maybe almost the opposite plans feel overwhelmed and think they won't even get there. And I think that's not warranted, but you need to obviously get started with planning and starting to execute. But a lot of it really also depends on what capabilities you deploy, you know, separate from your software vendor or with your software vendor. So there's a lot of variables that can make it really hard or rather easy. And so I don't think there's a clear answer. But I think that nobody's taking it too lightly. And that's probably a good thing. Great. Another great question from LinkedIn is, are you seeing plans work with two vendors depending on capabilities? So for example, one that operates a traditional HEDIS implementation while one is also operating a more mature digital engine. To date, we've primarily seen this across use cases. So plans may use one vendor for gaps in care or for PQA measures and another for HIE. There's very few examples where plans are actually picking a digital vendor deliberately at this point. But we know of a few that are considering a different vendor from their incumbent vendor at this point. And there's lots of plans who are hoping their incumbent vendor gives them a nice path to digital. But I think there's, we'll probably see a lot of movement in that area and a lot more clarity. So we gave you some numbers just so you kind of see what the universe is out there and some criteria on what to ask about and what to look for. Think if we meet a year from now, I think there will be a lot more clarity around all that. So for our last question around cost, I know we've talked about cost quite a bit today. And this may have been answered, but I think it's worth some additional conversation. And once organizations are fully digital, what type of costs do we think will come down around maintenance and overall engagement with HEDIS measures? The one promise of digital HEDIS is that software itself, because there is more variability in how it's deployed, and just, how many vendors are in a mix and how they approach deployment models where in the cloud, may be a lot less expensive to run, for example. So there's the software and running the software itself. I think the primary potential is really around standardizing and simplifying data. Both in acquiring the data and getting standard formats in the door by using standard APIs. Some of those things we talked about earlier, but also optimizing internal operations. Because right now, lots of plans have lots of one off systems and people doing manual steps and having to keep mappings updated and operating FTP transfers and all kinds of sort of one off things that not only need to be handled, but also kept an eye on and managed and making sure nothing gets dropped and nothing goes wrong. So I think standardization, streamlining and optimization around standards and best practices is huge. And not least because the same data in the FHIR standard optimally will also be able to leverage and have ROI across other use cases, right? So if you have comprehensive real time data for risk adjustment and for VBC and for care coordination, that basically comes in, and that is the same data that is also used for quality reporting and gap analysis and gap closure. I think that's really where we're excited about the benefits, not just for digital quality, which the share of the cost will obviously come down on the digital quality side, but across the enterprise. So we've gone through our questions. In just our last couple of minutes, I'll give you each an opportunity to kind of wrap up. What are the things you really want folks to take away today to think about, kind of that last great light bulb item before we wrap up? First, get started now, all right? I mean, it's the one thing to go deep into the policy and the changes, but you need to get started, need to looking at your clinical data acquisition strategies, your optimization strategies, improving what you already have, and then going to that next level. It's going to happen fast. I know we're talking about years now, but it takes a lot of time to implement, test, and revise. It's also not just a technical issue, it's a cultural and operational issue. So make sure to consider all those aspects as you start down this road. And yes, there are gonna be some increased costs in the long run. We expect this overall to be less costly and better across multiple different use cases once you're building an internal infrastructure and those processes and improvement processes. Over to you, Michael. Go car shopping. No. I'm somewhat serious about that. You know, again, I think it's time to start and I think the offerings get to a point where that starts to make sense, just start having conversations, but also potentially start, you know, thinking about which capabilities you have with your current vendor and which capabilities you wanna keep, or which things you might wanna even bring in house. So there's, I don't think there's one right answer, but I think there's no need to do everything from scratch here, right? And the other thing is, I really believe that standards like CQL and FHIR will transform payer operations for the better. So embracing those without having to be an expert at them and understanding the concepts and planning strategically around that is super important. And again, that shouldn't be a big, oh, we'll need to invest millions of dollars for three years before we know if any of this is actually gonna work investment. It's more, you can do very meaningful, very incremental steps and measure the outcomes and measure the ROI within months, in many cases on the data side. So before you escalate your spend and accelerate that effort. So there's a lot of sort of incremental meaningful steps, and I'm not talking about a pilot, I'm really talking about, you know, incremental steps in production that will immediately show how well it's going and potentially adjust or potentially say, yep, that turned out the way we expected. So we're going to do more of that. Great, wonderful. I thank you both Doctor. Barr and Michael for taking your time today. I always continue to learn more, and I appreciate the insight. For those of us who joined us, thank you for giving us an hour of your time. I know how busy everybody is. And as always, feel free to reach out to NCQA at any time with questions around digital, and HEDIS, and all of these things. Our goal is to really focus on what can we do to help organizations prepare for this transition. So thank you again to everyone for your time, and have a great rest of your day. Thank you, guys. Bye. Thank you. Thank you. Bye.
View Transcript
Digital Quality Expert Series: Assessing Health Plan Readiness for Digital Quality Transition with Velox Health Metadata
Drawing on a comprehensive framework, standardized KPIs and real world data from over 140 health plans, this session offers practical guidance and benchmarking strategies to help organizations move toward an all digital future while continuing to optimize quality outcomes and manage tight budgets.
You’ll learn how to:
- Identify key milestones for transitioning from traditional and hybrid methodologies to digital quality measurement.
- Assess organizational readiness using meaningful KPIs across lines of business.
- Understand the clinical data gap—and the most cost effective ways to capture additional data.
- Evaluate software readiness and vendor capabilities to support digital quality.