Department of Health and Human Services National Committee on Vital and Health Statistics
Subcommittee on Standards
Predictability Roadmap Recommendations
December 13, 2018
Omni Shoreham Hotel Washington, D.C.
P R O C E E D I N G S 8:30 a.m.
Agenda Item: Welcome
COUSSOULE: Good morning everybody. Welcome to day two of our National Committee on Vital and Health Statistics Subcommittee on Standards. We appreciate your time and efforts yesterday, as we’ve indicated a few times, and look forward to picking up where we left of yesterday and continuing our very good and productive discussion today.
Official business first, we need to do role call.
HINES: Good morning members. Let’s begin by introducing yourself, as we always do. Alix, you want to begin?
GOSS: Sure. Good morning. I am Alix Goss with Imprato. I am a member of the Full Committee, co-chair for the Subcommittee for Standards and I have no conflicts.
COUSSOULE: I am Nick Coussoule with BlueCross BlueShield of Tennessee. Co-chair of the Subcommittee on Standards, member of the Executive Committee, member of the Privacy, Confidentiality, Security Subcommittee, and I have no conflicts.
LANDEN: Rich Landen, member of the Full Committee, member of the Standards Subcommittee. No conflicts.
KLOSS: Linda Kloss, member of the Full Committee, member of the Standards Subcommittee, co-chair of Privacy, Confidentiality and Security Subcommittee and no conflicts.
MAYS: Good morning. Vickie Mays, University of California Los Angeles. I am a member of the Full Committee, Pop, Privacy and the Review Committee for Standards, and I have no conflicts.
CORNELIUS: Lee Cornelius, University of Georgia. Member of the Full Committee, the Population Health Subcommittee, and I have no conflicts.
STRICKLAND: Debra Strickland, XeoHealth, member of the Full Committee, member of the Standards Subcommittee, member of Population Health. No conflicts.
STEAD: Bill Stead, Vanderbilt University Medical Center, chair of the Full Committee, no conflicts.
HINES: Do we have any members on the phone this morning?
MUNSON: Good morning, Jacki Monson, Sutter Health, member of the Full Committee, member of the Subcommittee Privacy, Security, Confidentiality, and no conflicts.
HINES: Denise Love stepped out, so we will move to staff.
HINES: Thank you. Just to remind our audience that are remote, we will be having a period of public comment at the end of today’s agenda. And if you were not on yesterday, there are two ways you can send public comments to the email box email@example.com or if you’re on the WebEx platform, you can use the Dashboard. And please remember to include your name, your title and your organization. In the room, we’ll have the mic open for the members of the audience.
And just a couple reminders around the mics.
Remember to turn your mic on and off when you speak. There’s just that one button with the person talking. You don’t need to use any of the other buttons on the phone and please speak directly into the mic so that the audio and the transcriptionist have everything as clear as possible.
COUSSOULE: Excellent. We will start off today getting right into our next session and our illustrious colleague Ms. Deb Strickland will lead us in this effort so I’ll turn it over to Deb.
Agenda Item: Discussion of Outcome Goal 3 – Regulatory levers – 6 recommendations & calls to action
STRICKLAND: Thank you very much, Nick. And I want to thank everyone here for yesterday’s robust discussion on a number of topics, some of which will converge today and it certainly does make my job a little bit harder because we sort of had a plan on how our recommendations were going to go but we have a couple monkey wrenches in the mix now. So we certainly will have some exciting conversation as we move forward. Go ahead.
LOVE: Denise Love, National Association of Health Data Organizations, member of the Full Committee, member of the Standards Subcommittee, no conflicts.
STRICKLAND: Great, thank you very much. So Outcome Three is our regulatory process and timelines. So I’ll read the five for those on the Webinar.
To improve the regulatory process and timelines at HHS to enable timely adoption testing and implementation of updated or new standards and operating rules.
So what we’ve talked about already is that we would impress upon HHS to take one year. We also understand that that’s a hard ask but if we were to get that, that would be a good start.
But then we had a monkey wrench where Chris suggested that we actually get out of the regulatory process. So we try to do this on our own and we try to advance the standards as innovations are happening which personally I think is a better way to go. It’s less of a process and, as we’ve all discussed, that this has really become a black hole in a number of our processes.
So now that we have sort of that monkey in the system, and we also have another twist. So at this point we’re talking about how are we going to make these transactions move through the process faster, be more predictable, be complete and be tested and ready to go.
But we also have to consider where we are today and today we have a standards body that has 7030 in the works but is not yet on our doorstep.
We don’t have them ready to go and at the point that they are ready to go, if we have to go through the process that we currently have in place, we have to – it has to go through the DSMO. It has to go to NCVHS and then it has to go to the Secretary until it pops out the other side, which we don’t have a timeframe for that at this `xpoint.
If, let’s just say, it was the year that we’re asking for then after that year then generally we have about 24 months to implement. However, knowing what I know, having been involved in the standards process for 7030, the changes for 7030 are not small. It’s been a decade since we’ve had another version and so the changes are a big lift.
So I think that we also need to consider, and maybe X12 can shed some light on, the fact how they envision 7030 going. Was this going to go as a suite as we’ve done in the past? Or were they going to propose a standard approach to implementing this next version.
So we have a number of things that I’d like you all to comment on, one of which being what do we think about going outside the regulatory process. If we go outside the regulatory process, does that mean we strengthen up that DSMO to make that the HHS replacement? And if so, what does that look like? And also I would love to hear some comments about the 7030 dilemma.
So, I open it up to the group.
KLOSS: So starting with our fashion of yesterday, your cards will be used to indicate that you have a comment and surprisingly, Margaret put hers up and down already. So it will be Margaret followed by April.
WEIKER: Thank you. I am Margaret Weiker with NCPDP, for those that are joining us today. In regard to going outside of the regulatory process, today the NCPDP SCRIPT Standard, which is the e-prescribing standard is not HIPAA mandated. However, it is mandated for Medicare Part D, so it’s included in their rules and we ask to have it updated and they added to their yearly rule, or one of their many yearly rules. But it’s only mandated for Medicare Part D.
But what the industry has done is implemented it throughout. It’s not just oh, I have a prescription, the person is on Medicare and they have a Part B plan or a Medicare Advantage plan so I’m going to use the SCRIPT Standard to send that transaction electronically. Oh wait, they’re on a commercial plan or a Medicaid plan, I’m not going to send that SCRIPT elect – that’s not how it works. It’s like it’s been implemented and it is therefore a de facto standard in the industry to use.
So I would encourage the Committee to look at how can we do this smarter, faster, better. And if that means lessening some of the regulatory or figuring out a way that we can move this forward quicker, that would be beneficial to the entire industry. And like I said yesterday, I remember a point in time pre-HIPAA, where people would implement the standard because it was the right thing to do, and not have it mandated.
LANDEN: I am sorry, Margaret, can I interrupt and ask you to describe a little bit more about the how it is mandated for Part D?
WEIKER: They include in one of their rules, Medicare Part D, it’s one of the many things they do, and what they do is they say, the industry had adopted this.
There’s been changes and the industry now wants to move to a new version. Here are the transactions, and they list the transactions, because SCRIPT has more than just one transaction. So they list the transactions that you should be doing. They then give a date of when you must be compliant. They have said there’s no transition because it’s not backward compatible. But they list it.
They don’t do the impact analysis that you would see in a HIPAA rule because they don’t have to do it for where they are putting it in their Part D rule. But it looks a lot like it would look for HIPAA except it is only for Part D. There’s no detailed impact analysis around just that e-prescribing piece. There is some impact but it’s kind of all mushed together in their rules.
KLOSS: Margaret, building on Rich’s question, how analogous is this Part D effort to what we see with the promoting interoperability, Meaningful Use type EHR standards. Is it sort of similar because it’s just for Medicare and Medicaid and so they advance their regulations related to their own internal policy and contrast it being a national standard.
WEIKER: In regard to like Meaningful Use, there was incentives in order for certain physicians, they could go and purchase a certified product and it had to be certified to get incentives. And it was a Medicare, and then there was a Medicaid too in regard to incentives for those. So that was a little bit different versus here’s a rule and if you’re going to do a Medicare Part D new prescription you must use this standard versus here’s Meaningful Use and here’s the criteria that your software should meet and it should be a certified system. And then you have to do reporting and all of the other around that in order to get that incentive. So that’s a little bit different.
KLOSS: Thank you. That really gives us three different models to be considering.
TODD: I wanted to reiterate comments from yesterday that would encourage the Committee to recommend HHS move updates and new standards outside of the regulatory process. They could adopt a process that would accept recommendations from NCVHS with a timeframe associated with that and what I would put up as another example of how this works today is the core code combinations for CARCs and RARCs. There is a process that is specified but the requirements are maintained outside of the rule.
KLOSS: Got a lot of head nodding on that comment. All right, Danny.
SAWYER: Good morning. My thought is that maybe it sounds like what I was hearing was maybe trying to do this even before 7030 or 7030 being done outside of the regulatory process. My thought and recommendation is that we take the time to kind of plan to check on that before we implement the Predictability Roadmap and some of the recommendations that have come even or been discussed in this group.
And then the Predictability Roadmap be worked on so that it can take affect after 7030 has been named and implemented or in the process of being implemented so that we can continue on a faster, quicker, smaller increments after that. And part of that is if we get outside of the regulatory process, some of us depend on the regulations in order to get budget or to execute anything.
So at least in the current environment, without a regulation, without something on paper that tells me I need to do it, it will be very hard in the Department of Defense to justify spending the money on something that we’re not required to do.
NARCISI: Thank you. This is Jean Narcisi from the ADA. And I mentioned yesterday that there’s a huge barrier to just moving anything forward. And I believe that it is the regulatory process that’s the huge barrier. I think that the office that’s responsible for these is totally understaffed. Several things and recommendations have gone from the NCVHS Committee and also from WEDI and maybe that’s the black hole because nothing comes out of there.
So I mentioned yesterday when we were talking about the DSMO that the DMSO has fulfilled what they were assigned to do but if you get this out of the regulatory process, there needs to be some kind of a coordinating body that oversees this and that’s the group that should have the authority to role these standards out.
Now, that will take a lot of coordination with all of the standards developing organizations, NCVHS and also CMS, but this body should have the authority to review these standards, come up with cost benefit analysis, look at the business needs, and just determine do you need a policy, a guideline, a new standard, an operating rule.
Maybe you need to look outside of the box. Maybe not all of the information flows with just the X12 transactions, maybe HL7 has some standards like FHIR that could help to get this information where it needs to be.
But, my other thought is and the ADA believes that there needs to be a neutral organization over this Committee or group or whatever you’re going to call it, and we believe that WEDI could be that group to take on that work.
SPECTOR: Just some observations listening to the conversation. I do – going back to what Margaret was saying – there are discreet areas where the industry has been able to come together and have agreement without having something per se mandated on them. NPI is one. It was mandated for use in Medicare but pretty much across the board, the entire industry on the commercial side has adopted that and used that, the 1500 going back, of course, nobody wants to talk about paper, but that’s an example where Medicare adopted it and pretty much the rest of the industry fell in line with adopting that.
And so I think there’s some of these areas again that are discreet where there’s common interest for the industry to come together. The problem is in the situations where we don’t have agreement and so there are potentially multiple solutions for something, attachments. I think the reason why we haven’t see attachments in the 10 years, 13 years since it was – since the proposed rule actually came out but we’re talking 22 years now – is that there just seems to not be this clear agreement on what is the best solution to do attachments. And now we’ve got these different ideas and we can’t get one industry approached to coalesce around that.
And that’s where, if we’re outside the regulatory process on something like that, we are going to need to have a strong, I don’t know if that’s external – or that DMSO replacement entity that can do this but we’re going to need the pilots. We’re going to need the cost benefit analysis. We’re going to need all stakeholders being able to weigh in and have their voice heard about what is the right approach to go because we need that industry consensus to come together on those issues.
TENNANT: Rob Tennant with MGMA. And I’ll agree with the majority of what Nancy said. A couple of things though. If we move ahead outside of the regulatory process, we have to be very careful. I agree, there may be certain narrow instances where that might work but I’ll hold this up and say if somebody in 2006 had said that you would buy a phone that $1000, you would have laughed at them. Now we don’t blink twice – okay, you blink twice but you still buy it.
The reason why we’ve moved to this technology is simple. There’s an ROI. There’s value, at least a perceived value. That’s I think one of the challenges that we faced here. So we look at 7030 and we scratch our heads and we say, it’s a better standard but is there really value. So we as trade associations, we’ve got to sell it to our members. Vendors have to sell it to their board of directors. Health plans the same. And there’s really no value established so the momentum never really builds.
The exception I would argue – I would argue with Nancy, is the 275. There’s a case where the industry got together and we met now about a year and a half ago, we have payers, providers, vendors, all of the major players all at the table in front of CMS saying we all agree on this standard and we want it. And it still didn’t work.
So there’s something missing. And I think part of the problem with some of the standards is we say, well, it’s not perfect and there’s something down the road that might be better. And not to cast dispersions on FHIR but that’s been sort of the illusive gold ball down the road. And so we’re saying, well, let’s not put this standard out because we’re expecting this new one to be revolutionary. So we’re sort of hamstringing ourselves.
So I think at the same time as we want to be aggressive, we also have to be careful of moving outside of the regulatory process. And I would say I would be all in favor of moving outside the regulatory process if we all lived north of the border in Canada where there’s a single payer. We don’t.
We have thousand plus health plans, all of them have their own way of doing business. And so we have to be careful because a large group practice will have dozens of contracts with health plans. And if each of them do things in a different way, it’s going to make things if possible, even worse. So again, cautious with moving ahead outside the regulatory process but absolutely agree that the process has to be approved.
KLOSS: Okay, so we have Gary, then Gail, then Brad, then Melanie and Margaret, I just need to clarify are you back in the queue. So Gary, you’re up.
BEATTY: Gary Beatty with X12. In a lot of respects when we look at the adoption of regulatory process that we have today, it has created the proverbial elephant in the room which we all call 7030 these days. Because everybody knows it takes so long to get through this process, everybody’s trying to get more and more things, that oh there’s something else we need to bring in to the implementation guides and so forth. So we are in this continuous process of trying to meet the industry’s needs, knowing that if we don’t, once we do get done with the 7030 it might take 10, 12, 15, however long it takes to get through the regulatory and adoption process.
As I mentioned yesterday, X12 is going to annual publication process for all of our work products. And so we’re going to go to an annual cadence for doing so. But if we’re stuck in whatever adoption and regulatory process, it takes years and years and years to do that. If we can’t get it to a regular shorter timeframe, we’re still going to be trying to eat the elephant even in the future even if the standards communities made their process shorter.
We have to make both our process within the standards community faster to be able to meet the needs of the industry but we also need to make sure whatever process we put in, whether it’s within the regulatory process or outside the regulatory process, a regular cadence as well so that we can continue to meet and adopt those standards that will meet the industry’s needs.
KOCHER: Thank you. Couple of things. First of all, some of the ideas that are being floated around are not new. In fact, I’m pretty sure Alix was on this side of the table the last time we had that conversation, as was Margaret and a few others.
So the one thing I would encourage the Subcommittee to do is to go back and look at some of the things that were brought forward before and look at the barriers to why you weren’t able to advance them at that time so that you can think about that on the front end as you try to think through the creative ways perhaps that maybe those barriers can be overcome this time around.
Secondarily to that, obviously less regulation I’m going to say is a good thing. I think most of us around the table would be okay with that but I really agree with Rob, I would get highly concerned about the variability. Contrary to what some folks believe, payers do care about their provider customers and I don’t want to see them having to do 15 different implementations. That doesn’t help them. It doesn’t give them time with their patients.
At the end of the day, this is all about the patient and the ways that payers and providers can work together. There’s also the unintended consequences of budget impacts, costs to your install base if you drastically make changes.
So I think we need to think through those things and acknowledge them on the front end as we look at the ways we can make this better. I’m not saying that we shouldn’t pursue them but we have to go into it with all of those things on the table and thinking through all of them so we don’t have those unintended consequences at the end of the day.
GNAGY: Quick question – how realistic is it to expect that we can affect the speed of the regulatory process? I’m curious. So if what – based upon the eyes and faces, if that’s not realistic – and I agree with Rob, I can go to my programmers and say, here’s 6020 or 6040 or here’s something that we can go with, why don’t we look at this? Well, is mandated? No. Well, we’re not doing it, we don’t have time. We don’t have the resources. We don’t have the ability to go through there.
And yet when 7030 comes out, it’s likely I’ll have to hire more programmers. And part of that issue is I don’t have anybody left that remembers 5010. Seriously. And the guys that I can get right now, they don’t have a clue as to what that was. Paper what? Modem, huh? And it’s difficult to get them to understand how come it’s so much more difficult to be a programmer in the medical industry or an IT person in the medical industry when other industries have far few barriers to have things accomplished.
So I agree, if it’s not regulated I’m not going to be able to go to my boss and say, here’s your ROI on doing this early. But what I might be able to do is scare them and say here’s what I’m going to ask for in three years when 7030 comes out. Maybe you should start trickling a little bit to me now. So those are the things that I have to deal with.
And I really think that if we’re talking about predictability, if we’re talking about looking into the future and seeing what we can do, we need the information that we’re all talking about here. We need the people that are in the trenches doing the work, we need the information from the community and right now we’re still way up here – we’ve built HIPAA and then here, let’s see how it works after we throw it down and the little guys are like, really.
So I really think that we need address – and I don’t know how from the regulation side but from the standards said we need the involvement. We need the information at a much lower level than we’re getting it right now because we’re expecting us to now go back to the people that we represent, get the information, bring it back to you. How long is that going to take? And how accurate is that when I’m consolidating all this information to me to then to pass to you a year later at the next meeting.
You need to get that information from the people and be able to store it because you can get the use cases. You can get all of the information at that level, whether that’s extensibility – and as soon as I say that word, I see everybody go ehh and I’ve heard the Wild West and I’ve heard all this. Well, what are we living in now? I mean the Wild Universe?
If you structure what we think is going to be outside of the standard, if we give them a sandbox to play in, they’ll play in it. If you say here is the only sandbox you can play in and it has walls and a roof, they’re not going to want to play in it.
So I really think that all of the predictability aspects require us to get good data from the level that we need it. Not aggregated data up to this level and then pushed up. And certainly not people at the top making the decisions and then disseminating it down and going, okay, what happened? What did we do?
Because that’s what HIPAA feels like. It feels like they threw it at us and then we have to deal with it. Now, we’re finally saying oh, well, let’s bring all the little guys in. I still think we’re too big. We need the little, little – we need everybody that’s actively doing these things because again disseminating data up through the ranks and then saying, okay, here’s the particular instances, you still don’t have what you need.
You don’t have the use cases. You don’t have the details. You don’t have the outcomes of those. Whereas if you’re getting all this information in a structured environment, you have it. And then when you are ready for predictability, you have all of that information and you’ve got all of the experience that you need in the data.
- WILBANKS: I was thinking about 7030. When we did all of those changes, we had reasons for all of them. We had business cases. We had projected ROIs. But in many cases, that’s now five years old. And in some cases, I will be that the reasons now have changed. It’s no longer what we need. The initial ROI we predicted probably needs to be updated.
In general, if we keep going this regulatory process, we’re going to continue to have the issues with determining what the ROI is because now we have to take a second look at 7030, go through all the changes and say is this ROI still valid. Has the industry done something different? Have we moved on?
So from a regulatory standpoint we need to figure out, I guess from HHS point is how do we get through the process in a way where the feature we put into the guide, the ROI, we have it. We know what it is. And we’ve gone through it quickly enough so it remains valid and the industry hasn’t changed.
COMBS-DYER: I pass.
WEIKER: I would like to ditto the comment that April made about adding the model that’s done in regard to the operating rule authoring entity and how those changes get adopted. That that’s something that also should be looked at.
And then I agree with Gail about going back – I’m pretty sure though that’s been done – but to go back to what was originally proposed for some changes to the process.
KLOSS: I’m sorry, Margaret, can I clarify?
Are you talking about the 2006 White Paper?
WEIKER: Yes, the 2006 White Paper, Alix. I don’t remember the actual name of it though.
WEIKER: Oh, is that what it was called? So today, NCPDP can produce four versions a year but we decided to only do two. And we’ve had changes that have come through here that have been recommended but it has to go through that process. And the bottleneck is not NCPDP. The bottleneck is not the DSMO. The bottleneck is not NCVHS.
Until something changes with regulatory process, whether its add resources or whatever it may do to fix it, that really needs to be done. Because I have a change request that came through here – I know I sound like a broken record because you all have heard this 20 times or more – and it’s a change for one field from not use to situational, the entire industry agrees with that but yet we can’t get that regulation out the door and it has been six years.
We have states that are now putting in regulation and legislation around the opioid crisis that this country is in but yet they need that field to do what they want to do in regard to their programs because no two states are going to do it the same way. So we have to say there’s no way to do it? We’re waiting for HHS to issue a regulation.
Now, I have had people go can we just implement it anyway? And I said, I can’t tell you to do that. NCPDP cannot advise you there. You need to decide that. But I have states that say, okay, I can’t use this field. How do you suggest? So now we’re coming up with these workarounds that cost the industry a bunch of money, that they then in turn will have to rip out at some point, to implement these things.
Then we look at the electronic prior authorization request that was put forward to this group. And that’s been four years now and we still don’t have that.
And yet you look at the strategy. You look at all of the efforts WEDI is doing around prior auth and other organizations, but yet here’s something that the industry has agreed to, and the industry has adopted those standards and are using them. Though some may say they’re out of HIPAA compliance but they can’t use what was mandating in HIPAA because it doesn’t meet their business requirements because it just wasn’t done back in 5010.
So until we get something changed at that point, whether it’s we go outside, we look at new avenues, we double the resources. We do whatever it takes, this group – and we can come up with the most predictable process but if it goes and it just sits there for four years, for six years, for 10 years, the predictability is gone. It serves no one. So, that’s my speech for this morning.
KLOSS: Thank you, Margaret. I know that’s very passionate topic and hopefully we will be able to thread the needle and find the way to address those barriers. I’m going to now call upon Danny.
SAWYER: I will try to also have a passionate topic. I believe and recommend and suggest that a language in the law regulations can be changed and probably even within the next couple years. But you have to have the will. Somebody’s got to have the will. You’re going to have to have probably an advocate or a champion and whether that’s the Secretary of Health and Human Services, somebody’s got to be the champion of this.
You also used to have – I know HMS used to have a day on the Hill where people would go and advocate different topics to their congressman and their senators.
I don’t if that still happens. I haven’t been for a while.
KLOSS: Yes, in states and federal levels.
SAWYER: Great. We used to go to Richmond and talk to our house delegates there as well. So if there’s the will and if you want to get an advocate or multiple advocates behind, yes, the regulations can change and they can change – okay Washington DC rather quickly, within a couple of years maybe – to have some language in there where we can make that happen.
But I believe too, there’s got to be a plan. You are going to have to have – we’re going to have to have something not just to go and talk about a concept but actually what is it that needs to be done and maybe even recommended text. Something needs to be pretty firm where they can get traction on a specific idea.
So if we’re talking about changing regulations, there are ways to do that. But you’ve got to have the will to do it. Thanks.
KLOSS: Thank you. Just to be clear in the cue I have Liora next, followed by Laurie Darst, Brad and Joe.
ALSCHULER: All right, so just listening to this, I have to think that what I’m hearing is that what people expect and will demand from a non-regulatory process is pretty much identical to what they would expect and demand from the regulatory process. It’s consensus. I mean the regulatory process has to reflect a consensus and be sure to have that assurance before it moves ahead.
It has to understand the ROI before it moves ahead and it has to lay out a roadmap that will be clear and feasible for industry that allows planning, long-range planning, mid-term planning and short-term planning for everybody who has to make the corollary investment. So it does seem to come down to a question of who is best equipped to manage this process knowing that the demands will be the same.
So looking back at some of yesterday’s comments that understanding that whoever is overseeing this it needs to be a process that involves continual tests, piloting at every stage so the determination of ROI and feasibility is something that is not an addon after the fact but is measured and reviewed continually allowing updates before it becomes that proposal for final consensus.
And there should be – if it’s not regulatory or nonregulatory body, there should be incentives for early adopters to participate in those pilots so that we can get the appropriate range of stakeholders engaged and be sure that we build into the design of those pilots, a measure of ROI and the data that’s going to be needed for decision making and for rolling out a roadmap.
My final thought is that that roadmap has to consider that even in the standards world where by definition we want everybody to do the same thing, there has to be room for competition in the sense of understanding that it’s a heterogenous environment and people have a working system – sometimes called a legacy is a system that works is good. They have to be allowed to maintain and continue to leverage the value of their investment.
But those who haven’t made that investment yet, who are coming into the game at this point 10, 20, 30 years later, they need to be able to adopt the standards that their programmers will be able to pick up and work with.
And so we need to make sure that our roadmap builds in equivalence and we’ve done this – where semantically from an information perspective, we have bidirectional trans – lossless bidirectional transforms between existing things like CDA and new things like FHIR.
And they allow this concurrency where those who have some costs are not losing their investment and those would benefit by a quick start with new technology can do so. And it doesn’t create the situation where – you’ve heard many times – where now I have to accept all these different things. If it’s done right with foresight for equivalence then these can be compatible and we can build that kind of a roadmap.
KLOSS: Could I possibly clarify your point about the equivalence from a data semantic level? So if we, as industry, have defined the data that we need in the various transactions then it’s about making sure we’ve got the data harmonization happening, not necessarily the standard structure that we need to focus on.
And so if we, as a nation, adopted and recognize a claim or an encounter or an eligibility request, those things are already out in the ecosystem. It’s about how do we leverage that knowing that we’ve got the legacy and the emerging but you focus to that level of the data and let the – sort of the structure kind of soften in the regulatorily adopted ecosystem.
ALSCHULER: There are many precedents for this. If you think of it – actually the transition from paper to the first electronic transactions, they had to have the same information because there was concurrency. Some people were still on paper. So the end systems were expecting equivalent information.
And we have precedents of that in the standards rule too. One of the first things we did when we brought XML into HL7 was build an XML version of version two. Now, nobody adopted it because they were already using version two in the same way.
With FHIR, we’ve built an equivalent of FHIR, an equivalent to CDA, that does the common core from CDA that’s widely implemented but does the exact equivalent in FHIR. So you can really have concurrent implementation adoption without creating the problem of now I have to receive so many things because you have lossless bidirectional transforms that you can use.
KLOSS: And it’s really good to think about the fact that UB or the 1500 became the A37 – like there is a thread through all of our data needs. Good point.
Laurie Darst I think is next.
DARST: Speaking on behalf of WEDI, I can’t make a recommendation whether we should bypass the regulatory process or not since we didn’t talk about that. But what I can share with you is again, we had multi stakeholders. There was the payers, the providers, the vendors sitting at our policy advisory group and the level of discussion and frustration candidly on this issue – as we talked about this, it wasn’t X12 or HL7 or FHIR or CORE and NCPDP that was the stumbling block for us moving forward.
It was this delay that we have with regulations coming out. And you could just tell in the room the frustration of this process. All this work is being done by all these different standard setting organizations and it goes nowhere and we can’t implement it. So I just want to reiterate from a multi stakeholder point of view, this is a huge piece. This is probably one of the biggest frustrations we talked about.
KLOSS: Brad, then Joe, then Rob, Minil and April. That’s who we have in the cue.
GNAGY: So what I am hearing is that there’s not really a hard roadblock to getting to a lot of the places that we want to get, that the speed limit is just zero miles an hour. So if change in the regulatory process or modifying it or getting to the point where it’s not an issue seems way down the road – we keep talking about things way down the road –
And one of the common themes I’ve heard from some of the technical people is timely data, good data, getting the information. Right now we’re not getting timely data. We’re going through the learning process. We’re obtaining data. We’re observing the information. We’re making decisions. We’re taking actions. And then we’re starting the learning loop again of obtaining the new data but that cycle takes us a year.
And so we need to implement something where we’ve got your standard – and maybe this is more of a question – if you’ve got this floor that we were talking about yesterday, and you have something outside of the floor that’s maybe not finite and structured to a specific degree, but says okay, you can include these other alternatives. You need to give them a path to get around
If you don’t give them a path to get around their blocks, they’re going to choose their own path.
So let’s designate some paths – and I use extensibility because it can be built in – but I know that’s not a simple process. I mean obviously if I said okay, X12 go extend – you can’t come back tomorrow and it’s done.
So you need to give the people doing this an avenue so that then you know where they’re going. If you give them an avenue, they are going to go down that path and then they’re going to take the path of least resistance and find ways of doing it and that’s where you can get this information, get good data, get timely data that is absolutely relevant for predictability and then you start storing data.
And as you’re storing that data then you get a picture of what that data looks like over your time periods. And then you can continue through that process of learning and observing and changing and making your actions. And it becomes much more relevant instead of just saying here’s our floor, if you’re outside of this we don’t have a recommendation. I think we need to give them some avenues.
And I’ve actually heard, the last night at a social event, that there were some things in HIPAA that allow you to say, okay, if I can’t do this I’m going to go down this process. I’m going to make this request. I’m going to do this. I’d never heard of it. And it’s likely that my lead programmer heard it and he just didn’t care.
He goes, yes, I didn’t want to do it so I didn’t tell you about it. And he’s a whole different beast. He is the super old – he’s been programming for 40 years.
He’s a brilliant and even goes, I don’t want to do any of that stuff. In fact, he says – he goes, 7030? I will retire before that happens. And that will kill me, I’ll be honest with you.
BELL: I wonder if the adoption could be sped because we look at right now for approval, a constant reassurance, rather than looking if there’s any dissent.
It seems like Margaret’s example of everybody wants this but we can’t get confirmation at every level in any timely fashion because everyone’s so afraid that there might be something wrong. There might be somebody that isn’t on board with it or whatever.
I don’t like the term floor. Actually, our organization uses the word baseline which I think is a much better term for that. So that this is the minimum standard or the baseline and if you can improve on that, you should be able to improve on that, and that becomes your intellectual property.
And finally, 7030 scares the hell out of me. And I’m weary that we’re not looking at the implementation of that. We can have predictable standards but if we can’t implement them in a predictable timeframe, I can’t tell you the number of people that have said, oh, I’m not really – I don’t really care about 7030 anymore because I will retire before – it’s not –
And then there’s certain transactions, or one in particular, that’s not backwards compatible. So there’s definitely some challenges if you don’t have regulations in place to say yes, this is the baseline and the next baseline when we up the baseline has to be – there has to be a transition from the previous baseline.
TENNANT: I wanted to circle back to a very intelligent comment made by Danny. I didn’t want it to get lost in the shuffle. We’ve talked about the process being broken. Well, the process, in part at least, was developed through statute and that might be a pathway for us to at least to improve the process somewhat.
For those who follow Congress, you know there are four committees of jurisdiction. There’s Finance and Health in the Senate and Energy and Commerce and Ways and Means in the House. And two things stand out about those committees. They tend to be more bipartisan than most and they have a great interest in health IT. There’s been hearings galore on both the Senate side and the House side.
So there is an appetite potentially if the industry could go to Congress with an idea about how to improve things. Congress is many things but most of all is reactive, it’s not proactive. There has to be a consensus developed on changing the process. And I think NCVHS could play a role by identifying some of the barriers that our statutorily imposed on the industry that could potentially be mitigated through legislation.
Understanding that the mandate in general is to report to the Secretary so it might be up to a group like WEDI to coalesce the industry around a narrow set of changes to the law to make that process better. NCVHS could certainly help in developing some of those barriers and I would engagement that.
But I think Danny is absolutely right. You go to somebody – the incoming chair Frank Pallone has a great interest in this. Lamar Alexander, folks like that who could be a champion in Danny’s terms, that’s one way to address this because if we’re tinkering at the edges, ultimately we’re going to be hamstrung again by the law itself.
MIKKILI: We believe that this is voluntary adoption and voluntary adoptions, as Danny mentioned, is driven by great ROIs, justification and these are very subjective based on business requirement. So today, as Nancy also mentioned, about attachments.
If there is a voluntary guidance or suggestions with these standards it would help us to justify the case. But if you have to go ahead and implement voluntary the new version then that’s going to be costly. And then, in that case, we have to also maintain the old version and the new version. CAQH CORE does a non-substantive requirements.
Those are voluntary so that could be also considered for the industry to adopt but overall we think it’s a good option to have a voluntary adoption.
KLOSS: April then Arthur and then I’m going to turn it back to Deb to start moving us through the slides.
TODD: I just wanted to provide some clarification on my comments earlier and some specific recommendations of how this process could work. I wanted to specify that the recommendations were not to bypass the regulatory process but to use the flexibility that’s already in statute to make this process work more effectively and that’s within 1101.
So the recommendation is to not put specific requirements in regulation but to specify a process. That process could be adopting recommendations from NCVHS. Already within 1104 there is a requirement for expedited interim final rulemaking. That is a 90-day process with 60-day comment. The rules become final unless there is significant public comment. That process does not require that specific language is in the final rule. There can be a cross reference in the narrative.
KLOSS: Thus that could be a tool, April, for a body – we could elevate the – we have the process work and then the final blessing to get everyone to move can come out just as an IFC.
TODD: Correct. MS. KLOSS: Arthur.
ROOSA: I wanted to ditto what April just said. She got there just before I was able to. I want to also just raise the issue that we’re talking about this idea of – in Joe’s word – of a baseline. We talked about floor yesterday. One of my comments yesterday was to the issue of that we need to establish that and that, in my mind, is the what is required of everybody to accept it as a way of transferring electronic data.
Beyond that I said you could do whatever you wanted. I didn’t really mean it quite that openly. But it was really about that we could build on top of that baseline to build in what are our current and evolving business needs. But whatever they happen to be, that any trading partner could expect, that if they transmitted data in the format of what is a standard baseline that it would be accepted.
A concern that I hear that I’m having from what I’m hearing is that there is a feeling that flexibility or extensibility of format is a good thing and it is a good thing only if providers, payers, vendors and whomever can depend on if they don’t or wish not to support that extensibility or flexibility and they are still able to send the baseline standard transaction and have that accepted. Now, over time, that standard needs to evolve.
And we’re talking about that in terms of what is a regulatory thing. Is there some other agency or organization outside regulatory that can get it done? I agree with Rob that you sort of have to force this, that it has to be regulatory. We’re not talking about a system that has even thousand participants. We’re talking about a system that has close to a million participants so the only way to get that universal is to have it in regulation.
STRICKLAND: Okay, great. Thank you for your comments. This has been very informative. I know the topic has gone from one end to the other but I think that we’ve got a lot of very important feedback.
So the next slide, what we were talking about is basically the standard organizations and the operating rule entities, publishing incremental updates to their standard and operating rules and making them available for recommendations to NCVHS on a schedule that’s not greater than two years. And NCVHS is drawing its calendars to the SDOs and the Operating Rule Entities to update and review and deliver its recommendation to HHS within six months.
And HHS would adopt the NCVHS recommendations on a regular schedule.
Now, all of the conversations that we’ve had, we’re taking into account that things may change and process may go in different ways. One of the things that I would point out here is we’ve talked a number of times about things like testing, certification and pilots. But if we’re asking the standards organization and operating rule entities to operate under a two-year –
So with that being said, if you look at the fact that 7030 has 432 change requests in it – 432 change requests in 7030. That’s pre and post public comment. So given if we started from a point which is 7030 moves on and this is X12’s decision but I could only see that X12 could only allow an open door for change request for a given period before they’d have to close that door to get them into the process, get them into the workgroups, get the narratives around it, all the language. Get it tested, piloted and certified, however long that takes.
Now my other point is what does a pilot mean? So in past years, for 5010 for example, there was an entity that did sort of a backwards compatibility sort of look at the transactions and that was a paid service. But when we look at 7030, we don’t really have that ability and so when you look at the family of transactions that we’re going to send through here, you really need an end-to-end pilot.
Like you’ll need a member database, a provider database, adjudication system, all the way out to your remittance.
So who and what determines what a full pilot is? And who in the industry could do such a pilot? Like what payers have an exact replica of their adjudication process so that they can test these transactions in a timely fashion, get us back results that say yes, this works or no this doesn’t. We had a colossal failure over here or something and then go back and try to fix that.
So there’s a number of things that I’ll put on the table, one of which is how and who should define the pilot criteria. And what is the feasibility of these standards organizations developing new standards within a two-year timeframe. Thoughts?
KLOSS: Surprisingly Margaret’s tent is already up.
WEIKER: Should we go on?
KLOSS: We are having a little bit of a technical issue but I don’t believe that that’s the same as our internet broadcasting. That is separate for the people that were dialing in like our other committee members. So I think you can go ahead, Margaret. Thank you.
WEIKER: Thank you, Alix. I’m going to first address this recommendation that’s being displayed. NCPDP meets the publishing requirements of this recommendation as I’ve stated previously. NCPDP standards development process allows new versions to existing standards as well as to new standards to be published twice per year. The determination to move forward to an existing or a new standards through the rule-making process is managed through NCPDP’s regular review process which includes input from NCPDP membership’s evaluation of the business needs and cost benefit analysis.
If the intention of the predictability timeline is to support recommending standards and operating rules to NCVHS, on a schedule that is not greater than two years, NCPDP would support an annual response from the SDOs to NCVHS to move or not to move to a new version or an updated version of a standard format. NCPDP does not support an approach that would allow HHS to mandate the SDO move to a new version of a standard or a new standard every year or every two years.
JAFFE: At HL7 there’s a publicly available standards development roadmap which details the development of all of our family of standards as well as a broad view of sunsetting of standards that are no longer under development. Moreover, we test our standards at least quarterly as part of the implementation process. For example, in January, we have invited the payer community to test the standards as part of the HL7 workgroup meeting.
Quarterly we invite other private sector and public sector stakeholders to participate in this process.
Lastly, FHIR has a maturity model in which the individual resources are identified from zero to five based upon their early evaluation up to level five which is that it’s been implemented in multiple countries in at least two different instances. So we would be eager to identify a testing process that could be formalized across SDOs.
KLOSS: Chuck, could you just clarify for me, when you that you test them regularly, did you say annually or quarterly that you bring folks together to test your standards?
JAFFE: Yes, that is correct.
KLOSS: So you do that in like a connect-a- thon form? Or is it broad-based production – can you describe the testing a little bit more?
JAFFE: At each of the workgroup meetings and then at a fourth time in the year, we have a connect-a-thon like process. There are as many as 50 different entities that bring their specific use cases and needs to the table to evaluate whether the process and the specifications meet those needs.
CAMPBELL: Just to add onto that, having been a participant in that. I think sometimes people hear connect-a-thon and think of kind of like a hack-a-thon where people are just like, oh, let’s try this out. And a lot of the testing that goes into that is actually very rigorously tracked. You sign up for the cases that you intend to test.
There are people who come and actually validate that the test was successful. All of that is put into a big database that then can be used to generate a compliance statement. So there is a lot of rigor to it that I think isn’t always applied in the name.
KLOSS: I appreciate the clarification. In the cue I have Laurie Darst, Martin and Gary.
DARST: From a WEDI perspective, from a policy advisor group, we strongly support this recommendation, with the caveat that there needs to be a business need to move forward and that a cost analysis is also done against this. There’s always a cost with implementation and we want to make sure that the costs are justified with the business need.
WILBANKS: To get back to I think the original question Debbie asked, is there actors within the industry who would be willing to sit out, test environments for 7030 or for future releases. I think the answer that is yes, with a caveat. Setting up, while those test environments do exist, maintaining them, keeping them up- to-date, that is an investment and the industry I think would need some assurance that we would be getting continual feedback, continuing updates so that that test environment could be maintained, so those environments could be kept.
Because right now, speaking for myself, two ways which the industry could do is either set up a true second test bed for 7030 and the new releases or we lock down our current test environment for these which means stopping our other development work so that we could test 7030. And that is a big ask I think within the industry to say, okay, let’s set up another environment or let’s stop your firm development so we can check out this mandate.
STRICKLAND: Right. And Martin, to your point, you would have to isolate your test environment, implement the entire 7030 suite and then test it. Now, also understanding if it doesn’t work, backing out all the changes for 7030 so that you can go back to your life and have your test environment.
WILBANKS: Actually, sort of a funny thing happened about two weeks ago. I was talking to my new business analyst and he came across a hunk of code and said, what is this sitting in our production environment. Should I take it out. I said, oh no, don’t do that.
That’s the original switch we built so that we could test 4010 and 5010 and have those pointing to two different environments. It took us a long time to build that code so don’t do anything with it.
We left it there so yes, just take out the 4010, we’re going to put in 7030. But yes, there is an investment. If it’s a new environment you would have to stand it up and then pull everything back out.
BEATTY: Gary Beatty, X12. To answer the point that you guys have mentioned a couple of times already, X12 is moving to an annual publication process so we are comfortable with as we develop and evolve that process to be able to meet that two-year timeline. As we look at our development process, within X12 we have several layers of quality assurance as we go through the process, both within our insurance subcommittee but also as an X12 overall organization we have a QA process from a technical perspective.
As we do go over our change request process, we also build the business requirements and business case for the modifications as we go in the future. Certainly, as we get into a regular cadence and we start looking at changes on smaller more incremental bites, it will be easier for the industry to digest those smaller volumes of changes on a more frequent basis.
I wanted also to address one of the comments that was made earlier about what we call our metadata and Alix had mentioned earlier that when we started our whole process, we started with the claim forms for example that we have in the industry, and even from an X12 perspective, a lot of our transactions all started with some paper base, whether it were purchase orders or invoices or health care claims and so forth.
And the data that was on those paper forms ended up in the transactions metadata and that whole concept has continued on even within X12. Is that we have our syntax, that’s our traditional X12 syntax but X12 is also looking beyond that. We’ve developed XML based upon a couple of different schemas, one of which is actually currently out for a pilot project right now using the health care claim attachments which are basically metadata equivalents of the 277, 275 transactions. We’re looking at JSON. We’re looking at other technologies as well.
But the interesting thing, when you kind of talk about what’s in these transactions regardless of what the syntax it is, it’s all the same data and it’s all the same information being conveyed, just with different flavors.
So the metadata, if you will, is all syntactically valid across all of them. So, again, it’s really the metadata that’s important, not necessarily the syntax of the data.
KLOSS: Gary, can you elaborate a little bit on X12’s process related to pilots. So let’s just take 7030 as an example since it seems to be the next flavor we’re anticipating.
BEATTY: Right. The pilot that we’re conducting right now is with what we call our RS Schema which is a reduced structure schema. So it’s not the X12 syntax itself. It’s really the XML flavor.
KLOSS: I am really interested in the X12 syntax world in testing your standards. I think Chuck’s HL7 experience is sort of prompting my curiosity as to where X12 is in their own testing of stuff and within that two-year cycle. Is that something that you envision happening and what are those methodologies.
BEATTY: Right now we do not conduct point to point testing. All of our testing is done internally as part of our QA process to make sure that, if you will, the nuts and bolts of the syntax are all met. As Martin said, there’s an investment that needs to be made within the industry to actually facilitate organizations to be able to set up test environments and I think somebody earlier referred to that as a test bed type of environment.
We haven’t made that investment yet to be able to do that as an outreach type of organization. Usually we depend upon the people who use the standards to actually go through and prove their case.
KLOSS: Thank you. Gail.
KOCHER: Two things, one, I just want to remind us all that depending upon the particular standard we’re talking about, there is a difference between the standard and the base standard and an implementation guide, although I think collectively we’ve been calling them standards during our discussion. So that may impact some of the thought processes depending upon where that falls.
I also wanted to ditto Laurie Darst. The concept of implementing on a routine basis updates is great and I get that the IT guys are all like yes, that’s cool. But every time we do that, that takes business resources away, whether it’s people, coding time, et cetera, and if there isn’t a real business need and ROI from that, that is detracting from other things that companies need or want to do that perhaps may be more strategic or business related. So we really do need to balance the business need against just making a change for the sake of making a change.
KLOSS: Nancy, then Brad.
SPECTOR: Ditto Gail and Laurie, and to sort of piggyback. When Gail talks about it takes dollars away from other businesses, business projects, business on the physician side, it takes dollars away from patient care delivery. So I just want to underscore that at the end of the day we are supposed to be a health care system delivering care to patients which again, we all are as well and so we want to make sure we’ve got the right dollars going to the right priorities.
KLOSS: Thank you for regrounding us on what this is really all about, Nancy. That’s a really important point. Brad.
KLOSS: Let us do a little time management. It’s just about 10 minutes to 10:00. Why don’t we think about when you want to take a break, Deb.
STICKLAND: I think that the next slide is, because it’s basically HHS should adopt incremental updates of the standards and operating rules, we kind of had this dialogue already so I think we can skip that one over.
KLOSS: Surprisingly, tents are already out.
WEIKER: I was like – don’t be skipping one of these, because I have some comments. Imagine that. In regard to recommendation number 10, NCPDP agrees HHS should adopt incremental updates on a reliable, predictable schedule. A process that abbreviates the regulatory requirements for incremental updates would be supported by NCPDP. SDOs need an incremental update process to appropriately respond to a new legal/regulatory change as well as unanticipated industry right business needs that require immediate attention. For example, quantity prescribed.
KLOSS: Just six years in the making on that one. So Deb, anybody else have comments before we move on to the next slide? Or do you want to go to break.
STRICKLAND: No. I think we can start to talk about the next slide. I don’t know that we’ll get through it.
So the reason I wanted to sort of look at this slide, and we’ve talked about this through yesterday and
today is when we talk about the baseline, what I’d like to get feedback on from this group, what do you think a baseline would be? The baseline standard or the baseline TR3 implementation guide? What are thoughts on what baseline should be?
WEIKER: NCPDP doesn’t have a standard and then a separate document for an implementation guide. We used to do that but it’s all one now. So when we talk about an implementation guide or we say standard, it’s one in the same, so we don’t split them like X12 has an underlying base and then they build an implementation guide on top of it. For NCPDP, it’s just one document.
So, in regard to this, I think this is almost a two-day meeting in and of itself, to talk about what do we mean by a baseline. And then when we say, oh, well now everybody should be here but we’re going to adopt this but you don’t have – all the little nuances around this really needs to be flushed out.
KLOSS: Would you include the backward compatibility topic as a part of that, Margaret?
WEIKER: I think you have to. MS. KLOSS: Arthur the Liora.
ROOSA: I think actually the two documents should be the same across the board. Just my thought.
As far as the backward compatibility, I disagree. I don’t think that that should be part of this discussion. My thinking of a baseline is a standard which is basically a configuration of how you are transmitting data and what that data means that is consistently applied across the industry. Again, if any trading partner within the industry transfers data to any other using that standard, it will be accepted. There are lots of reasons for having additional flexibility and extensibility and other ways of getting that data or getting more comprehensive data from partner to partner but I believe that the flexibility that was originally built into the 4010 is one of the problems that the industry has run in to. I remember going back then that our shop basically said that we went from supporting thousand different formats to supporting one format in a thousand different ways.
And we really need to – I think it’s important that the industry comes together and says, okay, this is – and I think, again, regulation that this is now the standard that we have and everybody can expect that they can transfer data and it will be accepted and processed correctly under the standard. And then, on top of that, is the chance for innovation and growth in terms of what we do.
ALSCHULER: I agree that this is a topic that is not easily defined in a paragraph. And from my understanding, primarily within HL7 and a bit of acquaintance with the other SDOs, it’s done differently.
In fact, within HL7 we’ve had different concepts of this, how this is done in version two, in CDA and in FHIR. So I would encourage some concerted attention to what this means because there certainly – my observation is that having a base standard that is highly stable over a long period of time is a good thing as long as there is the capacity to continue to build upon that.
CDA is in release two which was passed in 2005.
We’re doing an incremental update of 2.1 now, maybe it should have been done a little earlier but it was very important I think to industry to have that base. But CDA is an architecture by intent that established a base and anticipated that there would be implementation guides built upon that with variation. But that’s a concept that evolved further with FHIR and really is critical to get agreement on what’s best.
KLOSS: Pat and then Nancy, Brad and Gary and – April, I can’t tell is it Rob or April – then Rob.
WALLER: Pat Waller, Cambia Health. So I do think the baseline is a long topic. When I think about baseline and incremental adjustments over the baseline, I start to think about the DDE requirement and how you measure back to DDE so that you have equivalency. You don’t disadvantage people in one form or the other between wanting to use the HIPAA standard versus DDE and the add alignment for content. So I think that needs to be considered as well.
SPECTOR: I think I am going to be echoing some of what has been said, and even some of the things that were said yesterday. Definitely we need to better define floor. It raises many questions. Is this about variation of an adopted standard allowing people to add elements that aren’t in the adopted standard versus is it saying, no, anything that’s adopted is – that has to be maintained. But the floor in a sense is that sweet, whatever you want to call it, is the floor and then if you want to innovate with other pieces that aren’t part of that then that’s the innovation area.
Going back to, I think Gail, you raised this. If you are allowing this variation above a floor, is that noncompliance or is that just variation? And that becomes a big question. There’s the issue about once you start having these variations, from the physician provider side, you’ve got to support all of this.
But even so, thinking about it on the payer side, when you – coordination of benefits, you’re going to have to be supporting this as well. And how much do we want to open the industry up for all of these mini one-offs and needing to support all of that.
And again, just going back to the physician perspective on this. Physicians aren’t always in the position where they can negotiate strongly with the payers they are working with and there are times where they are essentially pushed into having to support something that is not administratively efficient for them.
GNAGY: I am going to ditto Liora. The baseline, the floor – I mean the foundation, what is the key principle of all those things and that’s stability over time. And that is absolutely key. Backwards compatibility should be a consideration but the longevity of the floor, the stability of it is key.
BEATTY: Gary Beatty, X12. I would like to ditto Margaret’s comments as well. X12 is a little bit different because we are across industry standards organization. So we look at what’s the floor ceiling.
From our perspective and getting back to the comments I had made earlier about the metadata, the metadata really is the standard. And that standard can be represented in a lot of different syntaxes.
Even when you look at our current environment where we have the implementation guys and TR3’s, we also have the DE exception, where organizations can use direct data entry to conduct the transactions. The measuring stick for that compliance is, is it compliant with the metadata that’s in the implementation guides. So that’s really, if you will, where the floor started going through.
So again, back to the concentration that the data is really the foundation and what’s being conducted. And that includes not only the data but the semantic relationships of the data, the business requirements that are specified within the various implementation guides that
KLOSS: Chris. I apologize. It is Rob. I didn’t look correctly at my list.
TENNANT: I wanted to ditto Art’s comments regarding the variability of this one standard. I’ll have to admit, I look back at a PowerPoint presentation I gave to our members as 4010 was rolling out, and in it I have a broad, big font bullet point and I’ll ask Joe to cover his ears, I said the clearinghouse use would probably decrease. Boy, was I wrong. Joe continues to drive a Ferrari.
We’ve moved to 5010 and clearinghouse use has only escalated. There’s just way too much variation to these standards and we can move to 7030 and there may be some great ROI and technical advances or we can move to FHIR, but ultimately provider practices are simply going to drive all of these transactions through a clearinghouse so why are we even bothering to change the standard. They are still going to use old versions, whatever they have in their practice.
So we have way too much variability and imagine if other industries had this type of variability. If you move state to state and the roads were different and they use different street signs.
So I think there’s an opportunity here to try to collapse the floor and the ceiling so they are a little closer together so there’s variability. Does a health plan really need that extra data element that requires a provider to go through a clearinghouse. I don’t know, it’s debatable.
We had the same issue a few years ago with the issue of health plan credentialing. Every health plan had its own form to fill out to be credentialed as a provider. And eventually CAQH said, well hold on, really there’s not that much variation. Maybe we can all use the same form. And we got Proview and life is much simpler for providers.
So maybe we need to look at that type of approach for these standards as well.
KLOSS: And Chris.
MUIR: I support the whole concept for setting a floor or a baseline, whatever term you would like to use. And then I think that there’s a couple things.
First of all, I think as long as everyone has to be able to, as a minimum, use that floor to transact, have that ability to support that. It allows the industry to try out other things. It allows for that innovation and it kind of telegraphs where the industry is going. And so in that way it will provide some predictability. You can see the different standards that are starting to become more and more popular and so I like that.
I have a couple notes. Backwards compatibility – I think it’s an important consideration but I don’t think it’s the end all. I think sometimes we make leaps that are really good but you really can’t go back.
KLOSS: April, then Melanie.
TODD: I just want to make a general comment that we have in our response to this recommendation is that we support the concept of a floor and that the standards in the existing operating rules serve as that floor. Where there is advancement beyond the floor that does not require any modification or effort by a provider, we would support that as well. For example, many of the CORE operating rules, our participants are going above that floor in terms of connectivity and response time requirements. That, for example, instead of 20-second response, they are doing a three-second response. That is encouraged.
But for things that do require a provider to do something different, we would recommend that this recommendation include a requirement that if there is going to be something that is tested or something that is new that there needs to be an agreement between the trading partners that they will agree to that so that providers are not having to support multiple implementations.
KLOSS: My perception is that next is Melanie, then Joe and then Chuck but there are a few more tent cards up so I just want to make sure.
COMBS-DYER: The CMS Center for Program Integrity supports the floor concept for the reasons outlined by Chris. And I would just like to point out that at least if I was reading it correctly, this is a recommendation that you guys are proposing for 2021 to 2024. It seems like it could happen a lot sooner than that.
COUSSOULE: Let me make a comment on the timeframes. When we first laid out the, kind of the swim lanes, if you will, it was as much about a thinking process of how long it might take to get in place as opposed to when it would start. So just keep that in mind. It wasn’t all saying, well, if it says 2021, we don’t start anything until then. It was more of a as we noodle through some semblance of practical reality of some of these things, we thought that was when we might be able to affect it as opposed to when we might start it.
KLOSS: Amen. Joe.
BELL: First I do not drive a Ferrari. (Laughter)
BELL: So far I’ve been at these associations, I’ve been called a doctor and accused of driving a Ferrari, neither are true. Backwards compatibility. The reason that that is an important topic is that traditionally we’ve always had to support two different standards or transactions at the same time, dual use, as different people adopted different timeframes. The problem with not having backwards compatibility comes with that I believe a lot of people choose to use clearinghouse services and clearinghouses actually talk to other clearinghouses. And they are, in the ecosystem, these transactions pass through a variety of entities.
If you don’t have backwards compatibility, you basically the level of transaction you have to send is the lowest common denominator in this food chain. So therefore, sometimes the provider has no idea it’s going through two or three different people. It just comes down to if there’s not ability to step up or step down then you basically are driven by whoever is driving the lowest common denominator and sometimes you don’t even know what that is.
KLOSS: Chuck and then Gail and I believe then we will go to break.
JAFFE: I wanted to emphasize what Joe had said and take it from a different perspective. Backward compatibility drives value and diminishes cost. It ensures the end user and the implementor that there will be a continuity of technical capabilities that cannot be achieved without it.
KLOSS: Gail, then Suzanne then we’ll go to break.
KOCHER: I just wanted to reiterate my comment from yesterday that we have a little bit of concern with using the term floor and baseline. Unfortunately, I haven’t thought of a better term for that yet. If I do, I’ll let you know.
But we want to make sure that it doesn’t create the capability of more variability because we all want to have less variability because that’s what is going to drive down the costs. The other thing, I think when we have that two-day conversation that Margaret indicated, and I think she’s right about that, we need to make sure we’re also talking about data content. It’s not just about are we doing version A and B or versions A and one. It’s about what is the data content and how does that impact this floor or baseline or whatever term we end up using to define that.
LESTINA: Basically I am going to echo Gail’s comments. Just very quickly, that I would also echo several of the other commenters requests that we carefully define floor baseline and also where that line exists between innovation and really the ability for noncompliance.
KLOSS: Supporting the install base versus so those who want to innovate and having a trajectory between the two while managing variability and unintended consequences. With that light thought, I’m going to propose that we go to break and we will back in 15 minutes which will put us at, how about 10:25 folks?
KLOSS: I realized as we went to break that as I closed out that section I missed a tent card going up. So I would like to give Michael Herd the opportunity to make some further comments. And Mark also would like to do that. So Michael and then Mark.
HERD: Great. Thank you. I appreciate it.
Michael Herd from NACHA. First I wanted to echo some other comments about making sure we’re clear about what it means to be a floor or baseline that we’re moving away from. And I wanted to offer a cautionary tale in this regard because the final rule in 2012 that adopted the EFT standard used language that came about as close as possible to saying that the standard’s not really a standard. That entities were free to use other forms of EFT. And I think that’s had a pretty negative impact on providers. Many took it as a green light to use things like virtual credit cards to make EFT payments and we have been struggling with that issue ever since. And if you recall some of the conversation yesterday about requesting guidance from HHS, it’s been on that topic primarily and that has been unresolved ever since. So just again, that kind of a story to emphasize let’s make sure we know what we mean by – are we moving away from a standard where a standard is not a standard. Or are we doing something that is complimentary to a standard that is in place.
GINGRICH: Just to add a little bit taking the examples that Margaret had used earlier. For me, I do believe in innovation and a way for standards to emerge but say for instance, with NCPDP, we have an incremental publishing – twice a year we’ll publish versions of SCRIPT and the standards within NCPDP. If we as an industry – say there’s the baseline that’s out there – if we as an industry believe, for instance, that prior auth for medical or for prescription benefits is something we need as an industry, it’s defined, it’s out there, we should be able to, not just pilot but be able to roll that out if the industry agrees that that’s important to us.
And at some time in the future, like now, we’ve recommended that as a standard for drug prior auth and it is getting rolled. So it’s out there for recommendation. We’d love to have that as the baseline. Not to say that another version of that comes out – and again, it’s a new emerging standard within NCPDP. We may recommend another version. We as the industry should be able to roll that out.
The same thing with these recommendations around telecom. A field has to be changed. Well, the industry totally agrees with that on all sides of the network, we should be able to roll that out without some kind of mandate. So that’s – for me, that baseline definitely everyone has to meet that level but when you have business value moving forward with a new version you should be able to do that as an industry.
KLOSS: You made a comment about no mandate. I’m struggling with the sense of our industry needing like a hammer or like a gun going off saying okay, now, everybody, let’s go here. So we still have to affect that kind of change. Any thoughts on that, Mark.
GINGRICH: Right now we do have a mandate.
It’s in the FHIR rule and moving to the next version, 2017071, of SCRIPT and per the regulation, we as an industry are supposed to move to that new version of the standard on January 1, 2020. One day millions of transactions flow daily, ten plus million scripts flying daily and we’re going to flip a switch for the entire industry and flip it over on one day. That doesn’t make a whole lot of sense and we’re figuring out how to rationalize that. And a lot of it is through broad testing and roll out. And we, as the network, as a network in the middle of that, will facilitate.
We put that version of SCRIPT forward, I don’t have the math in front of me, I had it for the last meeting, but the last version of SCRIPT that we had out there is like 11 years old by the time – the version we’re putting in is already two years old and it’s being mandated in 2020. It will be over two years old. It’s July 2017, so two and a half years old. By the time it’s required to go in, it’s already two and a half years old, we’ve moved way beyond that. It hasn’t been tested.
So that’s the other thing. We should be able to determine the value upfront, before this is going in, and by the time that goes in it’s already two and a half years old and the last version – I’m trying to think 120, I had the numbers before, I think it was 2008 or 2010 is when 10.6 was actually defined. So it’s just crazy.
So for me, if 10.6 is the baseline, we should have been able to move forward with 10.8, 11.whatever, as an industry if we were ready to do that and we just haven’t been able to do that. So we’ve piled up all this technical debt because we were not permitted to move forward with something that we, as an industry, believe we should be able to move forward with. I’ll stop there.
KLOSS: Arthur, I think you are up sir.
ROOSA: I am guessing that we are going to need more time on defining what we’re talking about with a baseline or floor. My thinking is mostly along the idea of being consistent. That there is – I’m beginning to feel that some of these thoughts about a baseline is this is where we are and that’s where we’re going to stay and we can build on top of that but we’ll always be able to exchange data at this level.
And my thought is that that’s not true. That baseline, can in fact and should, needs to in fact, evolve. But that it’s consistent, that all players are working from that same book, that again, any one provider, payer, vendor in the country can exchange data reliably with any other provider, vendor, payer.
The pharmacy world has had some success – well, more than some success. They’ve had success in what they’ve done. Their model is a bit different. They are more sort of like in the retail area. A lot of their data interchange is in the sense of well, this is what I want and oh, this is what I want to provide you with and that type of interchange.
In the physician world, that data interchange is a patient saying this is what I want. The provider is saying this is what I want to provide you with and then the data interchange is to a payer that says, oh, by the way, I want you to pay for it. And the payer has a different incentive and different motivations than the interaction between the pharmacy and the physicians.
So there’s more of a need, in my estimation, there’s more of a need for actual regulation in that area. But the baseline, as I see it, is something which is established within regulations which has consistency in terms of both structure and content that all trading partners can rely on.
BANKS: We all look at this through our own lens and terminology is very variable as we find everybody looks at this in a different light. When we looked at this, we thought adoption of a floor or baseline was quite simple and it’s not. But that was how we looked at it.
And that 5010 is the baseline, right? And the compliance on 5010 is what’s in place right now.
And I think the other standard-setting bodies are probably a better example than X12 because they publish incremental and X12 has not published incremental since 5010. So if we take Margaret’s example of NCPDP, where the published standard, whatever it is right now, and they put a new one out next year, whatever is published and final would be the next incremental version that you would still have to remain compliant with if you were going to go above the floor.
And I can only speak from an X12 perspective because I’m more familiar, is if X12 had a 6030 that was published or had a more incremental version, to move people to use that is going to be very difficult because it’s very costly. There has to be a great business need to test it but that’s what it does. It tests it.
But to me, in my simplified mind, there would still be compliance because you have to be compliant with the implementations guide and standard. And so I thought this was kind of a clear way of being able to enforce innovation, give you incentive to go to the standard bodies, put in implementations, if you’re going to be able to use it in a year.
Because now we can meet client needs quicker, faster and we keep having that testing field because when you look at this, and one thing we didn’t talk about was the levers. If everybody was getting the data they need to do their job in a streamlined workflow we wouldn’t be here today. Is that true?
So I see the mandate as the capital getter. The only reason in health care the mandate is important is it’s the way to get more money. And when we have a huge span – just because of the regulatory and the issues, we’ve got 5010 to 7030. That amount of money, trying to get it in order just to put one standard in is going to be very challenging.
And now what’s compliance. Well, compliance is the arm – if it’s truly implemented and enforcement is to get that consistency so that there’s that lever. You’ve got to be compliant or you are going to pay a penalty that’s going to be more painful than being incompliant. But that arm has to be there.
But now what is truly usage. Usage is none of those. Usage is who is willing to do it. And so if you have all the Buka(?) that choose to go up one version, how many practice management systems, how many clearinghouses are also going to move to get that enhanced value if there’s a business need?
It’s not about the mandate. It’s about how many people are willing to use it and test it. It’s the market. And so I think when I really love your comments because it’s how do you get the data where it needs to be in the right place.
And one thing we didn’t talk about – and you know I always talk about is the convergence or the harmonization of administrative and clinical data. We’re talking about claims. We’re talking about – for X12, we always defer to you guys. I don’t know why but it’s the way it is. All the organizations are working on different pieces of it.
And the question is what is the streamline workflow? And I would argue there’s only one streamline workflow and that’s across administrative and clinical data. We’re able to move data with metadata, with the different syntaxes. We’re able to put data in all different places within the workflow between the consumer, the payer, all the intermediaries and the provider. This is not just payer provider anymore.
And why in the world is the eligibility not going to the patient. I would love to know if someone is asking eligibility on a service. That could be a good fraud detection. But I digress.
So the point is we’re making this so complicated and maybe we need this body, that we’re not sure we should have body, really looking at where is the future of information exchange, recognizing for the billions of claims that get transferred, it has to be tight. That’s how you are cost effective in exchanging the information for a claim.
But when you’re looking across the board, what does the provider need, what does the consumer need? How is metadata – is there variability in the syntax that needs to be thought of. Again, not today, because we’re working in today’s environment but who is thinking about tomorrow. And I don’t think it’s just HL7 and FHIR and block chain.
These conversations are important to us. Are these HIPAA handcuffs going to be on us now for five or 10 years?
Brad’s programmers, where are they? If they’re not going to program in X12, what are they programming? What’s the next best thing? What is the cost efficient –
Again, not a conversation for today but if we’re going to talk about this, we’ve got to start separating between today and what we’re working on, our move to 7030 which is frightening for everybody, or/and passed that.
And so incremental – when I look at this for X12, I’m looking at 7030 as a floor assuming it comes mandated. And then if there’d be a 7030 – I don’t know how the mechanisms go or 7040, that’s published which then has all the feedback, that would be your optional upgrade.
By the time regulatory upgrades the floor, it’ll probably be five years and then you continue that process. Now you have a testing end loop process. But we’ve got to remember we don’t want to stifle clinical information exchange and the metadata needs that possibly need syntax in a different format. But that’s way over my head.
COUSSOULE: I am going to take the chair’s prerogative for just a second here. I want to get us to the next slide but I want to tee up a little bit with a question. We’ve spent a lot of time and heard a lot of usage of the terms industry getting on board and business value getting on board.
There’s an interesting one for me when they say industry does that mean willing participants or everybody. Big difference there when you start talking about moving a standard or moving a base or whatever the terminology you use.
Secondly, you say business value, business value is very independent. Because what’s for me might not be the same for you might not be the same for you. And so it is interesting the thinking process about how we move forward with those two kind of terms which are almost incongruous from the standpoint of if it’s some, it’s voluntary; if it’s all, it’s not. Can you get everybody there if it’s not voluntary? Or if it is voluntary? So I think there’s an interesting.
I’ve asked to advance to the next slide and address the recommendation 12. So didn’t mean to interrupt Deb.
KLOSS: I am going to interrupt again because we do have a couple people with tent cards up. Now they’re going down. But I do believe our esteemed chair of the Full Committee would like to –
STEAD: I am a generalist in this space and as I listen to this conversation and the different elephants in the room, one of the questions I just would pose is in the administrative transaction space, is part of the problem that we’re trying to put too much in a transaction? Because we’re still largely coming from a world that was paper based and you either got it in the transaction or you didn’t.
And I’ve heard – it sounds to me like some of the conversation around 7030 has that same business of how do we cut it off and move on knowing that we could maybe have a quicker next step. I think the clinical standards world has begun to move in a very different direction toward, if you will, smaller service-based use case kind of work.
And one possibility would be that we could marry the two approaches and we could have a much more limited and probably, therefore, much more standard, administrative transaction that might be so basic it wouldn’t need as frequent updates.
We would then be able to obtain the other information you need in a more flexible, service-based approach which could be built up from these use case-based combined definition testing processes that get something so that when you’re making it the standard, you’re simply promoting something we all already know works.
And so I just put that out as a thought. I don’t know if it’s conceivable as part of the way we might come at this or not. But just hearing these conversations that we’re talking across these two different approaches and maybe I should just put that piece of the elephant on the table. Thank you.
ALSCHULER: It is a pleasure to build upon that. I was just drawing a spectrum of universal and local because ever since I’ve been in standards work I’ve seen it as this tension because all implementation is local. ALL implementation is local. And every standard has to be universal to some degree.
So there’s just an inherent tension in what you’re trying to do every time you create a standard. And what you’ve pointed out is that some use cases are inherently easier to universalize and the compromise is less damaging. You can get a very very tight agreement.
We’re using the same metadata set to move FHIR documents as was developed for medical documentation messages 30 years ago. It has not changed very much. All of the syntax and environment around it has changed so some use cases can be inherently tight so I think that’s really a tremendously important point.
First, that sometimes you can get really strict and you need to be really strict and then other times you will have to live with the fact that every implementation is going to have some different needs. And if you try to impose the same degree of stringency, you will not get a useful consensus even if you do get compromise because then you just push the variability outside the standard.
And there are various ways standards organizations deal with that and you’ve suggested that that be taken into consideration upfront, looking at it from an administrative clinical perspective, as Tammy raised, I think is very useful and looking at the leakage between them and the attempt to meet the absence of clinical data which is increasingly needed through mechanisms and standards that were built to titer use cases is probably a key consideration.
KLOSS: The things about raising your tent cards is great but it doesn’t go into the transcript. So I’ve seen dittos from Brad, Mark and Mike. Would you like to make some comments?
BARLOW: No, I wanted to ditto.
KLOSS: I tried to make sure it was in the transcript. Please others that may have comments about the last couple of insights and the elephant in the room? Deb, would you like to take us into recommendation 12.
STRICKLAND: So moving onto recommendation
HHS should enable voluntary use of new or updated standards prior to their adoption through the rule making process. Testing new standards to enable their voluntary use may be explored by testing alternatives under 162.940. Exceptions from standards to permit testing of proposed modifications. The purpose of this recommendation is basically to enable innovation.
So under today’s regulatory environment to legally use a new standard, a covered entity must request an exception under 164.940 which must be granted by the Secretary. This is the case for modifications to an existing standard as well.
So in 2009 proposal, a group suggested that standards under development be announced through a notice of the federal registrar. They suggested the notice enable early adopters to use the new version once it became available but before it was officially adopted through the notice and the final ruling came. Any comments on this recommendation?
GNAGY: That just sounded complicated. And I can tell you, programmers don’t like complication. If you’ll make it easy for them to get to you to get to this information to innovate, to bypass the current constraints, they’re not going to do it or they are going to do it in their own way.
GINGRICH: We believe in the voluntary use for updated versions of standards, like I said before. But don’t believe that needs to be regulated.
TENNANT: I would take the opposite approach.
I think we’re certainly in favor of opportunities to test new standards but it absolutely needs to be regulated. We believe that there should be a process in place for application. So a health plan, for example, would make an application but the term willing trading partner would have to be carefully defined so it couldn’t be done through simple contracts. So a health plan couldn’t force providers to use a different standard based on a contract provision.
We’d like to see there be a partnership. So a health plan, provider and vender who would go in concert to HHS and say here’s what we plan to do. And part of that contract should be that there be data metrics, a report issued after completion of what would be essential a pilot so the industry could gain knowledge from the pilot.
But again, it has to be regulated. At the same time that a health plan, for example, would be trying this new standard, they should also be required to support the existing standard for other providers in their network.
SPECTOR: I wanted to ditto that officially. MS. KLOSS: Nancy is dittoing Pat.
WALLER: So Cambia supports the piloting of new standards. The process has been there for years. I’m not sure that it’s ever been utilized and so we’re sitting around worrying about how to get to 7030 faster. We actually have a method to get there.
Additionally, I would recommend that HHS publish anything that’s been approved through the process, the parties and what they’re testing so that if someone wants to join that process they can. So this is the avenue, if we want to move a little faster than we are today, and NCPDP could have moved faster using this. But we have to make sure the administration actually is on board with us, approving these and supporting so that we can move faster.
WEIKER: NCPDP agrees with this recommendation. One of the things about this particular section, the 162.940, a lot of people don’t know about it so we might need to do some marketing about it. What’s the process? What’s the procedure? How long it should take between submission and getting an answer back from the Secretary?
There is a reporting requirement when you get this exemption to test. You have to give your results of this test because part of it is you want to do X to prove Y kind of thing. And this was invoked out in MediCal and this went back to the use of what was the UPN versus actually using like a J code or a HCPCS or something. It was to use the UPN.
And MediCal did ask for this. They did it. They got the report. We thought they were done and had ripped it out of X12 and then they’re like no, no, no, we want to continue doing it, it was a success. So we had to put it back in.
But I think this needs to be brought to light more – of this is out there, it’s available, but a lot of people don’t want to use it or don’t even know about it to determine to use it or not.
ROOSA: We would support this particular recommendation. I do want to raise the comment again and going back to Margaret’s comment about sounding like a broken record. By the way, it occurs to me that there are not many people around now that actually have ever heard a broken record.
WEIKER: Wait a minute, are you calling me old?
ROOSA: There is the issue of innovation versus noncompliance and how do you tell the difference and the absolute need that there be an established baseline at which you can reliably depend that that transmission will be accepted by your trading partner. And if in fact, somebody even applying formally through this section, wishes to test a new standard that not all of their trading partners are required to do that.
That in fact, they make individual agreements with those folks and those that are willing, and as Rob said, you need to define what willing is. Sometimes willing can be, well, if you don’t do it, we’re not going to honor your contract anymore. And then, oh, I’m a willing partner. Define that clearly. But that it should not be forced upon any of their trading partners and I believe that’s very important.
KLOSS: So Laurie Darst and Suzanne Lestina. MS. DARST: I ditto that.
KLOSS: All right. Just to clarify because cards are going up and down. Eric, Gail, Laurie Darst, Melanie, Gary, Janet.
DARST: You can take Laurie MS. KLOSS: Eric.
HEFLIN: So I think the goal here is to provide two objectives, interoperability in a very stable predictable manner, as well as to also allow for innovation. And those can be diametrically opposed goals but I think if we link – and my recommendation is actually in fact to link recommendation 11 and 12 together and make them a single unified recommendation which is that simply organizations must implement a baseline. They must be validated, must be certified, it must be tested. And therefore everybody has that baseline of interoperability we can depend upon. And then on top of that we also do recommendation 12 as stated right now, that innovation should be allowed.
And I do have a friendly question for Rob of why he does think innovation does need regulation. It’s perhaps rhetorical or not. And I’m not making a value judgment. I’m just really curious why.
We do have examples of this, the Health Exchange actually had 2011 specifications in 2010, and we adopted the approach by simply requiring that everybody implement 2010. And then they could move into 2011 specifications at will. And then over time the 2010 specifications atrophied and people saw value in the 2011 specifications and now we’re all upgraded to a new version.
Other examples exist, like wifi was mentioned yesterday. There are compelling reasons why people are just naturally moving towards the more recent versions.
And then most importantly I have a comment for Arthur – that if you google the term record lathe you’ll see records are actually new again and they are on the comeback. And so people probably will start seeing broken records once again so – thank you.
KLOSS: Since there was a comment pointed out which I think got Rob’s card up and Nancy’s, are you getting in the cue, Nancy? Or are you jumping on the Eric comment?
SPECTOR: It is sort of in response –
KLOSS: That’s okay. So Rob and Nancy and then we’ll go to Gail.
TENNANT: I certainly appreciate that. We’re all in favor of innovation. But let me give an example, I think Mike had brought it up earlier, that we have fought to get a standard for electronics funds transfer, to be able to move money from the payer to the provider seamlessly, get rid of mail, get rid of the traipsing over to the bank every Friday.
But what happened was innovation in the form of a virtual credit card which is a 16-digit one time use fax sent to the provider’s office for payment and they were expected to use their credit card terminal and incur a four percent fee on that. Wonderful innovation for the health plans. Terrible innovation for the providers. We need regulation to prevent bad actors from doing the wrong thing.
HERD: Ditto Rob.
KLOSS: Thank you Michael Herd. On the same point? If not, I’m going to defer to Gail next.
KOCHER: We fully support recommendation 12.
We do support innovation and adoption but this exception process, however, is really critical because in some cases we’re talking about mandates that are currently on the books, that covered entities can be held to compliance and incur fees. That is different. We have to think about that slightly differently than new data exchanges that are not currently regulated and the only thing that stops entities from doing that is willingness to come together in a voluntary fashion.
So, that’s why I think recommendation 12 needs to stand on its own because it’s really critical for the current regulated transactions and covered entities that fall under that. What we don’t want though is this process to suddenly turn into having thousand different implementations that put trading partners, especially providers again back into variability. Somewhere along the line we could lose interoperability and administrative simplification if we’re not more thoughtful about how we proceed and look at doing innovation.
KLOSS: Currently in the cue, we’re going to start with Melanie, then Gary, Janet, Martin, then Nancy.
COMBS-DYER: The CMS Center for Program Integrity agrees that HHS should enable voluntary use of new or updated standards prior to their adoption through the rule making process. We really believe that this recommendation supports the country’s move towards interoperability, especially as it relates to coverage requirement discovery, ehealth record exchange, authorization support and the like.
The same comment on this one as on the earlier one about is it right to put this in the 2024 bucket, could it be moved up earlier but I note that perhaps you’re looking at end dates rather than start dates. Thank you.
BEATTY: Gary Beatty, X12. Just to ditto Margaret’s comments. I agree with this recommendation. This process has been in place. I also agree that it could be more readily made available and knowledgeable and promoted through the industry.
And I think we also have to separate this process from what – we’ve been using the innovation word a few times in this discussion – that we don’t need regulations to do innovation. When X12 started to do its pilot for using the RS Schema which is a representation for our metadata, we didn’t do a regulation or look for a regulation to make that happen. We just went out and started to do that.
We want to do that to be able to promote innovation within the industry. We don’t need regulation to make that happen. So we agree with the concept but not applying it to innovation.
CAMPBELL: I keep hitting the wrong button on this thing. I just wanted to say that I think that we’d all agree that trading partners shouldn’t be able to force other trading partners into an approach that they can’t otherwise support. Kind of the challenge that I think we’re all trying to protect against.
I guess my question that’s more a rhetorical question is whether the process of applying for that testing exemption as well as the overhead that goes along with it is really the right way to achieve that. Or if we could do something in regulation to outlaw the bad outcome without necessarily adding more overhead to the good actors who want to test together.
WILBANKS: Well, within this not every variation, to go back to Gail’s point, is the same size. But within the industry, we’ve made certain assumptions regarding the level of variation we are willing to accept.
So if we are going to go down this route, there needs to be a resetting of expectations as to how much variation we are going to allow within our standards before we actually reject it.
Right now, a lot of the standards we look at and we say okay, yes, it’s not compliant but it can go through my system so I’m just going to ignore the fact that you sent me a bad qualifier. Because that’s okay. It’s not going to break my system. And really understanding what that level is before we start going down this route.
Otherwise, sort of to Gail’s point, you’re going to have a situation where we’re going to support thousands of variations and build all that code to support it.
KLOSS: Nancy, then Brad.
SPECTOR: I just wanted to go back to a little of what Rob was saying there in terms of you were using the word regulation but I would say just registering the pilots. It would be what we would want to see – and largely that’s to understand what is being done and what the business needs are that are being addressed by these pilots, test cases, whatever you want to call them. And then understanding what the outcomes are from that.
And one example, real world example to bring up is, in terms of something that was being done that was little known, was the use of an unusable field in the 277CA claim acknowledgment transaction. And it was being used – there was a business need for it but when this didn’t come out in terms of it being used and the fact that it was a field that shouldn’t have been used until we were going through the 7030-comment period and now it’s required reworking within that transaction during the comment period.
And if there had been sort of this registration of this use of the field, we would have known about that, there would have been a business need and the workgroup working on that transaction could have addressed that in the building development process instead of this last minute during the comment period.
GNAGY: I will take my developer hat and stick it back on for a minute. So from our perspective as a developer, when we look – honestly, I go to my director of programming and if I mention 7030 he walks out of the room. There’s this perception of an instant lack of value. In fact, not only a lack of value but a desire to ignore it until, like Rob says, you’re forced to do it.
And so you don’t want that same perception in innovation. The perception that the innovators are being mined and herded tends to break that so I strongly disagree with regulation of innovation. But the point is valid that Rob makes and that someone else make the credit cards. The end result was the payer was paying an extra four percent when they didn’t have to. And that’s ridiculous. The provider, sorry.
So agree that there’s a problem but regulating at the level of the innovators is not going to make them want to innovate and so finding a way to address it at the point of implementation or prior to the point of implementation, something along those lines, I think we should consider.
But I agree with Gary for sure.
COUSSOULE: Just one thought, being a devil’s advocate a little here. Would people in this room generally consider a regulation like a tax on innovation?
I don’t mean monetary but just an overhead involved. Frankly, when you tax something you get less than of it, right, as is pretty well known across the industry.
So, if in fact what we’re trying to do is spur innovation, the more constraints we put on it the less of it we’re going to get.
GNAGY: (Off mic)
COUSSOULE: But my point – I’m not talking about a tax from a monetary perspective. I’m really just talking making it more difficult. And so I would just – I toss that out as just a thinking process for people in the room. The more overhead and the more challenge you put in something the less you’re going to get of it and is that what we’re really what we’re trying to incent. And you can tell whether you agree with that or not. It’s just a thinking process.
KLOSS: Rob, then Eric.
TENNANT: As per usual, Nancy did a better job of articulating my thoughts than I did. So, I would use the term registration and oversight rather than regulation. We certainly don’t want this process to have to go through the belly of the snake, taking years to come out but we do want oversight.
And with all due respect, taxes actually produce roads and schools and improvements to society, so it’s not all bad. But I think what we want here is – and I would use the word cooperation. And is there an example we can look to. I think there are two. The first is something that we’re all now well versed on and that is DaVinci where we have the key players, vendors, some providers and payers coming together to look at use cases, look at a new standard and in a very transparent way.
The other example, ironically, is the 275. The only HIPAA transaction every piloted. Piloted back in 2005 at Montefiore Medical Center in the Bronx, Empire Blue Cross Blue Shield, I think it was NextGen and WEDI was the convener. And again, what was ironic about it was it was resoundingly successful. Both sides saved money. I went to several presentations from participants and they raved about the value of the transaction. In fact, they kept using it after the pilot period was over. Of course, that signaled the death bell to the transaction apparently.
So there is some precedent and the difference there is to our point, which is cooperation. It’s a partnership between payer, provider, vendor going to CMS and saying we want to test this as a partnership. And I think that’s our point. Not more regulation. Just there has to be oversight and agreement.
GNAGY: Much better put in terms of the discussion that we’re having here. But what we have to consider – a programmer is a programmer and a lot of people when they think about a programmer they think of a very logical, very nerdy guy sitting behind a desk doing these things. Innovators tend to be very creative and we have to consider that those people are a little bit different and even the programmers coming into the industry, and not just programmers but IT people coming into the industry, the innovators are creative. And people create when they’re comfortable. They create environments that they’re used to.
And so I agree for sure there needs to be some oversight registration and a community will have that, will have a register for the community. This is who I am, this is who I work for, this is what I do. And then you give them the tools to say this is what we like to do. This is what we think we’re going to do and the costs are going to be. This is what we think the benefits are going to be.
This is how we’re going to approach and you give them these tools.
But the appearance of regulation and oversight will deter creativity. The absence of it, I think we can all agree is not really viable so have it but it needs to be set up in a way that the innovators can feel comfortable, can be in that environment and can work well in that environment.
GINGRICH: I totally agree with the transparency and it’s a little bit of a ditto of Brad. Whether or not a convener is an SDO or a WEDI-type organization, I think that’s important because of the communication and transparency is important. I’m not sure though that that has to be done through regulatory means.
SPECTOR: I was just thinking and I’m not that well versed in it. Others in the room probably are.
ONC has what they call the Proving Ground. And my understanding from just looking at it, not in a while, it’s a database where you go in and look at projects that are happening and I think that would be something that would be of help or use in this situation.
Because some of it is you might be thinking in your little place in the world, oh, I’d like to look at X, Y, Z, and this would be a way to look – is someone else doing that? Can I partner up with somebody who is already doing that? Versus reinventing the wheel. Then it’s also a place to go to see what is happening and get to understand experiences, outcomes, how all of that’s
COMBS-DYER: Ditto Nancy. I think the ONC Proving Ground is a great idea.
KLOSS: Mike Herd.
HERD: Mike Herd from NACHA. Just a little insight from a different industry perspective. Banking regulators have adopted innovation frameworks, for lack of a better word. And they are not regulations, per se, but they are giving a viewpoint on how they would view innovations in the financial services industry and I think it has a positive effect from the innovator’s perspective to have some certainty about how their work and how their products would be viewed by regulators.
KLOSS: Deb, I believe we’re back to you to advance us to the next slide.
STRICKLAND: Great. So the next slide is a call to action slide. It’s Call to Action D: HHS should fund a cost benefit analysis of HIPAA standards and operating rules to demonstrate their Return on Investment. HHS may consider collaborating with or supporting any existing industry initiatives pertaining to such cost benefit studies to increase data contribution by covered entities and trading partners.
So it’s difficult to adequately measure the value of implementation of a new rule and changes without a quality study. If we want to make sure that there is visibility into the value, we need data that’s supportive. Ideally, such a wide scale analysis should come from HHS since HHS is required to include a cost benefit analysis in its regulations to adopt a standard or operating rule. But there appears to be an incentive for the Secretary to fund these quality studies. There are entities that create and capture some of the information so that experience and data could be leveraged. What are thoughts on this?
WEIKER: NCPDP is opposed to this action if – there’s an if – if it would elongate the process and increase administrative costs for the implementation of the standards. We do support a cost benefit analysis being done on the standards and operating rules but if it elongates the process or increases the administrative cost of implementing those, then we’re not for this.
KLOSS: April, then Minil, then Rob and then Laurie Darst.
TODD: So I would just reiterate one of the comments that I had yesterday which is I think if we wait until this part in the process to do a cost benefit or ROI, we’ve missed the boat and may be wasting a lot of time. It needs to be done earlier on. We very much support the need for cost benefit and ROI and would welcome HHS supporting efforts that are done earlier on by the SDOs and operating rules and, in particular, would support broader industry support of the index that is starting to track some of this information.
KLOSS: Arthur and Mark’s ditto and Laurie’s ditto of April’s comments. Okay, Minil.
MIKKILI: We strongly support the cost benefit analysis and we think this should be moved in 2019 and 2022 timeframe.
TENNANT: I will ditto that as well. I think it’s critical to get it done. But let me make an argument that this slide bleeds into the previous one because how are you going to encourage people to test new standards.
Well, one of the ways is to say we’ve looked at the ROI and there is some real benefits here so we can start to do maybe larger-scale tests to start taking advantage of these standards prior to them being mandated by the federal government. So enormously important.
The challenge I think is the first part, getting funding. That may be something we’d have to again go through statute to get changed but at a minimum it’s got to be done and I think groups are willing to take on some of the financial risks, especially large payers, if they see there being a benefit to their business models.
SAWYER: Thank you. This one is saying cost benefit analysis for HIPAA standards but I think as we get into the predictability piece and maybe doing smaller increments more often. Are we doing a cost benefit on the entire standard or implementation guide again, or just the cost estimate on the incremental change?
And two, I guess in terms of cost benefit analysis, and I’ve seen some that are off by 1500 percent or more, so just knowing that a cost benefit analysis could be done but not always very accurate.
KLOSS: Suzanne, then Gary.
LESTINA: Mine was the same thing. I actually had that question – is this overall where we are today or is this incremental going forward. And I just wanted that clarification as well.
KLOSS: I’m not sure that we have that clarification for you at this point in time. Gary.
BEATTY: Hi, Gary, X12. I just wanted to ditto Margaret’s and April’s comments and that we can support this and we also support the index and we’re working with CAQH CORE to augment and work with that index to help represent industry.
KLOSS: Danny, are you back up in the cue?
Deb, I think that’s run the table.
STRICKLAND: Great. So we’re on to the next slide. So this is Call to Action H: HHS should continue to publish a universal dictionary of clinical, administrative and financial standards that are or will be available for use, for example, the ONC Interoperability Standards Advisory.
So there’s growing agreement about the convergence of clinical and administrative data. So it’s time to create a path to integrating the standards, incorporation of administrative and financial standards into the Interoperability Standards Advisory, and this represents a small step towards this convergence.
Another strategy to keep the conversation moving is collaboration between HHS’ Federal Advisory Committee, NCVHS and ONC’s federal advisory committee HITECH, are collaborating in regards to objectives in the 21st Century Cures. Any comments on this?
WEIKER: NCPDP agrees with this. MS. KLOSS: Tammy.
BANKS: We agree with this but we would also add that it be the mandated and nonmandated standards and include the – like HL7 has the one to five, the actual usage so we know when nonmandated standards are reading that chance of maturity, again to enhance future usage.
KOCHER: We agree with this recommendation but we think it should be the ONC ISA because it already includes all of these, and it does include at least the ones that we’ve looked at from – I can’t say I’ve looked at the HL7 side ,but it is including the X12 suite including the non-HIPAA mandated currently. I think Lorraine might get some credit for that.
KLOSS: That is why I am patting her on the back. Thanks to Lorraine we have it in there.
KOCHER: So we’d rather not see something else created. Let’s leverage something that already exists and put everything in one place.
KLOSS: That seems to be a very good theme. There is lots of – everyone’s like let’s not make it more complicated, let’s leverage what’s already working. And also when we have such a challenge with the widespread education, it will make it an easier lift to involve the entire village or community, as Brad would say.
MIKKILI: We agree with this recommendation and think this should also be moved to 2019-2020 timeframe.
CAMPBELL: I agree with supporting or using the ISA for these purposes. That said, I think that there’s a certain lack of transparency into what goes into the ISA and how those determinations are made. There’s a public comment process, but sometimes you look at it and your experiences in terms of the adoption of standards are entirely opposite of what the ISA actually says. I think that HL7 actually seems to have a better process for determining that and that there are probably opportunities for improvement in the ISA process.
KLOSS: Eric, I think you want to say something.
HEFLIN: To Janet’s point. I actually was involved in the ISA process the last couple years, not most recent but the prior two years before that. And I just want to point out the meetings are actually public and comments are always allowed in each meeting and the process actually has been defined. It’s very similar to CERIN(?) process including the definitions of the elements if you know what the criteria are and what the use cases that are associated with a given CERIN and so on. I’m just kind of curious what elaboration or what was kind of broken with the process.
CAMPBELL: Perhaps it is because I have participated in one of the very, very early iterations and it’s gotten better since then. But I think it’s the process after the publication and the public comments, how those comments are then reconciled into the next one. I know that we’ve made the same comments multiple years in a row when there doesn’t seem to be a good sort of feedback cycle for addressing this.
KLOSS: Danny, Jean then Chris.
SAWYER: For use in the existing advisory and I haven’t looked at that for a while but I guess whether that also includes emerging standards and standards for trial use. So some of the discussion that we’re talking about too, if we’ve got things that are out there for trial then we’d want that in there as well. Thanks.
KLOSS: It has been clarified that yes, it does. Jean.
NARCISI: I just wanted to get on record that the ADA supports the use of an ISA also.
KLOSS: Thank you. Chris.
MUIR: First of all, I just want to – being responsible for the ISA, I just want to say I appreciate all the accolades and stuff. I also want to say we are on a continuing process improvement and so I would love to talk to you and find out some of your concerns so we can improve that process. And that’s really all I wanted to say.
KLOSS: Thank you. It appears we’ve run the table. Deb.
STRICKLAND: Go on to Nick. That was the end of my sections.
Agenda Item: Prioritizing Recommendations and Identifying Interdependencies
COUSSOULE: Okay. Trying to get my thoughts together, sorry about the delay. You see my lips moving and no sound is coming out, that’s the challenge.
I want to thank everybody again today for your robust and challenging input. It is exciting to hear, frankly, so many smart people talking about different topics and not always agreeing because you get to better answers when we do that as open and professional and visible. So we really want to thank everybody for that.
There’s I guess one other question I want to pose to the people around the table. One is if you have any additional recommendations or things you believe we have missed or just want to weigh in on an additional thought kind of before we formally close it up for the day – after public comments, but I would like to ante that up for people in the room here. I think we’ve got public comments that are supposed to start in about half an hour.
But it will give us a little time to get your feedback on really three things. One, are there additional recommendations that we might have missed? Two is do you believe there is an order or a sequencing of these that would matter. I think we’ve covered a lot of that today but if you have something particular that you’d like to stress, we would love to hear that. And then I think we’ve gone into the unintended consequences a good bit so if you can kind of coalesce your thoughts around those kind of topics for feedback for the committee members here, we would love to hear that before we close up and go to public comments.
KLOSS: I see Brad’s card up as well as Eric’s.
GNAGY: So we will find my extra line at the very end of the document that I send in. It says one thing that we have not formally considered discussed, and again this is not a for here discussion but another extended discussion, and most of us in the room, or at least a lot of us in the room will potentially have experience in this and that is the advent or the implementation or current implementations going on with machine learning and AI and how those can directly affect not just standards but the creation of standards, the update of standards, that is significant.
And I don’t think I could cap exclamation point that even – if you’re in the AI space, and I’ll use AI loosely for anybody who is going to bust me on the actual definition – if you’re in that space and you’ve seen results, it’s remarkable, significantly remarkable.
And it can be used if you have the experience. And we need to get the experience and the experience I’m talking about is data. And that’s the data at the low level. And all of my braining on community is really to the point of the technical industry has spent years and years and years gathering data. We’ve gone through – the past 10 or 15 years have been analytics and let’s show the data and let’s see the data.
Well now let’s make some actions, let’s take some actions and that’s what machine learning and AI is really doing. And it’s going to come to fruition soon. And it’s going to be heavily impactful. So I think not in this discussion but there definitely needs to be a very high- level consideration. And then a detail consideration of how that could be implemented.
HEFLIN: I think we are dealing with national scale problems and potential solutions and value. We’ve had a lot of false starts and problems, especially in this particular domain of health care. But I think what’s really important for us, at kind of a philosophical level, is to understand it’s not necessarily the failures that really are going to guide us, it’s the limited pockets of success, both in our industry and other industries, we can actually learn from.
And I thought a lot about what can we model after. Are there other industries similar to ours in some ways that actually have overcome problems like this and been successful to at least national or even larger scales. And yesterday I think Chuck Jaffe was mentioning wifi and I think actually that might be something not bad worth us considering modeling after.
They have essentially very, almost nonexistent, regulations about that. They basically say you cannot do transmissions outside of this frequency range in terms of the wifi spectrum. You can’t have this much power. You cannot interfere. You must except interference from others and pretty much, that’s it, the rest of the spectrum is up to you all to figure out.
And then IEEE, the standards body came along with subject matter expertise and they created the various versions of 802.11 which actually were not perfect, almost every version of them has some issues. But they were adequate to meet most of the needs of the time.
And then, and this is where I think we have a big missing element, then they had commitment from industry to actually implement that. So the Wi-Fi Alliance, which I believe is formerly a trade association, then adopted IEEE, the standards body’s work, but part of that process was the members essentially had to commit to implementing that which IEEE and the standards body essentially created.
That’s where the gap is and so whatever we need to do in terms of regulation, I caution. Let’s make sure it’s filling that gap, not actually stifling innovation by incentivizing those that are not committed to actually commit to adopting those standards. Thank you.
BANKS: I just want to thank the Subcommittee on Standards on really putting a lot of thoughtful consideration into these recommendations and holding these hearings. Obviously, this is a review that occurs – what, every five years. But again, I just want to say thank you because I think you guys have really put a lot of work into it and everybody around the table has put a lot of time and effort in standards and we’re only having these conversations because of the success that we had. So I think as an industry we’ve got to give kudos to the standard setting bodies for all the information exchange that’s occurred.
But I do also want to reiterate the support for the Subcommittee in this work. This is the second time we’ve had this conversation. Everybody here, as Joe said, does good work and they are here because they want to do more good work and so we do want to make a meaningful difference in making sure the information gets to who it needs to be in a way that it needs to be so please reach out to us if you do need champions or what can we do to truly make change because there’s a lot of recommendations that were made before that there was not change for whatever reasons, and we’re here to support the effort to make that change. Reach out to us.
KLOSS: To that affect, I know we still have Rob in the cue and we have some public comments but can we advance to the timeline slide please. So I think it’s appropriate – so the game plan is for us to synthesize the phenomenal amount of feedback we got from all of you and balancing the different perspectives, to try to find the recommendations that are really going to help us continue to evolve forward in the short term while also looking towards the longer-term perspectives that we need to achieve.
And so we’re going to have a lot of work to do between now and the end of January because our goal is to bring some form of recommendations, call to action and measurement framework forward to the full Committee at our February 6th and 7th meeting. I’m not sure what that will exactly look like because there’s been a lot for us to digest over the last 48 hours, but we will be taking that on as a Subcommittee in earnest starting next Thursday which is the last day that we’re seeking public comments on the framework that we’ve put out.
So those of you at the table were given a December 7th date to submit your testimony but we had a December 20th date more publicly broadcasted and so if you have any further comments after your reflection of the last two days, please know that we would like to receive those by next Thursday so the Subcommittee can accomplish our work over the following six weeks.
And I think Nick would like us to go to the next slide please. There we go.
With that said, I want to go back to running the table and know that Rob’s tent card is up and that we do have some public comments that were submitted electronically and we also may have some folks that are sitting behind me who have public comments. So we will also make sure to give them time to make commentary. Plus anyone else whose tent card may pop up shortly.
TENNANT: I just wanted to echo Tammy’s thank you to the Subcommittee. Fantastic work in developing these recommendations. And this has been a long process.
If you think about it, it really started with Secretary Louis Sullivan back in the early 90s where he said, listen, there’s got to be a better way to move health data back and forth between providers and payers.
And I wanted to echo Nancy’s comment. At the bottom of this is the fact that we need to deliver patient care effectively and efficiently. And there’s been a lot of promises made over the years. 4010 was going to solve all of our problems. Okay maybe not but 5010 – that was going to be the elixir. ICD-10 was going to result in all kinds of improvements to quality. They never came to fruition, besides quality payment programs, Meaningful Use.
The one thing that has increased is physician frustration, burden and burnout. And it’s being recognized in this administration and it’s not all due to EHR use. A large part of frustration is built around administrative issues, dealing with health plans, trying to get paid for your work. And so this is an important component of how we can address those concerns. So I encourage you to reach out to the provider community if you have more issues as you develop your final recommendations.
KLOSS: Thank you. Chris.
MUIR: So again, I am taking the perspective as a fed and thinking back over some of the comments that I heard, especially around the regulatory process. And I know that there’s the challenges of just developing rules and how long that takes but when I hear – and what I’m going to comment on – is when I hear about some of the ideas I guess that have been presented from the NCVHS and this Committee and trying to move them forward to CMS, when it’s taking years and decades, I would submit to you that that’s not a regulatory problem. And I’m not really sure what really the problem is but recommendations come from federal advisory committees and then HHS has to decide on them. And then move them forward. And it sounds like it’s getting – some of them, the ones that are taking years especially it’s stuck at that level. It doesn’t even make it to the regulatory process, I guess is what I’m trying to say.
And so I don’t know if it’s that the industry has to do a better job selling the ideas or demonstrating the return on investment but it just strikes me as being – I’m not necessarily directly involved with the regulatory around it that there is an issue there and I think it’s overly simplistic to say it’s a regulatory problem. It’s like it’s not quite making it even to that process within the internal gears of government.
GOODYEAR: Thank you. First of all I want to ditto what Rob had to say with a capital D. But it really goes beyond frustration and burnout on the physician provider side, far beyond that. Those issues are incredibly important but implementation interferes with the continuity of care even if transiently, and I said that yesterday. And interruption of continuity of care can adversely affect outcomes, significantly adversely affect outcomes and can really adversely affect the patient experience. So ditto to Rob. Thank you.
KLOSS: I think we are going to start with Rebecca providing the electronically submitted public comments.
Agenda Item: Public Comment.
HINES: Just a reminder for those on WebEx, this is your opportunity to submit a comment on the WebEx dashboard or to the NCVHS mailbox and I know the WebEx has that slide up right now with that information.
So it looks like we’ve received three comments via the WebEx dashboard this morning. The first one from Sam Rubenstein regarding goal three. We believe that this is a critical issue and timeliness is essential. We must not forget that a good number of us are dependent on the vendors that distribute our EHRs and administrative systems must also adopt this within their own applications. This in many instances can be problematic for some organizations. It’s also very important not to forget that as we discuss these topics, the world marches on and out of necessity people develop and utilize non-standard processes to accomplish their business goals. This carries cost, complexity and additional challenges. Vendor solutions and their capabilities determine for many organizations what they can and cannot do.
From Stanley Nachimson: The pilot test process has been used twice, once for California Medicaid to test UPNs, once for the use of the ABC code set for alternative medicine.
And also from Stanley Nachimson: Regarding the cost benefit analysis, it will be important to have information on the cost and benefit for individual providers, payers and vendors as well as for the industry as a whole. Just doing an overall industry analysis does not provide actionable information.
And we have a ditto in the room. So I don’t see any additional comments at this time coming through the electronic means so we’ll go back to the old-fashioned in- person. Is there anyone in the room? Please state your name and your affiliation before you offer your comment.
BURCKHARDT: I am Laurie Burckhardt with WPS Health Solutions. I am a payer out of the State of Wisconsin. We are a small, relatively small private payer. Our largest volume of claims as we are Medicare fee-for- service contractor for J5 and J8. We are also a prime contractor for the active duty military, excuse me the retired military on a TRICARE for Life Program. And then we are a subcontractor for the VA, a subcontractor for the East Region Contract for our active duty military and their families and then we are also a subcontractor for overseas active duty military. On the private side, we’re small.
On the other side, we’re relatively large.
One of the things – yes, I am a payer but as a payer I’m only as successful as the provider community. When we talk in this, and one of the things I caution the Committee is as you look at this – if you are a large provider, you have the electronic health records. You have the integrated systems.
I’m going to speak to the providers that are small. We have a significant number of provider communities that do not have computers. Back in the day, we would give them computers, when we were – even in the Medicare side. We would give the Medicare computers to actually do – and modems. Believe it or not, if somebody asked we still have over I believe 1500 providers still connecting via modem, a 9600 baud modem.
We are working on getting them off of that but we have the very small providers that we give free software to so they can at least bill electronically – because paper from an administrative perspective, paper is a nightmare for a payer. It is very costly. We will do whatever we can to get providers to be able to submit electronically.
And portals are not the answer. We need to get the administrative transactions working.
One of the things I heard this last day and a half is the blurriness of administrative and clinical data. I understand there’s a growing trend for those integrations but to point to these last couple comments and especially to Rob’s last comment is, is we really need to kind of like
– in order to be successful in the future, we need to fix the current.
And we cannot forget about those small providers.
They are not going to have electronic health records. Or if they do have electronic health records, they are not going to be certified health records. So how does that impact the clinical data. How do we continue to move forward for the big and the small providers?
One of the other things I wanted to reach out – bring up is testing. What do we mean by testing? Testing means something different. When we did 5010, in order for a payer to do testing, our test environment is not a full- blown production environment. We don’t have all the patients and the providers loaded. Our test environment has other coding because we have other innovation – we have plans, business changes that have to happen.
So you almost have to have a separate system that’s your full current production, so a copy of your production system, because as a provider they don’t want to know can you take my transaction. Can you take my transaction into your system and can you pay me and pay me the correct amount in a timely manner? So from a claims perspective, that’s what the provider wants to know when we test going to a new version. Testing isn’t – can I take in a good transaction? Yes, I can do that. It’s can I actually pay it appropriately – and timely.
One of the things that as a Committee – and I forgot the goal, I did not write the goals down to my notes. But education and collaboration are key components going forward. Both of those take money and so somehow, in order for us as an industry to be successful, we need to have the appropriate funding.
I don’t know enough about ONC but they seem to have a lot of money. I want them to share it with us. And I kind of simplified that a little bit but if they can do it, we should be able to do it. So maybe even from the small providers, we figure out is there some sort of funding that we can get them to a baseline, a floor, so that at least everybody is working off the same thing.
Because as an industry, there’s never going to be a floor. Unless you bring everybody to that floor, you can’t start all transactions at the same floor. The expectations are not the same because again, as you heard, all the complexity, even the overseas, I’m receiving X12 transactions from a German clearinghouse. Guess what?
They don’t do ICD-10. They’ll give me the description of the ICD-10 code that I have to match in ICD-10 code. They don’t do procedure codes. So I have to release my edits for them.
Again, you get into some of the complexities of the mental health side when they are seeing active duty military family members, the children. And let me tell you, it just breaks your heart when you see these claims because there’s a lot of them and we need to make sure that the patient is not impacted by whatever this Committee does.
And I think I’ll get off my soapbox. MS. KLOSS: Other public comments?
GENSEMER: My name is Tara Gensemer and I’m with Pennsylvania Medical Society but I’m not speaking on behalf of the Medical Society. I’m speaking on behalf of my early-30s, solo practitioner, provider, business manager hat. So, that being said, I remember back in the day, manually entering all the remittance advices, you know, spent my day with my glasses most of the day. But the day we bought into our EMR, and the first time I auto-posted that electronic remittance advice it brought a little tear to my eye.
So in that role, I grew obviously and took over most of the health IT roles within this little registers office which was a lot of different hats. And along with that came Meaningful Use, came MIPS, came a lot of other options that we could add a la carte to the EMR. And every time you added an option, it was another $125, another $300 per provider.
And my fear is that – although predictability is great and it’s great for budgeting for a small provider that even though there’s a base – if we implement the APIs which I think is fantastic to make seamless transaction, there will be more a la carte services for the small provider. And when the ultimate goal is health care delivery, we can’t bankrupt the small providers on this.
So I just ask that that’s a consideration as we move forward. Thank you.
KLOSS: Several dittos on that public comment. And a couple more. Maybe we will have a balancing act for the small, medium and large for the install base, for the innovators we will have some tough balancing act to do as we move forward. Are there any other public comments electronically? My esteemed co- chair, do you have anything to say?
Agenda Item: Closing Remarks
COUSSOULE: Again, I want to thank everybody in here for participating and hopefully folks that are on the WebEx got something out of it as well. And our audience for both being patient and listening to us in the back. I also want to thank Lorraine, in particular, for all the work that she’s done.
KLOSS: We could not have done it without her.
COUSSOULE: We really used Lorraine in some ways but I do want to make sure that is recognized as well. We recognize there’s lots of work to do. We don’t think this is an ending by any stretch. We do have lots of written comments that were submitted. We look forward to more if you all would like to do that. Information on how to do that is there. And it can be things you’ve already thought of or even things that came out of today. So we are perpetually looking for that kind of feedback to help guide us to what we’d like to recommend.
So with that said, I think we are officially done for the day. Thank you all very much.
(Whereupon the hearing adjourned at 11:55 a.m.)