Department of Health and Human Services

National Committee on

 Vital and Health Statistics

Subcommittee on Standards

CIO Forum

May 17, 2018

Bureau of Labor Statistics

Postal Square Building
2 Massachusetts Avenue, Northeast
Washington, D.C. 

  

P R O C E E D I N G S    9:00 A.M.

Agenda Item:  Welcome

HINES: Good morning and welcome to the National Committee on Vital and Health Statistics CIO Forum, organized by the Standards Subcommittee. We are delighted that you can be here. My name is Rebecca Hines. I am the executive secretary for the Committee and the designated federal officer and I am here to make sure that everything goes according to the rules and also to make sure that we have a productive day.

I really want to thank you for taking the time to lend your expertise to this effort. This is a federal advisory committee to the HHS Secretary. We are only as good as what we learn from people in the real-world who are in the thick of the health care system. By you giving us your time, that we are able to be fully informed as we pull together and eventually come up with some recommendations to HHS.

If you are not totally familiar with the Committee, it is the advisory body on national health information policy to the Secretary of HHS, which includes standards and the administration of the privacy and security provisions of HIPAA.

With that, I would like to do roll call. So if our wonderful members could introduce yourself, the organization you are with, your affiliation with the Committee, and any conflicts.

STRICKLAND: Debra Strickland, XeoHealth. I am a member of the full Committee, Standards Subcommittee and no conflicts.

LOVE: Denise Love, National Association of Health Data Organizations, and co-leader of the All Payor Claims Database Council. I am a member of the Full Committee, member of the Standards Subcommittee, no conflicts.

LANDEN: Rich Landen, member of the full Committee, the Standards Subcommittee, no conflicts.

GOSS: Alix Goss. I am with DynaVet Solutions. I am a member of the full Committee, a member of the Executive Committee, and I am co-chair with Nick of the Standards Subcommittee, and I have no conflicts.

COUSSOULE: My name is Nick Coussoule. I am with BlueCross BlueShield of Tennessee. Co-chair of the Standard Subcommittee, member of the Executive Committee, as well as the Subcommittee on Privacy, Confidentiality and Security, but I have no conflicts.

KLOSS: Good morning. I am Linda Kloss. I am a health information management consultant. I am a member of the full Committee. I chair the Privacy, Confidentiality and Security Subcommittee. I am a member of the Standards Subcommittee, and I have no conflicts.

CORNELIUS: Lewellyn Cornelius, at the University of George. A member of the full Committee and the Population Health Subcommittee, no conflicts.

HINES: Great. I don’t know if you want to do the audience or just get their name at the sign-in sheet?

STEAD: Yes.

HINES: I just for time purposes, we do have some staff. Suzie Burke-Bebee with the Department, assistant secretary for Planning Evaluation Office. The team out front, Geanelle Herring with CMS. Lorraine, please introduce yourself.

DOO: Lorraine Doo with the Centers for Medicare and Medicaid Services and lead staff to the Standards Subcommittee.

HINES: Wonderful. I think I will turn it over to our two co-chairs to get us going onto the agenda.

Agenda Item:  Review of Agenda and Introduction to the Predictability Roadmap

COUSSOULE: Perfect. I want to reiterate what Rebecca said before, thank you all for your time in preparation and attention today. We oftentimes say it is all about me, in this case it is really about you. We are thankful and appreciative of your time and most importantly, your expertise to engage us in the next steps in our Forum today.

I am going to walk you through the agenda for the day. Let me start with that. A little bit of background, we are conducting this Forum, as many of you know we have been embarking on a journey to look at what we will call a “predictability roadmap”. Alix is going to walk you through the history of that in a bit more detail.

Part of what we really want to accomplish today is input from the folks who use the results of the Standards and Operating Rules and the challenges they are with on a regular basis.

We have people from across kind of the ecosystem, the health ecosystem, here today. We really want to hear both what is interesting, what works well today, what might not work so well, opportunities but also hear from you all what is coming in your world. How you see the world changing and the challenges and what we might need to consider as part of that.

We will walk through today a couple of different things. If you look at the agenda, again, Alix will walk you through a little bit of the history of the Roadmap and that will set some context. Many of you are probably aware of that, but just to level set everybody as to where we are at.

Then we will basically walk around the room and give you all an opportunity to introduce yourself briefly and give us some initial comments.  We laid out some questions that you all saw in the invitation. We are appreciative of that. We will go through a couple of sections of that so to get through everybody.

Then we will walk through each of five general themes that we have uncovered so far in the work that we have done within the Predictability Roadmap. The five themes are governance, standards adoption process, federal regulatory process, data harmonization, and then third parties as covered entities.

Each of those sections will be moderated by one member of our subcommittee.  The purpose of the moderation really is to make sure we can either tee up additional questions and challenges. My hope is that basically it just becomes a rather large panel discussion where we have input and challenges and opportunities from you all in regard to each of those different topics and challenges.

We will wrap it up with public comments – if we have any at the end of the day, and then some next steps. We will get into a little bit more about the next steps when Alix walks you through the Roadmap.

We will have breaks during the day so when you all get tired of talking and we get tired of listening, we will take a break.

(Logistics)

With that, the one other thing I will indicate is that it is oftentimes a little difficult, certainly with this many people, everybody wants to be heard. One of the ways that we do that is pretty typical for our committee, when you put up your tent like this, it means you want to speak and be heard and we will try and make sure we are catching everybody to give everybody an opportunity to speak.

We do have a very busy day and we will in some cases, try to move quickly through some topics. If we get either – I don’t want to say stuck, but just wound around different topics, we will try to trigger up different thought processes just to make sure we get coverage throughout the day.

I think we have also given everybody an opportunity to not only tell us what you are doing – I know we have some presentations that you have made as part of this. It is not a requirement, but we will also take any written feedback that you have today, tomorrow, the next day, as part of this exercise. So please feel free if something is either not covered today or you want to go through more detail or more input for us, that we are very happy to accept any kind of written information from you on an ongoing basis.

With that, I will then turn it over to Alix, my co-chair, to walk you through a little bit of the history of the Roadmap and our goals for today.

GOSS: Could we have the slides, please.

COUSSOULE: Sorry, just logistics. We agreed on something in advance and I have already broken the rules. Just to walk you back through a few slides. Our goals for the day, just to reiterate what I walked you through a little bit, is to one, learn about your experience and expertise with the standards and operating rules. That is one of the primary things, but as I also indicated, to share either accomplishments or things that you have done to deal with either challenges or opportunities inherent in that, and the challenges on the use of the standards as the business models in our industry has changed. Third, ideas to improve the process and predictability of advancing those standards.

Again, I think Alix will do a job of level-setting and again, we intend for this to be very much a roundtable discussion.

GOSS: We will make these slides available, along with the agenda, on the web after today. Can we go to the next slide?

The Predictability Roadmap is actually something very near and dear to my heart. I have been working in the health care field for 30 years. I have had the pleasure of being a standards developer and working in Medicare operations and trying to get HIPAA transactions to actually work.

When I first worked in Medicare, when I was very much younger, we had six versions of an electronic remittance advice between the X12 versions and the National Standard Format or NSF. That was really hard as a payor, but also hard for the providers and the clearinghouses, and everyone, to understand which version, what is the differences, how do I use this, what is the data, what is the quality of the data?

So way back in the mid-nineties, it was really clear that we needed to do something to garner administrative efficiencies with how we exchanged information and how we enabled business to happen around the delivery and financing of health care.

Hard to believe, but 21 years ago, HIPAA is now legal. It can go out drinking and maybe should. We are hoping to get your commentary today on where we need to go with HIPAA because administrative simplification and the objectives are still worthy, and we are still trying to achieve them but technology, business models, financing models, have all been changing and the care that can be delivered is evolving with all those underpinning technologies and methodologies.

We need to take a look at where we are at now and we have been hearing from the industry over the years, that we need to tackle – and I will get to that slide in a second – that we need to continue to improve it. We have seen under the ACA, the addition of operating rules to help further business’ ability to achieve administrative simplifications.

This is a wee bit of a snapshot of some of the history of the Roadmap. Pre-2012, we were already hearing from the industry from the first round of implementation of HIPAA, which was 4010 and 4010A1, that the process was not working real well and that we needed to make some improvements to it.

As we evolved in our own interactions in reviewing the HIPAA adopted transactions and code sets. Hearing from the industry, it became really clear that we were very far from the mark even though we had garnered some efficiencies.

As a result of the Review Committee hearings, a number of testimonies from some of you, and others, it was clear that we needed a way to enable the development, maintenance, adoption and implementation of standards to improve.

We have also heard from the industry a lot of common themes. I am not going to be surprised today if I hear you bring up some points that you may have said before or we have all heard either at this table or at other hearings, of things that need to really change. So, today is an opportunity to call out the pink elephant in the room. We really want you to push the envelope. As the end users, as the people who bear the burden and the pain and the joy of using electronic transactions, datasets, and exchanging with your training partners, we really need to – this is your bully pulpit. Please use it in telling us how we can fix the system. Even if you are not sure how to do it, what you think needs to be done is what we really want to hear today.

In advance of this meeting, we undertook some foundational level to make sure that we were on solid ground with understanding the current business practices of the standards development organizations. When I say standards organizations today, I am really going to be talking about those within the HIPAA realm, NACHA Operating – CAQH CORE Operating Rules, X12, NCPDP, and HL7.

We have published a comprehensive overview of the development procedures, organizational compositions, and workgroup structures. That is available on our website should you be interested.

We have also conducted a day-long visioning workshop with those organizations and federal partners and interested stakeholders, interestingly enough, on HIPAA’s 21st birthday on the day of the solar eclipse. We also produced a summary report of the workshop. The consolidation of those ideas has led to the five themes that we are going to be focused on today.

We have also met with HHS to understand opportunities and limits of the regulatory process. We know from industry collaboration to the point where we actually can get the regulatory process to work, there are large blocks of time that really prevent the benefit of improvements to be received by the industry.

It was very important to us to do that foundational work before we scheduled the CIO Forum to really hear the end users’ perspectives. What we want to do is take your feedback along with all of the other body of work that we have done, consolidate those findings into a draft set of recommendations. We anticipate having a hearing, which is a much more formal process with testimony, to get feedback on those draft recommendations, which then will allow us to collaborate with our other NCVHS members so that we can finalize recommendations that would result in a letter being sent to the Secretary. Those recommendations could span industry changes, NCVHS changes, DSMO changes, HHS changes. It could be very broad. So, we are coming into this with eyes wide open and really interested in hearing your feedback today.

I am going to pause and allow my colleague to introduce herself for the record.

MONSON: For the record, I have been on a cab adventure.

Good morning. Jackie Monson, Sutter Health, member of the National Committee and member of the Privacy, Security, and Confidentiality Subcommittee. No conflicts.

GOSS: Thanks for joining us. So, next slide please. So, just to make sure we are all on the same page about what has been adopted and what is in scope on today’s hearing, we are going to advance to the next slide. These are the adopted standards in very fine print. It is not an eye test. This reflects the medical and pharmacy and electronic funds transfer standards also including Medicaid pharmacy subrogation. We are talking about the operating rules for CAQH CORE. We are pretty sure that you are aware of all of this, so we are not going to spend time on it. If you have any questions, please let us know.

I think the next point is we would like to stop talking and give you a chance to take the floor. So, we would like to start with our participant introductions. Lorraine, do you want to assist here?

COUSSOULE: Before, any initial questions from a level-setting perspective that we could answer before we kind of get rolling with each of you talking about yourselves?

GNAGY: Real quick. In terms of what you are currently undertaking, for example, 7030 and the changes that they are currently undertaking, if there is discussion that is brought up in here that – just want to make sure that we have somebody here that can validate and/or explain or detail what currently is going on from your end if we bring up a topic that is already underway and already in development or already under development.

COUSSOULE: We can talk through from a personal perspective if it is things that are working through the – what I will call the governmental process. We will ask Lorraine to talk about what she can or can’t talk about as far as what is formally going on. We will try our hardest to answer any question that we can. If we can’t, we will tell you that we can’t and we will try to dig up the information as best we can.

Please ask those questions. The worst case is we will tell you we don’t know or we will try to figure it out.

Agenda Item:  Participant Introductions with Brief Summary of Standards Initiatives

DOO: So, we have slides from some of you. What we have done is we are going in alpha order by first name. So, you all don’t know which of you is on the hook first, but we do. Those of you who know your names will know who is going first. I have the list. Andrew Burchett, you are on the hook first. You will know if you sent slides or not.

BURCHETT: I did send slides.

DOO: Lovely. I have to tell you a story. The reason I had to bring these buttons is because I had to use them because I have only been able to use them once for a Jeopardy game that I did. I had to bring them. My apologies, but you will be hearing them today.

BURCHETT: I am Andrew Burchett. I am the CIO of UHIN, which stands for Utah Health Information Network. For those of you who don’t know what that is, UHIN is probably one of the unique animals in the room. We operate a clearinghouse. We also operate an HIE. It is a state-designated health information exchange for the state of Utah. A majority of our customer base is in Utah.

We actually just hit 25 years. UHIN was founded by the community. We were actually doing – exchanging electronic data between providers and payors back in 1993 before HIPAA existed, before standards existed. We were convened by the community. Our board includes providers, payors, both commercial and governmental payors, as well as some national payors, physicians, hospitals, and all of the various association groups. So, we have a lot of key stakeholders at the table that are guiding UHIN in our operating principles, our business products, various things like that.

Our primary goals has always been to reduce administrative costs and overhead for our community. We are actually not-for-profit, which is also another unique thing as a clearinghouse competing in the for-profit space with the various for-profit entities out there.

Lastly, we are an actual national recognized standards development organization. We convene a standards subcommittee out of our board. We probably have more than 25 or so people from these various groups represented there as we review standards.

So, some current initiatives we have underway, as many of you guys have, is the 7030 review. That is actually separate from my group. So I kind of manage, on the healthcare delivery side, IT and development. That is kind of what I oversee. The standards development side is different than my group. That is managed under different executive at UHIN. I will just mention that.

We basically just get hit with the standards when it is time to implement. 4010 to 5010, my team was hit with that and we had to implement various changes. We are actually still doing 4010 to 5010 step up for customers and various fun stuff there. IT gets asked to do a lot of weird things because – which is maybe to a negative. We don’t have a backwards compatibility in the industry. As an IT guy by trade, it doesn’t make sense to us that – you know, an API, we could publish various editions and versions of APIs and be cross-compatible. We have represented that with 4010 to 5010 step-up. Going from 4010 to 7030, probably not going to happen.

Another initiative we have going on is a block chain pilot.

The third initiative is the Da Vinci Project pilot, which surrounds HL7 mainly, which is kind of an interesting – for any of you that are involved in HL7 and have heard in the Da Vinci Project, we are actually involved in a pilot on attachments and preauthorizations, which is a transaction set that X12 also does. HL7 has been much more – from a standards perspective, less restrictive. You see different adoption across our various payers and providers. That is another initiative we are involved in.

Some positives. Obviously, administrative cost-savings and automation. These transaction sets being adopted allow us to automate more. The fact that we don’t have the attachment standard out yet makes it very hard to automate that, makes it very hard to get customers to move. Surprisingly, a lot of our governmental payors have adopted those standards and are doing exchange of that data already, which is kind of backwards. A lot of our commercial payors and providers aren’t adopting those.

Which leads into the negatives. So, timeframes are very long, unpredictable. With the also rapid changes in the healthcare landscape from a technological perspective, there is a lot of things outside of the various things in healthcare that are adding to the cost. We see that as a negative. If timeframes were more predictable, things like budgeting, planning, et cetera, would be easier.

So, that is all I have.

DOO: Thank you very much. Next on our list is Brad Gnagy. Do you have slides, too?

GNAGY: I do.

(Sends Slides)

DOO: Brian Matzuura, are you ready?

MATZUURA: Certainly. I am not going to go through every one of these boxes, by the way. You don’t have to give me a ding.

So, I am with Kaiser Permanente. I am the executive director – I am an executive director. I have two roles. One of the roles I have is I manage all of the incoming – inbound/outbound electronic transactions for Kaiser, so all of the claims, all of the eligibility, the membership, pharmacy, et cetera. My other job is I am actually leading our PMO for all of the digital work that occurs at Kaiser Permanente. That is probably one of our most – our fastest growing, highest – one of the high demand areas is really about – you know, people really wanted to use the internet for – it provides variation in how we can deliver care, where we can deliver care, provide care to our members in a location convenient setting that is more convenient for them, et cetera. That is not what we are here to talk about, but I just wanted to share that because there is a little bit of what I am going to say that is really about better integration and aligning.

I won’t go through each one of these either, but I think it is suffice to say that overall we really try to keep on top of the standards, the regulations, and are pretty much, I think, up to speed and have implemented things. I will put them in three categories. There are things that we have completed and there is demand and there is use. There is things that we have completed and there is limited demand and limited use. Then there is things such as the attachments, things that we support, we participate in the various workgroups. Like all of us, we are just really kind of waiting for what the requirements will be.

Finally, the thing I do want to point out is – one of the things that we have been recently – been able to achieve is we have actually converted our MBI. Now, we can actually take in both. We have done a lot of testing on that. So, we are just waiting for the rest of the industry and for the timeframe to kick off so we can all move in the same direction.

So, I guess the positives and the negatives. The positives are where we have the rules and the requirements all laid out, there becomes a clearer path of what to do. The negative, I think, sometimes becomes it is really hard to get everybody onboard. So, CAQH and some of the processes – people come onboard at different times. The demand is sporadic for us. It kind of is incremental and sometimes it is – well, it is incremental. So, those are probably the two biggest things.

I will use ICD10 as an example because I think that worked out incredibly well. ICD10 – when we were doing ICD10 – I led that for Kaiser Permanente, as well – we had a very clear roadmap as well with dependencies. If a delay occurred and when we would talk about it, we could then talk about, well, what are the other things that would have downstream impacts? What are the implications? So, at least when we gave input and you all made the final decision, we went in understanding the downstream impacts. So, those are the, from our perspective, I think really the positives and the negatives.

I won’t read through this either, but I think that one of the areas – we are really supportive of the work that you all are doing. We want to help. We want to – we really kind of support standard – we support really kind of helping to reduce the cost of care. One of the areas that we are really looking at – it could be because of our model – is really how do we start to align more and more the clinical side of the house. What we are finding, right, is especially with my other job around digital, is that the appointment setting, the clinical setting, whether or not the patient wants to do video visits, phone calls, in-office, to give them the options, but also to integrate that data with their medical record, the history. I think once we can combine those two, it is going to be incredibly powerful.

DOO: Thank you, Brian, that was great. Brad, you are up, sir.

GNAGY: Good morning. My name is Brad Gnagy. I work with Healthpac Computer Systems. We are a practice management software vendor and developer. I also work with the HATA, the Healthcare Administrative and Technology Association. I have been in the industry for about 17 years now.

Again, we are a practice management vendor. First of all, let me thank you guys for this massive undertaking. You are essentially producing standards for massive numbers of not only people, but industries. I can imagine the headache, not to mention the fact that you now asked everybody who deals with it to come in here and give you more headaches. So, thank you.

You can really whittle this down, from our perspective, into several categories. One is pace. Obviously, the weight of this undertaking is also having an effect on the overall pace because the paces of technology today is substantially faster than it was even four or five years ago. We are going to continue to grow and that is going to continue to become sort of a metric in which standards have to recognize and that is the pace of industry, the pace of technology, and the pace of growth within those.

Just to start off with some of the challenges that we are going through right now, my primary one here is lack of payor adoption information compliance within the standard transactions. A lot of this requires these costly one-offs that we end up having to do from a development standpoint and causes the providers lack of adoption due to limited value. So, that is one of our – sort of dovetailing off of what Andrew was saying, lack of adoption, standards that doesn’t tend to keep up with the pace of technology and information as it grows today.

One of the things that we are really – from the Healthpac standpoint, that we are looking into is the ability for a standard to have built-in high level of extensibility. That adds yet another set of difficulties to it, but we – internally, at Healthpac, we have a number of scenarios in which we have to utilize our own standards internally as well as with some of our partners, vendors, and even some of our clients. As the lower end of the spectrum in terms of size of company, we also have a little bit of an advantage in the fact that we deal on a very, very personal level with our clients. We have a bit of a family environment where we also allow for customization. That is good and bad, as most of the smiling people know.

So, we take our standards, our internal systems, and we extend them. We build in extensibility. Again, back to some of the things that Andrew was saying, the backwards compatibility effect of extensibility within standards is huge. If you have a built-in – if you build that extensibility in to where you have instructions, you have information, you have details on how that extensibility is used, then your progression throughout your standards has a built-in backwards compatibility. If you build your customization or your needs within this extensibility option, then you are able to then report back to the standards, report back to you guys here are the things that we have seen, here is what we have done within the standards to extend it and to customize it for the needs of x, and you are able to get that information back as a metric. You are also able to have the people that build those tell you what they are doing and how they are doing it.

It builds that set of backwards compatibility into the system, where as you produce new versions, if I am not ready to progress with the new versions – which we are, but if you are not, you are able to keep your extensible options at level A or version A as it grows through.

So, our big things are the pace. We find that not only our clients’ needs, but our vendor needs and our internal needs tend to go in different directions and at different paces. We have to always come back and rely on the standard, which then goes even slower. Those are some of the things that we are trying to get out there today.

The third is the lack of a solid business case and customer demand to support the implementations of certain mandated standards. We really like to go back to what is the business case, what is the use case for this sort of implementation, and try to gear our development based on what the business needs are.

So, our big three things are the pace, having standards be extensible, and then having those business cases where it makes sense for us to press.

DOO: Chris Muir, you have the floor. Did you have slides?

MUIR: Honestly, I will start. If you happen to bring them up in time – I sent a couple of stock slides anyway. Maybe you have all seen them.

As I indicated, my name is Chris Muir. I work for the Office of the National Coordinator for Health Information Technology within the Department of Health and Human Services. I am probably a little bit different animal than the rest of your guests. I am actually a federal employee. I represent a regulator. We have a certification program where we certify electronic health records. We curate standards for nationwide interoperability. Very similar mission. I don’t know if I will be able to offer a lot on HIPAA standards and transactions, but maybe I can give some input on a fellow federal agency.

My role within ONC is I am the Standards Division Director, where we curate standards and we publish them in an online system called the Interoperability Standards Advisory. As we publish the standards, we seek public input. We try to provide helpful information regarding the standards. It is – we also hope that federal regulators and others will look to these standards as they develop their programs and look for centers to meet interoperability needs.

ONC – I basically covered that. We do a lot of other things. We interact with people who are implementing the standards. We try to improve upon the standards. We measure standards adoption and other things.

If you go to the next slide, ONC’s priorities right now are – there are two things: interoperability and reducing provider burden. We are using policies and standards to do both of those things. We are trying to have an impact on patients, on providers, and the overall healthcare market.

I will have to say – not on the slide, but just give you a little bit more insight. As the High Tech Act was the policy imperative for ONC through Meaningful Use, EHR adoption, with the passage of 21st Century Cures, that is the new policy for ONC. So, some of the things that includes is regulating to prevent data blocking using, through our certification program, open APIs to ensure that there is accessibility to data. What the ultimate goal there is, among other things – of course, we want people to have access to their own data. We want providers to have access to data where and when needed. Also, so that federal regulators like CMS and others can reach down to the doctors and pull the information they need without too much stress on the providers and the providers having to do a lot of extra things to get the data to the regulators and so forth.

We have a new federal advisory committee. It is called HITAC. Among other things that they are responsible for, they are responsible to identify priority use cases for interoperability and then to identify the standards and implementation guides to accomplish those use cases. Part of my job is to support that work as well.

Last, but not least, and I am sure you all have heard about it, we are also working on the Trusted Exchange Framework and Common Agreement. TEFCA you might have heard of. Part of my job is to enable or to ensure that there are standards and implementation guides to help implement the TEFCA.

In a nutshell, that is me and my organization. I will just say one more thing since the thing hasn’t gone off yet. I am very interested in the Predictability Roadmap and how we might do something like that for ONC. I am really anxious to learn more.

GOSS: We have a great glidepath to support that work in that 21st Century Cures calls out NCVHS’ role to support HITAC. We have been working on defining a collaborative scope between the two organizations. I think it kind of fits with – I think Brian said the earlier point about the administrative and clinical coming together, that convergence aspect. I think that we have a lot of interesting discussions in our future.

DOO: Thank you.

NICHOLSON: I’m Dave Nicholson. I am an executive vice president with AdvantEdge. You might have – it might be listed as PMI. My company was recently purchased by AdvantEdge. AdvantEdge is a revenue cycle billing company. We have over 800 employees worldwide, 75 certified coders. We process about $3 billion annually in client charges. We have been around for 50 years. I am also part of the government relations group of HBMA.

In the first bullet point, the real-time requirements to get a claim paid – back in October of 2016, my company hosted the NGS group of Medicare to our facility in Baltimore to present the day in the life of a claim. Some of the people in this room were actually at that meeting. I think we opened the eyes to the concept of a workaround. My old company had hundreds of workarounds. My new company has thousands. Our desire is to get the claim moving and paid. What happens if the claim doesn’t get paid? Our providers don’t get paid.

For example, when we receive an ERA, it may not have adequate information to tell us what the problem is. A CARC or RARC may exist, but it doesn’t give us the details, so we have to go to the payor’s website and get more information.

Another example might be that the claim gets paid and it says on the ERA, but we don’t know where it got paid because it doesn’t say anywhere – for instance, if it was an ERISA claim, they don’t have to tell the provider where it was paid. Now, we have to chase the patients. We have to chase the checks.

One of the nice reasons, if it was a little more details in some of the CARC and RARC, would be processes like automatic claim denials could be developed. So, you would get a stream of data in. You could figure out where it has to go. You could automatically respond to it.

Frequency of updates. The development and implementation cycle of many products in the revenue cycle, EMR, and practice management take many months to design, implement, and test. How can a standards body release final changes days before the go live date? An example was the whole MIPS in the advanced payment model rules and requirements. With all of these changes, are there economic benefits? A change occurs and then another change occurs six months later. What is the economic benefit?

The final area I would like to discussion is areas of improvements. In the privacy area, there are penalties if people transgress. Providers pay penalties for self-reporting of transgressions. In the administration simplification area, there are no penalties. There are no penalties for noncompliance to the standard. We don’t need 900 different ways to obtain an 835 from a payor. Some payors have tried to evade the rules with an 835 by offering virtual credit cards. Other payors have also – have now – are charging two percent to receive an EFT. To add insult to injury, they won’t give you an ERA for it. They make you go to the payor’s website to get it. We need help in that regard.

The moral of the story is if payors don’t adhere to the standards or try these new tricks, we could use help to get them to apply the rules. Those are my remarks. I beat your buzzer again.

DOO: Sure did. Record setting. Thank you very much. We are actually going to go to Dr. Goodyear.

GOODYEAR: I am Jim Goodyear. I am a general surgeon from suburban Philadelphia. I thank the committee for giving me the opportunity to actually present perspective of end user at the physician level. I have been in practice for 30 years. During that time, I have seen the transition from paper claims and remittance and doing it all manually through the transaction.

Initially, my practice took a significant financial outlay, both financial and time to computerize our practice and develop an electronic management system. It resulted in improved efficiency to handle day to day standard transitions.

Certainly, the benefits of EDI transactions have far outweighed the shortfalls. We experience reduced accounts receivable days, which allowed for more streamlined source of revenue and permitted our business office to budget finances in a more predictable way. EDI has reduced the duplicative work of manual entry, which in the past has created an environment of increased errors and significantly interfered with our patient care workflow.

Although 837 transactions has reduced the cost of claims significantly, when 835 remittance arrives at our practice there is often a lack of consistency between payors. Billing offices have reported that they repeatedly have – when their denial is reviewed, the reason codes and the remark codes do not always match the explanation of adjustments or match the reasons for denial. This is problematic at the practice level.

It is not uncommon for a modifier to be moved to another procedure code or the modifier to be amended to something entirely different. Additionally, a copay for an ENM code can be moved to another procedural code without significant explanation.

The claim status function has been helpful. Claims reports can be monitored daily and cross-referenced with claims status reports to identify issues that need to be addressed within a timely filing limit. However, the method of processing results of the user reports continues to remain fairly manual. Again, a significant burden for our offices.

The eligibility function has been helpful when verifying if a patient is enrolled in a plan. Depending on the plan, it may also name other plans in which the patient may participate. On the other hand, the eligibility function has not been as advantageous as we had hoped. Deductibles, coinsurance, and copayment data is not often drilled down enough for it to be valuable and helpful to our business offices.

Where this is particularly true is we are referencing coordination of benefits. CAQH said it significantly the best in their administrative white paper when they stated transaction standards are only effective if payors and providers are – have good information about all of the forms of coverage involved so that transactions can be sent to the correct health plans. Often, there is not enough data to facilitate coordination of benefits, since billing offices need more than just the health plan name to comply. We feel it would be significantly beneficial if health plans could share the patient identification number of any additional plans of which they are aware, thereby alleviating the guesswork in the billing department.

Authorization requests are where the practices and the physician practices both in Pennsylvania and at a national level would like to see more robust innovation and implementation. Parenthetically, this is a high priority issue with both the Pennsylvania Medical Society and the AMA. We hope to see more efficient and user-friendly approaches to prior authorization for procedural care. For example, in an age where we can attach consolidated clinical document architecture, CCDA, to a direct secure message for a referral to another provider, can we not also integrate this technology along with our EMR technology with our payors in the prior auth space? Large and small physician groups hire additional costly staff to work prior authorization. Most requests continue to be filled by a fax, telephone, and even email.

This is an administrative burden. We ask that payors be held to the same standard and shared responsibility as our providers. The workload, as it currently exists, is becoming increasingly burdensome, time consuming, unsustainable, and most importantly from my perspective, it interrupts the continued quality delivery of health care. We like to put the patient in this environment and think that that is most important. Processes need to be streamlined and accountability shared between all entities. This could allow for the burden reduction simply by allowing the attachment of CCDA as documentation and ease communication barriers between end users and payors.

Our fear is that with the advancement of application programming interfaces to complete the prior authorization task, end users will only have to bear more expense. So, we ask that you consider using existing infrastructure of the practice management system EHR to enable the prior authorization request, encouraging interoperability, transparency, and the ability to manage data in one central location. We also ask that you consider suggesting that health plans reduce their prior authorization requirements and limit that application only to outliers.

With the expectations and regulations put upon provider practices, they are focused on their resources on referrals and prior authorization rather than on direct patient care. Overwhelmed billing departments are outsourcing these transactions due to workflow and lack of confidence. Depending on the type of billing agreement a practice has with their vendor, these costs can range from a few hundred to a few thousand dollars per month per provider. Overhead has not decreased. Dollars have not been saved at the practice level. Funds have only been reapportioned. They have been reappropriated to technology support, vendors, security risk analysis, upgrades to hardware and software. Additional changes to electronic standards and operations have the potential to disrupt these workflows and have a significant financial impact both on small and large practices.

I ask you to be mindful of these costs as decisions are made to advance innovation through technology and setting up a standard for frequency of these updates. I want to thank you again for allowing me to present to the committee today.

DOO: Thank you very much.

HEFLIN: Thank you. I actually want to dovetail on Dr. Goodyear’s comments. One of the things I am going to talk about today is actually the API he described.

Essentially, the Sequoia Project, whom I am representing today, actually is a curator of a national scale application program interface that allows the clinical data to be accessible to EMRs and it is in production today. It is actually in widescale production today.

My comments are mostly around essentially a little bit of educational materials about Sequoia. Because we are a non-profit, we don’t do much marketing. So, many organizations haven’t really heard of us. I actually talk about how this can dovetail into some of the objectives stated for this meeting.

By way of background, the Sequoia Project is actually – we are a 501(c)3. We are a nonprofit. We are not a trade association. We are organized for the public good. So, our only reason for existing is really to facilitate healthcare data interoperability. We actually manage three and soon four essentially separate initiatives, which I will talk about very briefly today: eHealth Exchange, Carequality, RSNA Image Share Validation Program, and then a project called PULSE.

EHealth Exchange, which you may have never even heard of because we are kind of a parallel world, really, to the billing and payment world, is the United States’ largest public-private data health sharing network. Again, you may not have heard of us because we are a non-profit so we are not out there marketing or selling or anything. We have a presence in all 50 states. Four federal agencies are connected and live in production. We have 70 percent of all U.S. hospitals connected and live today and 59 regional health information exchanges.

We actually support also a broad number of use cases, including treatment, care coordination – this is, by the way, using industry standards – social determinants, social security benefits determination workflows where the SSA uses our, essentially, API to determine evidence existence in electronic format rather than going to paper. We have immunization use case, release of information for consumer-based access, syndromic surveillance use cases, encounter alerts, release information for life insurance – again, this is basically consumer access – PDMP, prescription drug monitoring, lab reporting, and image share. So, that is our first initiative, which is called eHealth Exchange. That was really our bedrock. By the way, that was actually started by the HHS and ONC about 12 years ago. I have been involved since that point.

Our second initiative is called Carequality. So, the eHealth Exchange I just mentioned, one way of thinking about it is it’s a network. Carequality is the only organization to my knowledge in the United States whose mission is to connect those various networks together. The analogy we use is what if you have a cellphone from provider A and you couldn’t call somebody on a cellphone from provider B. That is actually the reality today for healthcare networks.

So, Commonwealth has a network, eHealth Exchange is a network. I am sure Scripts has a network. EPIC has a network and so on. So, Carequality’s mission is to provide a legal trust and a technical framework allowing those to interoperate. It has been very successful doing that. As of today, we are exchanging around 2.4 million actually, successful clinical patient documents today in production in structured data format.

The third initiative is RSNA Image Share Validation Program. RSNA came to us a few years ago. They are the world’s largest, I think, trade association for image share or for imaging. They came to us and said every other industrialized country on the planet has a single national strategy for radiological image data sharing but the U.S.A. They wanted to work with us to fix that problem because, as you are probably aware, radiological images are some of the most expensive and, very importantly as well, they actually are also harmful to patients in many cases. They are radiological procedures, whether intervention or diagnostic, they can actually harm patients by almost definition. So, it is very important not to duplicate these tests and to share them efficiently. Our third initiative actually is designed to help facilitate that.

Our final one, this is actually brand new, is actually – so, the PULSE system, which stands for Patient Universal Lookup System for Emergencies, is a new ONC grantee-based project designed to have another single national strategy to help displaced people in the event of a natural or a manmade disaster. It is actually deployed now. It was deployed – it was activated for California for the recent wildfires. I was actually the technical lead for the deployment and activation of that project. The purpose is essentially to give any authorized physician or authorized individual, such as a pharmacist and so on, access to patient data in the event of an emergency.

During Katrina, one thing we found out, for example, is the single biggest requirement in terms of healthcare for displaced citizen is to determine their current medications and then current problems and then allergies. This actually provides a single, unified approach by plugging into Carequality and eHealth Exchange to accomplish that.

Everything we do is based on industry standards. I have listed them here. I won’t go into them in detail today.

Everything we do is based on open industry standards. Where they don’t exist, we help create those. Bottom line is the reason that I am here today is I would like to actually offer kind of a – I am not saying an olive branch because we are not in any way competing, but I would like to find a bridge. Really, the payment world right now is not talking to the healthcare data exchange world. We have built, as of today, a network that touches 75 percent or 70 percent of all hospitals. Why are we not using that for billing purposes? There is very valuable clinical data. One of the things we have been passionately pushing for is to make sure that data is interoperable. It is based on structured data. It is not unstructured content. All of that data is available through secure use to your communities today. I would like to find a way to bridge that gap so we can actually work together and provide benefits for our mutual citizens and patients. Thank you.

DOO: We are back on track. Janet

CAMPBELL: I don’t have any slides. I will make it easy for folks. So, my name is Janet Campbell. I am a software developer. I work with EPIC. We do electronic health records, which I think people now know. What people maybe don’t know is the breadth of the groups that we work with. We work with large academics, small clinics, huge organizations, ACOs, CINs, retail clinics. We work with HMOs.

We are starting to expand more into nontraditional settings of care, so dentistry, specialty pharmacy, specialty clinics, et cetera. Moving now more into the population health space and, in fact, working directly with providers on those efforts. So, we really see the electronic health record as really being very beneficial in all of these spaces, especially when it comes to data exchange among all of these different settings of care.

Groups that use our software work in all 50 states and 11 countries. We also have a lot of experience with working with a lot of different standards and competing standards. I would say that overall we are a strong supporter of standards-based development. It is easier to work with, certainly, when it works right. Of course, it is the things that our organizations demand that we do, so we support it.

We are strong supporters of HL7 and IHE. We have also found a lot of benefit in newer standards, environment initiatives, if that is the best way to put it, like Argonauts and the Da Vinci Project, which are focused on actually implementing and testing the use of these standards and learning in small pilots before they actually go to being fully codified. Of course, we are strong participants in Carequality and the eHealth Exchange. In fact, I was thinking – Eric mentioned this as well, but from Dr. Goodyear’s comments, the Social Security Administration through the eHealth Exchange does allow for that CCDA exchange to support disability claims, which has just reduced – resulted in a huge cost savings for organizations that use it.

I would say when it comes to standards, probably one of the most complicated things for us is when standards are contradictory, of course. And not just contradictory in terms of how they are implemented, but even in terms of their methodology. Stan Huff from Intermountain has a great example of whether you would represent cancer of the right lung as one concept, cancer of the right lung, or three concepts, cancer, right, lung. It is that type of mismatch between data that makes it difficult to implement from a technical perspective.

Above and beyond that, what concerns us more is when standards development doesn’t necessarily consider the upstream consequences, by which I mean if we ask clinicians to record more of this data that results in usability challenges almost 100 percent of the time. We can do our best to be able to ease that burden, but if the data is not in the system, we have to ask somebody for it. Typically, that is someone in the healthcare delivery system. That is difficult. It is always difficult for folks. So, we really try to urge standards development to consider how the data gets into the system in the first place, not just how we spit it back out.

In terms of timeframes, one thing that we have to keep in mind is with so many organizations – not every organization is using the same version of the software. Not every organization has the same plan for upgrades. That either means that we are pushing changes back to multiple versions of the software or they are needing to change their upgrade plans in order to get to a version that supports it. We see that as – the predictability is huge there. The more we know in advance and the more organizations know in advance, the more they can budget and plan for that implementation.

We have worked, actually, with ONC on the standards advisory work. We think that is useful. It is very helpful. It has the right approach. I would say that it could be more predictable and more dictatorial and also maybe more focused on research in the industry to understand what standards are available and how in use they really are. There is a difference between a small, localized pilot used at a few groups and a wide thing that is being used across state lines. That isn’t always gotten to in the divisions of the standards advisory.

I mentioned, in fact, we have been trying to spread out into working more with payors directly. Part of that actually is to see if we can start to eliminate claims in some scenarios and other types of things to get around the standards problem. So, I think it is a huge issue for groups. Anything we can do to make that easier would be great.

DOO: Great. Thank you very much. On that note, we are going to take a break. It is time for a break. We are going to start with Joe Bell when we get back. You know you are on cue.

CAMPBELL: Can I grab ten seconds of my thing back because I just got an email?

DOO: Yes.

CAMPBELL: The SSA example, the SSA just did a study – I just thought it was really interesting. The SSA just did a study looking at that electronic disability thing and found a 70 percent reduction in uncompensated cost per claimant and a 42 percent overall reduction of uncompensated care costs. So, the standards can work. I think that is really powerful.

DOO: Great. Thank you. You had a minute left. So, ten minutes.

(Break)

DOO: Mr. Bell, you are up.

BELL: I would just like to say we are honored to be a part of this conversation. My name is Joe Bell, I am the Chair of the Cooperative Exchange, the National Clearinghouse Association. We represent over 29 clearinghouses. We are the guys who are caught in the middle. We process both administrative and clinical transactions. We represent over 85 percent of the clearinghouse industry. Our submitting provider organizations are over 75,000. We have over 7,000 payer connections, over 1,000 IT vendor connections, and process over $1.1 trillion transactions.

We basically are trying to explain our experience, the industry impact and the recommendations that we have. Feel free to read them, but, for the sake of time, we basically find that the standards have a problem because of the timeframe that it takes to come out with them, and new technologies, they lag behind them by quite a significant number of years. So, we need to build in agility to the process. We also need to have business requirements clearly defined, and provable ROIs to make it happen.

We recommend a governing entity with a skillset and financial backing to enable a methodology/process to assist in coordinating business and clinical needs across SDOs so that the industry can implement emerging technology standards and operating rules proactively to expedite stakeholders’ existing and future business needs with their respective workflows. So, if you define the rules and make sure the rails on the tracks are there, we promise we will build the train.

Some of our industry initiatives — Being the middle man, we actually have quite a few emerging technologies that we are supporting our member companies with different initiatives such as FHIR, restful APIs, privacy and security, cognitive computing and Blockchain. We also have some pilot programs that are industry collaborations while waiting for regulations — attachments, prior authorizations, facilitating administrative and clinical information exchange, and FHIR application for interoperability.

We need to define the ROI as we do these pilot programs because we have to have that justification in order to be able to build that train I was talking about.

DOO: Thank you very much.

GOSS: While we are going to the next person, it’s my perception that if you have given us slides we will be posting those on the website, so, if you are not okay with that, please make sure you tell us.

DOO: Next, we have Keri Grizer.

GRIZER: Thank you for the invitation. I am glad to be here. My name is Keri Grizer. I represent Optum 360, which is part of the United Health Group. We are the technology branch. I am here also representing HATA that Brad mentioned as well.

I just want to talk you through a few of the challenges and opportunities. I won’t speak to all of them. There are a lot of common themes that I have heard, so I’m kind of nodding my head and tapping along as several of you have made your points with your experience. But a few that I wanted to highlight are just that we are a revenue cycle management — that is what I’m here to represent as far as our products and solutions — and we are trying to satisfy clients’ needs in a timely fashion. Obviously, standards help support that but we often have to go beyond that.

We look to those standards when we are trying to meet needs timely, but those standards often lack a little bit of flexibility which we need to be innovative, so we’re looking for ways to be more transparent and meet those business needs within the standard transactions. Something that come to mind would be payer-specific billing. Someone mentioned the CARC and RARC codes aren’t specific enough. So even within standards we are trying to comply, but often we need to think a little bit outside the box, so we are looking for some ways to be flexible within the standards to support that innovation.

There are still manual processes in place. I laughed about the 4010 to 5010 conversion. You know, we still deal with paper claims to some degree. Many of you are probably rolling your eyes, but we still live in a world sometimes where that still exists, and it is painful because, in the spirit of electronic transactions, we obviously don’t want to have to spend any of our time and resources and costs to support paper.

There are lots of opportunities. One that we try to do is, within the solutions that we offer and that we collaborate with other vendors on, to be able to leverage existing end-user workflows. There is nothing more painful for a provider or provider organization, big, small or medium, than to come in with new solutions and you have to totally redo your workflow. So we really try to have a focus on surrounding their existing workflows and enhancing them, but being able to leverage the standards to do that.

We also are trying to exchange information prior to claim submission, so, claim statusing comes to mind, being able to be more innovative and flexible with those transactions so that we can get more frontend messaging. We want to reduce denials is the bottom line. A denial-free universe is maybe not in our lifetime, but certainly whatever we can do to bring more actionable feedback earlier in that EDI stream for the users and for the providers will prevent or at least reduce or minimize denials downstream obviously through these standard transactions.

The other opportunity I see piggybacking off of is from the Secretary of Health and Human Services, his statement back in March about transparency. There is a lot of data and a lot of great ideas in this room alone, let alone in the industry, but that transparency to meet those business needs is really critical.

I just wanted to talk briefly about some of the initiatives that we have going on within Optum and United surrounding pricing transparency in particular. The PreCheck MyScript is just an example of a provider-facing solution that allows the provider to look up the prescription, maybe offer alternatives, be able to see and communicate to the patient.

What patients want is understanding. You all have been patients. Certainly, everyone has had medical services in this room. You just want to know upfront what it’s going to cost you. That is just so critical, and so often we lose sight of that. This just an example of one innovation within our industry where those providers can check for prescription pricing while the patient is still there and guide them appropriately so there aren’t any surprises on the back end.

Another initiative we have done is patient estimation specifically with Quest, being able to have the phlebotomist in a lab setting order the lab. And we are leveraging that predetermination transaction, which many of you may not be familiar with. It’s a flavor of the 837, not commonly supported by practice management systems. They tend to use the claim adjudication standard. But we are using that predetermination claim to generate the eligibility request to the payers and leverage clinical editing and contract-allowed amounts and, within three to four seconds, be able to return what the patient is responsible for owing at the time of that lab service, again, so there aren’t any surprises and being able to communicate and be transparent at the point of service.

This initiative has been going on for the last year, but it has been live for about the last six months, and so far we have processed over 2 million transactions with a 95 percent success rate, leveraging the 835 to communicate back the patient responsibility amount. We are looking to be able to realize a 15 percent increase in getting the patient to agree to authorize the credit card for the amount that we have calculated, and are currently live with 25 payers with, again, about a three to four-second response time.

So, just a little bit about some of the initiatives that we have going on to support that real-time pricing transparency that certainly is a priority of Health and Human Services. Thank you very much.

DOO: Thank you very much. Very interesting.

Dr. Larson, you are up now.

LARSON: I have probably more slides than I need so I won’t go through them. I will just talk to them but you can put them up if you want.

I am Dr. Kevin Larson. I’m an internist and a medical informaticist and work at the Centers for Medicare and Medicaid Services in the Office of the Administrator, and part of my role there is to help coordinate health IT across the agency. You can imagine there are a lot of different pieces and parts to that, so I will highlight a couple of them here.

One of the projects in my slides has been mentioned here already, which is the Da Vinci project. This is the claims attachment project looking to test claims attachments using APIs. I won’t go into that too much other than to say we are very interested in how we can make this happen much more smoothly and more quickly. A number of you have been participating in testing of that, thank you very much.

Another is our MyHealthEData, which is patient access to data. It’s a big initiative, a big priority for our administrator and for our secretary. How do we make patients have their own data, and there, the point of transparency. There has been a Blue Button initiative that many people participated in with an ASCI file where patients can download their records from both providers and some payers. Many people have participated. We are moving to make that an API and have that be much more open and much more available to patients.

In talking about the place we are at here, I will focus a little on some of the projects that I have been more deeply involved with because I can speak with a little more expertise. My work previously was with the Office of the National Coordinator with Chris, and there I helped oversee the electronic clinical quality measurement system for ONC and CMS, especially the standards and certification and testing for that.

That was an interesting space because we had mandates for the time by which things had to be ready for national scale, and those mandates didn’t really give us much wiggle room to get standards that were just right and to do all the testing that we wanted and needed, so we said how can we really think differently. We built a kind of coalition where we worked to do agile development of our measure specifications of our tooling of our standards, and have the whole industry try to work in an agile way with constant small bits of iterative feedback.

I won’t say that we were fully successful in that, but we got far further far faster than if we tried to do what we had historically done, which was to do large-scale waterfall batch processing for each thing, which is to say let the standards people take a couple of years, sit in a bunch of rooms, build a standard, hand it to us. Then let the specification people take a couple of years, build the specifications, then have the software people take a couple of years, build some software. Oh, it doesn’t work! The standards don’t work. Surprise, surprise! We didn’t actually test the standards.

So I think one of the big opportunities here is how can we all work more closely together in these much smaller-scale, lower-risk agile iterative feedback loops. How do we take lessons from Agil, de-risk this whole process by building much smaller chunks of change with a much lower bar of testing and shorter feedback cycles, all the way through.

One of my insights through all of this is there is not really much of a business model for people to do that kind of testing that the industry wants. What we heard over and over again is, hey, we would love a standard that has already been adopted by a whole bunch of the industry, and then we are ready to take it. We called around, okay, who wants to test and try that. Oh, we don’t have time for that; we’re too busy implementing and validating the old standard. Let somebody else do that. But please tell us when 30 percent of the market is already there and then we will jump onboard.

I think we all aspire to new and better, so the question is how do we all have a part in that and how do we de-risk it by having it be in smaller chunks so that none of us has to take the giant risk and we can all take our piece and test and try that so that we can build these loops. So, what have we done?

We actually built a Web resource, a combination of ONC and CMS and HRQ and the National Library of Medicine together, and combined a whole set of technical resources and inputs into one place. This is kind of like the meaningful use website for nerds. If you want to meet the implementation guy and the direct connection to the HL7 standard document and you want to give feedback to any of that, there’s a single ticketing system. You can put in a ticket and we manage and triage it between the agency, the standards body, SNOMED CT, the National Library of Medicine — all those things are managed together. You don’t need to know, as a technical software developer or implementer, which of those things is the solution. We handle that.

Then we coordinate that across this ecosystem of different players, and then we queue things up. For example, for HL7 we created a queue for the clinical workgroup, and as they meet we have a ticket queue that has come in through this and we don’t change their process but we say, hey, here’s the ticket queue of the things we prioritized from the inputs we have from the technical implementers of this. Can we work with you, can we be on the committee and help to resolve that? We often will also, through various government contracts, support some of the people who are on those committees to off-load some of the work of the committee where we do some of the technical activity.

So this helped us get from a really unstable, immature standards space to a much more mature space that people don’t throw rocks at us nearly as much when we go to meetings. Thank you.

DOO: Thank you, Kevin. Very good.

Next we have Liz Johnson. I think it’s amazing that I feel like I’m sitting in a room with people that understand what I deal with every day. It’s terrific.

(Laughter)

JOHNSON: Here is some stuff, not particularly important except for the fact that I have been — you know when somebody says I’ve been at it 17 years, I’m like, you could be my son. I have been at it 40, literally, so I have done a lot of work over the years with standards. I have some critical roles with CHIME, and I did put a slide in at the end about CHIME. They are really a great opportunity for all of us to be able to access CIOs nationwide, which really then gives us the information we need to figure out how to not repeat each other’s errors.

I would have to say to Keri and Kevin, first of all, yes, we are definitely still in a paper world. We do paper claims every day. I don’t consider Fax electronic but we use it every single day. I think that, as we talk about how do we move to a more digital world and we talk about how can standards work with it — and then I have to say to Kevin, no, I do not want to be your testing partner —

(Laughter)

We are just going to get it all straight right off the bat. So I think we really do need semantic interoperability and fancy words, but the reality of it is we do not talk to each other well.

While I think Cures and TEFCA and all of those acronyms are out there and I think they are real, and I think we really want to get there — and it was interesting listening as we even talked about some of our vendor partners — and Tenet has the opportunity to work with all of them — we still are not talking to each other very well and we still haven’t given our patients the ability, much less our physicians — that’s why I was so interested to hear from our surgeon — that we really don’t pass information back and forth in a very easy way to understand. I always love it when the standard says human readable. For any of you who don’t know what that means, please print out a human-readable form and then try to explain it to a patient. So I think we all get it, going there.

I think we need to really talk about value, and I have heard it several times. We are a for-profit. Tenet is a for-profit, big organization, crosses 43 states, 63,000 employees, etc. The reality of it is we have the same problems. It doesn’t matter how big or small you are, the problems are the same. Scalability doesn’t even matter because the reality of it is I have more resources than a smaller place so it’s all the same. But we are not doing it, and it’s really hard to get people to buy into the cost because the value can’t be proven yet, and we need help with that.

I think that standardized attachments are critical. What we did, just so you know — and I will be honest. I don’t know that it stays in this room, but I think it is safe. We still print out everything and send it with our claim because it’s so much faster to print it, scan it and send it all at one time rather than waiting for the next request for the next document, so we still do that today.

I am amazed because I was involved in 1996 when the HIPAA came out and I was so excited about the administrative implications. I think we have made progress, I really do, but I think there’s much more progress to make. I think if you really think about interoperability and really think about the patient you will get to the right place.

I will give you just some other quick comments about things we deal with every day that are really hard. Everybody has talked about timing to implement. It doesn’t matter which standards you’re talking about. I think two years may be too long for each segment, but I think the minute we get a standard that people agree on we’re looking at a minimum of 15 to 24 months to roll it across our whole organization, get it from our vendors, test it, and yes, of course, we want it to all be right in advance. It is not. Then we start rolling and then we have to do all the workflow assessments and changes.

As we think about that and we try to be innovative it’s really hard to talk about timeframes that feel like that when somebody in their garage can create an API in five minutes and put it out there for us to use. Somehow we have got to come to a balance.

We are still having a lot of trouble with tax IDs, and I will tell you what that means in our world. We have numbers that identify our entities. They each have a tax ID, but if you have one CCN — for those who know what the language is — and it’s five hospitals or five clinics or five whatever’s, the tax IDs are separate but we get one remittance, so we end up with all that work behind the scenes.

We do electronic payments but we have a lot of people that want to use virtual credit cards right now. I love the comment that was made earlier — those get faxed. We’re kind of going, okay, that works. Everybody that has a portal that’s a payer — and I love you all — they all have their own X12 standards. They take the X12 standards, they create their own template, so we have a running of hundreds of templates that we have to respond to for X12. They are 100 percent okay. They have followed the standard. This is not a complaint about them, but we have to interpret.

I think it all comes down to we want to be a good participant. We want the patients and clinicians to be able to get to their charges, their billing, their data and understand it, but we are not there. What we need — and that’s why this is exciting for us and for me personally — is we have come some of the way on the clinical data being exchanged — and luckily today you don’t have to listen to me talk about CCBAs — but there’s more standardization and exchange of data than we have ever had. That is phenomenal. But the patients don’t understand it and the clinicians can’t always use it and our payers don’t always work on the same page we do.

Thank you for listening to me. I feel much better now. Thank you.

COUSSOULE: We didn’t talk about the catharsis involved in the data.

(Laughter)

DOO: Next, Mandi Bishop from Gartner.

BISHOP: I am Mandi Bishop. I recently joined Gartner. We are an IT analyst firm that has been around so long that our stock ticker is IT.

I am in our CIO research and advisory division, so I spend all day every day having exactly these conversations with CIOs from your organizations as well as others, and I act as human ETL. I am translating the provider plan into payer implementation ideas, so, thinking about payer strategy. I have to talk to them about what the administrative inefficiencies for providers mean both from an RCM standpoint as well as from a payment integrity standpoint for the payers.

I also am on the Board of Directors for the Society for Participatory Medicine and the National Health IT Collaborative for the Underserved. As part of those advisory services I have to talk about and kind of remind us all — and I haven’t heard it yet that we’ve mentioned the patient — that every single one of these inefficiencies, every single one of these administrative mistakes, every single one of the claims that are denied as a result of our lack of standards and the lack of interoperability has a material financial and potential health impact on every single one of those patients.

Medical data is now the leading cause of bankruptcy for seniors. A lot of that can be traced back to the 90 percent of claim denials that are preventable, which is in large part due to the lack of these standards, the lack of claims attachment standards, the prior authorization challenges, the black box processes that we all have in place for medical necessity, so, all of the items that you have talked about. Ditto. Yes, I hear you. I hear this every day. We can do better and we must do better both for our businesses as well as for those patients and members that we serve.

I am going to be the fastest and the shortest so that I don’t hear that —

(Laughter)

DOO: Thank you, Mandi. Margaret Schuler.

SCHULER: Thank you for having me here. I am super-excited to be here. I am very passionate about this topic. I’m from Ohio Health. I represent a provider. We are located in Ohio — I wanted to make sure you were all awake.

(Laughter)

We are a not-for-profit organization. We have over three million visits. We have employed physicians, we have 10 hospitals in our health system and we have over 300 physician practices. We have urgent care, home care, hospice, et cetera, so we are a large health system and so I am very familiar with all the standards.

We collect about $3.8 billion in cash a year — just to give you a sense of the magnitude of who we are as a provider.

Who am I? I am Assistant Vice President of Revenue Cycle at Ohio Health. I entered the industry at the birth of HIPAA, so it’s near and dear to my heart, administrative simplification. I am hopeful that before I retire we’re going to really see administrative simplification.

We are under tremendous pressure on the provider side to reduce our costs. Our reimbursement is getting cut every single day. You read about that in every publication. I am overhead; my team is overhead. I have 1500 FTEs that report to me. My budget is six figures, and if we had true administrative simplification I could cut that in half overnight.

I have humans touching and moving claims, information, all the way from the front end to the back end. My role is I oversee patient access, coding, HIM and all the business office transactions and cash posting. I have been in Revenue Cycle for 20-plus years on the provider side. Again, very passionate. We can reduce costs.

I have a provider’s perspective, and I love what Liz said. I love my health plans, but my health plans, when I talk to them about these standards they say, Margaret, we hear you, but we have put all our money into our portal. We are meeting the minimum standards in the industry, but our money is in the portal, so you have to go to our portal.

Well, me going to the portal is going to 200 portals, and they are all completely different. They have different sign-ons, they have different requirements to look up patient information. So, what we are having to do now because standards are not adopted, we are having to go and screen-scrape electronically from payer portals, which is not what we need to be doing. This is old technology. I’m using bots now to screen-scrape payer portal information.

The transactions that I do for my friends at the Health Plan are empty or are so generic that I have to go back to the portal to get good information. I will give you an example. Claim status — If we can get this baby humming we can cut costs. That is why I’m hopeful that we are not going to wait until 2030 to do this. There are standards already available out there, and if we can just get the adoption and the content in these standards we can cut costs.

So, going back to the next slide, the 277, the claim status, is not working. Again, it’s empty. Again, we have to get the proprietary health plan information off the portal. We are right now working with a vendor to help us with the emulation and that screen-scraping.

The other thing Dr. Goodyear mentioned — authorizations, number one issue in the industry. I am very active with HFMA, Healthcare Financial Management Association. That’s how I got invited here. I go around the nation and speak about denials. It is a problem. It’s completely manual. We want to give the health plans whatever they want to pay the claim but it’s extremely labor intensive. The physician practices are supposed to initiate the authorization. They have to go to 100, 200 different portals to figure out if it even needs pre-cert’d, and then by the time we figure all that out they need to get their patient in for care, and we get a denial.

It is a huge issue that I cannot stress enough in the industry that’s causing — just the rework and the cost.

The other items I have up here — I will make a plea for just the basic transactions that are already available. If we can get these working rather than investing in — we can keep with the research and development for new transactions, but if we can just commit to getting these transactions actually working, I think we could make significant progress in the industry around this space.

Again, thank you for having us here, and thank you for having the provider’s voice at the table. I am so passionate about this space that I actually joined the X12 committee eight years ago. I only lasted a year —

(Laughter)

— because they were talking about standards that were five years out. But there are very few providers represented in this space just because we don’t have the time or the availability of resources to commit to being part of the standards development, so I really appreciate that you’re hearing the voice of the provider. Many times, we are the ones that end up dealing with the downstream effects of the lack of standardization, so thank you very much.

DOO: Thank you very much. Mark.

GINGRICH: I am Mark Gingrich, CIO of Surescripts. We were founded back in 2001 by a combination of the PBM industry as well as the pharmacy industry, and merged in 2008. I have been around for the whole ride, and the first employee, first CEO of RX Hub, so I have been through this whole standards creation as well as taking them through the standards process and the NVHA hearings and all the sorts.

We have come a long way since 2001. We are now one of the largest health information networks in the country. I think our annual reports say we processed nearly 14 billion standards-based transactions in 2017. A significant number of those were for script transactions, but also we are involved with X12 and eligibility as well as HL7 as far as clinical exchange and medication history, so we have done a lot in that area.

Besides involvement in you might say the traditional SDOs, we are a member of Argena, and I don’t need to say a whole lot about that since others have, as well as Da Vinci and the Sequoia project and Care Equality, so we are involved in all of these. From my standpoint, our company’s standpoint, we appreciate working in those worlds because we’re not just defining standards, we are also using them. And the whole point is to implement and use and evolve the standard to get it to the right level for mass use, so we are definitely involved with that. We are involved in some other standards activities as well.

We appreciate the role that NCVHS plays and regulators play with the standards; however, I will just pile on — the pace of the process is not keeping up with the pace of business and the business needs that are needed to increase patient safety, reduce costs and improve the quality of care.

One case in point is what we’re going through right now with NCPDP Script. We are going through our third iteration of a final rule. We have gone from 8.1, which was back in — I don’t have the dates written down but I know the durations. That became a standard, and three years later we took through the Script 10.6, so the timeframe from when the standards became ANSI-accredited was three years and until we had it in a rule and were driving the use of it was a four-year timeframe.

Now we finally have a new rule in place — well, we are working with it because there’s a transition, and, to some people’s point, it does take time to transition from one version to another. But the rule goes into effect January 1 of 2020, and the sunset date for 10.6 is January 1, 2020.

So, we have some work to do from a transition standpoint. When you look at the time and duration from when the last version went through, you’re talking from 10.6 to 20.17 07.1, which is the one that will become effective in 2020, already two and one-half years old. The duration between that and the previous version is nine years, so you’re talking — it is not evolving fast enough for the industry.

So, anything we can do to allow us — and you talk about the pace and predictability. For me, what’s happening — and a term we use internally for the engineering work we do is if we are not keeping up with the pace of technology and improvements, we’re incurring technical debt. So the more frequent, smaller chunks of business value we can have out there the less technical debt we’re building up as an industry, which makes it easier to go to the next version versus harder.

I think I will end with that. Everything else I think has been said multiple times. I do look for ways that, again, the industry, say through NCPDP, can take a little more ownership in how we get a new version out there, because I think at least in that industry we are pretty good at evolving the standard and doing it with a high level of consensus and without a lot of customization. We are pretty good at getting it out. I will end with that.

DOO: Thank you. I demonstrated a little bit of an alphabetical challenge earlier, so I missed a J, and I apologize. We do have Jon From Walgreen’s, and he does not have slides. I’m sorry I missed it on my list, so if you would grace us with your comments that would be great. Then we will go back to our regularly scheduled program.

ARENDS: It’s quite all right. Thank you. I was a late add. Thank you very much. My name is Jon Arends. By training I am a pharmacist. I represent today Walgreen, Walgreen Co., a subsidiary of Walgreen Boots Alliance, a global company. Specifically for Walgreens, we have 8,000 pharmacy locations as well as about 1300 acquired locations that will be coming into our network or are in progress coming into our network.

We have 24,000 pharmacists and over 200,000 pharmacy technicians that actually are our end users and use these standards on a daily basis. Specifically, we work mostly with NCPDP standards looking at Script and Telecom, our two mains, but as Mark mentioned, we do a little bit of eligibility as well, under Surescripts .My role is I specifically work on electronic prescribing and pharmacy automation.

The positives I see from standards-based organizations and standards in general in our organization — we really see that the electronic standards help us improve our quality outcomes, and that’s one thing that we really hold sacrosanct at Walgreens, is that patient safety. We have seen, as we have moved from paper-based to electronic, that we get an increase in patient safety there, and so we are advocates for just in general electronic transmission of information.

On top of that, we have also put in a lot of automation, and I have heard other people mention that. As we start to take the human element out of some of the routine tasks that we have in the pharmacy, we are finding that we have even better patient safety outcomes and adherence outcomes for our patients, and we allow then our staff to refocus on the actual interactions that require people, that actually do more than the rote day-to-day stuff. They are the ones that are talking to the patients, making sure they understand how to take their medications. And that is where we really would like to spend the majority of our time and have these standards allow us to automate some of the more routine tasks.

Also, like I said, it improves our efficiency. As we have talked around lowering the cost of healthcare and reimbursement pressures that we’re all facing, one of the things that we have to do is get more efficient. So the add-ons that come with each iteration of standards allow us to do more and more things in an automated fashion, which allows us to then still compete in the marketplace while bringing down healthcare costs globally.

A little bit about me specifically — I am in the pharmacy systems team, but really that’s a subset of operations, so I deal with our questions. We have physicians that reach out to us and they don’t understand how their EHRs work and they ask us questions about the EHRs. We have our pharmacists who ask us questions like, well, this prescriber doesn’t know how to find me; can you help. We are really on the front lines of triaging where the things don’t quite work properly.

We take all of that feedback and we go back to the standards organizations and go back to our network operators and say here are our pain points; how can we address them.

I am active, as I said, in NCPDP in the standards development role. I also have an industry-facing role where we talk about clinical quality improvement efforts with our network provider. And, more specifically, my biggest role though is our interface with our IT team, where I end up writing all the requirements. They are not familiar with the standards. I am familiar with the standards; I know enough about technology to be able — who said it? The human ETL — to translate what the standards are trying to say and incorporate that into how operationally we want to actually adopt and implement the different standards.

I will go quickly into some unique things about Walgreens. I would say the reason why I’m going into them is every organization will have its own local or parochial things that are going on that prevent or augment the way that they’re reacting to requirements. I wouldn’t say that ours are necessarily global, but one example, an anecdote, will give you an example of what other people are kind of dealing with and why it gets a little bit like herding cats to get us all over the finish line of the standard.

We are in the process where we have a 21-year old system — ours is like HIPAA, where we introduced our dispensing system 21 years ago, and it is not flexible. Although we love it, we want to move on. The challenge that we are facing is, ideally in the best-case scenario, we would just not put any money into our legacy system and focus completely on the new one. But then, when we start getting mandates, we have to start saying, well, we actually have to co-develop this in our new platform and in our old platform because of specific requirements and time lines, and that ends up not allowing us to be most efficient and flexible because — we call it throw-away work — what we’re doing on our old system. It doesn’t give us an opportunity just to say, well, if we could just not meet the time line or have a longer transition period we wouldn’t have to do this duplicative work and we could be even better what we go out the gate with.

As we’re talking about Script, that’s the big thing on our radar right now, and we are having to do a minimal implementation in Script in our old system to meet requirements, but it’s also taking away resources so that we can’t do a more robust implementation in our new system out of the gate because our new system is not available right away.

Well, that was five minutes.

DOO: Thanks very much. Now we have Mike Barlow.

BARLOW: Thanks. Being this far down in the Agenda I can go very quick and skip right to some points. Some quick background on Palmetto GBA. Palmetto GBA is a Medicare administrative contractor to CMS. We process over 300 million claims a year, institutional and provider. We also have one of the largest gateways because we are also an IT administrative contractor. Our gateway actually processes over 3 billion transactions a year of different types, not just these claims, so we are real familiar with the claims standards.

I would echo most of the others’ comments. We believe the standards promote efficiencies, but they need to be able to allow that window of innovation and the window of opportunity, because if you live just on the standards cycle you lose the opportunity and you build all this — I like the term — technical debt, and you cannot keep up with what’s available.

In the time since the X12 standard that’s currently in place came out, I as the payer have the ability now to actually pre-process the claim and tell a provider before it gets into my claim system that if this claim keeps going the way it is, it’s going to be denied. That is a powerful message. Not a lot of claims, but all the work that I have to do behind the scenes — take phone calls, work appeals and handle — is all associated with mostly claims processing errors. While the standard focuses on the structural elements of the claim, you cannot put standards around my processing rules and my requirements for when a modifier or when a diagnosis code is necessary. But I can put that message into that same informational stream, provide it to the provider, give both education and the opportunity to fix the claim.

The idea is to make sure that the standards committee and the standards development takes into account that what you think you’re doing today, you have to make sure you leave that window of opportunity.

We did a pilot with Optum called — at the time we implemented it, it was called ACE, or Advanced Clinical Editing, and it put all the edits associated with all of the policies that we have in place, all the correct coding initiatives, all those things that everybody is supposed to know that not everybody gets right every time on every claim, and put them up front. We quickly saw the capabilities of this system and took it a step further.

In fact, Optum now calls it the Advanced Communication Engine, because not only can I send back messaging about the status of that particular claim and whether it is likely to deny or not but gives the provider the immediate opportunity to turn right around and submit it. Talk about payment cycles — submit a claim this morning, get a message, fix it this morning and send it in and you still get your check in 14 days. The old way, it was about a 23-day cycle for that correction, and most of the time you had to call me because you didn’t understand the CARC or RARC code.

We changed that. Now we have gone further. We have put credentialing messaging back. When the provider’s credentials are about to expire and stop their payments, we’re sending them a message that says hey, guys, you might want to make sure your credentialing department is up to speed. We send them messages about their behavior. Providers always want to know we’re looking at them.

We can actually send messages back. We can do real-time data analytics, so all the tools that are available and the technology today we can put into play and send out to that one little window of opportunity in the X12 transaction called the STC free text field.

It is imperative that the standard allows that type of innovation to continue when you’re looking to go down the road, because without the ability to do that messaging or without the ability to work within the standards that exist we wouldn’t have that capability.

My last slide just shows you the administrative impact in the first year of this implementation. Providers actually complain — if I have to take ACE down for a day, they complain why didn’t I get any messages today. So they are very used to this. It’s an informational flow. It’s in their workflow. It is not changing anything about their life.

I am glad to be here. I am certainly honored to be amongst this group, but I wanted to just make the point I am here primarily to represent the payer perspective that the standards need to give us — need to make sure we leave room in the opportunity for us to be able to develop these types of tools and put them in place, because you can’t standardize my payer rules and XYZ’s payer rules. All of our payer rules are different. If anybody has worked with Medicare, the Medicare rules are the most complex of all, and yet I have been able to encode that into a standard response back to the provider on a day-to-day basis.

That’s my five minutes, thank you.

DOO: Thank you. Pat Waller from Cambia, you have the floor.

WALLER: I am Pat Waller from Cambia Health Solutions. I am a senior staff IT consultant. They didn’t know what to call me so they called me that. Cambia Health Solutions is a company that’s creating simple, people-focused solutions, putting the patient in the middle so that it’s really tailored to them. We have been around for over 100 years. We founded one of the first, or the first, health plan, Harris County Medical, 100 years ago. We serve about 70 million people across the country.

A little bit about the current state and HIPAA. For us, HIPAA works reasonably well for many of the transactions — claims, A35s — and the more business flow transactions. The standards are slow to change. They don’t  support business change, they don’t support innovation. We are struggling and moving towards APIs, and there are significant constraints in how to mix not only standards but some of the HIPAA laws around how you can automate workflows between FIHR, HL7 and X12.

I am constantly working with my business and technical folks adjusting how to solve problems, so there needs to be some latitude, some ability without going through a bunch of paperwork with CMS to try and adjust and create standards or processes that are more beneficial to the industry. Prior authorizations is one. It’s really broken and we are trying to fix it and I keep running into, well, I need to do this from a HIPAA perspective and then I can switch over and do it in the API, and I have to go back to X12 for this. It just complicates it and makes the delivery extremely challenging.

We are, like many of you, engaged in Da Vinci. We are also part of the CARIN Alliance. We are out there trying to help people have access to their data. We are also trying to make sure that there’s broad dataflow that is used across the industry, not only between provider and patient but also the complete continuum. HIPAA enables part of that and HIPAA also makes some challenges to that.

I don’t need to go into any more detail. We will get to some more, I’m sure.

DOO: There will be lots of opportunity for finer points to be made. Okay, thank you.

Ray Sczudlo.

SCZUDLO: Thank you very much for inviting b.well to participate in this panel. I think you will find, as I go through this quickly, that we are coming at this from a very different perspective from a lot of you, but we depend on all of you to let us see the data.

The b.well platform puts consumers at the heart of their healthcare, and we want it to be the one mobile device for a consumer to access all of their healthcare data and transform an overly complex and expensive process into a simple, on-demand experience.

The platform is designed to manage all aspects of a consumer’s health and provide a mobile, on-demand experience. We operate as a concierge on behalf of consumers to collect all of their data, such as clinical, insurance, family medical history, wearables and sensors and even genomics, to provide the consumer a simplified and meaningful view of their own health. We do three key things with all of that data.

Using their data, we provide a personalized care guide tailored to a consumer’s unique health needs and the health needs of their family. We need to move beyond the general population health approach of get more sleep, don’t stress out, eat less sugar, into a world where we deliver real-time messages to consumers based on the data that we have, such as you are hypertensive, you are not on medication; let’s get you to a doctor.

We then match them, using clinical data and self-assessments, to low-cost, high-quality services that they need when they need it, making healthcare more on-demand and convenient. And, most important, we connect them to their healthcare teams, which could include a loved one, physicians, coaches, really anyone they wish to share their data with, and we allow our customers to do this at whatever level they choose and with whomever they choose to build continuity and support for those who need it most.

You see that we interact with a number of portals. There are doctors, there are insurance portals, there are consumer applications, there’s employer wellness and health savings accounts. If you’re looking at the center, the consumer, if they have to deal without the b.well application they would have to deal with each and every one of those separately, and it becomes too complicated, to time-consuming, and, therefore, they won’t do it. We have to address the way that we engage the customer. The more digital we have become the more fragmented and difficult it is.

We have a log-in to insurance carrier portals to access basic financials and the provider network. Why is this important? Employers are taking on more and more risk of healthcare in the U.S., and a typical employer would have to piece together seven to 21 different programs designed to reduce cost and increase engagement, and through the b.well platform they can do that with a single sign-on. None of the apps talk to each other and we enable the customer to do that.

You can see how b.well is at the center of aggregating all of this data. There’s health insurance data, clinical data, partner data and there is other self-reported data that’s important. We connect to a number of data sources across the spectrum of healthcare through various methods to deliver a personalized healthcare experience. In short, we are a data aggregation and analytics company at our core, and we have designed a comprehensive approach leveraging the recent regulations and state-of-the-art technology to deliver a more effective healthcare solution.

The new health economy is evolving. Consumers are paying more for healthcare than ever before. Employers are increasingly becoming payers and need to manage risk, and payers and providers are trying to move to pay-for-value and need to connect to the population in real time, so we can enable that. We are designed to serve the healthcare industry as the tectonic shift occurs, but the industry still has lots to do in terms of regulatory support.

The key issues that we are faced with are data harmonization — we need to have clean standardized data across all platforms and good interoperability. We need to make clear everywhere that a patient’s right of access is not the same as a HIPAA authorization. A patient has a right to that, and we are very concerned about data-blocking and inability to access the various portals that are out there.

Our other concerns that are not necessarily part of this committee but also impede the right of a customer, consumer or patient to manage their own health is that they need to be able to correct their medical records. That is a right granted to them under HIPAA but there’s no real meaningful mechanism to do that. They need to be able to get labs and obtain results, and they need to be able to easily access Telemedicine services if we are going to cut costs.

I will stop there.

DOO: Thank you. Ms. Wilson, you have the floor.

WILSON: First of all, thank you for my being at the end. This is very nice because I concur with all the comments that have been said so mine is very short. I am Sherry Wilson, I’m the Executive Vice President and CCO of Jopari Solutions. We are a technology solution vendor and national clearinghouse. We process healthcare transactions and payment solutions across all lines of healthcare business.

We also represent over 75 percent of the property and casualty market either directly or through our channel partners. We are also processing over four million attachments a month, which includes structured and unstructured. I think, Janet, to your point, with the structured document — it’s probably less than 5 percent of that four million — even though we have a standard, the challenge that I think you alluded to is the ability to — you have disparate systems that the providers have and trying to get the proper data in order to populate these transactions is a challenge. And seeing a standard CDA is uncommon. But great opportunity.

The other thing is we are actively engaged in the national standards organizations. I think you all have pretty much said all of these things. There are just two points I want to bring out. The SDOs, I think a lot of us belong. Often we are all the same people. These are voluntary organizations with limited funding, resources and governance, resulting in an inability to be agile, as you all talked about, to really meet our business needs. And the real shame in this, with the limitations, is our inability to be agile has created missed opportunities for optimizing our business processes.

The other point I want to bring up here is the lack of an industry standard roadmap. Chris from ONC — I do want to acknowledge the ONC IT roadmap and the trusted framework, great. But it is, again, the coordination across — whether private or government — industries working together. But when I am in a board room or I’m sitting there with my CFO — and I’m sure a lot of you have the same challenge — they’re asking me what is the direction, which new technology in our standard is the better solution to achieve interoperability, or automated workflow process. And this is very hard because I should be able to say that clearly. And the next question comes, what is the ROI? Where are the pilot programs? Who is engaged in these projects? And it becomes very hard. So that’s a real challenge that we really face today.

Yes, those regulations were supposed to come out but they’re not here yet, and we don’t want to invest several million dollars — and again, a real clear direction. So I am very excited at what you all are doing because that’s going to help us, I think, within our own corporation.

I wanted to share with you — and, Mike, this kind of goes to your point — in supporting emerging technology and missed business opportunities, we are looking at APIs, I as most of us are, on data integration and normalization of data. How can we leverage existing IT technology, resources and connectivity to have one end-to-end workflow process across all lines of business? In property and casualty we’re using the exact same transactions, the HIPAA transactions, as we are using in the other areas of business.

Also, the ability to apply for edits to adjudication. The states have adopted in property and casualty the acknowledgement transaction and have attachment standards since 2008, and that is critical because it has allowed us to apply front-end edits upfront. We know what the rules are. We have been able to increase first-time claim submission by 95 percent, and are able to increase payment usually within 15 days by 75 percent. But those acknowledgements are key to making that happen.

Also, the artificial intelligence — I think cognitive computing. We are into enterprise data mining, and data analytics is critical. And I think we all are struggling in our initiatives to enhance stakeholder self-service access to information, to get it at the right time to the right people. It is critical. And then FHIR and Blockchain.

Very quickly, in the lack of regulations or being able to move forward to take business opportunities we are really engaged in industry collaboration. Liz, you were talking about all the different things, but really coming together with partners with the same cause.

Kevin, I think you will like this one. An electronic attachment collaboration, again, with the lack of regulations, is moving forward where we’re doing unsolicited and solicited. And CMS is really leading the way in this initiative.

We have been working with NGS. They have been over a year doing unsolicited attachments with the X12 275, and as of May 21st they are now opening up Medicare Part A and B, and we are working on with solicited with the X12 277 request for additional information, or the 275 is respond.

But why we’re doing this, the Blue Cross we’re working with the same thing. They go into production in June. That is great, that infrastructure, but you still need those partners. You still have to look at the ROI across all the stakeholders to get the providers engaged as well.

With that, prior authorization, streamlining that application, getting information up front, how much do you really need. And more data analytics, and then FHIR application for interoperability. We saw our people within three days knew nothing about FHIR and we were able to build a FHIR server, and actually, it was a huge thing, so I was trying to sell it, but I really got their attention by seeing the ease — it would take us probably eight months in an implementation with the new standard.

And last is Blockchain and collaboration. Again, partnership. We’re looking at it internally, but then trying to find partners and collaborations that we can work together with to achieve those goals.

So, very grateful to be here today. I am really looking forward to the panel discussions. Thank you.

DOO: Thanks very much. And thank you to everyone. That wraps up our introductions. You are an amazing group and I think it’s going to be very exciting this afternoon.

Alix or Nick, do you want to make any comments?

COUSSOULE: It has been very helpful so far. We are slightly behind but I think we would like to get our first panel discussion in prior to breaking for lunch. I think we might have better odds of getting lunch without so many if we push the timing a little bit. So hopefully that will work with everybody.

GOSS: Can I just get a show of hands for who has to leave before 4:00 o’clock, just for our time management. Thank you.

When we get into the panel discussion it’s much more of a free-for-all, so we want to make sure everybody is using their tent cards to help us with that. We will have a setup by each subcommittee member for each of the panel topics where we will just give you a quick backdrop and a couple priming questions. I suspect we won’t have any problem with discussion.

The goal here is for us to not talk that much. It is more for us to set the stage and let you guys give us great input that we can leverage moving forward.

Agenda Item:  Theme 1: Governance

LANDEN: First off, I want to thank you all. You have already heard a lot about all five of our themes. I heard at least four of you talk specifically about things that fall into this governance pocket. We will take governance first, governance with a small “g”. I’ll talk about that a little bit and then we’ll get into the standards adoption, the regulatory process, data harmonization, which was another common theme I have heard, and then other entities involved in the ecosystem.

The Designated Standards Maintenance Organization, more commonly referred to as DSMO, is a creature of federal regulation, not the HIPAA legislation, but the purpose there was to bring the primary cast of characters together and maintain the EDI implementation materials that the final HHS CMS rules specified. It also talks about requests for new code sets that will eventually ultimately be named as HIPAA code sets and included in that process is a step for us as NCVHS to hear the recommendations from the DSMO on an annual basis.

Remember we’re talking almost 20 years ago. It wasn’t the legislation, which is 22, but the regulations which are slightly younger. The DSMO was mandated by regulation to review the updates and adopt standards primarily coming out of X12, HL7 and NCPDP, and it kind of set the ground rules for how industry and end users would input into this process. At the time, the DSMO was envisioned as the primary access point for entering change requests or update requests. That turned out to not be true. In hindsight what we see is most of the change requests go directly to the SDOs and then later are looked at by the DSMO organization.

I also want to point out that the operating rules do not go through the DSMO process. Operating rules came along much later, different authorizing regulations and legislation, so operating rules are not part of the DSMO process.

Here you see who the DSMO is, and MOU, of course, is Memorandum of Understanding. We have got the three SDOs coming down the left side of the graph, the data content committees. I don’t think we have referenced those before. It’s the Dental Content Committee, the National Uniform Billing Committee, and the National Uniform Claim Committee, and all three of these committees had their origins back in the paper world. They were longstanding institutions or organizations in their own right and had a very primary and fundamental role in industry with establishing claim standards in the paper world, which was at the time the DSMOS-created, and were obviously morphing into the electronic world.

Finally, at the 6:00 o’clock position, Department of Health and Human Services, usually meaning CMS, but because the regulations come from the Department we described it as the parent organization.

When change requests come in to the DSMO, most times they do not affect every entity in here, so each organization here can opt in or opt out to the DSMO discussion of that particular change request. That is particularly important for code sets that might be relevant to the NUCC or NUBC committees because one is inpatient and the other is primarily outpatient.

This slide shows a more extensive process flow where the input moves from the left to the right. We start with industry demand submission to the DSMO, the DSMO review process with each SDO opting in or opting out, changes being made to the standards by the SDOs, so, the control of the standards modification is complete within the SDO. But then there is a feedback loop, it flows back through the DSMO process to verify that the change hit the intended target. Then the DSMO makes its decision and, assuming it is favorable, forwards its recommendation to the NCHS.

NCVHS then holds the industry hearings, public comment hearings, and deliberates and then makes its recommendations to Department of Health and Human Services, to the Secretary.

To get us started, we, on the NCVHS Standards Subcommittee, have crafted this problem statement. It’s a starter set; it is not meant to preclude anything. I have heard a lot of thoughts that fit into this problem statement. I have heard some other very, very compelling possibilities that don’t quite fit, so I want to encourage you. Anything relative to governance or how the whole system works is germane for conversation, so please do not be shy. It is up to us on the Standards Subcommittee and the full NCVHS to analyze that. And, as Alix and Nick said earlier, what we intend to do is we will do the analysis, we will put out our thinking and then ask for another round. Did we get it right, did we interpret what we heard correctly? So that will be back to you.

What we are starting with is the current coordinating body — meaning at this point the DSMO — is charged with the oversight of standards revision priorities. Then we go on to say, but we suspect it may be operating with too narrow a charter or lacking authority and resources to be effective. By that we mean some of the things we have heard — that there are gaps, there are things about the existing standards that some of you view as deficiencies that need to be corrected. So the basic question gets to how is it working. Is the DSMO sufficient? Do we need to think about changing it or not changing it? You know, it’s wide open, replace it with something else. Do we need to add parties to it? As I mentioned, operating rules are not there.

So, what do we need in order to have effective governance for this whole process of standards development and eventual rule promulgation by HHS? That is what I mean by governance with a small “g”.

Let me throw it open and start with the first question. How does the process work today from your vantage point? Again, if you want to comment, please raise your tent.

HEFLIN: From my perspective as a software architect, it’s working as designed, and it looks to me like it has got a 10-year iteration cycle.

(Laughter)

Back to Kevin’s comments, back 20 years ago, that was a best practice. It ain’t today. Best practice today is agile, quick steps, focus on high-prioritized, clearly defined — basically a force-ranked list of prioritized, hopefully business and clinical use cases driving that month’s or that quarter’s iterative cycle, not that century’s iterative cycle. So, again, I think it’s designed to have literally a 10-year cycle.

It looks overly bureaucratic. It looks like it’s completely waterfall. You don’t know the output until literally years after the inputs were put into the system. It’s a model that is not going to succeed in today’s world. That is my first comment.

Second, it’s definitely too narrow. I have been in healthcare IT for over 30 years. I have been involved with deploying systems as a volunteer — I am not trying to make money off this; I’m trying to help the common good — for 12 years at the national level. I have never had a single conversation or outreach from them towards any initiative I have ever been involved with. I am on the Board of Directors; never heard of them. They have never engaged with me. I didn’t know about their existence until now. Now I know they exist.

I would actually like to bridge that gap because we have an entirely different world here of clinical data exchange, which is working successfully for high priority use cases in production today. It’s an API. And structured data that can be used for the purposes being described today that is not being tapped into. There are no portals in our solution; it’s an API. You can directly write an application program to talk to it and pull structured data out, and almost every EMR vendor worldwide today supports these standards already.

So the standards body, IAG, of which I am a board member, actually curates this API that’s being used really planet-wide, nationally, and that I think should be within the scope, as well as TEFCA, the recognized coordinating entity.

GOSS: Are you saying that you think that DSMO should have scope involving TEFCA?

HEFLIN: Including the TEFCA RCE, because the RCE, if that concept ultimately becomes reality, will be coordinating kind of the clinical data side of exchange. Let’s get that talking to the administrative side. We have got an opportunity we’re wasting.

Department of Justice has their own standard — same dataset, different purpose, different standard, not interoperable. Again, another example. Emergency departments have a standard called NEMSIS — same content, different standard. Let’s bring all these together. Thank you.

JOHNSON: I have a question. Who is on it and how did it get there? I think all your comments were right on. I saw the slide. Is it representatives from all those people?

LANDEN: Health Level 7 delegates a representative. NCPDP, X12. The Dental Content Committee, again, is its own organization comprised of a balanced mix of dental provider representatives and dental payer representatives. The Dental Content Committee sends one rep. I believe it is its chair.

National Uniform Billing Committee, again, that’s the hospital. It’s basically a balanced committee of payers and providers. They appoint a representative to the DSMO. And then ditto with the National Uniform Claim Committee, which is the outpatient billing side — again, payers and providers. That is who constitutes the SDO representatives and the Content Committee representatives on the DSMO.

JOHNSON: Obviously, what I’m looking for is 12 members chaired by whom. The reason I say that, I think the comments that were just made are right on, but frankly, just structure itself makes a difference in our ability to be nimble and agile and respond, and that’s what I’m looking for.

LANDEN: Yes. And again, I stated that this was done almost 20 years ago, and at that time the world was different. HIPAA was brand new; we were still going through the first iterations. Communications had not been clearly established. There was uncertainty running rampant as to which — These are all longstanding organizations; they all have their own charters, they all had their own missions and they were not necessarily aligned.

So this was a way to put kind of a control valve on the process for some alignment as we felt our way forward and began to develop an understanding of how these flows would work under HIPAA and to begin to build trust among these entities and a comfort level so that they knew who was supposed to be doing what on whose behalf.

JOHNSON: Obviously, kind of self-evident — but I will say it out loud for the record — is that probably you need to look at the constituencies of the group and look at the current state. I think you have already said it. It was built 20 years ago. Is it still the right constituencies, and is it formulated and re-populated in a regular way so that it represents current state? Then we could get more agile.

LANDEN: That is exactly the question we’re asking you.

(Laughter)

JOHNSON: So you want to know who should be on it.

LANDEN: No, not exactly in that way. What we are thinking is the role is a 20-year old role. We have heard some good comments. What about this is right? Is this something that needs to be evolved? If so, what are the specifics? What is missing or who is missing?

KLOSS: What is the right governance model.

LANDEN: Or, alternatively, should it be replaced with something completely different? Again, we are not committing to going one way or the other. We just want to hear what the thinking is out there and the reasons why.

GOSS: I have Janet, then Sherry and then Kevin. Kevin, Janet and Sherry.

LARSEN: A couple of thoughts. One of the thoughts is that the devil is always in the details with standards and implementation. What I have found — and I’ve been on policy standards committees for a long time now, since the early 2000s, and I am amazed at how often it’s still really qualitative and opinion-based. We are in a data quantitative world and typically, at least what I have seen, it’s a bunch of experts sitting around a table and saying, well, I think this. And, well, I think that. And at my place we like this so let’s vote for this.

It seems like an industry or a governance process that could really lend itself to some more data-informed advanced analytics tools that really help us make our decisions and drive our way forward with the data that we all collectively have, using kind of data tooling, testing and modeling in ways that, at least when I have been involved, I haven’t seen a lot of. Maybe it does exist. But a lot of the standards committees I have been on are very expert opinion-oriented.

We often have places we couldn’t see. The world is getting so complex and the interconnections are so challenging that without that kind of modeling and testing visibility it’s no surprise to me that we have troubles and holes and defects because people are not using the tools that we’re actually governing.

So, that is one call to action for us. How do we do that? What is the tooling, modeling, statistical approach that we put into this whole process?

As I look at this as someone who had to come into government and then start to coordinate across here, what I see missing are the content standards groups, the SNOMED CT people, the AMA with CPT, ICP9, ICP10. I heard a lot of people want to start having data element interoperability, and if we are going to drive to that level of data element we’re missing all of the data element content people from this.

We found, in the quality measurement space, not a lot of clarity about which types of data elements belong to which type of organization, and how to go about that. I mean, the exact same kind of parallel process but integrated, because you need the data element standard and then you need the meta data standard and then you need to put it in some kind of content and you need some semantics. Figuring out how to put all those pieces together and then iterate them together and make sure that all of approvals are bounded in sequence so that we can actually get a complete specification that has both data element detail and the transport standard detail is not straightforward in the current premarket of SDOs.

CAMPBELL: I have a comment and also a question. I was thinking when Eric was talking about some of the work that the Sequoia project and others have done, I think what typifies that is that, for the most part, it’s extra-regulatory. It relies on regulated standards to a certain extent but it’s a system above and beyond the standards themselves. It is the system that runs things. If you look at something like Argonaut, that is entirely extra-regulatory; nothing that’s being done in that space is regulated at all.

And it makes me wonder if it’s not a matter of this being an incorrect body but just simply being insufficient and there not being other structures then to support whatever standards are regulated in place there. If this body moved faster, perhaps the support for it that is provided by these outside networks could spring up as well. But I think that actually, keeping governance outside of the regulatory process, I would prefer to keep everything outside of the regulatory process, has been shown to work in other areas.

My question was — I also wasn’t aware of this until today. Could you give an example of something that has gone through that step process, like a request that has made it through?

GOSS: Are you talking about a request like I want to update the data element XYZ and how that comes out of a guide, or something more global that says we have done a lot of work; we now have Kumbaya as an industry around wanting to upgrade, let’s just say, 4010 to 5010, and we want to advance that as a change request?

CAMPBELL: Like would that be an example of something that either one would be?

GOSS: As Elaine is whispering to me appropriately, there is a really great current example that I could be using. We held a hearing on March 26th for the National Council for Prescription Drug Programs to upgrade the NCPDP adopted, recognized standards or implementation guides, and that went through the full process. They got requests, they managed those requests, they produced new IGs, they brought it forth to the DSMO, they agreed, they brought it to us, we held a hearing — and I am going to hold off on the rest of the process until we get to Denise’s part.

COUSSOULE: Can I ask just one follow-up question, and this topic came up several times today. If, in fact, we were to push for an extra-regulatory model, one of the challenges that came up in a couple of the conversations of a couple speakers this morning was what kind of enforcement mechanisms are in place. If you don’t have a regulatory defined model, then by definition you don’t have kind of a big brother on the enforcement side. What is the good and bad?

CAMPBELL: And the question is, I guess, do you need it.

COUSSOULE: That’s part of my question. What is the good and bad of that, those differentials? If everything is totally voluntary, then you’re going to get take-up and not take-up, so where are the risks in that kind of model?

GOSS: I think where we are at is I think Sherry is up, and I think we have Brad and then Many and then Chris and then Dave.

WILSON: I like the circle. What I do know from our experience is the operating rules and the SDOs do not always align, as we have heard today, with our stakeholder business needs. Even though I am a participant in these different organizations, I like the thought of the right people with content, but also this group here, the functional implementation. Often we’re looking and focused on standards. We need that for interoperability, but then we have to go back and implement these things and we have to be able to integrate these things. And we have payers, providers, IT vendors, and to leverage that existing resource, their investment in IT is a huge challenge for us, an economical challenge.

But I look at having the ability to take a look back and say, if we had more people that maybe fed into some of those arrows with the functional, operational and implementation expertise along with the content, it could really help us maybe bridge that gap between our business needs.

GNAGY: The need for outside or more involvement here from more or different or diverse stakeholders would be great.

BISHOP: That was a perfect segue because I have kind of a comment, which is that we already have a de facto extra-regulatory environment for standards adoption. The lack thereof has caused us as an industry to create all of this surround code and external work-arounds that we now use which further complicate the process. So we have created our own set of standards in absence of this process working the way that it should, to the original point, although it does, to Eric’s point, work as designed.

Piggybacking off of Sherry and Brad, the inclusion of vendors would assist in developing the data. They can bring the data to the table about the impact this lack of standards has about what’s working and what’s not working. Sometimes, from an operational standpoint, the health systems and the payers don’t necessarily track across — they don’t have the mechanisms to track across industry to be able to benchmark and be able to bring that data and analytics to the table to help you understand collectively where we could be going and where what’s already implemented is failing.

Although, adding another cook to the kitchen may not be effective without formalizing a governance model and having them meet more often and having specific agendas and specific prioritization of what they are trying to achieve.

MUIR: Just to respond to the question about what are some of the tools you can do in regards to governance, and maybe not relying on regulatory action, I just have a couple of comments.

One thing that ONC does is they have cooperative agreements with standard development organizations where we actually pay to have certain things done through active cooperation with the government. We have a cooperative agreement with HL7, and in the past we have had a cooperative agreement with NCPDP, as examples.

The other thing is the Argonaut project, some people may or may not remember, that was started in response to ONC threatening regulation —

(Laughter)

And so there are levers to get things done. It doesn’t always have to be through direct —

LANDEN: Thank you for that. I don’t want to lose sight of the fact that the legislative authority is HIPAA, and legislatively the authorization requires HHS to do final rule promulgation. So, if we’re thinking in terms of an extra-regulatory approach — and I think we do need to consider that, absolutely — that’s a lot more work and a lot longer timeframe than some of the other things that we will be talking about.

NICHOLSON: In response to your issue on regulations enforcement, we are kind of at the downstream process. We create work-arounds because people don’t adhere to the needs that we have. If you go to an extra-regulatory process and you have no way of making people adhere to that, we are going to have one million work-arounds. We are not going to have 100 or 1,000. So there needs to be some mechanism for enforcement. I’m not saying it’s working well today, but there needs to be some mechanism for enforcement.

BELL: A couple of points. Number one, I think part of it is what’s driving the standard is not clearly defined or is not widely acknowledged. If you can take some of these innovations and use them as the test cases in the pilot programs to say this is what works well, and we already tried this so if you want to hit your head against the wall again by my guest, but I think that’s a big part of it. A lot of this process is changing something that was broken from the get-go, and trying to — That is not innovation. That is maintenance.

The second thing is I think we have to be concerned about the financial capability of some of these organizations that we have traditionally relied on. As the industry has evolved, we found that if you go to these standards meetings you will see that it’s the same people that are volunteering or have volunteered. We have all got the “I’ve been doing this for 20 years” or “25 years ago I used to” and it’s the “I used to’s” that we have to worry about because I don’t see a lot of people taking the gap as people retire or as they move in their jobs or as companies start to say what’s my return on investment on your participation in this given organization.

Some of these organizations used to make their money by selling books. The book stores are not in existence anymore, so how do we make sure that we have fiscally responsible capabilities in organizations that can spearhead this process?

GINGRICH: This goes back to Nick’s original question of the good or bad with having a regulation as the hammer, and also to Janet’s question about the need for it.

For me, it comes in handy at some point, and it’s more potentially setting the floor of a standard versus setting the ceiling and the floor of the standard, which sort of puts boundaries around it. I think from a business use case and business value standpoint, we are pretty good at going after — that’s what we do with Argonaut. That’s what we do with Da Vinci. It’s focusing on use cases, refining a standard to meet that need and then executing that and getting use out of it to actually evolve a standard.

So I think it comes in handy as potentially the lower bar club if, say, 80 to 90 percent of the industry has adopted. It’s getting the rest to that level and I think it comes in handy. Again, you have got to watch out for the support aspects because they will eat you alive.

I think there is good but it’s how it is approached.

GOSS: I’ve heard voluntary adoption. I’ve heard we like federal hammers, and I’m a little bit confused about the commentary on that one. We can build consensus around what’s the right approach because, quite frankly, you guys are a lot of leaders. You understand the game, you understand how to play, participate and evolve.

As we get into Denise’s section later when we talk about the regulatory process, I really wanted to cue up for you the thinking about how your organization would feel in a regulated versus non-regulated world and how we bring along everybody with no outliers.

LANDEN: Let me just add that besides the regulatory versus non-regulatory I have also heard very conflicting things about we need base level, minimum standards, and I have heard people talk about, well, if we don’t have all the complex detail then there’s a lot of manual processing. So I see those two as colliding, too.

GOSS: I have just been reminded by Nick that we have about five minutes and then we are going to break.

COUSSOULE: Obviously, we are out of time on certain topics but we have four different themes this afternoon and I think there is a decent amount of overlap among them, so by no means do we want to ignore anything, but we can try to get through some things pretty quickly and then we can break for lunch.

SCHULER: Again, I am a provider, remember that. Without trying to get into my own personal political values, what I’m seeing right now because there is a hammer and there’s a conflict of interest, I am not going to tattle-tale on our health plans for not sending electronic transactions or completing the content and vice versa, because we need each other. Health plans need providers and providers need health plans.

Naturally, there is a conflict of interest if we try to sing Kumbaya and work together there. Some people say, well, just take care of it through your contracts with your health plans. Well, health plans are typically national, many of them, and across states, so we can’t set our own regulations through our contracts. It just doesn’t work.

I am also starting to see states jumping in. Our state organizations have heard providers complain to them over and over about the lack of standards, so now the states are jumping in and starting to set regulations on response times on authorizations, and different standards. And that’s getting even more complicated. So, not that I am necessarily in favor of big brother all the time, but if we don’t have some monitoring and regulations people are just not going to adopt because there’s not always an ROI.

My final comment is meaningful use, value-based healthcare is typically based — Unfortunately, it’s all about the almighty dollar and the return on investment. The industry has been incentivized to adopt meaningful use, right? Or there’s a stick behind it. HIPAA violations are out there, so we’re going to the nth degree to ensure PHI is protected.

I do think there have to be some regulations out there with a stick behind them to get everyone following the standards, because if people can pick and choose, they are going to pick and choose. Again, that is why people have picked to invest in portals rather than choosing to invest in the standards. Unfortunately, I do think we need a regulatory body to oversee adoption of the standards, giving enough timeframe to adopt them but push them through, and by a certain date you have to be compliant. That’s the only way we are going to be able to advance the industry in this, I think.

ARENDS: I want to echo what I have heard from Mark and Margaret. I do feel like there’s a role for regulation. I don’t think it can be completely outside of the government.

To the ROI question, to the competing priorities, even if there is an ROI, sometimes just competing priorities means that you don’t get something done that does have a good ROI. And so having that backing of, well, this is a compliance matter lets us cut through a lot of red tape internally that we want to do things but then there are other areas of the organization that necessarily have different viewpoints.

I like what Mark was saying about setting a floor but not necessarily the ceiling. I feel like there can be a happy medium between keeping everyone moving forward but not limiting what’s out there and what has already been approved to let people start acting on if they want to get out ahead and really take advantage of that early mover benefit so that they may be able to get with trading partners and both agree to adopt something new that’s above the minimum.

CAMPBELL: I just want to clarify, for what it’s worth, I actually am on the same page as Mark. I do think there is a role for regulation, but I think it’s there to lift up the non-compliers and to set that date like Margaret said as well where everybody does have to catch up. That’s kind of where I see it working most effectively.

JOHNSON: I want to pile on a little bit — same thing. Be careful, though. I do think we just need a minimum. The problem with innovation — I don’t want to stop it. I just want to be really clear that when you say that as providers, I can tell you that we are subject to the innovation as well. I am just being honest with you. We get a federal reg, we have to meet the federal reg. We are good with that, we need to stick.

But the minute that our payers, our vendors, our partners start doing innovation, we are also obligated — that is the same problem where a regulation says “and” and the vendor has to do both. If a regulation says “or” the vendor has to do both. We have to pay attention to that or we get in trouble.

GNAGY: Just to throw my lot in with the floor, and not the ceiling, in terms of the innovation and regulation like you were just discussing. It all kind of goes back to agreed, but if the standard allows for a level of extensibility, fluidity, and the standard itself can have an effect on what we’re talking about here — because in terms of regulations, if innovation outstrips — and let’s all be honest here. Who can accurately predict the speed of innovation in five years much less in one year? How are you going to predict that? You can’t.

So, building something into the standard itself will help with all of these arguments, just with the ability of that standard to flow like you’re talking about, to flow with innovation and be able to be regulated at the same time. If you can’t allow for that flow you’re regulating people into a corner.

HEFLIN: I fully agree. Regulations I think have a role, but they are a blunt instrument. They are slow, onerous and burdensome to get done, and often they are written by non-technologists, so even best intent they get it wrong. We have to then iterate and repeat the entire process. So I think they should only be high-level, set policy and direction and then defer to industry bodies, SDOs and the community of interested parties in order to actually move forward on that.

I think there are actually a couple of good examples out there as well. Sequoia worked with Intel in one state, and as an example, regulation was not required to move the entire state. In about three months Intel — I don’t want to speak for them or misrepresent, but my perspective is that they are a large employer with an in-house health system for treating employees, and they basically in one state told all providers in the state you are not getting paid unless you use — essentially in this case it was the health exchange but it could be other things as well — to actually exchange clinical data with us. In three months, by December 31st of that year, they moved the entire state into one unified approach.

So my suggestion would be regulations are a blunt instrument. They set direction, and then some financial incentives could actually act with the industry’s standards bodies to actually move things very quickly including support for innovation.

LARSEN: A quick question on the role of governance and other governmental entities. We talked a lot about how we regulate industry, but in our role in IT, a lot of our job is to pay attention to all the government requirements and the government implementation guides. And I will tell you we don’t do a good job of being consistent across the federal government.

What CMS does in a program is not aligned to what HRSA does in a program, is not aligned to what VA does in a program, is not aligned to what DOD does in a program. There is an incredible opportunity for us to send a market signal that we have adopted a consistent standard across a large set of federal and potentially other governmental entities to say we have adopted this; we are clear; it’s getting executed in our role in interaction without a regulation, just a governance of federal actors.

BISHOP: I just have a quick question with respect to what’s working and what could be working, and this is very tactical. Are those on the committee who have the authority to enact legislation, to set these regulations — is there a sufficient level of staff? Is there a sufficient number of people right now who are able to execute this process?

LANDEN: DO you have all the resources you need?

(Laughter)

BISHOP: Good response.

COUSSOULE: Thank you all. We will take a lunch break and try to start back up right on time at 1:15. I know it’s a relatively short break but please try to be back in 45 minutes.

(Lunch Recess)

 

AFTERNOON SESSION

GOSS: I want to welcome everybody back. We are live and broadcasting on the web. This has been a phenomenal morning. The feedback so far has just been – I do not know that I could ask for this to be going any better and that is really a result of the quality input and remarks that you guys prepared this morning and our productive discussion around governance. The next one we are going to tackle is standards adoption. We are going to turn it over to Deb Strickland, subcommittee member, to talk about this section.

Agenda Item: Theme 2: Standards Adoption Process

STRICKLAND: In this section, we are just going to review what the current standard process is for getting updates into the X12 EDI standards. What ends up happening is that the industry will identify changes that they need or to the EDI standard and that will go to the SDO. And then the standards operation will process those and send those over to the DSMO for a review and recommendation.

Now, this next box, the cost benefit, was a great idea, but it did not end up happening. As things had traveled through the DSMO, they did go through the DSMO. They did have opt-ins and opt-outs. But the actual cost benefit did not happen there and it has not been happening there. What was envisioned would have been a great idea, but it did not happen there.

Once that is adopted by the standards organization, it would then go to NCVHS and then the standard would go to HHS as a recommendation to put forward that standard.

Now, it is important to note that as you can see at the bottom there, operating rules do not follow this process. Rich had mentioned this as well. They have their own authorizing authority. They go directly to NCVHS and they do not have to go through the DSMO.

What we come up here with our problem statement is that obviously the frequency of updates to the standards and operating rules are not aligned with the industry’s business and technical changes. It does not enable all of the covered entities, trading partners or business associates to take advantage of innovations, solutions, and be able to move forward with things that would advance their business.

Some of the questions that we have around this are how the current cycles for updating the standards support the industry operational and technical transformation. How are these standards working for you basically? Do they meet your business need? Do they meet what you need for technology and innovation? We will start there with that question.

Anyone have any comments on that?

HEFLIN: We actually have had a deep involvement for many years. I have actually, personally, been on the board IG-International. I am on the current board of IHE USA, member of HL7 and actually an author of many standards as well, that are used worldwide today.

Part of the reason I do that is because it is a voluntary activity. It is actually painful for me to do this. I am going through the process of debating for one paragraph of text over a course of ten weeks’ worth of meetings. That makes my brain hurt. It is actually a sacrifice. But it is one that I and many others do because we want to move things forward. Most of us in these SDOs – we volunteer for it. We actually have to fight in many cases for our employers, not my current one, but prior employers I have had I had to fight and justify my participation in that painful process. Who would want to go through that?

I think part of the answer back to the earlier comment is there really ought to be some type of funding available from somewhere, maybe public, maybe private, I do not want to speak to that necessarily, to actually bring in SMEs and those that are actually interested in doing this. They are willing to sit through this painful process for the better good, not for hidden agendas, to actually get these standards deployed.

One concrete example is Sequoia has long recognized the need for a national directory standard. I actually wrote one called HPD. It is an IHE standard and it completely failed. It is based on SOAP. It was XML. It was basically kind of old school approach. It achieved almost adoption. I immediately pivoted and started working with organizations like HL7 and FHIR and the Argonauts and I was kind of the co-author and co-lead of the Argonaut FHIR-based project because I did not really care if the standard health creates 68. I want a standard for us. I worked with that. To my knowledge, Sequoia actually deployed the very first version of the Argonaut FHIR health care directory about – I think it was Valentine’s Day last year.

We have actually recognized the need. The standard did not exist. We jumped in at great personal and corporate cost for a little non-profit like we are, and actually pushed that standard through the point where now it meets our needs. It is designed to meet the needs of direct trust for secure email, for HL7 V2, for FHIR as well as IHE interfaces, all in one single unified approach. The bottom line of the standards – it is very painful for us to actually get them pushed out there.

And then the other issue is that once they are done, that is down the road, because standards process ends often and almost always at the case where the standard is ready for some flexible deployments of that standard for a variety of situations. The challenge is flexibility, like Wes Rochelle(phonetic) used to say, is the enemy of interoperability. Every time you have a flexible data element, a flexible interpretation, a defect in the standard, what is going to happen? You are going to have two different organizations, not to due to malice, but just due to natural interpretations, do the same thing two different ways.

The way Sequoia has addressed that probably imperfectly, but it does work, is we have a validation program, which is much more precise. We have a validation tool, which we paid a lot of money to actually have developed and brought to the market our own expense for the public good, and we have implementation guides that take and factor out as much as flexibility and optionality as we can and the base standards for the objective of making them truly interoperable while still meeting the use cases, which drive all this.

STRICKLAND: Any other comments on that? As we were going around from the initial discussions, I heard a lot of APIs, get rid of claims. Some claims I can do in different ways. Obviously, it sounds like – EDI standards are not working and maybe we are pushing ahead to do some different innovation things.

WALLER: We look at APIs and how to do the HIPAA interactions via APIs. Almost every day I get asked and how we can accomplish more real time interactions for various things. While you can make it work with X12, it is challenging because you could use it as the payload, which is a challenge. I do have some anxiety that X12 will leave us in the old world and we will never get out of it. We need to stare at it and say is X12 the future or is it something we need to do until we get to the future and we create a bridge to the future. It will be very expensive to get everyone off of X12. I get that. I do not see us moving data as quickly or as much especially on the administrative side until we do something different.

LARSEN: Here is some naiveté I have in some of these standards. Do we routinely measure them as an industry? Do we know where they are adopted and how well they are used? I am used to looking at Surescripts for their adoption of scripts standards and have a really good — good visibility into how much adoption there is and where the adoption is and where the opportunities are. I do not know for a lot of the other standards across the country where they are adopted and besides where they are adopted, besides where they are effectively being used versus where there are a lot of errors or a lot of trouble with the standard that is in place.

CAMPBELL: That is an interesting point and I think one of the reasons you know it about Surescripts is because Surescripts is across the nation and can just give you those numbers where these standards are more distributed and it is more difficult.

One point that I wanted to make, and this is maybe just a small technological thing, but I think it could potentially just derail discussions. When we say APIs or knowing what we mean by that, I think people here are now saying APIs to mean real time web-based communication between two different systems. If that is what we are going to mean by it, that is fine. You could even consider X12 to be an API in some sense. And it is not web based, but you could do other.

The point that I was going to make – answering the question up there about standards activity. One of the things that I think has been beneficial about Argonauts is that it is in some sense almost ahead of the standards process. There is not a lot of balloting that goes on. There are phone calls and everybody kind of decides and then a decision is made, and we move on. I think that is one of the important things that small pilots can do is not deliberate forever, but just choose something and see if it works and then go forward. I think that is really valuable.

The standards process comes along afterwards after we have proven that it works, codifies it and then regulation comes along after that. There is a lot more runway up to it.

BELL: I may be jumping the gun a little bit, but we talked about the elephant in the room. Right now, a big part of it is the 7030, when is it going to happen, is it going to happen. It has been a long time since anything has happened. Part of it is – just throwing out potential solutions of doing phased approaches to the transaction sets. I am not saying it has to all move at once.

For example, right now in 6020 although they are not regulated, you have a 278 to 277 RFI and the 275, which is all needed for the attachment standards. There is no reason why those could not be adopted and mandated and then take a look at where do we sit, what next set of transactions makes sense, so the industry does not have to just uproot everything.

The second point is that the standards organization are once again that volunteer organization and as bodies move and where is the continuity built into the system of making sure that things are seen through and are the same people looking at the same things.

As far as APIs go, APIs do not need to be real time. It is just a matter of communicating machine to machine.

STRICKLAND: Understood. We should define what we mean. Some people are using it to be real time.

BELL: I would say that APIs are machine to machine. Then we start talking about the more technical JSON versus XML versus X12. I am not sure the standards need to regulate the language as much as they need to regulate the – like I said, tell me the gauge of the rails. Let me build the train.

GNAGY: Stepping back just a touch, going back to theme one and how it integrates in here with theme two – at the end here, you are talking about process timing, incremental updates – sort of what Joe was talking about. We had an earlier discussion of adoption of potential agile processes, things of that nature.

This whole group is here because there are issues with the technology that we are trying to produce – the standards. Whether it is the governance. All of our six different topics here, five different topics. And they all kind of – we have some potential other standards that are out there, some other processes that are out there. People are using this. Is the goal to continue to push this so X12 rises above them all and we all start looking out and we do want to start adopting it?

If that is the tact in all these, part of what we have to do is not just address the fact that the timing of these releases or the speed at which we can release these, it is can we – and I keep going back to my one big point that I do want to take out of here and make sure everybody knows that is my one big thing is the standard can help us in that, the standard itself, by functionally modifying the standards so that you have real extensibility, extensibility that allows people to – that they have direction on the extensibility that allows people to use it and to have that backwards compatibility. Then you can start implementing these things like agile or these small iterative releases of the formats so that developers do not have to sit back and wait for this huge chunk of change to come flying out. Here is your 730. And now here are all the changes that you have to address for.

Whereas if we are getting every six months or something along those lines smaller chunks in which we have to address that and we also have a built-in function that allows for backwards compatibility from chunk to chunk because Mike may be able to blow through the chunk of development on his. But I can tell you. My team. Much smaller. My time that it would take to blow that six – that chunk is going to be significantly longer. I may be on the next version still fixing the previous one by the time you are ready to move on again. Having that element in which the standard itself allows for this fluidity of use in the backwards compatibility I think is just key.

COUSSOULE: If I could just respond to one of – question he asked earlier. What is our objective in this one? To be honest with you, we really do not have an outcome we are trying to get you all to point to. We all each have in our minds some things that might be better or worse. But part of what we are trying to get to is options and impacts because it may be that there are multiple recommendations that would come out.

Some that can be effective right now and some that might be much bigger and hairier that would not be doable right now. Part of what we are trying to get at is all of that kind of input. By the way, this could actually help tomorrow versus this is a very different way of thinking things, which we may want to get to anyways. But it is not going to necessarily help you tomorrow. I think we do want to look at both of those or at least that continuum of —

GNAGY: Yes, and that was more posed to you and not the group of do we want – thank you.

MUIR: Thank you. Just a couple of different scattered thoughts so I apologize. The first one. ONC has had some experience with standards adoption. It was really tied around a couple of different things. One was aligning our program with CMS. Of course, we were somewhat forced to with the meaningful use. But going forward though, we are still having conversations with CMS on linking our certification program and some of the standards with their programs. That is one way to get standards adopted and with programs that actually have financial incentives involved with them.

The second thing – I do not know if there are certifications out there that certify the products that your centers are involved with, but that is something to perhaps consider.

And then the third thing that Kevin Larsen mentioned a couple of times and he is not here unfortunately and it is probably not a coincidence that he used to work at ONC. There is a real challenge with – when you are doing regulatory work and understanding where the adoption levels are and things like that. There is a real lack of situational awareness of the adoption levels of standards. That is something that we would be very interested in working with you on, for example, because it is a problem across the board on standards in health and health care. That is something that we are constantly looking for ways in which we can enhance our insights into the adoption levels of the standards. Those are my scattered comments.

SCHULER: A couple of thoughts. Provider perspective from Ohio. I shared with you earlier. To answer the question, do you participate, I did join the X12 committee because I am so passionate in this space. I think I was one of two providers there. I was the only end user. Everyone else was very technical. Clearinghouses, a lot of vendors and a lot of health plans. I am playing victim. But it was hard for me justify to Ohio Health to spend that kind of money to develop the standard that was going to come out five years from now. And then getting on all the phone calls to talk about – I think Eric outlined that. About two data elements and debate and blogging and emailing back and forth. It is a huge time commitment. I appreciate all those people that are on those committees.

But it is just for the providers and the end users to have a voice and I do not have the solution, but it seems like that is a gap because when the standards come out, I think we could offer a valid voice to say does it really work or not.

The second thing and I have already talked about this, but just to hammer it home a little more is the adoption rate. I am going to use the example, the 278, which I think is one of the most basic standards. I think you guys can correct me. I think there are maybe four data elements in the thing. It is about notifying the plan when a member has been admitted to a bed. It is very basic. And we used to – I hate to even say this – print out phase sheets and fax them to our health plans. And then the health plans would key that in on their end. And then the 278 came out. It was only adopted by a handful of payers. There are still only a handful of health plans that use the 278.

We have secure email and I email an ADT file of all the admissions to health plans. Unfortunately, then they have to print that out and key it in. It helped me, but it did not help them.

Again, when I have the discussions with the health plans, it is like that is not a priority. The adoption – again, I do not have the answer, but we have to figure out how do we get this adoption because there are hard dollar savings of basic 278 because when you have humans keying it in and if dates are wrong then the data is wrong and then it is a denial and then there is appeal involved. Of the basic standards out there, I am not seeing things widely adopted. That should be so easy.

I am hopeful – again about having a voice here – I really do think there is some low-hanging fruit that we could really make happen this year. And how do we get that in the industry to adopt some of these basic standards? I am an end user. It should not take a lot of technology and investment to make it happen. Those are my couple thoughts on standards adoption.

ARENDS: Jon from Walgreens. To address the question of the frequency and the cycle, I would say at least for SCRIPT – I am going to talk about SCRIPT NCPDP. It is infrequent and irregular. Because of that, I would say it is harder for me to do long-term planning. We end up in a hurry up and wait pattern over and over again. For years, I kept hearing it is coming soon. It is going to be named soon. Every year I would put in a budget request. I would put in a placeholder into our IT because any moment we could get a rule. I just want to say that there is an opportunity cost for that because I have a placeholder. Other things are not necessarily getting done because I might have a big compliance ask coming down the pike.

Then when the mandate does finally come through, first there is, is it real. You cried wolf many times. Is ICD-10 – is there going to be a deferment, all these sort of things that people then start to bank on rather than if it was more predictable then those questions do not come up.

And then it leads to a break neck pace. We have been waiting a long time. We have been wanting this adopted and then you have 12 months or 18 months to adopt it and then now we are looking like that is too fast. Meanwhile we have been saying we need it, we need it. The industry looks schizophrenic, I think.

Also, while we are waiting for things, I will also hold other projects. I know I have some fixes I need to do. I do not want to go in and do a small little fix because I am going to be touching and redesigning the whole section. Issues persist longer than they would need to if the standard would have come out earlier and we would all just adopt the new fix.

STRICKLAND: What I am hearing is more of a predictable, reliable schedule. Like some people are saying six months. Maybe it is a year. Whatever it is it just has to be predictable and reliable and not deferred or postponed or whatever.

ARENDS: Correct. With SCRIPT, I knew I could start on the really old stuff, but I never knew how far and to what version I needed to go to because as time went on, we kept balloting and improving newer versions. I could never say here is the line in the sand and we need to at least get ready for at least this much.

There was still even talk about even if we get the interim rule, if we have a new ballot in between then, we will ask for the newer version. It makes sense. We want to get as much as we can into it. We do not have the opportunity very often, but it causes even more heartburn on our side.

STRICKLAND: It even prevents you from doing pre-work that you in theory should be able to do to prepare for the one that you know. For example, the X12 guys, out for public comment. If you were a vetting person and you thought that maybe 7030 was going to happen, you could actually start thinking about what database changes do I have to make, what interface changes, what web changes, all of those things. But then if it never happens, it was all for naught.

ARENDS: And there is only so much I can do on the business side because – I am constrained on the business side, but they are more constrained on the IT side. Without a project, they are not going to be looking at any of this.

MATZURRA: This is Bryan Matzuura. Mine is a lot more high level, I guess. You do not want to hear about – the last two topics and what we talked about. I think all these details are also really good. But then I kept thinking. If you do what you always did, you are going to get what you always got – a 20-year-old process that also needs to revamping. I think this is something that we should probably think about in parallel with all these really good ideas about how do we fix stuff today, but where are we going in the long run.

To me, from an adoption perspective, we have to really get better at really understanding why and/or why not have one set of standards versus another set of standards, been adopted broadly while others languish.

The other is really I am a fan of Agile, but also what I have learned over the last two years incredibly painful is Agile is not just a change in your mindset and your behaviors. You actually have to plan a lot more for Agile or you have an Agile mess. You could do it really fast and you could do it in small increments.

To me, as we look at the decision-making bodies, it is not just the process, but you have to look at – this is what we did for us in the Agile, start looking at the guidelines and what are the criteria. What are the criteria that will help us to make decisions, to measure, and to prioritize? That is hard. They tend to have to be across the board both for the governance, for your standards, for some of the other things. That is going to take a while, but I think it is also worth thinking about.

Part of the criteria is what I think a lot of people talked about and they mentioned over and over. What is the benefit? What is the benefit to me? Is there a return? Is there going to be enforcement or no enforcement? You do not need it looking the same for every standard and for every answer. But we need a smart board like you guys who could help make those decisions or bring in all the right people to figure out where does this one fall into.

My take is let’s think about these from two perspectives. Not only how do we revamp the process, how do we look at the process internally and get some criteria so that it is clearer on how we make decisions and that the decisions that the criteria is consistent with the values of the people who are going to implement the standards.

BELL: Building on that point, I am wondering if we need to figure out a way to make it easier to adopt. Are we looking at technology and delivery methods and such that we can consistently adopt standards. I hate to say. I have a standard for the adoption of standards. For example, like Git and things like that to where we have a universal dictionary of standards, but that brings in the problem of intellectual property from the various organizations. If it was out on a public domain that you could go and grab and update even if it is not gospel, it is very easy to grab the latest version for your betas or your pilots or whatever and put them in your environment. We talk about Agile and we have had multiple people talk about Agile where five years ago, Agile meant how quick can you get out of the way of a problem.

I think we need to really look at what is the technology that is available today to make adoption easier because that is going to be a huge return on an investment with very low hanging fruit.

STRICKLAND: Mark.

GINGRICH: Mine was just a little add-on to what Jon said. At NCPDP at the maintenance and control level, I think have a policy that I believe was adopted, a consensus-based policy, across telecom and SCRIPT and the policy is around trying to move forward with a new version every three to five years. And, again, you get to the three-year point. There is a review within NCPDP and whether or not we are ready as a consensus – recommend it for adoption. That is sort of the case.

For me, it is consistency. It is cadence. Internally, in our software engineering and product development life cycle, it is whatever the cadence is. Ours is software releases, major releases five times a year, whatever that cadence is as an industry, it is not what we do from an engineering standpoint. But some level of cadence gives everyone a chance to plan ahead and get ready for the updates.

STRICKLAND: Janet.

CAMPBELL: I also wanted to just underline what Jon said especially around deferments. I think deferments are really problematic for a number of ways. It is teaching people that they do not actually have to execute on the regulation because they are rewarded for not doing so, which is incredibly challenging.

It also means that if you are one of the people who jump on it, you are then supporting the old way and the new way and that is harder to keep that going and that introduces more complexity for everyone involved.

I thought it was interesting that you mentioned the 7030 in terms of forecasting ahead of time. I do not know if I would give 70 percent chance. I would not act on a 70 percent chance and even right now like post-regulation, it is like 6040, right?  So that is not necessarily getting the wheels into action.

I would say that I would rather have a longer and more thoughtful process that was reliable than a more rapid process that was not.

GNAGY: Just real quick. I would agree with that. The goal here is not to put in a bad processor. The second thing with what Jon is saying. Everybody has their secret sauce. The secret sauce is going to be handled differently. The standards are going to have something to do with that. The standards are going to be involved with that. But again, if we are talking about the technology and John, you mentioned that too, the technology of the standard itself is key because we can dress this thing up and if X12 dies, we are just putting a very pretty corpse in the grave. If we are not dealing with the technology and making sure that it is going to function the way we want and do the things that we want for all of our customizations that we need for the future and take those – if we can make the engine run well — if we are not going to be able to sell the product or get the product out there, it is not going to be useful.

COUSSOULE: Just to follow up on what you were saying a second ago, is there is an interesting question that I would propose in regards to pace and cadence, which is to some degree, what is fast enough and what changes in that segment of cadence? We could say every five years, but every five years is some big thing. We could say every week and every week is some really little thing.

How do we figure out what potentially right cadences are for the kinds of changes that would be made within that? I would pose that as a question.

GNAGY: I would say use what we have right now that people are complaining about and go backwards from there. Obviously, if we are being very, very slow right now – I am not being smart with saying it. We know that the period of time we are taking right now is way too slow. We need to start the cadence or address the option of that cadence at the lower level in my opinion, than the higher level.

GRIZER: Just asking and just throwing this out there. Could we have some sort of oversight across all of the different standards? I feel like we are talking about all these different standards, but is there an oversight across all of them that could keep up with or keep a pulse on emerging trends and just ensure collaboration across all of them.

JOHNSON: That is really a great segue to what I was going to ask. I do not know that the organization we are just talking about really does that. That is my first problem.

And the answer to your question. The reality of it is this is self-evidence, but we need to say it in the process. How big is the change? I am not being flip whatsoever. The reality of it is, we were having a discussion at lunch and some of us remember when UBs(?) got replaced by 837s, and some of don’t. And there was a God awful stench in the air about that would never happen. It did happen.

But if we went back to our organizations tomorrow and said let’s get rid of the 837, I do not know what your CFO would say. I know what mine would say. But we have to somehow figure out can – you cannot do to us or we will not be tolerant of is little tiny changes every single day. You will drive us nuts.

Somehow, we have to get a balance. I am not saying it is easy, but that is why we need pilots and things where we can test the impact on a workflow process because that is where we seem to miss. We get standards people enthusiastic about an ability to capture something or change something, but they do not really see it all the way to the end. And having sat on HL7’s board for several years, the intent is absolutely honorable. The understanding of the impact of the end user is very limited.

SCHULER: This is a little bit into the weeds, but from a cadence perspective, it appears end user perspective is the shell of the standards comes out, but then the content is years later. As you put your notes together, bring those together.

GOSS: Could you clarify what you mean? Are you talking about implementation guides with operating rules? I just want to make sure I was clear.

SCHULER: Yes. Implementation. Back to my claims status example. The shell is available, the 276 and 277. People are using it, but the content is not value add.

GOSS: I would like to just be clear that when the implementation guide comes out, it does have a lot of functionality and the operating rules constrains that down. The shell has content. The operating rules from my perch, actually just constrains that to a very discrete set of agreements that are negotiated at a national level for trading partner usage and efficiencies.

SCHULER: It is limited is what you are saying.

GOSS: You have to have both.

SCHULER: We got the standard done, but is it value add? Is it meeting all the needs?

GOSS: I think it goes to the end user perspective. Does it really bring the value to my day-to-day business that I need and really tying those together?

SCHULER: Really tying those two together. Again, it goes back to the adoption too. You can pick and choose sometimes what is populated and what is not in the standards. Again, having that enforcement around what is populated in these standards is critical.

BARLOW: I echo a lot of what everybody said. The main thing I would do just to go back to the very first question you asked – from a payer perspective, the current cycle when you think about the standards and how they implement change to an operation like myself and at the provider community. What is missing in the current cycle is that you do not have a beta or test or try me first and test it before you make it final step. That is what is missing in the current cycle.

If I had to give you one suggestion that I could say – I think that is what is missing is when you make – because of the time cycle that you are inherited by the regulatory process, you need to introduce an interim cycle so that you can get to that next step with a lot more credibility. Get people to adopt it based upon proving the point and showing the value add, making the business case. Because if I am doing anything in my operation to change anything in the IT world, the first question I get asked is what is the value. Who is the use case and how many users are going to be impacted? It is a great thing. I can make a diagram and it looks beautiful. But if only three people use it, it dies on the vine in a heartbeat.

If you can add one thing into the current cycle, it is this test and beta cycle construct and also multiple platform scenarios because you really do need to be looking forward to the technology. I am not saying move away from – when you are considering a standard now, you have to start looking at – there are multiple ways to just about any of this information can be exchanged today. We need to make sure that the standard is focused on what is being exchanged and what needs to be processed and not how to exchange.

GOSS: Deb, what I would like to ask is that we take – if anybody has any points that has not been made yet, that we have those so we can go on to the next themes. If there are any concepts we have not heard yet that are —

CAMPBELL: I can do it in one sentence. As a part of that having – it goes to your question of resources, having testing tools already set up and able to be used is very helpful.

STRICKLAND: That is what I was going to ask Mike. One of the problems inherent in piloting and doing a test is to tell a payer, hey, to implement this ahead of any regulations. Oftentimes we hear I am not touching that until there is a reg on it because we do not know what is going to happen or you do not know when it is going to happen or something else is going to come along.

Having parallel systems – in theory, let’s talk current day. You would have to have your 5010 running end to end and then you have to have 730 running and to prove that that was going to work along with your partners, along with clearinghouses, along with everybody else in the system to make sure that that pilot was working, make sure that everything was operating. As a payer, would that be an investment that your company would do.

BARLOW: If I see value in where you going with the change, I will adopt it because I can build it into my business processes in advance. That is exactly what I did when I implemented the tools that we did in 2014. We went to CMS. We saw the potential. We got approval to do the pilot. It opened the door. I did not even see the potential in the long range. I saw the immediate outcome. But in testing it, I found new horizons, new challenges, and new limitations. As a payer – at least, most businesses are progressive and looking for an edge. As long as the standard is introducing to me a new way or a better way, I am going to reach out as far as I can within my capabilities to try to adopt it primarily because I want – if it is going to come any way, why wouldn’t I want to be the one to make sure that it is done – if it needs to tweaked, let me test it. I will help you test it and tweak it. When you just lay it on the whole community at the same time then you get everybody’s problems all conjoined and comingled.

WILSON: Just real quick to – Mike, kind of go along and kind of what Jon said too. Two points. One, when the 278 was regulated and it is a HIPAA transaction, as a national clearinghouse association, we did a survey. I will tell you. It is a regulation and we built it. That is 85 percent of the clearinghouses built the 278.

In our survey that we testified with NCVHS, I think we had less than 2 percent of even traffic being used on it. Again, that was a lot of money to go meet a regulation, lots of money and a lot of effort where resources could have gone elsewhere if we had a pilot program.

But I think one of the things we are learning like with the CMS project with NGS, we have gone to the current version of the 275, 277 request for additional information, the 620. It is very downward or upward compatible, which does make a difference, Sean, in what you are saying so you have less risk in doing it. Across stakeholders in the pilot, we have learned a lot and again where is the ROI. We learned about the unsolicited is very specific to provider type versus the request. It is much broader and more volume and more usage.

But I think that we have talked about the pilot program. Do we see whether it is funding for pilots? We have the collaboration and Argonaut and all these different efforts. But are there also when we are looking at some of these new things, are there – what is the pilot? What is the definition of success of a pilot? Is this something where again there is funding to help to be able to do this? There is concern as we move forward to look at some of the 7030s. I do not want to be in a 278 building it again and nobody coming. It is a big concern. That costs us our corporate line.

SCHULER: Based on what Sherry said, again, a take-a-way and I think it was talked a lot about the payers and providers. If we could have – I will volunteer Ohio. The part of that once a transaction, but if we had health plans and providers together committed to being a part of those pilots and beta, I think we need that. We can finish up working out the kinks and then roll it out to the industry and then show the ROI possibly to get better adoption with other areas and put the white papers together and the case studies that people can take back to their funding groups to source it.

GINGRICH: I will have to check with our CEO, but I would like to volunteer to host that pilot. The health exchange today while I am hearing from discussions, we have data that payers need desperately because they have huge costs internally to re-key stuff that is available right now in production electronically in structured form on the exchange and through care quality.

We have a trusted legal framework. We have a testing tool. We have a piloting process. We are in production operations. We have entered developments and a consensus-based governance as well too. I would like to just volunteer. If you want to do a pilot, we will help post that and apply all. I just stated to help with the pilot including the clinical data.

STRICKLAND: Thank you very much for all your input. It has been very informative. It is definitely something we need all of the thoughts and will help in adjudicating and making any decisions.

Agenda Item: Theme 3: Federal Regulatory Process

LOVE: I will go right into the really exciting part, the regulatory. I have a disclaimer here. I am not a federal employee, a special government employee. I have never written a federal regulation, but I have responded to plenty of them, the NPRMs. I stayed at the Hyatt last night so I am good.

As I look at the numbers up there, I will go through them very quickly on the regulatory process because, yes, it is daunting. It reminds me of the day I sailed on the USS Cole. Turning a ship that way – it cannot be steered fast. They have to have the horizon, the point they want to be in three hours. They cannot turn on a dime. I feel that that is kind of the way with the federal regulatory process. But I want to say that I think some of that is by design.

It also is an opportunity for transparency, having been a part of the responding to the rule making process and it does bring in broad inputs. These are the pluses. It levels the playing field, but there are some negatives. I am just going to say it upfront and we have heard it today. It is rigid and sometimes written to the lowest common denominator.

On these numbers, I will go fairly quickly. There has to be a trigger. Each rule starts with a statute law, executive order, judicial decision, advisory committee recommendation, or compelling stakeholder input.

Then it moves into Box 2. It is written based on those recommendations or statutes. Most rules are proposed to alert the public of the government’s intent to something or its intent to require something and to solicit comments.

The third box, what goes into it and I always am impressed with the amount of background work that goes into it. I think it could capture some of that pilot stuff. What is the significance? What is the population that it impacts?

That preamble is quite extensive in the rule making and the reading of it. I learned a lot about what is in their heads when they are writing the rule. The provisions and the text are usually brief. A lot of it provides the what and why of the rule and that is where I really think testing might inform some of that.

There is a unified agenda schedule that controls the publication of rules. They are usually spring and fall. Timing is sometimes a real beast to work with. The proposed rule does ask for comments. And the final rule may differ from based on those comments. Most rules allow for a 60-day comment period. I have seen 90. Rare I think 30 and some of the federal people might want to correct me if I am wrong.

The operating rules require that the stakeholders comment on both the policy and the standards and operating rules. It even adds more intricacy and technical engagement.

One of the boxes here is preparation of interim or final rule, but that is not common. It is a difficult standard to meet and it does not allow for comments through the typical proposed rule making process.

And then the OMB, the dear OMB, it governs a lot in this. They have to look at the rule and the cost-benefit analysis and other regulatory compliance. And then the ultimate I guess, end point, if it passes through all these gates and all these boxes and the USS Cole is on the right direction, I guess, it results in the publication of interim or final rule.

I am sure that there is not much discussion here. In our deliberations, some of the comments are listed here. The DSMO presents the recommendation to upgrade adopted transactions or code sets to NCVHS. NCVHS reviews the requests and conducts hearings on these proposals and reviews the testimony, makes ultimate recommendations to HHS, which may resound and may not. HHS reviews NCVHS’ recommendations and really it is HHS that determines the next steps and whether rule making proceeds.

GOSS: Just for clarity purposes, our process happens – this slide should have probably been flipped with the one before, not your issue, but my issue. It funnels through us. Then it goes into that lovely nine-step process she already reviewed.

LOVE: We have already done some things and then it goes into the boxes.

The problem statement that – the federal process for adoption of standards and operating rules is lengthy. It has unpredictable duration. It contains numerous checks and balances, which sometimes is a good thing and sometimes it gets in the way that arguably duplicates similar processes within the standard development organizations. In this discussion, we would like to hear your thoughts on the regulatory process, how it may hinder things and achieving.

GOSS: I think they have comments because while you have been talking, they all put their —

BISHOP: Just a quick comment. I am just making sure that I understand and clarify. The DSMO process that we talked about first thing this morning, happens prior to this process.

GOSS: It would go SDO does their stuff. DSMO is involved. It comes to us, as NCVHS. We say yay or nay, based upon industry engagement. Then it goes into the nine-step process and the feds.

BISHOP: Fantastic. And then it goes through the legislative process.

GOSS: That can be anywhere for a 4 to 10 or 12-year process.

LANDEN: The DSMO may or may not touch it at the very outset. DSMO may be where the request – oriented DSMO may or may not be involved in the middle. DSMO must get involved after SDO and before NCVHS.

DOO: Can I make a friendly amendment? Just because it is really – and Rich said this at the – because everyone has been really cooperative. That the operating rules do not go through the DSMO so just to add to the complexity. It is important. The operating rules will bring forward those new rules directly to NCVHS at a hearing and say we have a new operating rule for your consideration. It does not go to the DSMO. That is a separate process. It is only the standards that will go to the DSMO and then the operating rules will come to NCVHS. It is like they are two bodies. That is just to help you be confused.

BISHOP: That is great. Just a comment and then going to others. I think half the people in the room are wearing Apple watches. We think about the fact that that was introduced in 2016. Half the people in this room have that. If that is the ceiling for being able to develop at the rate of technology and we are talking about a potential – anywhere from 4 to 12 years, how does the regulatory process advance or hinder? I would say that the advancement part can maybe be taken out of the question. But how we get faster reasonably understanding that the federal government is not going to be Apple. That level of innovation and adoption. We had this incredible conversation about adoption. We are never going to get to Apple standards, but there has got to be a happy medium and how much of the bureaucratic red tape can be cut.

LOVE: That is on the table here. I think we want to hear some ideas and some thoughts.

WILSON: Good lead in. Thank you. Just as a use case. Two points. We look at the attachment regulations. It was back in 2005. I did a comparison from 2005 to 2018 just on proposed regulations, 2010 and 2013. When I went back and looked, the rationale for attachment regulations, the ROI and the business needs were exactly as they are today.

We look at – the states have moved forward with attachment regulations. They move forward because back in 2005, they knew they were going to get adopted so they went ahead and moved forward with that.

Today, going back – I think, John, you may have said it. Regulations – coming or not. Attachment regulations. I know that we are looking for an NPRM in August. It may be delayed. But I have states right now – we also are members of the IAIABC. We have people here in the back that is the international workers comp standard organization. These are providers, all the same providers, same clearinghouses, same vendors, using the HIPAA transactions.

The states have waited. We are trying to hold them off saying they are going to be actually really coming soon and they are at the point that you know what. We are going to go ahead and start moving forward if they do not come soon. Now, we are going to have a bifurcated system for our stakeholders. That is a concern when we are expecting something and then it does not happen.

The other point I would like to make is about NCVHS. Over the years, I have had the opportunity to testify. We have had industry guidance from standard organizations from things such as this. I think you guys have made great recommendations, very solid. And sometimes I question – it is very frustrating to know we all came in and spent time on these issues. You wrote great letters of recommendations to HHS. And then we see nothing getting done on the back end. It makes you think oh my gosh. We are coming to another NCVHS testimony. Are we going to be in the same situation? We are so supportive of the value brought to our industry. There seems to be a gap between your recommendation often and what are the procedures behind the scenes with HHS.

GINGRICH: It is more of a question. With the standards that are brought forward with recommendations from an SDO and getting into this process, how often have those recommendations – I do not know if we have any data on this. How often have those recommendations changed between what is recommended and comes in versus the outcome two to four to six years later? I am just curious if we have numbers.

GOSS: I am going to do this in two parts, Mark, because that is a really great question. I think part of the issue is that also NCVHS membership is done in terms, your maximum of two terms, each of four years. We roll in cycles. Although I used to sit on that side of the table.

My sense is that there is tremendous industry agreement about – even if they do not all like it, there is usually agreement about we need to take this step before it ever even gets to NCVHS. When we usually get a final rule, it is very much aligned with what the industry has said or at least pushes an envelope on something in the proposed rule so that industry can then weigh in on maybe a twist of it.

Susie, you may want to also weigh in on some of this. Maybe not.

One of the things that I am really curious about hearing and I know that some of my peers up here also want to hear about – it is a little bit more about specifics, dates. What is the right cadence? We agree with your commentary. We know these are the issues. But if there any really specific strawman dates – Joe, I do not know if yours was already up and I missed it.

BELL: In answer to your question, I think it depends on the size of the regulation. It seems like we are trying to create something that has already been created. If you look at the software development life cycle, it is the same thing as the standards development life cycle. I do not think we have ever defined it to where who is designing it, who is QA’ing it, who is analyzing it, and who is deploying it. It is just a cycle, the same way that we develop software or launch products in the industry today. Every one of our companies has similar cycles. I just do not think we have defined all those roles to say this is your responsibility to do these tasks.

GOSS: If we do define that, who do you think pays those people to do that? I have been a standards developer. I was the chair of X12N. I have lived through 4010A1, and 5010. I would like to say I was a red head back then. It is a volunteer basis to pay and play.

BELL: It really comes down to I think there needs to be some funding that happens, and part of that funding is – I think the QA is really a part that you want to make sure that you have accurate QA that they are looking at that, that somebody is looking at the ROI and the business case in this stuff. It just follows through on a checklist. And that checklist may vary, but the steps are the same. You will eventually be able to tell by based on you can have an evaluation on how big of a change, who does this effect, et cetera, and that will give you a predictable timeline on when, okay, this is how long the checklist will take.

WILSON: In your model, do you put regulation after the release point when you are conceptualizing it or not at all?

BELL: Regulation should be after the release point.

PARTICIPANT: Not in post-development.

BELL: I agree, but I think if you go back to the point of if we make it easier to deploy by using technology so it is more cost effective to deploy and we have a standard checklist of this is the process. We are making sure that we have our checks and balances in place once because our checklist is clearly defined that you will have a much more effective process.

GOSS: That means that the industry should drive that process and come and say we have now decreed we are ready. We have done our checklist. We are good to go. Please bless it so we know the official guardrails that we have to build the train to.

BELL: Exactly.

GOSS: I am not seeing everybody’s heads shaking the same way on that one.

GNAGY: From what Joe is basically explaining here is that we are not creating rockets and I am using that poorly because we can create rockets pretty easily now, but it is not rocket science. We are talking about a software – we are talking about an iterative process, whatever we decide to go with in terms of the organization of the processes, how they are enacted. It boils down to – you were saying or somebody else was saying that you do not want little iterations. You do not want these little bits and pieces so you constantly have to be developing. But what if you do not constantly have to be developing? What if when I release a version, you do not have to do anything? Your version still works. Your piece still works. You can then after a release or even prior to a release understand what that release consists of and then go back to your – John, you were saying earlier. I am throwing everything I can at the wall and praying something sticks so if I get hit with a reg, I have my money in the bank.

You guys are going to hear me say this probably a couple more times. If the solution – if your standard is built so that your problems do not matter because you do not have to change. Your problems matter, but they are not relevant because you do not have to change. You see everything as it is coming out. And, again, he said something that was fantastic. What if the system is set up so that it has made it so it is more cost effective to release the updates? That would be fantastic. You get these updates.

But the whole process or the whole product you are releasing is built in such a way or has the extensibility that whatever you do, whatever you decide, John decides, whatever I decide, I have a mechanism where I can produce that. Even if an update comes out in two weeks, it is completely what I did in a different way and maybe even better. This extensibility builds in a construct in which everybody can be explained of what we are doing. What is the customization? Here is a very simple construct now in which you have the ability to now have these discussions and say here is our customization. Here is what we did with the extensibility options. And then those can be reported on to then build your list of priorities. Again, we are talking about this process. If we get the process down well, we are still going to have to deal with the bureaucracy, not getting rid of that.

It is unlikely we are going to go fast. I do not see – even if this gets going, there is not going to be a massive amount of speed. If you put the standard in a position where the speed does not impact the ability to use it and to extend it then you are able to put out this product and then build this great process behind it that eventually gets faster and faster.

JOHNSON: I am just glad I am sitting down here where the popcorn cannot reach me. What I hear I love, but I am a little bit cynical because what I am hearing a little bit of is plug and play. I have not had plug and play work really well. I work in a very complex environment where we have hundreds and hundreds of software pieces that have to be touched. I did not hear much about workflow. The technology cannot drive the workflow. It has to be the other way around. I am not at all sure so forgive me who pays for all of this because if you are saying adopt the standard, I cannot quite connect how we – I think we are right. We cannot skip the regulatory process. I do not like it. I do not like it for anything – everything we do goes through this process and we are all familiar with it and some do go much faster than four years and some seem to take 20. I am not quite sure how that interplay works. But I do know that it cost money to do that.

I am really challenged with accept the standard and move forward before it has been regulated. And the only reason I say that and that is why I am struggling is I love what you are saying about put it in and I cannot tell whether that happens after it actually becomes law or before. Pilot-wise I get it, but I felt like we were saying do it now and hope it becomes regulation. I will never be able to sell it because I would have to change thousands of interfaces, hundreds of connections, workflow. I am not trying to over exaggerate it. It is just a reality. I may not be understanding what is being suggested, but I would be very weary when we talk about plug and play. We have to find the happy medium. We have to move faster. We have to be more nimble and agile.

Just so you know. We are already having trouble with agile development because people are very comfortable with – you hear about it. You talk about it. You discuss it to death. It then gets out in a passion. Then you put it in and then we test it and then we talk about it some more and then we decide to go to production. Agile means it happened overnight and we did not ask you. You get an agile phone thing. How many of you take Apple – whatever phone you use. I use Apple. How many of you take the IOS update the day it comes out? There is one brave soul. Not me. I wait because I at least can figure out how many things are broken and wait for the first patch. I used to take it after the first patch. Just be thinking about that as you – I do not want to hold us back. I want us to move forward, but you kind of get where I am going.

LOVE: Before I go to Andrew and Janet, are there any comments on plug and play versus not plug and play?

BELL: My vision is not plug and play. My vision is called versioning. It is versions. I come from a technical background. If anybody ever dealt with Cisco routers who has 18 million versions of their software and some are general availability. Some are hot fixes. Some are beta versions. If we can figure out a way and this is where it gets into the issue of intellectual property. If we can distribute these versions to our development people at the earliest possible timeframe so that they can do development work, pilots, et cetera, they will basically be QA’ing these versions and telling us this is wrong, this is not working, et cetera. It is putting them in a position to actually when the general availability release is there, they are already in a position to deploy that. It is actually in their best interest to keep track of this and it is going to let us know what is coming.

LOVE: Duly noted and I have some other questions, but I want to go on to Andrew.

BURCHETT: Mine has to do with all of this over here. My list is just growing. To complement Joe and Brad, that code versioning you speak of, in software development, we use that a lot so long-term release versus a patch. If you put a patch in because you want functionality, you know that you have to go back and implement the long-term supported addition.

I think of an example we have with some government payers where they want some functionality. That does not exist today in X12. To Ohio’s point, the provider portal – because its functionality does not exist in X12, other systems have to handle it to get that data back to the provider.

The code versioning is a good example. I think we need incremental releases. I think regulation though does not belong on the other side. I think it is more of a sunset date type regulation where a standard comes out. It might be this is a short-term patch type standard. You can implement it. With regulation to sunset – bigger standards like a 5010 sunset date to allow for like 7030 to coexist. I could tell you. Our development team is very used to working in that sense where we can accept various versions of things and we do that in all of our APIs today.

But one thing I did want to mention is kind of the mindfulness of provider burden and how – UHIN being a not for profit, we give out some free software to providers that cannot submit electronically and so on. I know there are a lot of other clearinghouses that do that. But this provider burden with this cross compatibility allows also to limit that provider burden rather than these forklift upgrade type things like 5010 was, ICD-10 – Mandy next to me or maybe it was Denise about when 5010 happened, we had providers driving to our corporate office to ask us to print their 835 because their systems just could not handle it.

We actually have sunset 4010 connections to various EHRs that were still running 4010. We said we cannot do this anymore. We have provider groups that have – and also I think some specialty like MCOs and ACO type, managed care type groups that do like a flat file type thing. Step this up to this for us. Can you convert this? We are starting to sunset some of those and give them this backwards compatibility because we do not want to support it every time a small change happens.

The last thing I will leave you with is there are some good examples out there backwards compatibility. HL7 is a good one. As we operate in HLE, we exchange data. HL7 V2, V3, FHIR resources just depending on the type of use case that is out there. That works really well. It works really well that these standards are constantly updated and released and we can consume them. FNIR is a little bit dicey.

LOVE: Do you have any other points because I am getting the ax here. I want to hear Ray and then Janet and Margaret.

SCZUDIO: Thank you. Let me just take a step back from this because one of the things – and I am kind of here with the consumer hat on even though we are a provider to consumers. But I guess one of the questions when you are developing standards is who are the stakeholders and are all the important stakeholders at the table. What is the ultimate goal of a standard? Is it interoperability? Is it actionable information from all this data that you have particularly? Is it for the benefit of the patient or the consumer? That, I think, is going to help not in the technical procedures that you all are talking about, which are very important, but really as to what your ultimate goal is. I think that would be a helpful point to keep in your development process.

LOVE: I think Janet was first and then – Margaret, go ahead.

SCHULER: It is volunteers that have made this a mission really to help the industry. I applaud everyone that has done that. But that is not going to be a fast process. The funding unfortunately I think is at the core issue here and the stakeholders are payers and providers, a big part of the stakeholders. Either the government pays or payers and providers come together and pay. Someone has to pay at the end of the day for this. I do not have the answer, but I hear the problem.

The thing is if we wait four to ten years for cycles to come out, we cannot wait that long. There is just going to naturally work arounds everywhere like we are doing today. Either we have to figure out how to fund this and do we privatize the standardization and let the private – pay a private group that is funded somehow to help develop standards get input from all the appropriate stakeholders, pay someone that has the Apple, the wherewithal through all the cycles that we are already all doing already. We do not need to reinvent that. But the funding is critical. I think we have to nail it down if we are going to look to this standard process to help the industry. If not, we are just all going to keep doing work arounds. That is just the reality unfortunately around that.

That was it. Privatization. Do we look at who funds it? Do payers and providers come together and pool money together somehow? Have our own tax. Or does the government pay for it? That is not always a solution either.

COUSSOULE: That is basically a tax, too.

SCHULER: It is a tax either way.

LOVE: Janet and then Jon because I am trying to go in order. Go for it, Janet.

CAMPBELL: I just was thinking. The concept of extensibility sounds good, but it scares me in practice because it is kind of an avenue for those workflows and almost by definition, it is non-standard ways of accomplishing the same thing. When you are working with a bunch of people doing that then it gets a lot harder to maintain across the entire space. I like the concept, but the execution worries me.

Alexander, you asked what the cadence should look like. You asked what cadence we should aim for here. EHRA, the Electronic Health Record Association – we have said that at least 18 months from the publication of the final rule to the actual enforcement of it – we need at least that much runway for a large regulation. In this case, I would probably go with two years just because we are coming off of a cycle of four years and that seems pretty ambitious.

The other reason I would say that two years is because this is not the only standard scheme in town. There are other standards-based initiatives that are hitting providers – same time and so making sure that we are cognizant of that I think is really important.

GOSS: I suspect that some of the collaboration we are doing with ONC needs to really factor in how do the bookends of administrative financial and clinical data all work together. And I would like to see us have that larger vision of where we are headed while we try to do some stepwise fixes along the way. At least, that is my personal view.

CAMPBELL: We have quality measures.

GOSS: Absolutely. Value-based care especially.

ARENDS: Jon from Walgreens. I am going to take it in a little bit different direction than we have gone previously. Going through the script update – NCPDP, it has become a realization that I think a lot of the things that we do not like about the rule that we got could be traced back to legislation. I think if we start talking about how can we fix the regulatory framework that we have, we may also need to look and say where are our hands tied because of legislation and start having potential action plans to start dealing with some of that.

Some examples that we have – we are having a January 1 implementation date, which is in the middle of code freezes. It is probably the worst possible data to do a change over. And the response that we got was you said it was a significant change and legislatively, if it is a significant change, it has to go January 1. We are not going to ask for an insignificant change, and stuff like that. Perhaps we can start working with our trade associations to start looking at legislative stuff that we can do. But I think we need to know that and have it all detailed out so we can start reacting to some of that.

LOVE: Brad and then Deborah.

GNAGY: I would be interested to know real quick and this may not be able to be answered right now. Is there any historical precedence for a funding of the type that you are discussing? I would be interested to know —

SCHULER: No. I should not say that. I do not know.

GNAGY: I do not know either. What does that look like or has it looked like in the past?

GOSS: Funding for the SDLC we are talking about? I just want to clarify what kind of funding you are asking so we can do some more homework on it.

GNAGY: What Margaret was explaining, and you can probably explain it better is funding of the organization of stakeholders. Gathered stakeholders for oversight. For example, the way that we deal with it is I have a committee of my users. My stakeholders are my users. Any sort of change request goes through my users before they ever get to me. That is my committee of stakeholders that looks at the request changes. They use my software. They come back to me and go yes. We think this should be an across the board change.

SCHULER: And when I say no precedent, meaning should we having clearinghouses, payers and providers because those are the main – there are more stakeholders than that, but those three. We all have groups within ourselves that we all pay into, but do we have a multi-disciplinary team that pays in?

GNAGY: That is what I would be interested in. Is there precedence in other industries?

GOSS: Depending upon which slice of the health IT pie. You want to look at multiple models and multiple decades, which we should take a look at that.

GNAGY: But that was a good comment.

And then the second thing real quick on extensibility, I appreciate the hesitance, but we do it now. It is very effective. Obviously, you can do it poorly. If I just say here is your text block, go to town. Probably, a bad idea.

The number of avenues, the number of issues, the number of problems that we have brought up here today and have potentially put a marker on to say probably not going to change that very quickly or very effectively, the bureaucracy, the regulatory process, anything that we want to make changes of, it is going to be slow. Why not have a standard that flows with you? And the people that do not want to use it are scared to use it – not scared. I am so trying to be politically correct and I am the wrong person to do it. Because extensibility can be very sketchy if done poorly for sure. I would agree with that 100 percent.

STRICKLAND: I just wanted to make sure that I circled back just because we went in a different way than I thought this was going to go, but that is okay. That is awesome.

One point I do want to make is once we send a letter to HHS with a recommendation, the secretary receives that letter and then it is a black hole until something happens or regulation pops out at some point. But we do not have any transparency. We do not understand what is going on inside that box. There is no timeframe. There is no you must let us in 60 days. At least your initial gut reaction – is it something you are thinking of? Is it not? Are you totally putting in the garbage already? What is happening to this?

I think that it would be helpful to have more transparency about what is happening there and is that the right place to be.

LARSEN: This is Kevin. I can explain it a little bit. Different agencies have different policy timelines and they are actually set. CMS – we typically do annual rulemaking in all of our fee-for-service rules, the payment updates. That is Medicare. But then other places, other parts of CMS – regular cadence of rulemaking and the other groups – coordinates – sometimes they did not even know when their next regulatory – is going to come and it is actually up to political leadership to decide. We are going to do a rule or we are not. We do not know as the federal government when the next window of opportunity is for even potentially putting that standard into a regulation. I think this is a great time to have a conversation back and forth.

But I will say that from the federal side, it is not even clear necessarily to us which are the windows for which different standards.

COUSSOULE: On that note, we are going to do a break for about ten minutes. Please be back here at three on the dot and then we will get rolling again into our next theme. Thank you all very much.

GOSS: We will be shifting to data harmonization.

(Break)

COUSSOULE: I will be leaving after the next segment, a couple of us will, to catch a plane. Please do not take this personally.

But before I do that, I just want to thank everybody again for your time today and engagement in this process. I apologize. I will have to scoot out just a little bit before we finish, but I do want to thank everybody again for their time.

Agenda Item: Theme 4: Data Harmonization

KLOSS: We have been talking about process and now we are going to get into content. And happily, you introduced the topic first thing out this morning. A couple of you mentioned codes and contradictory standards and ICD and data harmonization and all the content topics we would like to probe with you now.

This slide reminds us that obviously throughout health care, terms can vary depending on your context. Acute myocardial infraction may be coded on way in a provider’s office and there may be multiple definitions guided by different terminologies and classifications in the electronic health record in the research office or in the patient vernacular.

Data harmonization came up as a theme when we had the thinking session last August. In that context, it was more along the lines of simple things, not just complex as how you categorize heart conditions, but how you code sex and things that are pretty basic descriptive demographic terms that might be carried in a transaction. We know that this topic spans the universe. We do not have time if we stayed all weekend to do all of it.

Before I launch in with some strategic questions that we would like your insights on, I just want to say that NCVHS also has the charge looking at content standards. In our way of working, we kind of separate code sets from administrative standards. But we know that that is archaic. It needs to come back together and that we need to find a way to bridge some artificial barriers that exist because transaction standards here, code sets in HIPAA, content standards in the EHR kind of overseen by ONC.

To that end, another initiative of NCVHS is to look at the whole health terminology and vocabulary landscape. We are actually doing a hearing in July with all the code set developers and stakeholders to get our hands around issues of development, dissemination, maintenance, coordination, and governance – sound familiar. But in that context, health terminologies and vocabularies.

We are preparing right now an environmental scan that will be available in the summer. One of the contributions it will make is just lay out all of the kinds of vocabularies and terminologies that are in use, not only those that are named standards under HIPAA, but those that are commonly used for other purposes.

We do not really want to go into that big world in the next 30 minutes. But we do absolutely understand that the issue of data harmonization is well beyond the challenges in the administrative transaction world.

We understand that there is that whole health terminology and vocabulary standards. We know there is a lot of rich work going on not only with what NCVHS is doing, but through the interoperability standards, through the US CDI, through meaningful use, quality metrics, and patient registries. We have a lot of groups working on how to harmonize data standards. We know we are having this conversation in a much bigger universe.

However, in this context of administrative simplification and the standards process, the lack of data cohesion was acknowledged to be one of the issues jeopardizing interoperability and adoption of standards or making adoption more complex and some of those linkages more difficult to accomplish either because data dictionaries are incompatible or data elements are defined differently across SDO work products and therefore standards.

I think in this world, we would like you to answer three questions. One, can you describe ways in which data harmonization issues and the adoption of these standards has either been helpful or it has been a barrier? We would love to capture some examples if you have them, areas where you have to do work arounds in communicating with a health plan because of terminology and vocabulary differences.

Our second question is how would you describe the impact of the current status of data harmonization on operating costs, integrity, and quality of information. What is the impact of these issues? How big a problem is this in the context of overall adoption?

And then we would like a little bit of a reflection from you in which you started earlier this morning on should we be persisting in distinguishing between clinical and administration data content standards and if not, any ideas. We will take whatever blue sky ideas you might have for how we need to begin to bring those together. I guess we do have a little bias there that we do need to look at better bridging.

Let’s take the first two questions because they are kind of operational. What is the impact? What is a barrier? What are challenges? Give us examples because they would be very helpful to us as we craft recommendations. On those first two points, how data harmonization has been an issue and what its impact might be operationally.

NICHOLSON: I think about this issue like CARCs and RARCs. It drives us crazy when a general definition comes back in them, missing, incomplete, invalid, telephone number, address, anything like that because what do you do with that? You do not know whether it is missing or incomplete. You have to then make a phone call. Your system has to figure out something to do and you typically call the payer’s website or go to the payer’s website. It is hard to get because you are drilling down to get more and more definition, but that is what we need.

KLOSS: In that case, it is a matter of whatever criteria our scans are being sound is just not differentiating between whether it is missing.

NICHOLSON: It has three or four different pieces of that and you do not know which piece you are responding to.

I do have one comment. It actually goes back to the last segment. We talked about the adoption of the Apple Watch. Who wants to be the beta site? Who wants to be all in the roads with the beta site testing of autonomous vehicles?

  1. BISHOP: That was a good point. I was laughing. I think all of us are because they are testing in so many markets across the United States. We have no idea of knowing when we – we are the guinea pigs whether we are in the car or not, which is how all of us are.

It is actually an interesting segue. That is how all of us are with respect to the impact of this lack of data harmonization. With the rush to adopt meaningful use stage two and to test meaningful use stage two, all of the patients involved in the US health care system effectively became lab rats for the ability to effectively exchange clinical information and for that coordination of care based on the data that was coming out of the EHRs and in many cases, EHR vendors that were putting those releases in every two-week cycles so that they could keep up with what they anticipated coming from the testing facilities from ONC.

This is the single largest – Gartner does a tremendous amount of surveys. This challenge of data harmonization and the lack of data exchange between payers and providers for both payers and providers is the number one source of frustration. It is the number one source of distress. I think it is 68 percent of payers that this is their number one priority to resolve and more than half of providers.

I do not have a suggestion for how to simplify other than to say that clinical data and administrative data do serve very different purposes. But being able to address a concept and to have a definition of a dollar be a dollar across all of our systems is an imperative. This is a patient safety issue as we know. The off sighted statistic that medical errors are now the leading cause of death. There are very real impacts. I know that that is an off-sighted statistic and there are lots of skews to how that statistic was rendered.

But there are very real impacts to our continuing lack of semantic interoperability. How we get there and how we continue to establish standards that are meaningful across the spectrum, across all the HL7, all the quality measures, what we are doing from a business standpoint and which ones of the code sets align to that, why there is a difference between a concept that is represented in SNOMED versus LOINC. How do we resolve the business use cases that are driving the difference? What are the stakeholder motivations for aligning to those particular code sets? How do we resolve that business stakeholder discord?

KLOSS: Mandi, would you be willing to share those survey findings on how big an issue this is in the minds of – that would be very helpful and add some good hard data to our findings.

BISHOP: Sure.

JOHNSON: First, I would say, Mandi — I cannot imagine doing a better job articulating the issue than what you just did. I would say to you that the only place where I would make a change to what she said is we have exactly the same purpose for administrative data and for clinical data. It is about the patient.

That is why when you saw – the slide that I submitted was about semantic interoperability. That is what it is about. It is not an easy thing to overcome and people are going to give on both sides, but it is incredible to me that we still do not have agreement on heart attack, myocardial infraction. It does not matter. We have got to get there and whatever it takes. We can give you as many use cases as you want.

We all know it, but we have not really forced it. And unfortunately, the way the government runs – even today as we submit quality data, some people want it patient identified. Some people do not want it patient identified. Some people want four different fields from four other fields. Every one of us creates incredible amounts of queries to map data to give reports that we do not know what the you know what they do with – heck is what I was thinking. We do not really know where the data goes to improve care, but we keep doing it because we have no choice. On top of that, we do not even have similar definitions. It is a passion that I will never give up until we get it fixed. I would hope that you would join us in that passion.

LARSEN: For better or worse I have a lot of experience here. In the clinical, quality measure space, you are going to actually compare outcomes of myocardial infraction and pay differentially based on those outcomes. You have to have confidence in the industry that you have actually defined it in the same way and defined the outcome in the same way. We have spent a lot of time and energy here. I will not say that we hit a home run by any stretch of the imagination, but a lot of experience –

A couple of suggestions. One thing that we found as necessary is to not just establish the terminologies and some agreement there, but a data element standard. And the data element standard is a different standard than your transaction standard and a different standard than your terminology. It actually defines the meta-data and where that goes.

For example, I am going to send you an address. You might want to know if this is a current address or past address. If I am going to send you a phone number, you might want to know if this is a cell phone or a land line or my mother’s phone. Those are meta-data elements that you need to have same amount of agreement and conversation about so that when you are sending a phone number around, you know with what you have decided as an industry what are the right kind of tags that go with that phone number so that you are using it in a way that it was intended. There is a rabbit hole underneath all that because you can spend forever in that rabbit hole and that is where a lot of the data element activities have fallen short is they want a whole lot of context and the context is expensive and hard to keep up and maintain. My counsel is get some context, but be parsimonious about the context. Be as tight and short and sweet as you can.

Also, we worked in the meaningful use program, Chris and I, to define some demographic standards and ONC has done a fair bit of work and actually has certification and some clarity about what we think those demographic standards are. That is ready to go. That is something we have worked on to have in all the electronic health records. It is a bunch of the patient match and the clinical exchange is based on the tighter standards around demographics and how they work and how they are tested. It is there.

An example. You said – sex. It turns out that was not easy. I did that for ONC for stage three. Do you know how many genders Facebook has? 55 acceptable genders in Facebook. The federal government at the time had two. We needed something between 2 and 55. It took us six months to get consensus on what was the number of genders that we should have a federal recommendation. We did not make anybody ask the questions. We just said what the meaningful use certification standard for the number of genders should be and what should they look like. I think we ended up with four or five or something. That is consensus-based work. It is hard work. But if you pick a really tight use case like patient match, that is important and everybody cares about. There is a lot of pre-work that has already been done, a lot of things you can build off of and then you need to build a repeatable process. That is what we learned. This is not a thing you can do once and done in a meeting. This is a repeatable process that you build out and you build out and you built out.

KLOSS: To link back to our discussion earlier about governance, the governance of the SDO process and the standards that are being readied for adoption should ideally have that continuity in any underlying data decisions being made.

LARSEN: Potentially, again, you need a more sophisticated standards person than me. But what I would say we found is that this was often the interplay between multiple standards. You need a governance by use case in a way because you needed to have a transport standard and which data element standard would you use and then which terminology was used within that and then which value set was used in that terminology, in that meta-data standard, in that transport standard. There are these layers. A different use case might define a different value set even if all of the other standards are the same.

The terminology standards groups do not create a value set. We had to stand up a value set authority center to the National Library of Medicine and a governance process to figure out how those value sets get created and maintained and overseen so that we could – as we upgrade ICD and then they add new codes, we can be sure that we are getting those codes entered into our value sets and into our definition.

HEFLIN: I was thinking, as Kevin was talking, that really one way of looking at this at a more abstract level – I am software architect by discipline and a scientist. Really, if you look at the highest level, what we are really talking about here is building one of the world’s largest, most complex IT system ever created. We are talking about clinical data by itself, just the clinical portion of this, which is just a piece of the puzzle is probably one of the most complex data sets ever created known to human kind. That is just one small part of what we are really trying to take on here today.

My fundamental assertion is we have drastically as a world and as a country oversimplified. We have an approach that does not recognize the absolute complexity of the system we are trying to build. I live in a world of clinical data exchange typically for treatment. Most in this room live in the world of payment. We have to be talking. We need some overarching grand architecture that takes into account not only standards development, which is necessary but not sufficient. But we need an architected solution that has to include operations because the standard is not done when the standard is actually released. Someone has to deploy it. Someone has to develop it. Someone has to operate it. What about the validation testing program associated with that? What if you validated somebody last week to the old standard and now a new standard comes out and you evaluate somebody new? What is equitable? That has to all be taken into account.

My hypothesis is and what Kevin was just talking to us, which really spurred this because really what Kevin is talking about in one way is what is called dependency management. We have a system of code systems, as an example, where you have clinical content depending on code systems and potentially conflicting or different code systems. It is not a snapshot we can just take and say we are taking a snapshot of these code systems and we are done. Those change constantly, weekly, monthly, yearly basis as new medications are developed and so on. Procedures are developed and things expire.

We have not only just the world’s most complex system, we need to also effectively manage all the dependencies, which essentially spiders out to a massive web of dependencies that we have to successfully manage in order to make this work and even that is not enough.

We also need to have a validation program that actually confirms. We need to have a trust to exchange framework and environment upon which this can operate beyond just clinical or beyond just payment. I think all of this is insufficient.

We need a new level that does not exist today of a national architecture in terms of an operational deployed production system that iterates over time.

KLOSS: Jon.

ARENDS: One thing I would like to call out – the problem statement does specifically look at interoperability. While interoperability in and of itself is a laudable goal, I think the concern with poor data harmonization goes beyond interoperability. It is one thing if you get a claim and you do not really understand what the reject reason is and have to go digging for it. Yes, that is inefficient. But I think there is also a bigger risk of patient safety that is tied to the harmonization. I do not think we should lose focus.

Quick example. In NCPDP SCRIPT, we have a single field that one of it is text and then there are two machine readable identifiers that come through that are supposed to help explain the free text. We have a medication name and then you can send an RxNorm value and you can send an NDC.

Within those three identifiers, we often see mismatches. It is coming from a single entity. They are not even aligning internally and sending it to us. When we start looking at drugs, the patient gets a wrong drug. That is a big problem. I think focusing on interoperability is narrowing the scope too much.

MUIR: I just wanted to build on what Kevin was talking about and Linda’s comment that you made. Some of the work that ONC did was really trying to ferret out or reduce the optionality of standards. Some of the standards have a lot of optionality and it was really as constraining the optionality.

Some of what we have done though as well is we tried to create – I guess I would just say enhance the standards that are out there. We have learned by the scar tissue that you have to get the SDOs involved in the front end of that instead of like developing something and then throw it over to the fence to them. They do not like that. We tried that. It has not worked so well.

I really like the idea of having kind the situation you have now with having a governance of SDOs. But it is a challenge to make that a very streamlined process. I really like that idea and I think we can somehow expand upon it and enhance that.

LARSEN: I was just going to make a plug. Right now, CMS Medicare rule, IPPS, the Inpatient Prospective Payment System rule, is out. There is an open request for information about interoperability. It is a big priority for this administration for our administrator, for the secretary, for ONC. We are looking for any and all thoughts and ideas that people have to promote interoperability.

I will just give you a little inside scoop. There may be more than one opportunity in our rules this year to answer that same question, but it is actually open and out right now. We really want that input.

KLOSS: As we compile the record of today’s meeting, I think it is fair to say you underscore that problem statement and you are suggesting it be expanded beyond a focus of interoperability and whether this committee under this particular project takes that on we need to be expanding the vision of this that we probably are oversimplifying what it is going to take to really get the data standardization to be more robust and ongoing.

LARSEN: I will just say to piggyback on what Eric said, we need to start. Do not think that because it is hard we should not start. We should start. But as much as we can build in the empiric testing part of it as early as you can test a speck and get real input and feedback, the better off you are and the more cycles, the more iteration you can have, the faster it gets good and the faster people want to use it. That would be my counsel. Think about this as a process and tooling exercise and not just as a consensus exercise.

Agenda Item: Theme 5: Third Parties as Covered Entities

GOSS: With your permission, I will move on to the next one. Third party entities. Who in the room considers themselves a covered entity? Who considers themselves something like a business associate? Who doesn’t consider yourself either one of those?

Covered entities have a very specific definition within HIPAA. My sense is that I probably do not need to go over some of this background stuff – a few minutes. What I will really get into – we have a very long list of folks who are not covered entities. Property and casualty, medical transcription, utilization view, et cetera. These are just a sampling here. It is a pretty extensive list. I want to get to the next slide.

The problem statement is related to covered entities and it includes providers, health plans, and health care clearinghouses. Vendors and other business associates are not covered entities despite a role and the conduct of the adopted standards. The federal government is limited in its authority over non-covered entities especially because it is in the law and in the regulations. This impacts the use of standards in a variety of ways from cost to actual utilization.

We have two questions we would like you to ponder today. Do you think the list of entities on the prior slide should become covered entities? If so, why and how will this help industry use the standards and operating rules more effectively? And a third party who are not currently covered entities were to come under the umbrella of HIPAA, there could be implications for their compliance with the privacy and security rules. What barriers would that impose for those organizations? Ray, would you like to —

SCZUDIO: That would be a no.

(Laughter)

We have to step back and think about what are we doing and why are we doing it. Clearly, if an individual through the course of treatment or other activities winds up divulging their information or their medical information is created through a collaboration with a covered entity, a provider, and so forth, that patient and again that is the center of what we are talking about here. That patient and is recognized in HIPAA deserves to have that treated confidentially and should not be shared by the entity that has that information, the covered entity, unless there are procedures in place. That is a person, an entity, not the patient sharing information. For that, the HIPAA rules however complex they may be I think are doing a job for the patient.

But the flip side of that is that and this is only now starting to get recognized is that it is the patient’s data. The patient should be able to control that data and indeed the patient now has a right to get that data for themselves even by electronic means. They can enforce that.

To have an entity such as my company, B.Well, who is acting as a proxy, a designated rep for that patient to help the patient manage their data, is a good thing. And that patient right of access, not a HIPAA authorization, should be recognized and should not be unduly impeded or bought.

Now, if you were to make anyone on that long list of non-covered entities a covered entity, it turns them into not a tool for the patient, but it pulls them into an unbelievably complex set of rules where it is not necessarily applicable because again it is a facilitator for the patient managing the patient’s data, not for passing it back and forth among other parties.

SCHULER: Do you mind going back to the list?

GOSS: Could we go back up because some of these are not just consumers? There are property and casualty, workers’ comp, employers. I probably should have picked a few more applicable ones.

SCHULER: Again, provider from Ohio. There is a handful on here that I wish they were covered because they bring exceptions to again from the provider claim processing side.

GOSS: Who would they be?

SCHULER: Worker’s comp, my friends at the VA.

DOO: They are covered entities. They were either a provider or a health plan.

SCHULER: Worker’s comp when I look at the list. Let me stick with the list. For example, they do not follow many of the ANSI standards, which we would prefer that they do for worker’s comp.

GOSS: She had to leave although we do have a representative who is here to support questions on her behalf. We are going to ask her to introduce herself too.

SCHULER: Auto insurance, third party. Auto insurance is tough. It is very expensive to deal with third-party liability companies. And, again, with them being exempt – if they were in. Anyone involved in processing claims information generically stating that. It would be great if they were part of the covered entities to follow the standards.

GOSS: Can you introduce yourself before we keep going down the line?

GREENE: My name is Tina Greene. I am with Mitchell International. I am a chair at the IAIABC, which is a worker’s compensation international organization. We agree as far as the transactions are concerned and the code sets and the work that has been done. There is a worker’s compensation standard. It aligns with the HIPAA mandate with all of the transactions we have done extensive work with X12. There is a standard. States are beginning to adopt those standards. There are some differences so there is a companion guide. But your practice management system vendor should be able to put those in there for you.

SCHULER: It goes back to that adoption.

GREENE: Right. It is being done in all 50 states, but as far as a state mandating it, that is beginning to roll out now. We have about 15 states that are doing that. But it is available and it is a standard.

NICHOLSON: ERISA plans, a bane of our existence because they are a federal program. They do not follow state rules and all the things that go along with that. I do not know if they could be included here. Are they exempt from this? I do not know. That is my addition to this list. ERISA plans.

PARTICIPANT: ERISA plans are subject to HIPAA.

LOVE: They are subject to HIPAA, that is why —

NICHOLSON: There are some of the issues that they avoid. I am not sure if they are subject to all of them or not.

GOSS: Are you referring to the all payer claims database topic? That is out of scope. Sorry, Denise.

LOVE: Could we fix it though today?

GOSS: Jon?

ARENDS: As a covered entity, I do think a lot of these organizations should be covered as well. I am going to go back and give an example, an anecdote, that patients, I do not think understand the differences between what is a covered entity and what is not a covered entity. As their data flows from one organization to another, that transfer is not always clear to them.

Examples that I get and I see on a regular basis is Walgreens will contribute dispense history information to medication history products. At least on a month basis, I get complaints into our privacy office that says, why does my specialist know about my XX medication and it comes to Walgreens because patients – I have that medication filled at Walgreens so they must have told someone. Well, we did, but we contributed to the medication history product, which is firmly in the treatment auspices of HIPAA, but still causes angst and concern amongst our patients.

I think they will not understand that there are differences in their protections when they start saying, yeah, I want my data to go to some other place.

CAMPBELL: Definitely good questions. I think one thing to keep in mind is that as far as I understand it, I am not a lawyer, but as business associates, we are still subject to a lot of the same privacy and security rules under HIPAA. For business associates complying with that would probably not be an additional lift although I would assume that being a covered entity would more than – that there are other things besides that that go along with it.

I would say with the recent pending legislation around 42 CFR Part 2, and bringing that under the provisions of HIPAA, I would actually be really concerned about having all of these things as covered entities. From my perspective, it is a lot easier when it is primarily a health care provider and those other two entities that have access to that drug and substance abuse type information. Having all of these with that access would make rethink that.

GOSS: Building on what Jon said with the whole treatment, payment, and operations and ability to share within those covered entities, I do not see the SAMHSA Part 2 rules coming under the full breadth of it. But I think your point is really well made. There is a nuance. That goes back to the patient’s perspectives and the impacts to them.

Brad.

GNAGY: I will tell you my perspective as the director of technology for HealthPac computer systems. No, we do not want to be a covered entity at all. As a reasonable human being, there is obviously good points on either side of the spectrum. There are decent reasons for providers to want software vendors to be as well as like you said, situations that merit or potentially would cause somebody to really need to be in that situation. As a software vendor, yes, there are differences in being a business associate versus HIPAA. Yes, there is going to be a business expense because of that, adhering to the same rules as a covered entity.

In our situation in reality, is there a huge swing in there? Probably not. Other than the fact that we probably will have to rewrite every one of our policies or probably have to go back in and make sure. I will have to do things that I do not want to do except for every two years when I am forced to do them.

GOSS: There is a financial and cost impact to what you are seeing from a specific business perspective and that may outweigh the ecosystem being bound together in a way that helps the return from administrative simplification being achieved.

GNAGY: Correct and I would be really interested because again not being immersed in the broad spectrum of this industry like everybody in here together is I would really be interested in seeing the whys and why nots of this question. Why do you want me to be a covered entity because I am giving you software? What is the reasoning behind that? Now you are different. Why do you want me to be a covered entity because you have purchased software from me? Is it because your data is in my data center? Is it because — yours isn’t because you are in a different model than he is? Where are those lines drawn and what is the rationale behind making me a covered entity because I am providing software?

I do not disagree with what he just said necessarily. However, just because I am not a covered entity does not mean that I am not required because I am a business associate to make sure that – I have a list. Everybody knows that list.

GOSS: I guess Dr. Goodyear said something, but I missed it. I did not hear what he said.

GOODYEAR: I said yes. It just levels the playing field. He has this data. We are required as a provider with patient interface to make sure that that data stays secure and follows the standards. I think anybody that has access to that patient data should be held to the same standard as we are for the exact same reasons. It levels the playing field. Yes, I think he should be a covered entity.

GOSS: Simply because of the head shaking over on this side of the table, I feel like you have a follow-up comment.

ARENDS: I think the distinction between creation of data and housing of data is arbitrary. It looks like that the HIPAA rules are looking at who is creating the data. If you are creating the data, you are a covered entity. You are seeing a patient. You are doing some sort of activity. But there is that same sort of level of risk of losing control of the data and having adverse outcomes because of losing that control of that data and everyone should be under —

GNAGY: I am not disagreeing with the point, but where does it say anything about if you are the creator of data that is why you are a covered entity?

ARENDS: That is what we are getting to is that the providers are the covered entities.

GNAGY: I would not associate that with creation at all.

ARENDS: You were just arguing that the fact that it was stored on your data center that you are not really obligated.

GNAGY: I am not arguing the point. I am arguing that the element that you said. You said that HIPAA is looking at who is creating the data. Where does it say anything about creation of data? You being a covered entity because you created data.

ARENDS: The list of the covered entities are the creators of data, not necessarily that it is enshrined in the —

GOSS: Creators are exchangers. Maybe I should have gone over the covered entity slide. In the spirit of moving along, Eric, I think you were up next and then Deb and then Dave.

HEFLIN: The answer for this might be already out there in a way. I skipped to introduce myself just for time’s sake. In addition to being the CEO of the Sequoia Project, I am also the CTO of the Texas Health Services Authority, which is a statewide entity, fostering IHE within Texas.

One thing I will mention about Texas law is basically almost everybody is a covered entity. For anybody who wants to make a note of this, Chapter 182 defines a covered entity including – I will not read all of this, but essentially any person or company that does for financial, commercial gain, monetary gain, duties whether it is for profit, pro bono, for a fee, engages in whole or part with any kind of constructive knowledge in assembling, collecting, analyzing, using – transmitted includes business associates, payers – entity, information or computer management entities, et cetera.

Basically, if you would like to see an experiment about defining covered entity extremely broadly, I would encourage you to take a look of what is going on in Texas. I do not have an opinion about this, but just that that experiment is currently in process.

GNAGY: What was that called?

HEFLIN: It is Chapter 182 of the Texas Health and Safety Code, which defines the Texas definition of a covered entity, which is much more inclusive.

GOSS: Deb.

STRICKLAND: I am very passionate about this and that is in support of my provider friends because the practice management systems are the last part of the leg for the entire continuum of claims. The claim comes in. It goes to the payer. The payer sends the 835 back. And the provider only gets what that practice management decides to send them and whether it be 3050 version or 5010 version. Well, it is up to them because they are not a covered entity. They do not have to do what everybody else has to do.

The fact that I invested ten years of my life developing an 835 and only to find out that some vendors were passing 3051 versions back to the providers, which angered me because we put tons of value in there.

But I also question when people say the 835 does not have enough data or that they are not sending me rich 835s because I then start questioning the practice management system and what they are doing. Sometimes what ends up happening is it is an add-on feature or it is an upgrade. Providers on a strict budget sometimes do not take the upgrade, but they do not also realize what that means as far as how much data they are dropping on the floor. It is just something to keep in mind that in the entire process of paying a claim, that is the data that is important. If they are not covered entities, they do not have to do it.

SCZUDIO: Just to follow up for Jon, in the example that you gave about giving the medication information out and so forth, help me out here. How would that receiver of that data being a covered entity make any difference in responding to your customer’s complaint?

ARENDS: The point I was trying to make is that the patients do not have a thorough understanding of the differences. Because of that, I feel like everyone needs to be on the same playing field that deals with patient, medical, protected health information.

NICHOLSON: I cannot remember what was on the slide before this, but – go back to the next slide. I want to see the list. I do not see banks on that. I have had some recent cases where there have been some HIPAA privacy violations involving providers working with banks, not having BAA agreements and things like that.

Originally, I was told that a BAA agreement – this is by a bank. Banks five years ago would not sign BAA agreements. That has seemed to change, but I do not know where it is written down.

DOO: I do not have it on me, but there is something in our regulation when we adopted the electronic funds transfer and ERA standards that deal specifically with financial institutions because the banks ended up being partly involved. It is not an exemption or an exception. But it deals specifically with financial institutions. I am sorry that I do not have it on the tip of my tongue.

NICHOLSON: Are they business associates or are they —

DOO: They are not business associates.

NICHOLSON: Do we need BAA agreements with them?

DOO: No.

NICHOLSON: I have lawyers that would disagree with you.

DOO: Let me get you the correct information, Dave.

NICHOLSON: Shouldn’t they really be on this list?

DOO: There is specific language in the regulation – we need to get you the right information and the exact language so we do not misrepresent anything.

GOSS: We will also consider that in our internal deliberations. So thank you for bringing that up.

GRIZER: I did want to bring up really quickly HATA, the organization that I know Brad and I are here to represent as far as practice management system vendors. They are coming out with a certification program for practice management vendors to make sure that they are sending standardized transactions. That is something that provider organizations could reference. Are they certified within that certification program to be sending the standards?

GOSS: In essence, it is the industry saying even though we are not already regulated, we want to step up and show – good housekeeping seal of approval.

GRIZER: Correct.

GOSS: Brad, did you want to add to that?

GNAGY: Yes. I will go back because I do not think software vendors and lots of other practice management systems should be covered entities. However, I do not know if I am opposed to it. Right now, I guarantee you. I do way more than you do for your data security – if you are my client, I am doing way more. I am covering you for that purpose even though I am not a covered entity. I am not against it one way or another. If we need to go there then we need to go there. I do not think it is a huge impact on us. Dave, would you agree? You are shaking your head.

NICHOLSON: The only thing I would say is that we probably could do. I think about encryption at rest and how expensive.

GNAGY: If you are on my network, all of that is covered, not only that, but backup history, disaster recovery, all the things that you guys would pay thousands a month for potentially. We are paying for with your fees.

Going back to the experience that Deborah had with the practice management system, I have a much better practice management software.

GOSS: But that is also getting back to the level of education of the providers and the tools that they have. If we had that good housekeeping seal of approval, if they were covered entities, then it would make it much easier procurement job for the providers to know that they are getting the right kind of tools for what they are procuring.

GNAGY: And not all the providers are going to know to ask the questions. There is a lot in there. Honestly, it may be just semantical whether or not we are called a covered entity or not. We all have to abide by certain rules. I think that we would be good with that. The whole thing is it is a free market. That is why we did what we did. That is why we invested the money. That is why we made the 835 and let the others do the conversion or whatever because we want the business because we are good. That is the point. If it is out there, we will do it. Let the people that do not want to do it. Fine. We will make the money. They will not.

We will make sure that we get ourselves out there so that people who do not know to ask those questions – that is one of the reasons for Chat, the certification, is so that we have something right there that says you are a member of HATA, you are a member of – remember this. We have done our homework. Here is the top level. Now go talk to these people and see if you actually think that they are on the up and up and they have the other benefits that you want and the other little bits involved that you need.

GOSS: Kevin.

LARSEN: I am not an expert of this so these might be old hat. In the clinical realm, there are a lot of aggregators of clinical data like clinical data registries and quality measure reporting like companies or actually they are often not for profits that are regionally associated and they get PHI for the whole city. They then measure the hypertension across a market, for example. I do not know whether there are covered entities or BAAs, but it is probably worth if you are reopening this discussion to put on that list these things like registries and clinical quality measure aggregators.

I guess a third part of this that I think in this modern time, the Internet of things, trying to talk about what we would have an intentional decision about rather than just an interpretation of your device or your Alexa in your home that is somehow getting PHI. For the purposes of a new, cool app, can I tier a Fitbit? Those are open questions that should be discussed if this is opened for conversation.

BURCHETT: I was just going to comment that they are not covered entities, which includes like HIEs, which is unfortunate in some instances.

HEFLIN: HIEs are covered entities in Texas, just FYI.

(Laughter)

BURCHETT: UHIN was smart to get ourselves protected under state law for certain things, but that is the extent of what we are able to do.

GOSS: You are or you are not considered in Utah a covered entity? You are sort of in a hub for the payers and the provider community.

BURCHETT: We are not considered a covered entity, but under state law if something were to happen with the data, UHIN is protected financially.

GOSS: You have sovereign immunity, almost like a state agency kind of thing.

PARTICIPANT: You are a covered entity as a clearinghouse?

BURCHETT: As a clearinghouse, we are a covered entity, but all of that data is completely separate.

GOSS: Mandi.

BISHOP: I just want to make sure that I am clear on this. HIPAA is a federal standard, but there is deviation between the states.

GOSS: Because states can be more restrictive than this floor of HIPAA.

BISHOP: I am just making sure. He is allowed to do – you are enshrined. There is a whole – okay.

GOSS: Officially, we are going to let the federal regulator clarify this.

DOO: Another friendly amendment. I like friendly amendments. In part, some states, one as Alexandra has said, the states because they can be more restrictive or prescriptive. In part, some of the states because we either have not done enough or we have not been nimble enough and fast enough, but simply because they have had collaborations in those states of those own groups that have said we need to do more with our standards. We have gotten our covered entities together who have said we want to adopt, for example, a standard that does more. We want to use this standard that you have not adopted yet. We are going to enable this. There is some worker’s comp and PNC going on in certain states. They have enabled that. Minnesota, for example. Sorry Ohio, but Minnesota is particularly progressive in that. Like Texas, they have named other people as covered entities. For some things where enforcement has not been as stringent, there are laws that are going into effect that have made certain enforcement actions to be able to happen.

GNAGY: Just two words. Just to go along with that, I would be interested and I am going to look into this immediately actually. If I am selling to a client in Texas and I am in Georgia and yet my data center is in North Carolina, how do I rank in Texas because a Texas client owns my software?

BURCHETT: It applies just like GDPR.

GNAGY: Say that one more time.

BURCHETT: It applies the Texas law.

GNAGY: It would apply to me, business in Georgia, having it over there.

GOSS: If you have European citizens, I think, is where you were sliding to.

BURCHETT: I think there is a lot of JDs in the room.

GOSS: We are talking about the new European General Privacy – just making sure we are on the same page with that.

BISHOP: I was an analyst and I want to make sure that I have this right in my head. These are areas that I am definitely not an expert, but I want to understand. We spent the day talking at the federal level about how to become more consistent with our standards, how to develop standards collaboratively, now to have an effective way to implement standards. Are we now saying or am I now realizing at the end of the day that the states have an opportunity to set their own standards in addition? I am not talking about the covered entities here. I am talking specifically about the data exchange standards.

HEFLIN: I am not an attorney, but my understanding is that states can be more restrictive than HIPAA, but it is rather challenging to be less restrictive. I think there is an example there where by which someone can get a safe harbor.

LARSEN: ONC actually did a report, a survey of state privacy and security and then put that in and compiled that into a report of various state laws. I think it is two years old maybe.

GOSS: I think we want to be really careful here because there are different rules when it comes to transactions, which is what our focus has been on today. Then what happens when you slide into the privacy and security because you have to add in the nuance. A health care provider is only a covered entity if they elect to do an electronic transaction. They get a freebie pass. Health plans not so much. Clearinghouse not so much. You have to not only look at the law and the regulations related to HIPAA. You have to factor in that Omnibus bill as well, which then starts to really – I wish Linda was here for this – brings more of everybody into the same paddock for privacy and security rules.

BISHOP: Are we bringing all of the states to the table so that they can now come to an agreement about the standards that we are setting at the federal level and make sure that we can pass data between the states?

GOSS: Quite frankly, my sense is having been in the field of EDI for more years than I want to count, most of the states – the ones that are progressive have influence what happened in HIPAA and have been driving standards of development and I am not aware of disconnects in the transactional aspects of HIPAA. I am looking at some of the national big guys, they are agreeing with me.

MUIR: I was just going to say that that is the whole idea of setting the floor. That is really what we are trying to get at. You set the floor and then allow people to do some things above that.

CAMPBELL: In extensibility, that would be fine — maybe not this particular domain, but states do set different requirements and that is a problem, I think, with setting the floor and not a ceiling is that you have to then support all – go across state lines.

SCHULER: Ohio has set standards. The State of Ohio on response time for payers on authorization.

GNAGY: They are raising the floor, not the ceiling. If the floor is here by federal, it is here by the state.

HEFLIN: This is kind of amusing. I hope I do not embarrass Ohio. I actually architected OHIP, Ohio Health Information Exchange. When we were deploying that, one of the things we found out is the law included codified value sets that were incorrect. The law was actually un-implementable because they tried to go too deep into the technology. I do not know if that has been changed or not, but it was actually an un-implementable state requirement at that time about ten years ago.

GOSS: We are now at the point where we want to make sure that we have an opportunity for those listening on the web and those who have been patiently sitting in the room all day have an opportunity to come up and offer public comment. Ms. Margaret.

Agenda Item: Public Comment

WEIKER: I am going to start at the beginning. I am Margaret Weiker. I am director of Standards Development at NCPDP and have been involved standards organizations in this process for more than years than even Alix has done. I have also served in the past as chair of X12N. I am going to start at the beginning and talk about governance. I am going to start with governance and the DSMO.

Not many in this room can say they were there when this first started. When we actually met at the Baltimore Hilton to work out the MOU between the organizations and basically were told by people in CMS that we were not going to leave that room until we hammered it out.

When we look at why that was done, why the DSMO was created, what it hope to achieve, I think it met those goals because before X12 did not have a really good automated way to request changes to their standards. You would go to a meeting and you could say I need shoe size on this type of claim and here is why. People would go that makes sense and they would add it. But then we would lose why it was done and what was the business reason and what did we hope it would solve. That would get lost because that person would not come back or the people that were making the changes did not document it well or for whatever reason.

Part of the DSMO was to get a process, an automated way to where anybody, not just if I go to X12, but anybody whether they are a member of a standards organization or not could request a change.

Initially, we were overwhelmed with changes. I know the NCVHS committee, the full committee met many times. I was the first chair of the DSMO. I was flying to Washington about every other week. We would go through every change request and what the outcome was.

And then as processes developed, procedures developed, technology was enhanced, X12 started with developing a process where anybody could go in and enter a change. You did not have to go to a meeting. NCPDP had long had – anybody could enter a change and a form that anybody could pull down and fill out.

When I go back and think about do we really need the DSMO at this point in time for what it is doing, my answer would be no. When I look at the recent DSMO request because we have rarely gotten a DSMO request in the last three or four years and I entered two for NCPDP. When I look at – and I do not mean to offend anybody in the room or anybody listening, but when I look at the value that provided, I find none. I look at it as it cost me time because the change is entered. Then it goes into the batch. And then you have 30 days. You have ten business days to opt in. Then you have 30 days to review it. Then there is a potential 90 days to review it. Basically, it added time to a timeline that for me did not provide any value to NCPDP, its stakeholders, its members. Is some type of governance process valuable potentially on how it is structured, what its purpose is, what its goals set out to be? But I do not think it is the DSMO.

Also, in regard to the DSMO, the members of the DSMO just do not sit around and come up with things. It is an input process. The DSMO does not meet unless there is a change request that is submitted or occasionally we will meet to write our annual letter to the NCVHS committee. We will also meet a couple of times where we wrote comments to something that was being proposed by CMS that impacted the organizations that made the DSMO. But rarely do we meet unless we have a change request. It is always a matter of an input coming in to trigger the DSMO process.

And then when you looked at the one slide that had the organizations, each organization has many different types of stakeholders. Those organizations are supposed to represent their constituents so to speak or their stakeholders. When you look at NCPDP, we have providers. We have health plans. We have PBMs. We have knowledge management companies. We have HINs, clearinghouses, switches, and on and on. That is who are to represent.

With the content committees though, it is a little bit different because they are more focused on the old paper. You did dental. You did institutional. You did medical or what was the HCFA 1500. Those are a little bit different, but yet the organizations that represent those also have a multi-stakeholder constituency. That is my DSMO comments.

Standards adoption. Frequency. Mark said this earlier with Surescripts. NCPDP through our consensus building process determined that every three years we would ask the membership should we move forward to a new standard, whether that standard be SCRIPT or formulary and benefit or telecommunication or batch, et cetera. We have many other standards that are not governed by any type of legislation or regulation that are used in the industry today. When we ask this question, we are asking it in regard to those that have been adopted either through regulation or legislation, not should you move forward to the post-adjudication standard. It is those that have adopted or regulated.

If not, if the members say you know. We do not see any value in moving forward. There is not a lot of ROI. That is fine. We will mark that down the reason why and then wait another three years. Something may come up from a regulatory part of view like Part D or something, which may cause us to adopt something quicker than we thought we would. But we have always built in a lot of flexibility into the NCPDP standards to allow the use of code values to support some of the changes that we are seeing in regulation and legislation. That is the frequency.

NCPDP in the past – we could produce a standard every quarter. Nobody was implementing a standard every quarter. We were burning through versions. We decided we are going to stop doing that because nobody is implementing it and go to two ballots, two versions a year. That has seemed to work. It allows people to get their change request in. It allows a lot of discussion because it is not like we have to hurry up and approve it because the ballot is going to be done. It allows some time to bring forward a concept to have people think about it, look at it, go back and do research or whatever it may be and then bring it to the next meeting without having the issue of we have to meet the timeline to meet the ballot. That is some of the things we have done.

Also, everybody needs to keep in mind that the standards organizations are ANSI accredited. There are processes and procedures we have to do in order to be ANSI accredited. Why we may want to whip out a standard in a two-month period or something, there are processes and procedures we have to go through in order to maintain our ANSI accreditation. In addition, when you go back and look at HIPAA, it was to name standards from an ANSI-accredited organization if they existed for that business function. That is the standards adoption.

Moving to the regulatory process, most people know how I feel about this. I am going to try to be as nice as I can about this, but those in the room – NCPDP has put forward many requests to the appropriate entities to adopt versions of our standards whether that is under HIPAA, whether that is under MMA.

When we were showing the slides and going through the flow, it is like NCVHS writes a letter recommending that this be adopted. And then it goes into a black hole. I do not know what the status is. I do not know if it is even going to be looked at. We have things that have been waiting out there for ten years. We create a letter. NCVHS creates a letter. It takes years, six years to change a field from not used to situational. It is not because the standards cannot get it done, it is because of the regulatory process.

Something has to change here because technology is changing. Business is changing. We talk about agility. But when I bring forward something and it sits for six years, which is not going to cut it. We have to figure out a better way to do this.

Speaking of bringing letters forward, NCVHS – and I think I have said this as well. NCVHS has put forward adopting acknowledgments, adopting – first thing that came to mind. Attachments. Thank you. Another thing brought forward many times. The letters associated with moving forward with SCRIPT, moving forward with telecom, moving forward with the enhancements to telecom to change that field. There is probably many more out there. But it is, again, a letter sent and then what. You never hear about it. There is no follow up. The comments around transparency and accountability I agree 100 percent with.

They mean versions. I know – an SDO – we talk about this quite a bit. To say do they have to name the version. Can’t we just say adopt the SCRIPT standard and not name a version or name a version and say you must be on this version by this date, have more of a transition process. Name a version, but do not name a publication date associated with it because that would allow me to change that version if there was some little glitch or something wrong with it. But when you name a publication date associated with the version, for me to update that, that is a new version in my system in how we – in my ANSI-accredited consensus building process.

There are things like that that maybe we could talk about or figure out how we may be able to do some stuff from a regulatory process to where we could name SCRIPT standard version 2017 071 but leave off the publication date. I could still give you a copy of that to post because I know you have to put on display. But as people implement that, if there were changes, I could go in and make a change. It would go through the process. I am not just willy-nilly going to make a change. It would still go through my process, but I would update the publication date of that version versus moving to another version and then picking up everything else that was put in those new versions.

Data harmonization. We have had many attempts of this from an SDO point of view. We have had USHIK. I know that X12 NCPDP and HL7, put their data there. This was done by AHRQ. I do not know if that is even still up or if you can even access that website anymore. I probably should have looked.

GOSS: Yes, but I do not think you can get to the data. It tells you to go to the SDO. I believe USHIK is up. It just does not let you get to the – because it is intellectually copyrighted data.

WEIKER: We went through a lot of work to put that to basically crosswalk the data elements between the standards. That work was done. I am sure it needs updating because it has been many years, but that is a thought around the harmonization. I am not sure we can distinguish clinical and administrative data anymore.

I think so much of it is intertwined. So much of it flows from one to another system to another system. It was like the whole process with the UDI. That was being captured in a different system than the administrative system, which sends the claim. It had to get out of this system over here to get into this system over here. The slide had something about distinguishing or merging or something that the administrative and clinical data. They are not separate anymore. Maybe at one point they were, but they are not anymore.

That would conclude my public comments.

GOSS: I have checked. We do not have anybody on the web who has asked for public comment, but is there anyone else in the room who would like to make a public comment?

WEBER: Hi. I am Erin Weber. I am the director of CAQH CORE. We are the operating rule authoring entity. I just want to observe and give some history around CORE because we were founded as a voluntary initiative back in 2005 when the industry came together and said there are some challenges with this process. There were actually two phases of operating rules developed by the industry over 100 participating organizations, leading up to the Affordable Care Act. Those rules were implemented on a voluntary basis. There was a certification program around them. And then the Affordable Care Act came around and we became the formal author and the rules became mandated, at least the Phase 1 through 3 rules. There is a model of the industry coming together.

I do not want anyone to think that the industry has not tried to work on this because a lot of people have spent a lot of time developing the rules and the vision was always that the operating rules would then be incorporated into the standards as the standards went through their longer development process.

We are not ANSI accredited so we do not have quite the rigorous requirements around that. We are not part of the DSMO because of that. That allows us to be a little bit more nimble as does the voluntary process. We are always interested to hear from others on how we can continue to support the industry from that voluntary perspective as well.

GOSS: Anyone else? Pat.

WALLER: One of the things that I wanted to spend a little bit time talking about is the challenge that I see on a go forward basis with X12 and the ANSI standards that we have been using for years and years. Unless there is some dramatic change to how they are used and move forward, I do not see us making any great innovation strides.

We should start planning on maybe a plan to replace or significantly improve them in terms of real time, API-enabled, so that we can do better interactions between the care continuum, as well as payment and operations, because without that there are many things that we could do. I could modernize – if we still have claims that we want to pay, I could modernize that by opening up an API to Epic and have that claim real time adjudicated. Now, I could do with X12, but it is messy and not very friendly.

GOSS: Because you did say ANSI standards, I think you are very specific to X12 here.

WALLER: Additionally, we could significantly streamline prior authorizations by using inquiries into eligibility. We could then do a coverage inquiry outside of that standard as well and that if we still wanted to do the 278, we could actually submit the PA. The purpose of that would actually be to take and have a real interaction to say whether or not the person and the service actually required prior authorization, instead of today it is much more difficult to do that if you are actually adhering to the standards in HIPAA. You have to trick the world.

PARTICIPANT: I will just say that I think the telecom standards for pharmacy actually really works and it does a lot of that already. However, it does I do not quite well. I have used it and it is fantastic and it is old. It tells you right at the point of service is this covered and what is the co-pay. I think there are models.

GOSS: Ray.

SCZUDIO: Just quickly and I do not want to go too far off topic, but one thing that I had mentioned right at the beginning of my first presentation that we have not touched on and I am wondering what the process is for introducing a topic that is relevant. And the thing that I have particularly in mind is a – customers, consumers, and individual’s right to correct a medical record. Are there any electronic standards for that and how well are those publicized and promulgated?

GOSS: I would like to put Chris Muir on the spot for that one, but I am not sure that I should because I think we have left the land of practice management and administrative transactions, and you have gone to actually how do we correct something within the electronic health record environment.

PARTICIPANT: FHIR based rights for that would probably be easiest.

GOSS: I think though you have a question about – you want to look into something else and there is another topic. What you are asking is how do I get something on the radar so that we can some national attention to it. It is often just writing a letter or an email.

Rich and Brad.

GNAGY: I will go super quickly. Sentiments like Pat’s are not uncommon of people considering – if things do not get better quickly then we are going to find our own wheel and make it better. I do not disagree.

I did have a question and maybe I just misheard. You had said you were looking for X12 standards.

WALLER: No, not at all. I can do most of what I want to do with X12, but I am creating a Frankenstein in order to do it. In certain things, I have to use X12, and then I can hop over to FHIR and I can go back to X12. I might have to do another FHIR thing and then back to X12. You are creating this workflow that is just terrible.

GNAGY: Technical workflow?

WALLER: Technical workflow. From an end user perspective, it is probably okay other than typically the response time of some of the X12 mechanisms are a little bit slower.

GNAGY: I specifically wanted to address a point in your question just to make sure for clarity sake. You had mentioned that you were looking for them to address pieces of the API.

WALLER: The trick about it is you can make X12 work as the payload through API, but it is really a kluge data structure to do that with.

GNAGY: When you say API, do you – that is the whole point here.

WALLER: You can wrap it in anything. You can use that payload. Yes, it is not restful.

GNAGY: That is what I was trying to understand. What would the committee have to do with the implementation of the actual interface?

WALLER: I would be looking to them to be protein more like a restful FHIR-based approach to replace or extend some of the current HIPAA so have a dual approach or over time replace. Unless X12 gets different, it is very hard to use and be quick and flexible.

GOSS: Thank you. Rich.

LANDEN: I have a question I would like to put on the table. We have talked around it several times over the course of the day, but Margaret Weiker in her public comments nailed it on the head. That question is simply is there any longer any meaningful distinction between clinical data and administrative data. I ask that because under HIPAA administrative is a clear category. CMS clearly has responsibility. When we move later to HITECH, clinical data is defined separately from administrative data. ONC has primary authority over that with a lot of overlap with CMS. Is that distinction meaningful anymore? Will it be meaningful five years from now? Or is that something we as NCVHS need to start thinking about?

I guess the question boils down to simply yes/no binary. Show of hands. How many think there is a meaningful distinction between administrative – and think from your organization perspective. Does your organization think there is a meaningful distinction between administrative data and clinical data? Those thinking yes there is a meaningful distinction, please raise hands. Those thinking no there is no meaningful difference.

LARSEN: Just a quick topic that I would love to put on the table for a future discussion. We have standards for electronic and digital signatures, but I do not know if we have standards for when to use them. How do you define by use case when we allow an electronic signature, a digital signature as a physician that signs a lot of stuff and a lot of things still require a wet signature? Somebody somewhere helping make some recommendations around that would be great.

Agenda Item: Wrap Up and Next Steps

GOSS: Time may tell. Any other parting words before I go to the next slide? Just a recap, first and foremost, this has been a phenomenal day. We were not sure what the mix and match of people would bring as far as conversation. I do not think I would change one thing about it. You guys were phenomenal. It was more than I think I could have even imagined we would have been able to garner as far as input today. Your level of expertise, your thoughtfulness in preparing your slides and introductions and your contribution during thematic discussions was wonderful.

What we will do if we go back to the next slide, we are going to take all of this feedback. We are going to synthesize it along with all the other work that we have been gathering over the last year and produce another set of recommendations and hopefully be able to schedule a hearing on those recommendations. I do not know how long it is going to take us to sift through all of this stuff because I think we are going to have to do some really creative big picture, long-term thinking. But I do know that the full committee is already cued up for that especially as it relates to our 13th report to Congress on HIPAA.

Stay tuned. We would love to have you back when we do our formal hearing on the recommendations. After those are vetted with stakeholder feedback then we finalize them through the process within NCVHS. It typically would be a Standard Subcommittee hearing and then we would go to the Full Committee before we would finally submit it. We will keep things on our website and also hopefully stay in touch with all of you. You know where to find us. Thank you again for you day and your time and may you have safe and quick travels home.

(Whereupon, at 4:30 p.m., the meeting adjourned.)