DEPARTMENT OF HEALTH AND HUMAN SERVICES

National committee on Vital and Health Statistics (NCVHS)

Subcommittee on Privacy, Confidentiality, and Security

March 21, 2019

National Center for Health Statistics

3311 Toledo Road

Hyattsville, MD 20782

 

P R O C E E D I N G S       (9:10 a.m.)

 

Agenda Item: Welcome and Introductions

KLOSS: It is ten past nine and I think we can begin, audio issues notwithstanding. Welcome to this special working session of the Privacy, Confidentiality and Security Subcommittee of NCVHS. We are grateful for the members of the Committee who are here and the members of the Subcommittee and for our invited experts.

We will begin by taking the advisory committee roll call. I will begin. I am Linda Kloss, chair of the Subcommittee, member of the Full Committee, member of the Standards Subcommittee and no conflicts.

STEAD: I am Bill Stead, chair of the Full Committee, no conflicts.

GOSS: Good morning. I am Alix Goss. I am co- chair of the Standards Subcommittee and Review Committee, member of the Full Committee and I have no conflicts.

LANDEN: Rich Landen, member of the Full Committee, member of the Standards Subcommittee, no conflicts.

STRICKLAND; Debra Strickland, member of the Full Committee, member of the Standards Subcommittee and member of the Population Health Committee, no conflicts.

KLOSS: Nick or Jacki, if you are hearing me, could you introduce yourself for roll call? Nick. I am in.

I can see the presentation and I am on the audio conference, but we cannot hear you. Can we take that as a sign in? And Jackie? That is call to order and roll call.

I want to introduce our invited guests and actually have them help introduce themselves a little bit. Leslie Francis. I am starting at this end of the table. I am so happy to see you sitting there.

FRANCIS: I am Leslie Francis. I am professor of law and philosophy at the University of Utah where I direct a Center on Law and Biomedical Science. I am a former member of the Full Committee and former co-chair of the Privacy Subcommittee. I do not have any conflicts. It is great to be here.

EVANS: I am Barbara Evans. I am a professor of law at the University of Houston. I have a joint appointment as professor of electrical and computer engineering also at University of Houston. I too am a former member of the Committee. I am very happy to be here. Do we disclose any holdings of stock or just those that are related to this?

 

KLOSS: Just anything that might be a conflict.

EVANS: I do not think I have any holdings that would be perceived as a conflict. Thank you.

ROTHSTEIN: I am Mark Rothstein from the University of Louisville School of Medicine. It was my honor to be a member of the Committee from 1999 to 2008 and to chair the predecessor version of this Subcommittee. It is great to see everyone here. I have no conflicts.

MILAM: Hi. Sallie Milam. I was a member of the Full Committee and a member of the Population Health and Privacy Subcommittees and I am thrilled to be back with everyone today. I am now with the Network for Public Health Law. When I was with you all before, I was the chief privacy officer for the State of West Virginia. I retired from the state and I joined the network January 2. I am with the national office out in Minnesota and I am detailed to the mid-state’s region working on data and cross-sector data sharing, but I work remotely out of West Virginia.

KLOSS: Thank you so much. We have also invited Melissa Goldstein, associate professor of Milken Institute School of Public Health at George Washington University and I am sure she is making her way in the rain, but she will be here in person. We are very grateful for our invited guests.

Did someone just join us? I think we are good now to go.

I have asked our chair, Dr. Stead, to welcome this group and put this in context of what the NCVHS is working on now particularly for our invited guests who are not up to their elbows in what NCVHS is working in now. We would appreciate you kicking us off, Bill.

STEAD: Thank you, Linda. Welcome back all of our family of people that have been interested in this space for a long time.

If you look broadly at what NCVHS has been focused on, we are really dealing with the challenges of the dramatic scale up of technical capability on one hand. The increasing awareness of the opportunity to change or begin to change the health outcomes of the country by working in collective effort, multi-sector partnerships at the community level, engagement of health care and mental health in those partnerships in addition to what they do separately, the potential now to have an approach to data standards that in essence allows the data to be collected as part of its original need, its real use and need at helping patients, clinicians, and people trying to manage those different programs and to impute the other classifications, et cetera that you need.

We are at a sea change shift from a very different scale of thinking about data and opportunities and thinking about health and opportunities. All of that raises and takes a similar shift in how we think about meaningful protection of privacy, architecting security in from the ground up in a way that allows individuals to respond differently to different purposes of things people might want to do with the data.

I think all of the things we have been discussing are up against the challenge of a historic set of ways of working, historic frameworks that are by nature much more siloed. There is a common theme across all of our work. The time for this is right. There is a level of interest in this country and globally around privacy.

KLOSS: We have been approaching this topic as a work plan over the last two years. I am just going to get everybody on the same page before we go forward. This is kind of the summary of what the Committee’s Beyond HIPAA Initiative scope of work is. Let me underscore that this did not just start with the Committee’s work in 2017. This Committee has been doing this kind of work for decades.

But our goal here is to build on NCVHS’ past work and the work of other government and private initiatives to consider a health data privacy and security framework for 21st century health information challenges. We have had four driving goals to identify and describe the changing environment and the risks to privacy and security of confidential health information, including highlighting promising policies, practices, and technologies, lay out integrative models for how best to protect individual’s privacy and secure health data uses outside of HIPAA protections, formulate recommendations to the secretary, prepare a report for health data stewards. This is essentially the scope of work that we have been working off of now for two years. Our invited guests will have seen some of the work products and we will just highlight those in just a moment.

I think we have Nick’s audio now ready to introduce himself as part of the roll call. Is that right, Nick Coussoule?

COUSSOULE: Yes, Linda. It is Nick.

KLOSS: Do you want to introduce yourself?

COUSSOULE: My name is Nick Coussoule. I am with Blue Cross Blue Shield of Tennessee. I am member of the Full Committee, co-chair of the Standards Subcommittee, member of the Privacy, Confidentiality, and Security Subcommittee and I have no conflicts.

KLOSS: Jacki, are you able to speak now?

MONSON: I am. Jacki Monson, Sutter Health and a member of the Full Committee and a member of the Privacy Use, Confidentiality and Information Security Subcommittee and I have no conflicts.

KLOSS: Very good. We have had a little rocky start between rain and weather. Here comes Melissa.

Terrific. We will finish our introductions just as soon as she is seated. I know you ran into rain issues. Nice to see you. Can we ask you to introduce yourself?

GOLDSTEIN: Good morning. I am Melissa Goldstein. I am a professor at George Washington University. It is nice to see you all. I have been downstairs for a while, but there was a little bit of confusion. I am sorry to join you late.

MAYS: Vickie Mays, University of California Los Angeles, member of the Full Committee and member of this Committee and Pop. I have no conflicts.

KLOSS: Very good. Bill just introduced where we are at and how this fits with the rest of the Committee’s work. I was just giving a brief recap of what we have been doing in this Beyond HIPAA space. This has been our work plan charge. This is where we have come on our journey to date.

We began in 2017 with project scoping, initial hearings, and some of you participated in that. We did an extensive environmental scan which was completed at the end of 2017 and I think went up on the website in early 2018.

Again, some participated and provided. We had extensive testimony that preceded that, but we are proud of that work product as an environmental scan because it does pull together a lot of what the current state is.

We spent 2018 exploring this space. Again, our focus is Beyond HIPAA. We thought that the area that we could get our arms around most easily were some examples or exemplars at the intersection of the regulated, the HIPAA world and the world Beyond HIPAA that is largely unregulated as health data moves out of the HIPAA jurisdiction, the covered entities and business associate protections or other uses allowed by HIPAA or through other regulations.

We dabbled in three of those areas in 2018. We looked at registries. We looked at some apps. We looked at personal health technologies where there was exchange of information between the regulated and unregulated world.

Out of that came the design of a model and we asked some of you to react to that and give us input. It was very helpful.

We knew it was time to convene a group to now discuss model framing or a contemporary framework or whatever we come out of the day calling it. I think I want to go into this, emphasizing that we do not know where we are going to end the day. We just know we are open to whatever the dialogue and where it leads us. That is why it is so important that we have this more intimate meeting.

We worked hard at framing this meeting and we went all the way from thinking of having a big, large roundtable or a more formal set of hearings and we came down to, no, we feel that with the environment scan and the pre-work that we have done that we are ready just to kind of bring ourselves together and supplement the expertise of the committee with some invited guests and just roll up our sleeves and get down to work. We are calling it a working session and we have a plan for how we think the day will flow productively, but we come into it being very open to the collective wisdom and what will come out of our debate.

Our goals then are really to come out of this with enough insight to do two things, to actually prepare a report that describes key stewardship principles for the currently unregulated health information world and it will build on past work. We are not, again, starting from scratch. We have gone back and taken a look at the 2007- 2008 report and we have brought forward framework work that was done in the 2012-2013 era, again, building on past work, but putting forward a report that lays out a stewardship framework for this currently unregulated world.

Also, number three, in the course of our discussions, we are going to identify some actions that can be taken by the secretary. We also want to spin off a letter. This is not a light lift, but we feel like we are putting the stake in the ground and we are committed to having enough insight to prepare a report and also having enough concrete kind of short-term action opportunities that we can put in the letter. That is our goal.

In distributing the material that went out ahead, we actually gave you an outline for the report. Our plan is to do the dialogue, the discussion, the deliberation today and then tomorrow be able to roll up our sleeves and actually fill in the work on the outline for the report or switch it around however we might want.

That is all we have to get accomplished by 5 o’clock this afternoon. Are you ready? Everybody good? All right. We have thought a lot about this. We think we are ready to do this. Let’s not dance around anymore. Let’s just do it.

Back to the agenda. What we are going to do – but today’s agenda. We have three chunks. We are going to first discuss major themes that will frame the report and recommendations to the secretary. We put out a first cut, again, just so we do not have to start with a blank sheet of paper. Then we are going to spend the bulk of our day on guiding principles and then this afternoon we will look at what we have learned and see if we cannot discuss what might constitute short-term recommendations. I think of 9 to 3:15 today, we are really working on the guts of the report and from 3:30 to 5, we are looking what we have identified through the day that might be put in a recommendation. Our first topic out of the box is to – and then if we scroll down from 5 to 5:15, we will have public comment.

 

Agenda Item: Identifying Major Themes

KLOSS: Our first discussion is a theme’s discussion. I am just going to launch us right in and we are going to get to work and we are going to do some things through breakouts in the interest of time.

I would like to just stop by seeing if we cannot add to this list. The major themes in health information privacy kind of come out of – Bill teed this up to some point that we are at this critical moment where the information is digital. There is an opportunity to do a lot with it. You can see where I am going with this. Why are we concerned about this? How do we craft an overarching statement of why this topic and why now? That has changed since the committee addressed this in 2008. There is an urgency today that did not exist a decade ago. I would like us to just spend 5 minutes, 10 minutes, 15 minutes talking about that before we get into specific themes. What would you add to this?

FRANCIS: One thing I would add to this is that it is so easy now to take data that was originally collected for health care and move it somewhere where it has no HIPAA protections at all. It is really that – maybe I would put it this way. The ease of data moving between regulatory worlds or among regulatory worlds. The HITECH Act really was a C change in some of that although it was growing before.

MILAM: I think what we see today to some extent in 2008, but you look at the balance. We have greater risk today. There is less benefit to the individual. The benefit is going to the privacy corporation that is monetizing the data and aggregating it and repackaging it. An individual may agree to those actions, those disclosures to have access to whatever app, but really it is incidental. There is no derived benefit from that. There is no balance anymore.

GOSS: Building on what Sallie just noted, it is very similar to the thoughts that I would like to share related to the speed of change and the cultural impacts are having unintended consequences. There is a personal freeing of their data and sharing of it in some of the generational aspects that I think we do not fully understand what it is going to mean to the person moving forward in the downstream activity. I think in addition to the balance aspect, I do not know that as a society we fully understand what the balance should be.

LANDEN: I have two comments. One is just to build on the earlier comment. The data that has its origins within the HIPAA protections and moves outside of HIPAA and specifically under HITECH, there is a huge movement supported by rule making from ONC that allows a patient to access his or her electronic health record using whatever app the patient chooses. There is no real protections that those app developers and sponsors will have security, will have privacy, will have confidentiality nor are there protections outside of HIPAA for the reuse of that data once the patient authorizes that app access to his or her electronic health record.

The other bullet I would suggest in Column 2, there are no major policy breakthroughs. I would like to suggest we think about adding the European Union General Data Protection Rule, which I think is another C change in privacy and California’s.

ROTHSTEIN: There are so many temptations to address issues, but I just want to mention one now and that is not only are people getting access to their information from medical records, but it is going in the reverse. We never had this before where patients were allowed to upload data that they have generated or that they have put forward. Some examples are directing consumer genetic testing. Some health care providers will allow patients to upload their entire report from companies like 23andMe. It is part of the All of Us Research Program from the department where the idea is that all the information that is given back to the individual at the individual’s option can be submitted to their health record.

One of the interesting points about this for NCVHS is that this can serve as the link between what has already been done and the clear jurisdiction of NCVHS and in the department and this sort of nebulous other world. It would certainly in my view be appropriate for your report to consider in what shape does the information have to be before a covered entity as required to put it into the record. Are they required at all to put it into the record? How is it aggregated with non-health information that the individual may seek to upload and part of the environmental data and so forth?

Now we have already established the jurisdiction and with that hook, I think that the department can set rules that would have even wider implications. If we are setting rules on how it must be structured, the security provisions and so forth to be put in a health record. Now, downstream we are having an indirect effect on the people who are facilitating that so the app developers and so forth.

EVANS: I have the same sentiments that have been expressed, but I think one of the big themes here is whether the entire edifice of HIPAA, how much vitality that has going forward. I refer to the fact that it is built around the notion of stewardship and trusted intermediaries. I think five years out we would need to be aware that one possibility is that people grow so frustrated that they just do not do business with entities that do not give them block chain technology protection, which creates some smart contracts that gives a set of rules and there is no intermediary who can deviate from those rules.

I think we need at least a small part of what we think about and consider if that happens, how do you protect those national priority uses like public health uses of data that HIPAA was designed to allow and make sure they do not fall by the wayside? How will we manage the meshing of increasing number of apps and under pressure providers that may have to provide a heighten level of protection that is kind of locked into software in a block chain system? How will that mesh with this old system of HIPAA? I am just concerned the technology could be changing out from under us such that we need at least to have a visioning exercise of what that world could look like in five years.

GOLDSTEIN: I agree with many of the comments that have already been made. I am trying to think about how to formulate this notion I have that genetic data is becoming increasingly usable although we are still making baby steps and certainly the perception that genetic tests will be useful to the individual far outpaces how usable they actually are at the moment. But that perception is driving a patient and a consumer – I do not know – attractiveness to direct to consumer ideas and to either knowing more information about yourself without a particular purpose or looking for information about yourself for a particular purpose. This is coupled with the decreasing utility of the concept of de-identification and perhaps obsoleteness especially in the realm of genetic data. We are not quite there yet, but there is a precipice at which people, not me, will be able to decode genetic data and we leave it around all the time.

I feel that things have changed with 23andMe and that people do not necessarily know about the changes and the dangers despite the notices, despite the check boxes. I think very few people actually know that 23andMe has made – I do not mean to just pick on them. There are multitudes of them. This is just in the news most. That they have made deals on the outside to sell the data for research purposes and that many people have agreed to this perhaps knowing, perhaps not knowing. It feels like the necessity of something protective is accelerating.

 

MAYS: I think that some of my comments just got made, but I want to add some issues to that. I think in particular when we start thinking about predictive analytics, which people are really turning towards in terms of what is in the background and they track often people’s behavior within a health context. And then decisions get made about what sometimes you may get, have access to, et cetera. I do not think that people are aware of it nor is that data being collected on behalf of the patient. It is actually being collected much more on behalf of the organization for cost containment, et cetera. Having people be better aware of how their behavior is being monitored I think is important.

The other issue in terms of the genetics data, I think in particular what we have to realize is that there has to be – you just cannot tell people particularly those who have low literacy levels, et cetera to check a box and sign off. We have to figure out how education is necessary. Here, it has to do with it is not just the impact on you giving your data, but how your data will be used for your family and others who are related to you and whether or not there is a question of broader kind of consent to do that. In this land now of individuals opening doors to other people and other people having no idea and whether or not we are talking different kinds of consent.

 

STEAD: As I listen, I want to second Melissa’s comment that the concept of de-identification is becoming or has become obsolete.

The other thing that I think is important to think about is the inclusion in GPDR of the right for individuals to request that their data be removed. None of our frameworks really anticipate that. And what gets at is

– I think a lot of our previous work has said that the stewardship principles and the reasons we might release information were purpose specific to a degree. When a person’s situation changes then their assumptions about whether they want their data accessible or not change.

A lot of people think about that as people getting more comfortable with everything being out there as they get more serious conditions and they want treatment.

But I think the general principle is it can go either way.

I think one of the big thing we really have to grapple with as we think about what the United States should do is this idea that not only should a person be aware, which is sort of what I see being implemented in the response to GPDR, but if I read it right, the fact to actually or to have the right to withdraw it.

KLOSS: Do you have any opening comments, Nick or Jacki?

 

COUSSOULE: This is Nick. Just a couple of highlights and additional things. We talked or Bill mentioned the re-identification risk and that is effectively non-existent to the privacy side of that. The expectation and the structure around creating that veneer of privacy is a little different. The rules as exist today say if you do these things, the data is secure when, Nick, we all recognize in a practical nature given the sheer explosion of data and the ability to match disparate data does not really work anymore.

The second thing that I would like to add is that there are really some evolving expectations of privacy given the sheer volume of social media and what I will call the general population’s willingness to share an enormous amount of information. I think that that is changing culturally a lot, but I think there is a big distinction between sharing of data that somebody wants to share on a Facebook versus medical information that is a whole lot more private to a whole lot more people. Even though the general notion is that people are more willing to share and have a somewhat lower expectation of privacy, I do not believe that really applies in a lot of the situations that we are talking about here.

KLOSS: Thank you. Jacki, did you have anything you wanted to add?

 

MONSON: I think I will add – of course, I have lots of opinions, but I will add one thing, which is we have – the organization I work for – we have CCPA, GDPR and all of the HIPAA regulations and state laws. I think our biggest challenge is most of these regulatory schemes do not – each other and they do not – from HIPAA, et cetera. Part of what we are grappling with is just how you – with all of it and how do you meet the needs of your patients at the end of the day and still protect their privacy, still secure their information. Just like GDPR, CCPA has the ability to remove information, which in a health care organization we do not want to remove their medical information, which might be – emergency situation to treat them. I think that is the biggest challenge that health care organizations in California is facing today.

KLOSS: Thank you. Thanks for that warm up. Let’s move to the next slide please.

Before we look at the draft themes, I thought it was just kind of interesting to look at what the major themes were in the 2007 work. Ten years ago, we were talking about the potential benefits of health data uses. That did not come up in our discussion just now. We assume that the benefits are extensive. We did talk ten years ago about potential risks and harms from uses. I think they have come into a little sharper focus. There were themes about HIPAA, importance of data stewardship and most importantly that a framework kind of focused on issues regarding specific uses which are certainly not relevant, but they certainly are a narrower view of the world of issues than was just described. I think it does underscore how much has changed in one decade. They introduce the concept of use of health data involving monetary exchange and we know that is now pretty much a business model. I thought it was just interesting to reflect. I think that contrast can certainly be one of the things that we bring out in a preface to our report.

What I would like us to do now and we are a little behind schedule, but we are able to catch up just fine. We have one hour to take a look at the draft themes that we circulated. I had in mind that we would break out into three groups. I may be thinking more like two now. But my reason for wanting to break out is just to get more ideas fleshed out faster and then come back together. And the timings on this are wrong. I think we actually might do two groups instead of three.

But I wanted you to look critically in smaller groups at these six draft themes and discard, combine, refine, reframe, add whatever you want to do, scratch, throw them out, start over, but for each theme, if you could then add a key bullet point or two that you think absolutely needs to be captured as we lay out themes in the report. Does that make sense? Do you think it will help to be less formal and not use microphones and just take 30 minutes and wrap your mind around these in small groups?

STEAD: Two things. One small – Theme One I do not think has been reworded to reflect the wording that finally emerged in the report to Congress. We may want to align it just so we do not get back into any of those discussions over again.

One thing we might do is almost have just the list of things we just talked about. I tried to capture bullets. I think I saw other people doing the same things. I can almost imagine being able to check and tune. I was just wondering if – I would be glad to email my list.

KLOSS: We probably cannot use the thumb drive. But Rachel has notes. Can her notes get displayed?

FRANCIS: We you thinking of taking these six themes and –

KLOSS: Hashing them out.

 

FRANCIS: Were you going to give all of them to each group? My concern is, for example, one – if you divide it up so one group gets one, two, three, and the other group gets four, five, six, one and five look a whole lot alike just to take one example. In some ways, I think it makes more sense to do it together. For example, part of what was behind my comment about how easily information now moves out of the HIPAA world was the flipside of Mark’s comment about how easily information moves into the HIPAA world. Each of those moves is a possible regulatory hook.

That is how the whole business associate model got generated. That could be an overarching what are the available frameworks whether HIPAA or some others, GDPR that could be a separate topic that I am not sure gets captured.

KLOSS: That will be the work and the bulk of our day because we are going to work on recommended frameworks.

FRANCIS: I was thinking the scoping question was what are some of the options.

KLOSS: Is it your preference to stay together to work on these themes or to have us display the notes we have taken and then just do a little —

FRANCIS: Or if we break up to have each group take a look at all of the themes.

KLOSS: That is what my goal to have each of you take a look at all of them because I think you have to look at them as a whole. I do think we will move a little further faster if we just say this side of the table and this side of the table. Just take 30 minutes and pull yourselves into a circle and digest this one page in light of the primer discussion that we have just had.

GOSS: Linda, I would like to just ask if Nick Coussoule could say a few words so that we can check whether our technology problem fixing has resolved your reverb.

COUSSOULE: As I am speaking now, I am not getting the same reverb I was before.

GOSS: Awesome. Thank you to the technology staff for fixing that problem.

KLOSS: We hear you loud and clear, Nick. Thanks.

 

GOSS: I apologize, Linda. Since I was off getting some technology assistance, could you just recap where we are going?

KLOSS: We are going to take 30 minutes and we are going to have this side of the table and that side of the table work in small groups. We are going to look at this set of six themes in light of the discussion we have just had. We are going to use the flip charts and just make any notes you wish to bring back. Then we are going to have 30 minutes where we are each going to report out. I think that will give us extensive input into the overarching themes. We are just picking your brain here.

(Pause/Breakout)

 

Agenda Item: Development of Guiding Principles

KLOSS: I think we will resume. Thank you. I know that was a burst of energy. When you do meetings like this, you just know things are going to get messy before they come into sharper focus. We have made some good progress.

I would like to ask Alix to report out on this group’s insights. We are not necessarily going to beat this until we reach full consensus, but let’s see where we are at after we have both report outs.

GOSS: Thank you. I apologize. I am going to keep my back to you a little bit so I can turn around and read our flip chart papers. We had some very robust discussion and I think it will be good for us to be able to capture the key points from each of the groups and then enable to tease those out a little bit more effectively throughout the day.

I think our task was to agree or to come to some consensus around items one through six and whether we were in agreement and if we thought something needed to be changed or something was missing. I am going to walk through the six in four parts.

In regards to items one, two, three, and five of the major themes list, the perspectives that we had was that those three were sort of redundant. They all said sort of the same thing. We knew in 1997 that the jurisdiction that HHS needed was not there and they have repeated the theme of knowing that barrier to holistically regulating privacy so what now and how do we really address that.

The second part or the next item that I would like to go over is number three, which we think is the key to the path forward. It is not a top-down solution approach that we need. It is really about empowering consumers and taking charge and working from the base up since we have had 25 years of experience with top-down not working.

The third part I would like to cover is related to item number four in that the emerging models that are being pushed down such as social determinants of health and precision medicine that all of us do not enjoy the same protections of our existing mechanisms. Equality is something I think we need to weave into number four.

In addition for number four, we think that we should be extending what is there already related to the aspects of purpose of data and accommodating how consumer’s views evolve over time.

The fourth part that I would like to go to is directly to number six. From an aspect of voluntary stewardship and theory could help, but it is sort of a fantasy model that could not protect privacy due to stewards’ incentives. Further, a thought around fair information, privacy practices, and principles, the legally required that only new ones that would – I am struggling with how this was written. Barbara, you might want to chime in here. Only new FIPPs legally required that enable individual control in all relevant settings can insure and improve privacy.

EVANS: I think what we are trying to express is that voluntary top-down stewardship – we all want it, but it cannot really be effective. And to protect consumer privacy, we need both FIPPs and Mark emphasized that they need to be legally enforceable practices that will empower people to force privacy including the ability to sequester their data. We are not for data removal and do not really think GDPR is for it either. We think it is just giving the right to sequester data so that the downstream recipient cannot keep passing it on and on without control. We are thinking individuals need to be much more empowered in this environment and that includes, I think, Alix, importantly mentioned education of consumers. They do not understand the implications of the decisions they are making.

FRANCIS: Maybe another sort of related way to put that is that purpose limitation among the FIPPs is one that has been inadequately explored and that that is critical that that may be the critical FIPP rather than things like starting out with identified data, de-identified data line the way all the frameworks earlier on have done.

 

PARTICIPANT: That would conclude Group 1’s report out.

 

FRANCIS: Maybe an even better way to put it is purpose control.

 

GOLDSTEIN: Do you think that — kind of partner of use limitations. You are say you are going to do it for this purpose and then you actually only do it for that purpose. Do you think that the use limitation has been properly used?

FRANCIS: I think that has been under explored what that might look like. There are really interesting questions about in what context and what kinds of uses is use limitation appropriate, but public health versus commercial is a really good example of that. Limitations for public health purposes may be ought to look very different. That is why I have been really sort of yelling over there about not even thinking about the de-identified identified line and also being really clear about the difference between sequestration or de-listing, as the Europeans call it, and erasure.

MILAM: Building on those ideas, as I think about the accounting of disclosure’s requirement under HIPAA and under other frameworks, it is really underutilized by consumers probably because they get huge amounts of information in return. They do not really know what it means. If there were clear buckets of use and purpose and disclosure that could then be tied to the accounting of disclosures kind of like a map. You only have available choices. That might turn it into something useful for the consumer and tie that to education and then people would be able to have a right that they could exercise that had some meaning to it.

ROTHSTEIN: I appreciate the sentiment, but I am not sure that that would help in the majority of cases because there is another point that got lost somewhere up there. Each year in the United States there are at least 25 million compelled disclosure of health information that third parties have the right legally. There is nothing illegal about it – to extract basically from individuals to get access to their health records through authorizations. Think about excluding health insurance now, but all the other forms of insurance, employment, commercial transactions, mortgages. There is a big long list. The entity says you do not have to do this, but if you would like us to consider you for life insurance, we need access to this. Now there is a legitimate reason why they would have it.

My concern with that is what they get is too much. They get the entire record that they do not need. But I do not know how we answer that question sitting around this table because it is not really a privacy issue. It is a substantive issue. How are you going to underwrite insurance in different categories? How are you going to fund nursing home care, et cetera? That is a lot more complicated.

MILAM: The points you are making, Mark, are where an individual decides to purchase a product and the organization they are purchasing it from requires certain information that they decide to turn over to them and they sign an authorization to do it, which is really very different than a lot of the other types of disclosures that occur. It is one generated by the consumer.

ROTHSTEIN: My point was simply getting consumers’ rights, which might apply in one category of disclosures or situations, is not going to have any bearing on a huge area that many people just do not pay any attention to.

FRANCIS: On the other side of that, what you might get with an accounting of disclosures is not treatment payment or health care operations and what that currently encompasses and how data can currently get passed on. It is a wonderful example, I think, because it shows how there is both over protection and under protection in a variety of different realms. It might matter to the consumer just as much that the TPO analytics get sold in certain kinds of ways as it might matter that the analytics that the life insurers produced get sold in certain kinds of ways.

KLOSS: I am going to have us report out the other group and then —

GOSS: Linda, before you do that, I just want to make a correction to the record. Due to time constraints, we missed Mark’s point. We actually had five major themes and that the board has been reflected for the flip charts. I apologize, Mark. We missed that when we all agreed to.

KLOSS: I will try my best. We had a really terrific discussion as it sounds like the other group did as well. We started by talking about number one and what we want to do is go back to the way we described the regulated and unregulated world in the recently published report to Congress because we work really hard on an opening paragraph and we want it to gel with the way we describe this theme.

But we went from there to saying that what the themes should do is – I will just drive this and then put it in a more logical way. But we wanted a clearer description of the world at the time that HIPAA was passed and the world now. And then have the themes pull out the problems that emerge from that profound evolution.

The patchwork item we want to keep, but we want it to be focused on laws and distinguish the world of health information more broadly from laws. We do not want to jumble the two up like we do now in – the world as defined by the HIPAA law. We want to describe the world independent of the laws and handle the law as a separate theme. And then the future trajectory all of the technologies and of course in here you will see as I flip, we preserve the consumer theme.

We did start teasing out things that we think are going to be useful as we move to framework. I think that was the discussion we were just starting to have from Group 1’s report out. Things like purpose limitation and things like that that may be very useful when we get to framework if that makes sense. We were trying to keep ourselves on theme, but put some placeholders on things that we want to remember when we move immediately to frameworks. This is taking that process of thinking and framing it back into themes.

The world in 1996 and the framework that we used in getting to HIPAA rules and regulations. We had it, as was pointed out, FIPPs, ethical frameworks, social values at the time and that was embodied in the HIPAA law and the regulations that emerged at the time. And then a description of the world in 2019 and then what comes out of that is the —

STEAD: We can note up here that we use part of FIPPs, not all of it, for example. But in essence, whatever that framework is, what is the framework now that will be used to develop what we are trying to do, understanding that. What are the gaps and problems that have resulted from that change and then this in the end can guide how we address it. But the patchwork of laws seems to be a fundamentally different theme.

KLOSS: And we clarified that it is HIPAA and other federal law, state-to-state variation and then within state variation. It just kind of calls out how much of a patchwork it is and it is not just preemption. It is deep and it goes down to how different states handle different elements of immunization records, for example.

And then the consumer – we did not spend too much on that, but the consumer information rights and awareness. We would certainly agree how that is going to drive in fact the path forward. And then future technologies and not so much describing what we have today, but where it might be going, the block chain and AI and whatever, all the things we cannot imagine, personal patches in our bodies, whatever.

I think we had consensus that we need this kind of – this report is going to need to lay out these themes carefully and that as you have done, the themes do kind of lead you into the framework discussion.

Is there any pushback of Group B on Group A or A on B?

 

ROTHSTEIN: I just have one comment. I think the way you have outlined it is fine. It is important to draw a distinction between the technical conditions 20 years ago and today. The only thing that I would add is that the reason we need to do more today is not solely technological. It is from the fact that the good old days were not very good. And the HIPAA privacy rule even considering the conditions and the time at which it was developed was a deeply flawed regulation for lots of historical reasons that people around the table know.

Congress tied the hands of the department and I just think that we would be remiss not to mention that there were lots of big problems from the day it went into effect until today.

GOLDSTEIN: I want to go back a little bit to the compelled choice discussion that Mark and Sallie were having. I mentioned in our group the coupon idea from pharmaceutical companies. The big example now is insulin. Insulin may have always been the big example. If you need insulin and you cannot afford it, if you give us every single bit of information about yourselves and then we will give you a vial of NovoLog for $25 instead of a vial of NovoLog for $800.

To me, some things where you are “voluntarily” giving your consent to giving your information such as this, such as long-term health insurance where you absolutely need it, such as life insurance perhaps when you are single mom of four kids such as all kinds of things are not – they are not extravagant. They are necessities in people’s lives. It is very difficult for me to call that real choice. I understand that we are giving up information in an understood – at best, it is an understood trade. We are giving you our information because we want the insulin for $25 or we want the life insurance if you will insure us anyway or we want the long-term disability insurance. But I cannot call that – to me, that is not choice. I do not view that as choice. I view that as trying to get the necessities that we need to live and I would not call that a purposeful giving of information in the same way that I would perhaps say, yes, I want you to put this information in a registry because I want to be included in research because I have people in my past who have breast cancer and I want for altruistic reasons. I think these are different types of decisions and I think that we have to acknowledge that if we are building frameworks that we are trying to protect people because there are going to be people who give information in a “coerced” fashion.

KLOSS: You have given us some great insights in how to reshape these themes. And certainly what comes out of writing this is going to be much more sophisticated than our first one through six draft. It is exactly what we wanted this discussion to take us to.

I want to scroll forward please to – we are not going to go through each one of these themes because I think we have reshaped it. We have re-sequenced and that is great.

What we are going to turn our attention to now and I think we can take an hour and then we will break for lunch, but I really want us to get into this principles and model work before we take lunch because I know it will just kind of keep churning as you take a break. We are a tough task master.

What we are going to do is first discuss guiding principles and then refine current model assumptions and then move – maybe we can get number one done before lunch. But model assumptions and then we are going to focus on private sector action and federal action. Does that make sense?

I think we have started this discussion of guiding principles. If you will just – the next slide. These were the guiding principles that were laid out in the 2007 framework. We may just want to scratch this and start from scratch, but it may prompt us to get going.

Think about a framework that we are putting forward today going forward. Let’s see if we cannot tease out guiding principles. I think in the interest of time and keeping us all on the same page, we will just work together instead of going back into our breakouts. How about that?

MILAM: These are principles to create framework of principles.

KLOSS: Correct. I will ask Jacki and Nick just to pipe in when you have something to say.

COUSSOULE: Linda, this is Nick. Can I just make one comment? I think the discussions previously tended to focus on the issues that exist today from an individual’s perspective about what their data may be used, how it may be used. I think the principles are really important to not only focus us on the issues that are existing today, but also the opportunities.

We can tend to look back and I was not involved in this when the law was first passed so I cannot comment on some of the other participants’ knowledge and history of the challenges that were inherent and known when it was first established. I think the dynamics have also changed in regards to capabilities that can be enabled by some of the challenges and whether the framework helps or hinders those opportunities. I think the principles really matter to look not only backwards and forward. I think they cover that a bit. I just wanted to make that point.

LANDEN: In number 3, I think that could use some updating. I think in 2007, it was still – we needed to encourage use of the relatively new ability to access electronic data. That is less true now, but it is still true.

But what I would like to add to that is facilitate or actually promote interoperability as that is now a current high priority federal process. I think it would advantageous to those receiving our eventual recommendations and framework is if we voice clear support for that policy of promoting interoperability and the electronic information.

KLOSS: Maybe what we should do is start at one and work through. Would that be alright? I am afraid we are going to – because I heard strong sentiment from Group A that the consumer was a driver here. That is kind of what the first one is about, but it is pretty passive compared to how we are thinking about it and talking about it today. Could we suggest a way to anchor these guiding principles more directly on individuals’ preferences and rights?

I think as we step down through these, we will bring them forward to 2020 and then see what may be missing. That might move us along rather than moving from topic to topic. Is that comment about number one?

EVANS: We need to really think about this stewardship word that has become a euphemism for letting data holders control your data and letting regulators and IRBs and others control your data on the assumption that they will be benevolent stewards they have not been. I think that word elicits skepticism now.

GOLDSTEIN: I am also wary of personal responsibility arguments. I guess it kind of makes sense according to my last comment as well. You have the responsibility to take care of everything you do and all of your information for even those of us who are highly educated and theoretically competent. It is almost impossible.

KLOSS: Let’s talk about the stewardship word.

Do we want to use it?

 

GOSS: I would just bring in other aspects of the NCVHS deliberations within the Standard Subcommittee. The word of stewardship versus governance is something that we are struggling with as well. I think that this is a very good point, Barbara, that you bring up. What is it that we really are trying to – rules of the road?

KLOSS: What word would you use?

 

EVANS: Governance by the people whose data are used. Is there a word for that?

PARTICIPANT: Of the people, by the people – I know it sounds ridiculous. Could someone help here?

MILAM: Patient’s rights or individual rights. Their frameworks around that.

 

EVANS: Data citizenship is sometimes used, but it sounds sort of funny. But governance could be a good word if we are clear who is doing the governing. That it is shared. It is shared responsibility for privacy or something. I am never good with titles so somebody —

GOSS: But it is responsibility.

 

EVANS: It is responsible guiding principles for responsibility and use and protection of data.

KLOSS: I think that raises just a really — we may not have the answer, but it is just a really good —

COUSSOULE: I think if you said guiding principles for use and protection, I think that is the right – guiding principles for use and protection. That is really what we are talking, right, is protecting the data, but allowing appropriate use.

 

KLOSS: Is that a working title we can live with for — guiding principles for use and protection of health data.

EVANS: Yes.

 

KLOSS: Thank you. Let’s look at Number 1. Maintain or strengthen individual’s health information privacy or should that we rights, individual’s rights regarding.

MILAM: Since they are so degraded, do we want to remove, maintain and just leave it at strengthen?

BERNSTEIN: A phrase you sometimes here in privacy circle is that they should not erode. There is both strengthen and that is something what the first part is trying to say. Maintain, but certainly not erode, which is moving in the wrong direction.

KLOSS: And we would say individuals – what phrase? Individual’s rights.

GOSS: I am finding myself pondering the phrase do no harm.

KLOSS: Are you okay with that then?

 

GOLDSTEIN: Where will the word right go?

 

KLOSS: I think it has to come after individuals. Are you talking about legal rights or human rights?

 

EVANS: Legally enforceable rights. Privacy means so many different things to different people. We are not going to always keep people from using our data and there are many reasons why it is socially beneficial to a lot of uses, but we would like some controls on downstream repurposing of data and data security, minimum standards that would be more stringent. It is sort of a thing of the purpose for which it is used. There are no rules after it leaves the HIPAA environment. I know this has been a theme in some work that your group has done that the business associate agreements need to be a lot better than they are today.

GOLDSTEIN: I would not limit it to rights.

 

KLOSS: We will come back to figuring out what the right word. I just want to say that all of what we are talking about here is in that kind of Beyond HIPAA mindset. Individual’s say, individual’s —

GOLDSTEIN: Legal rights are very frankly few and far between. You do not even have a legal right under HIPAA, a personal right of action. They do not currently exist in human rights – might encompass everything that Barbara just mentioned, but it might not. The US has even signed on to a lot of those human rights’ ideas.

I like the way that it is phrased more than – I would see rights as limiting as opposed to broadening only in that I am afraid of that is how people would interpret it, I think.

MILAM: People may value an option even if it is not a right. What I have been sitting here pondering is do we want to offer a benefit to the consumer through their selection of a privacy option as opposed to just – or have the individual perceive a benefit or feel a benefit as opposed to maintaining a certain status level of privacy.

ROTHSTEIN: I would respectfully recommend that we let the 2007 report rest in peace. Instead of our trying to update it, let’s develop – I should not speak for you. I recommend that you develop your own principles through the discussion that we are going to have through the rest of the day and then at the end capture what this committee felt. The 2007 report was drafted under markedly different conditions technologically as well as otherwise. It is innately troublesome and let’s just ditch it and do our own thing.

KLOSS: I think maybe it is constraining to us more than helpful. Let’s just blank out this piece of paper and now it says guiding principles for use and protection of health data rights. If we can get the principles down then the framework should flow from the principles. At least that was my logic.

 

BERNSTEIN: Instead of taking detailed notes about what everybody is saying, just put a screen up there. As you guys brainstorm, we can work on the language if you want. We are now calling guiding principles —

SEEGER: For use and protection of data.

 

KLOSS: How about just taking this technical time while we are doing this and just doing a little quiet time? Give five minutes. What principles do you want to see? Then we can share our ideas.

EVANS: Do we have to work within the existing HIPAA statute?

KLOSS: No. Beyond HIPAA.

 

EVANS: Wish list.

 

KLOSS: Who wants to start with one? Let’s just go around. Sallie, do you want to give us one?

MILAM: I was thinking about the existing principle about improving health of Americans in the health care delivery system. I think that we have a lot more people living here than Americans. I also think about we do not improve our health and well-being through the health care delivery system alone.

I propose to change this to recommended protections, enable, increases and health equity. That encompasses all of that. In my mind, it would encompass very good privacy and security protections, better uses of data, better surveillance and prevention, health promotion.

GOSS: A theme that you are going to hear from me is that I think we need to start education in grade school so people understand their data, how it is used. I think it needs to be part of our curriculum to tell people navigate some basic understanding if we can create a fabric within our country. It is a more cohesive fabric of privacy protections. I think that as participants in daily life, we all need to be more knowledgeable and skilled in understanding our roles in navigating it.

FRANCIS: A step back from that, I would emphasize the FIPP of transparency, a requirement that data holders be open about what data they hold and what uses they are putting it to. I would also want to highlight the fair information practice of purpose specification and use limitation, which does not mean you cannot use it, but what it means is that there needs to be selection – who gets to pick and in what circumstances that is appropriate to enable things like number one.

EVANS: This is follow on to Sallie’s point. In 1997, HHS in its report to Congress under HIPAA listed national priority data uses that needed to be protected even if it meant not letting people have a right to consent to them. I think we are at a point where we need to update that list and put some quid pro quo’s. The original design of HIPAA was that, yes, data could be used without your consent, but there would be these constraints downstream and then in the rule making, HHS says we do not have jurisdiction to put those constraints, but we are going to let the uses go on anyway. We need to redo that analysis. It has been over 20 years. Look at the process. Who is going to decide and consider that a use? Is there no way to get public input on that in a meaningful way? People may not want their data used for some of these things that were deemed socially beneficial. We need to rethink of that.

What are we going to allow and what responsibilities will attach to the use of unconsented data and purpose specification that they are only to be used for that and not passed on? I think that is a critical point here.

BERNSTEIN: Barbara, can you help me encapsulate that?

EVANS: I am sorry. I am so verbose? MS. BERSTEIN: What is the essence of –

EVANS: Redo the list of national priority uses identified in HHS’ 1997 report to Congress. Are those still good unconsidered uses?

ROTHSTEIN: You could also tie it into Section 512, public use –

 

EVANS: See if those are still the right uses and this time we need to think what responsibilities should we impose on users who receive data pursuant to those. We impose almost none this time and we need – if data has such social benefits, we have to put responsibility with unconsented data use, meaningful – update the list and put some quid pro quo’s on it.

ROTHSTEIN: When I think about health information beyond the regulated setting, what occurs to me is don’t treat health information like we treat other information in the computer age and that is click through consent, terms of use, statements that are 50 pages long that nobody reads. If we apply the same standard then people will be in effect forced to sign away or give away health information and we need to I think act in a way that is going to prevent that. Part of it goes to the first point that Leslie made about openness and transparency. But there is more to that.

I think we need to encourage and require the development of meaningful, understandable, and transparent consent requirements for the disclosure and use of health information.

GOLDSTEIN: This is on a wish list although I do not know whether it is realistic at all and if it is not, I want to blame it on jet lag. Collection limitation.

If you do not need my genome, do not ask for my genome. If you are not planning to use my genome right now, do not ask for it. Ask for it later. I think in other circles, people would be up in arms about this idea because people want everything just in case they can use it later, but I would really like to respect the idea of collection limitation.

If it is not envisioned by what you are planning to do with this data, do not collect the information.

PARTICIPANT: Is that related to purpose specification?

GOLDSTEIN: It is another one of the FIPPs. PARTICIPANT: It is minimum necessary collection.

MILAM: I have a clarifying question. We are talking about Beyond HIPAA. You are not suggesting it is for treatment. You are suggesting it for areas outside of health care delivery. Or are you suggesting it for treatment?

GOLDSTEIN: I will be honest. If you do not need it for the treatment of my condition at this particular point in time, I understand that you might ask for it later, but do not ask for it at this particular point in time. Once it is there, there is always a risk. It might be a small risk. It might be a large risk depending on your particular protections set up in your security. But when you ask for information that you do not need now, it then exists and there is a risk of breach.

MAYS: Can I ask a question? Are you thinking about research?

GOLDSTEIN: I am thinking about research as well. If you are not planning – I am particularly focused on genome, but it is not just genome. I think genome is the most vivid example of things that we can understand in both research and clinical practice of information that you may not need at this point. Later on you may decide that you do need. But then I think that it is – in general, if you do not need information, do not ask for it. Do not just give me everything you have upfront if you are not going to plan to use it at all and if it is not part of your purpose specification.

EVANS: One concern I have about that is – and I very much like the idea, but it assumes we are only doing hypothesis testing analytics where we know what we are looking at. We are in a time when there is enormous benefit in hypothesis-free analytics where you —

GOLDSTEIN: And it also assumes an ongoing relationship. I understand all the things with both your clinical practitioner and with – and some situations will not be possible like in ER possibly. We are talking about a wish list. I wanted to throw it out there. I think that there is some way that we could phrase an idea about collection in a way that respects the idea without perfecting it perhaps.

EVANS: I think this was suggested in one of the letters you put out after I was no longer working with you. There is a place for hypothesis-free analytics where you do broad collection. You could let the collection happen pursuit it to a business associate agreement that requires destruction at the end of the study so that you let the science happen, but you do not let the data continue to sit there after their purpose has been fulfilled. I would allow broad collection, but not have it so uncontrolled as it is today.

MAYS: I just want to comment because it is very concerning as a researcher for the simple fact that sometimes when we collect data, we know that there are technologies that are about to come. We know that some assessment will occur. There are a lot of things that are future based. If we had to at every point only collect what was specific to that hypothesis, it would cut research probably in half and the breakthrough innovations would not be there.

The issue that you talked about like the destruction – I may even be able to use analytics that are not hypothesis driven that would help me to identify some things. Then it is like at the – I would work on that for a while. At what point is the destruction to occur when fields differ in the extent to which they have breakthroughs? If all of us had to follow the same thing, it would be in certain areas devastating. In other areas, okay, we will destroy the data. That is the end of that.

If we think one of the things that pushed HIV through was because there were blood samples that were in the San Francisco area of gay men that – because of STDS, they had taken blood samples. That allowed us to much more quickly be able to study some of the origins of the disease. We had no idea. There is lots of data that is like that where you go back. It has implications for registries, biobanks.

 

KLOSS: I am going to be a time manager. Your point is made. Do you have another point to add that? Let’s hear that.

MAYS: I would like the principle of the collection of data as granular as possible, but with attention to greater protection the more granular it gets. Because I think we do not do people justice anymore if we do not know things like your age, your race, and within race and all those kind of factors that we have been collecting it at such a broad level that what happens is that it is not turning out to be beneficial. But granular data requires a lot more in terms of protection. It is to get the benefit, but needs to be protected.

STEAD: I really just want a clarity that was a little bit like the question raised earlier to the degree to which we are opening up TPL. Are we basically opening up separation of research conducted under individual informed consent with data collection related to that informed consent, which is one of our – I know nothing is perfect in this world, but it is one of our more balanced efforts at real consent. If we are opening that up, you really do create a completely new can of worms that relates to the work around data reproducibility, science reproducibility, and a lot of other things that are really around our increasing understanding of what it takes to do rigorous research when the data has in fact been captured for that purpose. I think we need to think hard about opening that. It seems to me that is what flies in the face of 21st Century Cures and a lot of other things we are trying to do. Our life may be simpler if we say we are not going there. If you want to go there, that is your business.

LANDEN: I want to go back to the point that I raised earlier. In our framework, I would like one of our girders to very explicitly state that the framework supports interoperability and promotes the use of electronic health data. But I would like to couple with that statement to patient or to individual controls. First off is I think where Alix was going earlier that there has to be some sort of informed consent by the individual for the primary use. My second control would be for any secondary tertiary uses. There needs to be additional controls that the individual can exercise and say yes or no to downstream uses that were not part of the original reason for the patient consenting to the re-release of use of that information.

FRANCIS: Could you scroll back for a second so I can see the first couple ones. I want to make a generalization, which is that it seems to me one theme is that the distinctions that drove the original structure of HIPAA. Many of them no longer hold. Just scroll all the way up to one.

Think about the initial distinctions that drove HIPAA that there is a distinction between identifiable and de-identified. That there is a distinction between research and treatment. That there is a distinction between public health and research. That there is a distinction between a HIPAA-covered entity with respect to the data that they hold and the protections that apply when something gets outside of HIPAA. Those are all a bunch of distinctions.

Part of what needs to be going on right now is a question of whether it is really problematic to keep working in a world in which we assume those distinctions and whether there are some principles. I tend to go back to something like you have to have some knowledge of what is going on and that requires that there be some specification and purpose and some sticking to it, which can vary a lot just for starts.  The question is whether – other people may disagree with those. I suspect there is a lot of disagreement about collection limitation as one of those overarching ones.

The way to think about moving forward is how do we get beyond – what is it possible to do to get us beyond some of the distinctions that are just not workable?

KLOSS: I have one I would like to throw up before I continue around. I would like to make sure that there are some mechanisms in place to mitigate risk of harm and consequences of harm. You lawyers can state that better. Redress is a very nice word. We have no – there just are not any —

EVANS: The little discussion we just had about minimum necessary collection or destruction or legitimacy of science gets back to this thing of we need to revisit our allowed uses under 512 of HIPAA and how we get to that. It may be at the end of that conversation, we come to the conclusion that that list of uses was a pretty good set of uses. If that is the conclusion of a wide national conversation about what unconsented uses are permissible, it comes down to this thing you just said about how do we make them as safe as possible. That may be where we end up. We need to strength the protections, but allow the uses to occur and we need transparency. People need to understand that that will be happening with their data. They think HIPAA does not allow it and just make clear to them that it does. That may be where it ends up, but we need a conversation.

BERNSTEIN: Is the conversation the point here? That is a principle is that this should be developed.

EVANS: We had a conversation of four people and you can see that it is a difficult problem. Can we do a better job of how we annunciate this, what uses are allowed, what consequences and protections and transparency?

ROTHSTEIN: This is design to make Melissa feel better because some people might view this as so outrageous, it will pale whatever you said. There is a sentiment that I hear that we ought to encourage the collection of relevant, accurate, valuable, interoperable health information that can be used for good clinical purposes, for public health purposes, and even for research. What is the privacy price that we are willing to pay for that? Is there any way that we can protect people better? I would suggest that this goes into the nuts and bolts of the privacy rule as it is currently written.

I want to put in a plug that got lost earlier that we talked about in our small group and that is revisiting the issue of sequestration or segmentation of sensitive information. The NCVHS in 2008 sent a detailed letter to the secretary outlining why this was important. I think it is just as important today as it was then. I would urge the present committee to look at that.

The other thing is —

 

BERNSTEIN: Can I interpret just for a second to make a comment about that particular thing? I do not know if it got discussed already this morning. You all probably know that the Office of the National Coordinator for Health IT has an NPRM on the street and that it includes requirements in certification to have the ability to do sequestration.

ROTHSTEIN: They did not consult me on that, but I am happy to hear that. I do not know what the time table is.

BERNSTEIN: I am not an expert on this, but my understanding is it not always required to be turned on. It is optional. But they have to have the capability.

Somebody will correct me if I mess this up. There is the capability to do it, but often these extra modules or whatever cost extra money or they are not turned on or for whatever the reason is that the provider does not want to have that in their system. But there are requirements in the certification rules that these kinds of things – people should look at that.

ROTHSTEIN: The second point that I want to make is I think one of the really excellent contributions of the old privacy rule was the concept of minimum necessary, which I think your comments go to, but it does not apply to a treatment.

I would like to propose that there be another principles that is adopted and that is information in the least identifiable form consistent with the use. Let’s take, for example, payment. The typical patient who has health insurance – their health insurance claim with their name plastered on every page is touched by 100 to 150 people at the institution – hospital and typical health insurance company. Everybody that I talk to says they do not need that information. You have a ten-digit identification number. You have all sorts of ways to figure out whether patient 1, 2, 3, 4 is entitled to health insurance whether it is consistent with prior payments. You can look it up if you have some legitimate reason. Why does that information have to be routinely included in health insurance claims? And the answer that I get repeatedly is no patients ask for that. Obviously, they do not know about it. The providers do not ask for that. Government does not require us to do that. Why should we change until somebody makes us change?

Another example would be in health care operations. You can do a lot of reviews for licensing purposes, for quality assurance purposes. Everybody is familiar with how broad that is. If you are doing those sorts of reviews, you do not need to know who the patient is. Actually, you need to know who the doctor is. But you do not necessarily need to know who the patient is. If you do, you can find them. That is the beauty of electronic health record systems. You can press a button assuming that it is built in and that field is removed.

KLOSS: How would you press that point in Beyond HIPAA? How would you carry that principle forward to Beyond HIPAA?

ROTHSTEIN: I think it is a principle that ought to apply beyond HIPAA.

COUSSOULE: — really good point. We could argue until the cows come home with any given use case. But I do not think that matters. The idea of least identifiable consistent with the use I think is a really strong point.

Being in a health plan, understanding some of the challenges and issues, I may quibble or argue with certain aspects of an individual use case, but in general, I would be totally in agreement with that point.

MILAM: My comments are from a little bit ago.

With respect to something Leslie was talking about and my comments are actually from before that, but it is in line with your comment, Leslie. I would like to see a placeholder in the frame whether the NCVHS develop separate frameworks for these different scenarios of one framework for applicable ethics because I think ethics help us weigh that balance.

An example would be for public health. The common good differentiates a disclosure for public health from all other disclosures. Research ethics are different from public health, different from treatment. I think through an identification of the right lens to look at that balance is really important. I think that is why it is still important to really be clear about the use case and it still has some differentiation around the different purposes so that we understand exactly what is in that balance.

And then, Linda, you made a comment. I just want to offer my perspective. Nothing exists today around mitigation for the risk of harm and consequences. But every state has a breach law. There is state level regulation and many cases’ abilities for individuals to have – I just want to offer that a lot is happening in the state level.

 

KLOSS: Do we want to offer any principle about technology or – we have not touched on that.

GOLDSTEIN: I do want to caution that public health does have a common good purpose that might not exist in other places. There is also room for privacy in public health as well, which is the entire existence of public health law, which includes have individual rights. Public health does not mean free pass.

MILAM: I am not suggesting that. In fact, I think the entire framework should be applied to public health and public health agrees as well.

GOLDSTEIN: I think it is a nuanced – it has to be nuanced phraseology.

BERNSTEIN: The framework – whether it is an integrated framework or public health framework. Every element of a privacy framework should address every situation including public health. As elements of the framework are identified, they will be tied to the appropriate ethical principles that apply.

PARTICIPANT: Which also ties into uses.

 

PARTICIPANT: Say that again. The frameworks should address all situations and –

PARTICIPANT: — ethics to the uses. MS. KLOSS: And technology.

 

GOSS: Could you elaborate on what you were thinking?

 

KLOSS: I would think that we would want some guiding principle that says any technology that handles health information is based on some common standard for how it carries forward privacy —

ROTHSTEIN: What about privacy and design?

 

KLOSS: It seems to me going forward and I know the committee wrestled with this in the past too that one’s privacy choices get carried with the data. That they are not system specific or they are not application specific. They are attached to the data. That is privacy and design. It just seems to me that we have to bring that into this framework because we know in looking at apps and personal health devices and other issues, they just do not have foundational standards. I think this is an important link to what our Standard Subcommittee does. I do not know how to say that.

MILAM: Maybe it has to do with the implementation of the framework project by project – through your privacy impact assessment, all of your purchasing decisions, everything, privacy design is where you filled it. Your PI is where you buy it.

KLOSS: There always seemed to me that if the ground rules were clearer then that is a way that, for example, covered entities can make decisions about who they are going to share data with because they will have some criteria. If we do not share with people that are not using these standards or these practices and it is not only technology, but it is practices.

STRICKLAND: I think that in our group, we were talking about future technology we may not know of now and how to protect the data that may be consumed by technology, drones flying behind you. You entering into certain facilities or whatever. It is sort of future thinking based on where we were 10 years ago or 20 years ago. What could we have imagined where we are right now and how we could protect that information that we do not even quite know of yet, but kind of put a cover over it so people are protected from technology not yet known? That is where we were going.

GOSS: I just want to clarify. Item 20, 21, and 22 are all related to item 19, privacy and design of technology solutions. It was really for me the privacy and design of technical solutions. I still do not know if we got the technology aspect in number 19 maybe.

KLOSS: But I think we are going to have to do some combining here.

Any other suggestions for the good of the cause at this stage?

 

BERNSTEIN: I just wanted to think out loud about whether – we seem to be focusing a lot on things that are actually within the health care system and not things that are beyond it like the use cases that we talked about, your Fitbit, registries, CPAP machines, devices, other kinds of things that are not within the covered space at the moment. I just wanted us to think explicitly for a minute about whether there was something else that was not covered here that you would want to say about those —

GOSS: I think it is really interesting – not currently covered – thinking about the reality of in today’s value-based care arrangement. Certainly never going to have primary care physicians referring patients into community services so that they can address food issues, transportation, other kinds of things that are affecting the outcomes and now there is information flow aspects.

Building a Fitbit might be one. You get a physical activity coach. Now that Fitbit is actually transmitting data back. It is going to be then used within that coach aspect that is reported back to the clinician, which the clinician is then going to have to do quality measurement reporting.

This is to substantiate the risk-based contract model.

 

KLOSS: I think Leslie’s notion of rethinking these distinctions is really important if we extend that further. Just like stewardship is a troubling word, I have kind of come to the conclusion that health information is troubling. We cannot make these kinds of need distinctions anymore that it is health information when it is here and it is something else when it is there. I think these principles need to be almost agnostic to that.

FRANCIS: This comment goes right along those lines. It goes back to actually Bob Gellman’s report to you all, which I thought was really interesting on the question of what is health information. Now, there is one little hook going back to Mark’s point. There is some information that actually goes back and forth out in medical records.

It goes into it and it goes out of it. And some of that is under business associate agreements right now. A great deal of it is not.

My Fitbit might be tethered to my doctor’s medical record or my doctor might say get a Fitbit and the Fitbit offers the capacity for me to download from my doctor’s record and I do or maybe I just have a separate thing on my iPhone.

I might feel very differently about the separate thing on my iPhone than the thing that gets the info for my doctor or the thing that transmits the info for my doctor. I do not know whether any of those distinctions make any sense, but there is a legal structure that was used for business associate agreements that is a potential hook to some of those.

Any time information leaves a medical record, there could be some required restrictions on what happens to it. But I think that whole question of whether that way of expanding things makes sense ought to be part of what is up for grabs.

BERNSTEIN: Are you talking tracking the provenance of data – derived from a medical record at some point in its history then it has some rules associated with it.

FRANCIS: That is a possible strategy. I do not know whether it is a desirable one. You know I know a fair amount about the registry world and it is a fascinating world in that way because some of them are all consumer driven. Others of them are essentially created by treating providers and the data goes into them and then often is out of the HIPAA space, but the people do not understand it. It may not be a business associate agreement. In either case, it might go to drug companies, but people may think very differently about that.

STEAD: I am just not tracking. I thought this entire list was a wish list that was Beyond HIPAA. Maya’s comment – you lost me. I think this this whole list is Beyond HIPAA that plays into Leslie’s comments around distinctions that may no longer make sense. We are trying to get a list of principles. One of the – whether distinctions make sense. Am I tracking wrong?

PARTICIPANT: There were some examples that were within HIPAA.

BERNSTEIN: I just wanted to explicitly draw our attention to anything that might be —

KLOSS: I think we probably are at a good point for lunch break. We have until 1:30. Then we are going to do model work based on these principles. We will organize the principles a little bit over lunch and then come back and dive into model work.

(Luncheon recess.)

 

AFTERNOON SESSION

 

Agenda Item: Development of Guiding Principles (continued)

KLOSS: We are just going to take a few minutes to go over a first cut cleanup of the guiding principles and make sure we have not change the meaning. We did combine some. We started by trying to make these a little more active declarative statements. But, again, we are more interested in making sure we have captured the breadth of content rather than the getting the words right. And then we are going to go on to the models. We boiled down to three slides. I will just give you a minute to look at this. We start out with Sallie’s suggestion on protections increase health equity. Second, consumers understand how to exercise their information rights and how information is used. Data holders disclose what information they hold, how it is used and shared. That was Leslie’s suggestion of transparency. Purpose specification, collection and use limitation. That came up in several ways. We mushed it together into one. Unconsented uses and disclosures are limited and clearly specified. We do not want that practice to be extended was the notion of limited and I think that captures your thinking. Doesn’t it, Barbara? Okay.

 

Consent requirements for disclosure are meaningful and understandable. And interestingly, really that whole first set were quite consumer oriented. That was what came out. Now collection of data as granular as possible, but more protection as data gets more granular.

We did not work too hard to describe that, but it is a very interesting notion. I think it maybe deserves a little more exploration. That it sort of suggests that kind of like data quality. You have different standards depending on how close the data is to a data subject, how clear it is that it is a data subject’s detailed information versus 100 million database or maybe it scales according to how hard or easy it is to re-identify. I think there is really a – there seems to be a granule of interesting possibilities there if you think about it. It is not one size fits all.

It does depend on the risk of re-disclosure.

 

Rich, this is yours. You talked about interoperability. In some ways, that is a term that is within HIPAA, not really. But at any rate, we changed it a little bit. The data is protected at risk and in motion.

But maybe that is changing it too much.

 

LANDEN: I like the addition of at rest and in motion and how you captured exercise control over downstream use. This is fine. But it really does not say anything about promoting interoperability and use of electronic health data. I do not know if that was intentionally omitted.

And the reason I am coming back again several times to promoting interoperability is again because that is a federal policy. Thinking ahead to – this may be overthinking it. Thinking ahead to packaging the report for somebody to take up. If we have that plank in our platform that will certainly help on the uptake. If we do not have that, it may be a negative.

 

GOSS: Are you getting at the semantic aspect of it?

 

KLOSS: It is a cornerstone word. I think of interoperability as a HIPAA word.

 

LANDEN: Actually, it is more high tech.

 

KLOSS: No, but you know what I am saying. It is pertinent to covered entities and business associates.

LANDEN: No. Not at all.

 

KLOSS: Okay. Then we have to put it in.

 

STEAD: Rich’s point is it is front and center on national policy. It is not at all being framed as a HIPAA demarcated world because we are supposed to make the data accessible to patients through whatever apps they want. It is the ultimate of Beyond HIPAA.

KLOSS: We have it back in.

 

PARTICIPANT: Does that work for you, the edit in Bullet 2?

 

KLOSS: Individuals have the ability to exercise control over downstream uses for data not included in original consent. That is a really big one. Is that what we mean?

The mechanisms for redress and mitigation of risk of harm. Individuals have the ability to sequester and segment data for uses outside of TPO.

EVANS: One qualification. Individuals do not really have control over uses within the HIPAA umbrella. I do not think we want to suggest that they have more control on the downstream uses than they have within. If we could get to something – have protections —

GOSS: We are on the last bullet, Barbara. I am struggling with the inclusion of TPO there. I was thinking could we – for uses – take the outside of TPO out of it because I did not hear of Mark’s comments in our discussion related to being HIPAA.

EVANS: But I am talking about the third bullet too. They really cannot exercise control necessarily. We want them to have as much protection as they have within the HIPAA framework.

PARTICIPANT: It is not exercise control.

 

EVANS: But we do not want to soften it too much because a lot of people would like them to have control.

 

LANDEN: I am not sure whether I would agree with taking that out because we are looking forward. We are not looking to what ifs. I think the point is we are suggesting that there be controls put in place.

EVANS: Both within and without the HIPAA space.

 

LANDEN: Irrespective of – this was not talking about within HIPAA.

EVANS: Downstream is what is making me think we are talking about outside HIPAA if I misunderstood that. They have ability to exercise control over all uses —

KLOSS: We were talking about —

 

LANDEN: It would be downstream from where the data was initially captured. A patient or an individual consent to the capture of data at some point. By downstream, it means not for that use to which the patient has explicitly consented, but any other use.

EVANS: I understand that. Thank you.

 

KLOSS: Information should be collected in the least identifiable form consistent with the use. Every element of a framework should address all situations. We did not work on rephrasing that, but we want a comprehensive framework that does not divide the pie, but unifies it.

MILAM: Every framework has an element pertinent to ethics. That is appropriate to the different situations. I do not think I see the ethics part.

KLOSS: Every element of the framework should include ethics. Ethics should be element of every framework.

Then incorporate privacy and design of technology. Privacy choices attached to the data, not the custodian. We retain the tracking providence of data originating in the medical records.

MAYS: I just want to make sure I understand the one in which privacy should be incorporated as part of the design and technology. What is the protection element there?

MILAM: What I was referring to there is not that – it does not operate like a part of FIPPs. But what the issue is say it is a research project and you want to apply a privacy framework. You look at your risk calibration, benefits against risks, and you apply the ethical principles that are inherent in a research project. Public health has different ethical principles as does treatment. I am recommending that whatever lens we use that it always has an ethical component that recognizes the scenario that the data is being used in.

MAYS: I think I was talking about the next one. Incorporate the privacy and design of technology. I am trying to get a sense of what the protection is because you say to design it. It is in a lot of things that people read it and check it is okay and they have no idea what it means.

LANDEN: Privacy by design is a concept and I believe it is now an ISO standard as well. What that means is when you architect a process from the beginning of the architecture, you make decisions about what kind of privacy techniques are going to be used so that as you design the system, it is capable of being supported by those privacy systems or privacy programs or whatever.

ROTHSTEIN: An example would be like in the recommendation to have the ability to sequester information. When you are designing a system, you have to put that in.

KLOSS: So good work guys.

 

I am going to move us to the next slide. In transition from our principles to model assumption. When the Subcommittee worked on drafting a model, which we are not going to focus too much on. We are going to start from scratch. But we had certain assumptions that were built in and I would like to discuss these and see if you want to extend them, change them or what you recommend here.

Our fundamental assumption was that we needed to both deliberate action. Again, we are looking at the Beyond HIPAA space, but we need governmental and private sector action. Either will be insufficient. Concur?

LANDEN: Concur. I wonder whether we need to break out both federal and state or territorial government.

KLOSS: Governmental to me was public and private sector action. We can do that. Public and private sector action is needed. Either alone will be insufficient.

LANDEN: And actually it went to the necessity for both federal and state level.

KLOSS: And I think governmental is federal and state.

LANDEN: We have captured it in other points.

 

PARTICIPANT: You want territories and tribes. It is a little different with sovereignty.

KLOSS: The second has been — we made the assumption as the Subcommittee worked on this that while our focus is on the world Beyond HIPAA, those responsible for HIPAA compliance covered entities and business associates can be a first line of defense against inappropriate use, poor privacy practices, et cetera. I know here we got some comment that asking people to go beyond in their stewardship was going to be difficult to accomplish, but we did not think that – we just saw them as a real way to —

GOLDSTEIN: I am not sure whether that is actually true. There might be some non-HIPAA covered entities that actually are better paradigms of what we would look for than HIPAA covered entities.

I also would not call it the unregulated world because it is regulated because the FTC has a tiny role. The FDA has a tiny role. They are like little bits and pieces around there even that exist beyond HIPAA.

PARTICIPANT: States regulate too in some areas.

 

GOLDSTEIN: And also I think we have been discussing to a certain extent the fact that HIPAA was never – HIPAA was a compromise. HIPAA was never the paradigm of what we might want. Beyond HIPAA could mean that. Beyond the HIPAA that we have.

KLOSS: Do we just scratch this one? Let’s look at the others and maybe it is not useful. Private sector actions are a combination of improved risk avoidance procedures, practices, standards and contracting.

BERNSTEIN: What does improved risk avoidance means as opposed to any other risk avoidance? Some private sector actors are crappy. They do a bad job and some do a good job.

 

KLOSS: We probably do not need the word improved.

 

BERNSTEIN: Some of them have risk avoidance procedures and some of them just do not care.

KLOSS: Maybe based on what we said earlier.

Private sector —

 

BERNSTEIN: I think it is either a patchwork or a range. There is a wide variety of compliance or self- regulatory actions I think in the private sector depending on the particular business, the culture, the sophistication, all those things.

KLOSS: We will cut the word improved. Is it worth keeping its description? It is okay. If we end up going through this list and saying the only way we really agree on it strongly is that it is number one. These are just here to provoke some thinking about what the scope of this model is.

BERNSTEIN: My idea was just that there is a lot of variability in these areas, which is not really captured there.

PARTICIPANT: What do we mean by private sector?

 

Are we referring to anything that is not the federal government?

PARTICIPANT: That is not governmental. PARTICIPANT: NGOs, not for profits.

 

LANDEN: To that point, I think government has two divisions. One is the legislative regulatory, which we clearly do not mean here, but when state government have operational programs that are data collectors/data users, I do not think we mean to exclude them from this bullet point. Wouldn’t government actors like state Medicaids – wouldn’t they be the —

BERNSTEIN: That is the administrative state. They are sort of regulatory bodies or executing regulations and laws.

 

LANDEN: Number five seems to do more toward the legislative and rule-making function of government. But I do not see anywhere where government-run like Medicaid programs.

 

BERNSTEIN: Do you mean its own compliance, government’s own compliance or its own —

 

LANDEN: Government’s own compliance with privacy outside of HIPAA.

My other question on Number 3 is I am not sure what kind of standards we are advocating. Are we talking ethical standards or are we talking some sort of technological standards of what?

KLOSS: All of the above. Is this helpful or should we follow the Mark rule? Blow it up. That is fine. We have consensus though that any useful framework for improving the protection of data beyond HIPAA has to apply to government and nongovernmental, public/private, commercial, non-profit. It has to be pretty comprehensive.

EVANS: I am not so sure. We have the Federal Privacy Act that governs the privacy of data in federal governmental databases and HIPAA does not reach that. I am also concerned. The HIPAA covered entities are private sector entities. Are we sure we are being – most of them are. Private hospital is HIPAA covered and it is private sector. I just want to make sure – but I am just wanting to make sure we do not conflate. I am not saying you are doing it, but I, as a reader, was momentarily confused that the private sector wording was meaning the non-HIPAA private sector, but we have a HIPAA private sector and non-HIPAA. Are we adequately clear here?

KLOSS: Clearly not. We just scrap this page? Okay. That is fine. We are scrapping. I was just trying to scope the model. We can scrap this too. But this is the Subcommittee’s work product. Again, the title is wrong because we have changed the title. But we made the assumption that stepping up the privacy protections beyond HIPAA was sort of a continuum of actions. If you look at the two concentric circles off to the left, mechanisms, public and private, we made the assumption that there were certain things that HIPAA covered entities and business associates could do in adopting protections that go beyond strict regulatory compliance. For example, they could require that they do not release EHR data to registries without a business associate agreement. There are just sensible things that they can do. That is where we came to the – there is a role for covered entities to play in improving protection in this Beyond HIPAA space.

Second and then at the far right, what the legislative branch can do is enact new data protections either federal or state or hopefully unify existing protections I think is what we heard this morning on our themes. And then there were other kinds of actions that can be taken by data holders, data processors, all kinds of federal and private entities to be better stewards. Some of that might be governed with regulation and some of it might be business practices, business requirements. We saw a continuum.

And what we thought we would do with the afternoon then is focus first on either the private side or the public side. I had in mind that we focus first on thinking of what this framework should look like in the private sector, but you might say it is going to be easier to start with the public and then work our way back. Or we do not differentiate. But I thought we would focus first on private, but maybe we need the governmental levers to think this through.

GOLDSTEIN: I have been playing in my head with this idea of separating users so you are separating HIPAA covered entities and BAAs from all other data users and data holders. And you are only applying compliance risk to covered HIPAA entities and business associates. But I feel like I might also want to add use and disclosure risk there instead of just compliance with HIPAA. But then I got myself into a circle, which I need the HIPAA experts to help me with. If you are a HIPAA covered entity, does that mean that all data that you have, any kind of data, you have to comply with HIPAA? What about that wall? Isn’t there a wall idea somewhere that you might get data from somewhere else and then you might have – so it might be the entity or do we then think of you as two different entities? I am trying to add the use and disclosure risk – I am trying to add the use and disclosure risk to the left side as well, but I am not sure because of if you are a covered entity.

SEEGER: The HIPAA umbrella is going to cover protected health information. Let’s think about a scenario where you have a group health plan and you hold information about your employees and HR. That information is separate and distinct from —

 

GOLDSTEIN: And if it is aggregate data, it is not PHI either. We could add the use and disclosure risk to the left side as well.

GOSS: I almost feel it is an inherent part of it because part of your compliance is that you need to do an assessment, what is protected health information, make sure you have the right protections in place unless I am looking at this too narrowly. From a practical perspective, I have to accommodate for my use and disclosure activities and ensure that I am doing a scalable, appropriate set of mitigation strategies based upon my assessment.

PARTICIPANT: Is that what that word compliance means?

 

GOSS: To me it does.

 

PARTICIPANT: Or does it mean compliance with HIPAA?

 

GOSS: That is compliance with HIPAA.

 

FRANCIS: Can I say something at this point? I think part of the challenge of Beyond HIPAA that I was trying to emphasize is whether the right lines are drawn even within HIPAA. The de-identified line is one of them.

Another one is once it goes out of HIPAA, it is binary. You have HIPAA land and you have non-HIPAA land. If it is goes to a business associate, it is still in HIPAA land. If it goes to public health or if it goes to law enforcement or if it goes in – depending on the public health entity, it may be completely outside of HIPAA land. Part of the question I thought we agreed on this morning was to re-look at those separations. At least my own sense within HIPAA land, there is a lot of issues about transparency for one, but there are other issues too particularly with what happens with de-identified data.

KLOSS: What this diagram fundamentally does is separate the world into two parts, those covered by HIPAA and those not. Are you suggesting that we have to find a different way of describing the universe and where the information originates?

FRANCIS: It may be. That is one strategy for dealing with it. But it may be there is some overarching principles that we think need to apply everywhere. We have a jurisdictional hook for HIPAA, but we really should be looking at these universal principles and at least getting them into place in HIPAA and wherever. I would start there with dropping the identifiable/de-identified line. That is just one example.

STEAD: I want to go back and advocate for where we came out of this morning because I think getting this picture clear which is not – it shows why we constructed HIPAA the way we did with frameworks that we can largely just use more robustly given the world we are in now. It makes it less discontinuous on one hand. It makes a difference.

I would then see – I would basically use something like this to get, one, to help – bring everybody along so it does not take them as many months to years as it has taken us to make this journey. That then becomes I think the overarching piece in your principles.

I think this then becomes – that is the framework you are trying to manage to as closely as politics and society and everybody else is ready to. This is a nice way to frame what we do as long as HIPAA as it stands is the law of the land.

If we are really going to go back to this thing then in essence I believe we are guessing that – we are basically saying – HIPAA was a law that was constructed given the way these things made sense then. We now have this. We probably construct more than one new law, one of which might be more general, one of which might relate specifically to health. I do not know where it would come out. But this would let us know what we do as long as we are constrained by the current HIPAA law. I think when we made this, we were assuming that we are thinking about a world in which we did not change HIPAA. We figured out how to upregulate the things so people that are already covered by HIPAA can do, how to do things with these other places so we move in a better direction.

I see this is still useful, but not as the overarching framework if we are really going to get away from that current separation, which I am sensing the group thinks no longer – we at least need to decide if it still applies by going through the checklist – whether it still makes sense to let the same things be used without consent and so forth. We sort of outlined how we would answer this.

GOLDSTEIN: A way to frame it might be where we have been and where we are going maybe. It is not binary. It could be where we were, but we kind of still are there with just focusing on HIPAA compliance as opposed to where we should be going, where we want to go.

KLOSS: If we were to take the guidelines and this notion of the world in 2020 or even 2025 or 30, can we just kind of do a little clean slate exercise? If we were designing this today, forget the constraints of HIPAA or any existing law, how would we do it?

FRANCIS: FIPPs are a good starting point.

 

STEAD: I am the generalist here again so I do not know what I am talking about, but you all do. These things exist. FIPPs exist. You can basically take FIPPs and you can say in HIPAA given the world we had to deal with, we dealt with this piece, but we ignored all the rest of them.

Here we are today. FIPPs itself has probably not changed a ton. Maybe it has. But my sense is it has been reasonably consistent. Now, given this, we now need to pay attention to these other pieces we ignored. That then becomes a very continuous journey, which I think makes it easier to bring people along.

KLOSS: But if we were to do that and that was the exercise that we went through in 2012 when we looked at the principles for community use of health data. HIPAA makes use of many of the fair information principles. But what we are talking about here is that they need to extend to entities that they do not extend to now. It is not that the principles need to change or that we need to apply different principles. We need to apply them differently. Am I correct?

STEAD: I do not hear that we really very well applied purpose. We have applied it at such a high level that it was what we could then, but it actually has not let people make more granular choices. I hear that we applied a portion of it to a portion of the ecosystem. I think I am hearing we need to apply a bit more of it and we need to apply that more broadly.

 

KLOSS: I was just going to suggest we pull up the list of FIPPs and talk about it.

BERNSTEIN: I was going to make another remark if I may if there is no one else who —

FRANCIS: I can email you the basic set that Bob Gellman used in the report he did for you two years ago. It is just a nice little box.

BERNSTEIN: Before we get there, I was going to maybe make a radical notion that we look slightly further back than the stewardship thing to the tenure of Mark Rothstein. The Full Committee, Mark may remember, in 2006, sent out a letter with a recommendation that now I think looks quite broad and maybe – that said that all health information should have a baseline of protection anywhere it is. I think it is a June 2006 letter. You can reach a recommendation if you want, which at the time – it seems now like quite broad and even radical that the committee should have said that because it was so broad.

But that is kind of where you are going back to now is revisiting even further back of basic recommendation that this committee made at the time although I think without a particular notion of how that might happen or it was aspirational I think at the time and now we are at a time when maybe we need to think about it in a way that is not merely aspirational.

 

KLOSS: We do need to go that broad.

 

ROTHSTEIN: My recollection is that it was one of the easier sells for the whole committee when we got that in —

BERNSTEIN: Mark’s tenure was characterized by let’s say more controversial than Linda’s or Leslie’s or Barbara’s.

ROTHSTEIN: Indeed. While we are in a lull for a minute, I wanted to support what Bill said. It is not that we want to merely extend the current privacy rule of currently uncovered entities. It is we think that the privacy rule needs to be improved, updated for covered entities and the currently uncovered entities. I believe that when we identify a problem, we should be creative and willing to go and make recommendations wherever they may take us. But we still want to be on the same planet as people who are going to read this. You reach a point of diminishing returns when you just try to reinvent all the assumptions.

My example would be if you take the GDPR and compare it to the privacy rule, they are exactly the opposite on several key assumptions. In the US, things that you do with information are presumptively legal unless we say that they are illegal. The GDPR says the opposite. We have sectoral privacy regulation in the US and the GDPR for the European Union is they have one set of rules that are supposed to apply across the board.

We have a quite different view of consent here where we do not look too closely at asking people to check a box or opt out or something. The GDPR on consent is much more stringent.

It is valuable to look at new options, but we have to be somewhat tied to the realistic assessment of what we can do, what is likely to be well received and try to strike the balance.

KLOSS: I think that characterizes well what the Subcommittee was doing in building on what we have. But I think we can certainly explore starting with a new set of – a new framework.

 

BERNSTEIN: Can I read to you what the committee wrote at that time? In June 2006, the committee made recommendations that say the following. HHS should work with other federal agencies and the Congress to insure that privacy and confidentiality rules apply to all individuals and entities that create, compile, store, transmit or use personal health information in any form and in any setting including employers, insurers, financial institutions, commercial data providers, application service providers and schools.

I think to Rich’s point, the next recommendation says HHS should explore ways to preserve some degree of state variation, maybe yours, in health privacy law without losing systemic interoperability and essential protections for privacy and confidentiality.

At the time, we were talking about the NHIN, the National Health Information Network. It says HHS should harmonize the rules governing the NHIN with the HIPAA privacy rule as well as other relevant federal regulations including those regulating substance abuse treatments. It was pretty comprehensive.

EVANS: I am concerned though that it is not as comprehensive because it is all underpinned by the HIPAA statutory definition of health information, which includes information created or received in a health care context.

We have all this data. By that definition, it is 1320D4 and title 42. In that, I am not sure stuff being generated outside the HIPAA covered context is health information.

BERNSTEIN: I do not think that is correct actually in this case. You are picking up on the use of the term personal health information, which is particularly used because it is not protected health information, which is the statutory language.

EVANS: But health information is a statutory term. 42 USC 1320D4. I am worried –

 

BERNSTEIN: Health information by itself?

 

EVANS: Yes. The term health information means any information whether oral or recorded in any form or medium that, A, is created or received by a health care provider, health plan, public health authority, employer life insurer and so forth and relates to the past, present, and future —

BERNSTEIN: It is very hard to get away from very general language. I think in the context of HIPAA, that is one thing to define. But I do not think that was the intent of this letter at all.

EVANS: I am just concerned that there is a lot of health information that is not within that definition. We have to capture it too.

KLOSS: We were going to kind of probe a little bit more on this – how things have changed and we thought if we reflected some on the Code of Fair Information Practices, we would say how far we went with HIPAA and how far we need to go now. Let’s try this for a couple and see if this is a productive exercise.

PARTICIPANT: Where did this come from?

 

KLOSS: This is out of our environmental scan.

The principle of openness.

 

FRANCIS: I think within HIPAA, it is inadequately recognized and certainly outside of HIPAA it and the two interior ones are we do not know all the major kinds of things that health care systems do particularly with not only identifiable, but also de- identified data. Even when they transfer information out, people do not have a clue what happens to it afterwards. I am not even sure the original covered entity does.

 

KLOSS: I think this has changed over the decades.

 

BERNSTEIN: I think the idea of openness in the original Code of Fair Information Practice was the idea that there should be no secret record keeping systems. You have to publish what you are collecting. Even the CIA publishes records that fall under the definition mostly about their own employees. You should not queue databases that people do not know exist.

KLOSS: Apply that to the World Beyond HIPAA now. How should it be addressed in this big, wild, wild west?

 

GOLDSTEIN: Within NIPAA when you go to a doctor’s office and they post their privacy practices, it tells the kind of places that they might send it to. Do you have any idea where they send it where they use the information?

 

BERNSTEIN: This is not about sending it. This is about keeping records that people know exist in the world. Just the idea that there should be no —

GOLDSTEIN: That is what that was about to you, not to me and that is not what it should be either. When I talk about transparency and openness, I talk about transparency of where my data is, where it came from, where it is going, what you are doing with it, who you are sending it to. The accounting of disclosures only applies to a very narrow set of what will actually be included in the counting and it is pretty lame as it is. There is a lot that we do not know. In a world where we are going, do you know where your Fitbit data is? Do you know where it is going? The CIA does not allow you to wear your Fitbit into the actual building. That should tell you something.

BERNSTEIN: Forgive me. The difference is that in this formulation of the code, which has many formulations. Bob Gellman has actually compiled many of them. It does not separate out – I do not see one that says notice, which would have the kinds of things you are talking about. In some formulations, there is a difference between what we tell individuals when we collect information about them about all the things you are asking, what is going to happen to the data, how it is kept and so forth and the principle of no secret record keeping systems.

GOLDSTEIN: Which formulation is this? I cannot see the citation. Can you scroll down? Where did this formulation come from? There are tons of them.

BERNSTEIN: This one came from one of our papers.

 

KLOSS: Let’s kind of keep high level. Is it helpful to probe into this and think about how we would apply these in the World Beyond HIPAA? Let’s kind of try to stay in the Beyond HIPAA world just for now. I do not doubt that we need to go back and recommend some changes. I am trying to think about this framework for all of the health information.

BERNSTEIN: For this principle, I think of this as database is held by private data aggregators, for example, things that I do not know are out there about me. My Fitbit I could probably guess that they have some database, but maybe it would be nice to know exactly what they are keeping and how they are keeping it and so forth. But there are many databases out there that either got information from me from some secondary way or collected things not directly from me that I might not know are out there. It would be nice to have some way of finding that stuff out. I am not saying that what Melissa is asking about is not also important. It is just a different principle than the one I was thinking about here.

KLOSS: If I look at that whole list of eight, I could conclude that with regard to any of my data, I know nothing. I do not know who has it, what they are using it for, how much they have, what the quality of the data is.

Isn’t that the world we are in? How secure it is? Nobody is accountable.

BERNSTEIN: One way of thinking – of trying to get at the particular things we do not know is to think about who has my data who I am not in – what is the legal word –

PARTICIPANT: Do you know where your facial recognition data is? It is a question.

BERNSTEIN: Privity. That is the word I am looking for. Thank you. Some entity with which I do not have a direct relationship.

GOLDSTEIN: But it is also people you do have a direct relationship —

BERNSTEIN: And people I do, but it makes the question more concrete for me if I think about even further removed. There is stuff out there. I have no chance of getting to know that it is out there.

KLOSS: If I could do one thing to improve the current situation, what would it be?

 

EVANS: I think this is a good list. It is broader than the old historical list. These are great protections and obviously have the supply inside and outside to what we call the HIPAA covered space. Right now, people do not have that at all. I agree with you that the HIPAA privacy rule provides some of this in a very important way. But I like this list.

Can we scroll on down and see what is after eight? Is that the end of it?

KLOSS: But as has been pointed out, there are other models. But they are roughly this list.

GOLDSTEIN: Because purpose specification is not here unless that is what is meant by disclosure limitation.

BERNSTEIN: If you look at use limitation, it incorporates only for the purposes specified at the time. There is the idea of purpose specification, but as they formulate them in a different way. You can find them all in there, but they are not in the way that you may have thought of them originally.

KLOSS: I sometimes use the diagram of the rights in GDPR as a very comprehensive set of principles.

BERNSTEIN: I guess the first one has combined the two things we were talking about, the idea of no secret data keeping systems and what happens to the data, its purposes and uses and so forth.

MILAM: I think the challenge is in the application. When you think about Fitbit data and some of the apps and you think of the use cases that are really subject to very little regulation. They could probably make a pretty good case for addressing all of these principles and their design. The problem is they are not done in a meaningful way.

As a consumer and you get your Fitbit in the box, you pull it out. You are trying to turn it on. The privacy notice is probably some very small language when you set up your account online. There is no meaningful choice and you do not really appreciate what the impact is going to be because it is not written in language that you are used to reading or that you understand. There is no impact statement. They may tell you exactly where they are sharing it, but you may not understand what that means to you and who will get access to that.

I think when we take a framework like this that is really tried and true and there are really hundreds of frameworks out there, but they all come back to this framework. Some are more different than others.

But I think where we might get the best traction is to identify how this framework needs to be tweaked for the different use cases as I have been indicating because in a treatment scenario, you have that one-on-one discussion. In any scenario, they are different. But when you are talking about apps and you are talking about Fitbits and you are talking about monetizing the data, there is never really any consumer-facing piece of it. It is all a click through agreement.

FRANCIS: May I read you the actual language on Fitbit just to think about this for a minute? We transfer information to our corporate affiliates, service providers and other partners who process it for us based on our instructions and in compliance with this policy and any other appropriate confidentiality and security measures.

These partners provide us with services globally including for customer support, information technology, payments, sales, marketing, data analysis, research and surveys. That is totally uninformative except we may do anything —

EVANS: What do you mean by Number 6?

Obviously, HIPAA allows all sorts of disclosures without the consent of the data subject. It can be taken unconsented subject to a privacy board waiver, subject to minimum necessary. Are all those non-consensual uses included in other legal authority? Is that how that works?

BERNSTEIN: The principle is that information collected for one purpose should not be used for another purpose without going back and getting consent for that next purpose. That you have limited disclosures you can make to what you told the person upfront.

KLOSS: Let’s just go back now and try to describe what is desirable. Let’s not even go where it is possible. Let’s just go to what is desirable and then we can always scale back and put a pathway from where we are today to where we need to get to.

If you were to apply Principle Number 1 then I should be able to get an accounting of what databases my health data has landed in.

PARTICIPANT: That should be true whether it is landed there in originally identifiable or current de- identified.

KLOSS: I agree.

 

ROTHSTEIN: But now you are getting into the problem that Bob Gellman captured very nicely in his background and that is if Target has a list of what I buy and I might greeting cards, bananas, and tools. But among those, I have bought band aids and I maybe had a prescription filled there. Now what? What is health information?

KLOSS: Could one develop a tiered definition of health information? I could say my medical information, information that originates through treatment as is one tier. Another tier is information that can be inferred through what I buy at Walgreens. Is there an opportunity to say – at least I would want data from my providers or from encounters I have had with the health care to be able to – I should know whether it has been released for databases that I am not aware of. I do not care as much about buying bandages, but maybe I care about – is that a possibility?

Could we tackle this problem of defining health information, but not making a great big circle, but by making tiers?

MAYS: I think the problem is that what impacts you the worst as we move along further in the use of technology is there are enough, not enough, but there are protections in your medical data. It is the number of things that are given to Google. It is how the Google analytics are then used by US News and World Report, which we think is wonderful, but they are not the only ones. Then it gets used by people in health plans. What we really are needing to deal with is –

I think when you started reading what is health care, it is like we need to fill it in to say and the things that interact with it and try and have a more broad view of those things and what needs to be done in terms of those things.

Right now, there are many apps. There is like over a million apps that feedback into Google. Google then sells that to various entities that then are almost pretty close to being able to identify what you do and the implications of that in terms of your health. To me, that is more concerning than medical information.

GOSS: I started to put my card up, but I think Vickie is making a really good point. It is this definition aspect of personal identified information that Barbara started us down with now the electronic health information aspects of 21st Century Cures and what we are seeing in the proposed rules. It has sea legs and then it is going to go have usage like you are talking about that we do not understand all those implications quite yet.

KLOSS: I am going to go back to the – what would – blank slate. How would you design it? Going forward, you cannot put the genie back in the bottle.

MAYS: I think I would talk about it is almost what Alix just said. We are talking about medical records, electronic health records. It needs to be also electronic health transmissions and health is very broad. It is well- being. It is all the things that we do that can indicate any behavior that has implications for health. I would change our language.

 

BERNSTEIN: I was going to say something about just inferences, where you can make inferences about people’s health.

STEAD: I think you are relearning why HIPAA elected to define things the way it did because in essence, without being able to define this, what it said was if it is in this space, it is this because we cannot define it. I do not think we are any brighter now about how you would define it.

The only thing I can imagine the clean slate would be is a broad – some form of broad protection that in essence provides a floor that applies to anything that is personally identifiable, whatever level our society would accept. You could over time imagine that that floor could be advanced as societal need changed.

Then I think in essence you take a risk-based approach to creating things that are much more narrowly defined that are probably managed against a purpose- specific risk scale. If I just guessed where this would be in 2050, I cannot imagine anything that would not be something like that. We are all sort of having to do that as we are having to deal with what is confidential data within our various enterprises whether they are health enterprises or not. We have increasingly, I think, gotten to where we are really willing to absolutely lock down some set of stuff that we view as confidential and not let you deal with it with anything but a locked down system. Those two extremes turn out to be actually pretty easy to deal with. I have at least discovered everything in the middle is hard.

KLOSS: I could add one more plank to that and that is not only a risk-based approach, but some type of enforcement that is more uniform based on assessment of potential harm, not real harm because I think that is a problem too.

If you are selling data that could suppress the value of the homes in my community even if it does not fully happen, if there was a high potentiality that that could happen, but maybe that is covered by fraud or other existing laws. I do not know. I think there needs to be some consequence, some enforcement.

FRANCIS: This is probably way off what anybody can do. The current enforcement scheme is all data use agreement, which is contract law and not useful.

KLOSS: What Bill has suggested and I am kind of seeing a metaphor here of a floor, a plank or something. I am just looking for something visual. Some clearer limitations on personally identifiable health information no matter where it exists. A clarification of certainly the fact that de-identification is obsolete. We need to redefine what personally identifiable is today. A risk- based approach. Some enforcement.

GOLDSTEIN: I am thinking back in the AI machine learning field. I think that it is time for us to think about use limitations honestly because if you were trying to figure out algorithmically what my likelihood of success in a treatment or failure in a treatment or longevity or how long I am going to live, you are going to put in my health status and all of my numbers and perhaps my genome, but you are also going to put in epigenetic data, what I eat, where I live, the air I breathe, my stress level, all of these things. You are going to put that into it also into the algorithm to make the decision. We know about all of the negativity about algorithms and how wonderful they can be and how negative they can be. I described them to my students as the new version of actuarial decisions by insurers. Like what is the likelihood that X is going to happen. You are going to put all of this in there.

It also brings me back to when you mentioned equity. You are going to have social determinants of health. You are going to have equity coming out of it. You are going to have everything. Collection is not going to work by itself. Openness is not going to work by itself.

There has to be honesty some things that people are not allowed to do with your data. This would be use limitations. I do not know what the list is. We could brainstorm on that type of list. We are going more and more into AI and more and more into AI in the health sphere and people are fascinated by this. And the All of Us Research Program, which used to be precision medicine. It is all about epigenetic data as well if it ever does anything.

Regardless of whether that program ever does anything, we are fascinated by AI. I do not think that is going to go

away.

 

KLOSS: And what you suggest then does – would it be desirable to have greater transparency of the algorithms?

GOLDSTEIN: Yes, transparency of the algorithm, but you could give me an algorithm. I would not know what it means. That is not going to help everybody.

What are we going after? We want people to have control as much as possible. We want benevolence, how do we prescribe benevolence, how do we find benevolence. We want people not to do bad things to use with our data. We want to be protected from that. In this society, we do not have a safety net that is going to protect us if they do bad things. We basically have to prevent whoever evil doer A is from doing it. How do we do that if we are not going to protect you?

The way the Europeans do it with GDPR. They do have a safety net most of those countries. They are protected from it happening, number one, and they are protected if it does happen, number two. We are not protected if it does happen, number two. Our focus is on number one. How do we find it? How do we find the benevolence? We do not want people to do bad things to you with your stuff. We would like to limit the amount of our stuff that those people get. That is kind of the basics.

EVANS: Mark said something in our breakout session this morning. I did not think much of it at the time, but it is a really important – we did not highlight it, but he said not all of these things are privacy issues, issues like whether we should use data for insurance underwriting is kind of a societal issue. If we try to make everything be a privacy issue that you guys are supposed to solve, I think you go down a rabbit hole where your task becomes unmanageable. I very much agree with the concern about the use of data to train for predicting. But that seems like a societal decision. Do we want to pursue that technology or not? I do not know if privacy policy is the place to make a decision of whether we are going to have AI in our society or not. If I did not say it correctly, please chime in. We kind of just said that and moved on.

But a lot of these things may not be privacy issues that we can solve with privacy law.

COUSSOULE: This is Nick. A couple of comments. I do not know how we would even contemplate including or not something like AI or machine learning or different algorithms in any statement you would make.

Frankly, that is just new technology that is available now. It is not changing. What it is doing is making more and more data that nobody would ever have thought relevant part of the equation that gets evaluated.

I think the only way to – or one of the only ways to deal with this is to really look at the use cases and the harm and try to prohibit that. I am not sure you can look at the technologies or alternatives because we could get the ones that are available today, but we would not know what is going to happen tomorrow —

BERNSTEIN: While Melissa was talking, it occurred to me. She asked a question not about privacy, but about preventing people from doing bad things to us. It occurred to think about the GINA rule because that is not a privacy rule. That is anti-discrimination rule. And what we want –- we kind of assumed with GINA that the information is going to be out there. The law says you should not discriminate against people using that data.

It occurred to me that if we focus on the bad things that we do not want people to do rather than whether or not they can get access to data, there might be an avenue for you to think about how you want the world to loo, which was your question, Linda.

KLOSS: I was going to ask if we could go back over these four planks, a broad floor prohibiting or allowing people to restrict the use of their personally identifiable information as they wish. What mechanisms would we use for that? Let’s go back and say what would we recommend exist that does not now exist. Again, going back to FIPPs, some knowledge of where my information is going and who is using it. Some consent or authorization process that is better than now. Consents or authorizations that are not meaningless click throughs.

Could we kind of peel each of those planks an see what we add to it so that we could think about what we do about this current world we are in?

STRICKLAND: I think that we have a disconnect. We have providers who – when you go to the doctors —

KLOSS: Thank you, Melissa. Will we see you tomorrow?

 

PARTICIPANT: My job is done. I say provocative things –

 

KLOSS: Will we see you tomorrow? We will have it all worked out by then.

STRICKLAND: You go to the doctors for a specific given ailment or even your PCP or what not. Then you sign this generic HIPAA form, which is much like your Fitbit disclaimer that you read. It has things like I can share your data for medical necessity, for billing, with my partners, which could be ABC Inc., which could be the guy who talks to Fitbit, who talks to Google. You do not have any idea. The consumer is signing this HIPAA form because that is just what we do. Nine out of ten times you do not even get a piece of paper anymore. You get this thing that is like this and they tell you to sign it, which is not compliant in my opinion either. But you do not really have the right. Are you refused service if you say, no, I do not want you to share my information with ABC Company because I do not know the confines of what ABC Company is going to do with that data and I do not know your contractual relationship with ABC Company that perhaps says that they can do whatever they want with that data. It is not private at all at that point.

At that point of signing that piece of paper, you do not know that the information is now not protected or leaving the protection of that office. Everybody thinks it is this beautiful umbrella that is protecting you, but in reality, it is actually something that is releasing your data and you are signing that release of that data.

The solution to that I would think would be that those types of forms would have to be somewhat regulated to say you cannot just have this generic – share this with all of my world and partners and whatever for that particular thing, for your say rotator cuff surgery. I can only release information about your rotator cuff surgery, not what medicine I prescribed to you, not what pharmacy you go to, not any of that stuff so that someone else could use it for other reasons that are beyond the need. It is a tough thing to lock down.

FRANCIS: This kind of builds on what you were saying. One way to start trying if we want to look at the sorts of things that might be done with data, which is different – that is the inside rather than my agreement.

One way to do frameworks is to start with saying it is all about individual consent. Another way to do frameworks is to say it is all about either making known or putting some constrains on the uses. If we start there, something we have in place although I do not think they have thought about it in quite the right way, but HIPAA distinguished a bunch of different kinds of uses, just big buckets of uses. It called one of them treatment payment and health care operations. Those got lumped together, but I think today we have realized they are in some ways quite different. They have come to mean things that go far beyond what I think people thought they meant at the time.

Another big bucket is fund raising by the facility. Another big bucket is marketing. I think at the time people thought about marketing as my data gets sold to somebody who calls me up and tries to sell me ish-kabibble drug, but marketing is now something really different. It is the folks in this catchment area. Their doctors have started prescribing the generic so we are going to get on the nightly news the fancy new whatever the drug is or something like that. Marketing has changed.

Another big bucket is public health. I think there is a lot to think about whether public health has started using – doing more sophisticated and different kinds of things or whether people really thought through what public health does when that large bucket was created.

And the last one is law enforcement. With law enforcement, the idea was somebody showed up in the emergency room with a wound and we got the gun or the shooter or whatever, but it was sort of law enforcement on that level. Now, we have fifth degree relatives from 23andMe. It is not 23andMe that was the one, but from various recreational genetics. What law enforcement means has changed.

 

But I think those were actually pretty decent buckets to start out with. I think some of those uses have different kinds of benefits and probably require different kinds of controls.

We also have some public attitude data, some public opinion data, not very good public opinion data, but we have some from things like Lee Rainie’s work at I think it is at Pew that is about things like people are more concerned about commercial uses of information than other kinds.

 

KLOSS: The model that the Subcommittee was

 

working with was kind of neither a data holder, not based on use and not really based on data holder, but sort of – we should settle somewhere.

FRANCIS: I was suggesting that there are different ways to come at this and a lot of the current framework has come at this either as who is the data holder or who is the data subject. Another frame to come at this with is what is the data use. And actually, even for something like public health, that is a different question. That is the use, not the holder because there are some interesting public health work that gets done by entities that are not classic public health departments. I think that is just an aspect of any framing that probably – we have some pieces of it that got started out with in the original frameworks that could be built on.

KLOSS: HIPAA is sort of a holder –

 

FRANCIS: HIPAA built on some of the – HIPAA is a holder approach, but HIPAA distinguished among uses. If you were going to broaden out in some way beyond HIPAA, maybe it is different if Fitbit sells your data to be big pharma then a Fitbit uses your data to let all the runners in the area know that cougars have been spotted and do not go out on the trail alone. That is a possible use of data in my – the geolocation data in my Fitbit. There might be very different – I am just suggesting a way to think about differences in use as heightened or lowering questions about who holds it and whether or not I have consented.

At my university now, I no longer have the authority to consent to whether or not I get a shelter in place notice from the university. I get it if I am in the university system.

KLOSS: GDPR is based on kind of the individual rights focus.

FRANCIS: There are some pretty big exceptions built in to GDPR, which are not based either on the individual or on the holder.

KLOSS: Should we take a little break? How are you feeling?

 

GOSS: But I do think that the last comment from Leslie about the use is something to explore more. Yesterday a number of us from the Full Committee with the standards focus attended the HITAC, ONC’s federal advisory committee and Deb and I were particularly having a conversation about that event and sort of the challenges we heard with the EHR and practice management systems. It is sort of how could we solve this covered entity kind of aspect that is dynamic that we are in the middle of. It is very analogous to the use. If you are engaging in the activity, you have rules applied to you.

And then I was starting to think about the simplicity of – my favorite example. Suzy Q trying to understand what is happening with their data. When you are starting to talk about in terms of use, okay, if you are in a university system and you do not need to consent. We do not need an opt-in. You are just going to receive – stay in shelter kind of notice. Most people are going to go yes.

Good. It has to resonate in some kind of simple terms even though it is a very complicated set of matters. But I am starting to feel like this lens shift has some merit to further explore.

ROTHSTEIN: Barbara and I and two colleagues have – after two years of work completed an article on genetic privacy that is called – I guess after the colon, applications, implications, and limitations. I just want to read the last paragraph. You can substitute health for genetic or something if you want. Our overview of the law of genetic privacy and by the way, we considered it in every conceivable way clinically, public health, research and all the third-party – our overview of the law of genetic privacy has been quite sobering. Although some opportunities exist to increase individual control over disclosures that may affect them, these situations are limited. Thus it may be time to shift attention from attempting to control access to genetic information to considering the more challenging question of how these data can be used and under what conditions, explicitly addressing tradeoffs between individual and social goods in numerous applications. The first step to meaningful protection of genetic privacy may be the societal recognition that health privacy including genetic privacy is now largely a mirage.

KLOSS: That is the article that you sent me, Barbara, right?

EVANS: No, it is not.

 

KLOSS: It is not the 70-page.

 

EVANS: This is the 61-page. Now, this will be out in the Journal of —

 

ROTHSTEIN: This is going to be published in the Journal of Law and Biosciences whenever it comes out.

KLOSS: It is largely a mirage. Is that what – individual control.

EVANS: It is just saying that this notion of individual control is and always has been because – this gets back to the thing. The social benefits of using the data are in many instances very compelling such that we as a society made a decision to allow public health, health equity, access to data. That may need to be updated. But the buckets they came up with were pretty good buckets to tell you the truth. I am not sure we want to revise them. But I love your idea of nuancing them more. They have evolved.

 

KLOSS: That is very helpful. That is very helpful to focus on uses, focus on based on our earlier discussion, identify higher risk uses and have some explicit mechanisms for redress. Doing that would be pretty huge actually or framing that as a set of recommendations.

I think we better have a little break.

 

GOSS: Let me just do one quick check in. Nick, I want to confirm that maybe we are heading in a direction that goes back to some comments you made earlier. Do you want to explore that a little bit with us or validate whether or not you are hearing the same thing?

 

COUSSOULE: I think so. As I am listening to most of the conversation, the transition that was just referenced I think makes sense. I am just interested.

KLOSS: I am going to draw a picture of it. It lends itself actually to a picture. That is why I wanted to give us a break and a stretch now. I am going to put this up and let us refine it a little bit.

(Break)

 

Agenda Item: Development of Draft Recommendations

 

KLOSS: We are trying to describe the whole elephant here. And then I think what we will find is that we need to come back and apply it to that more continuum model, that pathway that we have started and look at how we chunk out changes.

We are looking at a broad floor protection for personally identifiable information. First, we started with there are three ways to envision this: protections at the individual level, protections on use, and protections or some mechanisms for holders. Our insight was that where HIPAA has excelled is this intersection of data holders, covered entities, business associates, and specific uses, TPO and the other uses that we discussed. And what we are adding to this or strengthening is the protective levers for individuals. A broad floor of protections for personally identifiable information that needs to include some kind of use based consent or authorization process, sequestration, segmentation. As we talked about earlier that the protections or those specifications follow the data. We will come back and look at our guiding principles and see if we capture —

STEAD: A key piece of that is you will notice that up at the top we have added Beyond HIPAA and Beyond Health because this thing will only work as a floor if it is in fact applying to personally identifiable information. Health gets at different combination of uses and users that sit on top of that. But if we are going to have a truly broad floor, it will be whatever this country can agree to that is in fact our version of GDPR.

KLOSS: Then I had de-identified data as a particular flavor of data. We took that out because that is a notion that is of less value. But in addition to these individuals we need the data use agreements, the use redisclosure limitations and that does cover the territory.

STEAD: And those really apply at the intersections of these various combinations.

GOSS: I already texted it to Nick.

 

KLOSS: This model needs some kind of risk model with regard to uses. We talked about that. That all uses are not – we have the basic cluster of uses in HIPAA, but we have to add commercial uses and other uses and some way of identifying risks of harm. Enforcement mechanisms and then redress. And the enforcement can be on the holders and redress compensation is for harmed individuals or organization, I suppose. It includes nondiscrimination. I do not know if that gets us anywhere.

I think we talked also about some discussion about tiers of health information.

Is there any plank or layer that is missing here? DR. MAYS: I have a question. For the risk model, is the risk about the information or is the risk also going to identify people who have cases that are most likely to put them at risk?

KLOSS: I do not know. I was thinking of it as which uses could lead to inferences or harm.

STEAD: I think once you get above the broad thing, which I think applies to individuals with possibly some exclusions related to – it is a broad floor. It does not – but what it does, in fact, is establish something that other things – it can be advanced.

And then I think you have use. I think use and holder – you will end up with a set of uses that have different criteria for what is acceptable – dependent on use. That is the reason you have different uses is because of criteria for what is acceptable, which I presume relates to the balance.

 

MAYS: What I was thinking about is there are categories of people that need greater protection. It is more risky if it is – it is more risky if you have a disease that stigmatizes – that kind of thing. I just want to make sure somewhere – it is great to do that, but I do not want to forget that we have people that we typically write in some protection of someone has to on their behalf. In technology, it is really important because kids get on to these sites. We need to make sure that there is some protection.

KLOSS: What is missing?

 

GOSS: It is very difficult to tell at this point. What I am feeling like is that we were trying to put a graphical representation of six hours of free flowing thoughts and honing of principles and what is the challenge we are trying to achieve. I can tell you even struggle with a little bit just by the way you craft some out on the board. What we are trying to do, I think, is put them all in relation to one another so that we can then validate some of our thoughts moving forward for what is the report and subsequent letter that we might want to create.

ROTHSTEIN: I think it is a nice starting point. I think it is easy to say that personally identifiable information and so forth should have the highest standard or the per se coverage of the rules. But you are still going to have the very difficult task of saying what is sufficient risk to generate or to kick in the higher standard. What is sensitive information that might kick in a higher standard? What is a vulnerable individual who might require additional protection? This is a start, but there are certain questions and certain issues that are unavoidable. Unless you are going to say every kind of information is protected to the hilt, which nobody is going to say, you are going to have the difficult task of line drawing and developing criteria that will enable you to draw those lines. It really almost does not matter what the model is or the scheme. That is why in this article that we wrote, we said we cannot do that. We are going to forget about that side of it and we are just going to focus on what the uses are.

KLOSS: You chose not to focus on the individuals and all of the potential —

ROTHSTEIN: We chose not to try to focus on the problem of preventing disclosure because we thought it was virtually impossible to do that given the way that people disclose information themselves, the way the information of a health nature or even of a genetic nature is all around us. We thought that the most feasible way of getting a handle on that is to focus in on the uses. How were we going to let people use that information in insurance or in education or in immigration or whatever?

STEAD: I am not sure we are not talking past each other. Let’s try to figure that out. If the EU has in essence established some broad rules with some exceptions and those rules, I think, other than as modified by exceptions are intended to be generally applicable. They are not very – they do not try to prevent disclosure. They do try to establish certain of the hooks you need for the fair information practices.

ROTHSTEIN: There is a difficulty baked into the GDPR as well and that centers around sensitive information. That is one of the areas that is left to the individual countries to figure out. They are in various stages of meeting and trying to draw the lines that we are talking about.

STEAD: What I am trying to propose is that the floor might not address sensitive information. The floor would be whatever principles we could agree would be broadly applicable. Then you would sit on top of that floor, I think, use or use and holder combination that is where one would provide stronger controls and where you would deal with things like sensitive. I would not try. I was trying to figure out a way to have something that let us put some of the things we sort of agree should apply to everything.

ROTHSTEIN: Conceptually, I agree with that.

 

What I am saying is that is not sufficient and that you are going to still have to wrestle with where you are going to draw the lines and on what basis throughout all those areas.

 

STEAD: I am not attempting to imply it is sufficient. I am trying to by definition keep it from being sufficient because then it would be attempting to become something we could not do broadly. The idea would be if we could put the things that we all sort of agree to, which will make it easier to build out the things above it because you will be sitting on some foundation of practices that cut across and that apply no matter where things go so they then make it easier to hook together or manage the more restricted things that sit on top of it. I am thinking the floor would not be making an attempt to be sufficient.

It would be what we could agree to applying broadly, which might be a floor of nothing. And then we could advance – but the concept of the floor I think is still useful. And then I think to put – my guess is the things that sit on top of that are use and holder specific and maybe Vickie’s point is you are actually making combination as individual uses and then combinations of holders and individuals within – maybe how it would play out. Those would be narrow, but more robust. Useful or useless.

 

FRANCIS: I have a suggestion of two things that would go in the floor for everything. One is accountability. You have to have somebody who is a responsible person – some identified accountable and transparency. You have to have some broad disclosure of what is being done.

MILAM: I would like to add to that. I think there are some others that are universal security, security for privacy and individual rights and notice. Individual rights would be the right to access your record, your right to request an amendment, your right to complain.

MONSON: Linda, this is Jacki. I have a comment whenever appropriate.

KLOSS: Thank you, Jacki. Now is a perfect time.

 

MONSON: I think most of the conversation that I have been listening to is focused on privacy and one of our biggest issues or challenges is actually the security of the data and cybersecurity and all the people who want that data. I think any model that we have without contemplating security is going to be challenging for us to actually deploy and we are not actually solving all the problems.

KLOSS: Thank you. It makes sense. We see security as one of the – certainly of the guiding principles. Maybe it needs to be added actually. The Fair Information Practices. That had security in it.

BERNSTEIN: I was going to respond to Sallie’s comment about access and amendment. Those are, as you probably know, not without controversy in some corners. It is either difficult or burdensome. There is no relationship between the party about who the record is and the entity.

How would they – we even have those arguments among business associates and whatever. When we were working on a business associate rule, they were talking about how could we possibly.

MILAM: That is true, but you know when you are looking for models and you look at players that have entered the space more recently, they have been far more innovative than older players who adopted the rules midstream when they came out and really all the different sectors of health.

GOSS: I have been thinking about where all this is leading. One of the goals being for developing recommendations to define a contemporary framework of data stewardship for the secretary including a pathway for improving private and public sector governance. I feel like there is this timing disconnect with – maybe it is a philosophical and a timing disconnect with efforts regarding information sharing and needed under today’s value-based care environment and then what that means really when you think about OCR’s RFI and the proposed rule coming out and sort of – the work that we do is very important and is often leveraged in ways for beyond just secretarial administration or recommendations.

I am struggling with – when you think about the final rule – the proposed rules from ONC, which I have not finished. CMS – I have, but we have a lot of the framework for privacy and security being underpinnings to those obligations in a Medicare/Medicaid certified electronic health record environment. I am trying to put all these various pieces together and thinking about this table, how it fits, what are we really going to try – is it making sense to anybody else? I see all these various pieces, whether they are going to come together. Our timing for doing a letter to the secretary. What does it mean that OCR is already off probably synthesizing that recent RFI information? We have the proposed rules for ONC and the conditions for certification.

BERNSTEIN: It is not too bad. The NPRM closing date for comments is May 3. It is a long and complex document. I am imagining they are going to get a lot of comments. It will take them some time.

 

GOSS: For just the ONC rule they were talking about.

 

BERNSTEIN: Just the ONC rule. It will take some time to process them and go through them. I do not know what kind of comments came in on the RFI if we have numbers for any of that. But I am imaging there were lots of them.

KLOSS: May I just say I have suspended those kinds of questions for today? We have just invested a lot in thinking about Beyond HIPAA and we have groups now holding hearings on consumer protection. I think it is just really important that we try our best to get whatever we can agree on moves the ball forward even if it turns out to be not earth shattering, but if we absolutely agree that these things are foundational, I think the guiding principles we came up with are terrific. We will be able to use those. I think our themes are strong.

We have a goal of doing a report from this deliberation. Maybe it is a five-page report. Maybe it turns out to be 20.

GOSS: Part of my thinking though and I respect that you desire to suspend it, but if you are thinking about broad floor kind of discussions and what needs to go into it, there is stuff in movement right now that could be influencing that.

KLOSS: I think everything adds – it is all additive. It will be some new insight. But I think it is kind of a breakthrough here that we are saying the broad floor is beyond HIPAA and beyond health. We had not gone there before because we are kind of saying we do not know how to define health anymore. I just do not think we can – I think by tomorrow afternoon we can come back to how does all this fit.

GOSS: I am glad I said it today because unfortunately I will not be here tomorrow afternoon.

KLOSS: What we want to be doing now and thank you for bringing that up because I want to keep differentiating between this report we hope to create out of all the good work that has been done. We have had some real great insights here.

And then turn our attention to what from what we have talked about could be actionable by the secretary today. Rich said something out of the box this morning that was on my secretary letter and that was the importance of addressing privacy and security and this interoperability thrust.

GOSS: I am struggling with that a little bit because I do not see just because we have new ways of exchanging information that we have removed any of the obligations for complying with appropriate privacy and security provisions that are in place today.

KLOSS: Have we added any? Because now we are talking about using apps for access to information and what are the requirements for authentication and other kinds of privacy and security mechanisms in those apps. I think our weighing in on some of that can add a little bit of oomph to what gets decided down the road.

MILAM: I have a question for Maya or Rachel or whoever is appropriate. That issue of connecting personal apps to an electronic health record to receive extractive data or possibly to share data, bidirectional data sharing. Is that regulated by the FTC for personal health record or is that beyond it?

SEEGER: We have put out extensive guidance in collaboration with FTC, ONC, and FDA. We, meaning OCR. It is designed for app developers and mHealth developers to determine when HIPAA, the FTC Act and other rules apply.

And then OCR also offers a portal for app developers that provide certain scenarios that get a little deeper into this. But assuming that an application is being developed that would allow patients’ access to their information through – to the EHR through an app, a portal, that type of thing. The developer who is working in tandem with the covered entity, you would assume that they would be a business associate. And that application could also fall under the FTC Act for consumer protections.

But there are other applications on the market that are being developed whether it is an EKG monitor, a glucose monitor, all of these new applications that you can use with your smartphone or it is a separate device that links onto your smartphone. Many of those are available in the open market and do not have a HIPAA component, but are health data and may or may not be covered under FTC Act.

That was much of what we were talking about during the first two hearings that we held here. We had FTC come in. We had NIST come in. We had other folks come in, talking about matching data sets or data brokers who might – this information might be sold. There are no protections for that information. It is, I believe, referenced in the environmental scan.

But the FTC did a report on big data and made some very bold recommendations maybe five years ago about potential protections for consumers like opting out. There was transparency. They understood how their information might be used in these applications and also calls to Congress to create a stronger framework for protections for big data.

 

BERNSTEIN: Can I just elaborate on that a little? We had a conversation I think among us about the NPRM from ONC and I have heard some talk among the privacy community. In the proposal where they are promoting the idea of downloading your own data and getting access to your apps that will give you access to your own data, it will then be on your phone. It is one thing to have your data in your position on paper, but if it is on your phone, anyone with access to your phone can then get it, Google, Facebook, pharma, anyone because it is now in your position. You are not covered by HIPAA. That means that potentially pharmaceutical companies can get access to data that they would not now be able to get on their own if possible that they could maybe use it for drug pricing and other things that are now priorities of the secretary. It is not clear what ways that data might be used that the consumer is not particularly sophisticated about or aware of. I imagine we are going to get comments of that kind and maybe other things that I have not thought of, but maybe this group wants to think about that too.

KLOSS: Doesn’t that squarely fall into the broad floor of protections?

BERNSTEIN: If there were such protections, yes. It is possible it would be covered by FTC law. It is possible that is fair or deceptive under – but it is not. I am not expert in their law and their regulation.

STEAD: I think if we can write a relatively short first part of the report that really makes clear this concept of the two worlds, not the ’96 world and the 2019 world, the concept of the framework and the fact it was applied appropriately given what society was ready to do in ’96 and it can now be applied. I think to get that statement made really clearly so that anybody can understand it that wants to. That changes the way we are having the conversation.

I think the set below that, this idea that we need a broad floor that begins to protect all personal identifiable information unrelated to whatever it is health, unrelated to HIPAA and that that has to be whatever society at this juncture can accept and hopefully we can put a couple of examples of what we think society would accept. Then to say that you would build on top of that with specific policies and laws that are related to individual combinations of individual user and holder – that actually then – there are principles in there. That becomes – I think that sort of becomes the equivalent of the 2007. This is the way forward.

Then I think from that and maybe we can begin to get at whether some of those – I think then having set that far reaching view in place, here are some recommendations that the secretary should do tomorrow that are going to be informed by this journey we have been on, but that will be much more tuned I think to what can we do now to help covered entities and business associates be more effective in what they are trying to do. I think there are some specific things there and there are probably some first steps outside of there.

One of the first steps – one thing we could come at that boundary was that when ONC is – if we are requiring that people make stuff available to an application, it seems to me that one thing we could propose is that the application needs to be labeled as whether it is managed under the rules that would allow a covered entity to use it under HIPAA or whether it is being managed under different rules. I do not know the right word. But some way so that if some app developer is actually willing to adopt the logical principles for protecting that information then that should become a gold star. Whether we build it in all the laws or not at that juncture while we are building – that may be somebody – I think the recommendations are going to largely come out of the original diagram we had.

They will reflect all this.

 

KLOSS: I would think that would give them a business edge. I see evidence of the release of information industry working very hard to introduce apps now for patient access to medical records, but how they do it makes a huge difference. How well do they build in authentication, two-level authentication and other things? That should be the gold standard for those apps. In fact, done right that can be better than the current practice.

ROTHSTEIN: Following up on that, I am the co- PI of an NIH grant that is studying the role of health apps in research by unregulated researchers who are not subject to the common rule or any research regulations other than theoretically state laws in a few states. As part of our project on September 12 in New York City, we are having a workshop for health app developers and we are going to review not only what is required of them because that is de minimis, but the economic advantages for them of having privacy and security measures built into the apps. We are also going to be reviewing the kind of off-the-shelf technology that is available so they can download components of it. It is not going to cost them anything to do, why it is the right thing to do, why it is from the business standpoint the right thing and so forth. Check with me on September 13. I will let you know how that —

KLOSS: And I think that is another intersection where the current cover entities who work in privacy and security daily can require certain – they set the purchasing spec standards. If they build in those requirements, you will not connect with our EHR if we do not have these eight things in place. Then I think that moves the market in a capitalist way.

GOSS: I want to make sure that we are keeping ourselves aware of the work that the CARIN Alliance is doing. I am not sure if folks are aware of them. CARIN, C- A-R-I-N. A little bit about them if you do not know. It was a group. It is a bipartisan, multisector, collaborative working to advance consumer directed exchange of health information convened by just a few folks we might know such as David Blumenthal, David Brailer, Aneesh Chopra, and Mike Leavitt in 2016.

They are focused on three categories: consumers, HIPAA-covered entities and non-covered entities. Their vision is to rapidly advance the ability for consumers and their authorized caregivers to easily get, use and share their digital health information when, where and how they want to achieve their goals with a set of guiding principles on their website and along with their strategic priorities related to trust barriers, technical barriers, adoption and policy barriers. It seems to me that there is work going on in this space and they might be an effective ally to engage.

 

MAYS: — two comments. One is the problem about the apps is that we also have to adhere to the Google and App Store. Right now, I am spending about $1200 to redo my app because there were some things that changed. If we cannot get Google to not put certain things in the contract then we have a problem. We also have to target Google and Android.

The second issue is I want to make sure we do not focus so much on apps that we lose something that is really becoming more prevalent and that is wireless transmission right now. In the development of actual post-surgery activities, they are sending people home, but developing ways in which to transmit a lot of their data. Some of that data is coming directly to the physicians, some to the hospital. There are a lot of different places that the data has been used even in terms of sending it to the researcher. We want to make sure that we also focus on the wireless development because that is the next big frontier.

KLOSS: If we were going to address the letter to the secretary today, what might we recommend in addition to what we just talked about that we want to make sure that the new requirements being stood up for app and interoperability are using gold standard privacy and security practices? Something like that. I am not writing the recommendations now, but that was the discussion we just had. What else is kind of a here and now? This is your opportunity to think about recommendations that you have for improving HIPAA today.

FRANCIS: I will try a category. One category would be what are the things to think about when individuals okay transfers of information from their health records, which they do a lot sometimes because they have to because they need a job or employment or something like that, sometimes because they get encouraged to do it by things like blue button or today’s equivalent thereof. What kinds of downstream future use constraints should there be?

We are starting out with your EHR. There is a bunch of context in which you are either – we can anticipate some contexts in which people will authorize a transfer out of the EHR whether it is employer, insurer, registry, wherever. What kinds if any downstream future use protections should there be when that happens? What should those look like?

And the reason I started out with that is I think that is an area where consumers are – people are completely confused because people think if it is in the medical record —

BERNSTEIN: And you are asking if it is originates in the medical record, but goes elsewhere.

FRANCIS: Right.

 

STRICKLAND: One of the things that I would say as I am thinking of a situation I recently had to do, which was switch between one PCP to another. I would want my medical records to be passed from when PCP’s health record to the new PCP’s health record and then I would want that first PCP to not be able to use my data in any way, share my data in any more. Restrict the first one from any further rights to that data once I have them transfer it.

KLOSS: Because you have changed providers. MS. STRICKLAND: Right.

GOSS: Does that include their ability to do the necessary reporting they are obligated under payer contracts related to quality measures and —

STRICKLAND: I would think it would just be to the extent of their need to report things on me. You cannot say he has to keep my data for ten years. Once the payers are paid and all the bills are settled and all the dust is whatever, you destroy or get rid of that data. There is no need for you to hold onto that data. There is no need for you to give that to a pharmaceutical company so that they can see the wealth of drugs that I have ever been on or whatever. They should be able to – there should be privacy there and I should be confident that the data from my primary first PCP is gone as far as they are concerned. And my current PCP has all of the information that they need in order to continue any treatment services or medication needs that I have going forward.

KLOSS: I would think that from a life cycle management standpoint that there would be a requirement that data be retained as a business record for the statute of limitations.

 

BERNSTEIN: — liability requirements – state law.

 

KLOSS: I do not think that could work.

 

PARTICIPANT: — how long they are.

 

KLOSS: You could maybe take the initiative and specify some restrictions on use, but as a business record, I think statute of limitations and record retention requirements would trump your wishes.

STEAD: I think what you might modify was that you should be able to request sequestration from further disclosure for purposes other than TPO.

COUSSOULE: One complexity in that kind of model is if an individual indicates to a given provider or practice that they do not want their data shared, what the data does go to, for instance, a health plan by design to pay claims, how does that individual’s request to the originating entity flow through to everybody else who might get that data for legitimate purposes?

 

And the reason I bring this up is I think that level of complexity as information flows through the system is really challenging and I am not sure practical. I would get back a little bit into how does the data get used if and when it gets shared and how do I prevent that from happening, not just how do I tell somebody not to share it and then make sure that everybody else who would conceivably be associated with that does not share it either.

MILAM: I think that there is a lot of value in further exploring Leslie’s framing question because it is really the most restrictive situation should there be downstream constraints on future data sharing that was previously shared with patient authorization or legally. That is my paraphrasing of it.

When I think about that situation, the only place that I can think of today where we have that is with Part 2 data, 42 CFR Part 2. Substance use disorder data. We have that kind of restriction that follows the data. There is a notice, a prohibition on redisclosure that is supposed to follow the data as well. It is because that data can be so damaging, more damaging that our policymakers have determined any other kind of data. When you make a privacy decision, you look at what could the possible risks be because of the individuals, the kinds of data, and what kind of controls do you need. The decision was made in this instance that you need very strong controls. You need it to follow the data.

When you think about other kinds of scenarios, do they all warrant that level of control and resource and effort and limitation on sharing? Maybe that is a good frame for us to think about. But that is really all the way at one end. Then you have the whole continuum.

FRANCIS: We do have an employer model. For example, if an employer gets data with respect to an ADA accommodation request, there are at least requirements that it be kept segregated or there are restrictions on things that can be done with it. We have some models there.

What is not true is that we have any models that I can think of that exist in the – when I download it into my app space or when I give it to a registry.

STEAD: I would advocate that we try to include the first two recommendations that we put in the report to Congress for the executive branch. The first one is to develop and promulgate guidance requiring covered entities and business associates to take reasonable responsible steps to protect identifiable and de-identified health information when released to entities that are not covered by HIPAA or other privacy law. We list the reasonable steps.

 

I think the other one, which is work with other federal agencies and states to develop and promulgate reasonable privacy and security guidance that applies to all individuals and entities and hold or process health information, but are not subject to HIPAA. This guidance should reinforce obligation to uphold fair information and privacy principles. It seems to me those might be two things which would not – those are reasonably constrained. It could be done quickly. It could make a baby step to making – handling your left hand column and a guidance step in the currently Beyond HIPAA space.

KLOSS: Thank you. We will look back at those and include them. What about the – what we put up here on our continuum model was the idea that we could step up the protections beyond HIPAA on covered entities. One of them was requiring data sharing and use agreements before releasing PHI, any PHI.

And then another was specific to strengthen risk management practices and de-identification policies when releasing data sets. I know we have said earlier and we have taken the de-identification, but I think as part of the process, right now we know from prior hearings that is kind of release and forget is what the state of the art is on de-identification, de-identified data sets.

 

STEAD: I think the fact we are saying de- identification is no longer a useful safe harbor that to at a minimum issue guidance that would move people toward best practice while giving us time to let the legal regulatory framework catch up.

FRANCIS: One of the problems with – it is not just the initial releaser. It is good practices. It is what happens afterwards. It has to be somewhere about uses because I can say something like when I release this data set, you cannot use it to re-identify to the initial person I disclose it to. But then let’s say that ends up being part of a larger set of data. It is not clear how the initial disclosure is going to be able to follow it if you see what I mean.

STEAD: At one point along our journey, we talked about having the data use or business associate agreement, as the case might be, actually require either regular attestation that they are operating within the agreement, which would not allow or – and that is probably the simplest thing. It would start. There are obviously things that could be more invasive.

ROTHSTEIN: For the record, I would restate my support for sequestration or segmentation and also the principle of least identifiable form.

 

But I have another suggestion and that is when there is an authorization for limited disclosure of information. For example, suppose I am injured in a work- related accident and now the workers comp insurer wants to see exactly what was found when they examined me or in surgery or whatever and there is an authorization that I sign and it says for disclosure of information regarding broken left arm or something. It is common for the custodians of the health records to send the entire file because it is just easier, cheaper, faster, and nobody seems to care. They just press a button and everything goes. When everything goes, there is a lot of irrelevant, sensitive information in many people’s records. I think it would be certainly appropriate for the secretary to begin rulemaking to try to tighten that up so that these legal authorized disclosures are to the minimum necessary and consistent with the authorization that was signed.

BERNSTEIN: Can I ask a question about that?

 

The entities that you are talking about that are requesting most of the examples you have given, our various kinds of financial entities, mortgages, insurance, non-health insurance and so forth. Those are generally not entities regulated by HHS. I do not know the answer to whether the secretary has the authority to say you – life insurance company, which are normally regulated by the states, should do —

ROTHSTEIN: I am not saying that the requester has to tailor the request even though that would be nice.

All that I am saying is that when they get a request that is limited to a certain amount of information that the covered entity comply with that request and not release the entire file.

BERNSTEIN: But is that only say it is long- term care insurance or whatever, life insurer requests a limited amount. Any request should have certain standards for life insurance request or a long-term care request or a mortgage request.

ROTHSTEIN: I think that eventually I would like to get to that point where the requester can – but that is down the road.

What I think can be done now is to tell the covered entities when you get a limited request, you need to honor it and not do what is fast and cheap and easy by sending everything.

KLOSS: I just had a kind of aha here. We are in an era where there is certainly consideration of looking at ways to reduce unnecessary or nonproductive burden of HIPAA administration, day-to-day administration. All of these examples that we have given are areas where we are at this intersection of HIPAA and access and disclosure of information for various purposes and for various entities.

What we could do is theme this letter as access and disclosure, set of access and disclosure recommendations in light of the stepped up importance of interoperability and the changes afoot there.

PARTICIPANT: Including data provenance as a part of US CDI.

KLOSS: And that would then – we could also make reference to this as part of our Beyond HIPAA, but here are the near-term because all of these fit into access and disclosure. We can go back and pull some of our recommendations on minimum necessary that have yet to be acted on. We have that recommendation, Mark, because you were on that panel. That kind of ties that thematically together and then it is not a potpourri of unrelated recommendations.

FRANCIS: Do you think you ought to add reuse?

 

KLOSS: I think that should be part of the disclosure.

FRANCIS: Access disclosure and reuse.

 

KLOSS: And data sharing agreements. I would really like to make a recommendation about we ought not to be releasing EHR information to data registries if that is not a business associate.

 

MILAM: What do you mean by registries exactly?

 

KLOSS: We did a whole paper on that.

 

MILAM: Are you including public health registries? I would have huge concerns. I think so many people would —

KLOSS: The disease registries.

 

GOSS: But not like the syndromic surveillance registries.

MILAM: Cancer registries? That would be a huge loss. I think it is still a matter of national priority and more of a priority now than ever.

KLOSS: But a lot of the cancer registries have BAA agreements. Do they not?

MILAM: I think some covered entities muscled those in place, but they are not acting – no registry in public health is acting on behalf of a hospital. They are acting on behalf of themselves, the mission they have given in their own jurisdiction. When they are in place, they are inappropriate.

PARTICIPANT: Say that again.

 

MILAM: If a public health agency has a business associate agreement in place with a covered health care provider, I do not see how that is appropriate. They are not acting on behalf of the hospital or the physician.

 

GOSS: To add to your point, it is the under state law they typically have or some kind of federal funding stream. They have the responsibility as a jurisdiction to be receiving certain information from a public health perspective. It is not the same as a BAA under HIPAA.

KLOSS: We can scope it.

 

FRANCIS: There are different kinds of registration. You are talking essentially about public health. But there are a lot of disease-based ones, some of which are run by drug companies.

MILAM: Maybe you want to carve out public health.

 

FRANCIS: That is why I keep going back to my bucket of purposes or maybe users, but not all – here is an interesting thing. Suppose a state cancer registry keeps itself in business by selling data?

MILAM: I cannot imagine that happening.

 

PARTICIPANT: Is there a state that you are aware that is reselling their public health – registry data?

FRANCIS: They are not identifiable data. I do not actually know the answer to that question.

MILAM: That is a pretty significant claim that we would want to have some evidence behind – and put that on the table. Let’s take a step back.

 

FRANCIS: I would not advocate putting that on the table, but I would advocate thinking – there have been examples of public health agencies that have sold de- identified data.

MILAM: There are examples of all kinds of government agencies selling de-identified data.

 

BERNSTEIN: They sell identified data to make money.

 

GOSS: De-identified data is a huge portion of state agency efforts to inform public policy across the nation.

KLOSS: We will look at that. But we did do a paper and we did do research and we identified that there certainly is a niche of disease registries where they operate with business associate agreements and they work hard to make sure that the practices are conforming.

MILAM: It could be that they are not – are they not state agencies?

KLOSS: No.

 

MILAM: They are state agencies? MS. KLOSS: No, they are private.

MILAM: They are private. Okay. Perhaps they are in those situations acting on behalf of a provider.

KLOSS: They could be a nonprofit entity that operates a disease registry on behalf of a professional group, for example. That is fine. That is great. There are great practices and great benefits, but some have BAA agreements and some do not. There is variability in practice.

MILAM: Without a grant of public health authority under HIPAA, that is the only way the nonprofits could have that information legally from the providers most probably so that business associate agreement is the only vehicle that they could have to exist. But when you are looking at pure public health –

KLOSS: I am not looking at pure public health, but I am saying that there are those that exist without a business associate agreement and one would think would be a minimum.

MONSON: This is Jacki. The majority of beyond the public health mandatory registries, most of the registries that Sutter participates in because the provider is interested in getting data back or for whatever the reason is, they are unregulated and I would say that we battle with every single one to try to get any kind of agreement in place and many of them do not view themselves as a BAA so it is very challenging.

KLOSS: That would be helpful to you. MS. MONSON: Absolutely.

 

KLOSS: We did honor that in the research we did last – and most of those have – they are in best practice consortiums and are recommending that kind of mechanism be in place. But that does not mean it is yet the standard of operation.

MILAM: If it is a nonprofit and they do not have that grant of public health authority then they would have to have a business associate agreement legally to get it I would imagine or a patient authorization.

FRANCIS: That is how they get it. Patient authorization.

PARTICIPANT: That is totally outside of HIPAA.

 

SEEGER: Let’s say a patient is looking to get more information, looking to participate with this nonprofit, asked to have their information shared, oftentimes it might not be transparent that a pharma company is involved with this registry. We are not talking about a registry that is operated by a state agency or –

PARTICIPANT: You are talking about private registries.

KLOSS: Any other suggestions? I think this makes really good sense to me and it fits well on our Beyond HIPAA path, but these are things that do not need a lot more study and could be done now and fit with the larger agenda of interoperability.

 

All right my friends. It is quarter to five. Been a long day. We are 15 minutes early. Should we just keep talking or should we – I think we are ready. I was looking for a nod. Before we do that just to indicate that tomorrow our pick up – we can talk about dinner first, but tomorrow we are getting picked up at 7:30. Is that right? 7:45.

Agenda Item: Public Comment

 

HINES: We are a little bit early on public comment. If anyone would like to make a comment, we will be looking at the WebEx. You can use the WebEx broadcast Dashboard or you can send an email to the address on the screen, ncvhsmail@cdc.gov. Thus far, we have not received any comment.

Linda, what I suggest we do is give people a minute if they do want to submit something and if you want to continue teeing up tomorrow morning, let people know what you will be focusing on and in another minute we will ask one more time.

KLOSS: Tomorrow we are rolling up our sleeves and we are going to go back to this draft report and start working on it. We will be opening the morning by thinking about taking our themes work from this morning and trying to refine that. And then we will just go as far as we can and working on the recommendations letter and working on the report.

Our goal is to have documents for the June NCVHS meeting. Tomorrow is a real roll up your sleeves work session. But we are going to use all of the input from today and then all of the previous work and hearings and so forth. It will be around the table with computers and we will be in a working session. But no question, these things will get better as we do another iteration.

HINES: Before we adjourn for today, anyone have a public comment? Please send it to the WebEx dashboard or to the email on the screen.

KLOSS: While we are waiting, I want to just really thank – I did not get a chance to thank Barbara before she left. Melissa will be back tomorrow. Mark, are you heading home this evening? We really thank you. I hope you will think back and think that this was a day worth your time.

ROTHSTEIN: It certainly was and I thank you for inviting me. It is great to see my friends from over the years and feel free to call me to help in any way I can.

KLOSS: That would be great. Sallie, it is just wonderful seeing you. Are you coming to dinner tonight?

BERNSTEIN: Can we take a photo before more of you leave?

 

KLOSS: I am glad you are going to dinner.

 

Mark, are you able to go to dinner?

 

BERNSTEIN: Marjorie Greenberg will join us at dinner tonight.

KLOSS: Oh my gosh. How wonderful.

 

HINES: As far as I can tell, we do not have any public comment at this moment. For anyone who is in the public and would like to send something, you can do so today or tomorrow to the email address showing on the WebEx or to the WebEx dashboard. Thank you.

(Whereupon, the meeting was adjourned at 5:00 p.m.)