DEPARTMENT OF HEALTH AND HUMAN SERVICES

National committee on Vital and Health Statistics (NCVHS)

Subcommittee on Privacy, Confidentiality, and Security

March 22, 2019

National Center for Health Statistics

3311 Toledo Road

Hyattsville, MD 20782

 

P R O C E E D I N G S (8:38 a.m.)

 

Agenda Item: Welcome

MS. KLOSS: Welcome back. I know we’ll have Alix and Deb and Melissa and Maya joining us. But I would like to call day two of our meeting to order. I’m Linda Kloss, chairman of the Privacy, Confidentiality, and Security Subcommittee, member of the full committee, and no conflicts.
MR. LANDEN: Rich Landen, member of the full committee, member of the standards subcommittee. No conflicts.
DR. STEAD: Bill Stead, Vanderbilt University, chair of the full committee, no conflicts.
DR. MAYS: Vickie Mays, University of California, Los Angeles, member of the full committee, pop, and privacy, and no conflicts.
MS. KLOSS: Do we have any members of the subcommittee or the full committee on the phone today?
MR. COUSSOULE: This is Nick Coussoule. I’m with BlueCross BlueShield of Tennessee, member of the full committee, co-chair of the standards subcommittee, member of the Privacy, Security and Confidentiality subcommittee, and I have no conflicts.

MS. MONSON: Member of the full committee, member of the subcommittee privacy, confidentiality and security, and no conflicts.
DR. FRANCIS: I’m Leslie Francis. I’m at the University of Utah, and I’m a former member of the full committee and co-chair of privacy, confidentiality and as it was then called, security. And I’m here as a guest, and I don’t have any conflicts.
MS. SEEGER: Rachel Seeger, HHS Office for Civil Rights, lead staff to the subcommittee.
MS. JACKSON: Debbie Jackson, DFO, and full committee, NCVHS.
MS. KLOSS: Thanks. I would like to ask if you would take out the agenda for today, and let’s just talk through how we are going to proceed. I have a suggestion, but I’d like some discussion about that. Our goal for today in many ways is to go back over what we did yesterday and make it better. So I think we came out of yesterday with draft one thinking, and our overarching goal for today is to come out of this with draft two thinking. We just have to move all of the chunks of our work to the next step. Is that what — is that agreeable? Okay.
When we did the agenda, we thought we would start by reviewing themes and identifying potential recommendations, and begin to draft recommendations that would go in the letter to the Secretary, one of our two products. The way we had conceived of it, we’d work on the letter this morning and then return to the report this afternoon. I’d like to suggest that we flip that. Rachel and I talked about this a little bit last night. Our rationale is that the hardest work, frankly, is the report and that we’d be best served if we used everybody’s brainpower this morning on the toughest lift.
We left with a pretty good idea of how we wanted to frame the recommendations, the recommendation letter, and I have a slide that encapsulates that, so when we get to that part, I think we can review that. But if you’re all right with it, I think we start walking back through the report territory from yesterday and do our best to bring that to draft two.
Any comments, suggestions? Makes sense? Okay. Then we’re going to work first to consolidate group A and group B, the theme work, and try to crystallize that.
We’ll just pause for a moment and continue introductions.
MS. GOSS: Good morning. Alix Goss, member of the full committee, member of the standards subcommittee. No conflicts.

MS. STRICKLAND: Debra Strickland, member of the full committee, member of the standards subcommittee, and member of pop health, no conflicts.
MS. BERNSTEIN: Maya Bernstein. I work for the Office of the Assistant Secretary for Planning and Evaluation. I’m staff to the subcommittee.
Agenda Item: Observations and Development of Draft Recommendations
MS. KLOSS: Thank you, and we have Nick and Jackie on the phone. I think the only person that’s joining us today that’s yet in the room is Melissa, and she said that she’d be a little late.
We were reviewing the agenda, and essentially, our proposal to you is that we flip it. Our plan had been to return to working on the recommendations for the letter to the Secretary in the morning, and then go back to the report in the afternoon. When we conferred last night at dinner, after one cocktail and one glass of wine, we decided that we’d better use our good brainpower early in the morning, and return to working on the report.
I think we came away with — at least I came away with — a pretty clear vision of how the letter might come together, and I’ve got a slide on that that we’ll look at later. But the toughest lift is getting the report message correct, so I think we’re just going to start slogging through it with a goal of bringing first-draft work to second-draft level, by the time we adjourn today. Then the subcommittee will take it from there.
That means that we’re going to pull the flip charts from our early morning, we’re going to go back to yesterday morning, to our themes discussion, and I’d like us to work on consolidating the A group and the B group into a set of themes that kind of jibes with where we — this model, where we kind of ended up.
(Cross-talk.)

We’re going to look at those to make sure that the principles support the theme. And as we talk about how to consolidate, we can capture that. Then we’ll roll into the principles next.
DR. FRANCIS: Linda, I am not a member of the committee, but this is a thought, just for all of you to consider as you discuss this. One sometimes very helpful way to organize is historically, and Bill had a kind of excellent way of formulating how HIPAA was for its time, and perhaps just doing it chronologically could organize
for you.

MS. KLOSS: Leslie has suggested that it would be useful, because we’re going to have to get to that at any rate, to tell the HIPAA was right for its time story, and then the 2019 world story. We could organize chronologically.
Go a little further with that, Leslie.

DR. FRANCIS: Well, I think Bill actually could do that. I thought he had a lovely way of doing it yesterday, but one way — the way I think about it is, a critical theme is just how the world has changed. What did the world look like then? What are some of the major ways that it’s changed? All the distinctions that no longer hold, or the ways that different buckets have changed. But I’ll leave that to you all.
MS. KLOSS: We said that when we were going to start out with the themes, the first thing we were going to do is pick up the description of the patchwork from the report to Congress, and pick up those words and revisit that.
DR. STEAD: That’s what we said before. Melissa at least began to break through to me that the importance of the fact that HIPAA reflected the fair information practices of 1996, the societal values of 1996, the ethical frameworks of 1996, and it then developed law and regulation that worked with those. Where she was trying to push us, which really to me seemed like a breakthrough, but I’m not sure that you and I have — my sense is we’ve not yet connected, and I don’t want to overdrive this thing, I’m just trying to re-walk it through.
Then in 2019 — the other thing in that thing could be technology or whatever — but then in 2019, there have been advances in each of those areas, and along that journey, a patchwork has developed. I think the advantage to getting back that way is it really does present the way that you accommodate that ecosystem in law and regulation, and that what we now need to do is, with the updated view of each of them, figure out the right way to update the law and regulation.
That, to me, was a helpful way of thinking through it. Then, with those two lenses, you sort of get to the patchwork. You don’t start with the patchwork. Be that as it may.
Rachel’s got her card up.

MS. SEEGER: So, I do think it’s important for us to cast where we were and where we are today. HIPAA, the privacy and security rules were the backbone to provide assurances to the transaction and code set standards and the hope that we were going to move from paper to electronic, right? We were moving from the HCFA 1500 to the 837, et al, and if you recall Gramm-Leach-Bliley was happening at the same time that we were envisioning HIPAA, where we were moving the financial transactions from a time when we used passbook — we all had a paper book that we would go to the bank with, and really there was a time when we couldn’t even do interstate banking.
So Gramm-Leach-Bliley set up a matrix for financial institutions to move forward, and now we’re able to use our ATMs in other countries and get an immediate exchange of information. Unfortunately the healthcare industry has not kept pace with that. It’s been much more complex, and we don’t have interoperability of systems yet.
Here comes Alix.

MS. GOSS: And very diverse incentives and motives between the financial industry and healthcare. But keep going.
MS. SEEGER: Right. So, looking at the slide that we have up, the themes discussion — on one hand, on one side, it says there have been no major policy breakthroughs in the last decade. What’s missing here is the omnibus role. We do have HITECH up there. In the context of the privacy, security, and now breach notification requirements, HITECH is missing here, and that really was a major policy breakthrough, but we have this whole unregulated space of health information that this subcommittee has been noodling on for the past two years.
Where I’m going with this — is this report going to be — we just had an RFI that OCR released on modifications to HIPAA. ONC and CMS have a proposed rule out right now that addresses a number of things that we’re kind of talking about in this ecosystem, as well as interoperability. Is the intent for us to provide the Secretary with recommendations on how to transform HIPAA?
Or is it to address this unregulated, beyond HIPAA, space where — HIPAA-covered entities and business associates maybe are taken care of. But we have this other space of emerging tech developers and others who, when they’re playing in HIPAA, they have to play in HIPAA, and when they’re playing in FTC space, they have to play in FTC space. But a lot of them don’t know what space is which, and we have emerging laws like in California, which are much more sweeping than HIPAA’s little space.
I’m just trying to kind of set where we’re at before we kind of move into themes. I think we can easily write where we’ve been in the past 22 years. But I think what we need the brain trust to work on is where we’re going in the next 20 years and try to envision that pathway that Bill was talking about. But HIPAA was designed to be technology-neutral and to be flexible and scalable, all of these things that are so eloquently stated in the security rule by our colleagues at CMS. Really, it was trying to envision back in 1999, when many of us started to work on this rule, when it was a twinkle in Kennedy and Kassebaum’s eye, and moving forward into regulatory space, it was really meant to be this agile framework for technology, and not specific.
MS. KLOSS: So, I agree that I think our goal is to describe what the framework should be for going forward and for this area that’s not protected by HIPAA. But the only reason to go back to HIPAA is to draw the lessons forward, that this was the situation at the time, and I think that we could use those levers, the fair information practices, societal norms, and ethics, the things that we talked about yesterday, and compare and contrast. I don’t think we’re revisiting HIPAA. Is there consensus on that?
Nods around the table.

DR. FRANCIS: Could I just say — obviously, you don’t want your report to revisit HIPAA, but I think there’s a question that you have to have about whether the current structure of HIPAA is always a good fit, or not a good fit, for how to deal with non-HIPAA-covered entities. So whether there might need to be some looking back at HIPAA or what the implications are of the current HIPAA structure for how you can deal with non-HIPAA-covered entities. You can’t forget it, but yeah, if other people are playing in that space, why bother right now?

MS. BERNSTEIN: Sorry, I am distracted. I have Melissa on the phone, who is not up to coming today, but I suggested maybe she could call in, and she may do that.
MS. KLOSS: Okay, I am feeling a little stuck.

Help me, guys.

DR. STEAD: I think we are a little stuck, and I think maybe, from a process point of view, you ought to sketch out where you think we are, what you think we ought to do, and maybe we can edit from it. We’re not — I came out of the day yesterday at a very different place than I came in, and I’m a little bit afraid from this conversation that this is normal Bill Stead, sort of figuring out where the next mountain range is going to be, and I can sort of see how that actually might make a clearer journey.
But I don’t want to take us down that rabbit hole. I’m sensing you and Rachel are much closer to where we came in, and I’m good with any — but I think we would be really helped if as a group we somehow got something, we figured out a process, that got something we can look at and edit, rather than continuing to explore. This, today, is a close-down day, not an open-up day, from a process point of view.
MS. SEEGER: I think it would be helpful to go back to where Leslie was suggesting, that HIPAA might not be the right structure for the non-HIPAA space. Maybe —

I’m just throwing this out there to all of you to get some discussion going — what works in HIPAA as a construct and kind of going back to, Leslie went through the permitted uses yesterday. We had a discussion on that. There’s TPO. There’s consent, which we talked about. It’s up here, trying to go back to some of these pieces that are up here.
We had a discussion, at least in our group, about some risks for consumers in disclosures of genetic information to potential underwriting, insurance, mortgage, long-term care insurance, and others, for relatives of individuals. So are there things that work in HIPAA? And what doesn’t work in HIPAA?
MS. GOSS: When you ask that question, Rachel, are you talking about like the specific obligations of activities under HIPAA, or more of the philosophical guideline, framework, of trying to make it scalable to organizations? If I think about, not just mobile health, but I think about all these downstream uses of data once it’s — after it’s been captured in an EHR, or shared among covered entities, it’s sort of beyond that circle of activities is what we’ve really been focused on. And trying to the figure out how we can protect citizens’ information and ensure that they’re educated on how it’s being used, they have an opportunity to weigh in on it.

So we’ve got all these folks that are kind of bubbling up in the ecosystem that are not technically a covered entity, so what does that mean to us? If we look at HIPAA as what can we take and extend out, because we’re not going to change the law. We could possibly ask to change the regs. But your point about the omnibus was a pretty major upgrade and expanded to successfully address the new health information exchange ecosystem and the protections under breaches, et cetera, and strengthen that business associates. But we’ve never gone back to the crux of, you have to play.
So if we go to this idea of what works with HIPAA, what are the permitted uses — what I’m trying to do is tease out a little bit more to make sure I’m getting your point. That maybe then we can say, these are the frameworks or aspects, attributes, that we need to carry forward in trying to address the beyond-HIPAA scope, to get to that better future state.
MS. SEEGER: If we were to brainstorm around what those attributes are, you talked a little about the philosophical — scalable, flexible, technology-neutral, protective, consumer rights. I’m just trying to think big about what those philosophical pieces are. Are those important?

MS. GOSS: I heard them popping up a lot yesterday. What are you thinking, Linda?
MS. KLOSS: I am just wondering if we might not — maybe I was too linear in thinking it would be useful to start again with themes and go to principles. So maybe we ought to just go to where we were with the model at the end of the day, and talk, and try to flesh that out as much as we can. It might be helpful to put up the model from the subcommittee on the slide and try to see whether there are somethings in there that help us fill this out. Because if we’re going to write a report, it’s got to have — the crux of it is going to be laying out this model. I think we can do the what doesn’t work anymore and what needs to be changed stuff. I think we have enough of that background stuff to write.
But what is it we’re proposing? We always envisioned this as kind of that diagram of the 21st century public health system, the overlapping concentric circles.
I think we just need to look at that area outside of HIPAA, to set HIPAA aside for now. Look at beyond HIPAA, and as we added yesterday, even beyond health, as it relates to data.
I think yesterday we scrapped this, and we said it’s not — Melissa? Good morning.

MS. GOLDSTEIN: I’m sorry I can’t make it in person, but I’ll be on the phone. I am Melissa Goldstein, I’m a professor at George Washington University and the Milken Institute School of Public Health.
MS. KLOSS: Thank you for joining us, and we wish you were in the room with us. We’re struggling to kind of get cranked up here this morning, or at least maybe I am. We’ve decided that we’re going to focus on the model that we were working on yesterday at the close of the day and try to get that to the point that we’re comfortable that we’re reflecting are the essential controls for the beyond- HIPAA space, if I can phrase the challenge.
I guess the best thing is to get in your mind’s eye that flipchart where we were building the planks. On the webinar is the model that we came into his meeting with, that’s the work product of our subcommittee. You’ll recall the public-private continuum, and we’ve changed the title, as you’ll recall yesterday. It’s uses, use- oriented, and our model accounts for individual uses and holders of data with a broad floor of protections for personally identifiable information, health and beyond, because we know the definition of health information needs to be broadened when we’re in the beyond-HIPAA space. So that’s kind of one of those —

PARTICIPANT: Do you want me to start to build this on a slide?
MS. KLOSS: Yes, that would be great. Is there anything more we want to say about this? Are we really saying here that any model that’s going to work in beyond- HIPAA has to look beyond health? Because we can no longer cut neatly into this is the medical record and this is everything else?
MS. GOSS: Let me just clarify, when you say beyond health, we’re really thinking about beyond medical care, and we’re thinking about all the socioeconomic, the epi — what was the great word that Melissa used yesterday? Epigenetics.
DR. STEAD: I think what we are saying is that in today’s world, defining health information is actually considerably harder than it was in 1996, and we couldn’t do it in 1996 and we can’t do it in 2019, and therefore, when you’re looking at beyond HIPAA, you’re really talking about individually identifiable information. Period, stop, end of message. No matter, without attempting to narrow it to health.

MS. GOSS: You use the phrase, individually identifiable information —

MR. COUSSOULE: I’m in total agreement with that.

I think the challenge is if we frame it up as kind of adjustments to the law, I think that’s the wrong way to look at it. If we frame it up as individual information that we call health information, it has broadened and its uses have changed to the extent that what the law was originally trying to address — we could argue how well it did or didn’t, but it doesn’t even apply to a lot of the things today that you would consider health information or PII.
MS. KLOSS: Okay. I think that is a very important consensus point. We could further say that even if we could define it for 2019, the definition will be irrelevant by 2025.
MS. GOSS: We need to be careful on the individually identifiable information, because that, to me, triggers me back to certain regulations. I get it; I understand the challenges in defining it.
MS. KLOSS: Personally identifiable information.

This is the foundation of data protection, a data protection approach, isn’t it?
MS. GOSS: I think what I was trying to get at was the aspect of, I may have medical information that’s individually identifiable, but there are other kinds of community-based information that context — affect my health. I was trying to link those concepts together, because that may not be individually identifiable, because it’s my zip code, yes, but it’s also the air quality in my zip code. So if we’re thinking about the kinds of data that’s going to be transported by noncovered entities in trying to produce increased outcomes of an individual, we’ve got to factor that in, and I was trying to figure out how that fits in with the definition that you were using.
MS. KLOSS: That is where we jump to uses, I believe. Because, remember, we had that plank for —
PARTICIPANT: Let me see if I can help Alix or not, I don’t know.
MR. COUSSOULE: There is a distinction between generally public information applied to any individual who might be part of it versus something specific to an individual. So I get that there’s things like socioeconomic factors that may have applicability to whichever individual happens to belong to that either group or circumstance, but that’s different than something very specific to me.
DR. STEAD: The way I would think about it is to the degree air quality relates to a geographic location, and that is not personally identifiable information. The association of Alix with a particular geographic location is personally identifiable information. What you then use that to connect to is different. The fundamental principle is that — which I think the environmental scan made clear, is there is no useful definition of health information. So if we’re trying to put anything in as a foundation, it’s really related to things which can be, that are identified with an individual.
MS. GOSS: Nick and Bill, that helps. The broad floor is truly the individual. Thank you.
MS. KLOSS: The other assumption, remember, that we made quite strongly is that that plank of differentiating between individually identifiable data and deidentified data is no longer useful. So we took that out. And that we move from individually — the way we conceive of the buckets is by uses.
DR. STEAD: Those are two big somethings, whether we call them principles or findings — maybe they’re findings. Maybe those are two findings, I don’t know.
MS. KLOSS: We can’t define health information, then you can’t deidentify it. You can’t define it, you can’t deidentify it.
Okay. That’s important. Rich?

MR. LANDEN: I am still struggling a little bit with the scope. HIPAA, I understand. Health data, I understand that there is no understanding of that. But where I’m struggling is FIPs, the Federal Trade Commission’s fair information practices, and once we get totally outside of HIPAA, as my understanding from yesterday is FIPs is kind of the rules of the road, so what are we trying to do relative to FIPs? The question is, as we leave the center of HIPAA and go more and more outside, don’t we just transition to FIPs? So where is our, where’s the boundary here? Or is there none?
MS. BERNSTEIN: Forgive me, I can’t help, since you said that they’re the Federal Trade Commission’s, to point out that the Code of Fair Information Practice started at HEW in 1973, and they’re just the thing on which the Privacy Act was based, the thing on which — it was an advisory committee like us, a very, very influential one.
They have been the basis of all kinds of privacy law for many years, universally. The Europeans — the Privacy Act embodied all of them in some way or another, a comprehensive law, over federal data.
And then the OECD took them up later. The Europeans would say we kind of didn’t do anything with them for a while. And now they are those things that form the foundation of many different privacy laws, including HIPAA, although HIPAA doesn’t incorporate all of them. I don’t want to give the idea that it’s just the Federal Trade Commission.
MS. KLOSS: The jurisdiction of the Federal Trade Commission is fairly narrow.

MS. BERNSTEIN: In some way it’s narrow, and in some way it’s very broad. That is, it sort of covers everything that’s not specifically assigned to someone else in the commercial arena, and so they don’t do banking exactly, because that’s regulated by banking. They don’t do health except they do everything that’s not inside HIPAA. They do anything that’s an unfair or deceptive trade practice, and in the last 10, 15 years, I guess you’d say, have expanded their jurisdiction to look at the arena of privacy and security as something where you might have an unfair or deceptive trade practice. Those are two distinct things — what’s unfair and what’s deceptive. You can be very straightforward about your policy and tell people what it is, but still be —
MR. LANDEN: Part of where I’m going with that is a lot of what we’re talking about is app developers, it’s the genetic analysis things, a lot of the data flows that we’re talking about do fall within the FTC scope.
MS. BERNSTEIN: That’s right. Because they’re not specifically regulated by someone else, the only sort of thing left is whether they’re out there in the market with something that’s unfair or deceptive under the authorities of the FTC.

MS. KLOSS: But taking the unfair or deceptive as their jurisdiction doesn’t inform this universe about what are the practices, the ethics, the application of —
MS. BERNSTEIN: One good example to think about is if you have some kind of commercial product that collects individually identifiable information, and you very straightforwardly state that your policy is we don’t really care about security of that data, that’s not deceptive, but it might be unfair. Right? So there are things out there that we care about that are in the realm of what they’re doing that — what am I trying to say?
MS. KLOSS: I guess what I was trying to say is because there is that jurisdiction doesn’t mean that the beyond-HIPAA space is protected. That’s what I was getting at. It plays a part —
MS. BERNSTEIN: No, that’s right. And also, they have a very particular type of jurisdiction, which is not like what we have at HHS. HHS is an agency that can regulate, it can issue regulations. Sort of, we call that quasi-legislative. Whereas the FTC takes individual cases. They can only take a particular case when the a thing actually arises in the market, and look and see if that case, if they can take an action against a particular player, and their power comes from the making public what happened to that case afterwards so that others in the market will not behave the same way. But they have limited resources, and they can only take a certain number of cases, and so forth. They don’t regulate. They can’t, across the board, touch a whole piece of market.
DR. FRANCIS: Another point to bear in mind is that some of the actors you’re thinking about are nonprofits, who probably don’t play in the FTC space, although that — a big open question.
MS. KLOSS: That jurisdiction is kind of part of the patchwork, but it’s not a comprehensive —
MS. MONSON: I think a way for us to think about scope and circumscribe the actions of this committee to the area of privacy if that is what we should do, which I was getting the feeling that that is the purpose, is to focus on the idea of privacy impact assessments and risk assessments, which I think the government, at least the OMB, was trying to do. I’m not sure what is happening now, but because one of the phraseologies that they were using is health in all things, the recognition that housing and transportation and all of these things affect people.
So if we can somehow think about the idea of what is this particular collection of this data holder, what is the risk to a person in the area of privacy, then what do we do about it there, when we’re focusing on the uses? So, we’re focusing on allowable uses, and uses that the person was transparent, or the collector was transparent about, and then what are the actual risks or harms that might come about, potential harms, and how do we focus on narrowing that and protecting individuals from those harms? Does that make sense?
MS. KLOSS: It does. Our emerging model that we wrote up yesterday and are building on the screen now included the need for mechanisms to assess potential harm, and you called it risk assessment model? Is that what you were suggesting?
MS. MONSON: Yes, basically. I think so. Because we can’t — it’s not the job of this committee to protect everyone from everything. I’m not sure — I don’t know whose job it is. But in terms of circumscribing this particular scope, so that we’re not overwhelmed by it, maybe that’s a good focus.
MS. KLOSS: And that’s tied, as we were thinking about it yesterday, the risk assessment model and understanding of potential harms, is tied to the uses. It might be logical, as we develop this model, to go back to this use bucket and kind of list out those chunks of uses. And we did that work yesterday, actually, we have, we went through the list of allowable exceptions to HIPAA. That whole set of uses. But I’m sure we need to expand on that for the commercial and the private space. That wasn’t covered in HIPAA. Does that make sense? That this is what we need to blow up a little bit.
MS. GOSS: Since we’ve talked to a broad floor, we can now move up to the uses, from looking at the screen while we’re in graphic design class. It kind of gives us also a process step if we kind of got consensus around what broad floor was really dealing with, and go to uses, and then we can go to enforcements, and then redress. Maybe that starts to give you all of the pieces.
MS. KLOSS: Right. So can we find a logical way to create a description of uses? We can start with the list that we pulled up yesterday, which were the allowable exceptions to HIPAA. Public health, law enforcement, but marketing was in the context of the covered entities’ use of data.
DR. FRANCIS: What they were were the buckets of how HIPAA distinguishes among uses. Some of them were permitted, some of them were prohibited. The way I meant to raise those, the question is whether for — since those are buckets that exist in HIPAA, how do they look when you move them out of HIPAA? Is there a bucket for emergency uses, say, that looks different than a bucket for marketing, whatever?
MS. KLOSS: Maybe we could put the uses over on the right there.

MS. GOSS: I think she’s still trying to put all the flipchart paper together before —
DR. STEAD: While she’s doing that, I think maybe a third finding here, in the list of findings, is that use of — set the stage for this — are that uses are the key way to structure whatever we build above the broad floor. That it’s uses, not the type of information. Since we can’t define health information, the key way to manage the risk framework is around uses. I don’t know if it will help you as you play with your other things.
Then possible narrowing or subdivision of a use by either holder and/or individual, type of individual, that those fit, that they are ways you can constrain scope or have more than one set of answers for a use. Because are the restrictions on a use different, depending on the holder and the individual, or not? I think that’s the way those use — individuals’ uses and holders’ work together in your framework.
MS. KLOSS: Can we take an example and just talk through that? That relationship? Let’s take public health.

MS. GOSS: Okay. So when we’re talking about public health, it’s a pretty broad bucket to me, spanning from the epidemiology, syndromic surveillance aspect, to immunizations, to — I’m trying to think about, what is — or are we talking about public health from the perspective of obligations of electronic health record, related information to — I know that takes me to a very HIPAA place, but we’re talking about what happens outside of it. I do think also public health has an added conundrum of it as certain exemptions in states.
MS. BERNSTEIN: I think it is too complicated to start with.
MS. GOSS: What would you propose, maybe? MS. BERNSTEIN: Law enforcement.
MS. GOSS: Law enforcement. What aspect of law enforcement? Department of corrections? Inmate care?
MS. BERNSTEIN: No. So, we have an exception in HIPAA for disclosures that are allowable to law enforcement when they meet certain criteria. Pretty broad. Because we understand the need for law enforcement to do certain things. In the private sector, I think there are some — I’m not sure I can think about this out loud in the right way — when law enforcement comes knocking on your door and says we want records because we’re investigating a crime, what happens at a hospital versus what happens at a pharmaceutical company, or some other noncovered entity?
Some bank, or your telecom company, or your phone company. The response to that might be very different.

MS. GOSS: So, the ability to get publicly available data and do discovery, connections that enable you to actually then go —
PARTICIPANT: (Off mic.)

MS. BERNSTEIN: Right. So they are investigating a crime —
MS. MONSON: One of the, I think related to law enforcement, how it’s evolved and how it’s changed that we might want to contemplate for example, is now police officers have bodycams that they wear. So they often will walk into our hospital, they don’t have a warrant, they don’t have a court-ordered requirement to have that.
They’re told by their police departments to wear that camera, and now all of a sudden they have a camera, and they’re taking pictures of our patients, recording them, along with whatever their regular business is.
From safety standpoint they feel they have to keep that camera on while they’re in our facility, because they’re working, and the balance from a healthcare provider and patient privacy standpoint is the concern around it.
So I think just another example of how things have evolved a lot since HIPAA.
PARTICIPANT: That’s a great example.

MS. STRICKLAND: That is one example that we can probably dig into, and also the whole iPhone thing, where someone committed a crime and they went into the history of a phone and dig for records from that perspective. Now that information on an individual was accessed by law enforcement.
MS. BERNSTEIN: Or the recent stories about going to the recreational genetic market — Ancestry, or 23andMe, or whatever it is, to find information in order to find your fifth cousin who committed some crime, is different than the way that we think about genetic information in a healthcare context.
MS. STRICKLAND: Right. So, I think, to that point, a lot of the people who did submit their genetics to these things had no idea that the rest of their family was going to be checked out as well. So there was either not that disclosure, or that disclosure was so fuzzy that they were not aware that, the gravity of that situation, so now Jack the Ripper and whoever else — all of a sudden, we’re going to have all these people. But who would have thought that they were going to take your piece of genetics and run it through 20, 30 years’ worth of data? Who thought —
MS. BERNSTEIN: It is the kind of thing FTC might think about, maybe. Is this unfair not to be — not to make a very clear disclosure and have people understand, because it’s a commercial product now.
MS. STRICKLAND: I think people know now.

MS. GOSS: Isn’t another example like with the sperm banks and people kind of wanting to find out who their dad was, and then that woman, the cease and desist. I feel like that might be another example.
MS. KLOSS: Could we think about uses — again, because we’re trying to make this distinction — we could put the allowable uses under HIPAA in one column and illustrate a couple of these examples that even those that were pretty straightforward, I mean, they were the product of a lot of compromise, but at the time, they seemed like a logical reasonable — but even the circumstances relating to those have changed.
But what the whole chunk that really stands out there as the big elephant is the beyond uses, all of the commercial uses, and I think that we kind of have to get to that. So maybe we say, yeah, we know what all those allowable exceptions are. We can list them and give a couple of examples of how even what seem pretty straightforward is no longer straightforward. Jackie’s example is great.
And then try to capture a way of chunking uses that are just not in the HIPAA world. Let’s go there, because I think that’s what our greater challenge is.
So how do we —

PARTICIPANT: Do you have an example?

MS. KLOSS: Well, I do think that the genetics —

MS. SEEGER: The use cases, I think, the exemplars that we were talking about, genetics is certainly one that we really didn’t get into and explore. But registries, applications, we talked about geofencing a little bit.
PARTICIPANT: Can you remind us what we said about that in a couple of sentences, about geofencing?
MS. SEEGER: So this was a news article that we shared with the subcommittee. Some of you may not have received this. But the example was there is an actual business model where folks are using a text-based application outside of hospital emergency rooms. So they’re seeing people who are texting and whatnot who are in an emergency room. It’s like ambulance chasing. And they are hitting people up with texts who are in the emergency room about insurance and legal services and other types of services, using geolocation of people who are in the ER, in a waiting room.
So I will tell you that there have been plenty of examples within the Medicare space when Medicare Part D was rolling out, back in the dark ages, before this type of technology was around, where certain health plans were staking out outside of Social Security offices to try to enroll Medicare beneficiaries who had just come on to Medicare to market them on Medicare Part D. Really in this we got telephone calls from SSA at CMS to try to shut this behavior down.
So what you’re seeing now is the same type of business model, but using technology to market services to individuals who really are in a vulnerable position in the hospital, using geofencing.
MS. KLOSS: So here are some of the things we talked about in our subcommittee that apps — we didn’t talk about genetics. We talked about registries. We talked about the geofencing technologies, the Internet of Things. We had testimony on that, and just the general product marketing using targeted — using social media or other techniques for targeted marketing.
Are there other things here that are big use buckets that —
MS. BERNSTEIN: I’m reminded of the example of the Target consumer whose father was upset to find his daughter receiving marketing materials for babies and pregnancy and whatever, and he went and screamed bloody murder until he discovered, you know, through some analytics on purchasing, Target could figure out that his daughter was pregnant before he could, and it turns out she was in fact pregnant. So the kind of analytics just on retail business, the combination of what you buy, how much lotion you’re buying and whether or how much Mylanta you’re buying, the inferences people can make about retail purchases.

MS. KLOSS: Maybe retail analytics is a better way to —

MS. STRICKLAND: Cookie tracking type stuff, like when you enter into apps and they say I’m going to track your cookies. So everything you do on that, so if you’re on Amazon and you go and look for something and the next pages you go on, forget it, that’s all that’s advertised on that page is the last five things you looked at, and that’s what exactly happens with probably that Target situation.
So it’s really tracking your internet use through cookies or various means, and that ends up being —
MS. BERNSTEIN: I think those are two different things, could be related, but I think those are two different things. What I’m talking about is inferences people make by the combination of products that you buy at your grocery store, how many cigarettes you buy, how much meat you eat so they can tell something about your diet, how much Mylanta you’re consuming every month for example, as opposed to I went to purchase a particular thing and now all I see is ads for that particular thing until I actually buy it.
MS. STRICKLAND: So when you went to Target, you used your Target card that said this is Deb and she’s going to buy these things. So loyalty cards versus cookies, but those are two different things. One is — I mean, because the loyalty card is hooked to your purchase. When this first started, people didn’t expect for these ads to keep popping up, and people would be like, hey, isn’t it funny that this creepy thing is now showing me the ads that I just looked up for.
So people didn’t expect that, but now people do, because —
MS. BERNSTEIN: I think it is more — it’s similar. I think it’s more like when Amazon says I see that you’ve read these 10 books, you might like this other book, based on your likes or dislikes. Your preferences, right? I can do some analytics and decide you’re a type of person that likes mystery reads or magazines or — but now that Amazon sells everything, not just books, we can do analytics —
MS. SEEGER: We talked about this a bit during the reidentification and deidentification hearing, that the subcommittee had, and we shared some work that was done by Bloomberg on a practice that’s called match backs, where organizations buy datasets from data brokers of deidentified health information, and buy datasets of loyalty cards, and they’re able to reidentify datasets,

mostly in pharmaceutical information, and then sell those datasets back to pharmaceutical companies.
So typically the deidentified dataset, if I remember the article correctly, is from a PBM, from a pharmacy benefit manager, of deidentified information. These lists of reidentified information are being sold on the marketplace, and they even include people who have run in a charity race from diabetes, nonprofits, where those individuals will receive marketing from pharmaceutical companies of diabetes supplies or pharmaceuticals, based on their experiences.
MS. BERNSTEIN: So I realize I might be the one that talked about — we were talking about marketing, I tend to think of marketing as not particular harm to the average person. You can ignore it. You can turn it off, whatever, if the ad appears on your screen. You can either block your ads or just ignore it.
There is a small population for whom that might be a harm who are not competent to take care of marketing in that way. But I’m also, harking back to your imploring us to think about the kinds of harms that maybe some of the ones that Mark Rothstein was talking about yesterday are more salient harms, whether I can — things that affect, really affect my day-to-day life in an important way, like whether I can get certain kinds of — or Melissa was talking about things that you need to get your life together.
If you’re a single mom with kids, you need life insurance and that’s not — it’s sort of not optional, and if you can’t get it because of this kind of stuff, that’s a real problem, much more than the kind of harm that we get from being annoyed by marketing, in a typical case, or long-term care insurance or a mortgage or a car loan or things that — decisions about us that really affect our choices and options and abilities, whatever.
I mean, I think there’s a certain way to which marketing is also — could rise to that level if you are just not presented with opportunities that some populations are. So we have, many of us, have heard from Latanya Sweeney before, and she talks about work that she’s done where you go out for a Google search, and if they see that your name is Latanya and that that’s a typically African American name, you get different kinds of things coming up on your screen than you get if your name is whatever, you know, something that’s a typically Caucasian.
That could be a problem, too, that you just don’t get educational opportunities or opportunities for types of loans or opportunities for cheaper rates on things or whatever it is. So I’m trying to sort of characterize the kinds of levels of harm, maybe, and whether we can focus our attention on the most egregious ones and then another level of them. I don’t know if that’s helpful at all.
MS. KLOSS: I don’t know yet either. I think we’re just talking. I think it would be useful to take one of these and explore this relationship between — so in this beyond HIPAA enhancement of privacy, what’s this intersection between the individual, the use, and the holder? So maybe we take registries, because we worked at that more than any other of the uses. Currently the individuals, they have — let’s take this as a disease registry, not a registry that an individual self-enrolls in, but one for which their data is released without their knowledge, and it’s used as a registry for analytical purposes, but it’s also maybe redisclosed for marketing purposes, or combined with other datasets for marketing purposes. That kind of brings the three dimensions, the three triangles together.
DR. FRANCIS: You didn’t do anything with wellness programs, did you? Because that is another example of where — one thing that could — I don’t think you want, I’m an outsider, so take this with a grain of salt, but I don’t think you want to take on the whole world of the ways that data are being used, but one piece of it — there’s the uses, there’s the data holder, either now or originally, and then there’s the consumer. Those were the three pieces, and one of the things that is interesting about registries is that the data come from HIPAA land into non-HIPAA land. So they come from a HIPAA holder to a non- HIPAA holder.
Wellness programs look like that, too, and then they get combined with other kinds of things and could be used in ways that even if they don’t go back to you to send to you a particular diabetes medication pitch, they could end up with the kind of analytics that mean that the employer starts managing the workforce differently so all the employees with diabetes get disadvantaged, like we know employers did start managing workplace subgroups differently when they got the analytics about numbers of employees going off birth control.
MS. KLOSS: I just had a flash, let me throw it out, see if it’s a worthy flash to go down. One of our assumptions — and it may be a finding — but one of our assumptions was that a logical starting point, because like Leslie said, you can’t put our arms around the whole universe of all the ways data are being generated or used, is to look at this intersection and work at strengthening those kinds of protections.
So just think about this as two kind of recommendations that might come out of this, that here’s this universe of uses that are based on information that’s released out of the HIPAA world. Some releases are handled well, and some are handled poorly. But it’s that data.
And then for the broader universe of uses, this whole set of things that are so rapidly changing that you may focus on the harm, an assessment of harms. That’s how you’d set priorities for the whole range of uses that aren’t those coming out of, at the intersection.
Are you tracking with me? Does that make any sense? So then we are getting back to some of the controls in GINA for example, nondiscrimination or preventing discrimination, those things that are employment risk, insurance risk, things that really impact one’s ability to prosper in life.
MS. GOSS: Is that another way to segue to the box of enforcement on the model? Or is there a step between uses and enforcement so that we may be missing this harms flash that you’re having.
MS. KLOSS: I think it is one way to cut the uses, to categorize the uses.
MS. GOSS: For those on the phone, to take the finger — so Linda, you were pointing at the potential harm and aspect on the box on the left-hand side. So it’s maybe about maybe evolving that a little bit more, because the red boxes are sort of the influencers or attributes or sort of the aspects of the core framework in the middle.

MS. KLOSS: Yeah, the risk and harm assessment. I mean, we can — that’s why even in the California data protection law, they just take health information and say HIPAA still is in effect. It’s kind of a de facto thing, this direct identifiable patient information is high risk, and it’s over here and it has this whole set of mechanisms. Might that work as a way to tease out some recommendations that are more specific?
MS. GOSS: It sort of feels to me like it’s, you know, this idea of the broader-based uses that we might define, looking at the risk aspects and the potential harm aspects becomes another sort of bucket logically. It does build.
MS. KLOSS: Each has a different risk profile. So how do we get from — how do you get from uses to needing broader enforcement, or is this where Rich’s comment comes in? Well, we have enforcements for the beyond HIPAA space when things, when specific circumstances or specific cases can be brought to the Federal Trade Commission, but those are kind of — those protections are maybe few and far between. What gets more, toward more universality for high-risk things?

Rachel?

MS. SEEGER: So we have this construct, right, but there’s nothing within the landscape that requires policies, procedures, a framework for data protection in — we have some legal models, but there is this space that is slightly unregulated. We’ve talked a little bit, too, about medical devices. We have apps here, and maybe we can also add devices.
Some of these are approved by the FDA but don’t fall under HIPAA, and many are manufactured outside of the United States, and when these products are conceived in R&D, privacy and security, our friends at NIST have told us at our hearings that we had on beyond HIPAA that for instance the work that they’re doing on infusion pumps, privacy and security are like after the thing has moved to market and gone through basically FDA approval, then they think about security and they slap on Norton, and many of these devices are connecting to the network and really don’t have ample security baked into them.
MS. BERNSTEIN: Data protection security, because what they talk about security is not the security that we’re talking about, right? They care about safety of the patient’s medical care, but they don’t care about safety of the patient’s data.
MS. SEEGER: For FDA approval, right. That’s what they need. But cyber, for instance, is not — cybersecurity assurances are not a requirement.

MS. BERNSTEIN: Except if somebody can hack in and change the dose of something.
MS. SEEGER: Well, then it becomes HIPAA, if it’s in a medical setting.
MS. BERNSTEIN: But the manufacturer doesn’t — isn’t covered by HIPAA and isn’t required to — unless it affects patient safety as opposed to data safety, so I am totally agreeing with you. I just want to — when you use the word security, in their context they talk about security in a different way than we do. And they’re not regulating that.
MS. KLOSS: So where we’re going with this is that we in addition to a — if I’m hearing you right — risk and harm analysis mechanisms, we need the policy and technology mechanisms.
MS. SEEGER: That is where the framework comes in, and that’s one of the things we had been discussing, what would our recommendations be for a framework? Policies and procedures? Way back when we had the hearing on deidentification, we had Micah Altman who gave us this kind of brilliant flow for lifecycle management of data.
MS. KLOSS: But that is here, right? Here as in obligations of holders, some kind of baseline set of obligations for holders.

MS. SEEGER: I think coming down, out of that — we have those two triangles in the middle, center, and on the right, uses and holders. Coming down out of that is where you want to start to make recommendations for a framework for ensuring risk of harm, ensuring threats, ensuring that individuals would be protected from discrimination, reuse of data in a way that individuals haven’t — but there’s no transparency for.
MS. BERNSTEIN: So some of the things you said, I was thinking that we should try to tease out what kind of harms we actually care about, to be more concrete about the conversation, and some of the things you said are definitely harms we care about. Discrimination, risk of physical harm, maybe financial harm, some kind of threats. But some things are not a statement about harm, like the use of data in a way that is not transparent isn’t by itself a harm. It might lead to a harm that you might not be able to detect.
Right, the reason we care about that is because we want people to be able to participate so that they can protect themselves or so that they can — so what is it that you’re trying to protect themselves from? I think if we got concrete about that, we might —
MS. KLOSS: But I don’t think we have to create that. I think we as a subcommittee looked at the NIST work that categorizes or describes or lists the type of harms, and I think that we could reference that work, because we’re not going to do a definitive harm or risk assessment model in this little report. But that work fits in very well, because it even relates to harms of reputation and loss of autonomy.
They’re described in ways that are high enough level that they’d fit very well in here to describe, to illustrate this point.
MS. STRICKLAND: Right, the transparency I think we were talking about the privacy. So you understand where your data is. I sign up for one site, all of a sudden stuff is over here because it was a partner company, you had no idea.
MS. KLOSS: I think those principles are related to the individual, and that’s the way the General Data Protection Rule is written. It describes the patient.
MS. GOSS: So the principles, is that another green box underneath individuals? So you would have the principles of the rights and for the individuals. So then you go up from there. You go over to uses, the holders, and then it all comes back to a framework, with all of the red boxes and the non-colored boxes, giving the deeper description and flavor of what we’re getting at?

MS. KLOSS: Yes, that is where the principles belong. Are they FIPs?
So is anybody kind of getting kind of excited about this? Are we capturing all the — I don’t see a lot of excitement. I am starting to get — my heart’s starting to pound a little bit.
MS. STRICKLAND: One area that Rachel had mentioned earlier, these data brokers. There’s a lot of harm, I think, that can be done with data brokers, depending on their motivation and use of data. I’m not sure where they fit, but they’re a user, a seller. They use the data to gain money and they’re a holder. So they hold the data. Hold it for a use, which is to make money, profit, which could be for good reasons or for bad. So that’s an interesting concept.
MS. SEEGER: So our friends at the FTC, in addition to their big data report, did an entire — worked just as you all are working on a report on data brokers, and they made many recommendations and it — you had turnover with the FTC commissioners, and I don’t know what’s really come out of that, but they made very solid recommendations to Congress about transparency, about potential harm, discrimination. Much of our work has been built off of that in our thinking, and that led to the White House doing a report on big data, as well as the FTC’s follow-up report on big data. We’ve certainly been using all of that work to think about where we are today in this space.
MS. KLOSS: But we haven’t advanced any public policies to address those recommendations.
Vickie?

DR. MAYS: One of the other things when you talk about data brokers and we’re talking about potential harm, it would be good if we also have something that’s like missed opportunities or the positive side. If you look at where both HHS and the White House are, it’s like data is economics, and they really want to expand the use, and that’s really what we’re start to talk about in the beyond health, and we might need to identify — like for example, in the data access and use group, we have all these entrepreneurs, we have the U.S. News and World Report people, and then we have at the community level the people who want to use the data to do something.
So I guess I’m trying to get a positive — when you said excitement, I was like, well, I looked, why would I get excited about what could be but needs protection. So if we — I don’t know where it goes.
MS. KLOSS: I think that’s an important high level position of this paper. I’m going to call it the paper.
That we’re not discouraging use. We’re supporting use.
Just suggesting the need for a framework that better protects information.
DR. MAYS: I guess what I am thinking about is actually kind of calling out so that there are more people who will think about doing this, because we’re talking about beyond health. So beyond HIPAA and beyond health, and there’s just categories of people who are playing in this field now, and I don’t think we’re speaking to them.
I think we are still staying in the very kind of healthcare use and not kind of the more creative health that’s combined with other things. So find them a place actually in our diagram.
MS. JACKSON: Datapalooza is next week? Ten years, it started in the back pocket of HHS, and it’s exploded, and I don’t know how many — Rebecca is going, Rebecca and I are going, and just knowing that that kind of explosion is out there and has taken rein with all of the tools and things that have gone on in the last 10 years I think is pretty exciting.
MS. KLOSS: Absolutely, but I don’t know how — maybe that needs to be in our principles set, that we’re not suggesting putting barriers to creative uses, but we are suggesting the need for some kind of crash barriers against harmful uses. I think that’s kind of foundationally what we’re doing, because right now it is indeed the wild, wild west.
MR. LANDEN: I think we had some of those principles about encouraging use, promoting operability, and rather than crash barriers, maybe think of it in terms of guard rails.
MS. KLOSS: We will go back and look at the principles against this picture and see if we’ve carried them out.
MS. BERNSTEIN: I think of that as what Vickie’s talking about, is data sharing in a privacy appropriate environment. Like I don’t think anybody’s trying to hide or corral data or whatever, that data wants to be free, all that kind of stuff, but I think that we have to be mindful of both the joy and economic development and innovation and creativity that we can get from new uses of data and also that there are nefarious uses of data that we want to protect against. So I do think that we make sure to put that in a principle up front that our work will — your work will be better appreciated or better received.
MS. KLOSS: Rachel, could we put a couple of examples here, because that whole thing in my mind is kind of a — I know we were talking about specifics and policy mechanisms. If you could just put that — just put that in parens.

Are we at a stopping point? Do we want to take a break and then look at the principles?

(Break.)

Agenda Item: Draft Report Outline

MS. KLOSS: We will reconvene. I’d like to do a little time check on people’s travel. What time are you —

MS. GOSS: I have to be in a conference call at 9 o’clock.

MS. KLOSS: Deb, you’re here all day? You’re here, Rich. I think we just lost Nick. He had to go to another meeting.
Jacki, how’s your schedule?

MS. MONSON: I have about another 45 minutes, and then I’ll be dropping off for the day.
MS. KLOSS: Okay. Melissa? Melissa had just come back from New Zealand, and she came to our meeting yesterday. So perhaps the travel hit her.
MS. KLOSS: All right. Here is the flow of work that we suggest. First, that we look at this emerging model against the guiding principles and make sure we haven’t overlooked something. Then come back to the model and just talk through those green boxes. Which of the FIPs are related to individuals and which are framework items?
Because there are some that fall into both buckets. I was — I kind of said most of those are individuals, but they’re not all.
And then I think we probably can just make sure we’ve got the overarching messages that we want to include in this report, and then move our attention to the letter to the Secretary, and we left the day yesterday with some really good concrete ideas. I have a slide that summarizes those so we can work on that. And then we’ll see where we’re at and whether it’s lunchtime or after lunch, we’ll just play it by ear. Some of us have until midafternoon, but I’m sure there might be earlier flights that — I don’t know, but I’d like to take advantage of the brain trust for as long as I can, because immediately next week we’re at the computer starting to write these things. So the more — the further along we get, the better, right?

Okay, these are the principles that we — and there are three slides of them? So let’s go to the top and see if we think we’ve captured in the model.
DR. STEAD: I think we need to make sure we appreciate the set of principles before we’re trying to say have we covered them in the box. That’s all I’m saying.
PARTICIPANT: The second one on that slide, the one about ethics, sort of seems like one of these things just isn’t like the others.

PARTICIPANT: And it’s not on the model. It popped out at me, too.
PARTICIPANT: I agree it’s different. I thought ethics was actually going to be a piece of the framework, but I may be missing something.
MS. KLOSS: That maybe is where it belongs. PARTICIPANT: Sort of what that thing says.
MS. KLOSS: One of the thoughts I had going through these is it may be possible to organize the principles by our three boxes.
DR. STEAD: Let’s look at slide three, if we can, just so we know what the principles are.
MS. SEEGER: That’s two. This is one, two, which we already looked at, and this is three.
DR. STEAD: Back up to two a second. Got it.

Okay.

MS. GOSS: So Linda, is it really that if we go back to go to the third slide where we have this, the one that — okay, ethics should be an element. I think it’s ethics — expectation of ethical behaviors I think is what we are getting at in that statement. We wordsmithed this a bunch of times yesterday, but I still think it — we were joking that it sort of doesn’t look like the rest of it, but it’s — we expect people to be good actors. I think that is a reasonable principle.

PARTICIPANT: What do you mean by that?

DR. STEAD: How about saying incorporate expectation of ethics?
DR. MAYS: You want ethics in lots of things. So it isn’t how the actor behaves, but it’s like even how you think, design, the whole bit.
MS. GOSS: So would you be okay then with just incorporate expectations of ethics to be more of that broader bucket, knowing that we’re wanting it to be in the fiber of all that we do? I like it.
MR. LANDEN: I’m struggling a little bit with the final bullet there, track provenance of data originating in medical records. I’m not quite sure what it says. My guess is that if data originate in the medical record, then that fact should follow those data to all downstream uses.
MS. GOSS: How do you think it syncs with or is different from the bullet above? Privacy choices attach to the data, not the custodian, which to me is sort of getting at the provenance aspect. So is it really that the final bullet is more of a detailed aspect or description of what we mean by the second to last bullet? I’m trying to figure out if there’s truly a difference that we don’t want to miss.
MR. LANDEN: Part of my question of that final bullet, are we trying to say that if this data, datum, originated in a medical record it should remain HIPAA protected, even when it leaves the HIPAA sphere? What are we trying to —
MS. SEEGER: That I think is more methodological, and it gets back to some of the discussions we’ve had in the past about the lifecycle of data, and this one in particular kind of links to that notion of data brokers buying and selling data in the deidentified format. We don’t — I don’t know if the technology — I am sure the technology exists to do this.
DR. STEAD: Maybe the broader statement is we have privacy choices attached to the data, not the custodian.
Provenance attaches and follows the data. Then you would be getting — does that get at the point of this discussion?
MR. LANDEN: Yeah, I think that is generic, without getting us down the rabbit hole, because if the issue is tracking provenance, then you have to have not just that it originated in a medical record but that it didn’t originate in a medical record.
MS. KLOSS: So let’s go back to the first slide.

MS. SEEGER: So I think that some of these might be principles, and some might be methods. What we were calling a framework, but methods to get to — that would be under the framework.

MS. KLOSS: Right. I think that the green box there, some of these are going to allow us to flesh that out.

MS. SEEGER: Would you like me to start to build a new slide with two green boxes at the top, principles and framework, and then we can try to move these guiding principles on the other slide into buckets?
MS. KLOSS: I actually was wondering if it would end up that we actually had three green boxes, one for each of the triangles. Are there some principles that apply to uses, irrespective of the holder?
MS. SEEGER: Can we discuss each one of the principles, and then maybe color-code them or — color-code them. We can do that later.

MS. KLOSS: So protections increase health equity.

They make things fairer and expand access and all of the other things that we define and describe as health equity. Where have we captured that?
We could look at it as a harm that a protection restricts. I don’t know. One could argue that how an individual exercises their rights could either help them or hurt them. But we’ve covered that in the second bullet.
So never mind.

So consumers understand how to exercise information rights and how information is used, is definitely a blue color code. So I’m not sure about the first one. Let’s not get — let’s come back to it. Let’s not get hung up, right?
DR. FRANCIS: You might want a color code green, which is for — it cuts across everything. It’s a framework, right? And that would be the first one.
DR. STEAD: My guess is we’re getting in a rabbit hole. I think to get a good list of principles is a giant progress from where we were yesterday. They are very different than what we had. I think trying to match them up is going to get to be —
MS. KLOSS: Okay. I want to make sure we’ve got everything covered that we need to cover. Do we want to put some — hello?
MS. GOLDSTEIN: Hi, can you guys hear me? It’s Melissa. I was cut off for a while. I don’t know what happened. But if you don’t mind, I want to go back a little bit to the discussions of provenance and ethics.
First, for ethics, when we talk about people acting ethically, that’s very different than an ethical analysis using ethical principles and theories as to frameworks or models. So I would actually not focus on ethical behavior. That’s what I was talking yesterday about benevolence. It’s a different concept than actually including an ethical analysis of what people are doing and data holders and what we are doing with the data. So the phraseology we used yesterday about incorporating ethics is different than expectation of ethics. Expectation of ethics focuses on individual behavior, which might be like a government ethics office with conflicts of interest, but that’s not an ethical analysis, using ethical principles and theories. So you can focus on the behavior if that is what you are trying to focus on, but there is a distinction there that I wanted to make.
The other idea is in provenance. I think it is important to distinguish when we are talking about provenance between citizen science, patient contributed information, or information from third parties about people, and information that comes from a clinical environment or research environment from the researchers or clinicians specifically. So if you’re going to talk about provenance at all, my thought is that that should be part of the conversation. That doesn’t mean that you have to focus on provenance, but it is a very important distinguishing factor that people will use in that discussion.
MS. KLOSS: Have we captured the — I don’t know if you’re online, but we now have — are showing expectation of ethics.

MS. GOLDSTEIN: So, to me that means ethical behavior, and that’s not what I would focus on here, because everybody has a — well, what we had yesterday was incorporating ethical, ethics, into the framework. Those are different ideas. Ethical analysis into the framework.
If you want ethical behavior, that’s different, but it’s a much more colloquial societal term, ethical behavior, than true ethical analysis.
DR. STEAD: Incorporate ethical analysis may solve the problem. Do we have to have into the — where we get in trouble is the framework is now as currently on the diagram one piece of our larger framework.
(Cross-talk.)

If we simply say incorporate ethical analysis, you’ve got it.
PARTICIPANT: And then right now we’re showing provenance is written as provenance is attached to data.
MR. LANDEN: Melissa, I don’t think I really followed your thought process there around provenance. Can you talk a little bit more?
MS. GOLDSTEIN: So information that I might put into my medical record as a patient may not be actually, quote unquote, trusted by the provider as much as notes that the provider takes, right? But there’s more of a movement that patients should be able to contribute their own data, what I ate, how much I exercised, the blood pressure test that I took when I was at home, those types of things. There’s often the idea of citizen science, which is I think a broader idea about people in general collecting information about their environment and the environment at large and contributing it, sort of that idea of epigenetics and what goes into making decisions and algorithms and how we get to actual prediction and predictive analytics.
Provenance means all of those things. Where did particular data — to me, provenance means all of those things. Where did data in say a dossier that you have about an individual, where did it come from? Do we give certain importance to data from a medical record that we might not give to data contributed by the patient? Or data contributed by environmental factors, like prevalence of flu in the area at the particular time. So provenance could mean many different things to many different people. I agree that it should be one of these principles. I do believe in that. I just — you know, that there are different ways that people think about it. That’s all I was saying.
MR. LANDEN: I absolutely agree with you. The fact that there’s data in a medical record doesn’t mean that that medical record was the original source of that data, nor does it imply that it is the clinician’s professional judgment or it’s an accurate representation of whatever the patient stated. So yeah, provenance has to be kind of the whole history and flow, because in an interoperable world, what’s in clinician B’s electronic health record may have actually originated from clinician A, gone through an insurance company, and then into clinician B’s records. So that whole trail. Think in terms of chain of evidence. That’s what provenance should be describing.
MS. BERNSTEIN: I was wondering if there’s a way also for us to capture the idea that Vickie was pointing out before, that one of our principles, even though it may be in conflict with some of these principles, is that we don’t want to impede innovation, creativity, and so forth. Some statement like that. I used not impeded, but you can state it in the positive, too, I guess. Support for innovation and creativity.
DR. STEAD: Given the number of times we’re going to go back through this, I think we can call the principles done would be my suggestion, and move on. We’ll have many more times to revisit them.
MS. KLOSS: And so then if we go back to our model, I think what we want to do is try to identify critical elements under individuals — the green boxes, if we could.
DR. STEAD: Are the principles under individuals the fair information use principles?
DR. FRANCIS: Can I ask a question, which is why you’ve got principles under individuals and policies and procedures under the other? It seems to me that you want a framework that is for all of them.
MS. KLOSS: I think we put it up there when it popped in my head that when you look at the GDPR, the individual rights as each of them is delineated all reflect FIPs.
DR. FRANCIS: So do obligations of data.

MS. KLOSS: So we need to unify these two green boxes and call them — what will we call this green box?
DR. FRANCIS: The European word for holders is controllers, I think.
MS. BERNSTEIN: Sometimes it is called custodian. MS. KLOSS: I think we want to call it practices.
That’s broader than policies and procedures, and it can include FIPs. Then one box —
DR. FRANCIS: I think you want one box for all three. But different ones, different pieces of it — I mean, I don’t know if this is right, but I think once you start saying there’s no clear way to separate out health data by the holder, then you — I think you have to start saying part of what you need to think about for an overarching framework is issues that start with the individual. Part of what you have to think about is issues that start with the holder, and part of what you have to think about are issues that start with the use.
DR. STEAD: Can I make a comment? If we are going to end up with a unified box down here in some way, I think broad floor is largely reflecting to individuals. I think that broad floor points correctly to individuals. I think the uses comes — uses actually comes down to holder, uses and holders, and enforcement cuts across all of it. That could work.
MS. SEEGER: I think you have holders under uses.

It is the same thing. You have it up there twice.

DR. STEAD: So you are saying you kill the uses there.

MS. KLOSS: Let’s let Rachel play with the diagram.

DR. STEAD: I think we’re gaining ground.

MS. KLOSS: Maybe the risk and harm analysis is the next plank, though, because that may be what guides enforcement and then redress. I’m throwing that out as an idea.

Remember the people on the phone can’t hear you if you don’t use your mike.
DR. STEAD: I like the idea of adding risk and harm. I’m now a — well, or maybe it’s risk and benefit analysis that actually would be the missing plank. Because it will inform both the broad floor and the uses, and it will inform what we enforce and redress. So I like that.
MS. KLOSS: Risk and benefit analysis.

DR. STEAD: I would actually prefer to make it benefit and risk analysis, like we did in ICD.
MS. KLOSS: That echoes Vickie’s point. Move it into the empty spot. Move the benefit. Make it white and put it right here.
And harm analysis is part of redress. Harm analysis/compensation.
MS. BERNSTEIN: I have to say I’m still bothered by the two bottom arrows. Because I don’t think principles are only related to individuals.
Get rid of the other box, okay, and then I’m happier with both of those arrows.
MS. KLOSS: And then consent, sequestration, are principles and practices.
PARTICIPANT: We have multiple colors are confusing to me, but other than that, I’m good.

PARTICIPANT: I think that maybe for now we don’t need to. We are going to have a number of things that drop into principles and practices, and that, as you had started out thinking earlier, that maybe will lead to a second diagram. I don’t know that we need to develop that today, but we have things like lifecycle management of information
— we have a lot of —

DR. STEAD: So do we hurt ourselves by having the red boxes which are a subset?
MS. SEEGER: I think that there are mechanisms under principles and practices, and I have — I think that you have a question; do you want to start another slide with the green box at the top, principles and practices, and then start to get some of these levers underneath there, that you all have identified in the work we did on principles in these red boxes.
PARTICIPANT: Which is work we can do after the meeting. We don’t need to do that.
MS. KLOSS: And the tiers, we have kind of captured — we are going to capture that as the findings.
DR. STEAD: So then the left-hand column will come off ultimately. We don’t worry about it now. This will be the base model, okay. You could just at this moment make the green box principles, comma, practices, comma and mechanisms, or it could — and then you could have the next slide break that green box up. You can do that after. You don’t need all of us watching you.
MS. BERNSTEIN: I think the shape of this whole thing will come together after we delve into the discussion further.
MS. KLOSS: Save the principles, and we’ve got all of the input into the themes. We understand the overarching description of the world in 1996 versus 2016.
We have all of that input to come together, and if you will, can you pull up that one sheet that was the proposed report outline? Now we have changed the — we have changed the name of everything, maybe because we aren’t calling it a stewardship framework anymore.
(Cross-talk.)

MS. KLOSS: Let’s just do a reality check. We said we don’t know, but I think we’ve changed a lot. So just a reaction to this. We call the principles — we called them principles, we just didn’t call this stewardship. So we have to work on the title. Let’s not worry about that. After the fact.
DR. STEAD: My own gut is the title and the purpose of the report have all changed, and we have come a long, long —
MS. KLOSS: We have. That is why I wanted to — DR. STEAD: The intended audience may be the same.

MS. KLOSS: Yeah, it’s quite different. How far — but this will be helpful guidance. How far do you in a paper like this that is really for policymakers, we’re not going to get down to initial specific recommendations.
We’re going to say here’s the chunks of work that needs to be done.
DR. STEAD: I still think that you have a real opportunity, and I don’t — I still like the idea that really follows the point Melissa laid out of the world in 1996, the framework in 1996 and HIPAA as a response to the framework. The world in 2019, the framework in 2019, and the thing we have just described, the new framework, the frozen slide when it gets undone, which will have in it, I guess, some principles, so that actually would be a framework — it wouldn’t be the law. It would be the framework.
Then I think at that juncture you actually go back to the diagram we came in with, which is — so then this is the path forward. Then the framework we — the picture we came in with is now the section immediate next steps, and the thing we can do immediately is to operate within the confines of HIPAA and the confines probably of health beyond HIPAA, and then these are immediately actionable recommendations to the Secretary that either could be there and also in the letter or just in the letter, and so I think you could make broader — you could have a broader set of recommendations to multiple things before you get into that — that’s where you’re going to have to test, and in many ways I think what you might do it nail down the recommendations to the Secretary and then say, okay, given what we’ve just done, what other recommendations do you really want to make, to whom, that would be less immediate.
Then that might tell you, do you have a little block. Thinking process for painting your way through it.
MS. KLOSS: But I didn’t necessarily see this report — I guess this report goes to the Secretary and it has Secretary recommendations, and then it has —

MS. BERNSTEIN: Committee advises the Secretary of HHS.

DR. STEAD: Yes, I think this report goes to the

Secretary. We want it as many — as all of our reports do — to inform the industry and in this particular case, I think it informs — I think what we’ve just put down would be very helpful in informing the current discussions in Congress.

MS. SEEGER: I think you can incorporate the report in the next report to Congress, but I think Maya made, off-mic, an important reality check that your charge is to make recommendations to the Secretary; we know that you have many, many stakeholders out there who are going to be viewing those recommendations and want to hear what you have to say. I know that ONC is, they’re listening. You just met with them this week and they’re listening to you.
So you have everyone’s ear. The next report to Congress is an opportunity for you to articulate what you’ve done.
MS. BERNSTEIN: You can also — there’s a way in which making a recommendation to the Secretary, you can say things like the Secretary should support legislation of the following kind, or the Secretary should advance legislative recommendations of the following sort. You can sort of wangle your way to those other things that Bill wants to talk about in the guise of speaking to the Secretary. And as Rachel points out, knowing that these reports are public, many audiences are reading them. You don’t have to speak directly to them for them to get the message.
DR. STEAD: I think it will be extraordinarily helpful, though, to have the first piece of the report that lays out the path forward, and then the second piece of the report clearly constrained by the current legal framework. I think for people to actually see them side by side will be very helpful. And our recommendations will relate to the latter, not to the former.

DR. FRANCIS: Would the third section of the report trace that out for a couple of the use cases you’ve focused on?
PARTICIPANT: Yeah, we could. We’ve done the work.

MS. GOLDSTEIN: I think that the big data reports that the FTC and the White House put out in the Obama administration, I think had so much power because they were from different bodies, but within the same administration and focusing on different elements of the same kind of ideas. I wonder if you all might consider encouraging the Secretary to lead within the administration, across agencies, perhaps to start an interagency focus on this idea, so that it’s not a complete reliance on Congress or hoping that Congress can act, but that within the administration itself, across the federal government, that the Secretary actually takes — or perhaps a lead or supports, or advocates, or is in support of — whatever language that you believe is appropriate. Is that part of the mission of this group, Rachel? Maya? I’m not sure, legally, how it’s —
MS. BERNSTEIN: Absolutely. You can recommend pretty much anything to the Secretary that’s within his authority to do, and everything from merely the hortatory to suggesting that the Secretary convene or anything from convene to advance regulation or regulate. You can recommend to the Secretary any of those things which are in the purview of the Secretary to do, and certainly the things that you’re talking about are.
MS. KLOSS: Particularly against the background 20-plus years of HIPAA experience. The department that has the most experience with privacy work is HHS; is that not true?
MS. SEEGER: The Privacy Act is across government, and I know we’ve talked a lot about fair information, but I think that when HIPAA was first hatched, the writers were looking at the Privacy Act as a construct.
MS. BERNSTEIN: I think in different contexts, Homeland Security people, DOJ, have much more experience in the law enforcement context with dealing with privacy and civil liberties sorts of things than we do. But depending on which part of government you look at, there are different experiences there and different laws that apply in those areas that are in addition to the Privacy Act.
You have telecom privacy, you have other kind of financial privacy, you have different kinds of laws.
MR. LANDEN: Historical note, that HIPAA itself, the privacy, really didn’t get in the HIPAA legislation. It was defaulted to regulation by HHS because Congress could not at that time form a consensus around what HIPAA privacy should be.
MS. BERNSTEIN: That’s exactly right. I remember that. There’s a very strange language actually in the original law which says, should Congress not continue to do this work in the next two years, HHS should regulate — I was at the Office of Management and the Budget at the time, and those of us who were working on this were — we immediately convened to write regulations. It took us several years, because we knew that the Congress was not going to revisit that, it was so difficult to get through.
MS. GOLDSTEIN: I guess my thought is that OMB was a driver in the last administration of coordinating angles on privacy, probably particularly because there was the first privacy person at OMB, who had the charge to do this, Marc Groman. This administration might be structured differently. I don’t know if OMB is as currently focused on this. You all know much more than I do.
But if this Secretary decided that it is something that he wanted to specifically work on and start an interagency group and focus on it and lead on it, I think that would be great, because in terms of action, it doesn’t really matter — well, of course it matters who acts and what legal force it has and what legal angle it has and what strength in law it has. But at least there would be some action, that’s just what I’m thinking.
MS. BERNSTEIN: I think that, in terms of what the focus of the Secretary is, one new element in our world is a new bill called the Foundations for Evidence-Based Policymaking Act, which we’re calling the Evidence Act for short, and that thing has multiple parts of it, but it does, in a historical line of statutes that want to try to share more data and make more data available for linking and producing analytic information on which you can make policy, there’s a push to do those kinds of things.
There is, of course, language in there that says, you shouldn’t violate privacy while you’re doing it, and for the first time it requires that before we disclose data from the government, we do a risk analysis. So it without mentioning hearkens to things like both traditional ways of statistical perturbation of data and also relatively new concepts like differential privacy, and we’re looking for other approaches to think ab out privacy.
Both of those things, I think, are now on the plate of the Secretary, but the focus has generally been on the producing evidence and getting access to data part, and I think the work of this group will be important, because we’re now — the thing just got signed by the President on January 14, so it is six months before its implementation, lots of guidance that OMB will be putting out as major lead agency, creation of evaluation officers, chief data officers, and sorts of things, that will create this potentially new infrastructure for data sharing.
I think it’s timely for the work that the committee is doing, while government is thinking about that across the board, in every agency, not just HHS — to also be thinking about, okay, in that kind of environment, how should we be thinking about privacy and what, if any, new practices, principles, whatever we should put in place, in that kind of an environment?
MS. KLOSS: What do you see as the relevance of this work that we’ve done to 21st Century Cures?
Anything?

MS. BERNSTEIN: That act is broad. It has multiple, it has many, many different parts. There are also multiple privacy parts that are not the — the ones that this group is probably most likely thinking about, the work that’s coming out of ONC that we’ve been mentioning that NPRM is implementing what has been assigned to them in the 21st Century Cures Act, regarding things like information blocking and so forth. And I do think — we’ve mentioned that the comment closing period is May 3. I’m not aware that the committee as a committee is going to comment on that, but certainly in all of your individual professional, whatever, other affiliation and capacities, you may want to look at that.
It’s a quite long and complex rule, but you may want to look at that proposed rule and comment on it from your other affiliation or from an individual point of view and weigh in on those things with your expertise. I know that the department is looking for as broad comments as we can get and will welcome any expertise on that NPRM. That’s at the moment the phase of the kinds of, the parts of the Cures Act that you might be interested in.
MS. KLOSS: I just asked at an inappropriate time about the relevance of the work we’re doing on —
MS. BERNSTEIN: They have been working very closely with OCR on that.
MS. KLOSS: Why don’t we move ahead to the slide on the recommendations to the Secretary.
MS. BERNSTEIN: Can I just ask quickly, do we have an electronic — I couldn’t find in my inbox. Do we have an electronic version of the report outline? Did you send it, or Rebecca? Just so I can find it.
MS. KLOSS: We may be rethinking this, too, but when we talked about the letter yesterday, we were thinking of it being really that short-term opportunities and we’re probably going to have a set of those that come out of this report.

(Cross-talk.)

MS. KLOSS: Let’s not worry about it. We may throw the whole thing out. We said yesterday that we would link this recommendation letter to the Secretary, backgrounding it in the current policy priorities of interoperability, ease of patient access to health records, work on regulatory reform. My parens there, what do you refer to that internally, regulatory simplification? What — okay. Reg reform. Including eliminating or updating nonproductive requirements, removing barriers, et cetera.
So against that background, privacy and security risks regarding access and disclosure of data to entities not covered by HIPAA. So, releasing information outside of the HIPAA protections. Then we talked about recommendations based on raising the overall level of practice, and I suspect that these are those opportunistic short-term recommendations that are going to come out of this report. Reasserting the minimum necessary sequestration, principle of least identifiable form, authorization for limited disclosure of information — that was that discussion that we had about not releasing more than is called for. Provenance. Private registries as business associates. Step up focus on consumer education. These were the short list that we talked about yesterday.

I expect all of these will probably surface in some way through the report. Does that make sense, still, to background it in the current policy framework, short- term opportunities, to tighten and improve the, raising the level of practice when covered entities and business associates are releasing information out of the HIPAA protections? That would be essentially the focus of what we’d address to the Secretary out of this work.
In addition to perhaps looking at the broader recommendation that Melissa has offered, that there’s an opportunity for the Secretary to lead within the administration through interagency initiative in standing up this framework. On the right track? I’m seeing nods.
MS. SEEGER: Is there anything not in there that needs to be in there?
MS. KLOSS: I expect they’ll emerge as we write the report, or how to combine things. I think they’ll get refined. I just wanted to frame that background, what we were saying, that we’re going to make it very practical, based on things that are on the plate.
Bill, are you still envisioning a report and a separate letter, or the report with the recommendations? That’s a shift for me.
DR. STEAD: I am envisioning a letter with recommendations, just as you’ve talked through. I am envisioning a report, which may or may not take longer than the letter. All I was saying is the report could include both recommendations that weren’t quite as immediate, some of the things we talked about, working across agencies, which would begin to get at what the Secretary can do to work toward the broader framework. It might include those and the immediate action ones that we included in the letter. I would view that as a formatting decision once we got to the report.
MS. KLOSS: So we could kind of do what we did with the terminology and vocabularies. Do near term, longer term. Okay. That would work.
Twenty minutes to 12. What’s your pleasure? Have we done everything we can do? Shall we break for lunch, and then when we reconvene, flesh out the —
DR. STEAD: It seems to me what we want to flesh out next are the immediate recommendations to the Secretary. Then to the degree we have time, it would seem to me we could flesh out recommendations that would have a slightly longer lens, that we would want to include in the report. If could get that done, we’d probably be in reasonable state.
MS. SEEGER: I am wondering if we could take a quick break, or maybe go, have folks go to lunch, and put some of these documents on a flash drive. Get our friends at NCHS to print them out for the folks in the room so we’re not toggling back and forth and we have paper in front of us, and also get them out to Nick and Melissa.
Jacki has moved on with her day, California time. She did send a note to all of us, congratulating us on the progress we made this morning. I just feel that we have a number of different work products that we’ve developed that we all need to look at in moving forward to the next steps, and if we keep bringing them up, it’s going to be very hard for us to develop recommendations.
MS. KLOSS: That’s a good idea. So let’s do our lunch break early.
MS. SEEGER: I will take a few minutes to identify what documents that we want to have out in front of us, and make sure we can get copies of those?
MS. KLOSS: Okay. So we should plan to reconvene at 12:45.

(Luncheon Break.)

AFTERNOON SESSION

MS. KLOSS: Welcome back to the final afternoon of our two-day meeting to consider the report and recommendations on Health Information Privacy Beyond HIPAA. We won’t do another role call, but in the time remaining in our meeting, we are going to drill down and focus on crafting recommendations based on our work to date.
To help us do so, we, around the table, have hardcopies of output. We have been on a journey. We have considered and revised a set of themes that distinguish between the world as we knew it in 1996 when HIPAA was conceived and the world today and going forward in a digital information environment. We have themes that elucidate not only what has changed, but what parts of HIPAA just are no longer – could not be extended beyond their current scope.
We have identified principles of thinking about the beyond HIPAA space. I think we have been through those twice now, so we have pretty sharp consensus on those. We have made a decision before lunch not to do more dabbling. That any further dabbling would come in the writing of the report out.
We created a model that we know needs some additional tweaking in the green space principles and practices. That we have identified through the course of our discussion a lot of what these should be in managing privacy beyond HIPAA. Depending on time, we will come back and fill some of those in.
We also know that some of these concepts identified in red can be rolled up either in principles and practices or in our considerations of the need for benefit and risk analysis enforcement, harm analysis processes, and – but I think that we are pretty comfortable that this is a model that is forward looking. Obviously, the report will describe all of the elements of it.
What we want to do in the time remaining is to revisit the Beyond HIPAA Continuum that the Subcommittee has worked on over the past years using registries and apps and other exemplars at the intersection of the regulated and unregulated world. Because we identified some short- term opportunities that might bridge us between the model – the future-focusing model and what levers might be either used today or be considered going forward.
So, I think we wanted to review this and look at how we frame recommendations for the Secretary as part of our charge. I think it might be best if we go back one slide. I think we might want to just revisit this and see how – whether we want to make some changes to the model – back one slide, please – of this one.

I think yesterday we did agree that we would retitle this. So, we will make that change. We are now really looking at “Use and Protection of Health Data”.
We will get away from using stewardship, which is a difficult to explain concept and get away from continuum here – or keep that word? Keep it here. Yes.
We had some discussion yesterday that maybe this – the blue line, the compliance risk, changing to use and disclosure risk. Maybe those aren’t the right descriptions of risk. We know that those who are in the far left and are covered entities or business associates are very focused on compliance to regulation. But what we may change to conform is – yes, to go back to the benefit and risk analysis phrase that we put forth in the longer-term model.
DR. STEAD: I think the question is whether – so, I think appropriately, this slide is use and protection of health data continuum. It is anchored in the distinction between covered entities and business associates and data user is not covered. So, this is anchored in the current framework. I think in the current framework, the compliance is the framework of the covered entities. I would leave that row the same – at least I advocate for that – for this model, which is the model of the current framework.
MS. KLOSS: The transitional – this is, yes, today view.

DR. STEAD: Yes. This is today’s view. Our immediate recommendation are going to –
MS. KLOSS: Right, and our short-term recommendations are going to focus on use and disclosure mechanisms and then proving that. I do agree that that works.
Our hypothesis on this is that there are some private sector, i.e. covered entity and business associate, actions that can be taken in the current state that can improve the release of information, essentially, management and release of information, beyond the protections of HIPAA.
Each of these – each of these red bullets can certainly be part of a set of recommendations if we wanted to use this as a basis for our short-term recommendations.
PARTICIPANT: Are you going to leave data stewardship in the middle column? Are you going to try to call it something else?
MS. GOLDSTEIN: Is there a way to make it clear – I may not be up to date with this particular slide – that there is a focus on compliance now that we would like to – that we want to be beyond compliance with the current law and that we are searching for a better way – a more comprehensive way?

So, I am not – the slide that I am looking at doesn’t make clear that this is the current versus what we are – at least in that blue arrow up top that lists compliance risk and use and disclosure risk.
MS. KLOSS: That is a good comment.

DR. STEAD: I think we need to be careful. When we came into yesterday, this slide was actually our short-term and long-term. It was the whole plan.
We now have a fresh, new long-term. I would narrow this slide, not increase it. I would consider dropping the middle column and focus on the things to do in the left-hand column that are things that should be done within the – within the constraints of HIPAA as it exists. Then put in the right-hand column the things that could be done, some of which do not require new legislation – they could be regulatory – that will enhance – you know, begin to address use and disclosure risk and transparency.
So, for example, for the Secretary and ONC to begin to grant a good practices seal of approval to an app that a patient might select to use to access their information through promoting interoperability, which would basically mean if that app provider would, in essence, commit to what we are suggesting, law would ultimately make them have to commit. That would actually be something we could do now in the right-hand column. That would fit with – that would support a current priority policy of both Congress and the Department.
So, I just – I would favor narrowing this now that we have got – I mean we have the great advantage. We have the picture of the future. This end doesn’t have to carry along all of that baggage, which confuses this picture.

MS. KLOSS: I think maybe we need to incorporate something that says short-term opportunities.

DR. MAYS: Can I just ask in terms of dropping the middle, the red recommendations – are you saying even to not pull those over?
DR. STEAD: I think that we want to do something in the near-term in the right-hand column. Maybe it needs to be middle, but I would still prefer to get down to two columns.
That is regulatory or sub-regulatory that does not require enactment.
MS. BERNSTEIN: Or voluntary? Because this is –

DR. STEAD: Certainly, the public part can be voluntary. I think this idea of – I think what I just tried to talk through about the way to hook this to interoperability, which you talked about last evening, I think that is something that is within the existing regulatory purview to do if they want to do.

Whether the app developers elected to go after that carrot would be voluntary. I think the idea that in blue, in the bottom, we show that the industry should be moving in that direction voluntarily, I think that is good. I am just trying to simplify this if possible.
MS. KLOSS: I like going to two.

DR. STEAD: We need something that allows for both enactment and non-enactment.
MS. BERNSTEIN: Adoption. DR. STEAD: Regulatory or –
MS. BERNSTEIN: That might be regulatory. It might not be regulatory. Right? You can self-adopt, self – industry self-regulatory standards kind of thing, which are not governmental.
MS. KLOSS: The first column is headed by adopt protections beyond regulatory compliance. I think all three of these titles need adjusting.
DR. FRANCIS: How about just new data protections?

Drop enaction, adoption, whatever.

MS. KLOSS: I think that is a really good observation. We can simplify this. Move to two columns. It reflects short-term regulatory, sub-regulatory, or voluntary steps that can be taken to improve the protection of data.

I do think there are some organizations – Jackie has shared that with us in the past – that they take steps beyond what is strictly required for HIPAA. Vanderbilt has been doing that for decades.
DR. MAYS: I want to make sure that in the middle there is like organizations could elect to voluntary blah, blah, blah. We are losing that or we are keeping that?
MS. KLOSS: Probably losing it.

DR. MAYS: Wow. And we are going to lose SDOs could strengthen –
MS. KLOSS: No, I think that can be done now.

DR. MAYS: Okay, the first one I don’t think we have to repeat. Greater understanding, consumers, blah, blah, blah. I think we have said that in other places.
But data holders, the second one?

MS. KLOSS: I don’t know. I think we can leave that for the other model. I don’t know. How much emphasis do we want to put on the voluntary?
DR. MAYS: I think given that we have the other one where we talk about holders and all of this, to have a balance with having something that is very specific to that group helps us to really fill out this – I don’t know where it is going to go now, but we have that. Let’s talk to them I would say. That is why I thought, also, organizations and SDOs – the only one I thought to really dump was the first one.

MS. KLOSS: Because we have it in the third column.

DR. MAYS: Yes. But I thought the other three helped to speak to pieces that were on here. So, that is why I was trying to ask can we keep those?
MS. KLOSS: I guess the one that I think is least likely to succeed is the organizations could elect to voluntarily certify data holders, applications, and device manufacturers – I mean it is pretty non-specific as to who does the – who develops the – it is complex.
DR. STEAD: I think the lens you might want to try to apply to these decisions is when would putting something on this slide detract from the long-term slide. Things that were sort of the best we could think of in the current – within the current legal framework, which is sort of – we now have a much better way to deal with that in the long- term framework.
What I at least would advocate we try to do is distill this down to the most important types of things that the different players – I think the idea that we are still trying to get action, both at the public – both private and public, I think that is correct. I just would try to make this really show what we really think will be helpful in the short-term while we are working to put the long-term in place.
DR. MAYS: I would suggest that the organization issue be taken out of here, but I think some kind of – discussion is the wrong word, but some implying that this would be good to do and it is a problem to solve, in terms of how to do it, would be good. Because you want – I would think somewhere in the long haul you want somebody to start taking this up. I don’t want to lose it, but I think it should be a bullet as much as a point that is discussed.
MS. KLOSS: I think – so what I see is all five bullets in the first column are highly relevant.
MS. STRICKLAND: Could you explain the SDO one?

MS. KLOSS: This was our discussion about making data management and privacy and security a more – you know, the privacy in design. That was where we were kind of coming from. That that should be a consideration in any standards development.
So, for example, in how you – well, that is it.

Yes. I don’t know to what extent that ever – those questions get asked and discussed.
MS. STRICKLAND: As far as EDI standards, you can’t put privacy and security in them. That is why I was asking. That is not something that can be intertwined necessarily. It is something that has to be applied after.

Now, you could do things like first name/last name. Like if we want to start making data the minimum necessary, you could start saying first name/last name not necessary. If an ID coming across tells you everything you need to know about that member, that kind of stuff, it could be a mindful exercise to that point. Rules around how to keep the data private, how to secure the data as it moves, that kind of stuff is outside and beyond us.
Other things like devices, like when you are creating medical wearable things, yes. I can understand that should be mindful when they are designing that product that they should be understanding what levels of privacy and security come into play.
MS. KLOSS: I believe that is what our context was here. As Rachel said earlier, the apps where they tack on the security –
MS. STRICKLAND: The device manufacturers and so forth are also what we are calling SDOs? I wouldn’t think – I have never heard it that way. That is why I was asking the question.
MR. LANDEN: Let me chime in on that. I agree with Deb when we are down at the transaction level and the implementation guide level, but I am not so sure there might not be a role for privacy and security up at the communications level. So, not X12N, but what is it – C, X12C.
Again, it is a new thought, but that is kind of what we are doing. I am not sure what the HL7 and NCPDP counterparts might be, but it is kind of like the SDO writ large should at least give some thinking about how their transactions fit into the current world of privacy and security.
MS. KLOSS: I think that is what we were intending, Rich.
MS. BERNSTEIN: It occurs to me that SDOs here might not mean the ones that we traditionally deal with on this committee. They could be international ISO standards for how to develop apps. It could be any other kind of standards development organization that has the ability to adopt privacy standards for its industry or its group of companies or whatever it is developing. We don’t have to think of ourselves in the narrow space that we normally do for this purpose.
MR. LANDEN: It gets more into the NIST realm.

MS. BERNSTEIN: Could be, but NIST is only a player in the – it is one partner in developing standards. Granted, the federal government has a big footprint. If we were to adopt such standards, when they adopt them, they are only for the federal government to follow. Like other people do follow them because the federal government has a big footprint. Once the federal government is following it, others may also tend to do that.
But we don’t generally dictate standards. These are still voluntary standard-setting organizations in which the federal government only participates as a partner – as a player.

PARTICIPANT: That is why they are shown as private.

MR. LANDEN: It is all of the above. In some aspects, you are absolutely correct. In others, like HIPAA, no, it is definitely regulated.
MS. BERNSTEIN: That is a regulation not a standards developed in industry-voluntary standards. That is a governmental imposition of regulation where there is a statutory authority to do so.
MR. LANDEN: But if you look more in the ONC space, it comes up there at both poles. Like the apps are totally – ONC doesn’t have any regulations around privacy and security standards on apps yet. On the other side, the EHRs, absolutely, privacy and security standards are baked into the certification requirements.
MS. BERNSTEIN: Right, but they are certifications. They are not –

MS. SEEGER: The EHR is certified, but it is not a legal construct. ISO is a good example. I always think of lightbulbs, which have a standard for the little device up at the top for screwing it into the standard. Or a plug – the three plug outlet has a standard in the United States that is different than in Europe, but ISO creates a standard that everyone must follow and everyone does so voluntarily so they can all play in the space of putting their plugs in the wall.
MS. BERNSTEIN: Yes. They don’t have to. You could have your own proprietary plug, but your business isn’t going to be successful.
MS. SEEGER: That is really true for EHRs. They want to be certified so they can play in the space. CMS and ONC created – you know, there was Meaningful Use. There were incentive payments that were tied to using a certified product. It is not necessarily the same type of legal construct that we are talking about.
MS. KLOSS: I think this was written in the spirit that Rich described it. That it shouldn’t be an afterthought.
I think I hear general consensus that the last column gets struck. The middle column perhaps goes through one more cleanup, but these are actions that can be taken based in the current environment that could move things along – that could improve protection.
MS. BERNSTEIN: Some of these things over on the right column – forgive me, I missed the very beginning of this discussion after lunch – are being discussed now.
MS. KLOSS: Right. They are going to be part of our model for sure.
DR. FRANCIS: There is one question I have about that, which is I am not sure what you mean by more robust protections for deidentification.
Given what I thought was agreement yesterday that the whole distinction is crumbling, it might be that something like reconsider the line and whether there might need to be protections that don’t rely on the line.
MS. KLOSS: I think as we are looking at this now, this is the transitional steps that are possible to be taken now without major change or new laws. When we wrote this, we were thinking of some of the recommendations that we had made in the de-identification letter. Practices that could be improved. Even as simple as organizations logging what deidentified datasets go out and what their characteristics are and how they were deidentified. Even that practice isn’t being followed now.

We were thinking pretty foundational management practices around – then employing certainly more statistical methods than always relying on Safe Harbor.
DR. FRANCIS: Right. Although, what that suggests is maybe a better definition of the Safe Harbor characteristics and not all the – when it says for deidentification rather than for approaching deidentified data or something like that.
MS. KLOSS: When we wrote we were saying just please implement the recommendations we made in the deidentification letter, which called for more research into deidentification methods and other things that one would hope would be going on.
Have we beaten this one? Moving on?

Okay, against that background, let’s scroll to the next page and see if we can identify these short-term opportunities. I think they are really – you can see how much work has been done, yet kind of how scattered it still is.
We have got some recommendations in the report to Congress that are relevant to our short-term action plan that could be reinforced here. We have the recommendations – the statements in the chart we just looked at that could be reframed as recommendations. And then we have the new ideas or the reinforced ideas that came out of our discussion yesterday. I think we have plenty to work with.
I think we also try when we write letters to be pretty pointed – not try to boil the ocean. What is the most important message here, in terms of a set of recommendations, given the current environment, which is focusing on interoperability, ease of patient access. I think what we are saying is that while you are doing that, we have got to amp up access and disclosure management practices.
MS. BERNSTEIN: What is the question about regulatory reform?
MS. KLOSS: That was – I didn’t know what word to us. How do you talk about regulatory simplification as part of the RFI?
MS. BERNSTEIN: Regulatory reform.

MS. KLOSS: We said that before lunch.

MS. BERNSTEIN: That is the correct phrase. MS. KLOSS: That was my question to the two of you.

MS. BERNSTEIN: Not rocket science. We’re good.

You got it right.

MS. KLOSS: Good. So back to what do we want to highlight in the – could we specifically – well, and we have got some overlap here. I think there were a set of new recommendations to call for release of only the information that is authorized and nothing more, reinforcing minimum necessary. I do think that remains an important issue.
I don’t know what the Secretary can do about that. It is kind of practice in the field. The regulations are there. Publish the minimum necessary regulations. What would we want the Secretary to do there?
MS. BERNSTEIN: Some people might say enforcement.

Not everyone would say that.

MS. SEEGER: OCR enforces the minimum necessary standard in the construct of HIPAA. I think what we are talking about is the space outside of HIPAA. So how do we make recommendations to the Secretary as a committee if we think that stronger minimum necessary standards across the board are important for others who are playing in the space outside of what is currently regulated?
MS. BERNSTEIN: I think in the environment that we were discussing just before lunch of the New Evidence Act, where there is a push to share more information and link data, that you have to be especially creative about how – what do you mean by minimum necessary and how that gets implemented.
For some of the reasons that Vicki was raising and other people were raising about research needs or development or, you know, not stifling innovation and creativity, I think you have to weave those two things together or figure out a way to sell, for lack of a more eloquent word, the idea that it is valuable to do minimum – what is the value in minimum necessary? What are we getting from that when we also have these other values of innovation and creativity and data sharing and so forth?
MS. KLOSS: Maybe we don’t even want to address it in this letter.
MS. BERNSTEIN: I don’t want to stifle your –

MS. KLOSS: I know. I am just throwing it out. I am just kind of saying we can’t put out a letter with 20 recommendations. We have to home in on what is most important in the context of this work that we are doing.
DR. FRANCIS: So this goes to uses. The two uses that Mark raised were to employers and to –
MS. KLOSS: Workman’s comp.

DR. FRANCIS: Various kinds of insurers – worker’s comp, other sorts of applications for insurance. One possibility would be to develop for consumers the possibility of requesting minimum necessary disclosures in – and then list the context like employment. So then a consumer would have the option of saying –
MS. GOLDSTEIN: I agree that it does need mentioning. Even in the revised common rule conception of broad consent, you are not giving broad consent to anything, anywhere, anytime, any kind of research. There has to be a specification of the type of research that might be considered in broad consent.
So, we are still thinking about minimum necessary. Perhaps there is a reflection of collection limitations, which is the FIPPs principle, which I brought up yesterday.
I think it is an important idea. To the extent that it is practicable, I can’t answer that question because it is much easier for people to share everything than to be more specific. I think it goes hand in hand with the idea of data segmentation or sequestration and the idea that we are going to use what we say we are going to use.
So, I think it bears mentioning.

There are many ways to do that. Leslie’s proposal is a good one.
MS. KLOSS: Do we have consensus that that should be one of the items addressed? Okay.
What about the – let’s go through this list and then I am going to go back to the continuum. Principle of Least Identifiable Form. That was a new one for me. Bridge too far in the short-run?
MS. SEEGER: I think that does play into minimum necessary a bit and what Melissa was just saying with respect to – she was talking about collection limits, but also kind of limits on data. I can hear Sallie’s words echoing in the room about the importance of public health. Vicki’s concerns about having data available for research.
We talked a little bit about some of HIPAA’s permitted uses and disclosures, right, without an individual’s authorization, one of those being for public health. HIPAA is very prescriptive with respect to research. What happens often times is we have found through the work we have done in this space is that it becomes the Wild West. The data gets out there and is used for purposes that were not intended, purposes for marketing, purposes for underwriting, purposes that can harm an individual.
I think that this principle of least identifiable form, maybe that is where we were going in terms of trying to limit some of the data elements that get out there.
Mark was talking specifically about data that is in a health record that might not – can be used – isn’t necessary to treat an individual. There is information that is in there that one provider might not need in order to treat someone. You want to have it available, of course, but we are not talking about the HIPAA space.
MS. KLOSS: So, do we make that an example in minimum necessary? Probably.
MS. SEEGER: That might be a way to keep them both, but put it underneath.

MS. KLOSS: It is a good example because often we think of minimum necessary where we don’t need to share information from five years ago, but, you know, you kind of have a little mental switch when you say, well, we don’t really need place of birth for this purpose or some other information that makes it possible to link – makes it easier to link data and de-identify. That is why we are talking about it.
Provenance – I think we have beat that horse. Private Registries.
MS. BERNSTEIN: Are we skipping over the authorization or are we just keeping that?
MS. KLOSS: I think that is all part of it. I think that my notes show that was related to just disclosing what is asked for.
Private registries as business associates. That is on both lists. That is on our transitional list here.
MS. BERNSTEIN: The question was raised yesterday, but are they actually carrying out something that would – that makes – that allows them to fit the definition of business associate, carrying out something that would otherwise be done by a provider or plan, one of those functions. It is not clear that they are doing treatment or operations or whatever. They are doing something else. That makes it challenging, I think.

MS. KLOSS: So, then it is a data use –

MS. BERNSTEIN: Unless we change the rule to explicitly just call them out. One could do that, but I think it is not – it is hard because it doesn’t really fit the model.
MS. SEEGER: When we originally started to scope this work out, what we were doing by looking at these exemplars like private registries, were to provide – I mean I know that folks have tried to move away from this notion of stewardship. But what we were trying to do is develop recommendations for these buckets of – these spaces that aren’t currently within the scope of HIPAA where individuals are sharing PHI or what would be PHI under HIPAA.

There is not, perhaps, transparency on how that data might be used downstream once the individual has ponied it up. Going back to our main slide with – the slide in front of this one, here.
MS. KLOSS: We had that in there.

MS. SEEGER: Right. We were trying to develop this – some recommendations to organizations on how to better check data.
So, this is an area that is currently outside of the scope. Individuals are – some people are very free with sharing their information and not understanding how it might be used.
This is booming, especially for people who are with a chronic disease, a rare disease. Just a couple weeks ago, NIH held Rare Disease Day. The entire focus of that day was on registries. People who are in this space are all really looking to share information with other patients, with who the doctors are that treat this, who the researchers are that treat this. They are not business associates.
MS. BERNSTEIN: They are also not interested in maybe the full – how much are they interested in fully informing those patients that might be in those registries about everything that they are doing?
MS. KLOSS: Our registry report showed that there is a certain slice that may not be business associates, but if I were the privacy officer of Sutter Health, as Jackie says, I wouldn’t be releasing datasets without some data use agreement.
MS. SEEGER: So as I understand it, there are recommendations that are being made by folks who are working in this space about what a registry – a private registry should entail. There are folks who are registry managers. It is almost like a compliance officer type of position, but, typically, somebody who might have experience with an IRB, who might have experience with being a privacy officer, an HIM professional. That person wears lots of different hats.
MS. KLOSS: Is it on the list or off the list?

DR. STEAD: I am wondering if we are having two challenges in this conversation. We need to figure out what granularity we are going to try to make the recommendations at. We are clearly going to make some regulation that, in essence, is around new guidance to covered entities. One of the things I think we need guidance to do is to – so, it reflects some of these other things we are starting.
Given that de-identification is problematic, given that the people that they are disclosing the information to may not be covered by HIPAA – maybe that is one. Two, the fact that de-identification is problematic. That they need to do certain things that strengthen the protection. That can be by strengthening their business associate agreement, if appropriate, or if the person they are transmitting to is outside, it is a data use agreement.
To make those – one or the other form should be used to limit use to the agreed upon purpose, to prohibit aggregating it with other information that would increase re-identification risk.
MS. KLOSS: And prohibit redisclosure.

DR. STEAD: Which would be for – and to have some – and to include a provision for – at a station or – that they will carry those same requirements to the next tier now.
When people are releasing PHI, either de- identified or not, they – we should give them guidance that they should include these additional protections. It seems to me that is a pretty basic thing. It is doable within current law. The reasoning for it is explained by our framework.
There may be a recommendation where some of these things could become the specific examples rather than the recommendations. You still have a handful of recommendations, but you’ve got – this is a specific example to do this way.
MS. KLOSS: That helps.

DR. STEAD: Because anything we do at the recommendation level will be done – will have multiple instances of how it is done. Is that helpful?
MS. KLOSS: Well, it is helpful because our goal is to raise awareness of a problem and an opportunity to do better.
DR. STEAD: Get the Department to do something to actually help people do it better.

MS. KLOSS: The last bullet point on this list is step up focus on consumer education. Here, we can acknowledge that there has been a lot done in terms of helping people understand their information rights and how to get access to information. But has enough been done to help people understand how then to be good stewards of their own information once they get it? Is that not our –
MS. STRICKLAND: I don’t know that that is our purview, but it is definitely an important issue. I think people, as we mentioned before, don’t understand where their data could go and the complexity of letting that data go without reading all of the fine print. We swoosh past six pages of terms and conditions without really knowing what and who they are going to share the data with.
I think that this becomes a more important issue for seniors and the folks of that nature, even young people who may not understand the impact that sharing certain data could have. I certainly would not shy away from education around this and other issues because I think it is always important if you capture one or a hundred people who may not make a mistake, I think it is worth it.
DR. MAYS: I think we have done a lot, but in certain areas. Like, for example, when we think about wearables and mobile apps, we haven’t done a lot there. I think we have a long ways to go.

The problem is – I like what Deb was saying – that we have specific populations that have more problems with this lack of knowledge than others. We really have to think, I think, about the elderly. We really have to think about the young adults who – I think it is about from 13 to probably like in their 20s that – I know this technology.
They don’t read it. They have little understanding. Later, in some ways, have problems.
So, I don’t know if we want to call it out. This just seems like mom and apple pie. I don’t know how to put a little more teeth into it.
MS. HINES: Maybe, like Bill was saying, have an example of why it is so important. You know, the thing about you have an app on your phone. You think you are just downloading your data. Google and Facebook and everybody else is selling it without your knowledge. Some very pithy couple of sentences to –
MS. SEEGER: Some of the examples came up from Alyssa and Leslie and others around sharing of genetic information that individuals might get from 23 and Me. I don’t want to pick on that. There is a million of them – Ancestry, whomever. What can happen down the road with that information, especially when sharing it on social media?
DR. MAYS: Do we want to say something like step up focus on consumer education in vulnerable populations?

MS. HINES: I actually – sorry, Vicki – would just use the word risk, like to communicate risk. Something like get the risk – step up focus to emphasize risk or something like that. I think people – like everyone is talking about around it, people don’t appreciate the risk.
DR. MAYS: I’m good with that.

MS. KLOSS: I would say in the Beyond HIPAA space, you know, I would think the Secretary might step in and say, well, that really is not what we are charged with doing. I think in this current environment where the emphasis is on getting patients to have their information and promoting interoperability, that issue becomes more central.

MS. HINES: It is becoming a health literacy issue.

MS. KLOSS: All right. I think we have consensus that stays in. Leslie?

DR. FRANCIS: I was going to say one way – there are different ways to think about that, but the most limited way to think about it is any time people – any time there is the functionality to authorize downloading from an EHR, that should come with certain kinds of transparency and other provisions that could include data use agreements.

So if it is with apps, if apps are going to have the functionality to get the data from an EHR – now, that doesn’t deal with Ancestry because that is not EHR data, but it is a model. It is a start.
DR. MAYS: You can almost do that with most data. You have to go through and at the very end there are little check boxes. You can use this to do this, this, this, and then you can leave blank what you don’t want them to use.
That would really have transparency. Even though you still – a lot of people – I just want to hurry up and get through this – may check it, you really will have been alerted to its possible uses.
If those kinds of things – I have been trying to figure out what to do in terms of these terms and agreements. If something like that could be put in the terms and agreement, where you can’t check yes until you do that – it is going to be at the bottom. People will jump there. But at least you know.
MS. STRICKLAND: I would agree with that concept. I don’t know how you get that to be prescribed or as part of a rule or law or something. It speaks to our transparency. It speaks to privacy. It speaks to disclosing data, security. It really does solve a lot of the things that we are asking for.

If I am downloading an app that is going to use data, tell me who you are going to share it with, tell me what you are going to use it for, and let me decide if that is okay with me. If there is five different entities and if I don’t know what ABC Inc., does, I’m going to say no. I don’t know what they do or go research them and figure out. But it gives the user the ability to control their data.
MS. KLOSS: If the app won’t tell you, you might want to do something different.
I think one of the ways to frame that might be that – you know, it behooves everyone in this space to be prepared for what is inevitably going to be greater demand by consumers for transparency. It is already starting. Just to get on top of that.
MS. STRICKLAND: Do we want to say – we are saying step up focus on consumer education or I think we were going to turn this into risk. Do we also want to sort of give the app industry a heads up that we are going to be demanding transparency and functionality and features that allow the consumer to be – so, if we are talking about consumer education, that is one thing. You also need to talk to the developers in some way to say, hey, this isn’t okay anymore. This little button with one check box at the bottom isn’t going to cut it anymore. We are going to need to be more explanatory with who you are releasing data to and so forth, just to give them a heads up.
Some will be good actors. Some will be like, okay, if this is what is going to come down the pike eventually, no big deal to do it now. Put it in my terms and conditions, whatever. They could get ahead of it.
MR. LANDEN: This might be a good point on which we can have a discussion with our counterparts over at ONC and HITAC. ONC does fund competitions for app development. I am not sure what, if anything, they have done about this kind of end user agreement concept between the individual whose information it is and the app developer, but it could be a very fruitful discussion.
MS. KLOSS: Yes, that is really good. In the ideal world, we want this kind of transparency to be a gold standard for the industry. I think we mentioned that in some way yesterday.
DR. MAYS: Can we have a separate – rather than this all coming under those step up, can we have a transparency one? Just like we have sequestration, can we have a transparency one? Then I think – yes.
MS. KLOSS: Okay. Good. MS. SEEGER:

MS. KLOSS: We should probably elaborate a little about what you meant by that at this point in the conversation.
MS. KLOSS: As we rolled up the discussion on new guidance with regard to business associate and data use, we rolled several things under that – minimum necessary and so forth. We can roll several of these themes under – we can bring transparency into the consumer recommendation.
We don’t have to write the recommendation, but I think that is a good idea.
MS. SEEGER: I think it would be very helpful for us to go back and look at the FTC’s big data report, which speaks to the need for greater consumer transparency. One of the recommendations I will bring up again that they made was an opportunity for consumers to opt out of additional disclosures of their data and the need for consumer education around why you would want to potentially opt out.
MS. STRICKLAND: Did they also go into having them disclose who they might be sharing the data with? That is kind of the thing. It says you can opt out, but opt out of what? Maybe that is the education part, but if you don’t know where they were going to go with it, you don’t know whether you should opt out. It is the same thing as hitting that button that says I agree. If you say I disagree, you don’t do – but you don’t – you didn’t read the six pages.

MS. SEEGER: A right to an accounting of disclosures?
MS. KLOSS: But most of the organizations couldn’t tell you who they might disclose it to because they don’t know what the next business opportunity is.
MS. STRICKLAND: I think that speaks to – if they did do the list to say these are the known people I might share with, then if some other business opportunity comes along, you have to have your people – the users separated by who you can have it as an opportunity and who you can’t. It does get more complex. The reality is so does our lives. We get to protect our data. If we don’t want it flying all over the place, the next business opportunity that runs around shouldn’t mean that every single user that ever said they will share it with some other company should share it with this new company because they don’t know and they didn’t say they would want to.
I know it gets to be a bigger box and a bigger box and a bigger box and a bigger box, but the reality is what we are trying to do is to protect the consumer, provide transparency, and provide security.
MS. KLOSS: Any other big buckets that we haven’t talked about? Do we have enough to draft this puppy? I think so. You will have ten reviews.

MS. HINES: So, Linda, are you going to take a stab based on all of the incredible amount of notes you have taken and then we will just do a whole bunch of iterations and privacy subcommittee calls?
MS. KLOSS: Exactly. MS. HINES: Okay.
MS. KLOSS: We are just going to move to scheduling PCS calls between now and whenever your deadline is.
MS. HINES: Middle to late May.

MS. KLOSS: I think these two reports – I think these two deliverables kind of need to be done a little bit iteratively. My inclination would be to start on the model report and at least get a first –
MS. HINES: No point in writing the letter until you have the model firmed up.
MS. KLOSS: Then we can start doing a track of the letter and then come back and just keep moving together.
MS. HINES: To our Standards friends, just like we did when we had some non-Standards people at the December hearing, do you all want to be looped into those calls this spring? Okay.
MS. KLOSS: Thank you.

MS. HINES: Ad hoc, temporary members of the PCS. I know, just what you wanted.

MS. KLOSS: We would really appreciate that because you well know –
MS. BERNSTEIN: So, Geneva could make a note to herself that when she invites people to the meetings that are going to be following on this group meeting to discuss this report to include the other Standards people who were here?

MS. KLOSS: Alex, Rich, and then –

MS. BERNSTEIN: I mean I guess we can always invite the whole committee to participate if you want. Do you just want the people who actually had participation in this conversation and heard the conversation?
MS. HINES: Yes, that is how we did the December hearing.

MS. BERNSTEIN: So, Geneva, if you can make yourself a note when you make those phone calls to include Alex and Debra and Rich and Bill.
MS. KLOSS: SO, should we look ahead at the work plan? It is now the 22nd of March. I would think we would need to – what, every other week?
MS. HINES: Every other week? Okay. And are you thinking you need two weeks to get some work done, so two weeks from this week? Three weeks from this week? What is your start?

MS. KLOSS: I think we need to start – I think the first call needs to be the –
MS. HINES: Week of April 1st or April 8th? MS. KLOSS: 1st.
MS. HINES: So, you would like a call the week of April 1st?

MS. SEEGER: You want to take this week off? MS. KLOSS: I want to do some work.
MS. SEEGER: So, the first call of the subcommittee could be the first week in April.
MS. HINES: And then the week of the 15th and then the week of the 29th.
MS. SEEGER: Yes, exactly. MS. HINES: Got it.
MS. SEEGER: And then keep – and then maybe we would want to meet almost every week before the June meeting.
MS. KLOSS: I could work in person, here, in Washington, on the 10th of May – 8th, sorry, 8th.
MS. SEEGER: I have a roundtable and you are welcome to come and camp out in my office with me and Maya.
MS. KLOSS: And then we have one more meeting, really, before we have to turn over.
MS. HINES: So we have Executive Subcommittee calls on – obviously, April 4th is too early. So, the next one is May 2nd, which is before your trip to D.C. So, if we could maybe give folks some decent draft by then because that is the last official meeting before the full committee meets.
DR. BERNSTEIN: When do you turn into a pumpkin?

MS. KLOSS: 31st. Midnight. A cold and dark day for this committee. Oh, there isn’t a 31st.
MS. HINES: Linda, if you have nothing else to do on June 5th and would have time to call in, we would welcome your consult.
MS. KLOSS: I will be a citizen. I will hold my tongue until public comment.
MS. HINES: Speaking of which, we did actually get a public comment today. I sent the email around. Just so you are aware. It is very just simple thank you for the interesting discussion and thanks for the plug of ONC’s NPRM. We definitely encourage folks to read it and comment – from Lana Moriarty.

MS. SEEGER: She is my counterpart. She is in ONC on consumer engagement. She has been listening and very engaged. Also encouraged everyone to comment on their proposed rule.
MS. BERNSTEIN: I am so glad she was able to do that. That is excellent.

MS. KLOSS: So before we wrap, could we kind of just go around and discuss what we might have done better? I will start and just say I knew this was going to be a challenging meeting just because the topic is so challenging. It has always been hard for us to stay in the Beyond HIPAA space. We are so anchored. It is just sometimes hard just to get out of it and look more broadly.
She kept flashing this at me – Beyond HIPAA, Beyond HIPAA.
I feel like I, personally, could have been a little better organized for this, but I think we came out of it in a good place. Thanks for your patience and slogging through with me.
Leslie, it has been a real pleasure having you here. I just ask for any of your observations.
DR. FRANCIS: I just wanted to thank you for including me among the cats to herd. It has been a lot of fun to be back. Please do feel free to call on me as you think would be helpful and, of course, appropriate given the FACA requirements.
MS. KLOSS: I am hoping that we can, as we refine the calendar, maybe have one of the draft iterations go out to our five experts and get some response like you were good enough to do on the earlier model. That was very helpful.

MS. HINES: We also have to do a meeting summary. Maybe when that – that needs to go out, as well. We should probably run it by them. That would be another way to get everybody –
MS. KLOSS: How does that get done?

MS. HINES: Rachel volunteered. She has nothing else to do.
MS. STRICKLAND: I think some of our experts wanted to review the minutes before we were able to distribute them.
MS. KLOSS: The minutes? The meeting summary?

MS. STRICKLAND: I think Barbara was concerned with some entries into the minutes. She just wanted to take a look at them.
MS. BERNSTEIN: I got an email from Sallie asking to add a correction to something she said yesterday. We can add that into the summary, as well.
MS. KLOSS: I was just going to run the table.

MR. LANDEN: Back to your question, I think topics like this, where we are very much pursuing ideas, are hard to manage. Certainly, it was never anticipated to be a journey from a to b to c. Despite what felt a lot at times like struggle and revisiting issues, we came at subjects at different times from different perspectives with different outcomes.

I think, overall, the effect was absolutely phenomenally well done, albeit painful. So, when I hear you say you weren’t prepared, I think, oh, god, was she well prepared. It is just a topic that did not lend itself to a methodic process. Kudos there. Well done. I could not have done anywhere near as good a job. Thank you.
My other point is to thank all of the invited panelists. I think they displayed just a wealth of knowledge. Their ability to share and synthesize and talk us through issues that we, as individuals, were not nearly as familiar with was just extremely and extraordinarily helpful.
Similar kudos to the staff for what they brought, Rachel, Maya, others.
MS. STRICKLAND: I think this went as well as it could. You know, sometimes you can over-prepare or try to and then you are just sort of herding people into a direction. I don’t think we did that.
I think everybody was free to sort of share their feelings and thoughts. That is how we came up with the principles list. I think that was a really good, solid body of work. I think that it led to a good model.
I think it went really well. I don’t have any criticisms.

DR. STEAD: First, I obviously endorse the absence of criticism. I think we got much further than we thought we would. I came in somewhat skeptical that we could get a report, as we have done. It is not done, but we understand what it needs to say.
I guess one question, sort of a process question back to the previous comment, since the report now appears quite doable, is there any reason to do the work of a separate meeting summary from the report? I think we will have the transcript of the meeting.
MS. HINES: We will have the transcript. That will be posted on the website. If they can figure out whether the audio is of decent enough quality, that will be on the website. So, yes, we have in the past had the meeting summary and the report. That is a great suggestion, Bill, to save extra work.
DR. STEAD: I think that could maybe help.

MS. BERNSTEIN: Rachel has all that extra time on her hands that she needs to fill.
DR. STEAD: We appreciate that.

MS. HINES: It is just a little bit of a finagle. The only thing is – and I can help you with this, Rachel – is just having a page somewhere in the report that just really has the date of the meeting, who was here, and just have all of that detail.

Great suggestion. Kudos, Bill.

DR. STEAD: I guess one thing I will just continue to do is encourage us to narrow the recommendations as we write them to the near-term to the degree we can. I felt we were still wandering out, but I think you could easily do that. I will obviously help.
Awesome outcome. Thank you.

DR. MAYS: I think I would echo what others said. It is like when I came into the meeting, I thought it was going to be difficult. It actually – it is like when you were talking about joy. There were points at which, whoa, the joy came back. It was good.
I think we got even further than I thought we were going to. I applaud you. I don’t sometimes have the patience you have. You stayed very focused to lead us. I sometimes am like, okay, we just have to do this another time. It is not working. You just kept pulling and talking and what have you. I think that was a good skill.
I am always willing to roll up my sleeves and work with you because you do prepare quite well for the things that we work on. I appreciate being on your team for that reason.
I think the other thing is that the space right now for this report – there is an important gap out there. I think to the extent that we have clarity in the recommendations – I think it is important that people know who we are talking to. Otherwise, the recommendations that are kind of like, oh, somebody could look and say I already do that. It is really the people that aren’t doing it. It is like when – when Mark was talking, the fact that he gets an NIH grant to actually deal with people who are in the unregulated space really tells you that there are groups that – other than our usual suspects that we should try and reach out to.
To the extent that we can either identify sometimes those populations or make sure that we are talking their language, I think it would make the report just very useful.

MS. KLOSS: Melissa are you still hanging in there with us?

MS. GOLDSTEIN: I am.

MS. KLOSS: You are going to get some kind of star. Thank you. It has been a pleasure working with you again after all of these years.
MS. GOLDSTEIN: Thank you for letting me – I had it on speak. Thank you all for letting me join you. I have enjoyed being with you and the level of conversation as always. I do think it is a very important space, especially because so often when we are talking about, for instance, a consumer privacy bill of rights or legislation or even the California law, people stay away from healthcare and say it is already heavily regulated and HIPAA takes care of that, which leaves this area that HIPAA doesn’t take care of. I think it is a very important statement.
If you all do decide to skip the meeting summary, I think it is important somehow to work in Barbara’s statement that she wanted to rework and Sallie’s that she emailed. I don’t know how to do that if you don’t do a meeting summary. Maybe just an addendum to the transcript or something. I am not sure.
I appreciate your time and letting me join you.

MS. KLOSS: Thank you very much. We hope you will be able to review a draft of this as we get further along, probably around mid-May.
MS. BERNSTEIN: I can read what Sallie said into the transcript if that helps and that takes care of that.
She wrote here – so this is Sallie Milam writing to us yesterday. As I was thinking about our discussion yesterday, I realized that I may have been unclear. I was hoping that one of you would add a statement to the record in my name.
She says, “When I stated that a private sector registry needed either a grant of public health authority or a business associate agreement with the covered entity where it is acting on behalf of the covered entity, I believe we were speaking to the use case of disclosure without patient authorization.”
She wanted to make clear the context of her comments. Was there another one like that?
MS. KLOSS: You referenced Barbara’s?

MS. BERNSTEIN: Oh, Barbara had a comment as well? MS. STRICKLAND: Yes. She didn’t say exactly what, but she said she would catch up with whoever was doing the minutes and rectify that.
MS. KLOSS: Finally, I want to thank Rachel so much for pulling the agenda for this meeting together and so much of the planning and outreach to our invited guests and being such a good partner. I want to thank she and Maya for being such good support to the subcommittee this year. I want to thank them especially for my new brooch, which I have been wearing proudly all day. It doesn’t sit so well on a sweater, but I am going to wear it very proudly.
MS. SEEGER: Thank you, Linda. I know Maya will also want to say some words, but it is really bittersweet. The past few years of working with you have taught me so much about facilitation, listening. Your leadership skills are incomparable. Your ability to synthesize information – always we are coming into meetings prepared, where you have thoughtfully pulled together work so that we can roll up our sleeves and be ready.

The thought of you leaving us is just really too hard to bear. So, I will leave it at that, but you have been a bright light to us staff on the subcommittee. It has been a sincere pleasure working with you and getting to know you personally and professionally.
The last thing I wanted to add was to thank staff. We know that the administrative difficulties and telecom and others with this two-day meeting have had – pushed you to the limit. You have been creative in finding solutions for us in a very hot room. We appreciate everything that you have done.
Geneva, thank you so much for your support, as always, to the Subcommittee, and to Rebecca. You are both always here for us. We appreciate you lending – and Debbie – so sorry – but really always being there to lend a hand and support to us. Rebecca has kept us on task the entire time for this meeting. I appreciate you extending your help to us.

MS. HINES: Thank you, Rachel. I just want to say it is just hitting me, Linda, that this is your last official committee meeting – subcommittee meeting. I just want to echo that you remind me of – sorry to relate to the current media – the jokes, the memes about Gail King sitting there perfectly in equanimity while someone is standing over her with the loud tone of voice. It sort of reminds me no matter what is going on, you are just calm and cool, you know, bring it on and I will get us back on point, and always forward-looking.
I just realized in writing a little note for setting up our meetings that Frank Pasquale will be on our calls. To some degree, this could be the opportunity to hand the baton to the future generation of the Privacy leadership of the Committee. I am hoping that this product and the activity to achieve this letter and this report will actually be a way to really bring him in. Maybe we can engage him – hey, Frank, if you are listening at some point to the audio of this – as a way of really taking the baton.
I always have to look forward.

MS. KLOSS: We will do our best to bring him up to speed. God help us. I am not sure how we do that to replicate this experience.
MS. HINES: He can download it on his phone and listen to it in all of his spare time.
MS. KLOSS: We are adjourned.

(Whereupon, the meeting was adjourned at 2:20 p.m.)