Department of Health and Human Services

NATIONAL COMMITTEE ON VITAL AND HEALTH STATISTICS

Subcommittee on Privacy and Confidentiality

February 23-24, 2005

Hubert H. Humphrey Building
Washington, D.C.

Meeting Minutes


The National Committee on Vital and Health Statistics Subcommittee on Privacy and Confidentiality was convened on February 23-24, 2005 at the Hubert H. Humphrey Building in Washington, D.C. The meeting was open to the public. Present:

Committee members

  • Mark A. Rothstein, J.D., Chair
  • Simon P. Cohn, M.D. (by conference call)
  • Richard K. Harding, M.D.
  • Robert H. Hungate
  • Harry Reynolds

Absent

  • John P. Houston, J.D.

Staff and Liaisons

  • Maya Bernstein, Lead Staff
  • Amy Chapper, J.D., CMS
  • Jodi Daniel, HHS
  • Beverly Dozier-Peeples, J.D., CDC
  • J. Michael Fitzmaurice, Ph.D., AHRQ
  • Kathleen H. Fyffe, ASPE
  • Marjorie S. Greenberg, NCHS/CDC (February 24th only)
  • Debbie Jackson, NCHS/CDC
  • Evelyn Kappeler, OPHS
  • Lora A. Kutkat, M.S., M.P.H., NIH
  • Susan McAndrew, OS/OCR
  • Helga Rippen, M.D., Ph.D., ASPE/OS
  • Marietta Squire, NCHS/CDC
  • Steven J. Steindel, Ph.D., CDC
  • Sarah Wattenberg, SAMHSA
  • Patricia Watts, Dept. of Veterans Affairs

Others

  • Wendy P. Angst, CapMed, Bio-Imaging Technologies, Inc.
  • Danielle Belopilisky, National Journal’s Tech Daily
  • Sue Blevins, Institute for Health Freedom
  • Michael J. DeCarlo, BlueCross BlueShield Assn.
  • Joyce Dubow, AARP
  • Jose Escalante, Nat’l Council for Community Behavioral Healthcare
  • Linda F. Golodner, NCL
  • Robin Kaigh, private citizen
  • Rebecca Kirch, American Cancer Society
  • Bartha Maria Knoppers, University of Montreal
  • Robert Levine, Juvenile Diabetes Research Foundation
  • Len Lichtenfeld, American Cancer Society
  • Bernard Lo, UCSF
  • Marilyn Zigmund Luke, AHIP
  • Janet T. Martino, MedicAlert
  • A. Thomas McLellan, Treatment Research Institute
  • Thomas Murray, The Hastings Center
  • Joy Pritts, Georgetown University Health Policy Institute
  • Linda Rosenberg, Nat’l Council for Community Behavioral Healthcare
  • Kathryn Serkes, Assn. of American Physicians and Surgeons
  • Laura E. Vartain, Wexler and Walker Public Policy Associates
  • Alan Westin, Ph.D., J.D., Columbia University
  • Kristin Wolgemuth Fitzgerald, Fitzgerald Consulting, Inc.

EXECUTIVE SUMMARY

ACTIONS

  1. The Subcommittee passed a motion to endorse the stated revisions of Recommended Actions 10.1 and 10.2 of the document entitled Observations and Recommendations Relative to Privacy of E-Prescribing. Subcommittee members were unanimously in favor of the proposed amendments.
  2. Ms. Kathleen Fyffe and Ms. Maya Bernstein will summarize testimony for the first round of hearings, to be circulated to Subcommittee members for input.
  3. A poll will be circulated for the third set of privacy hearings (to be held either in April, May, or June 2005).

All official NCVHS documents are posted on the NCVHS website.

Opening remarks by Mr. Rothstein The United States is committed to a system of electronic health records (EHR) within the next decade. Electronic records will reduce cost and medical errors, improve quality by providing greater access, increase safety, research, and public health benefits. Substantial implementation problems must be addressed. Realizing the benefits of EHRs while protecting privacy and confidentiality poses great challenges to bioethics and health policy. The Subcommittee on Privacy and Confidentiality recognizes the difficulty and importance of these issues.

Panel I: What is Health Privacy and Why is it Important?

Introduction to Health Policy Thomas Murray, Ph.D.

Dr. Murray used Alan Westin’s working definition of privacy: the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others. He provided background on the state of electronic medical records (EMRs) and their usefulness to patient care; population health; public health; health services research and quality improvement efforts; population-based research (e.g., genetic research linkages); and registry research focusing on particular outcomes (e.g., cancer). He delineated barriers to the use of EMRs including patient privacy and lack of public trust, and he noted the importance of distinguishing between: control over content and control over access; purposes of access; and the relationship between the patient and the entity seeking information. People need assurance that the technology works reliably and well, especially with sensitive information. People might be reluctant to seek treatment if they fear that the EMR could hurt them later in life. Medical records (including EMRs) are composed of disparate information. Identifying sub-categories would limit major information categories and thus encourage increased patient use.

Comparative Notions of Health Privacy Bartha Maria Knoppers, Ph.D.

Recognizing an increase in international trials, Dr. Knoppers described the European Union’s approach to dealing with personal/health data with an ultimate goal of portability and equivalent protection, rights, and treatment of medical data. A legal directive passed in 1995 incorporated principles in the OECD Guidelines on the protection of privacy in trans-border flows. These principles, not limited by any one technology, are at work today (see detailed summary or transcript for explicit information on health data principles and application in France, as an example). How to evaluate adequate levels of protection for portability between countries and key coding were other topics discussed, as was Canada’s approach to personal health information protection. Health privacy is seen as a fundamental human right, a subjective right, and a constitutional right in different countries. In addition, the importance of data use and transferability was raised relative to quality of research. Barriers to research exist within current ethics and legal requirements; “semantic” inoperability; and incomparable data. A common language is needed because validation and gains in statistical significance cannot happen without an understanding of how privacy is protected. The notion of broad consent and authorization is important, as is the need for accessibility to certain minimal data on populations (rather than on individuals) for public health purposes in order to improve conditions such as anthrax, SARS, and avian flu (see detailed summary or transcript for more information).

Discussion The challenge of autonomy in health care is to figure out to what extent patients have control over the content of personal health records and over who has access. Creating a hierarchy within the medical record by broad categories was recommended along with elaboration of broad electronic record principles. Patient empowerment was seen as critical (including the right not to know, a legal “quagmire”) although ironically, a large margin of autonomy is eliminated by a preoccupation with privacy. For example, ethics committees no longer allow individuals to give broad consent for particular longitudinal studies.

Empirical road testing with providers, patients, and the public is needed to determine subcategories for sensitive information such as OB-GYN, genetics, infectious disease, mental illness. A neglected area of privacy involves professional ethics found in the criminal code or the constitution of some countries. A suggestion was made to address the hierarchy of privacy and patient protection within EMRs by first asking providers and then testing several approaches of translating principles into systemic applications, with input from consumer and patient organizations as well from as other countries. To accomplish the “human moral goals” of the work, it is important to be clear and transparent, to empower trial participants, and to provide them with fewer rather than many choices.

The laws of individual countries determine rights to privacy. While model contracts exist, most countries have legislation covering sensitive health data that does not use the language of limited data setsThere is a role for contracts but only with minimal content required by law. A discussion about opting-in versus opting-out clarified that privacy law protections do not legally apply if a patient is sufficiently anonymized and unidentifiable. While the purpose of information makes a difference, it was suggested that the only kind of opting-in that can be presumed is for routine tests that provide basic information for quality control. A judgment call must be made about what is reasonable to exclude from medical records when weighed against medical costs and other factors. Information transfer and third party access raise further challenges. In the case of life insurance, health information is an entry point for other economic goods (e.g., mortgage, car, or loan) but also is forced disclosure of intimate information. Insurers (and employers) should only get access to limited categories of data. The capacity to limit fields of disclosure must be built into EHRs of the future. Several approaches to protecting genetic information in Europe were delineated (see detailed summary or transcript for further information).

How the Public Views Health Care, Privacy, and Information Technology Applications Alan Westin, Ph.D., J.D.

Dr. Westin directs a new program on information technology, health records, and privacy at the Center for Social and Legal Research. He views the electronic health record initiative as a positive potential step in reshaping the nation’s health care system for purposes of enhancing patient care, reducing medical errors, and reducing high paper handling costs. Reshaping the medical record and flows of health information will impact the first sector (patient care), the second sector (payment and quality assurance), and the third sector (social uses of patient information for employers, licensing, insurance, research, etc.).

From 1978 – 2005, 14 national surveys about health privacy issues were conducted. Results were shared from a recent national survey sponsored by Harris Interactive and the Center for Social and Legal Research on public views of computer application, the effects of HIPAA, and attitudes toward health record computerization [due for release on February 23, 2005] (see detailed summary or transcript for specific results). Health and financial information were found to be the “top two scorers” because the public views technology as having enormous benefits and problems. By very large majorities, people are concerned about privacy and security implications of electronic collection and use of health information. The “tie-breaker” question of the survey was: “Overall, do you feel that the expected benefits to patients in society outweigh potential risks to privacy or do you feel that the privacy risks outweigh the expected benefits?” The public is deeply divided about this question (48 to 47 percent). People who believe that privacy risks outweigh expected benefits are widely distributed across demographic categories.

Over time, Dr. Westin has developed three or four trend questions that tap fundamental attitudes about privacy. Fifty-six percent of the public scores high in privacy concerns about EHRsThe number of people intensely concerned with privacy in the health area is almost double the number of people concerned about general consumer privacy. This is a public mandate for a privacy design specification for any EHR system. Advocates, managers, and builders of this system need to articulate what laws, rules, practices, technology arrangements, education about privacy, and kinds of positive patient experience are needed to engage large portions of the public. Patient access should allow privacy rights to be accessed anywhere. Regional programs are the “beta sites.” Empirical objective research must highlight impact on patients in many settings. There is much to be shared and learned from other countries.

Dr. Westin recommends the development of an institutionalized privacy-by-design working group (analogous to the LC Program of the Human Genome Project) that is active, well-funded, and impressively staffed. He believes that it would be a “privacy-by-design mistake” to think about a patient medical record as one unified document. Rather, he supports a six- or seven-segmented and formatted medical record that the technology stores and retrieves within these segmented parts (see detailed summary or transcript for further information).

Discussion There must be a clear mandate for privacy from the start. Those working on privacy need to convince legislators that an acceptable privacy system exists. Congress and state legislatures should take their cue from how the public feels about the EHR system. A free-standing, high-prestige, well-funded, and well-staffed entity that functions as a privacy impact assessment group should be institutionalized. It is recommended that the Subcommittee examine whether developing EHR systems incorporate interoperability adequately. There is presently a good opportunity to develop more uniformity in medical record formats. The trusted keeper solution is very promising to anonymizing without a preserved linkage file (see transcript for example). Data security, which enables confidence in a privacy and confidentiality system, belongs in the technology sector rather than in the policy orientation of the privacy sector. The EHR system will be another driver in the creation of a biometric identifier system, which will help quell identity theft.

Panel II: Privacy in Health Care and in Society

Privacy and Health Care Bernard Lo, M.D.

Dr. Lo described how an integrated health care system in Northern California used EHRs to respond quickly to the recent Vioxx recall. Prozac provides another example of EMR use in determining a drug’s benefits versus risks (vis-à-vis depression and suicide in children and adolescents). Database research can inform patients and doctors of concerns, uncertainty, and controversy about the use of certain drugs as part of quality control in prescribing. While HIPAA protects medical health records, mental health is often separate or “carved-out” of the medical system. Confidentiality prevents stigma and discrimination while showing respect for the individual but it is not an absolute ethical goal. Policy goals include the protection of confidentiality and the ability to access information for clinical care, public health, and research. Patient authorization must be sorted out as well as the use of individual health data for public health and research purposes. IRB members and researchers need guidance about what kind of outcomes database research is permissible under HIPAA. Recommendations include: 1) public education about the value and limitations of database research; 2) consideration of some outcomes research as critical to public health, which should therefore require compassionate notification rather than consent or authorization as the point of entry into that data; and 3) a focus on confidentiality as well as privacy (see detailed summary or transcript for further information).

Patient Interest in Health Information Technology Joy Pritts, J.D.

A list of benefits and risks of EHRs was presented. Benefits include: improved quality of care; legibility of records, which reduces errors; improved accessibility; elimination of duplicative tests; and streamlining administrative processes. Risks include: concerns about loss of control; questions about the adequacy of security and protection; accessibility; stigma attached to certain medical conditions and associated fear of job and health insurance loss; police access to personal information; and identity theft. A recent 2004 survey by HIPAA Advisory showed that many institutions are not following up or monitoring privacy policies. In addition, there is a segment of the younger population that provides inaccurate information that can subvert the system. Congress needs to resolve the fact that HIPAA does not directly cover everyone who has access to health information and that penalties only apply to covered entities. More practically, notices of privacy practice need improvement.

A national database provokes a “very adverse reaction” as do unique identifiers (due to their potential of use for unrelated purposes). The risks of EMRs, whose greatest impact is on those who are ill, changes depending upon background information. In the clinical community, trust about what can be done with health information depends on how a person has been treated in the past. The curiosity factor in small communities can threaten privacy due to stigma (see transcript for example).

Privacy and Substance Abuse Issues A. Thomas McLellan, Ph.D.

Addiction treatment should be part of the EHR because it is necessary for public health and safety. The management of this disease is important to the treatment of many other chronic illnesses such as diabetes, hypertension, asthma, breast cancer, sleep disorders, and chronic pain. The IOM’s Crossing the Quality Chasm report, which defines principles of patient-centered care, is at the base of the Treatment Research Institute’s approach to integrating addiction and substance abuse treatment and information into EMRs. The patient must be notified of information exchange and must have the right to say no. Special provisions for addiction treatment must be made due to special problems that have arisen from decades of under-funding and segregation, most particularly around a capacity for computer integration information management.

Discussion Outcomes research using large databases is more difficult to do when certain types of information preclude access to comprehensive, integrated information. The notion of the EHR being “consumer-driven” was challenged because the consumer does not have much input about who the record goes to or how it is used. Retrieval rules, more flexible than exclusion rules, fit with the privacy rule by giving patients an option. Many agreed that patient safety and efficacy come before patient preference. Patients cannot expect complete health care if they exercise the right to deny access. How these rules apply to public health and safety concerns was discussed. One way to build trust in an EHR system is to enforce penalties for misuse of information. At the macro level, these issues are difficult to solve until the general issue of U.S. health care access is resolved. Key questions revolve around who has authorization and how to keep health information away from people who can do “harm.” A recent IOM report supports the notion that health information is owned by the patient rather than the health system. A suggestion was made to seek guidance from financial institutions about handling this kind of information. The system should have complete information (to include sensitive areas) that is adequately protected. Mr. McLellan thought the link between substance abuse and illegal activity irrelevant within the confines of health care but others disagreed. Ms. Wattenberg (from SAMHSA) mentioned protection of substance abuse information in health records relative to the creation of the federal Part II regulation.

Patients must understand the potential benefits of outcomes research. Some thought that, despite waivers, automatic access to database public health research might create “slippery slopes” in areas such as homeland security. A discussion about public awareness approaches ensued (see detailed summary or transcripts for specific approaches). It was noted that in database research, potential harm is statistical; and that information can be given to researchers in a de-identified format. Decision-makers must be seen as trustworthy, accountable, and transparent about what is being done, why it is important, and what review process is in place. Public health is considering models that allow governors or state DPHs to declare emergencies with broad powers and due process procedures. Laws must address direct use of information by researchers.

Statements from the Public

Medical Privacy Coalition Kathryn Serkes

Ms. Serkes, representing the Medical Privacy Coalition, presented results of a survey by

The Association of American Physicians and Surgeons (AAPS) that cited concerns about privacy and confidentiality (see detailed summary or transcript). The Medical Privacy Coalition objects to the current standard privacy enforcement of public health usurping individual rights. Patients who fear disclosure withhold information. The biggest concern is government access to medical records. Patient confusion about HIPAA (seen as outdated) was mentioned. To avoid privacy intrusions, more patients are using cash-based practices. Patients say they want EMRs but not necessarily a government-run nationalized database of medical records. The Coalition would like to see consent reinstated. Survey results will be emailed to Ms. Marietta Squire.

Institute for Health Freedom Sue Blevins

Ms. Blevins read a prepared statement from Robin Kaigh, an attorney who has tracked medical privacy since 1996, which spoke of the “danger” of wrongful access to or exposure of personal health record information; and of the promotion of an opt-in or opt-out systemRepresenting The Institute for Health Freedom, Ms. Blevins also advocated for personal consent. She read three statements from HHS’s analysis of the federal medical privacy rule about electronic information that supports the Institute’s contention that “the electronic health record is a recipe for privacy invasions.” (see transcript for quotes)She emphasized a “huge” difference between consent and notification, stating that what occurs at present is “coercive consent.” Unless consent is reinstated and rights to privacy upheld, U.S. citizens will have three options: 1) information is shared without permission; 2) privacy is maintained because patients lie to their providers; and 3) care is refused to maintain privacy. Ms. Blevins will submit copies of the statements as well as a copy of a study noted by HHS.

Subcommittee Discussion

The draft letter hand-out was revised and approved by the Subcommittee, whose purpose was to provide language for the Subcommittee on Standards and Security to incorporate into their overall statement for submission to the full Committee meeting of March 3-4, 2005. Also revised were recommended Actions 10.1 and 10.2 within Observations and Recommendations Relative to Privacy of E-Prescribing, drafted by the Subcommittee on Standards and Security (see detailed summary or transcript for further information). It was noted that written testimony of Michelle Bratcher Goodwin about implications of electronic health records and health care records in general on minority populations would be entered into the record.

National Health Information Network

Panel III – Disease and Health Advocacy Groups

American Cancer Society Len Lichtenfeld, M.D.

Discussion centered on the internet and its impact on the American Cancer Society (ACS), its collaborators and constituents. Topics covered ACS’s internet-based information strategy, policies, and specific programs; the National Cancer Database; and electronic health records and related technologies. Dr. Lichtenfeld described how ACS addresses privacy concerns and how privacy issues impact activities and parties served. ACS does not use identifiable information for internal activities but does gather aggregate information to track trends. It collects no information and requires no information about or from visitors to its website Cancer.org. ACS works closely with the American College of Surgeons and their Commission on Cancer (CoC), which operates the National Cancer Database (NCDB). CoC utilizes aggregate data to improve the quality of care for cancer patients. Funding is being sought for a three-phase initiative to implement a plan that improves these data processes. The need for everyone at the local, state, and national levels to work together to address broad legislative, regulatory, legal and ethical concerns related to privacy was emphasized. The country needs clearly understood rules to strengthen valid research and quality improvement as well as guidance that allows for information-sharing across state borders.

Juvenile Diabetes Research Foundation Robert Levine, M.D.

Dr. Levine believes that the biggest risk to people with diabetes, heart disease, cancer, HIV/AIDS, or other chronic diseases and disabilities comes from a failure to share; inadequate use of personal health information; and, at times, from placing higher value on protecting privacy over a person’s life and health. The health information environment should respect and serve patients, the health system, and the public. A fundamental principle is that medical records information is owned by the individual rather than by third parties who might be health record keepers. Therefore, information users are involved in a voluntary exchange of value or definable direct benefit to the individual. One idea put forth is that individuals should be “paid” (as in social benefit; personal benefit; or financial benefit) for each use of their information. There must be public confidence in extensive and open exchange of sensitive information between trusted parties (see detailed summary and transcript for further information).

National Council for Community Behavioral Healthcare Linda Rosenberg, M.S.W.

At the heart of the privacy issue in mental health is discrimination against people with mental illness versus the importance of an automated system. There is a great divide between what is effective and what people “get” every day. Representing The National Council for Community Behavioral Health, Ms. Rosenberg described the lack of parity between treatment of mental and physical health and presented statistics about people suffering from mental illness. She articulated the goals of a 2004 report issued by the New Freedom Commission on the use of technology to access mental health care and information. Discussion topics included: why an electronic health record is important for the field; obstacles; and what assistance is needed to leverage a national health information system that improves the quality of mental health care and protects patient safety, privacy, and confidentiality. The National Council needs to step up its involvement within the health care information technology industry; work with its members to develop an electronic health record strategic plan; and ask that specific federal funding and technical assistance be earmarked to support behavioral health efforts.

Discussion Prevention was mentioned as a major mission for the country. Resistance to gathering information (sometimes political) about obesity or mental health in school children was raised as a privacy concern. The need for uniform, simple, clear, understandable and reasonably applied regulations was reiterated. Discrimination and fear of stigmatization, exacerbated by electronic retrievability, were highlighted as concerns likely to increase the anxiety of those with mental illness. That some might, as a result, forego beneficial treatment, is a topic recommended for discussion by the Committee. “Ownership” of personal health records was discussed relative to public health and research, with the suggestion that the focus stay with the professional relationship and benefit to the patient rather than with obtaining individual permission for use. A growing discomfort about how well patients are served by the system was identified. Outcomes depend on access to changing information flows, frequency of sampling, and quality of interpretation and communication of dependent relationships. Failure to use data could be construed as equaling negligence. Creating trusting partnerships is far more effective than threatening litigation as a means of improving public health.

The role of EMRs relative to workforce skill was discussed. Other topics included the need for strategies to engage and keep people in treatment. There was general agreement about including behavioral health in EHRs. The use of EHRs and the impact of HIPAA on research revolved around whether treatments are working, a question that cannot be answered until a portable, standardized set of datasets and data formatting is developed.

SUBCOMMITTEE DISCUSSION

Future meetings Details of the Subcommittee’s future hearings were delineated. The challenges of balancing public good, patient good, provider good, and research good in order to come to agreement about how these elements work together to change environments was stressed, as was the need for public education. Ms. Fyffe will finalize a composite document of issues to consider. Ms. Marietta Squire will circulate a calendar to finalize dates for future hearings (see transcript for further discussion and hearing details).

Panel IV Consumer Advocacy Groups

AARP Joyce Dubow

Representing the AARP Public Policy Institute, Ms. Dubow focused on consumer and public education about the National Health Information Network and related privacy issues in relation to adversarial stakeholder relationships. Background information was provided and the merits of health information technology (HIT) were articulated. It was noted that although clear and agreed-upon definitions are lacking, the importance of privacy and confidentiality to consumers cannot be underestimated. AARP’s policy on privacy and confidentiality supports compatible procedures that allow health care delivery and research to occur, with the understanding that patients control and know how to access their information. AARP supports actions such as written consent and covered entities that make individually identifiable health information less vulnerable to inappropriate disclosure and misuse. Consumers are not yet fully involved or engaged in this discussion.

National Consumers League Linda Golodner

Ms. Golodner provided a patient-oriented perspective on privacy in relation to HIT. Background was provided about the nation’s oldest consumer organization. The National Consumer League (NCL) urges policy makers in HHS to integrate a series of principles about information access and control; disclosure and accountability; functionality; and governance (see detailed summary or transcript for specific principles).

Discussion A case example was presented to clarify policy differences between AARP and NCL. A discussion of third party use ensued relative to the fundamental question of whether there are ways to balance the need for information required by insurance companies or long-term facilities with the invasion of individual privacy. At present, there is a limit to regulating privacy on access issues where third parties have economic leverage. Clarification was sought on the definition of “personal identifiers” and “de-identification.” If data was 100 percent de-identified, bias would shrink and consumers would be more fully informed of the whole picture. Questions revolved around whether consumers should trust IRBs and on the effectiveness of the HIPAA rule. The possibility of a “two-tiered” health information system was raised (see detailed summary or transcript for further information).

Panel V: Privacy Concerns Related to Personal Health Records

MedicAlert Janet Martino, M.D.

MedicAlert privacy and confidentiality policies and services were described. In collaboration with CapMed, MedicAlert plans to introduce a personal EHR that is compatible with health care industry standards, late in 2005. MedicAlert recognizes the need to be more broadly interoperable so that EHRs can accept and send data to various organizations in order to expand the scope of information sources and receive information more directly from provider EHRs, payers, and labs. The next generation architecture will include HL7 reference information model and functional outline. To accomplish its goals, MedicAlert is enhancing its security measures. Upcoming issues were also delineated (see detailed summary or transcript for further information).

CapMed Wendy Angst, M.H.A.

CapMed’s consumer-centric personal health record model looks for the best way to engage consumers in doing a better job of partnering with their health care. It allows patients to have complete use of their records without having to input information from the physician. The electronic interface between CapMed and EMRs (e.g., NextGen) was articulated. This PHR model aggregates information from electronic health records that export data and interface with home monitoring devices. Four levels of CapMed users were described as were ways that CapMed works with consumers. Research findings (2002) were presented on CapMed’s personal health record users. CapMed’s model is beneficial at the patient/family level; everything is codified appropriately and exchangeable. A patient’s record is mapped in a problem-oriented format with links to education and automatic reminders. Within the application, a person can view, manage, or update a PHR anywhere in the world, allowing for immediate access to emergency data. An examination of privacy in terms of access, sharing, and ownership was encouraged.

Discussion Pitfalls of information filtered by patients prior to review by physicians were discussed. Various interest sets were described (such as patients who want control, providers who want enough information to do their jobs, and third party users). The question of how to make this medical information interoperable was raised. A common data/information structure model is needed. One framework for interoperability (developed by the Computer-Based Patient Record group) was mentioned. MedicAlert intends to begin identifying common information models that allow mapping of different data structures to each other. The patient/provider relationship was seen as critical to gathering reliable data. Consumers are given the power to withhold information but not to responsibly wield this power. Patient authorization and consent were also discussed (see detailed summary or transcript for further discussion).


DETAILED SUMMARY

DAY ONE: FEBRUARY 23, 2005

CALL TO ORDER, INTRODUCTIONS, REVIEW OF AGENDA, OPENING REMARKS

Opening remarks by Mr. Rothstein The United States is committed to a system of electronic health records (EHR) within the next decade. Electronic records will purportedly reduce cost and improve quality by providing greater access to records from remote locations, accurate and fast information from cognitively or otherwise impaired individuals, cost savings, increased safety, research and public health benefits (as supported by Senator Bill Frist in The New England Journal of Medicine in January 2005). It should also be noted that a study of medication errors in 2003 (summarized in American Medical News) found that computer entry and other electronic errors far outnumbered medication errors caused by illegible or unclear handwriting. Substantial implementation problems must be addressed. Other concerns about a cradle-to-grave, comprehensive, longitudinal electronic health record center on health privacy and confidentiality issues. Some current privacy protections emanate from the fragmented nature of health records in a paper-based system lacking coordination and integration (see transcript for examples).

Realizing the benefits of electronic health records while protecting privacy and confidentiality poses one of the greatest challenges to bioethics and health policy in recent years. What level of patient control over record contents will be permitted? Too little would signify insufficient privacy and/or a system that is perceived to be overly intrusive; too much control might jeopardize health care quality. Health care providers might want to supplement the EHR or they might be concerned about medical error liability.

The Subcommittee on Privacy and Confidentiality recognizes the difficulty and importance of these issues. To help frame the issues, background and perspective is being sought from some of the world’s leading experts.

Panel I: What is Health Privacy and Why is it Important?

Introduction to Health Privacy Thomas Murray, Ph.D.

Privacy about health information is informational privacy. Dr. Murray will use Alan Westin’s definition of privacy as his working definition: the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.

Background on the state of electronic medical records (EMR) By 2002, 17 percent of U.S. primary care physicians were using electronic medical records, compared to 58 percent in the U.K. and 90 percent in Sweden. E-health carries numerous categories: 1) medical records (privacy is at issue); 2) communication between providers and patients (privacy is implicated); 3) decision support (individual recommendations might be forthcoming electronically); and 4) knowledge-based management (security – and therefore, privacy, are big issues).

The usefulness of the EMR is experienced in patient care; population health; public health; health services research and quality improvement efforts; population-based research (e.g., genetic research linkages); and registry research that focuses on particular outcomes (e.g., cancer). Trustworthy intermediaries play an essential role in well-functioning registries that allows for complete or near complete data.

Important barriers to the electronic medical record include: 1) many physicians find a very stressful learning curve; 2) time spent answering patient emails is greater than time saved in phone and office visits (confirmed by at least one study); 3) many (older) physicians don’t like to type (hearsay, although confirmed by some studies); 4) cost (a full EMR could cost a practice $50,000/doctor in Year I); 5) incompatible software and the fear of ever-changing software (which efforts to standardize data would mitigate); 6) lack of reimbursement; and 7) patient privacy. Another important challenge is public trust (Palm Beach incident was noted, in which names of people being treated for HIV were accidentally emailed out) although these incidents can occur outside the realm of technology (see transcript for example).

It is important to distinguish between:

  • Control over content of the EMC and control over access to that content. Giving a patient control over content could result in less than optimal diagnosis or treatment. Patients may not want certain things in their health record at all.
  • Purposes of access, which could include providers, prospective or current employers, insurance companies, a federal agency investigating potential health fraud, or researchers.
  • The relationship between the patient and the entity seeking the information. For example, a PCP would have a significantly different relationship than an ER physician).

Data in electronic health records (EHRs) are distinctive in their persistence, keeping data elements in-tact from cradle to grave, in contrast to today’s records, which follow us imperfectly (see transcript for example). Electronic data are also ubiquitous in not being limited by time and place, as paper records have been. People will need assurance that the technology works reliably and well.

A hypothetical patient, “Amy,” was introduced. She had enuresis at age five but the pediatrician found no significant physical problems. The problem eventually went away. At age 12, Amy’s father found a pack of cigarettes in her room and the doctor gives her a nicotine patch. Both episodes were recorded in the EMR. At 15, an acquaintance pushed her to try an illegal drug. She becomes frightened and wanted to talk to her doctor but was afraid to do so because she did not want this discussion on her permanent record. She went to the doctor anyway. Out of college for one year at age 23, Amy became somewhat depressed and was diagnosed with a mild case of clinical depression by her physician. She received medication, which was recorded in her record.

It was pointed out that people, in general, might be reluctant to seek treatment for depression due to fear that the EMR could hurt them later in life. Some information in the record is time- specific, some is inane, and some is very sensitive. The medical record (including the electronic medical record) is composed of disparate information and therefore is not unitary. Identifying sub-categories within that record would be enormously valuable in limiting the number of major information categories. This would be useful because people tend to tune out when confronted with many choices.

Comparative Notions of Health Privacy Bartha Maria Knoppers, Ph.D.

There are an increasing number of international trials, data comparability, and a flow of data between countries, researchers, and participants who stand to benefit from the trials.

The European Union is currently composed of 25 countries. As such, it deals with the same kinds of difficulties as the U.S. national approach, in which states usually have jurisdiction over privacy and health legislation. A legal directive was passed in 1995 by the Union on the processes of personal data, including health data. European countries were then forced to harmonize with the directive in their domestic national legislation in order to offer an equivalent level of protection in the other member countries of the European Union. The ultimate goal was portability between countries with equivalent protection, rights, and treatment of medical data. The directive incorporated the principles in the OECD Guidelines on the protection of privacy in trans-border flows. These principles are reflected in the European directive 15 years later and countries outside of the European Union have been equally inspired to adopt these principles. Manifesting power and flexibility, the principles are not limited to any one technology.

The principle dealing with health data is negative (“you may not…” without specific and suitable safeguards) within the construct of a non-limited list that allows for additions. One principle speaks to explicit consent except where laws authorize otherwise, such as for national security, public emergencies, or state surveillance programs. Consent is listed for those who are weak, vulnerable, or incapable in order to protect their vital interests. A very large exception includes processing for the purposes of preventive medicine, medical diagnosis, the provision of care or treatment, or the management of health care services provided by health professionals subject to professional secrecy. A health care system cannot work without a basic exception to explicit consent by the individual.

Prior to the directive (1978), France adopted a “loi.” Modified in August 2004, the new French law confirms the prohibitions as stated in the European Directive about processing health data (Article 8), with the same provisions to protect vulnerable persons and the same article about processing with the possibility of override by the state or system subject to professional secrecy. Two new additions are the exception for processing for research in the health sector and the possibility of making data anonymous without express consent, with authorization of the National Commission for Informatics and Freedom (CNIL). Research is included in the list of exceptions because, for quality assurance in any given health care system, data must exist to determine whether cut-off for different tests is accurate. Surveillance and incidence data is important, for example, when determining whether HIV is on the rise and what populations are at higher risk. Anonymization is legally and ethically expedient because, in a practical and reasonable context, it eliminates many problems. However, it can work to the detriment of science because once anonymous, data cannot be updated as they can’t be traced. No one will know if something changes clinically because the data remain static in time. After five years, data is good only for controls rather than for the long-term. The individuals needing protection from forward-moving research may be short-changed.

When the research exception was added in France, a consulting or advisory committee under the Research Ministry on the treatment of information in research in the health sector was also added. Operating since August 6, 2004, the committee examines requests for processing health data for research and gives an opinion on research methodology prior to scrutiny by the big body. Certain conditions remain even after an opinion and CNIL authorization is obtained. The data must be coded to allow for identification, which is minimalist protection. Once published, the results create a possibility of opposition that differs from consent and refusal. If someone opposes the use of their data, researchers won’t deliberately seek their consent for research or processing. Failure to respect such opposition is subject to sanctions of the higher body.

The question of how to evaluate an adequate level of protection for portability between countries was raised. One must look at the rules of law in individual countries, to include general and state laws, security measures, and so on. It is very rare for a law to go beyond its own borders. Relative to the U.S., the European Commission has made some decisions about the adequacy of American privacy principles (July 2000), and since then, HIPAA has come into effect. If personal data are collected in the European Union and transferred to the U.S. for pharmaceutical research, member state laws apply to the collection and processing of data prior to the transfer (noting that a country has to meet its own laws before it transfers). Once transferred to the U.S., safe harbor principles apply. In 2000, the Union determined that data used for pharmaceutical research and other purposes should be made anonymous when appropriate. If a European wants to withdraw data that is already in the U.S. for research purposes, the universal Helsinki declaration takes effect. That is, a person or sponsor who chooses to withdraw from research is able to do so. Participants are notified from the start that data collected previous to withdrawal may still be processed.

A frequently asked question is about key coding. If the country to which health data is being transferred cannot access the key coding from the country of origin, can the code be unlocked to identify the person? For example, there is no transfer of personal data if a company in the U.S. does not have a unique key code held in Germany that enables researchers to identify research subjects under special circumstances (i.e., for needed medical care or if toxicity of an individual or small group is discovered). Therefore, the directive would not apply to such data.

Canada asked the European Commission whether the new Canadian Personal Information Protection Electronic Document Act met equivalent protection. Information about body parts or substance donation, health services information, physical and mental information, and samples of “wet data” are subject to this Act. Originally, the Act was not supposed to apply to health but health data research comes with a good deal of public/private sponsorship. To ensure flexibility over time, Canadians attached an annex as part of the law – the Canadian Standards Association code on the protection of data. There is also a Tri-Council policy statement that covers all research involving humans (demographic, medical, historical), which is updated by the Canadian Institutes for Health Research draft Privacy Best Practice Guidelines. Since 1975, privacy and autonomy in health research was seen to hamper research about certain conditions that were not individually-oriented due to the need for individual consent. Statements of ethical conduct from 1998 are increasingly viewed as over-protectionist and paternalistic. Epidemiological research (e.g., for HIV prevalence in newborn screening) has been dropped due to the need for explicit consent. When taken to extremes, privacy and autonomy can harm the individual and thwart the role of the state to protect, prevent, and promote health. The province of Quebec will follow its own internal legislation once it has proven substantial equivalence.

Conclusion In the 25 member countries of the European Union, health privacy is a fundamental human right. In countries following the Napoleonic Code or the Common Law tradition such as the U.K., Australia, or Canada (except for Quebec), privacy is also a subjective right found in the civil codes. Harm does not have to be shown in the case of infringement, breach, economic or other losses, which automatically makes it an “actionable right.” Privacy and reputation are amorphous, ambiguous, and hard to define. In some countries, privacy is also a constitutional right (e.g., Germany) that falls under private laws in different statutes.

Use of data and data transferability is extremely important, especially as related to quality of research. Barriers exist within the current ethics and legal requirements that prohibit the building of research infrastructures and resources with no immediate benefit for the person, even within longitudinal studies. Researchers encounter “semantic inoperability.” Data cannot be compared because they are entered and protected differently. What is called de-identification under HIPAA is found under the international code on pharmacogenomics as meaning double-coding, which is totally different. We can no longer know whether there are equivalent levels of privacy and protection. A common language is needed, maybe not using the same words but using the same concepts to protect privacy because validation and gains in statistical significance cannot happen without understanding how privacy is protected.

The notion of broad consent and authorization is major. The OHRP in the U.S. made a revolutionary statement in August 2004 stipulating that if certain coding conditions were met with data holders and contracts, an explicit consent would no longer be required.

Portability in Europe and Canada portends the use of smart or administrative health cards that allow for exchange of individual medical data between countries.

Public health need for accessibility to minimal data While taking privacy into consideration, it is important to remember the need for accessibility to certain minimal data on populations (rather than on individuals) for public health purposes in order to improve conditions such as anthrax, SARS, and avian flu.

Discussion

Autonomy in health care has been a dominant theme in law and bioethics over the past 30 – 35 years. How, then, does patient autonomy (e.g., the right to refuse treatment) become integrated into an electronic health record system? Sensitivity to the interests as well as reasonable and informed preferences of patients are critical, notwithstanding qualifiers such as what drugs a person is already taking. The challenge is to figure out to what extent patients have control over their record content and over who has access. Adding to Dr. Murray’s comments, Dr. Knoppers agreed that creating a hierarchy within the medical record by broad categories is preferable to “no access at all” or total control by the patient. Elaborating broad electronic record principles is also recommended. Empowering the patient is critical, provided that autonomy is exercised in a reasonable and informed manner. The “emerging glory” of this empowerment, seen mainly in Europe, is the right ‘not to know,’ which has found its way into charters and constitutions (although at times, figuring out how much information to give to exercise the ‘not to know’ right is a legal quagmire). Autonomy has reached an apex in terms of individual preferences, exercise, and choice. A large margin of autonomy is being taken away (ironically) by an overriding preoccupation with privacy. For example, ethics committees no longer allow individuals to give broad consent to particular longitudinal studies, saying that it does not adequately protect privacy over time.

Dr. Harding raised the topic of “sensitive information” (i.e., OB-GYN, genetics, infectious disease, mental illness). He wondered what subcategories would exist in an electronic system and how they would be utilized. Dr. Murray emphasized the importance of “road testing” and suggested taking whatever guesses about how to chunk categories as a hypothesis to be tested empirically with providers, patients, and the public. An optimally designed array of choices is needed to enable people to exercise choice and get the results they hope for. Dr. Knoppers pointed out that what is sensitive today (e.g., genetic or psychiatric data) may (hopefully) be considered “normal” ten years from now.

A neglected area of privacy involves the deontological or professional ethics found in a variety of statutes that are legally actionable. In France, the right to professional secrecy is found in the criminal code; in other countries, it is found in the constitution. A very serious obligation of professional secrecy is imposed on the health professional to avoid breach.

Mr. Reynolds wondered about the hierarchy of privacy and patient protection within electronic medical records. Dr. Knoppers’ preference is to begin by asking providers, referring to the framework of professional/medical secrecy principles. She would develop and then test two to three approaches that would translate those principles into different kinds of systemic applications. Consumer and patient organizations need to be involved. It is important to take into consideration information (such as environmental or marital break-ups) that does not “look” medical but is equally important. Other empirical data gathering could include lessons learned from countries such as the U.K. and Sweden.

Dr. Murray hopes that we don’t end up with detailed, legal descriptions that people sign but don’t read because this would not accomplish the “human moral goals” of the work. Being clear and transparent (using simple language) is important, as is empowering trial participants and providing them with fewer rather than massive choices.

Dr. Fitzmaurice wondered if there is a right to privacy for health information relative to health insurance that does not exist for life or car insurance. He also wondered whether the use of contracts is similar in the European Union and in Canada – protecting confidentiality and personal health information when patients have not given authorization for use. Dr. Knoppers said that one would have to look at the laws of individual countries for an answer. Model contracts exist but most states already have legislation covering sensitive health data that does not use the language of limited data sets. In France, for example, a body of data would be used to approve the kind of arrangement needed for a particular type of research. Personal contracts are not usually implemented because a legal framework is already in place, something like a health IRB. A contract approach might work if the researcher/institution holds the data set and only needs certain data. This would be problematic in Europe unless it said “contractual arrangements will be permitted under law if they contain the following elements,” and a law about public content was passed on to legislators. There is a role for contracts, then, but only with minimal content required by law.

Dr. Fitzmaurice asked if there were a principle such as autonomy or self-determination that applies to whether a person would use opting-in versus opting-out, or is it a matter of balancing the benefits against the harm (see transcript for examples). In 1998, the Health Sector Database Law of Iceland was deemed unconstitutional because it presumed opting-in of Icelanders unless they opposed. They weren’t asked but citizens were supposed to know that they could oppose. Dr. Knoppers personally believes that the only kind of opting-in that can be presumed would be for routine tests that provide basic information for quality control. The privacy law protections would not legally apply if a patient is sufficiently anonymized and unidentifiable. This kind of aggregate information gathering should not require an explicit opt-in. Dr. Murray added that the purpose of the information gathering makes a difference (e.g. the allergies of someone who shows up in an ER), as does the relationship.

Mr. Hungate wondered if the need for “simple choices” was within the context of society or specifically geared to the individual. Dr. Murray emphasized the need for “meaningful” rather than “formal” choices, which again, require road-testing of a set of finite, small number of crucial choices that enable people to exercise autonomy in meaningful ways that serve their interests now and in the future. Mr. Rothstein asked how decisions would be made for how to technically help people keep what they want out of their records. A judgment call must be made about what is reasonable to exclude when weighed against the medical costs, etc. In the access arena, Dr. Steindel thought it important to consider information transfer out of a patient’s record to someone else (e.g., psychiatric information). Mr. Rothstein mentioned third party access to health information, which some consider to be much more problematic than unauthorized access. In the case of life insurance, it is an entry point for other economic goods (e.g., mortgage, car, or loan) but it is also a forced disclosure of intimate information. So, insurers (and employers) should only get access to limited categories of data that they really need for their actuarial tables. There is no practical way to accomplish this in a paper-based system as cost and time factors would be prohibitive. This capacity will only exist if the capacity to limit fields of disclosure is built into EHRs of the future. If limiting data sets could be done such that only relevant data went to the requester, aspects of personal privacy in an electronic system would be improved over the current paper system. A common misperception is that EHRs pose a tremendous privacy threat while it actually represents a great hope or possibility of protecting privacy (which Mr. Rothstein advocates).

A question was asked about whether the European privacy laws protecting genetic information are anti-discriminatory. Ms. Dozier wondered what the law looks like and about whether it restricts information that insurers can obtain. Dr. Knoppers described several approaches across Europe: 1) No access to data for life insurance purposes, as was established in Belgium in 1992 in the civil code on insurance (note: the approaches that separate genetics make it seem as though genetic data is not medical data, which leads to stigmatization and discrimination); 2) Voluntary moratoria, sometimes time-limited, for when insurance companies want to forestall the establishment of a law, as in Holland; 3) A special commission decides which conditions are so genetically certain that a life insurer should have access to them, as in the U.K. (the only one that they have done this with is for Huntington’s disease, on and off); 4) the European Convention on Biomedicine and Human Rights said in 1997 that no tests shall be done that are not done for medical reasons; and all countries that have signed on must ratify that their internal laws are in conformity. That means that a genetic test cannot be done for insurance purposes.

Dr. Murray’s participation in a task force for the Genome Project on genetic information and insurance has convinced him that it does not make sense to treat genetic information as distinctive or toxic in the larger context of the electronic medical record.

How the Public Views Health Care, Privacy, and Information Technology Applications

Alan Westin, Ph.D., J.D.

Among his many activities, Dr. Westin is the director of a new program on information technology, health records, and privacy at the Center for Social and Legal Research. He views the electronic health record initiative as a very positive potential step in reshaping the nation’s health care system for purposes of enhancing patient care, reducing medical errors, and reducing high paper handling costs. Reshaping the medical record and flows of health information will impact the first sector (patient care), the second sector (payment and quality assurance), and the third sector (social uses of patient information for employers, licensing, insurance, research, etc.).

Harris Interactive and the Center for Social and Legal Research sponsored a national survey on public views of computer application and effects of HIPAA as well as on attitudes toward the computerization of health records (due for release on February 23, 2005). This telephone survey, in the field from February 8 – 13, 2005, had a sample of over 1,000 respondents representing approximately 214 million adults. The margin of error was plus or minus three percent. As a reminder, there have been 14 national surveys between 1978 – 2005 that have dealt completely with health privacy issues or have had major sections about health privacy. There are some “top line” findings from these surveys that are well established: that health and financial information are always the top two scorers. In many ways, Dr. Westin thinks that personal health information would be slightly ahead of financial information in ratings. By very large majorities, people are concerned about privacy and security implications of electronic collection and use of health information. This is because the public essentially views technology as a two-edged sword, with enormous benefits and sharp problems. Because of this, when people visit health websites, they are often very concerned about privacy and security so they don’t share personal data or take full advantage of these sites. Consumers with chronic or genetically-based health conditions are especially concerned about the flows of their health information into the third sector, which is a relevant aspect of the social uses of personal information.

The survey’s first question (first used in the Harris Westin survey on health information privacy of 1992) asked: “do you believe that any of the following have disclosed your personal medical information in a way that you felt was improper?” In 1993, 27 percent of the public (representing 50 million adults) said they believed that one of the five groups [doctor; clinic or hospital; employer; health insurance company; public health agency] had released their personal medical information in a way that the respondent considered improper. In 2005, only 14 percent (representing 30 million people) responded that way, which is within the parameters of what would be expected from a general victimization perception by roughly a quarter of the American public (Dr. Westin noted that in all the surveys he has done, about 25 percent of the public says they believe that their privacy has been invaded by business or the government). The biggest drop was for the health insurance company, from 15 percent in 1993 to eight percent in 2005, now on a par with the clinic or hospital. Was this as a result of HIPAA? The 2005 survey asked, “In the past three years, have you ever received one of these HIPAA health privacy notices?” (Note: the text of all survey questions is in the appendix of Dr. Westin’s testimony). Astonishingly, only 32 percent of the public (representing 68 million adults) say that they cannot remember ever receiving a HIPAA privacy notice. One concern in survey research is that people will give a socially acceptable answer. But that is not what happened in this survey. Two-thirds of the public (representing 158 adults) remember receiving a HIPAA privacy notice and one third could not recall.

Those who remembered receiving a privacy notice were asked: “Based on your experiences and what you may have heard, how much have this federal privacy regulation and the privacy notices increased your confidence that your personal medical information is being handled today in what you feel is the proper way?” Sixty-seven percent said it increased their confidence. Yet only 23 percent chose “a great deal” and 44 percent said “only somewhat.”

The intended focus of the survey was the electronic health record or electronic medical record, described as “The federal government has called for medical and health care organizations to work with technology firms to create a nationwide system of patient electronic medical records over the next few years. The goal is to improve the effectiveness of patient care, lessen medical errors, and reduce the costs of paper handling. Have you read or heard anything about this program?” Twenty-nine percent (representing 62 million adults) said that they had read or heard about the program. The survey addressed six concerns that people have about the effects of an EHR system. Would there be: 1) leakage of sensitive health data? 2) more data sharing without the patient’s knowledge? 3) adequate security for health data stored on computers? 4) an increase rather than decrease in medical errors? 5) less willingness to provide necessary information to health care providers due to concern about computerization? 6) a reduction of federal health privacy rules in the name of efficiency?

Two-thirds of the American people say that they are concerned about each of these six issues, with most concern about leakage of sensitive material due to weak data security (70 percent were concerned, with 38 percent as very concerned). Concern about more data sharing was at 69 percent, with 42 percent as very worried. Concern about inadequate data security was at 69 percent, with 34 percent as very concerned. Concern about increasing rather than decreasing errors was at 65 percent, with 29 percent as very worried. Sixty-five percent would not choose to disclose information to their provider due to worries about it going into computerized records, with 29 percent as very worried. Sixty-two percent were worried that the federal health privacy rules would be watered down, with 28 percent as very concerned.

The “tie-breaker” question of the survey was: “Overall, do you feel that the expected benefits to patients in society outweigh potential risks to privacy or do you feel that the privacy risks outweigh the expected benefits?” The public is deeply divided about this question (48 to 47 percent). People who believe that privacy risks outweigh expected benefits are widely distributed across demographic categories (demographic information and factor analysis will be published in the Spring of 2005).

Over time, Dr. Westin has created a “segmentation of the public” on privacy issues by developing three or four trend questions that tap fundamental attitudes that show how many people take the strong privacy view on all the trend questions; how many take it on some trend questions; and how many don’t take the privacy view. A high-medium-low segmentation of the public has been developed with accompanying numbers for how many people fall into each category. Demographics of each segment have been studied to determine who is highly concerned about privacy and who is not. A striking finding is that 56 percent of the public scores high in privacy concerns about EHRs (concern in five to six of six statements). Sixteen percent fall into the medium category (concern in three to four statements); 14 percent choose one or two concerns. No statement has been chosen by 14 percent of the public.

There is a solid national majority who fall in a high EMR privacy concern camp. This compares with Dr. Westin’s studies, which show that only 35 percent of the public score in this high or fundamentalist orientation about consumer privacy issues. The number of people intensely concerned with privacy in the health area is almost double the number of people concerned about general consumer privacy. While it is a little more intense, this is also what Dr. Westin found in the 1993 study when a national health insurance plan was being promoted. He wanted to test what he believed to be one of the most critical issues in EHR development – the role of the patient. The question was framed as: “Since most adults now use computers, the new patient electronic medical record system could arrange ways for consumers to track their own personal information in the new system and exercise the privacy rights they were promised. How important do you think it is that individual consumer tools be incorporated in the new patient electronic medical record system from the start?” Eighty-two percent believe that this is important, 45 percent rated this as very important, and only 17 percent did not see this as important. Dr. Westin stipulated that the response was a somewhat “socially acceptable” answer but nevertheless, an extremely important finding. He interprets it as a public mandate for a privacy design specification for any electronic health record system. From the start, the public is saying “program me in my privacy choices, privacy access, and technology access. Otherwise, I am not going to be confident that this system serves my interest and is good for me and for society.”

Advocates, managers, and builders of this system need to articulate what laws, rules, practices, technology arrangements, education about privacy, and kinds of positive patient experience it will take to get that 47 percent of the public to feel that the privacy risks are not outweighed.

Study Conclusions and Recommendations An electronic medical record or electronic health record system holds enormous promise for patients, health care delivery, breakthrough research, and the interests of society. The system is more likely to proceed now than at any time in the past because 1) medical professionals are now fairly technology conversant; and 2) the technology is much more powerful, with data mining and linkage techniques, increased software power, and a host of tools which hold promise in achieving cost effectiveness and reduction of medical error problems. However, the system will not succeed if public concerns over privacy are not understood and addressed.

There needs to be an institutionalized privacy-by-design working group (analogous to the ELSI Program of the Human Genome Project) that is active, well-funded, and impressively staffed. Such an organization might be government-supported but not government-run – a consortium of government, private sector, consumer and patient advocates, etc. From the start, its charter would focus on the design of privacy into an electronic health record system. Privacy should not be a sub-topic of what is discussed by the Electronic Health Record Standards Board. The Standards Board is very important in terms of interoperability, regional system linkage, and medical record issues. But if privacy is consigned inside that Board, it may not have the “right” kind of pressure or poise. Dr. Westin hopes for the development of an independent Privacy Standards Board that sits alongside the larger Technology and Record Standards Board.

A privacy-by-design working group would:

  1. Apply materials developed over the past two to three decades on privacy risk and threat assessment to the development of an electronic health record system. Auditing and law firms, universities, and many other organizations do this kind of risk assessment.
  2. Identify the kind of system design elements that would enhance rather than defeat privacy interest (i.e., regional systems with linkages and interoperability standards). Across all the technology and organizational design choices, the privacy-by-design workgroup should have the charge to articulate privacy implications of one methodology over another.
  3. Identify anonymization techniques that facilitate research and data trend analysis. Privacy does not have to be at the expense of fundamental epidemiological and health systems research. It will be important to conceptualize a segmented medical record that contains some parts for identification and use and other parts subject to anonymization that will be organized differently in the medical record.
  4. Consider what policies and legal rules to use with EHRs. The Markle Foundation has a project on connectivity that is exploring privacy and security issues. There will be many parallels and competitors to the privacy-by-design workgroup.
  5. Identify and test procedures that would empower individual patients to access systems directly. This would assure (82 percent of the) people that certain kinds of information are in the records and that they can enforce their privacy rights. As the medical record moves into high computerization, patient access and control functions must be an equal technology-driven opportunity.

Currently, 165 million Americans are on-line. The patient should have access to the new EHR system under rules that allow them to exercise their privacy rights wherever they are. Regional programs are the “beta sites” for looking at the privacy design world. Dr. Westin’s empirical studies suggest going to where pioneers are putting new technologies forward and changing the way things are done while balancing rights and responsibilities. Empirical objective research is needed to highlight the impact on patients of what is happening in many settings. There is much to be shared and learned from other countries.

NOTE: Dr. Westin believes that it would be a “privacy-by-design mistake” to think about a patient medical record as one unified document. He supports a six- or seven-segmented and formatted medical record that the technology stores and retrieves within these segmented parts: 1) personal identifiers; 2) medical transaction; 3) prescription history; 4) mental health/psychological or psychiatric (to be kept in encrypted form for security purposes); 5) lifestyle information (e.g., information about sexual life, drugs, alcohol, risk-taking behavior etc.); 6) anonymized data, for research purposes, that would not require patient consent, opt-in or opt-out. Patient data should be collected from the beginning for public health and research purposes (e.g., epidemiological and pharmaceutical utilization research). If the “right” rules exist for patient, provider, and third party access, a medical record could be thought of as a set of records in which rules of privacy, access, consent, and disclosure are customized for the nature, sensitivity, and functions of different types of information.

The non-profit Center for Social and Legal Research conducts public surveys on how the public and various health care leadership groups and others feel about the more specific aspects of an EHR system. The Center would like to go beyond HIPAA by: doing empirical studies on how various programs are working as they roll out; and helping to develop legal and policy rules necessary for privacy, confidentiality, subject access, due process, etc. In the Spring of 2005, the Center will publish a white paper that looks broadly at computers, health records, and privacy in the 21st Century. In addition, a quarterly electronic newsletter will go out; and seminars and conferences on program themes will be held. The Center’s website (www.pandav.org) is currently posting top line results of the survey mentioned; a report on how the public views health privacy; and survey findings from 1978 – 2005. Also on the website are today’s testimony and PowerPoint as well as an expanded survey report with demographics and factor analysis.

Discussion

Mr. Rothstein noted that the EHR “train” is “zooming down the track.” Those concerned with privacy are trying to “keep the train on track.” The reality, then, differs from Dr. Westin’s suggestion to work out the privacy element in advance of the system itself. Dr. Westin thinks that privacy issues are so central to success that it behooves those working on privacy to convince legislators that an acceptable privacy system exists. He suggests asking Congress and state legislatures to take their cue from how the public feels about the EHR system. There must be a clear mandate for privacy from the start. Important questions revolve around how this is institutionalized, located, funded, etc. However it is done, there must be a free-standing, high-prestige, well-funded, and well-staffed entity that functions as a privacy impact assessment group. These functions should be institutionalized. Even though people believe that the “train is going,” it is not too late because the “train is gathering steam.” The health care community is privacy-oriented and the technology groups building the software accept the importance and centrality of privacy. There seems to be a greater possibility of creating an institution with the right kind of support in the health arena than in other privacy areas such as homeland security or telemarketing.

Mr. Reynolds mentioned the trouble that the development of HIPAA encountered because everyone had their own way of doing things. He wondered if there was a way to put some segments in place as regionals do their work to help them base their structures on a foundation that ties multiple regionals together. Dr. Westin, though not a technologist, said that technologists are working hard on interoperability. They understand the need for operating standards that cut across various technology systems; and they are looking at linkage rather than uniformity techniques. He suggested that the Subcommittee examine whether developing EHR systems are incorporating interoperability adequately. He also thinks that the moment is right to develop much more uniformity in medical record formats. Highly refined and correct estimates of language and techniques will move the next decade toward a much more uniform system of reporting and formatting.

Very promising to anonymizing without any preserved linkage file is the trusted keeper solution (see transcript for example). There is strong incentive and commercial interest within organizations that function as trusted agents within a personal health record system. The privacy issue is paramount here.

Dr. Steindel wondered how to assure the public that computer systems are being designed that “express the needs” of a national privacy data board. Dr. Westin believes that the security area probably belongs in the technology sector rather than in the policy orientation of the privacy sector. While it does not define privacy or establish confidentiality, data security enables confidence in a privacy and confidentiality system. Some deep reservations exist, however, as exemplified by a recent private data supplier who was hacked into by a Nigerian ring that set up false accounts and used records for identity theft. Because it is next to impossible to provide100 percent data security, the question is “how close can we get?” The EHR system will be another driver in the creation of a biometric identifier system, which would put an enormous damper on identity theft as it provides more secure methods of authenticating users.

People who believe benefits outweigh privacy risks (48 percent) are not hostile to privacy. The 47 percent who don’t think this must be addressed as well as that segment of the 48 percent who are responsive to privacy. Congressional “champions” are needed to move the health privacy campaign forward (see transcript for how the LC Program progressed politically).

The concept of technologically-enhanced patient participation is not yet sound. The thinking should begin with “what would it take?” The liability system must be taken into account.

Panel II: Privacy in Health Care and in Society

Privacy and Health Care Bernard Lo, M.D.

When one of the large manufacturers voluntarily withdrew Vioxx (for arthritis) due to increased cardiac problems, one integrated health care system took advantage of EHRs to respond quickly to changing information. Patients were notified by mail within 24 hours of the recall. The drug was immediately withdrawn from the pharmacies. Every provider was notified. An alert flag was placed on patient electronic records to remind physicians to talk to their patients about how to manage the new information. Kaiser of Northern CA utilized a comprehensive, sophisticated EMR system to institute a quick study of patients on various Cox-2 inhibitors (that also used pre-existing data) to produce a fast-track publication through Lancet.

What kind of database do you use to answer questions quickly, using existing data? Comprehensive data are needed as well as integrated pharmacy, out-patient visit, hospitalization, laboratory test, and death data. To obtain this data in non-integrated systems, individual identifiers are needed to cross-link different types of data. Complete follow-up is also needed to access care outside of the system. There must be few refusals or drop-outs to avoid scientifically invalid responses due to loss of statistical power or selection bias, to questions like “does drug “x” cause greater or fewer heart problems compared to other drugs?” In an ideal system, other uses of an EMR include quality improvement (such that physicians prescribing above a recommended dose would be alerted when they cross the threshold). A tickler can be placed in an EMR as an alert for contraindications to other drugs.

Prozac provides another example of EMR use in determining a drug’s benefits versus risks (noting recent front page news linking it to depression and suicide in children and adolescents). Database research can inform patients and doctors of concerns, uncertainty, and controversy about the use of certain drugs, as part of quality control in prescribing (noting heightened privacy and confidentiality concerns for sensitive information such as psychiatric, substance abuse, or genetics). While HIPAA protects medical health records, mental health is often separate or “carved-out” of the medical system. For many, even basic medical information is sensitive. Confidentiality prevents stigma and discrimination while showing respect for the individual but it is not an absolute ethical goal. Dr. Lo suggests two policy goals – to protect confidentiality and to access information for clinical care, public health and research.

For public health risks from SARS, anthrax, and avian flu, the public health system accesses individual EMRs for contact tracing and epidemiological source outbreak research with compassionate notification that assures privacy rather than consent. Certain types of outcomes research are as critical to public health as public health research. The technical challenges of doing this research include the need for compatibility between different organizations with different data formats. Identifying links must be protected when merging databases with different types of information. Patient authorization is needed although it is not clear how well patients are informed, which affects their response. Also, there is clear evidence of selection bias (noting Canadian database study on stroke, where the sickest patients did not allow their data to be used). Incomplete data is misleading. So, the issue of patient authorization and how it sometimes serves as a deterrent to outcomes research must be sorted out.

Dr. Lo thinks there needs to be a distinction between what the regulations say and how they are implemented and interpreted. De-identified data will not suffice to accomplish some database research because identifiers are needed to link different databases. IRB chairs and researchers seem confused about the HIPAA waiver of authorization. Research becomes more difficult when IRBs and privacy boards are stricter than the regulations. Guidance is needed for these boards and for researchers about what kind of outcomes database research is permissible under HIPAA. At the other end, the provider may be reluctant (see transcript for CA example).

Recommendations

  1. Public education about the value and limitations of database research, including trade-offs between confidentiality and patient benefits (as evidenced by the possible usefulness of outcomes research).
  2. Think about some (outcomes) research as being similar to public health, with the ethical issue being notification rather than consent or authorization as the point of entry into that data.
  3. Focus on confidentiality as well as privacy.

Patient Interest in Health Information Technology Joy Pritts, J.D.

Ten years ago, Donna Shalala posed a valid question: “When all is said and done, will our health records be used to heal or to reveal us?” How patients perceive the balance of benefits and risks of electronic health records is the focus of this presentation.

Benefits and risks Benefits include: improved quality of care because doctors can access complete medical records (especially for those with chronic illness); legibility of records, which reduces errors; improved accessibility; elimination of duplicative tests; and streamlining administrative processes. Concerns about loss of control of electronic health information abound. Associated risks include questions about the adequacy of security and protection within and outside of the health care system. For example, financial institutions administering health savings accounts are not covered by the privacy rule nor are they regulated by the banking authorities. Some people wonder whether the “sins” of their youth will “haunt” them in later years. Regarding accessibility, if information is written in code for easy transfer between providers, will the patient be able to access and understand it? HIPAA does not require health information to be “translated” into plain language. An EMR system might also leave those with no access to computers behind (especially within the most vulnerable populations). Harm from the “downside” of EMRs includes stigma attached to certain medical conditions and people fear for their jobs and health insurance. The majority of bankruptcies in the U.S. involve health care costs and debt. Other concerns include police access to personal information. Talk of a national database provokes a “very adverse reaction” from people. Similarly, unique identifiers provoke an adverse reaction due to the potential of use for unrelated purposes (as has occurred with social security numbers). Identify theft becomes an increased risk (see transcript for example), especially because people who hack into medical systems are probably not subject to HIPAA civil and criminal provisions. The question of what can be lost with EMRs, which has the greatest impact on those who are ill, changes depending on background (noting trust lost in the African-American community due to Tuskegee; and the vulnerability of famous people such as President Clinton). Trust in the clinical community about what can be done with health information depends on how a person has been treated in the past. The curiosity factor in small communities can threaten privacy such that people with diseases -still largely stigmatized – have much to lose (see transcript for example).

A recent 2004 survey by HIPAA Advisory showed that many institutions are not following up or monitoring privacy policies. In addition, there is a segment of the younger population that provides inaccurate information that can subvert the system. People want to have some ability to control how their health information is used and by whom. They need to trust that the information is being used properly and they want accountability. HIPAA has started to address some of these concerns but it does not cover all the hazards. For example, Congress needs to resolve the fact that HIPAA does not directly cover everyone who has access to health information and that penalties only apply to covered entities. More practically, notices of privacy practice need improvement.

Privacy and Substance Abuse Issues A. Thomas McLellan, Ph.D.

Addiction treatment should be part of the EHR because it is necessary for public health and safety. Addiction is a chronic illness requiring treatment by more than two million people. The management of this disease is important to the treatment of many other chronic illnesses such as diabetes, hypertension, asthma, breast cancer, sleep disorders, and chronic pain. Every month, new medications, therapies and interventions provide more options for substance abuse treatment, for which the same kind of information is needed as for any other illness. The “old” way of treating substance abuse was segregated detoxification. Now, addiction treatment, like that of other illnesses, has an acute care phase (often in hospitals) that often transfers to specialty care or sometimes to continued care.

The IOM’s Crossing the Quality Chasm report, which defines principles of patient-centered care, is at the base of the Treatment Research Institute’s approach to integrating addiction and substance abuse treatment and information into EMRs. The patient has to be notified of information exchange and must have the right to say no. Special provisions for addiction treatment must be made due to special problems that have arisen from decades of under-funding and segregation, most particularly around a capacity for computer integration information management.

Discussion

Dr. Lo stated that outcomes research using large databases is more difficult to do when certain types of information, singled out for special treatment, preclude access to comprehensive, integrated information. Mr. Rothstein asked if there should be inclusion or retrieval rules (a retrieval algorithm) for sensitive information. Ms. Pritts said that the question was hard to answer because even within the patient community, people think about the issues differently. She challenged the notion of the EHR being “consumer-driven” because, at present, the consumer does not have much input about who the record goes to or how it is used. A retrieval system fits with the privacy rule by giving patients the option. On the other hand, it is hard to know what kind of past information is relevant to current health status and thus, difficult to think about expunging parts of the medical record. Dr. Lo believes that retrieval rules are much more flexible than exclusion rules. What happens in medicine generally happens here – decisions depend upon the situation (see transcript for example). The data should be accessible to a treating physician in an emergency or acute situation. Mr. McLellan agreed that the principles are safety and efficacy come first, with patient preference second. With the right to deny access, a patient cannot expect complete health care. One big question centers on how these rules apply to public health and safety concerns (e.g., a patient with cholera does not want it put in the record). Sensitive items are “way more prevalent than the not sensitive ones.”

Mr. Rothstein’s concern is that the adoption of an inflexible, non-patient controlled system might stimulate proliferation of a “drive-through, no record, no research, and no questions asked health care system” that caters to the privacy conscious at the expense of public health and research. Rather than choosing one method over another, Mr. Reynolds wondered what would happen if there were room for both, with stipulations (“yes, if” and “no, if”). Ms. Pritts pointed out that EHRs are coming, regardless, and that the more people trust the system, the more they will see the benefits. One way to build trust is to enforce penalties on those who misuse their access to information. People especially do not like their health insurance companies to know what is wrong with them because they are afraid of losing their coverage. At the macro level, these issues are difficult to solve until the general issue of how people get health care in the U.S. is resolved. People would not be so afraid if they did not fear losing their jobs, their insurance, or their homes if they get sick. Mr. McLellan does not believe that anyone would say “yes, if” if the system were secure and under control. Key questions revolve around who has authorization and how to keep health information away from people who can do “harm.” A recent IOM report supports the notion that health information is owned by the patient rather than the health system. The individual has the right to decide what occurs during the course of their care. He suggested seeking guidance from financial institutions, which have been “very good” at handling this kind of information. Ms. Pritts mentioned that some patients would not want researchers to ever use their information without explicit permission.

Dr. Harding pointed out that Mr. McLellan is the first person to make a recommendation to the Subcommittee that substance abuse (as one of the “sensitive” issues) be made part of the general record. Mr. McLellan specified that there is nothing qualitatively different about venereal disease, mental health illness, infectious disease, or substance abuse that distinguishes them enough to warrant preclusion from the general record (and financial segregation). The system should have complete information that is adequately protected. Dr. Lo wondered about the link between substance abuse and illegal activity. Mr. McLellan believes that link to be irrelevant within the confines of health care (though others disagreed). Ms. Wattenberg (from SAMHSA) talked about the protection of substance abuse information in health records relative to the creation of the federal Part II regulation. HIPAA does not protect records from law enforcement in the same way that Part II does. She mentioned a list of 20 related topics that the substance abuse community needs to think about.

According to Mr. Hungate, patients do not understand that EHRs allow database research, which is more beneficial to them than they realize. Dr. Lo reiterated that patients need to understand the potential benefits of health information use that go beyond direct patient-provider exchange. Giving the patient final say should be an informed decision and the current HIPAA process does not accomplish this. There needs to be some effort to help people understand how outcomes research benefits patients and their families while assuring them of adequate protection. Mr. Hungate and Dr. Lo agreed that this would require an independent entity, perhaps a separate UL lab that has obligations and a trustworthy vetting process that is transparent and accountable. Mr. McLellan added that recommendations are based on prior care. It is important to communicate information about data that speaks to the benefits and risks to help patients make informed decisions.

Mr. Rothstein thinks that Dr. Lo’s provocative suggestion about automatic access to database research is, despite waivers, fairly “radical,” in that the dichotomy between clinical intervention and other research forms has not been recognized since Nuremberg. Is Dr. Lo suggesting a carve-out justified by a public health/research value? If so, concerns might arise around such “slippery slopes” as homeland security. Mr. Rothstein would be more comfortable with an educational model that would better inform people about the value of broad participation in research studies and accompanying security measures. Dr. Lo stated that the values for public health access to data and protections for individual privacy are both very important. In situations with the potential for immediate, serious, visible, harm to identifiable persons, individuals ought not to have the right to withhold information. In database research, potential harm is statistical (i.e., 10,000 may not have used Cox-2 inhibitors if they had known that it might lead to heart attacks). Some types of research are so beneficial to the public that, with appropriate safeguards and public understanding, it should be possible to proceed without individual consent. Decision-makers must be seen as trustworthy, accountable, and transparent about what is being done, why it is important, and what review process is in place. Public health is considering models that allow governors or state DPHs to declare emergencies with broad powers and due process procedures.

Many states with disease registries (e.g., cancer registries) have stringent access and use rules, unlike the federal level of protection, in which the provider bears oversight responsibility. Ms. Pritts believes that it is imperative for the laws to address direct use of information by researchers. Dr. Lo would mandate a prohibition against researchers passing on information to third parties. It is important to remember that only the initial identifier is needed to link data; in other words, information can be turned over to researchers in a de-identified format. Without this, data for a five-year follow-up study would have to be reassembled. Mr. Hungate wondered why a drug on the market (such as Vioxx) couldn’t be accompanied by a statement that says: “this is going to be treated as data in a public health way; here are safeguards; and this is what will be done with the information.” He would make participation in the clinical trial a condition of taking the medication. Dr. Lo believes that the best approach is tailored, with alternatives and education offered. Mr. Rothstein does not think it possible to turn everyone into a research subject, and as such, choices would have to be made on a case-by-case basis. Dr. Lo suggested testing the creation of an entity that could guarantee privacy around a specific example. While the standard should be sharing, Mr. McLellan would not make participation mandatory because the patient should have the ultimate right to refuse. Ms. Pritts noted that there is accountability at the state level for such things as cancer registries because they have been enacted by state legislatures (and can therefore be voted out). She was disturbed at the thought of a “nebulous” entity making public health research decisions.

Statements from the Public

Medical Privacy Coalition Kathryn Serkes

The Medical Privacy Coalition is a non-partisan coalition of 29 groups (see transcript for membership).

A survey of AAPS physicians showed that an overwhelming number of physicians believe that third parties ask for information that violates confidentiality (51 percent cited government agencies and 70 percent cited requests from health plans). Eighty-seven percent of physicians reported that a patient had asked that information be kept out of the record; 78 percent had withheld information from a record due to privacy concerns; and 19 percent admitted to lying to protect a patient. Physicians say that requests to withhold information are increasing. The Medical Privacy Coalition objects to the current standard of how privacy is enforced – that public health usurps individual rights. Patients who fear disclosure withhold information. The biggest concern is government access to medical records. With regard to HIPAA, the Coalition contends that people may or may not have received a HIPAA privacy notice. They do not understand the difference between consent and being advised of information under HIPAA. They often don’t understand what they are signing and they are often deluged with paper work upon an initial doctor’s visit.

The benefit of IT advances will be lost unless patients and physicians can be guaranteed privacy. A new set of rules is needed because HIPAA is outdated. Because of the way HIPAA was written, nothing in the rule permits covered entities from avoiding disclosures required by other laws. Because there are limited restrictions on law enforcement, the Fourth Amendment should apply to this information. Medical records should be as well protected as papers in one’s home (noting fear of criminal prosecution when substance abuse is put in a health record). Ms. Serkes noted that patients frequently take advantage of what is a large increase of cash-based practices (i.e., doctor-patient direct practices) to avoid privacy intrusions. The country is moving away from HIPAA’s original purpose, which was to move toward a national database and a centralized idea of electronic medical records.

Recommendation Patients say they want electronic medical records but not necessarily a government-run nationalized database of medical records. The Association of American Physicians and Surgeons is working with a company named WorldDoc that has a personal health record controlled by the patient. Some patients pay an annual fee to use this type of system. The Coalition would like to see the PHR as a first step. Consent must be reinstated.

Discussion

Survey results will be emailed to Marietta Squire. Ms. Serkes said that the survey did not differentiate between information requests submitted for purposes of payment (e.g., from Medicare and Medicaid), public health, or law enforcement.

Mr. Rothstein relayed a recent personal experience at an initial visit to a new physician, in which he received a copy of the HIPAA acknowledgment form but not the HIPAA notice of privacy practices. When he asked if it might be a good idea to pass along a copy of the document that he was being asked to acknowledge, he was told that “nobody reads them” and asked, “Why would you want to read them?”

Institute for Health Freedom Sue Blevins

Ms. Blevins read a prepared statement from Robin Kaigh, a private citizen and attorney who has tracked medical privacy since 1996, which spoke to the “danger” of wrongful access to or exposure of personal health record information; and to the promotion of an opt-in or opt-out system (see transcript for statement).

The Institute for Health Freedom is a Washington, D.C.-based think tank that studies and reports on the freedom of people to make their own health care choices and to maintain health (including genetic) privacy. HIPAA eliminates the right to give or withhold consent before personal health information can be accessed by others. The Office for Civil Rights needs to make clear that individuals do not have control over access to personal health information until this right is restored. The electronic health record is a “recipe” for privacy invasions because it allows an even greater number of people to access a patient’s medical records without permission. To support this position, Ms. Blevins read three quotes from HHS’s analysis of the federal medical privacy rule regarding electronic information (see transcripts for quotes).

Unless consent is reinstated and rights to privacy upheld, U.S. citizens will essentially have to choose between three options: 1) information is shared without their permission when they seek care; 2) patients can lie to their health care providers and others to maintain privacy; and 3) they can forego care to maintain their privacy.

Discussion

Ms. Blevins will submit copies of the statements as well as a copy of the study noted by HHS, which elaborates on the notion that a person who does not believe his privacy will be protected is much less likely to participate fully in his diagnosis and treatment.

Mr. Rothstein supports “consent” only because of its symbolic value as a traditional prerequisite of medical care, although he does not see that it has practical value, especially when a patient does not get to see the doctor unless s/he signs the form. Substantial questions about who gets access to information, how much, and under what circumstances, must be addressed in order to deal with privacy concerns. Ms. Blevins disagreed, stating that she was present at the hearing on behalf of many people who would not come before a government committee. She emphasized the “huge” difference between consent and notification, stating that what occurs at present is “coercive consent.” The patient needs to be able to ask about who gets the information and what happens. She spoke about the many reasons people want privacy (see transcript for example). A small minority really cares about consent and she sees the right to choose about where one’s personal information goes as a “fundamental ethic.”

Subcommittee Discussion

Mr. Rothstein noted that the draft letter hand-out was approved by the Subcommittee as a result of the November 18, 2004 hearing related to e-prescribing issues. The purpose was to draft language for the Subcommittee on Standards and Security to incorporate into their overall statement for submission to the full Committee meeting of March 3-4, 2005. Revisions were primarily editorial. A discussion about further revisions ensued (see transcript for details).

He then noted that the hand-out entitled Observations and Recommendations Relative to Privacy of E-Prescribing (drafted by the Subcommittee on Standards and Security) had two recommended actions, 10.1 and 10.2. Mr. Reynolds said that the full 18-page letter covers new and old testimony as well as information from the Subcommittee on Privacy and Confidentiality.

Mr. Rothstein expressed a concern about Action 10.2, which goes beyond the pilot test phase to say “HHS should use experience gained from the pilot tests to develop and communicate guidance to the industry.” Suggestive of the use of “soft” regulation, he wondered whether HHS would just develop and communicate guidance on handling privacy issues or develop privacy rules. Ms. Friedman interjected that the pilot tests are standards to help inform the implementation of the Medicare Part D benefit, noting that an RFP is being developed for pilot participants. Pilots will be conducted during 2006 and the evaluation will take place in 2007. Final rules will come out in 2008, to be implemented sometime in 2009.

Ms. McAndrew said that the main focus of the pilots is to prove the effectiveness of the foundation standards that are being adopted to rulemaking in 2005; and to prove the effectiveness of the other initial e-prescribing standards to be formulated into regulations (as foundations standards are), post-evaluation. However, the focus of the pilots and of the evaluation is on standards for the e-prescribing system. Unless one of the standards included in the pilot will be a HIPAA privacy-based standard, there is nothing for the pilots to test or evaluate with regard to privacy. The pilot tests will not evaluate other operative laws such as the HIPAA privacy standards although pilot participants must adhere to HIPAA privacy and security laws. The evaluation outcome will not direct regulatory changes in standards that are not part of these pilot-tested systems. However, experience gained from the pilots can inform the Department and the regulators of the security and privacy standards of what needs to happen within certain contexts. Then, the task is to determine if getting to “X” is via guidance or rule change. Ms. McAndrew thinks that these challenges are outside of the standards being scoped for the pilots and beyond the scope of the pilot tests.

Mr. Rothstein would consider deleting Action 10.2 or revising it to clarify that the Subcommittee on Privacy and Confidentiality does not know what the regulatory options are or what the Subcommittee or the Department might want to pursue with regard to a privacy rule and an e-prescribing rule as a result of the pilot (in contrast to the Subcommittee on Standards and Security, which has determined that this is “guidance”). Dr. Fitzmaurice interpreted the actions to mean: 1) Action 10.1: as long as the pilots are being done, it would be good to learn something about privacy; and 2) Action 10.2: if something useful is learned about privacy, be sure to tell the industry. He did not think that the privacy rules would be changed as a result of the pilots.

A spirited conversation followed about what HHS is looking to do. Ms. Fyffe thought that HHS might come out with more privacy regulations as a result of the pilot tests. Dr. Fitzmaurice thought they were looking for burdens and benefits while Mr. Rothstein thought the opposite. Ms. McAndrews and Dr. Fitzmaurice agreed that the privacy rule applies as is within the pilots. Mr. Reynolds noted a continual reference in the document to setting standards that are not already in place while HIPAA security and privacy are in place. As a member of both Subcommittees, he wanted to know if the implementation of e-prescribing brings up other privacy issues that have not been addressed. He noted that previous HIPAA transactions have been claims- and eligibility-related rather than care-related, whereas the pilot process would be at the point of care. If the pilots do an excellent job, for example, of addressing frequently asked questions or HIPAA transactions and code sets that help the industry to understand the privacy rule as it relates to e-prescribing, they should know about it.

Mr. Rothstein suggested adding to 10.2: “Communicate guidance to the industry on handling privacy issues or take other regulatory action as necessary.” Dr. Fitzmaurice clarified that the purpose was to inform the industry. He thought that recommending regulatory action to HHS after the pilots would be a different set of recommendations. Mr. Rothstein reiterated that it may be necessary in 2007 to do more than issue guidance to the industry. Others said it would depend on what was learned in the pilots. It was further pointed out that the term “guidance” was open-ended, inferring a range of responses from FAQs to regulatory action.

A revision to Action 10.2 was suggested as follows: “HHS should use experience gained from the e-prescribing pilots to develop appropriate actions on handling privacy issues.”

Ms. Wattenberg had an additional concern about Action 10.1, noting that Part II of the HIPAA privacy rule and the DEA have a higher level of restriction on some information. Mr. Reynolds said that these concerns are addressed in the Subcommittee on Standard and Security’s 18-page letter. Mr. Rothstein suggested adding to Action 10.1 that HHS should identify and address any privacy issues within the HIPAA privacy rule and other health records laws that track the prior language. Mr. Reynolds and Dr. Fitzmaurice agreed with Mr. Rothstein’s statement.

The Subcommittee passed a motion to endorse the stated revisions of Recommended Actions 10.1 and 10.2 of the document entitled Observations and Recommendations Relative to Privacy of E-Prescribing. Subcommittee members were unanimously in favor of the proposed amendments.

The meeting was adjourned at 4:18 p.m.


DAY TWO: FEBRUARY 24, 2005

CALL TO ORDER, WELCOME, INTRODUCTIONS, REVIEW OF AGENDA

As part of Panel II hearings on Privacy in Health Care in Society held on February 23, 2005, Mr. Rothstein noted that the written testimony of Michelle Bratcher Goodwin about implications of electronic health records and health care records in general on minority populations would be entered into the record (see transcript for written testimony).

National Health Information Network

Panel III: Disease and Health Advocacy Groups

American Cancer Society Len Lichtenfeld, M.D.

Central topics and intent Discussion centered on the internet, its impact on the American Cancer Society (ACS) as well as on collaborators and those served. Central topics included: ACS’s internet-based information strategy and specific programs; the National Cancer Database; and electronic health records and related technologies. The intent was to describe how ACS addresses privacy concerns and to share how privacy issues impact activities and parties served.

Background ACS’s broad-based information strategy, which began in 1996 by reaching out to constituents nationwide, was followed by the development of a call center in 1977 and a website in 1997 (Cancer.org). Currently, 1.5 million visit the website and almost 100,000 contact the Call Center on a monthly basis. When ACS realized that its obligations for privacy extended beyond HIPAA, it developed its own guide for privacy activities entitled Health Information Confidentiality Principles (finalized in September 2000).

Cancer.org has a privacy statement on-site as well as a practical privacy policy. To wit, ACS collects no information and requires no information about or from visitors to the site. Beyond using cookies to direct people to areas of interest, visitors enjoy free access. Other services that may require more information are available to constituents but only if they seek those services. The interactive Cancer Survivors Network (launched in July 2000) has been active and quite effective as a support mechanism, with 1.4 million visitors, 54,000 registered active members, and 1,200 – 4,500 new member registrations monthly. The network, which includes message boards, private and secure internal CSN email and personal web pages, is built on trust and respect. Registration information is limited to a screen name, a password, a valid email address, and a zip code, and, if desired, personal profile information that helps people to find others with similar circumstances in order to share experiences. ACS does not use identifiable information for internal activities but it does gather aggregate information for tracking and determining trends. While many share personal information openly on the Network; there have not been serious breaches of confidentiality to date.

Two other programs on Cancer.org have significant requirements for disease-specific information: 1) the treatment profiler works in cooperation with NexCura, whose privacy policy was developed by a company in cooperation with other well-recognized, not-for-profit voluntary health organizations; and 2) ACS’s clinical trials matching service cooperates with EmergingMed, whose privacy policy was created collaboratively between ACS and EmergingMed. The treatment profiler uses an opt-out strategy and matching services uses an opt-in approach to further products or responses based on data entry. With the opt-out strategy, consumers allow an organization to maintain a database and use the information to advise patients of various opportunities (i.e., medications, treatments or clinical trials). These messages go beyond the initial intent of entering data into the system by helping to determine the best treatment options for specific types of cancer at particular stages. The information, which is often commercially supported, is generally viewed by patients and families as a useful resource rather than as an unwanted intrusion. In comparison, the clinical trials matching opt-in service enables data to remain completely under the control of the patient and/or family. Here, the need to proactively protect privacy is paramount. Yet, ACS has found that almost all patients who pursue information about clinical trials on the website are willing to provide detailed information about themselves and their disease. ACS has also learned that telephone back-up must complete the information process because many users may be email literate but not internet literate.

ACS works closely with the American College of Surgeons and their Commission on Cancer (CoC), which operates the national cancer database (NCDB). The database is effective in gathering and utilizing aggregate data to improve the quality of care for cancer patients. The CoC has accredited 1,425 cancer programs to date throughout the U.S. Because data is gathered the “old-fashioned” way (e.g., registrars going through charts and relying on sources outside the hospital to provide periodic follow-up information), the delay in obtaining information limits its usefulness. Additional limits occur when data is not gathered for diagnoses and treatments offered outside of traditional hospital settings. Funding is being sought for a three-phase initiative to implement a plan that improves these data processes. At present, NCDB is working to obtain real time information in cooperation with a national hospital system in order to impact the quality of care by providing rapid feedback to physicians. The NCDB systems are HIPAA compliant but the creation of health information networks with data element standards in place would be a big step forward. To function properly at present, NCDB must negotiate privacy agreements with each cancer program, taking into account varying rules about privacy protection from all 50 states (the stricter state laws prevail over federal). They are working to establish a link to the National Death Index.

Summary Dr. Lichtenfeld stressed the need for everyone at the local, state, and national levels to work together to address broad legislative, regulatory, legal and ethical concerns related to privacy. The country needs clearly understood rules to strengthen valid research and quality improvement as well as guidance that allows for information-sharing across state borders. ACS hopes for policies that support their mission to educate the public, collect data to improve the quality of cancer care, and allow the conduct of research using the internet. To cancer patients, “information is crucial, time is of the essence, and accuracy is critical.”

Juvenile Diabetes Research Foundation Robert Levine, M.D.

Background on Juvenile Diabetes Research Foundation (JDRF): JDRF was founded in 1970 by the parents of children with Type I diabetes. Today, it is the world’s largest non-profit, non-governmental contributor to diabetes research, projected to fund over $100 million in 2005.

Failure to share and inadequate use of health information Dr. Levine’s testimony is personal (his wife has had Type I diabetes for 40 years) although his point of view represents concerns of JDRF’s key constituents. He believes that the greatest threat and biggest risk to people with diabetes, heart disease, cancer, HIV/AIDS, or other chronic diseases and disabilities comes from a failure to share, inadequate use of personal health information and, at times, from placing higher value on protecting privacy over protecting a person’s life, health, and the health of their families and communities. To make his point, Dr. Levine read from the abstract of a paper presented by the University Health System Consortium Diabetes Benchmarking Project entitled “Quality of Diabetes Care in U.S. Academic Medical Centers: Low Rates of Medical Regimen Change” published in Diabetes Care in February 2005 (see transcript for details). The details of the study addressed a lack of use of critical information that demonstrated a lack of achievement of clinical goals.

Referring to Rich Lowry’s article entitled “Civil Libertarians versus Public Health: A Dangerous Impulse” in a recent National Review, Dr. Levine referred to dramatic reductions in incidence of HIV/AIDS in newborns in New York State with the rejection of “privacy obsessions” after the passage of a bill in 1996 mandating that all newborns be tested and their mothers informed (see transcript for details).

The first order of a national health information infrastructure is to dynamically support the achievement of optimal health-related quality of life and function for all. It is within this context that privacy and confidentiality issues are best addressed. Referring to the work of the Child and Adolescent Health Measurement Initiative overseen by Dr. David Lansky, Dr. Levine noted that the health information environment should be designed to respect and serve patients in addition to the health system and the public (see transcript for further information). A fundamental principle is that information of medical records is owned by the individual rather than by third parties who might be health record keepers. Therefore, information users are involved in a voluntary exchange of value or definable direct benefit to the individual. Dr. Levine believes that individuals should be “paid” (as in social benefit; personal benefit; or financial benefit) for each use of their information.

The concept of who “owns” the information is important because it drives decision-making and design as it frames relevant privacy protection issues from the perspective of assessable risk and definable and often immediate benefit. It also challenges us to determine who the recipient of the information is. If the hospital “owns” the record, the information is not reliably or proactively used to inform specific individual actions to improve outcomes. If the individual “owns” personal health information no matter where it resides, the information can be transmitted to the patient with a statement of benchmark goals as well as with suggestions for therapeutic enhancement, self-care, and follow-up. Dr. Levine believes that compensation (i.e., direct personal benefit) should be given to individuals contributing health information to epidemiologic research or clinical trials such that they would be notified immediately of relevant findings and informed of who they might call for more information, counsel, and action. Individual ownership informs the approach to pursued protection and punishment of violators, relative to crafting a severe judicial response to violations that reflects societal revulsion. There must be public confidence and an extensive and open exchange of sensitive digital self information between trusted parties (see transcript for analogies and stories intended to prove these points).

National Council for Community Behavioral Healthcare Linda Rosenberg, M.S.W.

At the heart of the privacy issue in mental health is discrimination against people with mental illness versus the importance of an automated system. There is a great divide between what is effective and what people “get” every day.

Background The National Council for Community Behavioral Health is the nation’s oldest trade association representing community-based providers of mental health and substance abuse services (which often includes services for people with developmental disabilities). Members include community mental health centers, hospitals, state associations of providers, and local behavioral health authorities. Members serve four and one-half million people across the country, employing more than 225,000 staff in a range of settings that include schools, jails, and clinics.

The Federal Center for Mental Health Services estimates that 21.4 percent of adults experience a mental health disorder in any given year and five to seven percent of the population suffers from a serious mental illness. Twenty percent of young people experience a diagnosable disorder in a given year, with five to nine percent of children and adolescents having a serious emotional disorder. A substantial portion of people with serious mental illness are served by the public mental health system because they are uninsured, underinsured, or on Medicaid or Medicare. There continues to be a lack of parity between the treatment of mental and physical health. As such, the public dollars and often the Medicaid dollars underwrite the work of private health insurers.

Recommendations by New Freedom Commission A 2004 report issued by the New Freedom Commission (appointed by the President) had as its sixth goal the use of technology to access mental health care and information. Recommendations were: 1) to use health care technology and tele-health to improve access and coordination of mental health care, especially for Americans in remote areas or within underserved populations; and 2) to develop and implant integrated health record and personal information systems. The Commission noted that a transformed mental health system would consistently use evidence-based, state-of-the-art medications and psychotherapies as standard practice.

Topics of presentation Why an electronic health record is important for the field; obstacles; and what assistance is needed to leverage a national health information system to improve the quality of mental health care and protect patients’ safety, privacy, and confidentiality.

The care of individuals with mental illness generally follows two basic patterns that impact electronic health records. Those with a low to moderate mental health complexity often require one or more brief episodes of care, generally provided by a single provider in a clinic. Those with chronic and persistent disorders (that may include one or more psychiatric crises that involve police, emergency rooms, or psychiatric crisis workers) often require long-term care that includes multiple levels and teams of clinicians. Both groups begin a clinical experience with an initial screening, a face-to-face assessment, and the development of a written care plan. In the majority of mental health organizations, these clinical activities are handwritten on paper. A study at a mental health center in California illustrated a significant loss of documentation and staff hours (and therefore money), which translated into people going unserved (see transcript for details). In addition, when a crisis worker encounters someone having a mental health crisis, little or no clinical information is available even for clients who have received services for years. Often, results are costly, to include unnecessary hospital admissions, less than ideal crisis intervention, and lost ground.

Currently, a “de-skilled” and inexperienced workforce of psychiatric technicians and social workers provide care to the most seriously psychiatrically ill adults and children. Emerging best practices include the development of on-line crisis plans with at-risk clients before a crisis occurs. Electronic crisis episode tracking systems are being established in some communities as are laptop computers for crisis workers with encrypted cellular modem cards that allow secure access to relevant clinical records and plans. While the difference in outcomes has not yet been studied, organizations using these improved systems experience dramatic changes in their ability to know substantially more about a person’s clinical history, medications, and current care.

Diverse states are showing that that presence of electronic health records positively impacts an ability to provide evidence-based care for the following reasons:

  • The likelihood of a proper diagnosis and correctly identified functional impairment is increased within some electronic health record products that allow for entry of on-line assessment instruments;
  • An automated medical health information record can help to correct ongoing problems (such as an incorrect initial clinical formulation) because many software packages allow the clinician to connect to an individual’s treatment plan, which contains evidence for the diagnosis.

Obstacles It is very difficult to practice evidence-based care with paper and pencil due to its data-free environment, especially in light of the lack of outcomes tracking at the individual, program, or population level. The anecdotal and data-free character of mental health services will not change until an automated system with built-in scales is used.

Vendor mismatch is another obstacle to implementing an electronic record (see transcript for California example). Lack of interest in the software community also hinders an ability to use and benefit from technology. Fifty states have different sets of rules and codes for treatment, diagnosis, and assessment. One vendor in Oregon is now in eight states with over 7,000 data elements in their database. As a result, major information technology vendors shy away from working in mental health, which illustrates the dramatic difference between the availability of in-house information technology resources in hospitals and general medical clinics versus mental health clinics.

Another difficulty is the transition from paper and pencil to an electronic system. A phased-in approach is a must, due to the complexity of the mental health environment and the volume of forms. Two systems significantly increase the cost and complexity of implementation.

Enormous discrimination still exists against people with psychiatric disorders despite President Bush’s message that mental disability is an illness.

On privacy There are probably more built-in safeguards in an electronic health record than in a paper record.

The National Council needs to: step up its involvement within the health care information technology industry; work with its members to develop an electronic health record strategic plan; and ask that specific federal funding and technical assistance be earmarked to support behavioral health efforts.

A good example In November 2004, California passed Proposition 63, which gives one percent of the taxes on the income of everyone earning over $1 million to mental health services for adults and children, a portion of which is earmarked for infrastructure development including technology.

Discussion

Dr. Steindel noted that people are willing to share information after the fact (cancer patients) rather than before the fact (prevention), which impacts diabetes, evidence-based care, and sometimes mental health. He wondered how to influence lifestyle behavior when dealing with groups that want to protect the privacy of their lifestyle behavior. Dr. Lichtenfeld also believes that prevention is a major mission though he thinks that figuring out the elements of prevention is at the heart of the work. This country does a “lousy job” of prevention relative to implementing lifestyle changes (personal) as well as at the public health and transactional (practitioner) levels. One study found that it would take 7.4 hours/day for health care professionals to do “what they should” with prevention. The country needs different systems to remind practitioners about what needs to be done, to make information available to patients, and to deliver a consistent public message. Dr. Lichtenfeld thinks that other countries are further ahead with prevention because of the electronic health record and a national health infrastructure process.

Relative to privacy concerns, resistance to gathering information (sometimes political) about obesity or mental health in school children was mentioned.

With regard to privacy on a website, Dr. Lichtenfeld reiterated that people with password protection willingly give out a great deal of information, with few problems arising. With so many layers of requirements from local, state, federal, and institutional sources, many of his research colleagues hope for the elimination of privacy regulations and advocate for open accessible data ( “with the appropriate protections”). He pointed out that the toughest negotiations have been with federal health care organizations. He emphasized the need for uniform, simple, clear, understandable regulations that can be applied reasonably. Dr. Levine added that health professionals react to the “real” circumstances they are confronted with rather than to written regulations.

A regulatory approach to privacy (as opposed to a professional approach) will lead to a more bureaucratic approach to the compliance checklist, according to Dr. Levine, with little understanding by the asker or the signer of what and why s/he is signing. But a regulatory agency like JCAHO will affirm compliance with these signatures and completed checklists. Further, he believes that people do not understand HIPAA or know how to comply. He stressed the need to get back to the ethos of trust between provider and patient and to build a system that supports that trust.

Mr. Rothstein noted that the morning’s testimony reinforced the notion that patients have a strong interest in research and public health such that the debate is not one of public versus private interest. Patients are interested in their medical information and privacy but are also interested in research and clinical care. Mentioning high suicide rates among physicians (and sometimes law students) due to concern about licenses being pulled in a paper-based system, he wondered if mental health patients might refuse care if their records were not confidential. Ms. Rosenberg thought the issue was really about discrimination against people with mental illness. While certain conditions like depression are now discussed more openly, the anxiety of those with mental illness is increased due to “limited technology in effective treatment” that adds to the fear of stigmatization. While electronic retrievability exacerbates these fears (because the information “never goes away even when you want it to”), no one is saying that things will get worse with the use of an electronic health record system. While Mr. Rothstein hopes for parity in the long-term, his short-term concern is that people might forego beneficial mental health treatment with the use of electronic records, which he named as an issue that the Committee needs to address.

While patient ownership of the health record is “intriguing,” Mr. Rothstein nevertheless finds the concept of “ownership interest” to be problematic. He suggested consideration of clinical application of an already existing research rule, in which IRBs don’t approve research unless re-contact is considered; and subjects must be given an option about being re-contacted. The focus, then, would be on the professional relationship – and on the benefit to the patient rather than on obtaining permission to use the individual’s health record. In his construct, Dr. Levine clarified that the patient would “own” the information no matter what record it is in and the keeper of the records would own the record itself. The patient would therefore have the right to be apprised of the information’s use and relevance (including benchmarks) in order to promote his or her health on a timely basis.

Dr. Lichtenfeld added that the NCDB is working to improve quality of care. As it collects information, NCDB hopes to feed what it learns back to its patients, which (deterred by years of delay) does not happen at present. He reported that each of 1,425 institutions participating in NCDB is impatient to receive the information. What would happen, he asked, if the person entering the data was required to notify an individual of non-compliance?

Referring to a recent article in The New Yorker by Dr. Gawande (“Complications” Dec. 6, 2004), Mr. Hungate noted a growing discomfort of patients about how well they are served by the system. That there is no apparent response to variations in diabetes management is in part related to the privacy issue (on the provider side, relative to the malpractice threat). He wondered if dealing with patient safety and medical error in a professional, protective way would bring a better measure of result (see transcript for example). Ms. Rosenberg said that in mental health, the issue is not the threat of lawsuits, which happen rarely though they are sometimes instrumental in moving a state’s mental health system forward (as in Alabama). The U.S. has a complex health system that cannot do what Great Britain, as a single payer system, did in 2000 when it reformed the system with clear standards around evidence-based practices in the treatment of psychiatric illness. Dr. Levine referred to episodes versus continuity of care with the intersection of expert and personal information flows as well as a growing knowledge base. Outcomes, then, depend on access to and immersion of those changing information flows as well as frequency of sampling and the quality of interpretation and communication of dependent relationships. One would get better outcomes, then, for a patient with Type I diabetes based on an immersive relationship with the patient and his family and continuous observation because decision-making is iterative. Dr. Harding added that the failure to use data could be construed as equaling negligence.

Dr. Lichtenfeld disagreed with non-physician colleagues who believe that the way to solve the screening problem in the U.S. is to sue doctors who don’t do it. Stating the importance of developing “an environment,” he and Dr. Levine agreed that creating trusting partnerships is far more effective than threatening litigation as a means of improving public health.

Asked if an EMR would replace workforce skill, Ms. Rosenberg said that it might not replace but that it would make apparent the lack of skill, which is hidden within a non-transparent system at present. Engaging and keeping people in treatment are major concerns. For the skilled worker, an electronic health record with built-in decision prompts is a “terrific boon.” Dr. Harding wondered how a psychiatric worker could look at a computer while raising sensitive issues with a client. Dr. Levine said that the Institute for Health Care Improvement recommends using abbreviated versions (the “80% rule”). Dr. Lichtenfeld said that EHRs are enterprise- rather than physician-efficient and that the system will not work until this changes.

There was general agreement about greater use of EHRs for behavioral health in order to ensure well-protected health information. Dr. Fitzmaurice wondered whether this would translate into greater use of electronic health data for research gathering unbiased and scientific information about what works in the privacy arena. He asked if others believed in greater use of electronic health record data for research and whether current research protections of the privacy rules suffice. Dr. Levine supported gathering as much data as possible for use within clinical engagement with the understanding that protections were as adequate as the value of the exchange. Dr. Lichtenfeld said that the central question of whether treatments are working (and if so, what are the problems people face in years to come) cannot be answered until a portable, standardized set of datasets and data formatting is developed. NCDB’s data shows that a decided minority feel that HIPAA has adversely impacted their ability to do research, despite a perception that it has. Therefore, one cannot undervalue the impact of HIPAA on the research equation, which, viewed as onerous, must be made to seem reasonable, consistent, and understandable.

SUBCOMMITTEE DISCUSSION

Future meetings The Subcommittee’s second hearing, scheduled for March 30-31, 2005 in Chicago, will focus on the interests, issues, and concerns of provider individuals and groups such as the AMA, AHA, ANA, and others about the new national health information network. The third hearing will be a more concrete level of discussion, with presentations about privacy concerns and practices by plans that have already adopted electronic health records. Designers of electronic health records will also make presentations about what is realistic and possible relative to protecting privacy in future systems. The Subcommittee is likely to have something to share that represents multiple viewpoints for the full Committee by its September 2005 meeting (though there may be some early findings to communicate at the June 2005 meeting). Ms. Marietta Squire will circulate a calendar to finalize dates for future hearings (see transcript for further discussion about hearing participants).

Mr. Hungate suggested posing a simple set of questions that would draw on testimony heard to date about what ought to be in the content about privacy – a “what-if” statement of what the structure of privacy might need to be. Mr. Rothstein suggested developing questions after the next round of provider hearings to provide guidance to the third round of testifiers. He also thought that developing a simple set of questions could be a complicated process that could require more input. As such, the task might become the responsibility of some other body.

As a developer, Mr. Reynolds said that most medical records are designed for ownership by providers within systems that deliver the records to providers. This contrasts with the basic premise of these hearing discussions of patient ownership and control over where medical records go. The Subcommittee, he suggested, might want to “paint that basic picture.” Mr. Rothstein wondered about current privacy assumptions of developers. Dr. Deering framed the issue for developers by suggesting that, in future hearings, providers, vendors, and practitioners ask the following: is it useful to set aside the record as the unit toward which policy is directed or is it more palatable to deconstruct the record and direct policy and practice toward content? What makes privacy easier from a practice point of view relative to institutional requirements? In keeping with the theme of the February 23rd hearings, Mr. Rothstein thought that providers might take issue with restricted access and have qualms about patients filtering what their information. Dr. Harding agreed with Mr. Reynolds that medical records are developed for doctors and insurance companies. Providers will be reluctant to change until information exchange becomes easier, more efficient, and produces better outcomes (noting that this may result in a decrease in patient control). Mr. Rothstein articulated the “incredible transition problems” that lie ahead with the development of a prospective system and a legacy system, which may in the short-term deter some people from obtaining treatment.

Culling key issues to develop a composite document In developing his tentative list of key issues, Mr. Hungate wondered if his list jives with those of others. He mentioned Alan Westin’s suggestion for a standard-setting activity around privacy (perhaps DSMO-related [Designated Standards Maintenance Organization]), which currently does not exist. He suggested the development of a summary of the first hearing for use as input to later hearings. Mr. Rothstein suggested circulating the draft minutes and asking participants to add to them to create a composite document of issues to consider (Ms. Fyffe volunteered to finalize such a document and requested input via email as soon as possible).

Mr. Reynolds stressed the need to balance the public good, patient good, provider good, and research good in order to come to agreement about how these elements might work together to change environments. He hopes that the Committee will have this discussion, into which he will give input about the part of industry. He wondered about negotiation between public and personal ownership and expressed concern about everyone testifying “in their bubble.” To date, no one has suggested how to tie competing elements together and Mr. Reynolds wonders whether this is even feasible. He hopes to extrapolate from more complicated structures such as institutions that are not “closed environments” (as, for example, the VA is).

Ms. Greenberg pointed out that different points of view were also presented by the Quality Workgroup. Whether privacy concerns revolve around patients or providers, the public good and improving the health care system must be balanced with privacy concerns. Bringing these elements together poses significant challenges. She mentioned the significant need for public education while acknowledging the difficulty of identifying whose responsibility this is. She identified the complexity of the issues in today’s discussion, which is accompanied by confusion, misinterpretation, lack of good communication of “public good,” and a lack of clarity about rights and responsibilities. Ms. Deering noted that the Markle Foundation’s Connecting for Health project has made public education a major thrust. She recommended that they not presume adversarial interests when engaging providers. A list of questions might help them look for points of agreement and overlapping interests in the service of strengthening the clinical relationship. She cautioned against making assumptions and noted that many providers in recent NHII hearings said that medical records serve an institution’s administrative, financial, and legal interests rather than health care providers. She wondered if providers might not need or want the whole record (legal, financial, and record of care) in order to provide quality care.

Dr. Fitzmaurice noted that the Committee produces recommendations to the Secretary that take into account the “pulse” of people who testify, which, in turn, infers the “pulse” of the nation. The biggest lever is the HIPAA privacy rule. Testimony to date does not indicate that the privacy rule isn’t doing its job. While it may be a burden to some (especially in research), the HIPAA privacy rule has generally been accepted in its three years of existence. He thinks the Committee’s work is to present hearing results and considerations (such as the privacy rule’s scope and burdens), as the discussion about the national health information infrastructure continues. Referencing testimony from the Subcommittee on Standards and Security about e-prescribing, Mr. Reynolds disagreed. The current system of PPOs and multiple providers infers that data passes through many hands with no one knowing where it is going or what the process is. “Covered entity” still has significant holes relative to how data flows from network to network, person to person, and group to group. Dr. Fitzmaurice recognized a larger burden of responsibility for information release in the hands of the individual, beyond the parameters of covered entity.

TWO ACTION ITEMS

  1. Kathleen Fyffe and Maya Bernstein will summarize testimony for the first round of hearings, to be circulated to Subcommittee members for input.
  2. A poll will be circulated for the third set of privacy hearings (to be held either in April, May, or June 2005).

Panel IV: Consumer Advocacy Groups

AARP Joyce Dubow

Focus Consumer and public education about the National Health Information Network and related privacy issues relative to the purported adversarial relationship among stakeholders. Ms. Dubow noted that the GAO Report indicated that most people don’t have a good understanding of HIPAA.

Background The AARP Public Policy Institute is charged with doing policy research and analysis intended to stimulate and inform the public debate about issues important to midlife and older people. AARP has over 35 million members with approximately half of its membership under 65. Membership is very diverse, making it difficult to generalize and yet, core values exist for all members. AARP’s interest in health information technology (HIT) is grounded in a conviction that quality problems exist in the U.S that could be improved with technological interventions. The IOM Quality Chasm Report emphasizes the need for systems change to improve health care quality. AARP has not yet done explicit research about what members think about personal health records (PHRs), although some have asked for access to an electronic tool that can help them manage their health information.

HIT can hasten improvement and presents enormous opportunities. As such, it is a means to an end (the improvement of quality). Electronic records bring decision support to physicians that can accelerate and disseminate knowledge, ensure the provision of appropriate care, and reduce error. Patients become more active partners with access to their health information, which is especially beneficial to people with chronic illness (note references to the Wagner self-management model of chronic care). Access to information also enhances patient/provider communication.

Clear and agreed-upon definitions are lacking (as discussed with David Lansky in a January 2005 Subcommittee meeting relative to PHRs –see transcript for further information). Research produced by the Markle Foundation and Harris Interactive indicates the need for more research to better understand consumer perspectives on privacy. AARP has commission an international study that compares how other countries use PHRs (scheduled for completion by December 2005). AARP is pleased that Dr. Brailer (National Coordinator for Health Information Technology) and others are reaching out to consumers to stimulate more interest in these areas.

The importance of privacy and confidentiality to consumers cannot be underestimated. A PEW study has shown that a substantial proportion of the population would not trust having private information on the web. They are not convinced that an electronic format is “better protection” and they see the opportunity for mistakes. It is important to clarify people’s assessment and perceptions of the safety of electronic records.

AARP’s policy on privacy and confidentiality supports compatible procedures that allow health care delivery and research to occur, with the understanding that patients control and know how to access their information. Individuals should have the right to: examine and copy the contents of their health records and know the identities of people who examine their records; and determine who may have access to personally identifiable health information and for what purpose. AARP opposes disclosure of an individual’s medical information except as authorized by the patient for public health reporting as required by law (for enforcement of the financial integrity of publicly-funded programs), or for research, quality assessment and improvement, provided that personal identifiers have been removed whenever possible.

AARP supports actions that make individually identifiable health information less vulnerable to inappropriate disclosure and misuse. Although HIPAA represents a step forward for health privacy, some provisions concern AARP. Written consent should be required before information is shared for treatment, payment, and health care operations. Notification to patients of a provider’s privacy policies is inadequate because it denies the consumer the right to privacy. Covered entities should be required, on request, to account for information disclosures when treatment, payment, and health care operations occur. Patients should be able to learn who has access to their individual health information.

To summarize, AARP believes that privacy and confidentiality cross all issues involved in the very close relationship between health care quality and IT. Consumers are not yet fully involved or engaged in this discussion.

National Consumers League Linda Golodner

Focus To provide a patient-oriented perspective on privacy issues within the context of health information technology.

Background The National Consumers League (NCL) is the nation’s oldest consumer organization. Founded in 1899, this non-profit group addresses issues such as child labor, privacy, food safety, health care, financial services, and fraud. NCL held a conference focusing on privacy in health care in 1990. NCL has recently developed a coalition called SOS Rx, which focuses on improving medication safety by encouraging broader use of electronic prescribing. NCL believes that rapid development of a National Health Information Network (NHIN) is critical to patient safety and health care system efficiency, but not at the expense of privacy that enables patient/provider trust.

NCL urges policy makers in HHS to integrate the following principles:

Information access and control

  • An individual’s ability to access information and exert control: the structure and rules must facilitate a person’s ability to exercise their personal health information rights, at a minimum.
  • An individual’s ability to control who has access and permission to use personal health information over the network (directly or via proxy) without coercion or pressure.
  • An individual’s ability to review accesses made to personal health information. The individual or entity should have a unique digital signature that connects to a standardized profile of those reviewing personal information.
  • No personal health information should be available to a provider or health professional that is not available to the person it describes (except in cases of danger to patient).
  • Unreasonable or unaffordable fees should not impair a person’s ability to review and contribute to their personal health record.
  • An individual’s ability, at their liberty, to add comments and annotations to their personal health record.
  • An individual’s ability to request amendment or correction of their personal health information and receive a timely response.
  • The NHIN must provide a sound method of securing access and authenticating individual patient users that does not require physician or institutional mediation.
  • An individual’s ability to designate (and withdraw designation from) proxies with full authority to mange their personal health information on the network.

Disclosure and accountability

  • An individual’s ability to fully understand existing policies before giving information.
  • Information elements central to network functioning such as identifiers, authorizations and permissions, access histories, and index entries must be presented in easily understood terms and formats to patients, consumers, and authorized users (such as caregivers).
  • An individual’s right to know all the possible ways their information may be used and an ability to choose whether to make personal health data available to various systems. NHIN must permit for distinction between data storage and data use.
  • Communications with people about policies and uses of their information must be conducted in simple and easily understood language.
  • States should adopt common operating standards for security and patient privacy protection.
  • People must be able to receive complete paper copies of their information available across the national network.

Functionality

  • NHIN must provide the capability for people to reliably and securely move all or portions of their personal health information from one health care entity to another.
  • NHIN should permit aggregation of non-identifiable data in support of quality measurement, provider and institutional performance assessment, prescription drug monitoring, patient safety, public health, and other public interest objectives.
  • Non-identifiable data sets generated from the NHIN should not be used for insurance underwriting or other commercial applications intended to provide preferential pricing or services.
  • Implementation of NHIN must be accompanied by a significant public education program so that people understand the value of the network, its privacy and security protections, how to participate, and their rights and benefits.
  • NHIN must permit patients to transmit information to their health care providers and receive information from them.

Governance

  • Consumers and patient advocates must have significant representation in the governance and advisory structure of all regional and national NHIN authorities, including standard-setting and operational entities.
  • The governance and administration of NHIN must be public, transparent, and accountable.

Closing thoughts NCL suggests viewing these principles in their totality. It urges HHS to leverage the interests of consumer advocacy groups in this arena.

NOTE Ms. Golodner is on the board of the Patient Safety Institute, which is working on some electronic record pilot projects. The board is comprised of three consumers, three medical or health professionals and three institutions. She recommends a presentation by this group at a future hearing in light of its applicability as a model for governance and interoperability. Ms. Fyffe mentioned that a component of this model has to do with the trusted third party who holds information.

Discussion

Mr. Rothstein asked for clarification about the following scenario:

A 55-year old who wants to purchase long-term care insurance, knows that in long-term care facilities, people who need skilled nursing care cost at least twice as much as people who don’t. The long-term care insurance company wants access to complete medical records. In the AARP policy, it appears as though nothing would prohibit a long-term care insurer from making signing an authorization a condition of applying for the policy. In the NCL policy, this appears to be impermissible because the decision to share must be made without coercion and pressure.

Ms. Dubow said that AARP subscribes to the principles delineated by Ms. Golodner, which were delineated at a Markle Foundation meeting that both organizations attended. There was a debate at that meeting about the extent to which consumers could realistically control their information in the context of purchasing health insurance. In addition, while buying insurance is voluntarily, concerns have been raised about those instances in which insurers might invade privacy by visiting websites in order to make determinations about changing premiums after coverage has been purchased. AARP’s policy does not take every situation into account but it does recognize that consumers cannot always control an insurer’s access to their information, if they want a policy. AARP has extensive policies on insurance coverage which try to balance the need to ensure access to coverage with other issues. While AARP still strongly supports community rating, it has been flexible over the years due to its interest in seeing more people gain insurance coverage. There are no black and white answers to this complicated area and compromise is often called for.

Ms. Golodner thinks that consumers will have to compromise when acquiring insurance. She wondered if there were ways to balance the need for information required by insurance companies or long-term facilities with the invasion of individual privacy.

Mr. Rothstein referred to a discussion on February 23rd about control of the nature of information to be disclosed. Sometimes, what is presented as a health privacy issue is really about the right of access to health insurance or about conditions for medical underwriting. There is a limit to regulating privacy relative to access issues where third parties have economic leverage.

On identification and de-identification Mr. Reynolds asked for clarification about the definition of “personal identifiers” and “de-identification.” Ms. Dubow agreed that this area needs further clarification and study. The place to raise these concerns within AARP is within its National Legislative Council a group of volunteers that proposes public policy positions to the board of directors. AARP wants de-identified information to be used as much as possible but also recognizes that more specifics are sometimes needed to address quality concerns. NCL supports de-identified information very strongly but recognizes consumer choice (see transcript for example). Ms. Golodner suggested that there should be a way for consumers to give blanket permission for use of their information, if they choose. From a quality perspective, Ms. Dubow suggested a need to get adequate performance information at the provider level, which would suggest the need for some de-identified information. But the vast majority of patients do not know what quality information is used for or why it is necessary nor do they understand how privacy and confidentiality of their information is preserved. A better understanding of how consumers view these issues and a better public education campaign is needed. While health care quality is the consumer objective, only 55 percent of the population receives recommended care at present. It is in the consumers’ interest to see quality improve. Dr. Fitzmaurice added that if data was 100 percent de-identified, bias would shrink due to inclusion of rare events; and consumers and the general public would be more fully informed of the whole picture. On the other hand, identification and linkage would be useful for follow-up treatment or quality patient safety. Should consumers trust IRBs so long as consumer interest is represented? Has the HIPAA privacy rule for consumers been a “net plus?” Ms. Golodner thought IRBs could be trusted with consumer representation. She and Ms. Dubow perceive the HIPAA rule as a net plus because it encourages consumers to think about privacy as it puts the institution or the health professional on guard. Citing great variation, Ms. Dubow questioned the reliability of IRBs. She thinks HIPAA is an important piece of legislation that could be improved.

Asked if AARP was advocating patient authorization for public health reporting, Ms. Dubow stated that the AARP’s policy opposes disclosure except as authorized by the patient for public health reporting.

Reflecting on the “educational task about HIPAA,” Dr. Harding asked if Ms. Dubow and Ms. Golodner had suggestions or resources to help shape education, attitudes, or people capacity. Ms. Golodner emphasized that there was more to do than educating the public or patients about privacy issues and HIPAA. The institution that asks people to sign the form, often via a clerk, has to make sure that its meaning is communicated and understood. Ms. Dubow mentioned that AARP’s Public Policy Institute will soon publish a peer-reviewed paper that examines HIPAA with respect to its use in health care quality and research; and another about patient access to medical records. Although an advocacy campaign on HIPAA is not on AARP’s agenda at present, HIPAA issues will be raised in discussions about health care quality and health information technology.

Mr. Rothstein questioned the creation of an electronic system that might leave many seniors behind, thus developing a “two-tiered” health information system. A recent Kaiser Family Foundation internet study found clear differences between users 50-64 years old and those over 65. As people age, their ability to use the internet might diminish due to visual impairment or simply not wanting to deal with the burden of information. At the same time, there are 90-year olds who use the computer, so targeted approaches are called for. This is an area for further research. Ms. Golodner said that information must also be available in paper.

Panel V Privacy Concerns Related to Personal Health Records

MedicAlert Janet Martino, M.D.

Background and philosophy MedicAlert works to protect and save lives, enhance patient safety, and improve quality care by providing secure data, collection, and storage as well as accurate and easy transfer of medical information to health and safety professionals 24 hours/day worldwide.

A new service, likely to be offered later in 2005, is the use of personal electronic health records, designed to be compatible with health care industry standards (in collaboration with CapMed). New sources beyond patient-entered data are anticipated. MedicAlert already has an EHR for managing medical information of its members. The CapMed personal health record product will allow members to maintain their own information and records with bi-directional updates to and from the MedicAlert EHR, at the patient’s direction. The records can be made accessible from the MedicAlert website, the personal health record carried on a USB key, or by calling a 24-hour Call Center.

Other MedicAlert services include:

  • Document repository services, to include written advance directives and pre-hospital DNR orders. MedicAlert is an officially recognized provider of pre-hospital DNR orders in ten states, which allows a responding paramedic team to call MedicAlert or to accept the DNR from a bracelet or emblem before providing extraordinary care in the field. Upon arrival at a hospital, another document takes over.
  • Professional training and education, mostly to emergency responders.
  • Enhancement of patient education through links to medical information and access to disease-specific websites.
  • Family notification services in the case of emergencies.

MedicAlert’s privacy and confidentiality policies ensure that all medical data is directly provided by members. When they register for MedicAlert, they give prior written approval for all storage and transfers as a standing order for emergency care (MedicAlert’s primary service). The Emergency Response Team handles all incoming requests for patient information with a 10-step authorization protocol. Website and technology strategies are standard: use of member identifying information, number and password, and secure encrypted information transfer.

MedicAlert recognizes the need to be more broadly interoperable such that the EHR can accept and send data to various organizations in order to expand the scope of information sources and receive information more directly from provider EHRs, payers, and labs. The next generation architecture will include HL7 reference information model and functional outline. MedicAlert is working with the OMG HL7 services specification to enable operability. It intends to also expand the scope of drug utilization reviews. All this requires management of patient-directed authorizations for information release, which is a complex process.

To accomplish its goals, MedicAlert is enhancing its security measures. Mini-tiered data access authorization is being offered as are internal network domains with firewalls as well as internal and external hacking and intrusions detection. Audit trails are being used, much of which must be done electronically, as are password guesser algorithms to test the quality of user passwords. A pretend hacker system ensures that passwords are holding. Externally, all accessed hardware is in a DMZ. There is no unencrypted XML data or login across machines. Data source notations have been added to identify who is entering data and its source. Relative to the movement of data, the concerns are a security layer, an identity and access management layer, and management and technology services that are provided with MedicAlert’s business logic (see transcript for more technical detail).

Upcoming issues (see transcript for details)

  1. Authorization management for the release of information, since much of MedicAlert’s potential business falls outside the direct medical care umbrella. Members must be in control of data sharing between MedicAlert and its partners, to include provider EHRs, payers, employers, pharmacy benefits managers and retail pharmacies, and laboratories.
  2. Authorization of emergency responders for release of information.
  3. Member support and education on the implications of withholding or releasing information under different circumstances.

Regarding the technical requirements of interoperability, a big difficulty is that people tend to overlook functional requirements needed for adequate levels or security, privacy, and access. In asking what the government can do to enable, enhance, foster, and incentivize work to get the job done, Dr. Martino suggests: 1) a standard definition and list of common roles; 2) articulation of common situations and contexts; and 3) guidelines for policies. A panel initiated by the federal government could have these discussions with all stakeholders. The technology can do the work but what is missing is the content of the decision engines defining who should see what.

Consumer education and counseling was another topic of concern relative to: implications of withholding or releasing information; and the need for healthcare professionals to counsel patients about their particular situations and objectives. Consumers must be educated about what they should do with their information and where it is critical.

CapMed Wendy Angst, M.H.A.

Background CapMed, founded in the early 1990s, was acquired in November 2004 by a publicly traded company. Its focus has been on patient health records since 1996.

CapMed’s consumer-centric personal health record model looks for the best way to engage consumers in doing a better job of partnering with their health care. This untethered PHR model aggregates information from electronic health records that export data. It has the ability to interface with home monitoring devices such as blood pressure monitors, glucose meters, and electronic scales; and to self-enter information through pick lists and wizards. One half-million PHRs (in the form of a CD-ROM or on-line access) will have been distributed by March 2005 with a patent pending for a portable Personal HealthKey. The largest user group falls in the 51-60 age range, followed by people age 61 – 70. These internet-enabled products are stored on the desktop, with the ability to directly link patients to electronic exchange information.

There are four levels of CapMed users.

  1. Those who want complete control and store their information on their desktop.
  2. Those who want their information portable. The individual, as “chauffeur” of the health record, uses the portable HealthKey device.
  3. Those who want to serve as gatekeeper. The individual controls access via the hub and HealthKey.
  4. Those who store health information in a trusted bank. The individual maintains control while the third party manages and communicates the information (MedicAlert)

CapMed is working to enable consumers to deposit information, withdraw, or transfer information into their personal health records. Consumers will be able to aggregate data from different EMR vendors, add information, and export it back to effect information exchange and import directly into a physician’s system. Any report (e.g., family history, graphing lab values, or medical images) within the health record can be transferred.

To date, only 15 percent of providers have EMRs, although most have computers. Therefore, consumers need to make sure that their information gets to providers at the point of care. PHRs afford better care from a more reliable and efficient system. Ms. Angst noted the parallels of privacy concerns between financial and health information.

Research on CapMed’s personal health record users conducted in 2002 by the Smith School of Business found that 95 percent of those surveyed think it valuable to store health information in one place(see transcript for details). Close to 80 percent would rather keep the information on their PC because they distrust the internet. Most managing a chronic condition believe this will decrease the chance of errors by 39 percent. Less than four percent think that their physicians want them to use a PHR. Other PHR research (FAACT) indicates that 91 percent of respondents are concerned about privacy and security of their health information. With Harris Interactive, of the 13 percent who kept electronic records, only one in 13 kept them on-line. In this study, employers and pharmaceutical companies were rated as the worst while physicians and hospitals were rated as the two highest sources for trust. Cost was noted to be the biggest factor in keeping people from using or endorsing a personal health record; the second biggest concern was privacy. The bulk of study participants strongly agreed that privacy and security concerns would prohibit them form storing or transferring their health information over the internet. As for type of personal record, most chose a hybrid internet USB product which combined capacity to exchange and access information on-line while maintaining hands-on control. The full results of this study will be presented in TEPR in May 2005.

Summary The value of PHR is undisputed. Patients who are more involved in their own health care have better outcomes. Privacy must be examined in terms of access, sharing, and ownership.

Discussion

The CapMed model is designed as a “patient-centric” tool that allows patients to have complete use of their record without having to input any information electronically from the physician. Ms. Angst described the interface between what CapMed does electronically with EMRs – NextGen, as an example (see transcript for details). The information from the record is unchangeable and the audit trail shows what patients receive from physicians. Then, a physician’s record can be built into the patient’s own personal record (which is changeable), with sourcing that indicates what came from the patient and what came from the physician. That way, patients can choose to omit certain parts of the personal record when passing on information (see transcript for example). The patient has a choice about what is sent out from the PHR. Therefore, the only unchangeable document is the actual scan of the physician’s record. Dr. Martino understands the pitfalls of information filtered by patients prior to review by other physicians (as happens in our current system), which leads to provider assessments made from incomplete information.

Mr. Rothstein described several interest sets: 1) those who want to control access to their personal health record by others such as physicians, employers, and insurers; 2) those who want enough information to do their jobs; 3) third party users such as life insurers. The system does not advance the patient’s privacy interest by editing what goes out because an in-tact physician record is likely go directly to another physician. Ms. Angst clarified that CapMed essentially provides a tool to get the patient involved. To get consumer use, consumer concerns must remain at the forefront. To achieve physician adoption, it must be clear that the system does not get them into “trouble.”

How do you make this medical information interoperable? Dr. Fitzmaurice is excited about CapMed’s model because it is beneficial at the patient/family level. CapMed is designed to create a patient-generated CCR so that everything is codified appropriately and exchangeable. They are able to map and populate the patient’s record at the field level by having appropriate codes and terminology built into the product. Everything is mapped in a problem-oriented format with the ability to link patients to education and automatic reminders. Within the application itself, everything is self-contained on the device so a person can view, manage, or update a personal health record anywhere in the world, which also allows for immediate access to emergency data.

The Government Computer-Based Patient Record group’s work was to create a framework for interoperability that would allow cross-mappings to common information and terminology models. Dr. Martino said that no one has done this yet – and it needs to be done. This is the direction that MedicAlert is heading – to start identifying common information models that allow mapping of different data structures to each other. Although many believe that XML is the answer to interoperability because every field is tagged, tagging may not match from one place to another. A common data/information structure model is needed. The terminology (field content) also needs to be mapped to a common standard (SNOMED is emerging as one of the strongest). Common models such as CCR or the HL7 CDA help with provider communication and patient safety. Human beings in their judgment calls must remain in the loop, allowing, at times, for clinical judgment over lab reports or patient input over external data.

Ms. Angst clarified that CapMed does not have authentication for the patient to log onto the hub or to access information. Patients must also authenticate their identity for information to be sent from the hub to a different provider, using 12 AS bit encryption to authenticate and communicate that information. This applies to MedicAlert through CapMed’s hub as part of their exchange. Mr. Hungate’s impression is that people often lump privacy into security and that privacy assurance relies on being able to say that those are “cold.” He asked about verification of data within the patient/physician relationship. Dr. Martino emphasized the patient/provider relationship as critical to the gathering of reliable data. Further, consumers are being given the power to withhold information but not the power to responsibly wield this power. As a consumer, Mr. Hungate said he’d ideally want to be able to control what is in his personal health record but that he would also would want his primary care physician to review content to ensure accurate representation (see transcript for example of consumer input from previous Subcommittee hearing).

Regarding patient authorizations and consent, Dr. Deering thinks the following are needed: 1) an equally robust and comprehensive delineation of the entire menu of potential patient consumer authorizations and consents; and 2) plain language communication of those consents and their implications, such that first level permissions are a drop-down box that is hot-linked for further information, as needed. This would be a great service because explanation and understanding bring more consents and authorizations. Dr. Deering wondered how this could be built into the system. Dr. Martino said that the more control you give to people, the more their anxiety is specific and targeted rather than free-floating. Various forms and pick lists are helpful in this regard. Legal forms, especially, should come up automatically with large print and common language. Technologists need to think about how to implement such forms electronically as a service so that RHIO, a physician’s office, a PBM, or a lab has access. High volume data exchange requires at least a semi-automated way of obtaining standing authorization for certain types of data sharing. This would allow patients to give permission once rather than for every new exchange.

Mr. Rothstein adjourned the meeting at 3:18 p.m.


To the best of my knowledge, the foregoing summary of minutes is accurate and complete.

/s/ 5/16/2005

_______________________________________

Subcommittee Chairman Date