Note: The transcript and speakers’ slides for this meeting are posted on the NCVHS Web site, http://www.ncvhs.hhs.govUse the meeting date to locate them.

Hearing Summary: Use of Administrative and Clinical Electronic Data for Quality Assessment

June 19, 2007

Hubert H. Humphrey Building, Washington, D.C.

EXECUTIVE SUMMARY

The NCVHS Quality Workgroup organized this hearing to learn about the current state of hospital performance measurement as well as state-of-the-art ways of collecting and reporting hospital performance. There is now strong momentum in many sectors to measure and improve health care quality. Public reporting initiatives, some voluntary and some mandatory, are a significant source of the momentum, along with payer incentives. While electronic health records hold promise as a powerful tool and data source for quality assessment and improvement, the transition to electronic systems is slow and the quality-related uses of EHRs are not yet well developed. Thus, administrative data remain the primary source for quality measures, combined with chart review to meet the requirements of accreditation organizations and others. Administrative data have been limited by the inability to classify patients on the basis of severity and risk. In recent years, however, there has been significant progress toward identifying key clinical information that can enhance administrative data for this purpose, and hybrid models drawing on multiple data sources are beginning to appear. Among other things, these models provide guidance for the development of EHRs for quality uses.

The hearing presentations and discussions showed the utility and potential of hybrid data sources for quality assessment and improvement. The Workgroup also learned about programs to collect nursing-sensitive measures to enrich the quality picture with information on the system of care. It was clear that measuring and reporting have immediate benefit within organizations for looking at performance and trending over time and that public reporting, when done collaboratively using accurate data, can be a significant force for change. The Workgroup talked with the presenters about the potential and challenges in each of these areas. The hearing also provided an opportunity to revisit themes and candidate recommendations in a 2004 NCVHS report on quality measurement.[1]

FRAMING THE TESTIMONY

Crystal Kallem, RHIT, and Allison Viola, MBA, American Health Information Management Association (AHIMA); Dave Gans, FACMPE, Medical Group Management Association (MGMA)

The first two presentations framed the day’s discussions by describing the challenges health care organizations and providers face in meeting quality reporting requirements. Ms. Viola introduced Ms. Kallem, who described the mounting and disparate requests for quality measures coming from multiple public and private organizations. The obstacles include a lack of uniform specifications, a lack of qualified staff, and many technical, organizational and economic challenges. The challenges are heightened by staffing shortages and shrinking reimbursements.

Both manual data retrieval in a paper environment and electronic data extraction from EHR systems present challenges because of the variations in performance measurement system designs, definitions, and reporting requirements. The net result is that those responsible for quality reporting must extract different data, using different definitions, and report them in different forms for different payers and others. Expert data abstractors have become a necessary part of the workforce, and they are in short supply.

Mr. Gans commented on the perspective of physicians in the small (single or double) practices that predominate in the U.S. For them, economic pressures and the absence of specialized staff intensify the challenges of meeting disparate data requirements. Only a few small practices have electronic record systems.

To consider these issues, AHIMA and MGMA convened more than 50 invited experts in a November 2006 conference, with AHRQ support. Based on the participants’ input, AHIMA and MGMA then put forward three recommendations to HHS:

  1. Form a public-private entity to oversee and evaluate policies and procedures for health-care performance measurement, to produce a unified and coordinated approach;
  2. Fund research on the quality of data reported for performance measurement; and
  3. Fund research on the costs associated with performance-measurement data collection and reporting.

PERFORMANCE MEASUREMENT AND QUALITY IMPROVEMENT

Denise Remus, PhD., RN, Bay Care Health System

Bay Care Health System in Tampa/St. Petersburg, FL manages a robust health statistics enterprise that uses both clinical and administrative data. Dr. Remus described the development of the organization’s quality measurement and improvement initiative. She acknowledged data-gathering challenges much like those outlined by the previous presenters. Noting the richness of administrative data in a hospital setting, she said Bay Care expects to continue to use administrative data for quality measurement for a long time to come. She described Bay Care’s extensive educational programs to bring about needed changes in areas flagged by its performance indicators. In addition, Bay Care plans to publicly report its quality measures.

Bay Care is implementing an EHR system for its nine hospitals, at a total cost of $200 million. Dr. Remus observed that an EHR is “only as good as the design”; and currently, EHR systems are vendor-driven, inconsistent, and without guidelines for the multiple uses to which they will be put. Nevertheless, she predicted that electronic systems and tools such as alerts will ultimately improve care delivery.

She offered these suggestions to the Committee: keep enhancing administrative data, which are not going to go away; standardize definitions and expand their detail and reliability; focus on standards for HIT; and expand measures in other clinical areas (e.g., oncology and behavioral health).

Mark Wynn, PhD, CMS: Incentives for Hospital Quality Medicare Demonstrations

Dr. Wynn introduced the theme of incentives into the discussion. He described a CMS pay for performance (P4P) demonstration project, Premier Hospital Quality Incentive Demonstration (HQID), which is testing the impact of quality incentives. Phase one was completed in September 2006, and the second phase will end in 2009. HQID pays bonuses to the top group of hospitals for each condition, and penalizes hospitals performing below a certain level. The project has shown continued improvement: in every quarter of the first year, all hospital categories improved, not just those being rewarded, and improvement continued in the second and third years. Research continues on what incentives are most effective, for example in motivating mediocre performers.

CMS will send a report to Congress on this project around August, 2007. The project showed that P4P (which, Dr. Wynn asserted, is inevitable) can work, and that modest dollars can have big impacts. He said the choice of measures and the perception of fairness are especially important. The challenges to be addressed relate to operational, financial, scoring, and measure selection issues, among others. In addition, there is a need for additional clinical areas, for risk and severity adjustment, for crosscutting measures, for efficiency measures and patient-reported measures, and for electronic records.

Asked how NCVHS could help, Dr. Wynn echoed other speakers in stressing that government has a special role to play in standardizing measures and reporting systems.

Richard Johannes, MD, MS, Cardinal Health: A Hybrid System That Supports Public Reporting in Pennsylvania

Dr. Johannes discussed research showing the effects of adding selected clinical data to administrative data in order to stratify patients in terms of risk upon hospital admission. This stratification makes it possible to meaningfully compare the performance of different hospitals. Cardinal Health provides the data collection and risk adjustment methodologies for longstanding work of this kind by the State of Pennsylvania. He noted that laboratory data and to a lesser extent vital signs are highly objective and powerful predictors. The timing of clinical data is important to admission-based severity stratification.

Cardinal Health recently completed a study testing the relative merits of different types of data used for risk adjustment. The study has been accepted to Medical Care. Dr. Johannes also commented on a recent study, funded by AHRQ and conducted by Abt Associates, that quantifies the benefits and practicality of different types of clinical data for this purpose. (Dr. Elixhauser later discussed the research in greater detail; see below.)

He stressed that clinical data have the advantage of being objective, precise, time-stamped, not gamable, and easily verified. Thus, they are credible for clinicians who otherwise might be skeptical about unfavorable results of quality measurements. He noted the particular value of laboratory data, which are becoming widely available in electronic format. In sum, he said, the studies such as those described move the discussion from “whether” clinical data will be added to administrative data to “when and how” this will happen.

Anne Elixhauser, PhD, AHRQ: Enhancing Claims Data to Improve Risk Adjustment of Mortality and Patient Safety Indicators

Dr. Elixhauser serves as lead AHRQ staff for an initiative to improve the value of administrative data for reporting quality of care in hospitals. Because of concerns about penalizing providers caring for the sickest patients, the project seeks ways to add clinical detail to administrative data. She noted that CMS is mandating the collection of POA for all Medicare patients starting in 2008, and lab data are available electronically for 80 percent of hospitals.

In an AHRQ-supported study, Dr. Michael Pine and colleagues incrementally added POA information, lab values, more diagnosis fields, improved documentation of diagnostic information, information on vital signs, and finally more clinical information requiring medical record abstraction. The study compared the discriminative ability and cost-effectiveness of different models and found that administrative data can be improved at relatively low cost by adding POA modifiers and numerical lab data on admission, and by changing coding conventions and rules. For example, such changes would allow signs and symptoms such as coma to be coded even when a definitive diagnosis such as stroke has been established.

Based on those findings, AHRQ has released two RFPs aimed at expanding the data capacities of statewide organizations participating in the Healthcare Cost and Utilization Project (HCUP). One RFP is for pilots in two or more states to add clinical information to their administrative data; the other is to support planning efforts in a limited number of states. Participants are required to find ways to electronically transfer the data. The aim is to jumpstart the addition of clinical data elements to statewide hospital data. Dr. Elixhauser noted that these relatively modest goals reflect the wide variations in states’ current data capacities.

Bruce Boissonault, Niagara Health Quality Coalition: Moving from Administrative Data to EHR: Data as a By-Product of Care

Niagara Health Quality Coalition (NHQC), a beta test site for AHRQ quality indicators, is New York State’s 10-year old multi-stakeholder collaboration. Membership includes employers, hospitals, physicians, health plans, consumers, government and others. Mr. Boissonnault noted the importance of moving toward the notion of a national “data highway” that provides not only quality data but also population screening. He stressed that today; health care is one of the few industries that does not provide effective data on whether it is doing a good job. However, we are in a new era in which people (including ordinary citizens) are looking at data more easily, largely because of the Internet. The NHQC Website myHealthFinder.com gets up to three million hits a day, and the information reported there receives prominent media attention because of its credibility. NHQC is recognized for the quality of its research and the independence of its public reports. Moreover, NHQC’s public reporting has been accepted by providers.

Public performance measurement systems, he said, must be judged by the degree to which they positively affect the status quo. He described the New York State Hospital Report Card, in which 100 percent of hospitals participate. The reporting is based on administrative “billing and discharge data.” He noted that administrative data are transparent, difficult to game, sustainable and scalable, among other merits. He cited some of the changes that were made as a result of quality measures, and said a drop in overall statewide mortality is plausibly associated with improved clinical care. There also have been cost savings.

Mr. Boissonnault offered several policy suggestions, asserting that today, US health care policy too often rewards effort when it should reward results. He outlined the dearth of publicly accessible billing and discharge databases needed to improve care and urged that policy leaders should define a U.S. data highway for health care. He recommended that HHS should convene a committee to define broadly what health care databases are needed, and then work backwards to design systems which require providers to populate these needed databases. Until EHRs are sufficiently robust to enable performance and population monitoring as a byproduct of care, he recommended moving toward a hybrid data system which could begin operating almost immediately. However, progress toward the needed data highway will not happen without HHS leadership.

Seth Eisen, MD, Veterans Health Administration: VA EMR: VistA

Dr. Eisen was the first presenter to describe a fully electronic system in operation for clinical care and quality assessment. Having worked with both paper and electronic systems, he said “the difference is really quite remarkable.” Researchers use the VA databases extensively for a wide variety of projects. The EMR is called VistA (Veterans Health Information Systems and Technology Architecture). Dr. Eisen showed screenshots of VistA and described clinical and research uses of the database and the resulting improvements in care (e.g., increased appropriate follow up for colon cancer screening as a result of automated reminders).

While the data systems facilitate medical care, evaluation, and efficiency, Dr. Eisen said a weakness is that the large size of the VA organization inhibits innovation. Other concerns include data security issues and vulnerability to data loss. In addition, he noted the limited ability to search the data because much of the contents of the EMR is free text. The VA is developing a research program that will support methodological research to permit sophisticated text searches. For the future, he recommends that EMRs with free text components include a text search methodology as an integral part of the record to complement searches of structured fields.

PERFORMANCE MEASUREMENT AND PUBLIC REPORTING

Seth Eisen, MD, VA: National Surgical Quality Improvement Program (NSQIP)

NSQIP is a measurement and reporting initiative developed by surgeons over the last two decades. It is a standardized methodology that is collected independently and used to measure critical outcomes including surgical morbidity, mortality, length of stay and complication rates. It started in the VA in response to a Congressional mandate and is now in wide use, with strong support and participation from surgical staffs and facilities. The American College of Surgeons offered it to surgical services nationwide in 2004.

NSQIP is primarily intended for programmatic uses, not for feedback to individual surgeons. The clinical feedback mechanisms include quarterly reports, summary annual reports, and chart audits of low-risk patients with adverse events. NSQIP also provides a data registry available to researchers, leading to more than 100 scientific publications. Dr. Eisen highlighted some of the research findings. The data have produced drops in both mortality and morbidity rates, and especially the former. A recent article in JAMA by Hank Wu and others illustrates the use of NSQIP’s longitudinal data. Regarding cost, he said that while NSQIP costs $38 per major surgical case done in the VA, this is only equal to the cost of one prolene suture, and it may be more than offset by savings in reduced complication rates.

In conclusion, Dr. Eisen characterized NSQIP as a well-established, well-oiled machine that has been effective in helping surgical programs evaluate their care and, if it proves to be sub-standard, to figure out what the problem is. He noted its widening use nationwide.

Michael Lundberg, Virginia Health Information (VHI): State Reporting for Consumer Health Transparency

Mr. Lundberg observed that consumer health transparency requires a combination of politics, science and public reporting. The mission of VHI, formed in 1993, is to help Virginians make more informed health care purchasing decisions and to enhance the quality of health care delivery. It is a multi-stakeholder organization; the non-consumer members of its board of directors are appointed by their trade associations. He stressed the importance of collaboration as well as good science, keeping people involved, and ongoing communication.

VHI is funded by general state appropriations (12%) provider fees (22%) and product/ special contract revenues (66%) . It publishes data on procedures or conditions that affect a significant portion of the population, are high in cost, and/or are amenable to risk adjustment. VHI produces a series of consumer publications, developed based on input from consumers and consumer focus groups. VHI publications include information on health maintenance organizations, hospitals, long term care (adult day care, assisted living, home care, continuing care retirement communities, and nursing facilities) and physicians.  VHI is revising its Website to serve primarily as a consumer health portal.  Currently www.vhi.org also serves the needs of business, researchers, and health care providers.

Mr. Lundberg said VHI is excited about the potential to use POA and lab values to enhance its data. He added that he would like to reduce the lag time from discharge to data release, currently six months long. He pointed out that national standards are important, but they will not eliminate the need for local innovation.

Betsy Clough, MPH, Wisconsin Coalition for Healthcare Quality (WCHQ)

Ms. Clough’s presentation highlighted the importance of physician leadership and participation. WCHQ was founded in 2002 by nine health systems, each of which brought a business partner to the table with the goal of improving the quality and cost-effectiveness of health care for Wisconsin by developing and publicly reporting quality measures. Forty percent of all state physicians and 21 hospitals now participate. The catalytic sparks that spurred the development of WCHQ include transparency and internal and market pressures to improve and be transparent. Physician leadership and vision have been key from the outset.

The Coalition developed criteria for measures. Its first report, developed in seven months, met with push-back from physicians who questioned the data; so it set out to develop a methodology that would represent all physicians and payers. This required “engaging the volunteer army” and established the importance of involving physicians. WCHQ’s public reporting occurs at the physician group or hospital level.

Ms. Clough reviewed some of the dramatic improvements that have resulted from its initiative, such as in controlling the hemoglobin A1c of diabetic patients. She described the way one health system used WCHQ measures as a framework for dramatically improving their care in several areas. The Coalition also emphasizes having an impact on the entire population of Wisconsin, and it reports population data for several conditions. It also has been working on developing efficiency measures, at the request of provider members. In conclusion, Ms. Clough stressed the importance of having credible, reliable data in order to retain the involvement of all stakeholders.

Ben Yandell, PhD, CQE, Norton Healthcare (Louisville, KY): Our Public Quality Report

Dr. Yandell highlighted the characteristics and benefits of transparency, as illustrated by Norton Healthcare’s activities. He noted that until recently, it was standard nationwide practice to closely guard the secrecy of hospital quality information. Norton decided in late 2004 voluntarily to go public with all the quality indicators it collects—200 initially and 400 today. It released its first public report in 2005 and is committed to publishing an objective and complete evaluation of its hospitals’ performance, good or bad. Dr. Yandell said “anything else is advertising, not transparency.” He added that Norton is an example for larger systems of what can be done.

He described the internal changes that have come about as a result of public reporting, including acceptance and interest in the data among the hospital’s physicians. Performance is improving; he showed data from 2003 and 2005 comparing Norton’s performance to national medians from Hospital Compare, with noticeable improvements in the later year for nearly every indicator. (He noted, however, that performance leveled off after that.) He added that one of the challenges of being in the forefront of reporting is “finding something to compare yourself to”; another is “being allowed to tell anybody else that you have something to compare yourself to,” because many databases do not allow publication of anything from their database.

Norton’s Website combines technical and publicly-oriented content. Dr. Yandell stressed that “public reporting is not just for the public”; indeed, he regards Norton’s staff as the primary audience, its medical staff as the secondary audience, and the public as the third audience. Attention and transparency create a sense of urgency that moves the agenda along and brings about improvements. While the field is not ready to build “a comparative shopping guide for consumers,” the public already benefits from the heightened attention to quality.

Finally, he called attention to limitations of Norton’s reporting. It is self-report, when national standardization of public reporting is what is needed. Most risk-adjustment methods adjust each hospital’s data to its individual patient mix and do not allow direct hospital-to-hospital comparisons. In addition, Dr. Yandell proposed that most national indicators are the wrong ones because “they are either zero or 100 percent” instead of allowing finer distinctions. He also echoed other presenters in noting the problems with definitions. Finally, he expressed hope that concerns about administrative burden, cost, and other challenges would not derail quality measurement and reporting, which he asserted have great promise.

Isis Montalvo, RN, MS, MBA, American Nurses Association/National Database on Nursing Quality Indicators (NDNQI): Nursing-Sensitive Measures

NDNQI is a data system that expands quality assessment to include facets of the system of care related to nursing contributions. ANA launched its Patient Safety and Quality Initiative in 1994 and established NDNQI in 1998. At present, more than 1,100 hospitals voluntarily participate, in all 50 states and the District of Columbia. Besides database participation, the NDNQI program includes indicator development, Web-based data submission, pilot testing, education and research, and a variety of reports. There is also an optional RN satisfaction survey for all RNs. Indicator development and implementation are ongoing. The NDNQI measures—13 at present, with four more planned for 2007—are drawn from multiple sources reflecting the contribution of nurses to patient safety and quality of care at the unit-level, where care occurs. Hospitals submit the data electronically, collected from administrative record systems, special studies, chart review, and other sources. NDNQI uses standardized definitions and data-collection guidelines. Many of its indicators were developed through the NQF consensus process.

The NDNQI data are used primarily for internal quality improvement. The other project goal is to use the data to analyze the relationship between aspects of the nursing workforce and nursing-sensitive patient outcomes. Hospitals receive quarterly reports on the indicators, with trend data. Research on NDNQI has demonstrated significance at the unit level, such as the relation of falls and pressure ulcers to staffing levels and skill mix. In January 2007, NDNQI hosted its first national conference which provided nurses with practical tools for improving patient care and outcomes. It also published a best practice exemplar.

Regarding lessons learned, Ms. Montalvo stressed the expertise and staffing levels needed to operate a national database with accurate data. Also, the development and implementation of valid and reliable indicators take a good deal of time and resources.

Sharon Sprenger, RHIA, CPHQ, MPA, The Joint Commission

Ms. Sprenger began by enumerating the many challenges facing hospitals and the Joint Commission in the evolution to electronic data. The challenges include fragmented health information exchanges, privacy issues, data quality issues, the need for national measurement priorities, and the fact that current measure specifications are not designed for the electronic record. Reiterating that the pace of change to electronic data is slow, she cautioned that during the transition, it will be important to be “very patient-centered” and to stay focused on measures that will help improve care.

To illustrate the earlier point that hospitals are confronted with myriad demands for data from multiple organizations, she noted that CMS began with 10 measures following the Medicare Modernization Act, and potentially there will be 32 by 2009. She then described the Joint Commission’s work on data quality, which is increasing. This will be an important issue with the electronic record, as well. The Joint Commission uses data in its accreditation process, and it also produces the ORYX performance-measure report, Quality Checks, and an annual report, all of which show how the nation’s hospitals are doing.

Ms. Sprenger then discussed the Joint Commission’s project on nursing-sensitive data. It has a Robert Wood Johnson Foundation grant to test the NQF consensus standards for nursing-sensitive care (15 in all) as an integrated set. Collecting the data is challenging, especially for systems without electronic systems, because multiple sources are involved (e.g., medical record, human resource/payroll and survey data). In contrast, she said, a standardized electronic data system would make it possible to move to systems thinking and to understand the relationship among a system’s parts. In conclusion, she observed that it will be important to have efficient systems so that hospitals have time to use their performance data to improve patient care.

Attendance

Workgroup members

  • Justine M.Carr, M.D., Workgroup Chair
  • Simon P. Cohn, M.D., M.P.H., NCVHS Chair
  • Larry Green, M.D.
  • Carol J. McCall, F.S.A., M.A.A.A.
  • William J. Scanlon, Ph.D.
  • Donald M. Steinwachs, Ph.D.
  • Paul Tang, M.D.

Staff

  • Marjorie Greenberg, NCHS/CDC, Executive Secretary
  • MaryBeth Farquhar, lead staff
  • Debbie Jackson, NCHS/CDC

Presenters

  • Allison Viola, MBA, American Health Information Management Association
  • Crystal Kallem, RHIT, American Health Information Management Association
  • Dave Gans, FACMPE, Medical Group Management Association
  • Denise Remus, PhD., RN, Bay Care Health System
  • Mark Wynn, PhD, CMS
  • Richard Johannes, MD, MS, Cardinal Health
  • Anne Elixhauser, PhD, AHRQ
  • Bruce Boissonault, Niagara Health Quality Coalition
  • Seth Eisen, MD, Veterans Health Administration
  • Michael Lundberg, Virginia Health Information
  • Betsy Clough, MPH, Wisconsin Coalition for Healthcare Quality
  • Ben Yandell, PhD, CQE, Norton Healthcare, Louisville, KY
  • Isis Montalvo, RN, MS, MBA, American Nurses Association/National Database on Nursing Quality Indicators
  • Sharon Sprenger, RHIA, CPHQ, MPA, The Joint Commission

Others

  • Susan Kanaan, consultant

[1] Measuring Health Care Quality: Obstacles and Opportunities. NCVHS, May 2004. http://www.ncvhs.hhs.gov/040531rp.pdf