[This Transcript is Unedited]


National Committee on Vital and Health Statistics

Hearing on the Functional Requirements

for the

Nationwide Health Information Network

July 27, 2006

Hamilton Crowne Plaza Hotel
1001 14th Street, NW
Washington, DC 20005

Proceedings by:
CASET Associates, Ltd.
10201 Lee Highway, suite 180
Fairfax, Virginia 22030
(703) 352-0091


  • Call to Order, Welcome and Introductions – Simon Cohn, M.D., M.P.H.
  • Panel V: Electronic Health Records
    • American Clinical Laboratory Association: Jason Dubois
    • Electronic Health Record Vendor Association: Charlene Underwood
    • Certification Commission for Healthcare Information Technology: Don Mon, Ph.D.
    • HITSP Technical Committee Use Case Co-Char: Jamie Ferguson
  • Panel VI: Biosurveillance
    • Association of State and Territorial Public Health Organizations: Lawrence Hanrahan, Ph.D., M.S.
    • CDC – Public Health Information Network Tim Morris
    • HITSP Technical Committee Use Case Co-Chair Floyd Eisenberg, M.D., M.P.H.
  • Panel VII: Other Health Information Stakeholders
    • Integrating the Healthcare Enterprise Didi Davis
    • Clinical Research Community: John Apathy
    • Federal Health Architecture – Vish Sankaran/Laura Tillery
    • SureScripts: Kevin Hutchinson
  • Public Testimony
    • Vik Kheterpal, M.D.
    • Gary Dickinson
    • Russell Williams
    • Charlene Underwood
    • Mark Adams, Ph.D.
  • Workgroup Discussion: Simon, M.D.

P R O C E E D I N G S [9:10 a.m.]

Agenda Item: Call to Order, Welcome, Introductions, and
Recap of Day One Testimony – Dr. Cohn

DR. COHN: Well good morning, everybody pleased be seated, we’re going
to get started here in a minute, have everybody take a breath in our new
expanded room and space here, air conditioning is working.

Good morning, I want to call this meeting to order, this is the second day
of meetings of the Workgroup on the Nationwide Health Information Network of
the National Committee on Vital and Health Statistics. The national committee
is the public advisory committee to the U.S. Department of Health and Human
Services on national health information policy. I am Simon Cohn, I’m the
associate executive director for health information policy for Kaiser
Permanente and chair of the committee and this workgroup. I want to welcome
committee members, HHS staff, and others here in person, and I want to welcome
those listening in on the internet. I also of course want to remind everyone to
speak clearly and into the microphone so those on the internet are able to

With that let’s have introductions around the table and around the
room. For those on the national committee, and not staff or presenters, I would
ask if you have any conflicts of interest on any of the issues coming before us
today would you so publicly note in your introduction. I want to begin by
observing that I have no conflicts of interest. Mary Jo.

DR. DEERING: Mary Jo Deering from the National Cancer Institute, lead staff
to the ad hoc workgroup on the NHIN and lead staff to the permanent Workgroup
on the NHII.

MR. HOUSTON: I’m John Houston with University of Pittsburgh Medical
Center, I am a member of the committee as well as I guess this workgroup, I
have no conflicts.

DR. WARREN: I’m Judy Warren, University of Kansas School of Nursing,
member of the committee and member of this workgroup, and have no conflicts.

DR. HUFF: Stan Huff, member of the committee, from Intermountain Health Care
and the University of Utah in Salt Lake City. I am a co-chair of HL7 and
sometimes that comes up, if there’s discussion of HL7 I will recuse myself
from the discussion.

DR. STEINDEL: Steve Steindel, Centers for Disease Control and Prevention,
staff to the ad hoc workgroup and liaison to the full committee.

MR. BLAIR: Jeff Blair, member of the committee, director of health
informatics from Lovelace Clinic Foundation, I’m not aware of any

MS. FISCHETTI: Linda Fischetti, Veterans Health Administration, staff to
this workgroup as well as staff to NCVHS NHII and I will follow Dr. Huff’s
lead and disclose that I have two elected positions within HL7 so that if HL7
issues come up I will recuse myself.

DR. MON: Don Mon, vice president for practice leadership at the American
Health Information Management Association.

MS. UNDERWOOD: Charlene Underwood, director of government and industry
relations for Siemens, this morning I’ll be testifying in my role as chair
of HIMSS Electronic Health Record Vendor Association, I have no conflicts of

MR. DUBOIS: Jason DuBois, I’m the vice president of government
relations for the American Clinical Laboratory Association, and I also have no

MR. FERGUSON: Jamie Ferguson, director of health information technology
strategy at Kaiser Permanente, I’ll be testifying on behalf of the
Electronic Health Records Technical Committee of the Health IT Standards Panel.

MS. AMADIAKAN(?): Margaret Amadiakan, NCVHS contractor.

MS. GREENBERG: Marjorie Greenberg, National Center for Health Statistics,
CDC, and executive secretary to the committee.

MR. LOONSK: John Loonsk, Office of the National Coordinator for Health
Information Technology.

MR. REYNOLDS: Harry Reynolds, Blue Cross and Blue Shield of North Carolina,
member of the full committee and this subcommittee, and no conflicts.

MS. SQUIRE: Marietta Squire, CDC, NCHS, and staff to this workgroup.

MR. BARKER: Robert Barker, interoperability analyst with Next Gen Health

MS. DAVIS: Didi Davis from HIMSS, director of IHE.

MS. JACKSON: Debbie Jackson, National Center for Health Statistics,
committee staff.

MS. VIOLA: Allison Viola, AHIMA.

MR. HOLIDAY: Sam Holiday with Accensure(?).

MS. LESH: Kathy Lesh with the Mitre Corporation.

DR. ADAMS: Mark Adams with Booz Allen Hamilton and I’ll be testifying
on behalf of the caBIG Project.

MS. RUBELL(?): Jordana(?) Rubell from Hogan and Hartson.

DR. EISENBERG: Floyd Eisenberg with Siemens Medical Solutions Health
Services and I will be testifying on behalf of HITSP Biosurveillance Technical

MR. LEO(?): Regon(?) Leo, senior manager, Ensure Scrips.

MR. JARVIS(?): Charles Jarvis, assistant vice president health care industry
and legislative affairs for Next Gen Health Care Information Systems and vice
chair of the EHR VA Communications Committee.

MR. DECARLO: Michael Decarlo with the Blue Cross and Blue Shield

DR. COHN: Well thank you all for joining us. I guess even though I’m
not going to recuse myself in the spirit of openness I do obviously want to
emphasize that I work for Kaiser Permanente as does Jamie Ferguson, one of our
presenters later on. Jamie, as I understand you’re representing the HITSP
panel today as opposed to Kaiser Permanente —

MR. FERGUSON: That’s correct.

DR. COHN: I also do want to publicly note as I did yesterday that I am
advisory member of the HIMSS board and the EHR Vendor Association is actually
part of the HIMSS organization, this is a non-voting membership activity,
I’m an advisory member but I do just to again publicly disclose that.

Now as we discussed yesterday the activities of this workgroup began at the
culmination of the very successful first Nationwide Health Information Network
forum sponsored by the Office of the National Coordinator and supported by
NCVHS in late June and obviously continue with the hearings both yesterday and
today. The NCVHS was asked by the Office of the National Coordinator to review
and synthesize the results of the June meeting and by the end of September
define a minimal but inclusive set of functional requirements needed for the
initial definition of an NHIN and its possible architectural forms.

Obviously this work builds on the contributions of key information
stakeholder groups including the Architecture Prototype Consortia, which we
heard from yesterday, HITSP which we began to hear from yesterday and
we’ll hear more from today, CCHIT, and of course the input of the many
stakeholders at the June forum. I again as I said yesterday want to emphasize
that our role in all of this is really not to make architectural or policy
decisions but to clearly designate common requirements as well as requirements
that are specifics to particular architectures that we feel are essential.

Now having said that of course we will come across policy issues I’m
sure as we began to yesterday and I think it’s our role rather then to
decide them more to note them and advise the Office of the National Coordinator
of policy issues that we’ve observed in the process but once again without
trying to determine the answer to them. Similarly we will obviously also
observe multiple different architectures and indeed some of the functional
requirements may relate to that but once again our role is not to make
decisions on the various architectures.

So let’s turn to the agenda. Obviously this morning we continue with
testimony on the functional requirements. We are asking each presenter to limit
their testimony to 15 minutes, and I’ll probably remind everyone from
panel to panel on that. This will be strictly enforced and really the main
reason for that is so we have time for questions and discussions which of
course is really the valuable activities of a hearing such as this.

Our first panel focuses on electronic health records and functional
requirements, after the morning break we move to biosurveillance. After the
lunch break we will hear from other health information stakeholders and then
offer an opportunity for public testimony. Then at the end of today’s
session we will spend some time in workgroup discussions discussing next steps
and sort of reviewing I think what we heard over the last day and a half and
reflecting on that.

Now let me also talk a little bit about steps from today and we’ll talk
more about that this afternoon as we begin to tighten this up. I do want to
remind everyone both in the room as well as listening in on the internet that
we are open for written testimony or comment from anyone, both from our
presenters in terms of further written testimony, from anyone in the audience
who wants to provide us written testimony, or anyone listening in. We do
however ask that these additional written comments come to us by July
31st close of business so we’ll have a chance to incorporate
them in our work going forward.

Now our hope is and I think we’ll be deciding this later on today is
that we may very well have a draft spreadsheet that we can begin to distribute
to both presenters and other interested parties by early to mid August to begin
to early input. I think our view is to have this be a very open process and we
want to elicit additional input before we begin to finalize anything. So
certainly we know who is presenting, we’ll be distributing that to them,
if there are others listening in on the internet or here in person we’d
obviously want to get your email so we can distribute that when it’s

Now we’ll obviously take some comments in the middle of August,
we’re also planning on having an open web call on August 31st
from 11:00 a.m. to 3:00 p.m. eastern time to discuss what we hope will at that
point be a final draft. From there we move to the full NCVHS meeting where the
hopefully what becomes, move from a final draft to a final document will be
presented on September 13th and 14th for consideration.

Panel V: Electronic Health Records

With that let’s move into our first panel which once again is on
electronic health records. I believe our first presenter is Jason DuBois from
the American Clinical Laboratory Association. Jason, thank you.

MR. DUBOIS: Good morning. Dr. Cohn, committee members, thank you for the
opportunity to testify today on behalf of the American Clinical Laboratory
Association, which represents national, regional and local laboratories. My
name is Jason DuBois, vice president of government relations for ACLA, and I
appreciate your interest in the NHIN’s use case of lab results and broader
electronic health record needs. ACLA members have an extensive history of
providing the nation’s hospitals and physicians with leading edge health
information technology to streamline the laboratory test requisition process
and speed the delivery of test results.

Today I would like to focus on four specific issues that are pertinent both
to the NHIN’s use case of lab results as well as to broader electronic
health record needs. These are one, clinical data standards for laboratory
results reporting and test requisition, two, identity/information or
correlation, unique patient identifier, three, the cost to large data providers
of servicing a national population, and four, authorization as it relates to
the Clinical Laboratory Information Amendments of 1988, or CLIA.

However before delving into these issues I must note that as we discuss the
functional requirements that will frame the development of the NHIN part of the
difficulty in making recommendations is the relative uncertainty of the
architecture for such a system, meaning are data going to be pushed or pulled,
will information be centralized or decentralized. The answers to these
questions will ultimately determine the framework and in turn determine what
specific issues laboratories will encounter in sending and receiving
information through the NHIN.

For example, authentication of the requester of data becomes a problem for
laboratories if the architecture of the NHIN is such that the information
resides with the originator of the data, for example a laboratory, as is the
case with the Utah Health Information Network, and if I’m not mistaken I
think you heard testimony yesterday from UHIN. In this scenario providers
request UHIN to query all providers within the network for any information
regarding a patient. Regardless of the architecture that is ultimately
established laboratories have several issues with respect to the proposed NHIN
functional categories that must be addressed for these efforts to be

First I’d like to address the need for uniform clinical data standards
for laboratory test requisition and result reporting and describe an effort
currently underway within the laboratory community. The creation and adoption
of a national standard for test requisition and results reporting will simplify
the use of electronic health records systems. Such a standard will help
physician practices reap tangible benefits early in EHR implementation and
reduce costs associated with EHR system installation and configuration. In
addition a national standard will reduce the cost and effort currently required
to implement custom interfaces between independent laboratories, really all
laboratories, and ambulatory EHR systems.

Likewise a national standard will also significantly expand the set of
ambulatory EHR implementations that include reporting of electronic laboratory
results and test requisitions. Finally a national standard will also enhance
the utility of electronic laboratory result data by specifying structural and
coding standards that enable greater automated processing including decision
support, quality measurement, and regional sharing. Ultimately these outcomes
will enhance the viability and functionality of the NHIN.

An implementation guide that will serve as the foundation for just such a
uniform national standard is the EHR-Lab Interoperability and Connectivity
Standards, or ELINCS. Currently managed by the California Healthcare Foundation
the ELINCS project represents the concerted efforts of laboratories, IT
vendors, government agencies, physician groups and many others. The
implementation guide is currently under review with the Health Information
Technology Standards Panel. While the California Healthcare Foundation is not a
standards development organization efforts are underway to transfer ELINCS to
an SDO that is a non-proprietary consensus driven organization. Our association
and its members, as well as all of those who participated in the project, have
spent a considerable amount of time helping to devise this standard and it is
our hope that it will ultimately be relied upon as the means for uniform result
reporting and laboratory test requisitions.

I say that because the current version of ELINCS is only on result reporting
and while I serve on the steering committee for ELINCS the future is such that
we hope to move the test requisition down the line.

In addition ACLA makes the following recommendations regarding
interoperability standards:

A particular HL7 version, meaning HL7 version 2.3 or a higher version,
should be adopted for use in the NHIN together with an implementation guide for
that version.

LOINC should be used in the NHIN once an implementation guide is developed.

The adoption of either SNOMED-CT or ICD-10 to replace ICD-9 will require a
massive reprogramming effort but one or the other should be selected for use in
the NHIN when the HIPAA transaction and code set standards are revised

Furthermore to facilitate widespread participation the NHIN should build
upon existing standards that are already commonplace in the industry instead of
imposing new standards from the top down. Accordingly whatever standards are
adopted should be consistent with the HIPAA transaction and code set standards
and should take into account any other standards that are currently being
employed in the health care industry.

Another issue which ACLA believes needs to be resolved is that of a patient
identification strategy. Obviously the most effective patient identification
strategy would be a unique patient identifier. However we recognize that the
goal of patient safety through rigorous patient identification must be balanced
with privacy concerns so if a unique patient identifier continues to be
politically unacceptable at a minimum the NHIN must adopt a robust uniform
national algorithm for patient matching using a standard minimum dataset
recognizing that no such algorithm is perfect and that many records simply will
never match due to patient name changes, nicknames, initials, misspellings,
marriages, divorces, and changes of address.

Laboratories face unique challenges in this area because they typically do
not see the patient and do not control the data acquisition. Certain data
elements, such as patient address, date of birth, or even gender may not be
required for certain billing arrangements. Therefore any dataset that is
employed should not include data elements that providers such as clinical
laboratories do not typically have access to, otherwise the system will only
further reduce the odds that records will match. The patient identification
function is one of the most basic foundations of the NHIN concept and the
difficulties associated with a complete matching system must be resolved before
the NHIN infrastructure can be completed.

Cost. While not a specific functional category, and admittedly it is out of
scope, cost is certainly an issue that must be considered when developing the
functional categories for the NHIN. Laboratories and other large data providers
face a potentially costly burden because of the depth and breadth of servicing
a national population and the resulting expectation of the availability of data
and responsiveness with which data are sent. The requirement of certain
functional criteria could translate into tremendous costs for laboratories and
other large data providers within the NHIN. Examples abound, however I will
list a few of these potential requirements which could translate into direct
costs for laboratories.

Creating audit trails and conducting regular risk assessments.

Providing patients with a mechanism to indicate that a visit is confidential
and hence only the ordering clinician is to be notified.

Supporting entry and display of content in language specified by user.

Presenting data via a web portal immediately upon availability of result.

If adopted as “standards” within the NHIN these criteria, while
well intended, will drive up the costs of transmitting and/or storing data for
laboratories and other large data providers such as pharmaceutical or imaging

A final issue ACLA would like to raise before the committee is the NHIN
functional category of authorization. Within the category laboratories face two
problems, one, their liability for the transmission of laboratory results each
time they are delivered to another entity within the NHIN, and two, ensuring
that an authenticated requester is authorized to view the data, meaning is this
a permitted disclosure and is the extent of the request reasonable.

Pursuant to CLIA when test result information is disclosed by the physician
to a regional health information organization the clinical laboratory is still
responsible for the content and format of that report. In addition, when an EHR
vendor changes the test result report that is provided to the physician the
clinical laboratory is still responsible for the content and format of that
report in accordance with CLIA requirements. Taking this one step further
laboratories would then be responsible for test result report information in
the NHIN despite the potential for reports to be modified several times over
for each and every manipulation made to them when an EHR vendor modifies the
content and for each and every time the content was used, for example primary
care visit, office visit to a specialist, hospital admission, nursing home
stay, etc. This regulatory burden needs to be addressed in order to facilitate
the exchange of electronic health data for treatment purposes.

ACLA has proposals to address both situations. In order to address the issue
of EHR vendors modifying lab result report content we propose two potential
solutions, one, amend the CLIA interpretive guidelines to clarify that the
laboratory must only ensure that the CLIA compliant report is received by
either the client or the vendor, or other contractually obligated intermediary,
system, or two, seek amendment of the CLIA regulations to clarify that the
results may be sent either to the client or to the intermediary.

Second, to address the issue of information shared with other health care
providers aside from the ordering physician the clinical laboratory’s
responsibility for the test result should end once the result is provided to
the ordering physician or other vendor. The interpretive guidelines should make
clear that the laboratory is not responsible for subsequent disclosure of test
result information made by the physician.

The other issue within the NHIN functional category of authorization
presents a particular problem for laboratories given their indirect provider
status. Developing a strong authentication and authorization infrastructure is
paramount to ensuring the privacy and confidentiality of patients whose
information is part of the NHIN. The authorization infrastructure would need to
provide an architecture governing the release of protected health information
for purposes for which patient authorization is required, for example
marketing. This authorization infrastructure would need to accommodate the
indirect provider status of clinical laboratories. Without this mechanism,
laboratories, in addition to other legitimate entities, would have no way of
knowing whether requests for PHI are lawful and if so the extent to which the
release of the information is a permitted use. Any authorization infrastructure
would need to be integrated so that user credentials could be evaluated against
the information being requested and the authorizations associated with the

In conclusion ACLA believes the issues discussed in this statement, namely
clinical data standards for lab test requisition and result reporting, a unique
patient identifier, the cost to large data providers of servicing a national
population, and the need for changes to existing CLIA regulations as well as
the creation of an authorization infrastructure are necessary to facilitate the
transmission of lab results as well as to address broader electronic health
record needs in the NHIN.

The questions being asked today are important to the development of a fully
functional NHIN that while providing for the transmission of pertinent patient
information for quality care also addresses the security and privacy needs
required for any system that aims to encompass the entire nation.

Thank you.

DR. COHN: Jason, thank you very much. What we’ll do is let each
panelist present and then we’ll have questions and discussions after.

Our next presenter is Charlene Underwood from the Electronic Health Record
Vendor Association.

MS. UNDERWOOD: My name is Charlene Underwood and Chairman Cohn and committee
members I appreciate the opportunity to be able to testify before you today on
behalf of the HIMSS Electronic Health Record Vendor Association. In the
audience we also have two other members of that association, Bob Barker, as
well as Charlie Jarvis, who are here to support me in answering any questions
that may get to a technical area in terms of the testimony when we have the
question and answer period.

In front of you you will find two documents, one is a copy of the
presentation and the second is actually the written testimony. I’m going
to walk through the presentation but we’ll be able to reference the
written testimony in addition to that.

Again my name is Charlene Underwood, I’m director of government and
industry affairs for Siemens Medical Solutions and chair of the Electronic
Health Record Vendor Association. The HIMSS Electronic Health Record Vendor
Association is a trade organization representing leading ambulatory and
enterprise electronic health record system vendors that address national
efforts to create interoperable EHRs in hospitals and ambulatory care settings.
Perhaps more importantly we are advocates for our customers, many thousands of
providers who today use our systems with the goal of improving patient care.

Today I’m going to cover these topics, I’m going to give a little
bit of background in a very limited way on Electronic Health Record Vendor
Association, address key NHIN functional requirements, and propose and share a
suggested response to that which we’ve published, and you can find on our
website, the EHR interoperability roadmap, as well as suggested phasing of
meeting the NHIN requirements, and lastly close with what we view to be the
next steps.

The Electronic Health Record Vendor Association is a trade organization that
represents 41 leading ambulatory and enterprise electronic health record
vendors. We have focused significant efforts since our founding in 2004 in two
major areas, in helping found a viable certification process with the goal of
helping to reduce the risk of EHR adoption. We also recognize the need and the
value of sharing health care information amidst our very diverse systems and we
are very committed to empowering patients and providers by sharing patient data
accurately and securely among our hospital systems. As you can see by the
members here, and actually the members on the next slide, we were created to
create a forum of the vendor community to work together where appropriate on
topics that effect EHR adoption across the health setting and we truly believe
that rapid widespread adoption of EHRs will improve both patient care and the
productivity of the health system.

And to that end we see that the need to be able to share data amidst our
systems as being one of those key barriers that are really preventing us from
achieving the levels of adoption as well as the levels of productivity that are
really possible within the industry. So as part of our mission we really looked
for the goals that we thought would make, the factors that would make
interoperability possible and we recognized a couple of those factors in our
planning as we work together toward an interoperability model.

One was to follow a benefits based incremental approach and I think this
echoes the fact that we cannot follow a record replace model, we have hundreds
of customers who have hundreds of systems, who have significant implementation
and systems today and they need to be able to build from where they are and
increment things in a benefit oriented way. To do that however we recognize
that unless we work together in the vendor community and with other
organizations we’re not going to get to the ability to be able to share
data in a very disciplined way so that we can really move to a more plug and
play type of model in terms of sharing information. So to that we support
promoting industry collaboration, the vendors certainly cannot do it alone
without a lot of other support from other stakeholders in the industry, as well
as we’re very supportive of utilizing industry best practices which I
think DiDi will be talking about and we’re supportive of processes like
the integrated health enterprise. We advocate for a single set of
interoperability standards and we also recognize that there’s going to be
levels of sophistication of interoperability standards as we move toward wider
spread adoption of sharing electronic health records so we advocate for levels
of interoperability as we migrate to increasing sophistication over time and
the ability to share information.

So to that end when we stepped back and looked at the diversity of the
provider community that we serve and the diversity of systems out there that
serve that community we recognize that we had to define some requirements and
boundaries within which we want to work in a NHIN environment and I’m
going to cover those requirements for you, they’ll be three requirements.

Number one, we advocate for the need to allow the edge system to retain
functionality and autonomy. And again this is going to enable health care
applications at the edge to be able to meet the workflow of their end users as
well as minimize duplication of services that are provided by the core. We
believe that utilizing a thin architecture for the core NHIN and its
sub-networks will leverage the functionality of the edge systems that are
connected to it and is more cost effective then duplicating the edge system
functionality within the core.

Secondly whereas in the real world we potentially see value of a common
architecture approach to be able to simplify how information is attached the
reality and the data shows that there’s a lot of different architectures
that may occur within the, that we have to be able to attach to. So we advocate
creating an architecture with the flexibility to support centralized or
distributed data repositories within the NHIN and its sub-network, and this
will help us by minimizing the proliferation of interfaces for different
storage schemes as well as recognize that there’s going to be evolution of
these architectures from more centralized to decentralized or vice versa
architectures over time. So creating the concept of the edge function versus
the core where the core can be evolutionary was a key requirement when we put
together our proposals.

The third requirement really focused on having a thin NHIN core, if you will
being agnostic, and we believe that limiting the core functions of the NHIN
will lead to more robust edge systems as well as recognize the diversity of
systems that have to attach to the NHIN. And we’re a very strong advocate
of supporting diversity at the edges while supporting homogeneity at the center
in terms of facilitating the data exchange from edge to edge as well as between
edge to core.

So to that end one of the steps that the vendor association took, and this
was in response to Dr. Brailer’s RFI at the beginning of 2005, was to look
at an incremental approach to achieve a National Health Information
Infrastructure. And we created at that point a HIMSS interoperability roadmap
with the goal of supporting interoperability, interoperable EHRs by the year
2004. And we also defined it in such a way that the roadmap would be deployed
in a phased approach that would deliver incremental benefits that would
leverage current technology that’s available now and add new function as
it emerged and became viable as well as it was cost effective. This roadmap is
based on accepted and tested standards and its organizing principle is
empowering patients and delivering care in a patient centric manner —

MR. HOUSTON: Can I ask one quick question by your slide, is the stuff in the
blue box section at the bottom intended to be core functionality? The part of
your slide where it’s in blue at the bottom, is that core functionality —

MS. UNDERWOOD: Yes, and I’ll get to that. So that end, this is the blue
box part of it, the functionality that we see, the edge systems to the core
systems is represented in the blue box below and what we did was we took the
functionality that you’ve defined, the functionality from the NHIN that
you provided us, and we’ve actually mapped that as an attachment to these
boxes in part of the, in the written testimony that I gave you, so we took that
step to map your functionality into these boxes.

And again, the functionality that we see as the basic functions that are
part of the NHIN are information that’s needed to support the processes,
identifying patients, providing access, as well as some key functional areas
that are generic to support data processing within the NHIN including document
sharing dynamic information management and transactions. So this is a pretty
agnostic layer that is really independent of what the content is and these are
the pieces that are mapped to the NHIN core. And it would be our request that
perhaps you could look at writing your functional requirements in the context
of these services.

So to that end then, so that’s the functionality at the NHIN level. And
I wanted to speak, one of your questions was what was that minimum set of
services that you had to provide in the first level and again we would advocate
that again clearly the means to identify patients as well as provide access in
some level of document sharing is that a minimum going to be a required to be
able to support the NHIN as it starts, perhaps some dynamic information
management. And we have done the work to map the current use cases coming out
of HITSP into these areas and most of them could be handled with the first two
boxes and the green and yellow box that you see on the screen so we can share
that at an additional level of detail.

As you look at the other component, and again as you think of this, this is
kind of the future vision, as all these are deployed this is going to look like
our interoperability vision of the future. But the view that the vendors have
taken is we will be implementing pieces of this architecture over time so for
instance at this level in the EHR level the services that are depicted here,
medical summaries, radiology, lab results, are domain specific and these are
the pieces that are implemented by each of the vendors at the edge system in a
very consistent way. So for instance if we want to share medical summary
information the vendors will do that in a consistent way and then we will use a
document sharing infrastructure to be able to communicate and share that
information with, same for radiology reports, potentially historic lab results
in terms of sharing information and that again domain can increase and then you
can use the different types of services for the different types of
communication that you have to perform.

When we wrote the roadmap we expected that the roadmap would be implemented
in phases so the other thing that we did, we looked at the current standards
that were available as well as the current content standards that were
available and we proposed four levels of implementation that would be built on
each other logically over time and would provide incremental value with each
step. And we would be starting more simply with the types of functions that are
convenience functions that make sharing information easier, that actually if
you will take away the paper and actually support some of the productivity
issues, two functions that ultimately over time become more supportive of the
data that’s captured and enable functions like clinical decision support,
sharing knowledge, and doing population based management. And all of those
steps are included in our roadmap.

DR. COHN: Charlene, could you go back to that slide, I’m not sure that
you actually talked about this one, am I confused —

MS. UNDERWOOD: Did you want me to walk through each of the phases?

DR. COHN: Yes, just to make sure I understand, I apologize —

MR. BLAIR: And Charlene, for my benefit, could you, if there’s a list
of functions that you let me know what they are.

MS. UNDERWOOD: Thank you, Jeff, sorry. Phase I of the functions is really
the ability to share patient care status information and what this is about is
the ability to be able to from an encounter capture that information in a
medical summary format and be able to share that amidst our electronic health
record systems in a structured form, think of the continuity of care record and
share that amidst our systems and we actually demonstrated that at the HIMSS
interoperability showcase this year, this is the one that we told Dr. Brailer
we were going to do in a year so many of the vendors have stepped up and done
that first phase.

The second phase is sharing diagnostic results and therapeutic information
and this adds patient created information in emergency summaries plus lab and
prescription information and the ability to be able to share that information
either via structured documents and/or transactions, so we’re looking at
this phase for instance in support of the EHR work that’s being done in
the HITSP committee in terms of supporting that. And again what the standard is
will resolve out of the HITSP process.

Phase IIIs include advanced clinical support and access function which
includes increased access control and notification, exchange of more advanced
continuity of care documents, and dynamic queries for medications and allergies
with more extensively coded data, so as each phase advances we recognize that
the standards for codification are going to be improving and we will be
adopting those standards so that we can share and integrate more of the allergy
information, the medication information, across those systems and enable that
information to be imported into our system, again it’s going to be
evolutionary. But at that point then you can start to think about clinical
decision support, so first phase is let’s share the information, phase
three is start to actually share data semantically.

And the last phase really looks at introducing more work, is called
collaborative care quality, active quality reporting and health surveillance,
it’s the ability as a byproduct of data capture to be able to generate
either performance information relative to surveillance information as well as
information relative to performance information and be able to communicate that
information via transactions and/or receive notifications back and respond to
information that’s provided back with notification of a breakout. Clearly
in this phase we could also be looking at supportive clinical trials and other
types of research but that phase becomes more advanced in terms of
understanding more the data that has to be captured for performance and metric
purposes and building that into the workflow, again more invasive into the EHRs
such as can be captured as a byproduct of data capture.

So again starting pretty simply, convenience type functions, moving to
functions which enable the workflow by providing clinical decision support, as
semantic interoperability increases recognizing they’ll be advancement and
performance measures that will be captured as part of workflow so that at the
end we can be able to inform population management more effectively, so
that’s kind of the series.

The good news today is we’re not working with, this is not brand new,
as I mentioned a significant number of vendors demonstrated Phase I at the
HIMSS interoperability showcase, we’re focusing on introducing
advancements in that for the HIMSS interoperability showcase this year. But in
addition we also recognize that health care is not contained to the United
States, it is global, so this shows a picture of the world and where the
concepts of the interoperability roadmap is being used, certainly for instance
in Canada it’s being looked at being adopted by CSC and IBM in some of
their RHIO structures as well as Denmark, France, so again, it is a global
concept that’s being adopted so the principles upon we’re founding
are really being tested globally.

Finally one of the key points we want to really bring to the table here is
that vendors have been working on interoperability and have invested
significant efforts in terms of sharing data within health enterprises as well
among ourselves for many years and there’s thousands of years of
experience here and we really ask that we leverage the experience of the
industry as part of this process.

Secondly, and this is clearly a message from our customers, is we must be
pragmatic, we’ve got to take a business oriented approach and make sure
that each step of the way. There’s value to the customers who are
stretched today to do their job let alone adding new steps into their workflow
which could impact them taking care of a patient which is our first priority.
We clearly advocate moving toward a single set of standards and private sector
initiatives because putting us all on the same page in an organized way is
going to get us to our vision of interoperability health care much more
effectively then not going in that direction. And lastly we’re fully
committed to deliver interoperable EHR components for NHIN demo products to
demonstrate what works, and we think those will help us move forward in terms
of achieving our vision of interoperability.

So on behalf of the EHR VA Mr. Chairman I really want to express my
gratitude for the opportunity to share our perspectives on creating the
National Health Information Network. Thank you.

DR. COHN: Charlene, thank you very much. Obviously as I said we’ll take
questions and comments after our presenters. Don Mon, you are I believe next,
we don’t have copies of your presentation —

DR. MON: There’s not a Power Point presentation, there is a handout —

DR. COHN: So thank you for joining us, I understand you’re representing
CCHIT today as opposed to AHIMA.

DR. MON: Yes, I apologize, I forgot to say that in my introduction.

Good morning, Chairman Cohn, members of the committee and fellow
participants, my name is Don Mon, vice president for practice leadership at the
American Health Information Management Association, an industry liaison for the
Certification Commission for Healthcare Information Technology. It is in this
latter role that in which I articulate CCHIT’s responses to the questions
you’ve posed. Thank you for allowing us this opportunity to provide input
on the issues related on the Nationwide Health Information Network.

Please be aware that because of the timing of CCHIT commissioner meetings
the response time for this meeting was not sufficient for us to develop a
position and take it to the full commission for formal approval, therefore the
statements that I will make today represent the initial thinking from
CCHIT’s leadership rather then an official position by the full

And also let me preface my remarks by explaining the certification process
and the timeframes within the contract that CCHIT has with the Office of the
National Coordinator with respect to network components. Per the contract CCHIT
will not begin certifying network components before 2008, the third year of its
ONC contract. Parenthetically in the first two years CCHIT will certify
ambulatory and inpatient systems respectively.

To certify network components in 2008 there are a lot of things that have to
be done before certification actually begins, the first is that CCHIT must
assemble workgroups comprised of industry experts who volunteer, who apply and
volunteer for these workgroups. These workgroups will then develop
certification criteria for network components throughout 2007 and then begin
testing components for certification in 2008 or beyond. So essentially what
happens is that these workgroups through an environmental scan go out to the
industry and do as we are doing now, listening to the input that is being
gathered from the NHIN forum, going to events or meetings like this to gather
that input so that we can put it on our early detection system about what kinds
of things might be coming forward for network components.

Getting back to the workgroups, it’s these workgroups who would
normally scan the industry and then engage the collaborative projects, for
example coordinating certification with the harmonization of standards with
HITSP and then also to evaluate the requirements that they have been posed in
the questions below. So obviously given the timeframes that I have just laid
out these workgroups don’t exist and therefore we don’t have
definitive work on the NHIN requirements. However we are able to give you some
initial impressions from our staff and leadership of CCHIT with the caution
that our input may evolve over time and we may have different input for you
once we’ve assembled the workgroups next year.

I’d also like to add that CCHIT is not a standards development
organization, we don’t make the standards. The way that the process is
supposed to run is that there are standards development organizations that
develop the standards, it then gets harmonized through the HITSP process, and
then CCHIT takes those kinds of requirements and extracts them and puts them
into certification. That doesn’t mean that we’re passive about the
functional requirements, for example with the NHIN requirements, as we have
found when we developed the ambulatory certification criteria we found that
there were certain things within the certification criteria that we then had to
go back to HITSP and back to some of the standards development organizations
and suggest whether or not they could include those so that we can then follow
that chain of it becoming a standard to be harmonized and then certification.

So as to the questions that you’ve posed, the first one was what are
the minimum but essential network functional requirements, focusing mostly on
the core systems. As there are 1139 functional requirements listed we chose not
to comment on individual requirements and so we’d like to frame our
remarks in certain domain areas. And these are the barest of minimum
requirements, the first one is in security, privacy and confidentiality, CCHIT
is hearing very loud and clear from all stakeholders that that is an area of
concern within networks in general, whether it’s the NHIN or state level
RHIOs or local RHIOs. And when we talk about security, privacy, and
confidentiality it is all of those controls for who has access to the record,
who has access to the record and what was accessed, and these will be
consistent with CCHIT’s security and reliability certification criteria.

So one of the first recommendations that we would have is that there should
be an analysis between the NHIN functional requirements and CCHIT’s
security and reliability requirements to see where they overlap, we have not
had the opportunity to do that just yet but seeing that we have just received
these functional requirements in late June from NHIN that’s one of the
things that could be done.

The second area is in patient identification, in NHIN environment it will be
extremely crucial that patient records are matched correctly before health
information is exchanged and that is even more crucial as we’ve been
hearing as the NHIN discussions have evolved about how PHRs are playing a role
in the NHIN. And so if individual consumers are permitted to access the NHIN
through their PHR then patient matching is going to be all that much more

And then the third area can be summarized as completeness of the record but
it was much more well articulated by Jason DuBois, these are issues such as
what data were sent and whether there were subsequent updates to that health
information by the data source.

The above are the most essential requirements, we certainly recognize that
there are many more domains that are important to the operations of the NHIN
but as we mentioned we felt that these were the bare essentials.

In terms of the specific edge systems, what functions would be minimum but
essential, here again we would suggest an analysis that compares the functional
requirements of the NHIN to CCHIT’s functional requirements an we reviewed
the list of 1139 functions, we did see some things there that mentioned that
you know lab data should be sent and other data should be sent and we have
those similar kinds of functions within the certification criteria. Obviously
the two should be in sync such that it may be possible that because CCHIT is
also taking a minimal approach and then an incremental approach that there may
be some functions within the NHIN set of functions that aren’t certified
and there may be some things that are in the CCHIT requirements that
aren’t in the NHIN functions. But it would be worthwhile to compare those
two sets of functional requirements and see if it doesn’t constrain one or
the other.

We applaud the workgroup that put together the Power Point slides, we think
that the first, the slide that we were distributed showed the architecture and
then all of the domains on the second slide. In terms of all of those points I
think that all of those domains that were listed on the second slide were spot
on, the one that we don’t question but don’t think is necessarily a
minimum but essential is the persistence of data when it is construed as
storing data in a central repository, clinical data in a central repository as
there are some RHIOs that are taking an approach in which there is no storage
of that clinical data.

And then lastly the question as to whether or not there are other standards
that were not identified through the HITSP use cases, we think that HITSP has
done very well to review the myriad standards that are relevant to the
respective use cases and at this time we do not have other standards to bring
forward for consideration. Our review of the 1139 functional requirements
showed that there was nothing unusual with the functional requirements except
for one thing that is piquing our interest and that again surrounds the
involvement of the PHR inside the NHIN. The PHR involvement in the NHIN just
introduces another level of complexity to the NHIN and we found that in line,
specifically in line 324 it mentioned, it used the phrase fully qualified PHRs,
it’s not clear what is meant by fully qualified PHRs and what is done to
qualify them. Does that mean a fully qualified PHR is one that is certified and
if so that means that CCHIT must now attend to something like that, and
that’s an area where we see that by looking at other projects like the
NHIN project we see something that puts something on our radar screen.

So thank you very much for allowing us to provide this input and we’ll
be glad to take any questions that you have.

DR. COHN: Don, thank you very much. I do want to remind everybody before
Jamie starts to put your cell phones on mute since I’m hearing little
ringing of various sorts, one of our members just left with that, but if you
can put it on quiet or vibrate or whatever for your cell phones.

Now our last presenter is Jamie Ferguson, obviously thank you for coming all
the way from California. I do want to just relate to you a conversation we had
yesterday knowing that Elaine Blechman has already presented some of the
information here. It was unclear to us whether or not there was an intent by
both you as well as Floyd to both review the early parts of these presentations
on the background of HITSP and I guess I was hoping that that was not going to
be —

MR. FERGUSON: My intention is to skip straight to the testimony from our
committee and then in the question period I can take and answer any questions
on HITSP in our process and overall status.

DR. COHN: We appreciate that, thank you very much.

MR. FERGUSON: My name is Jamie Ferguson, I’m here as co-chair of the
Electronic Health Record’s Technical Committee of HITSP, or EHRTC. First I
want to thank you for giving our committee the opportunity to comment here.
I’d like to start on page eight of the written testimony and take any
questions about HITSP later, also there were some Power Point slides that I
believe were distributed to the committee yesterday, there are also copies over
on the table, I’ll be referring to those but I will not be presenting
those on the screen.

The Electronic Health Record’s Technical Committee of HITSP has
conducted a series of open sessions to create HITSP’s work products and
deliverables related to the initial EHR use case. The committee currently has
89 organizational members representing all sectors of health care in the United
States and our committee meetings typically include lively debate on a wide
variety of related issues.

Since the first NHIN forum on June 28th and 29th the
NHIN functional requirements and the relationship of the NHIN framework and
NHIN concepts to the standards and implementation considerations for a use case
have occupied some of our time in these open sessions although it should be
mentioned that those discussions have not been the main body of work for the
committee. This testimony attempts to represent those discussions faithfully,
completely, and concisely.

As context for the following points I’d like to refer to the three
slides on pages 24 through 26 of the Power Point presentation. These slides
represent the functional flow diagram for the EHR laboratory results use case
that was published by HITSP in the Technical Committee presentation to the
HITSP panel, document HITSP 06 number 95 on June 14th. The slides
also represent an example of the adaptation of these functional elements into
interoperability specifications for the scenario of sending an electronic lab
result to authorize providers of care for use in clinical care. The document
HITSP 06 number 95 is an appendix attachment and I’ll now explain the
diagrams in those slides.

As described in the slides EHR lab use case can be viewed in two primary
scenarios, in the first scenario a releasable electronic lab result is sent to
the ordering clinician, to authorize providers of care and to a data
repository. Interoperability in this scenario a set of functional interactions
must occur that requires standard laboratory terminologies and message formats,
and that communicate characteristic message content.

In the second scenario authorized providers of care may query either a data
repository or a record locator service to retrieve lab results or result
summaries. It should be possible to enter the queries from within an existing
EHR system or through an independent web application without an EHR system. For
interoperability in this second scenario a further set of functional
interactions are performed such as those to create and retrieve a structured
document containing the lab results. All of these interactions involve multiple
actors and all the actors described in this work generally are agreed to be
edge systems for purposes of the NHIN. If these actors are unspecified edge
systems and a wide variety of implementation architectures is enabled in which
any of these actors may be combined in edge system entities in practically
limitless ways.

From the perspective of the HITSP standards specification for linking the
NHIN and the current EHR use case functional services of the NHIN core should
be restricted to a minimal set of transport services. In a manner in the
electronic realm it is analogous to a commercial package delivery service in
the paper realm, or if the NHIN should deliver electronic packages to
authorized receivers in a reliable and secure manner. And this should be done
without opening the packages or performing any additional services related to
their contents or their containers. Again using this analogy a set of NHIN core
functional services related to transport would include verification of the
address of the recipient, authentication of the identity of the recipient, or
giving the sender an acknowledgement of receipt. Related security policies and
security profiles may require specification by the NHIN.

Specific standards for these NHIN transport related services should be
determined by HITSP in consultation with appropriate designated parties.
Standards and related specifications determined by HITSP for most or all other
functions required by the use case generally are used in functional
interactions between health care systems and health care entities. The
inclusion of standards for these additional functional services in the core of
the NHIN would limit architectural choices, it would have the effect of
dictating selected implementation options in some cases and it would limit
commercial variation by requiring the bundling of some optional functional
services with mandatory functional services in EHR systems.

An example of these effects is found in terminology services which is
described in the HITSP standard solution to the EHR use case. For
interoperability to be achieved a standardized set of terminologies must be
used. There are many ways of achieving this end and a wide variety of solutions
can be found in the commercial systems that are currently marketed to edge
systems. In this example the inclusion of standards for functional terminology
services in the core functions of the NHIN can severely limit market choices
and limit related implementation alternatives.

As in the functional diagrams HITSP interoperability specifications for this
use case will cover a variety of functional services in addition to those
related to transport. Among others some functional services for electronic
clinical messages and clinical documents are ones to create, address, send and
receive them, and to execute rules which perform operations on their contents
and their formats. Without favoring any architectural choices for edge systems
it still can be stated that multiple hierarchical tiers, or layers of edge
systems, should be agreed to contain and perform the additional functional
services. Specifying a multiple number of edge system tiers or layers would
enable a wide variety of architectural approaches to be implemented because
these additional services could potentially be placed in any tier or layer in
any configuration and it could enable architectural variation to coexist more
peacefully. This would be true particularly if their definition promotes a
common set of interoperable interaction specifications between edge layers for
the additional functional services.

HITSP should select relevant standards and develop interoperability
specifications for these interactions when the core services are defined.
Architectural neutrality would be promoted further if it were clear that any
edge tier or layer could interact with the core NHIN transport services under
specified conditions. An example of these edge system layers can be found in
states in which a state regional health information organization, or RHIO,
operates in conjunction with both regional and local RHIOs as well as
multi-state delivery systems and networked communities of interest. In this
example it would be helpful to have a set of conditions and specifications for
interactions of these tiers or layers of edge systems with the NHIN core
functional services.

A categorization of these specifications for geographic and non-geographic
edge systems could simultaneously allow for states to oversee health
information exchange programs within their state and for non-geographic
networks or communities of interest to common specifications for interactions
with NHIN core services. It should be reemphasized that any such definitions of
multiple tiers or layers of edge systems need not play any limit or constraint
on architectural options. To enable standardized interactions of edge system
with the NHIN core HITSP should be engaged to determine the standards and
interoperability specifications as soon as policies are proposed that specify
any conditions or boundaries for edge system interactions with the NHIN.

In summary the HITSP EHR Technical Committee believes that the functional
services of the NHIN core should be limited to reliable and secure transport of
unaltered electronic messages and electronic clinical documents.

The Technical Committee would like to thank you for the opportunity to
comment here.

DR. COHN: Well Jamie, thank you very much, I want to thank all the
presenters for what I think has been some very interesting testimony. Now are
there questions from the workgroup? Harry?

MR. REYNOLDS: Thanks to everyone, nice job. I had a kind of quick question
for each one of you. Jason as you presented you focused a lot on regulations, I
didn’t seem to pull out of there what you thought the base requirements

MR. DUBOIS: Well one of the problems with the exchange of information today
electronically is that laboratories have to abide by the Clinical Laboratory
Improvement Amendments in CLIA and those do all, they regulate quality control,
quality assurance, and among them is that the requirement that a result report
is received by the ordering clinician and that the report, the laboratory under
the regulations and under the law are required to ensure the delivery of the
result and that it appears according to different specifications under CLIA,
that the management director of the laboratory sign off on certain types of
laboratory test results and there are other requirements under CLIA. Well one
of the problems with the way that that information is shared today, at least
electronically, and know that when CLIA was passed into law and regulations
were written out that I don’t think there was a lot of thought given to
the exchange of laboratory result information electronically.

With that said I think that some of the problems are that vendors who are
serving the physicians or hospitals will actually change the perspective or the
view of that result and in some instances laboratories can be considered liable
for that despite the fact that it’s the vendor and not the laboratory
that’s changing the view of the results, so they might take down whether
it’s the physician that signed off, and they do that because it’s the
physician or the hospital that wants it most often, they want to see, they want
to see the result stripped down so it’s just okay, show me what the
glucose number is and I don’t care what the doctor, who signed off on it
or the full report on what the AP of the breast biopsy is, malignant or benign,
that’s all I want to know.

So with that said laboratories can be considered liable for that information
once its changed and furthermore also a concern for many of our members is that
not only does it change there but once its shared with, let’s say
it’s a RHIO example in Indiana, that they’re required to ensure the
result of that lab data to each and every person that sees a result besides the
ordering physician.

I don’t know if I answered your question —

MR. REYNOLDS: No, you reiterated that you guys aren’t ready to go there

MR. DUBOIS: We want to go there —

MR. REYNOLDS: And that your relationship tends to be more between the lab
and the practicing physician.

MR. DUBOIS: And we want to, we’re trying to, that’s why we have
suggestions here in the testimony and we’re also working with the Office
of the National Coordinator and CMS to try and fix these, what we see as
impediments to the exchange of lab result information.

MR. REYNOLDS: Charlene, quickly, on profiling, what do you mean by that?

MS. UNDERWOOD: The use of a process such as integrating the health
enterprise where we have actually defined within a profile how we’re going
to be using a specific standard, so it’s the profiling —

MR. REYNOLDS: So it’s business rules between —

MS. UNDERWOOD: Business rules, we’re narrowing down those processes so
that they’re common among vendors in terms of how we implement.

MR. REYNOLDS: And some of your focus was on interoperability between edge to
edge and as an industry we have tended the more we become edge to edge and one
to one things haven’t gone quite as well in general, I mean whether
it’s the payers to the providers and edges to edges and vendors to
vendors, it’s made it more complicated, so I was interested in, you
mentioned the things that you thought could be useful in the center —

MS. UNDERWOOD: In the center, and I think that resonated with Jim’s
testimony, that we see, because of the variability of architectures,
capabilities, workflow needs to be maintained at the edge where the centers
needs to be this common set of transport services, so I think that’s
really the same message. And what we’ve seen and actually as we’ve
looked at the HIPAA transaction models there’s been, where there’s
still variability and a lot of content diversity at the edge over time it does
improve and become increasingly standardized so that it’s easier to do
that exchange. So it is a process that will evolve as opposed to something that
we’ll start out with, but it doesn’t mean not to start and we think
we should start.

MR. REYNOLDS: Don, bears to the minimum, is that a disclaimer or is that a
base set of requirements?

DR. MON: That was a disclaimer.

MR. REYNOLDS: Just making sure we understand. Those were my questions,

DR. COHN: I guess I had one and I know Jeff has a couple of questions. I was
actually trying to reconcile and trying to figure out what HITSP, what
you’re recommending, Jason, I was having some trouble understanding, I
recognize the complexity of CLIA and certainly there’s a fair amount of
complexity there. I was just trying to figure out whether or not you’re
supportive of what Jamie was saying around HITSP in relationship to how things
ought to be transported and what ought to be happening, or do you have other
views on this? Do you remember what he said?

MR. DUBOIS: To a certain extent —

DR. COHN: Let me just read it to you, it says in summary HITSP EHRTC
believes that the functional services of the NHIN core should be limited to
reliable and secure transport of unaltered electronic messages and electronic
documents. So I’m just, I mean is that something that —

MR. DUBOIS: I don’t think they’re out of sync with one another, I
think that we’re just, with the CLIA regulations we’re just trying
to, we are trying to facilitate that exchange and know that no big companies
like Quest Diagnostics or Lab Core, who are our members, are going to be doing
this, unlike a lot of other laboratory providers that do it on a regional
scale, you’re talking about companies who much like the NHIN setup already
exchange this information nationally. So to a certain extent some laboratories
are already there but we’re just, the concern is just making sure that
laboratories aren’t responsible for the final report each and every time
that that data is provided to others within the NHIN infrastructure.

DR. COHN: It looks like both Charlene and Jamie have further comments.

MS. UNDERWOOD: Again, I’m going to be very practical, if I were to get
a copy, let’s just say they sent it out in a document format then when I
would receive that in my system if I would see something in that result that
would be relevant to some other information that I stored in my system I might
want to present that result in the context of some other clinical information.
That’s where the restraint in terms of me having to present it as it was
sent to me so that by the CLIA regulation constrains, could potentially
constrain how I interpret that and advance that knowledge to my end users. Is
that kind of, so I would like it and I could take it in that same way but once
I get it then I want to do something with it that makes it more useful for the
person that I’m serving.

MR. DUBOIS: And one of the changes that we’re requesting is that when
the vendor does receive it that that would be considered receipt of the result
report and that it’s okay, they’re trying to facilitate the way that
the physician will see it but as it stands now we’re still on the hook for
the way that the laboratory result is viewed by the physician despite the fact
that it is being manipulated by an intermediary.

MR. FERGUSON: I’d just like to note that in order to meet the current
CLIA regulations and without changing that policy we’ve been very careful
in our work in the HITSP Technical Committee to make note of standards for
these electronic messages and particularly the lab results documents that can
uniquely identify the custodian of the document being the source lab and so
that’s complicated, for example in the case where the lab that receives
the order then sends it out to a reference lab and it’s actually the
reference lab that has to be identified on the document through the complete
chain. So what we’re doing today is we’re making sure that the
standards that we use for solving this use case can meet the current regulation
and then if the regulation has changed then we would perhaps have to change the
standards to meet the new regulation.

DR. COHN: Jeff?

MR. BLAIR: I’ve kind of been saving this question because it is a
difficult one but I think that this panel is probably more capable of coming up
with constructive ideas to begin to address, an approach, if not a definitive
answer then an approach or some suggestions that could help us to identify the
minimum set of functions for the NHIN. If you had a chance to listen yesterday
there’s a chorus, everyone indicating that we’d like to have the NHIN
be as thin as possible. On the other hand one of the other things to balance
that off is that it avoid rip and replace, in other words it has to be
flexible. And this panel I think has individuals that have dealt with the fact
that you go from version to version to version of standards and the NHIN is
going to have to deal with this problem so that it could support more then one
version of message format standard and over time different systems that might
have not only different coding systems but different versions of the coding

So I’d love to hear any thoughts you might have as to how the NHIN
could remain as thin as possible but still be flexible to accommodate the fact
that different EHR systems, e-prescribing systems, clinical labs, might either
have different message versions that they support or different coding systems
or different versions of them. Words of wisdom, how can you help us?

MR. FERGUSON: We’ve actually had I think quite a bit of discussion on
this in the EHR Technical Committee and I think to summarize our discussions I
would say that the answer is to minimize the number of standards that are
acceptable and allow the market and allow the edge systems to determine how
they transform their clinical content and their messages and documents and
their terminologies into those target formats and target code sets.

MR. BLAIR: Do I hear you saying that the edge systems must be responsible
for conforming to a single standard of the NHIN?

MR. FERGUSON: In general, yes —

MR. BLAIR: And what happens if over time the NHIN decides to change a level
or a version?

MR. FERGUSON: I think our view is that we want to minimize the number of
different, and I’ll say for example terminology subsets, so that we would
want to minimize the number of SNOMED or LOINC subsets in the case of the lab
results that are acceptable as the standard set for interoperability and so
today in HITSP we’re in the process of selecting and finalizing the first
set of standards so we recognize that those things will change over time, that
a maintenance process will be required but I don’t think there’s a,
we don’t have a pat answer on how to manage that maintenance and upgrade
process other then to minimize the number of standards that are acceptable.

DR. COHN: Charlene, you look like you had a comment on this one.

MS. UNDERWOOD: I just wanted to reinforce some of those principles and Bob,
if you want to add, because I know you’ve lived this for a long time, any
points to that that would be fine. But I think I want to reinforce again the
focus to fewer standards, more harmonized standards, so for instance when we
look across the three use cases that are active now, biosurveillance and the
lab as well as the consumer empowerment and I look at it and I see redundancy
of the data across those three different efforts, I’m sitting there
praying that I don’t want one standard for each one, I want one standard
that will enable me to respond to those use cases dependent on what my users
needs are and I could just populate it accordingly.

So I think the focus needs to be as we drive towards standards in the
future, data driven standards as opposed to specific transactions for every
single scenario that we’re going to run into, but I think that’s
really saying the same thing. Likewise at the edge I think there has to be
leadership on the part of the nation to direct us toward more single sets of
codification standards and I think those pieces have to come together.

So that is where policy and process will actually make this happen because
at the edge today there’s incredible variation in data content and quality
by every single one of our users and to do that mapping is going to be at the
detail level and they will not move until there’s some substantiation that
it’s worth making that move, trust me. So those things I think are
important in terms of the leadership.

Bob, did you want to add another point?

MR. BARKER: I’m Robert Barker with the EHR VA and what Jamie was saying
earlier was number one was to reduce the number of standards that are out there
because from a vendor, and I’ll give a specific case, there are two
competing SDOs, ASTM and HL7, in different areas, one is the medical summary
exchange which is the number one on the roadmap we’ve achieved and I have
two sets of developers, one that develops in one standard, one that develops
for the other, both of them are doing the same thing. So when you get into your
comment was how do you handle the changing versions let alone multiple
standards to be used, that’s what we have now is multiple standards and
multiple versions of each of these standards and if we can reduce them to at
least have one standard that has different versions we’ve simplified the
process, we’re using less resources to handle multiple standards and more
toward the content and the ability to use the information in a clinical
setting, so to reduce the number of standards is the first thing that we have
to do.

DR. COHN: Mary Jo was next on the queue, I also have a follow-up on this

DR. DEERING: Thank you and actually, Jamie, this is for you and I believe
it’s just an editorial clarification of your written testimony, on the
bottom of page nine it gets exactly I think at what was just said and therefore
I’m wondering, there’s a sentence there that says the inclusion of
standards for these additional functional services in the core of the NHIN
would limit architectural choices. Surely you don’t mean the inclusion,
you just mean the inclusion of these additional functional services in the NHIN

MR. FERGUSON: That’s correct —

DR. DEERING: So I think that you might want to go in and strike out, we do
want in fact standards —

MR. FERGUSON: Thank you for catching that error.

DR. DEERING: That’s the opposite of what you want, I just wanted to as
a former, spot a typo at a thousand yards.

Anyway, I had a clarification for Don and it’s a process clarification
that you may not be able to answer. In a couple of these instances you
suggested that once you’ve looked at CCHIT’s security and reliability
certification criteria and other, maybe it is just that single document, and
compare it for overlap to the functional requirements, you left it out there as
if one should and I guess my question to you is as part of your process over
time would that have been an exercise that you intended to engage in?

DR. MON: The short answer here, Mary Jo, is yes, having received the
functional requirements in late June when they were published and distributed
at the NHIN forum we just didn’t have the opportunity to do that between
here and then but then as soon as we saw the format and the content of the
functional requirements we saw that there may be some overlap. And so what
we’re suggesting is that either, really within CCHIT we’ll take that
on to review that but I think it will probably be better served, we just
haven’t made the connection yet to work with somebody within the NHIN
project and do this as a collaborative exercise and that’s probably the
better approach to do that. However that hasn’t been fully discussed
within our staff and so that I was a little vague on that, I apologize.

DR. COHN: Can I ask a follow-up question and John you’re obviously next
in queue here. But just to follow-up on what Jeff was asking a little bit to
make sure I understand and I think, I’m actually just struggling a little
bit as we talk about these obviously standards and version changes and
obviously terminology changes and all of this, and the concept that we have
these unaltered packages that are going to be going through the NHIN and
I’m just trying to figure out who’s responsible for what where
because it really does begin to speak of functional requirements. I guess if I
think of an unaltered package I’m expecting that somehow there’s sort
of minimal standards around the package that somehow defines it, routes it,
transports it, edifies it as secure and whatever with the need at sort of the
beginning and the end people to sort of understand what it is.

Now when you talk about all this, and Charlene you talk about standards and
making sure that everybody is on the right standard at the right moment and
I’m of course thinking back in the old days of ICD changes, CPT changes
and updates and all of this stuff. Are you, and I’m looking at you,
Charlene on this one, you weren’t in any way describing filtering or
standards that are part of core functionality to make sure that all the
standards are implemented inside the package at the point that something is
shot through the NHIN, or are you?

MS. UNDERWOOD: No, I think the intelligence in terms of what’s sent and
what’s received really has to be managed at the edge so we need to know in
the package what it is, how I accept that will be a function of how intelligent
I am at that point in time. So for instance in the use of the medical summary
status standard it’s got the flexibility for me to just take the document
and look at it or interpret the data, and/or interpret the data, but if not too
smart I’ll just take the documents so the intelligence of, the standards
are now advancing so that they’ve got some levels within them that make
them more, my ability to imbed them more evolutionary so I can depending on how
intelligent I am. But the intelligence in terms of how to send them needs to be
at the edge as well as how to import them needs to be at the edge. And then as
the edge becomes smarter in terms of storing things in a consistent way
we’ll be able to send things for semantically, accurately, across that
transport. So those edges are going to get smarter and more consistent so that
the transport layer in between which stays common, the data sharing will evolve
overt time.

DR. COHN: Okay, just to make sure I understand because I think you’re
agreeing with what I’m sort of thinking here is that, obviously
there’s the secure layer, you need to address and understand what the
address is and some other pieces in this wrapper but it’s really, and
that’s the responsibility in your view of the NHIN and it’s really
sort of the edge system that sort of dings, the contents of this are gibberish
and sending them back to the other edge system as opposed to there being some
sort of a quality review or screen at the NHIN at the beginning saying you
can’t even send this, this is not the right code set or standard or
whatever for the moment, except for sort of minimal surrounding sort of package
things. Is that, am I getting that right?

MR. FERGUSON: That’s exactly right, thank you.

DR. COHN: Okay, thank you. John Paul?

MR. HOUSTON: I had a brief question for Don. You indicated that completeness
of the record was one of the three minimum but essential domains you thought
needed to be worked, I mean was one of the three minimum but essential domains.
And I guess one of the concerns I guess I have is completeness in my mind
really is a utopia, I don’t think we’re ever going to have complete
information, or there’s going to be a high probability that when you go
out and use NHIN to get information you’re not going to get complete
information but rather I think it’s probably more likely that you need to
know what information you do have versus maybe what holes may exist in terms of
where information didn’t come from and maybe where it was expected.
What’s your thoughts in general on completeness of the record as well as
maybe some of the ways to address that?

DR. MON: I perhaps used that phrase a little bit too glibly because I think
what you just described is the latter, in your latter statements, was what we
were shooting for and it basically corresponds to what Jason was suggesting as
well. We agree that sending health information over the NHIN or even if
it’s just a point to point exchange that you may never get the complete
record. But what is important is to know what you got and that what I sent you
is all I’m going to send you and that’s all I have to send you, and
so having that kind of indication is what is useful at the NHIN level.

Is that a little clearer —

MR. HOUSTON: I just wanted to make sure I understood that, that’s
clearer, thank you.

DR. COHN: Other questions or comments from the committee? Okay, well I want
to thank all of you for what I think has been a very interesting and useful
panel, so with that we are going to take a 15 minute break, we will come back
at five minutes to 10:00, actually five minutes to 11:00, and we’ll
continue on.

[Brief break.]

Agenda Item: Panel VI: Biosurveillance – Association of
State and Territorial Public Health Organizations

DR. COHN: We’re going to get started again if we can, our next panel is
focused on biosurveillance issues and functional requirements and we’re
very pleased to have three presenters, our first is Lawrence Hanrahan
who’s from the Association of State and Territorial Public Health
Organizations, and obviously I want to thank you for joining us. Our second
presenter will be Tim Morris from CDC and then we finish up with Floyd
Eisenberg and obviously thank you from HITSP focused on biosurveillance use
cases. So obviously we’re pleased to have the three of you here, Lawrence
I think you’re going to be leading off so please.

DR. HANRAHAN: Well good morning, everyone, I want to begin by thanking the
committee members for inviting ASTHO to provide a public health perspective on
what technical functions and services public health would need from a
Nationwide Health Information Network.

My name is Larry Hanrahan and I am director of public health informatics for
the Wisconsin Department of Health and Family Services in the Division of
Public Health. I’m here today to testify on behalf of the Association of
State and Territorial Health Officials, I serve on ASTHOs Public Health
Informatics Policy Committee whose mission is to provide state based leadership
and policy recommendations to strengthen the ability of ASTHO members, the
state health officials, and partners to use informatics in making key public
health decisions.

Let me begin by going into a quick background into what services public
health provides in order to give you a better idea of what we will need from
the NHIN.

Public health is not just field agents swooping down in and after an anthrax
attack or the reporting of infectious disease agents like HIV or E. coli.
Public health is prevention, protection, promotion, response, and services.
Public health agencies use data to identify the most pressing health needs so
that the appropriate services or policy interventions can be made to improve
the nation’s health.

Many of you are well aware of the ten essential public health services and I
won’t go through them all today, however, I would like to stress that
these must be all kept in mind when we talk about what public health needs from
an NHIN.

So in terms of public health services that can benefit from NHIN and what
services they would need let’s start with the clinical services that
public health provides to the millions of patients each year. Public health
provides clinical care just like any other care delivery organization, we run
hospitals and clinics, we have laboratories and pharmacies, and therefore these
public health services would need for NHIN to provide the same functional
requirements and services to them as they would to any other care delivery

Public health however also provides non-clinical services. These can be
broken down into routine and emergency with routine encompassing the reporting
of notifiable diseases as well as surveillance of non-notifiable diseases and
conditions such as injury and new born screening, and systems that we have even
in the title of your committee like vital statistics. Emergency services
encompass the surveillance and response to natural and manmade events such as
hurricanes or the release of bioterrorism agents.

With these services in mind here are the critical functions and services
that public health would need from NHIN.

Auditing and logging, in terms of a core NHIN function auditing and logging
would be necessary to keep track of which edge systems, whether it be public
health agencies, clinical providers, laboratories, etc., are connecting and
disconnecting into the system.

Additionally, any edge system that is pushing or pulling information to
another system needs to keep a record of what was sent, when, and to whom as
well as records on what information has been received.

Authentication and authorization, many of the proposed functional
requirements identify the need to authenticate and authorize all public health
agencies that would need to connect to NHIN and in previous testimony there has
been mention of an edge system registry as a core function. We completely agree
with this and feel that anyone who will be adding, viewing, or editing data in
the system will need to be authenticated and authorized registered users in
order to ensure legitimacy of the health information. The data analysis that
public health provides back to the clinical sector is only as good as the data
that we receive, therefore authorization and authentication of users is a
critical component of the system.

The function of the core system would be to authenticate and authorize each
edge system that is requesting to connect to the NHIN.

However it would be a function of the edge system, for example the public
health agency, to oversee the authentication and authorization of specific
individuals within the agency that would have access, and at what specific
level of granularity, to the NHIN.

Data access and update, public health is going to need to have access to the
data on multiple levels. For example in order to use NHIN to enhance the
disease reporting system we currently have public health is going to have to
have access to identifiable data in order to ensure proper follow-up. Likewise
in order to use NHIN for public health surveillance we will need to have access
to anonymized data which if need be can be re-identified for follow-up.

Likewise different public health agencies will need access to different
health information depending on geographical and jurisdictional responsibility.
Role based access control that is flexible and scalable will be necessary in
order to manage multiple users with multiple access privileges. We see
establishing a role based access control as a core function.

Data mapping and translation, we cannot emphasize enough the need for the
use of common terminology or the capability of mapping in a standardized format
in order to ensure that multiple edge users mean the same thing when they use
the same word.

Data quality and data integrity, data quality and integrity should be
primarily a function of the edge system, however there can be a role for the
core as a backup to the system to ensure that data degradation doesn’t
occur in transit.

Data push and pull, data retrieval and transmission capabilities would be an
essential element for public health in utilizing NHIN. A bi-directional data
flow with public health edge systems and other edge systems will be beneficial
for all NHIN users. When following up with notifiable disease submissions
public health will inevitably need to make requests for additional information
from the submitting care delivery organizations. For early detection or other
biosurveillance purposes public health would need to access anonymized
aggregated data that can be queried and will give public health officials real
time illness data.

Public health also wants to give back into this system and one of the ways
we envision giving back is to provide data to other edge users that would be
useful, if not critical, in the treatment of patients. For example we could
give feedback on the patterns of antibiotic or antiviral resistance to the
medical community, we can give population health data on the screening
completeness for environmental conditions like detection of lead elevations in
our children, look at the geographic variation in stage at diagnosis for cancer
and give that back to practitioners, and also give feedback on the course and
spread of outbreaks.

We spend a lot of time collecting data and often adjust for the fact that it
represents a small sample from the population. Having access to a large set of
population data will allow us to prevent disease and disability, instead of
just measuring it. In turn we can provide information as a service back out to
the clinicians in a timely manner so that they can make informed decisions
about a patient’s treatment or prevention measures.

Identify and information correlation, this is something that’s really
key to us, being able to correlate information from multiple entities for one
patient is necessary for information that is shared with public health. In
order to effectively measure incidence and prevalence all the data for a single
case of disease is linked to ensure that one case of shigellosis is counted
only once and not multiple times because there was information indicating a
case by the ER care delivery organization, the personal physician care delivery
organization, the hospital laboratory, the private laboratory, and even the
state public health testing laboratories.

Likewise if additional information on a patient is entered into any one of
the multiple edge systems, including any public health follow-up, there needs
to be a way to link that information with all the other information on that
case and to that specific patient.

Record location, record location as a core function would be to provide
assistance to the edge systems in locating records.

And finally transient data, as a core function transient data within the
core would be for the purpose of delivering the information from one edge
system to another.

So that’s our 15 minutes of fame, I hope we get a shot at a little bit
more and I know we will, I see many friendly faces and colleagues here at the
table, it is an honor for us to be here.

But in conclusion I would like to state that public health offers many
services that will benefit from the bi-directional information flow from the
NHIN ultimately improving public health services and clinical care. As the NHIN
comes to fruition public health requirements should be incorporated in all
aspects of the NHIN not just biosurveillance. ASTHO stands ready to fully
participate in these efforts.

And I thank you again for this opportunity to present this testimony to the

DR. COHN: Lawrence, thank you very much. Now we have copies of your
overhead, I don’t think we have actually a copy of your testimony, is that
something we can —

DR. HANRAHAN: We’re going to be sending that to you.

DR. COHN: Okay, great, we appreciate that very much.

Our next presenter is Tim Morris, Tim as I understand you’re division
director from Division of Shared Services from National Center for Public
Health Informatics. So obviously thank you very much for joining us
representing CDC in these conversations. Please.

MR. MORRIS: Thank you for the introduction and thank you for the opportunity
to provide testimony today. What I’m going to cover is the linkages
between Public Health Information Network, or PHIN, and the Nationwide Health
Information Network, or the NHIN as we’ve heard many presentation today.
As was stated my name is Tim Morris, I’m the director of the Division of
Informatics Shared Services in the Center for Public Health Informatics at CDC,
I’ve been involved in work directly related to the Public Health
Information Network since its inception around 2002 and have been project lead
on several projects related to PHIN implementation.

Today I’m going to talk about the PHIN purpose, some functions and
components and show how they relate both directly and indirectly to the
biosurveillance use case and the NHIN functional categories and core
components. I will also show how data and information from both traditional and
non-traditional public health partners can be combined with data from the
clinical sector through connection of PHIN and NHIN to provide a rich source of
information for both public health and clinical sectors.

So again to cover overview of the PHIN purpose, I won’t go into too
much detail but the purpose is to improve the capacity of public health to use
and exchange information electronically by promoting the use of standards,
technical specification, some of the priority functions, workforce
competencies, collaboration and data sharing, and strengthening, which is a
key, strengthening the use and exchange to be robust and flexible enough in
routine times so that it can accommodate an emergency in scale.

When we look at PHIN its been categorized in a few different ways in the
past as it has evolved, most recently the PHIN preparedness requirements
defined functional areas for systems related to emergency preparedness and
response and specific specifications for interoperability amongst these
systems. Today I’m going to provide another categorization of functional
areas that are a big broader to include public health monitoring, public health
intervention, prevention, communications and alerting, and then touch on some
of the common services and components which really are the connection between
the Public Health Information Network and the NHIN.

Public health monitoring, I won’t go into too much detail on these but
again inside of this is what a lot of the biosurveillance use case is about as
some of the early event detection, situational awareness, but then you get on
to things with notifiable condition reporting, environmental health monitoring,
and many other types of monitoring and surveillance throughout public health.

Public health intervention is pretty much exactly as it states, functions
include public health investigation and response, environmental assessments,
outbreak management, some of the resource utilization again related to response
and directly related to the biosurveillance use case, inventory allocation, and
then response management and follow-up.

Public health communications is a bit more diffuse but certainly there are
some concrete things here with health alerts, decision support, secure
communications and collaboration as well as public communications that would be
targeted at specific audiences while not being extremely secure, health risk
communications, promotion of health practice, training, and you can see each
one of these has different example systems that would be part of this.

Prevention, again public health prevention and can’t forget this one
because it is the last word in the title for the CDC, and the functions with
vaccination campaigns, wellness programs, smoking/drug cessation, violence
prevention, and other types of different preventions that related to some of
the intervention but different.

And again we have, when you look at the different information flows that
might occur across the PHIN you can look at the different organizations and
entities that might be involved which some of the traditional partners would be
public health labs, some of the state and local health departments, law
enforcement has become more of a player since 9/11 for a lot of what we do with
different types of testing for things in criminal investigations. But again
there’s a lot of the non-traditional sources but to public health may be
traditional, looking at HIM(?) and RAD(?), if you look at vector control for
something like West Nile Virus, the stockpile for pharmaceuticals, there’s
a lot of information that can be overlaid with clinical information from the
NHIN to provide a good picture of situational awareness as well as surveillance
during routine times.

So the PHIN systems and standards, this is where we get to the connection to
the NHIN and certainly at the top of this listing is the functional
requirements, again we do have functional requirements related to preparedness
with the PHIN now, there’s work going on to broaden those to some broader
categories as we discussed before. Again the guidelines for workforce
competencies, we need staff in the field that are educated in informatics and
information technology so that we can build these systems to interoperate.
Guidelines on policy and operations for information systems, and again PHIN is
a set of applications delivered from CDC but also those partner systems or
companion systems that are in the state and local health departments and in
other agencies.

And the next few items, looking at vocabulary standards, technical
specifications for message exchange and data transport, specifications for
directory exchange, not one that’s listed in some of the core components
for the NHIN but it would be an important one to look at routing for messages
and other types of things. Technical specifications for alerting, again
I’ll talk in a bit about what the PHIN or public health can give back to
the clinical sector through notifications, alerts, and there are standards
related to those areas that CDC and the PHIN have been working with for some

Security services, this is certainly covered in the core components for the
NHIN. This is not uniform across the country, certainly at the federal level we
have certain guidelines that we must abide by by federal law, if you look at
PKI is it implemented consistently, the same certificate authorities and other
things, authorization is different, so this is a tough one but it’s
certainly something that needs to be done, is it federated and how do we
implement that, it’s going to be an interesting one to look at but
certainly a component of this.

So again PHIN and NHIN, what can be shared, I think this is the edge of the
many PHIN systems to the NHIN so again around the standards and specifications
that have been specified for the NHIN, and there is a lot of really good
correlation between those standards that have been selected for NHIN and what
PHIN has selected as well but there will be some synchronization need there as
we move forward.

This relates to the NCVHS template, the diagram that we were sent, this is
just a different representation of it where you see the two bubbles with PHIN
and NHIN represented and the key is in the center are those core components for
connection where data and information are exchanged. And then you see the edge
systems for each around the outside, there is some overlap here, certainly
there are hospitals and clinics and laboratories that are public health that do
clinical care but this is just to demonstrate some of the different
organizations and systems that might exist.

A key component of this I think is delivery of both clinical information and
possibly combined with public health information, delivery to consumers —

MR. BLAIR: As you’re describing this, since I can’t see I’m
just going to ask a little bit, could you describe to me exactly where, it may
be visible on the picture, exactly where PHIN network and NHIN network, where
they interface, at what point? Is that shown on your diagram?

MR. MORRIS: It’s shown in two bubbles that have overlap and I think
what is in the overlap area is the common, is the core components —

MR. BLAIR: Could you tell me what those are?

MR. MORRIS: The standards, when you look at the NHIN components from the
NCVHS template, if you look at terminology standards, if you look at messaging
standards, transport standards and the semantics around those types of things,
that’s where the overlap and connectivity will be. And I think it’s,
again the edge systems that would be on the PHIN again are operating with a
similar set of standards that are being recommended for NHIN so I think
there’s a good correlation there.

I can come back to it in questions if we want to cover that some more. I
apologize for not describing the diagram better.

And related to this I wanted to cover some different types of interaction
scenarios between PHIN and NHIN or some traditional means as well as some that
are pretty unique to public health but we do need information combined from
many sources. West Nile Virus is a good example of this where you have
information sources, could be from animal disease data, vector control data,
and case reports possibly, source from the PHIN and their partners, and then
human clinical data from the NHIN. Certainly this has been done in the past,
many of these things are done but it’s difficult and expensive to do these
interfaces now and there’s a lot of things on paper. So the better data we
can get from all of these sources the better we’re going to be at
prevention, guidelines, and hopefully reducing the amount of disease. So if you
look at this scenario with West Nile the dissemination of information could be
prevention guidelines, alerts and notifications to both PHIN and NHIN and many
other things.

The second scenario would be of air quality and asthma, currently the
Environmental Health Tracking Network at CDC is working with the EPA and the
local EPDs to look at air quality data and doing a pilot to overlay that with
asthma data to look at rates of asthma and clusters and different ways to do
research on the data and provide guidelines. What would be the outputs of this?
Certainly I think the EPA would be interested in evidence for regulatory
control measure success, that if we can make these correlations that these
regulations are helping to reduce, increase air quality and at the same time
reduce related illness, community health research in general and air quality
notifications to clinical care back to the NHIN.

So in closing I’d just like to make a few recommendations and talk a
little bit to some more generalized pieces. Members of the clinical care and
public health community have seen the benefit of a connection of information
systems in the two sectors but I hope I’ve given you a picture of what
this could become in the future as real integration is realized. While some of
the interactions I’ve described occur today many are manual and those that
are electronic require large investments of both time and money due to the lack
of standards and interoperability specifications across both the clinical and
public health sectors. In closing I would like to offer a few recommendations
for consideration in the biosurveillance use case and NHIN functional

First I think certainly for PHIN, and there are a lot of specification
implementation guides and terminology together for the PHIN, that we need to
synchronize those with those that are coming out as recommendations with the
NHIN and the functional categories.

Number two, I think we need to extend biosurveillance use case to include
data flows from PHIN to NHIN. It may not be specifically in what it was
initially meant for but I think there are important flows for emergency
response, for routine times just during different types of outbreaks,
information that could be provided back for disease spread in a pandemic, for
targeting counter measures or interventions to specifics areas based on disease
spread, and certainly many other guidelines. So I think those need to be
explored and built out.

As well we need to extend the biosurveillance use case to include
requirements for routine surveillance needs and if not in the use case itself
certainly in the data elements that we are going to collect as part of
biosurveillance, I think you need to think about what you need to do with the
data, if you’re going to look at environmental conditions, if you want to
look at overlaying that with different things for animal health or zoonosis
then I think we need to look at what those data elements are ahead of time,
otherwise we’ll be catching up afterwards.

And again work with some additional partners, some of those that we
don’t consider possibly in public health that could get us some of those
other elements in how they might be used and include those in this conversation
as well, either through the partners that we have on the committees and
workgroups or directly, but I’d like to make that recommendation as well.

Thank you for your time and attention and I appreciate the opportunity to
provide testimony today. Thank you.

DR. COHN: Thank you very much. Obviously we’ll take questions and
conversation after we’ve had a chance for three presentations. Our final
presenter is Floyd Eisenberg, obviously thank you very much for joining us

DR. EISENBERG: Thank you very much and on behalf of the HITSP
Biosurveillance Technical Committee I thank you for the opportunity to present.
It’s nice we have an interoperable machine —

What I will be doing is starting on the presentation in your handout on
slide 28, as requested this morning I will not be repeating the prior HITSP
slides that you’ve seen, in your testimony it starts on page 11 for
biosurveillance. I won’t be reading directly from the testimony,
they’ll be a lot of comments that sound like I’m reading directly but
I’m not.

The first thing I think is important to state and I actually appreciate very
much being at this point in the schedule coming after ASTHO and CDC because as
our technical committee began its discussions and deliberations there was a lot
of discussion about this is the beginning of a journey, we’re looking at a
infrastructure to help to start to set down some infrastructure to be able to
for now meet this biosurveillance use case as defined but an infrastructure
that perhaps or hopefully will lead us toward population health. Which
we’ve had feedback from some of the HITSP panel members, why don’t we
call it population health, and the reason is because it is limited to start
with but we’re being very careful to try to make sure the standards we
choose, and I won’t be telling you which standards today because
we’re in the process of finishing up what will be in the specifications as
I think you heard yesterday, but I will be talking about some of the
significant issues.

And the issues here are just a brief overview of our use case was moving
from clinical care in multiple settings through the patient encounter, lab, now
additional to lab results we have orders, we have radiology lab and orders to
deal with, as well as resources, the utilization of hospital beds mainly, and
getting that information to the BIS, or Biosurveillance Information System. The
challenge there is there’s another, there are different levels of
interoperability in that if we’re talking about an EHR or a hospital
we’re still, ambulatory or hospital, we’re still dealing with do I
send something to local in one format, regional public health in another format
and federal in another, so hopefully this will help to inform how the responses
in our interoperability specification that it doesn’t any one or how many
send to it can transmit the data in the same format and be reused.

So that’s our intent and how we’re looking at this, the issue of
anonymization or pseudonymization so that information can be re-identified as
necessary and what we’re really looking at this is in two formats, we
actually have these listed as functional scenarios which is the message based
data submission which is more likely to be used in an ambulatory setting with,
I’m sorry, I’m backwards, message based from the hospital in which
data will be coming at different times, utilization is not necessarily going to
be in a document based data submission but would be message based, but from an
ambulatory setting may well aligning best with the work in EHR be in a
document, perhaps a document share and queries from there. So we were looking
at two different scenarios of sending information as well as resource
utilization as another scenario.

In keeping with the biosurveillance which is a secondary use system
different from the other technical committees we wanted to keep in context the
clinical exemplars, the clinical scenarios, of pandemic influenza,
bioterrorism, I’m using anthrax as an example, and through our committee
it was agreed to look at a sexually transmitted disease as routine surveillance
to show through our specification how it can be used for these purposes with
the primary reason here for identification of situational awareness.

I do not intend to go through all the details on this slide of the message
based data submission but just to explain that as we went through the use case
we looked at all steps along the way into what we at that time called building
blocks, as did the other technical committees, and based on what required new
work, what was existing, what was a current effort, but also then took those
down to the next level, just a reminder of process which I believe you heard
yesterday, to our constructs which are transaction packages, transactions and
components. In addition in working with the other technical committees we have,
and this is one example of the slide, which shows the constructs specific to
biosurveillance, there are constructs that are common to biosurveillance EHR,
and constructs common to all three.

So we are working with the other technical committees to provide that, the
reason I say this is a draft is its already changed since yesterday so one of
these boxes, managing consistent time across enterprises, is now a common one
rather then a unique one. But the point is that they are now developed into
constructs for which specifications will be written and documents will be
available on September 29.

Basically as far as requirements for the core system there are specific
requirements, again we are as a technical committee in agreement with a thin
core system, basically providing methodology for secure communication channels
and note identification for managing machine credentials understanding that
biosurveillance is not looking at a human sending data from an organization to
an organization but machines sending information or devices sending information
perhaps, or computer systems sending, but not necessarily a request to an
individual. So in this case not necessarily the individual who was responsible
for the information but where it came from as far as the system, that’s
what is meant by machine, so to have a requirement for the edge system to have
policies on issuing those credentials.

Also the ability for the edge systems to request filtering, not for the NHIN
to do any filtering but in cases of a large scale event where public health
needs to request additional data or a modification on how data is sent to the
public health there needs to be a method to request that and that method needs
to be a standard method. The filtering itself is an edge system that I’ll
get to on the next slide but the ability to ask for changes is what we’re
looking for here.

The challenge at this point in that the data that is being considered,
because of the issues of data transformation from each edge system, and you
heard that in the morning discussion on the EHR, we’re dealing with a
minimal dataset to be able to manage in the near term public health
surveillance, not what’s necessary for the long term what we all would
like but for the near term. There will be some challenges if the filtering, if
there’s a request to increase the filtering and there’s no
standardization on the new elements, so we understand that’s going to be a
challenge but we’re dealing with the minimal dataset to begin with.

The other area here we’re requesting for the NHIN is to manage related
networks, for example there are networks today for emergency responders to
understand hospital utilization emergency department availability. We’re
being asked through our use case to provide similar but not as complete
information to public health and if there are multiple methods and we’re
creating new groups for which information needs to be sent what NHIN could help
with is how to coordinate multiple networks, rather then having multiple but
have them coordinate.

Another that was presented in the slide by CDC this morning of course was
law enforcement in addition to emergency responders.

The other area for NHIN here is to support communication environment for
secondary views of data, whether it’s public health, whether it’s
researchers, we clearly see this as a biosurveillance use case, in the long
term we see this also as the potential for sending information from the EHR for
pay for performance, CMS, other measures and so there needs to be consideration
for how other secondary users of data will manage.

As far as requirements for edge systems on interoperability to basically to
support the three functional scenarios, to manage the message, document and
resource, a methodology for data communication for secure point to point
communication and shared document resource, high level crude filtering so that
each edge could filter and in this case on the terminology side we would leave
it to the local implementation to decide whether the sending edge or the
receiving edge does the data mapping. There are benefits and challenges to each
solution but rather then specify the architecture it was agreed among the
committees that we may be better off allowing the local implementation decide
which edge does that to see which works best for that organization, but the
filtering occurs at the edge.

Associated services would include terminology services, security, and again
pseudonymization services. Some of these services are still to be developed, we
will define standards for them but some services such as pseudonymization may
or may not exist in certain locations —

MR. BLAIR: What is that word?

DR. EISENBERG: I’m sorry? Pseudonymization which through the work with
the committee, an ISO standard I understand is called pseudonymization —

MR. BLAIR: What does it mean?

DR. EISENBERG: It means basically providing an alternate ID so that the
service could correlate data on the same patient, the issue provided by, I
forget whether it was CDC or ASTHO that the same patient seen in one ambulatory
setting and another and the hospital were all rolled up into the same patient
but the receiving end in public health would only know it as an unknown patient
but would be able to know that all that data relates to the one patient.

Another issue is timely data flow management which in our use case was an
issue and we’ve heard anywhere from every 15 minute transmission to the
most, or the least frequent, daily, so management of the data flow in that near
real time as required would be an edge system.

And management of large volumes of data, one of the concerns that came up in
a number of our discussions was an excess of data at either end, and especially
at the public health end, could be overwhelming considering resources. And I
think there could be excess at the other end as well and there are legal and
policy implications that do need to be addressed as far as having information
but not being able to respond to it, that needs to be carefully evaluated from
a legal and policy standpoint, as well as workflow at the edge. Those are not
IT specific issues but they need to be addressed to make it work.

In general the other comments the technical committee wanted to make was for
syndromic surveillance and situational awareness, aggregate monitoring does
happen today, it happens successfully in isolated sites, in groups and in
regional areas, and successfully monitoring bed utilization and types of
conditions for which patients are seen in emergency departments today. So
rather then looking at it as a green field we can learn from this experience,
HITSP can certainly enhance the process through standardization and we hope to
do that.

The other significant issue in public health is probably more similar to
clinical health care then it is different. In addition to the aggregate
monitoring there is the need to give back to the individual level for either
case identification and contact management and further epidemiology information
collection, and that’s where the anonymization and pseudonymization, or
whatever word you put to that, comes in.

Requirements for edge and core systems do inform and are informed by
architecture and I think it needs to be a collaborative discussion over
architecture, architecture cannot be ignored as we look at the methodology for
biosurveillance, just as a comment.

The other comment here is many of the standard development organization
efforts are working towards some harmonization and we have a number of SDO
representatives on our team and they are also a moving target, there are some
that are very close to approval but not approved, and it’s hard to say we
can’t look at those when we’re trying to come up with those as
solutions but our task is to be careful to take only those that are already
approved. So we will be providing a roadmap as well as a list of what can be
done exactly at the end of our deliverable.

So I want to thank the committee for allowing us and the technical committee
to present, thank you very much.

DR. COHN: I want to thank the three of you for what I think has been a very
interesting set of discussions. I know John Paul already indicated to me that
he had wanted to have the first question, so please.

MR. HOUSTON: Thank you. I know Dr. Eisenberg spoke about it a little bit at
the end, I didn’t really hear too much discussion about it in the other
two presentations and so in the context of biosurveillance, I’ve heard a
common concern from privacy advocates with regards to the release of
identifiable information in the context of biosurveillance, especially without
an authorization. Now I understand that clearly there are cases where there is
mandatory government reporting, we know that exists and it’s necessary.
But I get the sense that there’s also, I get the sense and I apologize if
I’m going to say this the wrong way, that people see this as an
opportunity almost as a land grab to get additional information and again I
think in the context of people’s concerns over, individual accessing
public health, individuals accessing identifiable information without
authorization, it’s a concern that I’ve heard put in that context.

So I guess my question sort of is I’d like to hear from you generally
about what your thoughts are on this and as well the rule of some type of an
authorization scheme to release identifiable information for biosurveillance

DR. EISENBERG: Let me respond from the perspective of the HITSP Technical
Committee. Given the timeframe that we’re working with we are addressing
some of the security issues but some of them are being handled separately from
our own technical committee. What I can say as far, and the committee has been
very concerned about the issue of privacy and reuse of data with or without
permission. We in our constructs have been very careful to say release, and
I’m trying to remember exactly how we worded one of these, it’s not
on the slide that I have here, but it’s release of information to
authorized public health officials with the definition of authorized as based
on policy allowed to get that information at that time. But that implies that
there’s a policy that indicates when that can occur because in an
aggregate situation we did not feel that this was an individual patient consent
possibility because you can’t consent, unless you’re asking for
consent every time you come in to a health care setting that your data may be
used by public health and they may call you, anything else we felt was more of
a policy issue.

MR. HOUSTON: But I think it is a concern though that individuals want to say
yes, I allow my data to be used for biosurveillance purposes and that they
would authorize that, that’s at least one very clear message that I’m
hearing from a group of individuals.

DR. COHN: Well, John Paul, I guess the good news, on the one hand there are
policy issues there which I mean I think are legitimate policy issues, I’m
actually reminded that I think Dr. Loonsk yesterday reminded us that
there’s actually a whole group that’s set up to try and resolve that.

MR. HOUSTON: I understand that but I didn’t hear any discussion from a
technical perspective from, I think to the presenters really about that, the
importance of it, relative importance of it as well as how it fits into all of
this maybe.

DR. EISENBERG: Well what I can say is in, as future use cases are evaluated
for issues such as research, which would not be public health, then consent
does come back into the picture. We were limited to only public health or
situational awareness and that was, and reporting is actually not in our use
case so we were being very specific to for these purposes it’s not
necessary. But once you get back to research and something beyond the routine
surveillance and situational awareness the committee did agree it was a matter
of more scoping for the time being but not to say that it’s not

MR. MORRIS: I think that certainly in CDC we get a lot of anonymized data
through case reporting and other means and what the biosurveillance work
that’s going on currently with BioSense(?), the pseudonymization or pseudo
anonymization is in place where it can be linked, again back to policy, should
all those patients sign something to say that it’s okay for that data to
flow to that type of system, I don’t know how doable that would be but
certainly the protections are there and the only keys to that shop are on the
side of the data source, so certainly we can use that as it was explained for
aggregation of records for an entire encounter but it’s only linkable back
and generally at the CDC level, at the national level, that linking back or any
reach back to identified information would generally be at the local level. Now
where do you get into authentication policy and how you distribute out those
types of rights or policies to that, I think that’s something that needs
to be looked at in how you step that in the hierarchy because there’s
different levels of use and different needs to see more identified data at
different levels.

DR. COHN: John, did you have a comment?

MR. LOONSK: Yeah, I just wanted to comment that clearly some of the areas of
discussion here are very policy oriented, are highly technical, relate to
existing laws and regulations in the circumstance of both state and local and
HIPAA, and that it may not be fair to ask these testifiers to comment on those
issues fully in that context. We have every reason to believe that, and are
confident that the biosurveillance use case was compliant with federal
regulations in regard to what it suggested needed to be done, that was in
general nature and architecturally indeterminate in many respects. The areas of
system services for filtering and for the linking seemed to be very much in the
scope of discussion here in that context and so perhaps the sweet spot here
relative to functional requirements is to make sure that the functional
services and capabilities are considered in the context of supporting those

MR. BLAIR: I have a little anxiety asking this question because I’m
just not sure how much of this might have been displayed on the graphs,
I’m really actually very, very supportive of trying to use a health
information network capability for clinical research, public health,
biosurveillance. And I’m saying that first because I’m struggling
trying to sort out, our mission is to identify the minimum core set in the
NHIN, so maybe you can help me. When I listened to the testimony on this panel
it appears to me that there’s aggregation of information, widespread
collection of information or monitoring of information, and I’m an old guy
and I remember when we were looking at electronic health records at one point
and we were looking at them for patient care and response time was important
and performance was important and we had a whole different set of users and we
started to separate out usage across patients into data warehouses separately
so it wouldn’t impact performance. And I know things have improved so
I’m, as I’m hearing these things I’m almost wondering whether
since there’s a PHIN, Public Health Information Network, in addition to
the NHIN, can a lot of this cross population aggregation of information be done
on the PHIN? Because that means that we could keep the NHIN thin in terms of
functionality so we don’t have to deal with data aggregation on a
widespread basis for the NHIN and maybe there’s an obvious answer and it
was already there, maybe you could help me with those thoughts.

DR. EISENBERG: Well let me state from the technical committee side and with
somewhat of a health care vendor hat as well. It is potentially problematic to
expect a consistent stream of data going across the NHIN as the operational
system functions and as clinical users use it. The consideration was thinking,
our committee was really thinking more of packets of information being sent at
specific time intervals, what NHIN can potentially help with is what is the
acceptable timeframe that we have some wanting it sooner, more frequently, and
some wanting real time and some wanting every day but if we could help
standardize what frequency is really necessary so that the edge sending systems
could appropriately manage that through some type of data warehousing or
whatever method they feel best, if they feel they can do it out of the
operational system so be it but that would be the edge system requirement. I
know that —

MR. BLAIR: So you’re not looking at the NHIN to have in its core
minimal thin set of functionalities capability for data aggregation?

DR. EISENBERG: No, we were not.

MR. BLAIR: Thank you. And please continue to elaborate but that was one
aspect of what I was trying to understand.

DR. EISENBERG: Well that would actually be the subject of about an hour long
presentation I’ve given but I don’t know that I want to try to
summarize all of that here, I’m representing the technical committee and
we haven’t really fleshed out that whole issue of how will the edge system
do it but just that it will do it.

MR. BLAIR: Okay, thank you.

DR. COHN: Kevin.

DR. VIGILANTE: Thank you. I’m not sure who would be the best to respond
to this, maybe none of the folks here, but how does the PHIN and the NHIN, how
do they relate to the NBIS, the National Biosurveillance Integration System?
And what implications does the NBIS have as we think about the NHIN?

MR. MORRIS: There’s probably others that could answer this better then
I but I think NBIS seeks to look across different sources, much like we talked
about in a West Nile scenario but something more in a bioterrorism or other
situation to look at these different sectors, be it veterinary, plants or
clinical human health and other things and do some bit of early earning itself,
at a next layer up I think some of the things that they are struggling with is
the subject matter expertise to do the algorithms at these edge system that
aggregate at the lower level, that’s where that subject matter expertise
exists. So I think it is another layer of that because some of those feed some
rolled up information to that or the early detection, I don’t know.

DR. VIGILANTE: So is there significant collaboration between, when we think
about standards development and the like is there significant overlap and
collaboration with the NBIS enterprise?

MR. LOONSK: This is John Loonsk and I’m taking off my ONC hat for a
moment because I did have something to do with PHIN in the past and wanted to
make a clarification because I think it’s an important one. As the
testifiers suggested they are not suggesting that the NHIN accumulate data nor
does PHIN accumulate data, PHIN has been a framework much like is being
discussed with the NHIN, a framework of standards and communications, it is not
a data aggregation system per se. There are edge PHIN systems that do as in
accordance with state, local and federal law accumulate data as public health
is enabled under HIPAA and in other contexts. And this is a proposed data
aggregation system, it could be considered an edge system on the public health
network, it does have interactions with other sectors as Tim Morris has
suggested as well, and that’s sectors such as agriculture and other
domains. So hopefully that clarifies a little bit.

DR. COHN: Floyd, I actually had a question here and I just wanted to maybe
delve into it for just a minute. I think we’ve seen data filtering as a
term that has floated around between edge and core and all of this stuff.
Obviously you’re using a term of art described as high level crude
filtering and I guess I’m trying to figure out what exactly, I mean why
are you making comments like that and what is it that you mean by that, is that
something that we should be aware of when we think about functional

DR. EISENBERG: The reason for the term high level crude filtering was
specific to the fact that we’re dealing with a minimal dataset and we
weren’t looking to participate for biosurveillance in the near term, that
all information at the edge had to be fully mapped to some standard terminology
to be able to be filtered. That was the reason for the I guess less then
specific filtering comment because it is dealing with a minimal dataset. There
are challenges with that in that as soon as something more is needed it’s
not defined yet so that dataset would be expected to grow and over time it
wouldn’t just be high level and crude.

DR. COHN: Thanks for the clarification on that. Other questions or comments
from the workgroup? Okay, well I really want to thank our presenters, I think
this has been a very useful additional dimension to our conversation around
functional requirements and certainly an area that we need to keep in mind as
we look at core and edge and trying to sort of identify a set of functional
requirements to obviously meet multiple needs.

Now we are at 12:07 I believe, what we will do is to take a lunch break and
reconvene at 1:00.

[Whereupon at 12:10 p.m. the meeting was recessed, to reconvene at 12:10
p.m., the same afternoon, July 27, 2006.]

D I N G S [1:10 p.m.]

Agenda Item: Panel VII: Other Health Information

DR. COHN: We are going to get started with the afternoon session. Before we
move into our speakers, I actually wanted to give a member a chance to
introduce himself, as well as identify any conflicts of interest.


DR. VIGILANTE: In the next panel, Mark Adams from Booz Allen Hamilton will
be speaking, although he will be speaking as a caBIG employee,
contractor/employee. I would also like to note that I am employed by Booz Allen
Hamilton as well. If any conflicts arise, I will note them and recuse myself

DR. COHN: Okay. That is Kevin Vigilante —

DR. VIGILANTE: My name is Kevin Vigilante, member of the committee.

DR. COHN: Okay. Thank you.

DR. COHN: This panel is really a variety of I think what should be very
interesting testimony from a number of different perspectives. We are going to
start out with Didi Davis and, Didi, as I understand, you are the director of
Integrating the Healthcare Enterprise. So, we are obviously pleased to have you
here for testimony.

Following, you know your agenda, it says that Kevin Hutchinson will go
next. Kevin is, I believe, is going to be calling in by phone and we are going
to put him at the end just because I think that will work better in terms of
the conversation.

So, following Didi will be John Apathy and then Vish Sankaran from the
Federal Health Architecture. Obviously, we are pleased to have you all joining
us for the conversation.

Now, what I told other panels is is that we are trying to hold the actual
testimony to about 15 minutes a person, with the issue being that, obviously,
we have lots of written information from you and we really want to make sure
there is enough time for questions and discussions about the information. So,
your help on that would be appreciated.

So, with that, Didi, why don’t we let you start and thank you for coming.

MS. DAVIS: Thank you for welcoming me and allowing me to have time to talk.

For anybody who doesn’t have it, all of what I am going to talk about — we
have a power point presentation that this is presented by. It is in a packet
that inside of here and I think there are extra copies in the back as well.
There is also written testimony that I will be referring to a few pages so that
you can have the highlighted notes, so you won’t have to do a lot of note

I apologize for not making them available ahead of time.

I am as was introduced director of Integrating the Healthcare Enterprise.
It is an initiative that actually has been around since 1998. So, what I am
going to do is to fill in a few gaps. I understand there is a lot to be
covered, but I will answer your questions that were proposed, but I wanted to
also set the tone a little bit about what IHE was real quickly.

One of the most important things I think that IHE brings to the table is
that care providers and vendors working together to actually create
implementation guides of standards. So, what Integrating the Healthcare
Enterprise really does and some of the things I will be covering is it plays a
really important role in my mind and in IHE’s with the foundation, what we have
put together, for really creating a truly interoperable world at some point in

It is an incremental step approach. IHE does not have everything that you
need to build an entirely interoperable world today, but it does have building
blocks and I am going to try to weave that story and show you some of the
connectivity for that. But it is not — one of the things I want to clarify —
a lot of people think it is a standard development organization. It is not. We
leverage standards that exist. They are balloted and approved standards from
every place from ASTM, you know, the HL7 standards, the DICOM imaging standards
and so forth.

It takes about 16 standards to implement a truly interoperable end-to-end
solution if you consider security auditing type of standards and so forth. IHE
does provide a common framework for passing that type of information across
enterprises, but also within the enterprise because there is interoperability
problems between the departments within the institutions themselves as well.

So, what IHE does is it creates what we call integration profiles that
actually pull those things together and those same integration profiles can be
used not only inside the Enterprise but across Enterprises across state regions
as well.

IHE here in North America is sponsored by three primary sponsors, HIMSS,
which is the organization I work for; also, the American College of Cardiology
and the Radiological Society of North America. Like I said before, IHE has been
in existence since 1998 and it is an international organization as well.

So, building upon that, we do have chapters of IHE within Europe, Asia,
North America, which also does include Canada. There is IHE Canada as well.
There is a Strategic Development Committee that oversees the full strategic
development go forward plan and road maps are put together as strategic road
maps that can be used over a course of three to five years out at a time.

It is a proven adoption process, as well. So, IHE is not only an
initiative, but it is also a proven process. It starts with identification of a
problem or a use case, similar to what HITSP is doing within their various
departments, identifying standards that may be used to help implement profiles
that would help with that interoperability problem. Those profiles become what
are called the technical framework. They are folded into the specifications.
Nothing but documents.

It does go a step further not only creating the implementation guides, but
also being tested at what we call a Connectathon. Here in North America there
is a Connectathon that happens every January. The last Connectathon happened
earlier this year. We had over 350 engineers from over a hundred different
companies represented there, everything from cardiology systems to radiology
systems to the normal electronic health record systems.

That leads to demonstrations at the various annual conferences that the
sponsors hold every — American College of Cardiology has one in March. HIMSS
has theirs in February. RSNA has theirs in November. All this detail is again
in your written documentation.

Hopefully, that leads to products with IHE baked into the product. So,
vendors have adopted this as a way to go forward with interoperability and they
create what are called integration statements saying that they do have this
version of software that can do this. They can act as a patient identifier,
cross referencing manager to create — figure out how to link all the different
identifiers for a patient. Hopefully, that allows you to have more of a
plug-and-play type of product, rather than having a lot of point-to-point
interfaces that is required today.

There are nine domains within IHE. So, it is not just radiology, although
that was where it was born from. If you think about the background for it, a
bunch of radiologists got together and said, look, I am tired of putting in the
patient name in my order entry system and then again in my RIS system, the
Radiology Information System and then in a PAC(?) system. They were having to
put it in three different places.

IHE was formed on that basis of saying let’s make this electronic and
seamless. Now it has grown. This is 2006. We have nine domains. You will notice
on the slide that underneath each of the bubbles, such as radiology and
laboratory, it has the number of profiles listed underneath it that are in
existence today.

For instance, radiology has 17 profiles, being the oldest. Now I go on to
kind of answer a little bit more about your questions that you had proposed and
if you want to actually turn to page 14 of the written document, it is the
written testimony, that is where I am going to list a lot of what I am talking
about. So, if you want to follow and use that to make notes, I wanted to put
that to your attention.

My document, “Sharing of Data,” IHE feels that we are building
upon an infrastructure that can be flexible and that can be extended for an
interoperable architecture. Now, what I am going to do is address this, saying
that IHE has pieces. I am not going to say they have the end-to-end components.
They have pieces that could be considered building blocks to do things like
patient data record locations, can do the transport, the security, the audit
trails and authentication that you have in the middle box. Now, in your
document on that page 14 is this same blue box that you have here. The
integration profiles are listed on a web site called WWW.IHE.NET. All of the
actual documents that I reference in this slide are all out there and you can
bring them up, look — some of them are 20 pages and some of them 300 pages.
So, I will warn you there is a lot of reading out there. But by piecing these
together — and I truly really reemphasize what we have done with IHE is we
have put together building blocks that depending upon the work flow that is
required within that physician or clinician’s work space, the radiologist, the
pharmacist, whoever it is that is working with the patient to help with the
patient care, with the delivery process, there are the building blocks.

What we have done is we have structured it on what we call a Document
Sharing type of framework and we call this profile in IHE, XDS,
Cross-Enterprise Document Sharing. I am not going to get technical with that,
but just trying to get to the point where in addition to that profile, there is
a profile called Audit Trail & Node Authentication. So, somebody alluded to
earlier the machine-to-machine communication that is necessary for
biosurveillance and so forth. That node-to-node authentication thing, yes, I
trust you with the computer. I will send you this data and so forth. That has
to happen.

There has to be a consistent time when you are looking at documents and you
are looking at lab results that are three hours old. You need to know if it is
three hours old your time or is it three hours old Pacific Coast time.

So, you have got to think about certain things like that as well. These
things are grouped together to create that interoperable infrastructure. This
has actually been demonstrated at these annual conferences that I mentioned
about the sponsors. So, these are pieced together to create the interoperable
world that you have seen maybe in some of the actual annual conference
presentations and demonstrations that occur from the process.

But because the component literally that is alluded to here in the — you
will see starting on page 14, it goes into in depth, we even profiles that
allow you to do the point-to-point sharing. We have an infrastructure that has
registries so that people could go and query, not knowing anything more than a
patient identifier if there is anything that exists or they can go query
directly to one institution. So, if we did have a public health information
network and it had its piece together, it could actually pull data from some of
these registries or repositories, as well.

Now, these documents that are created within the IHE framework are true
documents. They are based on the CDA standard within HL7. There is discrete
data, XML-based, within that document. So, at this first step we are trying to
make something that any EHR, whether it is a smart EHR or not a smart — a
smaller, you know, maybe a smaller physician practice EHR that doesn’t have all
the bells and whistles that an acute care setting might have.

Trying to pull all that together, not to say every vendor can digest that
information yet, but the information is there so that one day when that
semantic interoperability is achievable, the data is available and you can pull

So, all of these things are grouped together. There is a profile you will
notice listed at the bottom right hand corner under “Other.” There
are two there. I will mention because I happen to know this number. Somebody
gave this to me. The request form for data capture is actually building block
No. 39 in the HITSP work. HITSP is also leveraging a lot of this work, which
leads me into how they are looking at kind of bringing this together.

I am not going to spend a lot of time on this next slide, allowing the edge
systems to speak to one another because the EHR Vendors Association and
Charlene Underwood did a very good job of presenting it. But the IHE profile we
see as truly the building blocks that will the edge systems to connect. They
are, hopefully, the standard ways of sending that data between the edge system,
electronic health records, pharmacy, you know, benefit systems, wherever the
payers, wherever this data exists, sharing that data from the edge into the
network so that it is discoverable at that point.

The requirement, too, that they talked about was flexibility. I would like
to point out there is a profile within the IHE process that I alluded to
earlier for the patient identification, cross referencing of information. It is
called PIX, patient identifier cross-referencing. That profile has been looked
upon and investigated to not only within like the CSC NHIN contracts for being
able to talk between record locater services, but also being leveraged from the
edge systems to talk up to the record locater services.

So, that is an example of a profile that IHE has that is flexible, that can
be used within a RHIO, Regional Health Information Organization, or it can be
used across RHIOs, across an NHIN type of implementation. So, we feel that that
is a flexible type — that gives you an example of how the flexibility is built
into these profiles.

Again, this needs to incrementally happen. So, what Charlene had pointed
out earlier in her testimony that this has to be across phases, there is an
interoperability road map, that the EHR Vendors Association has put together
that has built this out, again, using the IHE profiles.

I mentioned earlier the Cross-Enterprise Document Sharing that was shown.
This is a very good picture, depiction of how that is done. The scenarios that
we actually showed, we had 16 scenarios in the Interoperability Showcase at
HIMSS. That is the one I am the most familiar with because I helped put that
one together for the last two years. We had everybody from VA and the DOD
sharing documents and showing how this scenario could bring in everyone from a
soldier down to a cardiologist doing referral back to the primary care provider
after they had finished with the visit and the cardiac type of care.

So, this is discharge summaries, medical summaries. There are emergency
referrals. There are content profiles within IHE for eight different types of
documents and those are everything from personal health record document content
down to the normal discharge summary type of information that you need to

MR. BLAIR: Would you mind reading off those eight documents, please?

MS. DAVIS: The eight types of documents: medical summary. There is a
discharge summary. There is a personal health record summary. There is a
referring to an acute care setting and a referral document to an ambulatory
care setting. I think I mentioned the emergency department one already. You put
me on the spot. So, I am trying to remember the last three.

There is a consents document. That was the new document created this year,
basic privacy and consent document. And I cannot remember the last one. I will
have to look it up and let you know, but I will definitely make sure to get you
that information. I just can’t think of it right now. It is in this testimony
as well. So, you can find it there.

So, again, that gives you an idea of what we mean by these document sharing
type of processes, not to say that that will change the world immediately, but
that will give more information at the right time, on the right device for that
clinician to provide hopefully better care. Now, these data elements within the
documents started off, for instance, the medical summary document last year was
very small amounts of data; allergies, medications, problems list. That was it.
It is starting to grow, for instance, the emergency department referral has a
lot more information now.

To showcase this last year at HIMSS, I put one information in here because
I had all of this information on a slide, but in the written documentation, it
gives you statistics on all the annual conferences that the other sponsors have
had. The Interoperability Showcase had over 3,000 attendees go through the
showcase and there were 26,000 people at HIMSS total. That was better than 10

Thirty-seven vendors all showed product with this. So, with that, we are
also leveraging the integration profiles are being used by HITSP. CCHIT in
turn, hopefully, will at some point in time in years 2 and 3, leverage those as

So, in summary, really the role of IHE is, hopefully, the IHE integration
profiles can provide models for harmonization because it does — for instance,
the XDS profile uses about six different standards between the security and so

These building blocks can be put together to create this infrastructure
that will allow you to not only connect to the NHIN, but also hopefully may be
become pieces and parts of that. The implementation guides for interoperability
specifications are freely available. They are domain free of charge. There is
no charge for leveraging of those and we encourage many of the NHIN contractors
to come to the table.

With that, I am finished.

DR. COHN: Didi, can I just ask you — you went through slide 15 without
comment. Is there anything there that we should —

MS. DAVIS: No, these were — basically, what we did, what is missing in
these, clarity. What we did was we took some of what you have listed as the
functional requirements and we listed out what we have considered up to this
point and the timing of what we have already put them together as. So, if you
remember the EHR Vendor Association Road Map, Phase 1, Phase 2, that is what
this is referring to. So, we tried to tie this together for you so you will see
what is available.

One other thing that I might want to point out real quickly, because it is
color coded and it wasn’t very clear for you to maybe see and I am sorry I am
having to go backwards there. These are color coded as well. You will notice
the blocks in green are all fully available. They have not only been tested in
a trial implementation. They have been tested as a Connectathon against at
least three or four other vendors by every other vendor.

The ones that were added in 2005 are in orange and the ones that were added
in 2006 are in the light blue. So, that gives you an idea where we are.

DR. COHN: Okay, Didi. Thank you very much.

The next presenter is John Apathy.

MR. APATHY: Dr. Cohn, thank you. I am John Apathy. I am a senior executive
within Accenture’s Health & Life Sciences practice. I want to first thank
you for allowing me the opportunity to speak on behalf of a wide array of
stakeholders from the Clinical Research Community. Just a few of the members of
this community are here with me today and I wanted to just real quick
acknowledge them.

Dr. Peter Highnam, special assistant to the director at the National Center
for Research Resources at NIH.

Maggie O’Connor, she is with the Global Drug Development Information
Systems Group at AstraZeneca.

David Isom and Dr. Ross Martin, both within the Regulatory and Healthcare
Informatics Group at Pfizer.

Although our broad group has not had sufficient time to fully discuss and
come to complete consensus on all the detailed plans and recommendations that I
will be describing, we nevertheless welcome the opportunity to present a high
level perspective to NCVHS at this time.

Collectively, this broad stakeholder group is aligned on the promise that
the Nationwide Health Information Network can and will be a key enabler to
improving public health by transforming Clinical Research, expediting the
development of new ways to prevent and treat diseases and improving the lives
of Americans through better, safer disease prevention and treatment options.

Since March 2006, Accenture has been working with a small group of
pharmaceutical companies that has included AstraZeneca, Bristol-Myers Squibb,
Pfizer and Wyeth, to evaluate and explore how Clinical Research could be
improved by leveraging the NHIN and other Health Information Exchanges. This
effort was named the NHIN Slipstream Project. This group explored many
important ways that the exchange of health information could improve patient
health through the research, development and commercialization of new therapies
and determined that the three most important areas of initial focus in the ONC
NHIN process are post-marketing drug safety surveillance, connecting patients
to clinical trials and establishing appropriate care standards through
outcomes, pharmacoeconomic and personalized medicine research. The group has
spent the past few months developing documentation for these use cases and will
continue by determining the data elements necessary to support them and the
value propositions and business cases for each set of stakeholders.

Based on the work done through the NHIN Slipstream Project, our
Accenture-led NHIN prototyping consortium included 11 requirements related to
Clinical Research that is now among the 1,100 plus requirements submitted to
the ONC in May. These 11 requirements as you see listed here all beginning with
the ACN prefix are 01.10, 01.11, 01.12, 01.13 —

DR. COHN: You don’t need to read them.

MR. APATHY: For those who might not be able to see the slide.

Very good. Sorry.

Additionally, as a final step in their development, we intend to make these
use cases available to the NCVHS, the AHIC and other interested parties for
further critical review, comment and use.

On May 9th, groups such as the group FasterCures, the National Institutes
of Health’s National Center for Research Resources and the Agency for
Healthcare Research and Quality, AHRQ, sponsored a meeting of Clinical Research
stakeholders to determine how to include clinical and translational research in
the design of the NHIN and other HIEs.

The meeting broke into work groups to identify the highest priority
Clinical Research use cases and there was agreement upon the same two use cases
of connecting patients to trials and enhancing post-marketing surveillance.

After realizing the alignment between the research community and the group
of pharmaceutical companies participating in the Slipstream Project,
FasterCures and Accenture sponsored a meeting of the two groups on June 15th
that was attended by representatives from NCRR, AHRQ, NCI, the Mayo Clinic, Fox
Chase Cancer Center, Memorial Sloan-Kettering Cancer Center, Intel and the four
pharmaceutical companies from the Slipstream Project.

This group met to explore the best way to work together to ensure that the
priorities of the Clinical Research Community are included in the design and
requirements for the NHIN and other HIEs. The group agreed that a key principle
in all the efforts to develop use cases and the NHIN design should be to
consider patient needs and patient care as the first priority. The group has
remained grounded in keeping the patient’s needs as a top priority while
developing the connecting patients to trials, drug safety surveillance and
appropriate care use cases.

We will continue with this patient focus going forward. Participants from
this group of Clinical Research stakeholders attended the open forum on NHIN
requirements held by ONC on June 28th and 29th in order to support the
inclusion of Clinical Research requirements and to participate actively in
discussions and breakouts. During this forum, it became clear to the Clinical
Research group in attendance that the biosurveillance use case that has been
developed as part of the NHIN prototype projects has many similarities to the
drug safety surveillance use case being considered for Clinical Research.

The group believes that many of the architectural designs, policy issues
and data requirements for biosurveillance also apply to drug safety
surveillance, which is another form of public health. The group feels that this
alignment between the two use cases may be a good starting point for Clinical
Research organizations to contribute to the development of the NHIN
infrastructure and to drive standards adoption efforts.

Immediately following the Open Forum, Dr. Brian Kelly from Accenture
represented this group and presented testimony to a subgroup of the NCVHS
asking that Clinical Research be considered in the requirements for and
technical design of the NHIN. Several of you, as members of NCVHS, voiced
support for the inclusion of Clinical Research in the NHIN and urged the
Clinical Research Community to consider two important points. (1) How can
Clinical Research contribute to the business model and funding necessary to
build out the NHIN and drive the necessary adoption of standards and (2) how
best to work with patient advocacy groups to build an understanding of the
importance of health care information sharing for Clinical Research and public
safety, which can use this shared information to delivery important new
treatments to patients faster and better ensure the safety of approved

DR. COHN: John, I don’t mean to break in. I know you have a relatively long
presentation and I am pretty convinced that you are going to far exceed your 15
minutes. We obviously were here and took that testimony. I am actually
wondering — and I apologize to sort of break in, but I am looking at your
slide pages 6 and on as being particularly relevant to the testimony today. So,
I guess I would hope that you would use the next couple of pages — if there is
something here critical, tell us about it, but let’s move to the — make sure
we have time for the key issues.

MR. APATHY: I will do that.

I guess one of the key points that we would like to make are in the coming
months the Clinical Research Community plans to share its work on these use
cases and utilize all the documentation and the work done on the value
proposition, the business model to both initiate proof of concept projects but
also to — in other words gain real life experience to prove out some of the
use case utility.

I think that I would also make the point that a proposal that is being
discussed and desired to have a private/public collaboration and the
opportunity perhaps to have a working group underneath AHIC that would be
specifically focused on the Clinical Research concerns.

So that was summarizing those points from the rest of that opening

So, in summary, we strongly believe that the NHIN could support Clinical
Research activities and functions and the time is right to engage in the
discussion about the next steps we need to make. It is in this context that I
would provide the following responses to the questions posed by NCVHS.

Regarding what are the minimal but essential network functional
requirements for the initial roll out of the NHIN, the functions for the
nationwide network itself and not for specific edge systems, in order to best
enable Clinical Research, including the matching of patients to trial,
post-marketing surveillance and other population-based aspects of Clinical
Research, the NHIN must support several key components of functionality.

First, the NHIN must allow all pieces of electronic information about the
health of a patient to be designated as belonging to that unique patient. In
order to do so, the NHIN and all systems that connect to the NHIN must be able
to identify that a record or document is associated with that unique patient.

So, a lot of the systems request information about a patient from one
another and ensure that all the systems are referring to the same individual.
In order to conduct the population-based analysis required for several Clinical
Research and drug safety use cases, it will be necessary to study complete
longitudinal health information across a large sample size of patients.

For these aggregate data uses, it is not important to know who a patient
is, but rather that all pieces of information have been identified as belonging
to a unique patient to make up the complete health record for that patient.
Therefore, robust enterprise master patient identification must be a key
component of the NHIN.

Second, the NHIN must contain robust record locator service functionality
that can locate all pieces of health information about a given patient from all
data sources. Many types of data source systems will exist, containing a
variety of data about each unique patient. Upon request, the NHIN must be able
to pull all of these pieces of information together to make up the complete,
longitudinal electronic health record for a given patient. This ability to view
and query against all patient health information is necessary to support
improved Clinical Research, drug and medical device development and safety

Record locator services can be of varying degrees of sophistication. In a
simple form, they can consist of essentially a directory, which catalogs the
location, where patient data exists and more complex implementations contain
the business logic for determining what data is returned with on query for

How much of the type of business logic that resides in the core as opposed
to edge systems will need to be determined. Most likely, much of the business
logic will reside in the edge.

Third, the NHIN must supply the infrastructure to support the exchange of
all types of electronic health information between core and edge. This
infrastructure includes the identification and implementation of messaging,
document exchange and terminology standards to support health information
exchange. These standards must initially support exchange of health documents
between edge systems using standard header information.

I think the prior presentation addressed this in quite good detail. In
order to realize the true benefit of health information exchange for Clinical
Research, particularly drug safety surveillance and population-based analytics,
may types of health data exchanged via the NHIN must be structured,
standardized, normalized format that supports data analytics. The NHIN must
define the message and terminology standards it will accept, which should then
drive the adoption of standards by edge systems wishing to connect to the NHIN
to exchange and receive information.

Finally, the NHIN must ensure data security, patient privacy by thoroughly
controlling access to patient health information. In order for patients to
support and consent to the use of their data for Clinical Research and drug
safety surveillance, they must trust their data is secure when exchanged
between edge systems will only be accessible to authorized individuals and will
be used appropriately for the delivery of care and secondary uses that are in
the interest of improved healthcare quality.

The NHIN design must accommodate where and how a patient’s consent to share
his/her data is acquired, recorded, managed and implemented. While often
technical components for the management of a patient’s consent may reside in
edge systems, the overall NHIN architecture must provide guidelines for these
functions and how the edge and core systems interact to handle the critical

Regarding specific edge systems or entities, what functions should be
minimum but essential for linking to the network? In order to support the needs
of Clinical Research, edge systems must be capable of several functions when
connecting to the NHIN. First, edge systems must have well-defined processes,
policies and enabling technology to capture, store and report clinical data in
accordance with standards for Clinical Research, data capture and management.

Additionally, edge systems must also anonymize and re-identify patient
health information for various audiences based on agreed security and access
rights. Many aspects of Clinical Research are population-based analytics that
do not require knowing who a patient is, but rather require visibility to the
entire longitudinal health records. The technology and policies must be in
place to support anonymization and aggregation of patient data and re-linking
to identify patient information upon request by authorized entities.

Second, in order to fully support many Clinical Research uses of health
information, edge systems must be able to semantically normalize health data to
accepted messaging and terminology standards before transmitting messages or
documents through the NHIN. Edge systems must be able to translate terms from
local standards into format and technology accepted by the NHIN for information
exchange. Health information that is standardized is critical to the ability to
execute data analytics to derive research insights and detect patterns relevant
to the safety of medications and medical devices.

Edge systems must also contain the necessary patient identification
services necessary to link data to a unique patient and record locator services
as described above, capable of handling requests for data from the NHIN. It is
likely that much of the business logic that will control the data will reside
in edge systems.

Finally, Question 3. The HITSP is identifying standards for the NHIN and
for specific use cases. What additional HITSP considerations are there
regarding Clinical Research?

The use of messaging and data standards is critical to the use of
electronic patient health information for the purposes of Clinical Research.
Many different categories of patient health information are required to fully
enable the Clinical Research use cases, including patient demographics,
allergies, family history, personal health history, clinical diagnoses, basic
clinical observations, labs, procedures and pharmacy data. Additionally,
standards for data exchange for genomic and proteomic data will be required.
Each of these types of data must be standardized into accepted terminologies
and formats to support the data analytics required for Clinical Research and
public safety.

For example, standards such as SNOMED, MEDDRA and ICD-9 will be important
for the analysis of medical diagnostics. The LOINC and ICD-9 standards will be
important to analyze laboratory data and results. Procedure data should be
captured in terminologies such as CPT-4 for analysis and pharmacy data should
be captured using standardized terminologies like RXNORM. Theses types of
terminology standards, as well as other types of terminology and messaging
standards like HL7 and CDISC will all be critical to the data analytics that
will be required to support Clinical Research.

So, in summary, our informal consortium and the represented parties of the
broader U.S. based Clinical Research Community believe that the NHIN can be a
transformational capability to improve public health and Clinical Research and
expedite the development of new ways to prevent and treat disease.

In fact, we believe it is an essential next step for the U.S. to remain
competitive in the world’s global economy with respect to biomedical research
and human health and relevant in discovering the next generation of medical
breakthroughs for America’s health. We strongly urge the National Committee for
Vital and Health Statistics to work to ensure that the future NHIN supports
Clinical Research. The patient is waiting.

I want to thank you for the opportunity to testify on this matter.

DR. COHN: John, thank you very much.

I also want to note that as an appendix, you actually have listed out the
various functional requirements. So, thank you very much.

Our next presenter, Vish, and then we will have Kevin Hutchinson finish.
Kevin, are you on the line?


DR. COHN: We have one more presentation and then we will ask you to

MR. HUTCHINSON: All right. Thank you.

MR. SANKARAN: Members of the committee, thanks for the opportunity to speak
today on behalf of the Federal Health Architecture Program. My name is Vish
Sankaran and I am the acting program manager for the Federal Health
Architecture Program within the Office of the National Coordinator.

As many of you know, FHA, through its Standards Work Group, Consolidated
Health Informatics or we call as CHI, has worked with NCVHS for the past few
years. We thank you for that support.

The Federal Health Architecture role is to help create a federal health and
IT environment that is interoperable with the private sector and supports the
President’s health IT plan enabling better care, increased efficiency and
improved population health.

FHA contributes to the President’s health IT plan through three main goals,
by providing, one, input through a coordinated federal voice and collaboration
on national health IT solutions.

Two, implementation guidance to federal agencies through advanced

No. 3, accountability and assistance in measuring the adoption of
interoperability by federal agencies.

All federal agencies within health care activities participate in Federal
Health Architecture. The FHA partners includes approximately 350 members from
20 federal agencies that cover five health domains that include access to care,
population health and consumer safety, health care administration, health care
delivery services and health care research and practitioner education.

In June 2006, FHA informed, involved and compiled input from over a hundred
individual federal stakeholders from 20 agencies in the FHA NHIN functional
review process. The review teams were aligned with the roles shown in the
bottom half of this particular slide.

We are here today to provide an overview of FHA’s contribution to the NHIN
functional review process. FHA has conducted a gap analysis of the functional
requirements and will be submitting three documents for the committee’s
consideration. A copy of our presentation, a summary report on FHA’s review of
the functional requirement and the third one, a spreadsheet with detailed

At this point, I would like to turn the discussion over to Laura Tillery
from the Department of Defense, who will provide further details on FHA review

MS. TILLERY: The process that we went through was quite laborious, I think,
in some fashion. As mentioned, we had five review teams organized according to
functional roles, such as provider, payer, researcher, surveyor. Those roles
then were headed by different agencies and had also a co-chair, looking from
both a functional and a technical aspect.

The kick-off meeting occurred on the 23rd of May. We had general
collaborative sessions that occurred and then more breakout sessions that
occurred, reviewing the 900 requirements or more. We met and consolidated all
those requirements up to the 27th of June and from there the representatives
from the federal agencies were able to take the consolidated spreadsheets to
the NHIN Forum as consolidated of what the federal of these were in the
different agencies.

Since then we have been able to further distill and collaborate and
summarize many of the comments down into this summary sheet that you have in
front of you, about 20 pages for your benefit.

Just some general observations from our review process. As noted, I think
probably previously the functional requirements were based on the Breakthrough
Use Cases and given the fact that the functional requirements probably don’t
cover all the domains, we think that they need to be expanded. So, as they
evolve with further use cases later on, then the functional requirements are
not encompassing of the totality of what the NHIN would be.

We also thought that there was still need for ongoing refinement of the
edge and core definitions and what the aspects of the services provided there.
Many of the functional requirements would have been more helpful if they had
been business oriented because many of our people were functional looking at
the requirements rather than more grouped in technical terms.

There was also a mixture into the requirements of system characteristics,
business rules and design specifications, we felt. To give kind of a summary of
the results of our review, of the 896 requirements in our review, of the 997,
we felt that 98 could be considered minimum essential requirements by the
federal agencies.

The summary is in a report on page 6 through 9. It summarizes what those
minimum essential NHIN functional requirements are. I won’t go into detail with
them but you have them in the report. We also felt that on page 9 through 11,
you can see where we saw the gap in the minimum essential requirements score
based on the NHIN framework that we put forward and that is the gap that we saw

There were some of the requirements that we had in the spreadsheets that
were outside of the minimum essential that we felt were either duplicative or
more policy oriented type of requirements. This is just a summary view of what
they are.

I would like to in the report, though, highlight some of the specific high
level gaps that we noted from the federal agencies. One is the requirements
with regards to the federal mandates for security and privacy addressing items
like FISMA, which is the Federal Information Security Management Act, how that
relates to the NHIN.

Also, with regards to the delegation of access control and release of
information to third parties, we didn’t feel was sufficiently addressed.

Also to 508 compliance, which is for compliance with people with
disabilities, potentially how they would be able to access information. We also
talked about the issue of medication and e-prescribing wasn’t addressed because
it simply wasn’t in the use cases. So, it was lacking also and role-based
access control, particularly with regards to credentials for providers, looking
at the identification of providers based on credentials.

Also, too, with regards to cross-organizational auditing, support for
community health centers and safety net providers. Research and clinical trials
is mentioned. We don’t think it was adequately addressed by all the functional
requirements and was sufficiently addressed and no other further detail needs
to be brought out in that area.

Possibly with the use case in that area it would show more of the
functional requirements necessary. Capture patient consent for release of
information. Also, emergency access was not expressed adequately, such as — I
think it was mentioned, how do you break the glass type of thing in emergency
situations and particularly also, too, with the issue of with regards to
supporting the reporting of adverse medical events and how to — both by the
consumer and provider in that respect as part of the NHIN’s work.

These are just some highlights. I showed the highlights based on our
functional roles that we looked at because that is the way we conducted our
initial analysis. In the report, though, we went to the NHIN’s framework forum
so you can see the long details that we looked at in the report for that.

I think that you will find that we also in our spreadsheets, you can see
the individual agency’s comments in detail if you would like to go through the
997 reports there and see the different comments, but we wanted to summarize
them for you.

MR. SANKARAN: In summary, Federal Health Architecture would like the
committee to consider using the information provided to understand the Federal
perspective and include Federal needs. Also, utilizing FHA as a resource to
collect Federal comments for future NCVHS in a joint activities. We are here to
help you.

Again, thank you for the opportunity to address the committee.

DR. COHN: I want to thank you both very much. I actually really do like
these longer documents. They are very good for airplane reading. Seriously, it
is something where — you know, it is 900, but it is not 10,000 yet. So, I
really appreciate it. Thank you.

MR. SANKARAN: Simon mentioned to me like it is good to have it to read it
after a glass of wine.

DR. COHN: Thank you.

Our next presenter is calling in. Kevin Hutchinson, are you still there?


DR. COHN: Thank you so much for being willing to join us. I am sorry we
can’t see you in person, but we do appreciate your willingness to participate
anyway. My understanding is you actually have a power point presentation. I am
going to have somebody go to the front here and run your presentation for you.
So, please.

MR. HUTCHINSON: Okay. Thank you, Mr. Chairman, for the opportunity. I do
apologize for not being able to be there in person today, but a nasty bug has
kept me home today.

I want to quickly go through and highlight some of the things that are
actually happening in the marketplace as we are seeing it today and weave into
that the actual standards and function requirements that we see for both the
edge systems, as well as the National Health Information Network.

If you go to slide 2, it is just simply to show who SureScripts is, for
those that are not aware. We do have over 90 percent of the pharmacies in the
United States certified on the network today and actually every pharmacy in the
U.S. that is doing electronic prescribing is, in fact, on our network today.
So, we have a lot of experience in how to bring these two worlds together.

In looking at slide 3, one of the uniquenesses of the company and the
organization is that we are one of the few networks in the industry that does
not supply an end user application. We are purely the network, purely a routing
function, validation function between physician software vendors and pharmacy
software. Today we have over 60 various different physician software vendors
that are certified on the network today.

Going to slide 4, it is just intended to show you as e-prescribing has
gotten a lot of attention, even most recently with the IOM report, it shows the
types of messages that are, in fact, being exchanged between various different
stakeholders and community pharmacies, both new prescriptions, refill
authorizations, medication history, as well as formal eligibility checks.

Today we have the physicians, the RHIOs, the payers, live on the network
and exchanging that level of information and hospitals for medication
reconciliation are headed down this path as well. We have not connected into
PHRs because of a primary concern that we still have over the level of process
and standards there are for authentication. I will speak more to that in just a

If you go to slide 5, slide 5 is really intended to directly answer more of
Question No. 2 that was in your document, as opposed to what things in the edge
system should be required. These are a lot of the things that we as a company
are doing today on behalf of pharmacy and physician vendors in the connectivity
world between these two environments.

Auditing and logging of all transactions process is very, very key as we
have found where there has been question as to where a transaction originated
from, that capability is there to audit and log all of the transactions or even
what medications may have been ordered. There are a number of other things.
Some people think with respect to the data that — the standards that exist
today, that everyone is on those standards.

As we all know, it takes different software vendors a period of time to be
able to get to the next level of standards. So, we also have to do some data
mapping and translation between versions of a standard and any edge system will
have to do this no matter what standards are put out there. Software vendors
will be at various points in time with their implementation of that standard.

I am happy to report and I am very pleased to see that NCVHS supported the
move to NCPDP Script 8.1 in December of last year. I am happy to report that of
the over 60 different physician vendors that are on the network, 80 percent of
them have now implemented the NCPDP 8.1 standard and have recertified on the
network and we expect a hundred percent of them to be on 8.1 by end of this

So, these are some various different elements that are specifically to the
edge systems. Now, this is really targeted toward usually specific
environments, you know. There are going to be different requirements whether it
is a pharmacy or whether it is a hospital or whether it is a payer. There will
be some various different requirements based upon the stakeholders that you are
working with.

If you go to slide 6, it is just intended to show you what is happening in
the industry right now is there is a variety of different ways that some of
these RHIOs are being created. The RHIOs could be considered to be edge systems
in themselves. Some are doing in a clinical — a repository approach, some more
just the master patient index function, others a full-blown record locator
service, including the use of the master patient index.

We are seeing different architectures that exist as well. These regional
systems, whether it be a regional hospital, a regional lab, connecting into
these RHIOs, makes a lot of sense. Where we are seeing some difficulty is where
we have national organizations like a national hospital chain or a national lab
chain or a national pharmacy chain, who really has struggled to connect into a
lot of different architectures.

If you go to slide 7, we are seeing the RHIOs being able to interoperate. I
think most everyone is aware of the demonstration project that is going on
through the Markel(?) Foundation, where the Massachusetts, Indiana and
California RHIOs have interconnected and are able to share information across
those various different environments. That could also be used for, you know,
patients that are traveling across the country, as well as in biosurveillance
and population health initiatives as defined in the region.

There is also some statewide prescription drug monitoring that goes on and
in many cases, we have several different RHIOs in the same states, states like
Florida or Texas or some of the larger states, we might see different networks
being created within their own state boundaries.

Slide 8 actually is probably one of the more important slides to try to get
my points across about the National Health Information Network and the edge
systems. It is directly attributed to your Question No. 1 about the NHIN

As I stated before, you have these national organizations, but they have
local establishments. So, a CVS is an example of a national chain but with
local stores, the same with the Quest Labs is a national laboratory but with
local laboratories.

So, they struggle with this idea of this national focus versus the regional
focus and we really do see the National Health Information Network as playing a
role between this national focus and this regional focus, whether it be
hospitals, plans, payers, laboratories, pharmacies. The functions that they
need to provide are really data routing and transmission, certification of the
edge systems to make sure that the edge systems are adhering to the standards
of NHIN.

Registration authenticating the edge systems, in other words, validating
those edge systems and the data that is being shared or transmitted, as well as
the record location, validation of the data, I mentioned before, and also even
though an edge system is performing and auditing a logging standpoint, I think
any system, any network that is transmitting health information should auditing
and logging capabilities.

The real challenge here is at first hand experience, SureScripts is
connecting in several different RHIOs right now, both Massachusetts, Upstate
New York, Indiana and a number of other RHIOs, all on different architectures,
where we are requiring 8.1 to be the standard to connect in those environments.
We are not going to be able to connect in the hundreds of different RHIOs.
There really needs to be an infrastructure, a National Health Information
Infrastructure, where national organizations, where we may represent pharmacy
access, can connect into an infrastructure that is connected into all of those

Today, we are going to continue down this process of connecting directly
into the RHIOs, but we look forward to the day when there are additional
standards or approaches to where we don’t have to have so many different
customized implementations on a RHIO by RHIO basis.

My final slide, slide 9, which directly goes toward Question No. 5,
addresses what I mentioned before, an NCPDP Script 8.1. We are very pleased
that physician vendors have moved to 8.1. We did give one year notice of the
requirement to be on 8.1, which gave the vendors time to plan that
implementation. I think that is something that the National Health Information
Network needs to address as well is when standards are — new standards come
out or new versions of the standards that have been adopted come out, to
establish the right time line for the migration to those standards is very

Otherwise, you are going to have a lot of data mapping, a lot of
translation of data that occurs if the stake is not in the ground on when those
new standards need to be implemented and put in place. The Scripts standard, as
an example, will need to evolve if it is going to be a more interoperable
standard. For example, where we may be sharing medication history with
physicians today from pharmacies to physicians, it does not call for an ability
to share medication history information from a pharmacy to a personal health

Now, one could argue we could use the exact same standard that is used to
share it with physicians. The challenge is there are different work flow
requirements and data that we have coming from physicians versus data that we
might expect to come from personal health records. So, the standards have
actually entered into an interesting area, whereas, not only the identifying
elements that are part of the standard for the exchange of information, but
even standardizing or dictating the type of work flow for how new
prescriptions, refill authorizations, medication history and other information
is to be shared and we need to make sure that we really limit that as much as
possible to where it is not so limiting to work flow requirements.

One of my biggest points I would like to make today is around
authentication. I have been on this bandwagon for quite some time about the
fact that we do not have a good policy or good standards for authenticating
patients, that they are who they say they are through the personal health
records. We need some level of not only initial identification of patients when
they gain their first ID and password. I know there is work being done on maybe
a better rated ID, where we would be — the National Health Information Network
may establish a standard that if you have an online relationship already with a
bank or another entity, you might be able to use that established relationship
to be able to get access to your health ID, password and ID.

So, there are a number of different ways to look at it. Also, some have
argued that cell phones or telephones where there is a confirmation that goes
back in calling back the phone or using an IBR or sending text messages back to
cell phones, but you would need to ping off of various different phone
companies to make sure that that is, in fact, that person’s cell phone and not
one that is a bogus cell phone that was entered by the user.

So, there are a variety of ways to look at ways to do authentication, but I
do think in the whole scheme of things, both at a RHIO level, even at a
National Health Information Network level, that is one of the biggest holes
that we need to be able to fill.

Finally, and I think I heard one of the previous speakers mention it before
about HL7, CDA and CCR standards. We really do need to get those normalized and
finalized into the types of messages that they will support and the exchange of
that information is critical. We have many, many physicians connected into the
network today. We could easily transmit messages between physicians because we
are transmitting messages between a physician and a pharmacy today. However,
there is not really a strong, well-accepted standard in the industry right now
until we normalize the differences between the CDA and CCR.

So, Mr. Chairman, I appreciate the time that you have given me today and I
apologize once again for not being able to be there in person.

DR. COHN: Well, Kevin, thank you very much. I am sorry you are under the
weather. Obviously, we have seen you enough in person over the years that I
mean it is obviously good to hear from you again.

MR. HUTCHINSON: Thank you.

DR. COHN: Kevin, I actually want to start with a question. I am sure others
have questions for the speakers. I think it has been a wonderful panel just in
terms of bringing up multiple different points of view.

Now, Kevin, I have often thought — and I think you know of our long
history of involvement and support on e-prescribing, that e-prescribing would
be the wedge that drives everyone towards electronic health records. Now,
listening to your presentation, I am wondering if e-prescribing is also taking
the lead on the National Health Information Network. I guess I am wondering as
I listened to your presentation — now, obviously, this gets to be sort of a
definitional question but from your perspective, is what you are doing — are
you really an edge system or are you really a part of the — are you really one
of the fundamental infrastructure pieces of the National Health Information

MR. HUTCHINSON: Well, it is interesting that you asked that. I do get asked
that quite a bit. You know, what are you evolving into? You know, we are owned
by the pharmacy industry and my belief is this and this is no representation of
AHIC opinions or anything else. My personal believe is that you are going to
see — I have a theory of birds of a feather flock together. So, when
pharmacies come together because they have a common cause, they want to solve a
common problem, you are going to have a lot more success in working with
organizations that have similar issues to solve, whether it be hospitals or
labs or pharmacies, notwithstanding the fact that you have a lot of local,
regional health care providers that are not national in focus.

We will continue to stay focused simply on the pharmacy industry business
for the time being and anything associated with improving that prescribing
process — electronic prescribing is a piece of that, but the way we see it is
we are more of an edge system because there is a lot more than needs to be done
with respect to sharing of lab information, sharing of referral information,
being able to do a medical history look-up versus just a medication history
look-up. That is going to require the use of a lot more standards and a lot
more exchange capability than we have in our plans today.

DR. COHN: Okay. Well, I will mull over whether you really answered my
question or not, but I do appreciate your comments. I think certainly no one
would represent that what you are doing is the full NHIN as envisioned, but I
guess I will continue to mull over my question.

Now, Harry, I think you have other questions?

MR. REYNOLDS: First, I would like to commend this panel. Each of you stayed
on the subject. That was nice. I mean, you each had your own little flavor in
there, but you stayed on mark. Thank you. Because we are trying to distill this
down. That is very, very helpful.

A lot of it sounded the same. I mean, the categories that you listed, the
functions that you listed, as I was trying to take notes here, it doesn’t look
a lot different.

Somebody asked a good question the other day and I will steal it and act
like it is a good one for me. What do you think the five of you — is there
anything that you really disagreed with what someone else said because the way
you have put this together, this is not a bad framework, you know, to start
building thinking. So, I just am interested if there was anything anybody else
said that you didn’t think was part of the base or you thought was off base. I
would love to hear it, just as your opinion.

Good. Silence is unanimous. Thank you.

DR. COHN: It is good that we are getting that, especially on the second

MS. DAVIS: To answer the question on the eight documents, they are listed
on the top of page 15 of my document testimony. The one I left out was imaging,
the most important one, imaging. Sorry about that.

MR. BLAIR: Kevin, are you still with us?


MR. BLAIR: This is Jeff. This is a question, to be honest with you, I
wouldn’t want to bring this question up unless you were part of this
conversation because you are familiar with this as an issue that came up as we
went through our e-prescribing standards recommendations. Laura, you gave us a
list of gaps. So, in a sense, I am really asking for feedback from both Laura
and from Kevin on this subject. Yesterday, one of the testifiers representing a
well-respected consortium that is giving us ideas on RHIOs and the NHIN
included in their recommendations — and they really meant it. I wound up
checking to see if they were really serious. And they said they felt that
nonrepudiation was an important core function.

Now, Laura, when you gave us the list of gaps, you did not include
nonrepudiation as a gap. Now, the FHIN(?) includes a form of nonrepudiation.
So, No. 1, Laura, could you tell us why you didn’t include nonrepudiation and,
Kevin, I would like to hear your thoughts on the idea of the NHIN, whether it
should or should not include nonrepudiation.

MS. TILLERY: Just to let you know, it is included as part of the security
and privacy. We considered that as part of the FISMA mandate, federal mandate
issue, as part of that confidentiality, nonrepudiation, authentication,
authorization. They are all parts — they are elements of that part of the
security process. So, we do really think that.

MR. BLAIR: Kevin, can we have your thoughts regarding nonrepudiation as a
core function of NHIN?

MR. HUTCHINSON: In theory, I don’t disagree with the idea of
nonrepudiation. I really don’t because you could then without any hesitation
understand that this is the individual as who they say they are. My concern is
the readiness of the technology to be able to implement at an affordable level
and in the health care avoiding error rates that might not be seen in other

So, I guess my concern with it is we are making great progress in health
care at this point and, you know, if we could stop and wait until we get the
technology to be more of a less error prone, less expensive to implement to
make it native inside of these applications and other things like that. But my
concern is that is to slow down the momentum of what is going on at this point,
although understanding that we still need to be cognizant of the security and
privacy. So, in theory, I am not against it. I just — we are still looking for
a better way to implement it in such a way that it doesn’t slow everything down
or make it cost prohibitive.

MR. BLAIR: Thank you, both of you.

MS. TILLERY: Just one side, though. One of the thoughts on the federal side
was also the need for multiple levels of security and multiple levels of
authentication. So, we think that that is an evolving functional requirement
over time, that the technology will become mature enough to handle that type of
security process. So, we would agree.

DR. COHN: Okay. Maybe I need to ask you what do you mean by multiple

MS. TILLERY: That would be the ability for — based on your role, but also
based on the type of scenario that you are involved in. So that it depends on
like in the military by the DOD we have your unclassified, sensitive,
confidential, top secret type of thing. So, there would be multiple different
levels of security that would be placed on it.

So, for example, if you are — if it is a very sensitive type of passing of
information, you might have a much more tighter secure messaging standard than
you would if you are just passing CBC results or something.

DR. COHN: Thank you.

Other questions?

MR. BLAIR: It isn’t a question so much as, Laura, I just really appreciated
your identification of the list. It was an excellent list and it really will be
helpful. Thank you.

DR. COHN: Okay. Well, with that, I am going to give everybody a break
before we move into our final session. Why don’t we take about a ten minute
break and then we will reconvene at about 2:30.

[Brief recess.]

Agenda Item: Public Testimony

DR. COHN: Our next session is really an opportunity for public testimony
and we have a number of people who have identified themselves as wishing to
give testimony. So, I want to thank you all for coming.

We are asking, obviously, in the spirit of this being public testimony if
people would try to limit their remarks to around five minutes and we will try
to ask questions — maybe except for clarification, we will ask questions at
the end after you have all had a chance.

Gary Dickinson is — because he has another commitment afterwards has asked
to go second. So, we will move him up and then he follows Russell Williams,
Charlene Underwood and Mark Adams.

Obviously, we want to thank you for being willing to come and talk with us.
So, with that, Vik, I think you are up first. Do you just have testimony? Do
you have power point?

DR. KHETERPAL: I actually do have a power point.

DR. COHN: Okay. Please.

DR. KHETERPAL: And I believe there ought to be handouts as well.

First of all, let me begin by thanking the committee for allowing this time
for five minutes or so to share some thoughts that I personally have. I come to
you today as a clinician. I am trained as a physician. I haven’t done anything
like patient care in about 16 years, became an entrepreneur after an NIH
SBIR-funded grant became an EMR product, which is now popularly known as GE
Centricity and ran the clinical information systems business for GE for about
three, three and a half years, leaving that about a year and a half ago.

Now I think I come with a slightly different perspective as a father of
three and eldest son of aging parents with issues in health care and then you
personally have a greater opportunity today than I did 15 years ago to interact
with the very system I trained in, so, this being the health system. Some of my
comments are really frankly more tainted by that than anything else.

I thank first of all Mary Jo for her clarity in doing the invitation as to
what our remarks ought to be. So, I have four recommendations.

First of all is to emphasize the needs of the local health information
organizations or networks, those that actually form at a local level amongst
providers the ability to interchange information, the little n in the network
of networks. If the NHIN is comprised of a network of networks, the little n is
the efforts at the local level, which are the building blocks of the larger

The first recommendation, which I will go into more detail in a subsequent
page, would be to emphasize their needs a little bit more than I think they are
getting a hearing on.

Second is to potentially take a deeper look at the security of the RLS, the
Record Locater Service itself, and how secure it is. We spend a lot of time and
energy talking about securing the privacy and the importance of behavioral
health records and HIV and other sensitive clinical data, but not as much
attention, at least the dialogue doesn’t tend to get into what about all the
demographic information that will be resident in the RLS and the threat that
may pose.

Third recommendation is to actually — I think all the use cases, including
my colleagues who we have talked about, the research community testimony, just
in the last session, is maybe a possibility of beginning to distinguishing the
NHI requirements and prioritization of them along the themes of care delivery,
biosurveillance, PHR and research and I will share some thoughts as to why I
think that may be helpful.

Finally, if there would be some opportunity for the functional requirements
to include validation criteria for record linking effectiveness since that we
believe is a core enabling technology that really everything else is built on.

So, the first recommendation, emphasizing the needs of the local health
information network or local health information organization and broadly
speaking, you know, with the HIT-board and some of the other things that I
think Blackford and others talked about yesterday, there are 500, maybe 600
announced real initiatives around the country. We are aware of the excitement
around that. Yet, most of them are at the baby steps. They are startups.

The four contractors that have been working on it have leveraged in the
example of what Mark Overage was representing, a lot of great work done at
Reagan’s Reef, a lot of great work done in Massachusetts, great work already
that was done previously in Mendocino(?). I guess the theme is that pulling all
of these things together is obviously the mission of the NHIN, but most of
these folks would tell us that it took a lot of energy and a lot of technical
know-how in order to get to regional success, in order for them to be able to
contribute in a meaningful manner at a national level.

So, our thought is that in the dialogue and particularly in the functional
requirements if we were to go through them again, as some of our folks have
done, to say which of these can be tuned specifically to assist local
communities to adopt this, which of these standards could apply uniquely to
them or specifically to them, such that they are in a better place to be able
to participate in things like an interest null bridge or something of that

It seems in my personal view that there is not a whole lot of attention or
particularly time allocation to the needs of that local environment. So, three
specific recommendations. Develop some sort of requirements, specifically SNO
in the core environment, the core of the system of the NHIN. Maybe encourage
HITSP or something like CCHIT to accelerate the focus on interoperability for
EHR. We look at that 2007 efforts that CCHIT has. That is sort of the bottom up
approach that will allow the EMRs and other things that are being adopted at a
very local level to be able to populate the local health information exchanges
so that they can participate and enter SNO bridges to effect NHIN.

Finally, maybe sponsor work to define intermediate stops along this journey
towards NHIN, a more incremental approach to say what might be the immediate
next step if somebody is successful in a locale. What does a reasonable success
look like? What are the functional requirements that sort of graduate an entity
from one to another.

A set of sort of animation slides which show up as one printout. Choice
point, data loss awhile ago, followed by the — of the records that card
systems lost about a year and a half ago. Then something happened up in
Providence, Seattle-based health systems, 365,000 records and then the most
recent, of course, in the Beltway. It gets lots of attention as the laptop
with, you know, 26 million veterans information.

Our takeaway from this is that demographic information is exceptionally
dangerous. While clinical information release some issues around privacy and
security are obviously of importance, as I suggested before, the current real
threat seems to be aggregation of demographic information that includes stuff
that identifies people as who they are. That is where identity theft, financial
risk and other things live and, frankly, that is the risk that has been
realized over the last little bit, over the last few years particularly.

It is a particular concern as you analyze this and project the success of
an NHIN because most Record Locator Service design as contemplated are
centralized in primarily centralized models. Although folks speak to the
ability to have a federated RLS, frankly, even the four prototypes that are
being currently contemplated would be centralized. So, our thought there is
that maybe the recommendation here is that the core functional requirements
should address a need to have a blinded secure RLS, one that withstands the
obvious vulnerability of a large store of demographic data sets, which if
wildly successful may comprise a vast majority of the citizens of this country
and all the really confidential information that goes into affecting identity

There has been some great work done in Australia with linguistic
techniques, such as bi-gramming, that allow — you know, Shawn Grannith and
Mark Overage have some exposure to this stuff. It has not been done in
production in the U.S., but we have actually tried this out, so to speak, in a
lab and affected it at a couple of communities. such that you can actually have
a blindfolded Record Locator Service. As a holder of that service, you can do
linking for citizens without ever knowing what the underlying name and address
were. You basically have access to hash(?) — so this is inherently a more
safe, secure environment.

This not about encrypting, decrypting stuff. It is about being able to have
a bank vault system, whereas, the operator as a bank, my safety deposit box,
frankly, is secure because I hold the other key and even disgruntled employees,
anybody else, they really do not have access to that. I think there are
techniques of this nature that ought to be seriously explored as part of the
core requirements of the NHIN.

Very specifically, some examples of that are listed. I won’t go into them
here. They are in your printout as to how n-gramming techniques could affect

A third recommendation would be to actually, as I alluded to before,
separate the care delivery use cases from biosurveillance, PHR and research in
the functional requirements. Some of this was borne out maybe in a very vocal
way by the UHIN, JAN(?) and I think folks, Jack, spoke to it a little bit. The
idea here is both for business models, that is where the business model
justification lives is at the local level.

From a citizen’s perspective, if you would look at the Harris and RAND
studies and some of the other data out there, if there is one reason that
citizens of this country are willing to withstand what they think, what they
believe to be potentially at a higher risk for privacy, it is because it would
enhance clinical care.

So, our thought is not necessarily as I think your recent submission to the
Secretary, that somebody actually suggested forget all the other stuff. Just
focus on care delivery in the first go-around. I am not suggesting that we
throw all that out, baby with the bath water, but minimally to actually look at
the functional requirements along these four themes and ask the hypothesis to
say how would I prioritize this if I wasn’t doing it for biosurveillance. How
would I prioritize this if this was not for PHR? How would I do this if it was
purely for clinical care?

Our suggestion is that you would end up with potentially different
functional groupings and prioritizations as to what should formulate core
requirements for the functional requirements versus not. And they may clarify a
path that is more actionable than a super set of all wish lists.

Final recommendation 4 is, again, potentially too detailed and potentially
not really a functional requirement recommendation at all, which is to say, you
know, if we are going to get to a specificity of one in a hundred thousand
false positives, that the NHIN functional requirements contemplate how good
will the sensitivity be.

All four of the contractors when I asked that question a few weeks ago at
the forum said they are not actually officially studying that as part of this
prototype. So, we really don’t have a path right now that we are on through the
course of the experience collectively that says, hey, when we try to achieve on
some of the shall we say the fulcrum around which a lot of this Record Locator
Service has to live, what do we actually achieve since we are not formally
studying it? Part of the reason the study is very difficult as an entrepreneur
outside working in this space is there is no gold standard annotated data set
of linked records that I can subscribe to and say I have come up with really
something cool or good that works. Let me test it.

I don’t know how to go test it. So, either again through a HITSP or a
CCHIT, it is not really a certification issue, but really the work to sponsor,
to authorize, to fund an initiative, to create this gold standard annotated
data set that could be used by all participants that are working towards this

That was my last recommendation. Thanks.

DR. COHN: Vik, thank you very much.

We will try to hold all the conversation until we have heard all the
presenters. but very interesting.

Gary, you are on next.

MR. DICKINSON: My name is Gary Dickinson. I am director of health care
standards for CentrifyHealth, which is an emerging PHR vendor. We essentially
took the list of functional categories that you supplied and came up with our
own rankings for those. For each of them, we identified or for many of them
anyway we identified particular things that we thought were particularly
important to consider. We did not do an exhaustive gap analysis, simply not
time for that. But appreciate the fact that others have.

Basically, we want to ensure just kind of briefly skimming this list
because I have a number of slides, basically, audit and logging needs to
include traceability, which is end to end, at least in the space of source to
receiver. Authentication needs to ensure that we are identifying edge systems
as they communicate within the network and that we have authentication of the
source and authorship of records and data that are communicated across the

Authorization needs to include not only authorization for access to
individual health records, but authorization to access to system functions,
authorization for access to particular data content of the record.
Confidentiality is ensured by many of the other functions that are identified
in the list. So, we won’t belabor that.

Credentialing, we did here is to include not only individuals but also to
ensure organizations are appropriately authorized and registered as being
appropriate partners for communicating systems on the — communicating
organizations on the network and then, of course, communicating systems also
would need to have the same level of authorization and registration for their
participation in the NHIN.

Assuming that each of those would also not only have a discrete identifier
for them, but would also be classified by their role.

I am continuing on. I am now on slide 6. I don’t think there is anything
particularly astounding in any of this. I think I will move directly on to the
recommendations for standards. We believe that the HL7 — I am now on slide 9
— HL7 EHR system functional model is a key component of standards that should
be considered as part of the NHIN of the functional model specified as
functional characteristics of EHR systems, which includes a set of
interoperability functions. It is obviously the framework for CCHIT and their
certification of the EHR systems.

It is now a normative standard, which is just going to ballot here next
week now at the membership level ballot. So, hopefully, this will be the final
round of ballots for the functional model as a normative standard. An emerging
standard in HL7 — I missed a bullet point here — is the EHR interoperability
model, which specifies interoperability characteristics of EHR records and is
complementary with the EHR system function model. It is now a draft standard
for trial use, which is also going to ballot use here this next week.

The bullet that I missed on this slide was that this interoperability model
specifies have a common EHR record unit. We will talk about that here in the
upcoming slide.

Also, a few standards to consider is ISO 21089, which is trusted end-to-end
information flows, which specifies persistent health record and key points in
its flow and lifecycle, including origination, amendment, translation,
verification, access/use, transmittal and receipt, anonymization, aliasing,
archiving and so forth.

Basically, what we believe in terms of network functionality is that we
have already defined through the EHR system functional model and through the
efforts of CCHIT a set of criteria for the systems at the edge or at least for
EHR systems at the edge. We believe that the network is — the network
functionality should be very complementary with those functions of the edge
systems on both sides, assuming that — in a simple paradigm as we have
outlined here essentially of a source system. You have the network in the
middle and you have a receiving system and ultimately that the functions of all
three in that sense, point to point, must be complementary.

So, we believe that there are basically four classes of NHIN and
interchange. The first two are traceable to a persistent source of information,
which would typically be the originating system for a particular health record.
So, those two are the individual health record as traceable to a persistent
source or an extracted subset of that health record is also traceable to that
persistent source.

We believe that other things would be communicated across the network,
which may not have that characteristic, which may not be traceable. We also
believe that messages are fundamentally transient interchange artifacts and
really do not appropriately play the role for persistent, indelible exchange of
information from source to ultimate recipient.

We think that this can all be simplified in a very straightforward manner,
as has been identified in the HITSP AHIC use cases, everything — the use cases
all resolve down to an action. That action is the occurrence of some event in
health care. That action requires an action record, which is persistent
evidence of that action and that action record is then interchanged either as a
persistent record in and of its own right or an abstracted subset of that
record, which yields an energy pattern, which looks something like this, where
the originating system or the source system originates and retains the
persistent record.

That record then may be persistent across multiple hops if you will to
additional systems, which also retain that persistent record in its original,
indelible and unaltered form, which can be authenticated in all cases. There
may also be cases where the second or third system may use an extracted subset
of that persistent record. But there really are only two cases in terms of
traceable records back to the source.

I would suggest that our standards focus going forward needs to be much
different than our standards focus in the past. In the past, particularly in
HL7, we have focused on what happens at the back end. Messaging, point to point
messaging, using — or point to point interfaces using transient messaging. We
really need to move rapidly to a persistent legal health record and by
definition that must be something that is created, originated, retained, in the
source system at the front end. It is not a back end artifact. It is a front
end artifact, which has been passed by the back end interface from system to
system, where systems may choose to extract only the information they care
about from that record, but at least there is a path for that persistent record
from its point of origination to its ultimate point of use.

Again, I believe that our standards focus needs to be front end focused and
that implies, of course, that standards must become internal to applications
and are not simply back end artifacts. I think that is a crucial leap for this
industry to move forward appropriately.

Thank you.

DR. COHN: Gary, thank you very much.

Our next presenter is Russell Williams. Thank you for joining us.

MR. WILLIAMS: Thank you. Good afternoon, everybody. My name is Russell
Williams. I am speaking today on behalf of the American Health Care Association
and the National Association for the Support of Long Term Care.

I am a nursing home operator in Atlanta, Georgia. We operate three
non-profit nursing homes. I like to think that in addition to speaking on
behalf of these two groups, I am also speaking on behalf of all of our patients
that we take care of everyday.

I about three years ago recognized the need to move in the direction of
electronic health documentation for the improvement of the care and service
that we expect our nurses and our nursing assistants to deliver everyday. I
recognize that the checklist, the paper systems that they were using at the end
of their shifts just was not sufficient to address the very detailed needs of
our patients. So, we started exploring all the different vendors that were out
there and we implemented this system.

I am here to tell you that it has been wonderful. It has been a lot of hard
work, but our patients today get better care than they ever have before. Our
CNAs have Palm Pilots that address what specific issues that they need and when
they need them. This has improved our satisfaction, the quality of life of our
seniors. To me that is the most important thing.

So, I don’t think that we are unique in our approach to wanting to take
care of our seniors. It is just that we have to give our staff the tools to do
this work. That is simply something that our country doesn’t have right now.
Now, we have done our part, as far as I am concerned, my little organization in
Atlanta. But when we send our patients to the hospital, their information
doesn’t flow. They don’t know if they are allergic to a specific medication and
they are starting all over again, every single time. So, we are firm believers
in our operability and we wanted to make a point today with this illustration
of our group. It is important that long term care is part of the design of this
NHIN, not something that we are assimilated into after it is already made.

That being said, we did take some time to look over these standards, the
functional requirements. I thought I would just go through a few of these ideas
with you right now. At a minimum, adoption of a common set functionality that
allows nursing facilities to exchange information with hospitals, doctors when
transfers occur. This exchange must be handled in multiple ways, including
rural areas by dial up and by fax.

At a minimum, this connection for the facilities needs to be secure. It is
a common theme I have heard all day. Authentication. Important.

A system must be designed to accommodate a nursing facility environment
where nurses and nursing assistants provide the hands on care, the bedside
observations and complete the care status reports. At a minimum, the NHIN best
serves as a pointer system, allowing for the capture of the most recent

And at a minimum we are concerned that there are being too many
“shalls” in these functional requirement implementations will become
more or less cost prohibitive for long term care operators to become involved
in this network. So, we want to make sure that we are not trying to create
something that is a maximum.

The two examples of these maximums were the flagging new results in EHR.
“New” is a new term that must be defined in great detail in providing
around-the-clock care to long term care residents. Trying to flag every report
and every little change is not possible. It may take more sense for long term
care to submit information only when a resident goes out to the facility for
care, hospital or ER.

“Shall” may be too harsh of a requirement in this case.

At a minimum in the data transfer area, it is important that appropriate
standards be selected and that those standards are disseminated to interested
parties for implementation in the edge systems. Long term care in this country,
we are projecting to spend $128 billion this year to care for our frail elderly
in the nursing homes.

In 2015, that cost is projected to grow 60 percent to $217 billion. For
long term care, people that are working in these facilities and the technology
folks that are working there, they need to be considered as this is being
designed. It is my personal believe that that $217 billion that is projected
will be drastically reduced if we can come up with this system and have long
term care involved in it because I do believe that that it not only improves
patient care, but I believe ultimately it saves money.

Long term care profession, we enthusiastically offer our HIT experience to
you to ensure that the NHIN becomes a reality. The long term care will fit into
this system in a comprehensive way to improve patient care.

Thank you very much for your time.

DR. COHN: Thank you. Actually it looks like you had another recommendation
specifically also —

MR. WILLIAMS: Actually we did. I kind of skipped that over. This was
referencing the multiple language support specified and displayed by the user.
Without qualification this is untenable in long term care, we believe. We
question whether this can ever work. Questions regarding languages and who
translates. The translation software with medical spell checkers struggles
currently with English. The cost of compliance in terms of time and dollars we
believe is beyond our means. “May” might be a better way of saying
this instead of making it a functional requirement.

DR. COHN: Okay. Thank you very much.

MR. WILLIAMS: Thank you so much.

DR. COHN: Mark Adams — actually I am sorry. Charlene.

MS. UNDERWOOD: I am going to actually — I introduced myself this morning.
I will try and keep this timely because I know you are on a schedule. My name
is Charlene Underwood and this time I am actually presenting with a Siemens hat
on. Siemens Medical Solutions today is about 13,000 people in the United
States. Again, it is a global company and we provide medical imaging
technologies, health care information systems management consulting and support
services to nearly 12,000 health care providers.

Again, on the focus of my testimony today and you should have two handouts.
I am not going to impart to you all the detail of my second handout, is I have
got a high level piece, which we will walk through fairly quickly. I had
actually included a lot of detail because my message here today is fairly
simple. We are not operating in a green field. There has been a lot of
commitment and investment on the vendors in this industry’s part, as well as
our customers in implementing information systems for many, many years. There
is a lot of experience I think that can be brought to the table in helping
solve and address the NHIN area. What this testimony will do and what the
documentation does more so is really just correlate the parallels, what are the
requirements from the NHIN to what is actually happening today in trying to
integrate and share information across health care enterprises.

So, when you look at the goal of the NHIN, which is to foster widely
available services that facilitate appropriate, timely and secure information
of health information, there are many similarities to trying to do that in
health enterprises today. So, this information I hope will provide the
opportunity to leverage the lessons learned and that is what is documented in
the detail of the information, as well as suggest that the NHIN efforts should
focus in experience on areas with theoretical benefit, perhaps limited
experience and collaborate very fully with the HIT industry for areas where
there is clear overlap and experience.

I think you saw that in Kevin’s example, when he explained some of the
experience they had had with SureScripts. To that end, I am going to really
talk about lessons learned, but I am going to if you will put your David
Letterman hat on because we are actually going to do it in a way to kind of
wake up the afternoon, if you will. We are going to talk about ten reasons why
and we are going to go on the positive, why building a health information
network is achievable. I am going to share with you some statistics, some
evidence of why we know this.

First of all, importantly to be successful, we have to integrate data
exchange into the work flow. So, to that end, most of the vendors today have
adopted standards like C-CAL(?), which enable data sharing at the desktop. They
have adopted concepts of portals and then in addition we have implemented and,
again, we can’t forget the financial industry in this picture, you know,
processes to really exchange eligibility information. For example, we provide a
service, which supports eligibility, which processes over 12 million
eligibility transactions per month and these today are fully integrated with
customers registration, scheduling and admission systems.

So, this capability is available already. So, again, one of the key
requirements we believe of supporting this kind of a network and there is
evidence and experience that you can leverage in defining what is possible. We
do know how to match patients. Another example of that — and I will give you
two evidences of that — No. 1, one of our products is a product which is an
EMPI and in that we have over 106 million patients, which represents 37 percent
of the population in the United States. So, we have done this a long time. So,
there is a lot of experience.

It is not just Siemens. It has initiated a lot of other vendors, who have
had this experience and we have to match and manage and there is a lot of
experience there. This brings a lot of value and, now, unfortunately, some of
those 37 percent are dead because patients that go to the hospital die. But
those are kind of the numbers we are using.

One other piece of evidence, you know, to that end is that we are also able
to with our eligibility services — and I think this is, again, another great
parallel, we process over 12 million real time transactions per month and 99.9
percent of those inquiries for eligibility are successful, again, matching
algorithms that are working today in both an EMPI sphere as well as the
eligibility sphere.

We know how to exchange data to and from multiple systems. Incredible
investment in the industry and the concept of interface. Siemens itself has
libraries of model interfaces again. So, there is great experience in doing
this. In the eligibility sphere and I think you heard in Kevin’s presentations,
but we link to banks, we link to payers, we link to multiple kinds of
hospitals, again, ability to leverage that experience.

We know how to provide high speed, available, reliable communications and,
again, just the example of being able to instantaneously provide eligibility
transactions for over 12 million inquiries per month on line with end users
gives you the example of the kind of performance that is available. Again,
performance is going to be a key here in terms of the success of NHIN.

Likewise, it supports your requirement in terms of supporting high volume,
real time transaction processing and data exchange. Actually on point 4, this
one, I hesitate to put on the public record, but we do have a very secure
network, which has terabytes of data, you know, has over 500,000 end users that
access the system and is fairly fail-safe. We have not had an intrusion.

So, it is possible to achieve, you know, kind of what Vik was saying. It is
not that you don’t want capabilities, but our experience is you can learn from
in the industry. They give you evidence that you can do that today.

We do know how to integrate data of varied quality, granularity and
consistency. Again, huge investments in concepts of vocabulary engine, on

— a lot of time normalizing their data to help move in that direction. They
have done that over the last decade. The example here in terms of the HDX
services is that if you are going to do these inquiries, again, you have a high
percentage that are successful inquiries. And I have one other piece here.

As we process these millions of transactions each month, the acceptance
rate of these transactions by the person receiving them is over 98 percent. So,
again, good experience that demonstrates in the financial community if we model
some of that and, again, you have got — there is evidence that you are able to
accomplish this network.

Perhaps one of the most important things is to know how to collaborate with
other stakeholders to improve operability. A significant investment by the
industry — like IAG and with other stakeholders to make this happen. We know
how to accommodate local, regional and national standards and policies. Again,
we operate nationally. We operate globally. We have to be able to do this.

These last two come from our customers. We had a focus group on Tuesday
morning and I said I was testifying. So, they asked me to deliver really two
messages. This is perhaps one of the most important ones I can bring to the
table. There is a clear recognition by the providers in our industry that the
financing and delivery of health care is in for a major change. We want to move
to paying for quality.

So, to get there in the very interim term as we move toward, you know,
these new models of health delivery, they don’t want to build anything into the
NHIN that perpetuates today’s model of health care delivery. So, their
recommendation is keep your first phases simple. Keep it incremental. We have
an example of one of our implementations, which is implemented secured
messaging. They share information among 500 physician practices. Saving those
physicians $6,000 per year by a physician. Again, secured messaging, it is
open, innocuous, whatever EHR. It automates manual processing, attacks this
simple stuff in terms of removing — you know, making it convenient, as well as
the productivity issue.

So, again, when I look at that implementation to me, it is a start-up NHIN.
It doesn’t do everything that we have talked to, but, again, it is grass roots.
It addresses the local levels. It is very convenient. We invite you to take a
look at it.

Lastly, this again came from our customers. They say, well, okay, financing
is the biggest issue but, you know, I don’t have to ask for building an ROI for
a CT scanner. I mean, why — clinical information, if you will, is just a tool
to improve quality and we do see it in the future becoming an expected standard
of care.

So, that was kind of their last recommendation. So, I guess funding is a
huge issue there. This health information exchange they view to be as a public
utility and we need to look at it as becoming an expected standard of care.

The last couple of slides — and I am not going to spend a lot of time, but
as we kind of model what these architectures look like, so you kind of look at
— and I think this parallels the past presentations — what the enterprise
looks like. So, the top is the patient centric view and the bottom is all the
departments and facilities they get connected to. That is the work we have been
doing for the last 10 to 15 years.

And our customers have been doing to integrate — if you look at the
regional health network, again, you will see a lot of the different pieces and
you will see consumers attached to that regional network and perhaps some of
the lower hanging — some other facilities that can’t afford more sophisticated
attachments to the NHIN. So, again, you start to see some of the same
capabilities in the regional network.

Then if you move off to the NHIN, again, network-network, you are going to
see some of the same capabilities. At that point, do consumers or perhaps some
of the rural facilities attach directly to the NHIN. So, again needing things
in common.

Lastly, I just wanted to convey that we do offer — and I mentioned earlier
this integrated health EDR network. I gave you some of the statistics. Again,
other vendors offer these same type of capabilities. I encourage you not to
lose sight of the success that has been gained in trying to support EDI and
data exchange from the financial venue and see what you can do to leverage that
as part of the NHIN effort.

Thank you very much.

DR. COHN: Charlene, thank you very much.

I think our last presenter is Mark Adams.

DR. ADAMS: Those of you who know me know I am notorious for not taking much
time talking about things. So, I will do my best today.

I am Mark Adams. For those of you who don’t know me, I am the project
manager for the caBIG project out of the National Cancer Institute. I am a
senior associate at Booz Allen Hamilton and I am a bioinformatics person.

I am here really to discuss how we have been implementing within the caBIG
project a multi-site, multi-institutional research network for collaboration
and coordination and how we can both make use of but also perhaps show the way
a bit based on the work that we have done to the NHIN project. I do appreciate
having the opportunity to present to you.

caBIG or the Cancer of Biomedical Informatics Grid sports coordination and
collaboration among almost a thousand participants spread over more than 50
sites throughout the country and now increasingly internationally. Software
developed as part of this program is open in many different ways. It is open
source freely available. It is built to open standards, a variety of open
standards and we are involved in those committees. It is openly developed. The
software is developed within the community, both commercial and academic sites
and it is federated. It is open source, freely available.

It is built to open standards, a variety of open standards and we are
involved in those committees. It is openly developed. The software is developed
within the community, both commercial and academic sites and it is federated.
It is open to a connection without having the necessity of a single centralized

It is federated among often existing systems. This picture illustrates how
diverse the group is, both geographically, we have got a map with a picture of
participating centers spread out throughout the country and also it is a set of
photographs, which illustrates the variety of types of activity, which are
carried out within the context of the caBIG project. This range of activities
is illustrated in this slide, basically ranging from clinical trials
activities, a collection of clinical data, through the management and
collection of the tissue specimen’s derived from that, proceeding through
analysis, laboratory analysis, both conventional and high throughput molecular
biology and ultimately the correlation of the results from that analysis back
with that original clinical data collected at the beginning, closing that
traditional bench to bedside loop.

The intent here, obviously, is to create an opportunity whereby data
collected in any part of this system can be correlated, collaborated and
ultimately fed back for the use of clinicians, patients, clinical researchers.
This, we believe, is relevant to this group. It is our hope that we would be
able to leverage and participate in the National Health Information Network and
allow us to carry out this diversity of activities within the context of that
network whenever possible.

caBIG involves a large community, as I had illustrated before, but this
community includes not only the folks you might expect, the large universities,
universities like the University of Pittsburgh, M.I.T., Harvard and others, but
also government agencies, such as the National Cancer Institute, FDA and others
and then finally companies, both large and small, IBM, Bellows and others.
Again, a very large community and, again, somewhat illustrative of I think what
the work that is going on here, it is an attempt to take a very diverse group
of communities and allow them to communicate with one another in a consistent

The caBIG architecture, the approach that we have taken is federated,
again, based on our needs, but it provides the means and as was discussed
yesterday, by making use of tooling like standardized vocabularies and
standardized data elements, an approach that makes use of mapping to those
vocabularies and data elements is a way of providing access to a variety of
information and existing systems, I think is something that is, No. 1,
illustrative of what may, in fact, ultimately go on within the NHIN, but also
we would like to see the networks, which evolve from that in terms of our
requirements be able to support this kind of federated transfer of meta data.

As an example of how this coordination takes place, again, yesterday, for
those of you who were here, there was quite a bit of discussion around this
meta data modeling, standardized data models and so on. We have been working
closely in supporting and developing an expert group to develop something
called BRIDGE. The BRIDGE Project is an effort to develop a shared meta model
for clinical research, which can bridge a range of existing data models and
provide a mechanism for interoperability among systems built using those data

These data models include some of the systems that we have discussed here
today and yesterday, including HL7, C-DISK and others. We do think that it is
this community of stakeholders approach, which doesn’t force the group to
select or elect a particular data standard, but sometimes can be allowed to
select a set of standards, which support the range of activities and then
create a meta data model that can provide interface between those. We believe
that this is a reasonable approach and it is one based on the idea of shared
semantics rather than the idea as we talked about here, kind of a dig out and
replace and this semantic foundation again is as I mentioned allows messaging
to be passed between the variety of different systems.

This BRIDGE model and, of course, this is just for illustrative purposes,
this is actually a picture. If you could zoom down, you could read it. This is
where BRIDGE is today. BRIDGE, as I mentioned, is based on the HL7 V3 rim and
based on the C-DISK. It has a variety of contexts ranging from tracking,
authoring, statistical analysis, recording and transmission and protocol design
and so on, all of those functions being reflective of poor clinical trials
activities admittedly, not necessarily for obviously, the delivery of clinical

These lead down or drill down to a set of functions — on the Nationwide
Health Information Network. One of these, just highlights — we have written
testimony, which I will provide in the next couple of days, which does
exhaustively go through the list and analyze that in light of both what we have
done and what we anticipate we might need. But just the top lines of some of
these, we believe that the NIH — should support core functions, which provide
the means for operation of biomedical research. We would hope that it wouldn’t
exclude that as a fundamental use case. We believe that it is of interest to
provide for secure and robust transmission scientific data and sometimes
including fairly high band width data sets.

This has obviously been encountered in these discussions around imaging,
but similarly things like gene expression profiling and so on are actually
quite large and may in the future increasingly become part of clinical care, at
least in cancer. The NHIN platform should allow for the establishment of system
based on services or architecture as this was nicely discussed yesterday. The
NHIN should leverage the BRIDGE meta model or something like it as a standard
means of exchanging relevant clinical trials related data and as a means maybe
of not being forced to elect between two competing standards, but a way to
bridge standards that are deemed necessary for success.

Finally, NHIN can take advantage of the prototype establishment that is in
caBIG of services that provide the semantic interoperability, even as a
demonstration. So, as you start to think, as it was implied yesterday, about
the development of these standardized meta models and so on, we offer the
systems that we have built at least as a prototype that demonstrates that that
can be done.

This leads me to my final slide. What is the future here? We have some
commitments. We are going to continue to participate in this insofar as you
will continue to let us, but the caBIG program will also leverage wherever
possible the infrastructure that you develop and deploy. caBIG has a rule of
living off the land. We try to make use of things that are developed in the
community whenever possible. We try not to invent things. Members of the caBIG
community will integrate their software. We have already got a number of groups
here, who are also participating with caBIG and I think particularly important
is the early phases of the project, partly because of its nature and biomedical
informatics, caBIG tends to be kind of at the edges of the periphery.

We encourage you to take advantage of those folks who were pioneers to test
early developments in deployments. They have experience doing this. The NHIN is
invited to participate in caBIG’s open process in whatever way is possible and
appropriate. The BRIDGE is the most obvious place where representation from
this group to that group may be appropriate. It is to foster these close
relationships within these groups and to ensure that overlaps in development
are closely coordinated. We are all trying to do the same thing here.

Finally, NHIN is encouraged to make use of any of the caBIG tools that are
useful for the achieving of your goals. Again, they are all open and readily
available. We would encourage you to look at them whenever possible.

Thank you for the opportunity to testify.

DR. COHN: I want to thank our presenters.

Do people have any questions for the presenters?

MR. HOUSTON: Having been involved in caBIG, it is an interesting project
and I know that we have had a lot of discussion, some of us had at lunch today
regarding how thin the core is and I know that was one of the discussions in
caBIG was almost making the core almost nonexistent.

We sort of started talking today and we said it is almost like caBIG is
sort of on the edge of NHIN. Did you sort of perceive that to be where caBIG
falls also?

DR. ADAMS: I would agree with that. You know, it is interesting. caBIG I
would regard as a system that more or less thins the core out to the point
where it almost is nonexistent. There are a few obviously core services that
will have to exist, especially in the context of things like we are talking
about here, resolving a unique patient identifier and so on. But for the most
part, yes, an ideal situation for caBIG would be to live at the edge, provide
these research services but then be able to make use of some of these core
services like unique patient identifier when they are provided by the NHIN.

DR. STEINDEL: You just mentioned that caBIG has established a set of —
focused down on a thin core and established a set of functions. You just
mentioned one. Are there any others? I assume there are.

DR. ADAMS: Yes. Again, this is something we go back and forth. Given the
nature, as you saw one of the key elements of the caBIG’s federation, one of
the things that we typically struggle with are when there are things that
cannot be as we believe abstracted into a federated set must necessarily live
within a core. Security management is one. We have been discussing quite a bit
about how one actually effectively and truly federates security, particularly
authentication and authorization of multiple systems without having a single
core accepted registrar.

One way we have gotten around that typically is to say there is obviously a
necessity for our services, certification services, but there may be multiple
vocabulary services and common data element services that currently exist at
the NCI. They are currently for all intents and purposes a core. Now, it is the
intent of the program to provide for federation of those transmission and
provision of standard vocabulary and standardized data elements, but for now
those are three that we talk about a lot. There are others.

DR. COHN: Other questions, comments?

We want to thank our presenters. These are obviously brief reports, but I
think certainly help flesh out and get us thinking about other ways of looking
at all of this stuff. So, we want to thank everybody.

Now, I know we are second day mid afternoon, tough time to be having
conversations. I suggest we take a five minute stretch and then let’s get into
the discussions of next steps.

[Brief recess.]

Agenda Item: Workgroup Discussion

DR. COHN: I don’t know whether this is going to be a long conversation or a
short conversation, but we ought to at least open the floor to the members of
the subcommittee and workgroup to sort of see where we think we are at and then
we will sort of talk about next steps and plans.

Certainly from my perspective, I was heartened by that there is some —
certainly some direction and unification used by many in the industry, which I
think is very helpful to us. It is really not so different than I think what we
thought about in the original draft sort of framework that we put together.

So, I think it is good to know that we need to improve it. We need to
refine it, but we are certainly not far away from where we need to be. At least
that is the view that I came with from the last day and a half. But I am
curious about what other people think.

DR. STEINDEL: Simon, I agree this was really a very fruitful and
interesting two days worth of information that was passed on to us in
discussion and everything. You know, I have seen some things in the
discussions. One of the questions that we have had among the ad hoc workgroup
in framing this thing is how to set some guidance for how we are going to think
about it and set our reports.

ONC has asked us to stay out of certain areas, which I think is rightful
because we are not — because I don’t think anyone is really prepared to make
statements at this time on architecture of the NHIN. We don’t know enough about
what it is doing and what it should do to make real statements in that area. We
are definitely not charged with going into the policy area, the other areas and
doing that. So, I feel it is very important. But without having some structure,
I think it is going to be difficult to frame the report and I found two what I
consider to be relatively trivial points of conversion that I have heard from
all the speakers.

I hope this doesn’t sound too trivial but I think it is going to be helpful
in framing the report. The first one is the NHIN is a system of systems. That
may sound trivial but I think it is very important. Everybody has talked about
the concept of multiple things being hooked together. So, I think we have to
conceive of it as system of systems.

The other thing that people have told us is these systems that are going to
connect have varying degrees of sophistication. While those seem trivial, I
think they are going to help frame a lot of what we discuss, talk about and put
down. I would like that, if there is no objection, I would like that to be
accepted as some type of framing mechanism.

MR. BLAIR: Could I suggest as part of that framing mechanism because I
think there is going to be more — Steve has started us off but I think a good
way to phrase this is this is our assumptions that we are putting in. That way
we are not saying it has to be this way. That is why we are not winding up
making policy statements or architectural statements but the set of working
assumptions upon which we can achieve consensus and thereby start to articulate
the minimum functions.

DR. STEINDEL: I think that is a good clarification, Jeff.

DR. COHN: Comments? Stan.

DR. HUFF: I agree with those as framing constructs and I would go on and
say that my sort of assimilation of the testimony and the other things comes
down to that and actually probably, you know, the final presenter was sort of
where I was at in the sense of it got so thin it is almost invisible there in
the middle. The only thing that is maybe flowing in the middle is, in fact,
just secure transactions on the Internet. The whole thing just represents
software components in systems. Those systems could be part of, you know, a
health care organization, a provider organization, a government entity. So,
they are not — you know, you are not forcing roles of particular organizations
relative to how they participate in the network.

It is really a set of standard software, functions and services that can be
deployed in anybody’s system and if you adhere to those services and those
interfaces, you can participate in the network. So, I can see that is troubling
in some way but —

DR. COHN: Maybe I was looking a little troubled as you were talking about
it. I think I almost agree with you. There is fluidity here. However there are
things, the thing with the Internet is not — you use the Internet. You connect
to it and all of that, but you don’t maintain and have the Internet at
Intermountain Health Care.

So, I think we hear from caBIG that there were actually things that they
did consider to be the core. Now, not a lot. I would absolutely agree with you
about that, but I do think that there is some distinction there but not a — it
is not a hard line. It is more fluid.

DR. HUFF: There are a couple of things more that I would say. I am going to
have to go to a conference call. That is why I am trying to fit this in.

So, what that means is that one of the primary roles, I wouldn’t say
necessarily the primary, but a primary role then in creating this is, in fact,
creating the standards that allow people so that you know what you have to pay
to play in the game. I mean, with Internet, you know, you have to — again, you
don’t have the Internet, but there are defined standards about how you can
connect to the Internet, you know, how you buy — what kind of hub you need,
other kinds of stuff like that. So, the primary role, I think, of the
government or of the national organization is setting the standards that say,
you know, how this can be done.

So, we are helped in that in that, you know, the standards and
implementation guides should be approved by HITSP. I mean, that is what their
job is. That is why they were commissioned. We could say maybe it should be
somebody else, but conformance testing and knowing whether I am actually
qualified to connect, you know, should be the responsibility of CCHIT, which is
their job as testing whether people can plug in there.

Another sort of founding thing is that we are not talking about policy and
one of the reasons is that policy is going to change over time. So, these
things should be constructed in a way — the systems basically that participate
should be constructed in a way that it is easy to implement policy and data
changes because that is going to — you know, the kind of data you are sending,
the kind of data that is of interest, the policies of the government are going
to change over time. So, you don’t want anything in the system that prevents
you from quickly implementing policy changes when those things occur.

The one thing that I think gets back to what you were saying is that
whatever these functions are if you are going to have data sharing at a
national level, it implies, much as the last speaker said, that the function
has to span the population that is going to use it. So, if it truly is
national, whether it is distributed or otherwise, it means that somewhere I
have to be able to look up — truly, if we are going to share nationally, I
have to look up a patient that I want to share about nationally some way. It
might be that I have to hit four distributed servers or I have to do something,
but it has to be national and I think that is the part that now is not
invisible, but it has to be some organization that is as big as the people who
want to share that supports things like authentication, certification of the
users, authorization functions.

Those are the things that really have to be bigger than any individual or
else you can’t do it.

DR. COHN: Stan, do you have to leave this moment or —

DR. HUFF: No, I have got five more minutes.

DR. COHN: I think a lot of people were going to respond to you, I think.
John Paul, were you first and then Steve and then —

MR. HOUSTON: I am glad you said what you said last because I think there
are definitely some components that are more than just standards that really
have to be in place that are core functions. You are not going to get the
requisite public confidence in the environment unless they exist in a real
meaningful way.

I think, you know, authorization, authentication, they are just not — they
are not edge functions. I mean, I think at the core there has to be — I guess
I am searching for a word — maybe credentialing is the right word, but if
nothing else, it has to be this concept that whoever participates in NHIN is
credentialed in some fashion. That may be a strong word, but in all of that
credentialing information may ultimately be pushed out to the edge, but there
is still a substantial core responsibility that coordinates, that manages, that
ensures the integrity of that type of information. People are who they purport
to be.

I think it also is going to extend to having a managed such things as
mundane as log information because I think part of what is going to end up
coming out of the NHIN is that for people to accept it, they are going to have
the ability to say who looked at my records. Who asked for my records? I can’t
see that being something that we can push out to the edge. It is going to have
to be something that is going to ultimately be at the core in order to
coordinate it. I know that sounds like a lot of work and problematic. I see
Steve sort of nodding.

This stuff sort of adds up and I think that — I got in the same
discussions about CAB when I was in the security meetings for it and it — but
there are so many things that I think the only way ultimately to manage them is
to make some core functions. I am not sure how you go about doing that and what
organization supports it, but it is — I don’t think we can sidestep it in
terms of the design of NHIN.

DR. HUFF: I think you can step through each of the things, you know, the
functions and ask is this something that is the responsibility — each system
that participates has to support this and how is it then aggregated at a
national level?

MR. HOUSTON: I don’t think it is even aggregated. Some pieces —

DR. HUFF: Aggregated is the wrong term. How do you make it so it is usable?
If you talk about data logging, I mean, that is already inherent in a lot of
systems and I am going to keep the local logs and the question then is how do I
make those logs available if somebody nationally wants to ask how is my data
accessed. It is not a matter — it is probably not necessary — well, it is not
necessary to assume that those logs go out into some national database.

MR. HOUSTON: I agree in that context but in another case is again how does
somebody participate in NHIN? How does somebody become credentialed in order to

DR. HUFF: That is something that is absolutely essential.

DR. STEINDEL: That was a good segue to some of the comments I was going to

You know, like Stan, I think there is going to be a lot of local functions
that are already done that have been addressed as needs of the NHIN, auditing,
logging, but what we probably need in those cases is to clearly enunciate that
we need standards so that we can retrieve that data from stores of it, whether
they exist in a regional or a local or even a national basis. So that we can
combine the information. Standards for auditing, for instance, don’t exist, at
least not within health care.

That was just one off the top of the head example. The other is and this is
— I have a question that perhaps of John, perhaps as discussion, but, you
know, we have been told that we don’t do policy within this group, but we have
heard a lot of times that the testifiers have called for policy in certain
areas. Do we mention within ourselves that, you know, many people have said we
need a policy for and I think one example is one that John just mentioned and
that was credentialing. You know, there needs to be some type of national
standards and policy for credentialing the people who are going to ride the
NHIN. We don’t have to enunciate what it is or anything, but I think we should
clearly say that something like that is needed.

With regard to that, in the same vein as before, we also need some
standards for how to express that credentialing as people get — move on the
NHIN. That was just some off the head examples. There are a lot more I can
enumerate as we talk.

DR. COHN: Sure. And I commented and I am not sure if it was with John, but
I think in the conversations there was a recognition that we would run over
policy issues. I think there is a difference between identifying a policy
issue, which is something that we do need to capture and record versus trying
to resolve it.

DR. STEINDEL: I can assure you I don’t want to resolve it.

DR. COHN: I guess I should look at John. That was my understanding of our
discussions and early agreements.

MR. LOONSK: As we expressed at the forum, we are not trying to avoid the
policy determinations. There are a variety of different groups that are working
on them and we are trying to get policy implications out of this activity that
can be fed into different groups. We are trying to get results from those
groups that will be fed into subsequent steps of this activity as well.

DR. COHN: And given the time we have, we couldn’t solve these policy — we
would have to be here between now and through September 15th to get them
discussed adequately.


MR. BLAIR: My objective is to stay focused on our primary mission, which is
to identify the minimal functionality that is needed, minimal but inclusive. I
have some conceptual struggles and maybe in a small group we could come to some
consensus, not that it is right, but just something that enables us to go
forward because I am struggling. Okay?

I will give you the problem I am having. We can readily come to the idea,
pretty quick consensus that the idea is the NHIN is a network of networks and
that is fine. We heard that there are either RHIOs or gateways or subnetwork
organizations that could all be part of those networks of networks. When we go
down the idea of saying all we need to do is connect these RHIOs, subnetworks
or gateways, people start to argue with the idea that the NHIN could be very,
very thin.

That looks like it is simple, but then we look at accountability and we
look at security and I start to feel uncomfortable that I can’t just delegate
authentication and authorizations and access controls to all of these other
networks unless — and I don’t know the mechanism. I don’t know the mechanism
of saying well they are actually another part of the NHIN and, therefore, they
are within our scope to say they are core or another mechanism to wind up
saying, well, they are not part — and I don’t care which way we go. So, I am
not arguing for a particular construct. But I need something to be able to do
this with some integrity and consistency because there is a lot of the
functionality that has to be consistent at the outside edges, at least for
security and then I start to run into another problem.

This is why I am saying this as something that I think maybe some folks
after they hear my problem, they can wind up saying, well, we could do it this
way and we could come up with a construct and we can go forward. Here is the
problem I start to run up with is that each of those subnetwork organizations,
some are e-prescribing and some health information exchange and some are the VA
and some — they have different requirements, even for security.

Some of them have very demanding requirements for security and some are
much less or even caBIG. So, do we start to impose the most stringent levels of
security on all of those folks? So, I am struggling with how to deal with this.
So, if there is someone that has a construct, I don’t care which one it is,
which enables us to deal with this —

DR. COHN: Jeff, maybe I will try to respond. I guess — I mean, part of it
is chain of trust issues, but also part of it is role-based and various other
types of access controls, some of which I saw in functional requirements and
some of which I haven’t. I think I was getting into this consternation
yesterday when we were looking at personal health records, the same sort of
question about how do we handle —

MR. BLAIR: That is another good example.

DR. COHN: — I mean people sort of touching and making sure that things
stay secure and that things are not released inappropriately. That is, I guess,
the lesson — what I was hearing today.

John, you had your hand up on this one. You were trying to respond.

MR. LOONSK: One way that the issue that Jeff has brought up has been
discussed is the lowest common denominator in the security area. Can you assume
that your security is based on the lowest or the weakest link in the chain?
That is an issue that has to be considered.

I did want to back up a little bit and note the differentiation between
what Steve said, which was a system of systems and what Jeff said, which was a
network of networks. I do want to point out that those are different.

MR. BLAIR: Yes. Good point.

MR. LOONSK: And again, just to get back in the framing context to the fact
that I do think the name game is going to be problematic for this activity and
that we have to continuously focus on not the thin versus thick, core versus
not core because many of these issues still have to be resolved in other
processes and contexts, but the totality of functional requirements necessary
to connect up edge systems — where an edge system could actually be a network
or a system per se. It could be an EHR or it could be a network of EHRs as an

That is not to say that what is connecting that up is necessarily the NHIN
versus another network that is doing that. What it is to say is that we have to
be looking at the broad scope right now and defining the requirement or it will
be very difficult to parse out and get to an end product here in the time frame
we need to do it.

I don’t mean to sound like a broken record. I did say this yesterday just
before lunch time. I did make a point as well that when you look at the — and
when I think people see the NHIN prototypes, they are being called the NHIN
prototypes but are they prototypes of this thin network that people are
conceiving of? No, not really. They are going to be software prototypes of
capabilities for data exchange in a more regional fashion.

You could interpret perhaps the interconnection between them as potentially
being that which people are conceiving of as thin. We still are going through
this prototyping stage. We are doing that on purpose because we think that
there is a lot that still has to be worked out and that we are going to learn
from that process and some of those results are not going to come until after
this step and this work is done.

MR. HOUSTON: Is there a thought that the next step would be to take those
four separate, what do you want to call them, projects and look to try to link
or to do some type of evaluation, whether they could be linked effectively so
that they are no longer just simply region projects but they are becoming more
of a prototype for a national infrastructure.

MR. LOONSK: There have been discussions about that as a model. I think we
heard some very strong voices during this testimony about RHIOs that have a lot
of technical infrastructure who are very interested in doing some of those
things themselves.

I think there is another extreme of RHIOs that are not at all about IT
infrastructure and are about just business practices. So, there is a broad
spectrum of variability that is going to exist for sometime to come. I think
one model is to think about how those could connect up. Another model is
relating regional exchanges that are coming from different angles and different
perspectives. So, there are a lot of permutations.

MR. BLAIR: Is there anybody who could help me with my dilemma here?

DR. COHN: Maybe Harry — I had thought that I had responded. I guess maybe
I didn’t resolve your concerns. Steve, do you have a —

MR. BLAIR: Say it again. Maybe I missed a resolution there that you were

DR. COHN: Steve, did you have a —

DR. STEINDEL: Let me jump in a little bit because I don’t know if this is
going to resolve your dilemma, but, you know, when I was mentioning that there
are going to be varying degrees of sophistication. We have heard this varying
degrees of use cases for systems that were going to be using the NHIN, I mean,
just in today. You know, each one of the different areas while they were — you
can characterize what they said is what you need is essential functions for the
NHIN and you can characterize it as either a very thin set or all of them. But
they seem to — in each particular set, they did focus on a set of areas that
really were unique to them that may not overlap all over the place.

We got this the first day with each one of the prototypes, depending on
their model. We got it with the PHRs, for instance. And what I think is that
what is a set of the essential functions are those functions that we can
require or should require of anything that connects to the NHIN. Then we have
to tease out of those 20 odd lists of functional areas and maybe sub into those
areas, which are really not required by anything that will connect to the NHIN.

This kind of addresses what Jeff is talking about because some of the
things that he was — you know, when he is talking about the various degrees of
sophistication of the systems, yes, VA internally might have higher security
requirements than the NHIN and that might be justified. We heard that from the
FHA, that the federal systems have certain more stringent security requirements
that is generally used in the private sector.

Now, we may feel that those security requirements should be in the private
sector. I think that is another discussion, but we also heard that what VA
intends to be is essentially a gateway to the NHIN, that there will be one
connection from the VA into the private sector. So, you can presume that that
might run under certain security requirements. Once there is a connection in
that gateway, it will be handled under the federal security requirements.

I think we might see the e-prescribing. We heard that from Kevin. We got
the hint of that from Kevin that they might be willing to serve this for the
same type of function with an e-prescribing arena.

So, I think, you know, addressing what Jeff is talking about as, yes, we
are going to see different ways of doing this within the various systems that
connect, but I think we need to talk about, okay, we do need authentification.
We do need authorization. We do need credentialing. I think that has been
fairly common. The NHIN should establish a set of standards to do that. That is

MR. BLAIR: I would like to be able to go that path, but help me with
something else. I have got some stumbling blocks to be able to go down that
path. Like, for example, lowest common denominator and we wind up saying
minimum security requirements. But what if you have one of your ED systems or
two of your ED systems that require nonrepudiation. That has to be end to end.
So, it is not like it ends at the border of that subnetwork organization. The
NHIN core has to be able to take that and go all the way through to the other
end. How do we deal with something like that?

MR. LOONSK: I misspoke when I said lowest common denominator. The issue I
was really trying to tease out was weakest link and the concerns in regard to
if you have a shared trust model and everyone has to be participating in it,
how do you avoid being dependent on the weakest link in that chain? But I also
think, to follow up on your other point, that it is inevitable in this kind of
discussion but that we haven’t drilled down far enough to really appreciate
cross organizational auditing or some of the issues you are talking about,
Jeff, whether they be credentialing or role-based access control or a whole
variety — nonrepudiation, a whole variety of issues where when you actually
drill into them and think about them across organizations, there is
infrastructure and services that need to be induced that are not currently
present. It is not that we are trying to say that makes the NHIN thicker.

What we are trying to say is regardless of where they are, they have to be
in place for it to function. What we are trying to do with the functional
requirements is say so noted. The functional requirements need to be inclusive
of those capabilities. We are not then saying that they are inherent parts of
the NHIN.

They maybe edge systems that provide those services and that that is a
reasonable model. We heard a little discussion — it was certainly, I thought,
very helpful to look at SureScripts, a group that is doing a lot of function
that could be seen as parallel, that has a model for how they do some of that.
They were definitely thicker than people were thinking about in terms of NHIN.
But even if that was a model that was replicated, it could still not
necessarily be considered a core NHIN, but another service that is providing
those capabilities to foster it. So, that is why I keep harping on this concept
of — if the functional requirements have to identify the services, the
systems, the capabilities that need to be in place regardless of where they
reside, it will be in the strongest position to move forward with this activity
that the way in which those are provided is really partly strategy, partly
industry based in terms of working with industry to make sure that that can be
a well-correlated and I do think that we shouldn’t underestimate the fact that
there are services. There is infrastructure.

I think we are seeing the hints of, that you get a better glimpse of when
you start to look carefully at how something that is transacted from one
organization to another where you have these requirements for how it would be
exchanged, have to be manifest. That doesn’t mean it is a thick NHIN. But it
does mean that they have to be attended to in terms of these requirements and
thought about in terms that they exist somewhere.

MR. REYNOLDS: I just think we are jumping ahead too fast period. We heard a
lot of good stuff. I think we have — because we have had some sidebars, we
have been able to list Steve said 20 or somebody said 20 and maybe it is 10,
maybe it is 8, whatever. I mean, we got some — we had enough testimony to come
up with a list of, you know, what might be, you know, that core or base
functionality. I think we need to stay on our game and go through those because
we have not discussed any of those yet. We listened for two days. We had a
little bit of chance. But we listened for two days.

I think we need to go through there and then once we go through there, then
I would feel very comfortable dealing with Jeff’s question on nonrepudiation
because once we had looked at what we think core functionality ought to be or
the core things that we are looking at, we would know whether that is or isn’t
involved, can or can’t be taken care of, isn’t a function — because the other
thing, I think the only thing I am hearing as far as the RHIOs are concerned —
and I will play off one of John’s words — everybody wants to decide who is
going to do it. So, you have got a lot of that.

You have got everybody stepping up saying I am it or my way is it. That is
fine. They should be because they should be proud of what they have done, but
we can’t fall into that and we can’t start lining them up as whether they are,
you know, core or edge or anything else. We talk about functionality and then
there may be some people that play the whole gamut. Good. As long as they meet
the standards, as long as they meet what we said we were going to do, as long
as they meet the requirements, you know, fine.

But if we get caught up in that, then I think we will lose sight of the —
we are supposed to be talking about functionality and any time you design or
think about anything, you look at the 80 first and then you put the other 20
against it. We got to be careful because a lot of the presentations were at the
20 or were at the 5. We wanted to hear a cross section. That is not good or
bad. That is just what we asked them to do.

I feel good. I think we got a lot of good information in two days. I think
we got some structure last four or five — probably there were six
presentations that really took our questions ultimately seriously and took the
whole function ultimately seriously and really laid out some ideas. Now, I am
not agreeing or disagreeing with the ideas, but they gave us a nice — they
gave us at least a template to discuss it from.

So, that is where I think we are. So, I am pretty happy with where we are
and I think we are going to have a short list of how to go about it and I
didn’t say we were going to have a short answer. But at least I think we have
got a structure and I think if we try to jump any further than that right now,
everyone around this room could either blow it up or make it simple. Make it
thin or make it fat.

I think we are in good shape. I think we are in a good spot and I think we
just need to go about it.

DR. COHN: Maybe do a little more analysis of all the things we heard.

I think Mary Jo and Steve and then Judy.

MS. DEERING: What I am offering may boil down to not much more than
editorial later when we come to present it, but a lot of this discussion has
been requirements in terms of things that do boil down to some aspect of
security and I would like to just offer that what I also took away is another
perspective of minimal requirement, which is for the connectivity itself again.
Not whether you need to do — Steve is shaking his head.

DR. STEINDEL: I agree with you, Mary Jo.

MS. DEERING: And it is a more affirmative approach, shall we say. I mean,
this first approach has been one of constraint and concern and appropriately
so, but the other is affirmative and supportive of a lot of those things that
are at different levels of connectivity and/or that are faced with multiple
gateways of connectivity. I took very seriously Kevin’s slide, you know, this
one, where he contrasted the national versus the RHIOs. Then if you add a lot
of perhaps individual entities that have to be part of the NHIN and who don’t
necessarily see themselves as part of any of those subnetwork organizations,
then I think we also have to keep that image in mind as we move forward.

DR. STEINDEL: Well, I thought I was just going to say I am not going to
follow Harry because he basically expressed most of the thoughts that I was
going to say, but since Mary Jo introduced — I want to say the only reason I
mentioned like security and auditing and — it just happened to pop to my mind.
I think Harry, you know, gave an estimation of probably the scope of what we
were looking at and it is probably somewhere on the order of 20 plus or minus.
I don’t know what high level or how high a level, but things like connectivity
would fall into them when we start discussing outside of the two days of
intense hearing and brain frying.

DR. WARREN: I was going to say I almost would like to say ditto except I
would like to add one more dimension to it of when you asked us for comments, I
was sifting through my head of what the comments were, but I could only think
of the high things that we heard this afternoon. It was hard for me to go back
and remember what we heard this morning or yesterday. I am at the point that I
am sure everybody else is. The brain is not working as well and I almost need a
chance to sit down and analyze what were in each of those presentations and to
do kind of a grid or something so that I can figure out what was there and what
wasn’t there.

That is all I will say. I don’t think I can make any kind of informed
discussion right now because probably tomorrow I would say, oh, God, did I
really say something like that and want to deny it.

DR. COHN: Well, Judy, you actually bring up a good point about whether or
not we — we obviously have a consultant who can help us and rather than
spending a lot of time worrying about it, this is the occasion where we take
the pieces and massage them and then begin to look at them, begin to send them
out for comment and begin to sort of begin to produce things.

John Paul, you had a comment. Marjorie.

MR. HOUSTON: I have more of a question. Can you as articulately as is
possible restate our charge, just to make sure I understand exactly what we are
— listening to all these presentations almost seems like at times it — I lose
sight of what we are trying to do.

DR. COHN: I think it is relatively — actually nothing is straightforward.
Never is anything straightforward, but what I will read to you is basically
what I read this morning, as well as yesterday and it is basically taken from
agreements that we had with the Office of the National Coordinator about what
it is we were supposed to do.

We were asked to review and synthesize the results of the June meeting and
by the end of September to find a minimal but inclusive set of functional
requirements for the initial definition of the NHIN and its possible
architectural forms. What that other piece tends to mean is, well if an
architecture is completely different and it requires different pieces, that we
need to do that.

DR. WARREN: Would you just read that one more time because I thought you
said look at architecture and I thought we weren’t doing that.

DR. COHN: No, no. Basically, I was describing — we basically just need to
be aware that requirements may differ by architecture. We are not making
decisions on architecture. We are not making decisions on policy but we are
obviously observing all of that. You can’t talk about functional requirements
completely without — in a complete vacuum.

MR. HOUSTON: When we say the NHIN, just to definitionally make sure I am —
define NHIN in the context of that statement just so I make sure I understand
where our boundary is.

MR. LOONSK: The terminology is very problematic here and I have seen this
before, but whenever you say “network,” people have a concreteness to
that that is difficult to get past and the “it” in this regard is the
initiative. It is not the thing that some people have been trying to define for
us, thin versus thick. What are the requirements we have to work on to meet the
goal, the vision of what this is. Some of those requirements will exist for
edge systems. And some of those requirements may exist for connected networks.
We are not looking to delineate those right now. We are looking to look at the
broad initiative and make sure we have got the inclusive requirements that can
help move it forward.

DR. COHN: I was trying to think about whether to have Margaret put up that
slide that we had talked about, which I think we know needs to be changed. So,
I wasn’t really having her do that at this point. But if you think about it, it
is sort of, to use the term, sort of edge to edge and it is really about more
than just edge to edge. It includes the edge systems and it needs to basically
be able to do things from one side to the other in an appropriate fashion.

MR. LOONSK: And clearly not requirements for an edge system to carry out
activities internal to its — for example, if you are talking EHR, we are not
talking about functional requirements of providing EHR services to the
clinician using them, except as they relate to connectivity.

DR. COHN: Now, Marjorie, you have been very patient.

MS. GREENBERG: Actually, this isn’t what I was going to comment on but just
the bottom line I kind of came down with as to what this is about is I thought
it was best expressed really by Robert Cothren, one of the four — Northrup
Grumman. But —

PARTICIPANT: He is known as Rim.

DR. COHN: He was supportive of the rim, too.

MS. GREENBERG: I thought I heard John kind of concurring with this that it
isn’t so much what is NHIN and what is edge, but what are the minimal but —
and if I am wrong, you can tell me. That is fine, but minimal but inclusive
requirements for essentially nationwide interoperability for being able to —
as he said, the requirements of interoperability may be addressed by core or
edge systems. But it is basically those minimal requirements for that

MR. LOONSK: I think the complexity with that could be could be that some
interoperability is related to intra —

MS. GREENBERG: I realize that. That is why I said nationwide operability.
But in any event, be that as it may, I mean, I think it sounds to me like we
are somewhat moving away from what should be here and what should be there. It
really is what does it take to connect securely, but I really agree with what
Judy said. I guess I am struggling less than Jeff. Partly I am not a committee
member and I don’t have to vote, but partly because of the woman, though, who
is sitting next to me because I figure, you know, it is like the kid who
doesn’t worry about anything because mother will worry about it.

MR. BLAIR: Actually, I think some of my concerns at this point have been
resolved by clarifications — so I am satisfied.

MS. GREENBERG: I figured we engaged Margaret to now go through all of this
and I don’t envy her for it, but I know she is woman to the task because it
really does require that now. I want to really congratulate those of you who
put this hearing together. I think it was really a good job. I thought it was
not only very, very interesting but I think we heard from as diverse a group as
one could hope to in the few weeks that you had to put it together. Mary Jo and
everybody else around the table and not in the room, but who participated, I am
very gratified by it, I must say.

It sounded like — I can compare these two days to when I went to the
regional hearings on privacy and I said, oh, my — people were all over the map
and people, one after the other were getting up and saying the absolute
opposite of the person before. Well, it is that kind of an issue but we still
came out with, I thought, a really credible letter.

This wasn’t like that. I think this wasn’t like that at all. Even the
people who seemed to be saying different weren’t saying things that different
it seems, but I still have the feeling that the devil is in the details and
that is why I think it really does require someone, you know, to look through
what people gave us, get down, as you said, below the top level into some of
those more detailed levels and that can’t be — you know, that has to be done
probably by one person or a team of people who just has all the documents and
heard it all and understands it and takes it all into that.

The one thing that I am struggling with a bit because of that, because I
have this sort of little fear that the devil is really in the details is there
was this kind of self-esteem of not everybody — this was the only place where
I saw really maybe difference of some people kind of feeling like you really
should be focusing on is that EHR use case and less on the biosurveillance use
case or the PHR use case for the — there were a few people who made that
comment. Shake your head all you want, Steve, but some people did say it.

But, obviously, we can’t do that because there are three use cases and this
will have to meet use cases well beyond the three that are being discussed. So,
what isn’t clear and I think someone asked the question and I don’t know if
really drilling down into all the documents will help clarify this. Somebody
yesterday asked the question what functionalities are unique to the uses other
than the care use, the electronic health records use. I think the minimal but
inclusive can’t just include the health care use obviously. It has got to
include these other uses, but that is the one question I really have is whether
there are some functionalities that when you are trying to think minimal really
wouldn’t be so necessary for the care use case and then if so, is there some
way — but you still need them. So, is there some way to make them not overly
burdensome so we don’t weigh down the whole system.

That is a question that I have and I think it requires really looking at
the details.

DR. COHN: We would actually love to hear the answer from you after you have
analyzed the document.

MS. GREENBERG: I am not going to do that.

DR. COHN: I don’t think that we actually should do the analysis on it only
because I don’t even know how you would do it. We have functional requirements.
These functional requirements are loosely tied to use cases, but not tightly.
So, we would have to go through and, you know, our own judgment determine what
functional requirement related to what use case, which would be sort of
secondary — I mean, we would have to hold a whole hearing on that.

MS. GREENBERG: Well, maybe it came out sounding like — different than it
was. I do think there are some that are related more to one use case than to
another, certain requirements.

DR. COHN: I guess I should ask —

MS. GREENBERG: Maybe it is not true.

MR. LOONSK: I can direct you back to some of the initial discussions we had
because there actually — the requirements that had been submitted from the
consortia actually were parsed by use case. One of the attributes was what use
case they were associated with versus what — but in the spreadsheet, it is

The other access that they are all delineated along is entities or — so,
we had functions and the functions are — you teased out those in terms of
looking at them, but we also had systems or services, entities as they were
called and that tended to rely more concretely to software and then also use
case. So, same requirements but with each of those classifications, where you
can sort them and resort them according to those delineations.

That being said, I do think that our — and we have to anticipate that the
DHIM is going to grow organically but that our initial — much of our initial
detailing comes from the breadth of these use cases and all three of them are
very much on tap for —

MS. GREENBERG: I know they are. It just seems to me that some use cases
declare more of one thing than another. I don’t know the extent to which we
have to understand that or worry about it. Maybe not.

MS. AMATAYAKUL: Well, I can only suggest that I have actually done a number
of sorts and sorts within sorts, but one of the sorts I have not doe is by use
case because I actually didn’t feel like — I thought that would biasing. So, I
did not sort by use, not that I couldn’t or won’t in the future, but I just —
it just didn’t seem to make sense as my initial pass through.

MR. LOONSK: The other thing I can add is that as different groups have
looked at them and for example from a standard perspective, the HITSP technical
committees, that when you drill down further into them and someone, I think,
referenced the action, there is actually a tiering of the hierarchies of the
use cases. When you get down to the detailed level of the actions, that they
tend to look — start to look an awful lot alike and what people were doing was
converging, not diverging in terms of the fact that many of the requirements
are, indeed, the same when you start to get down to that level. I think that
that is something that should be encouraged.

MS. GREENBERG: I was just reacting to some people who were indicating that
there might be some functionality required by some of the non-care delivery
ones that you shouldn’t force on people. You know, I don’t think we can do

DR. STEINDEL: We heard a very good example of that, John, this morning when
Floyd said well there was one what they called IPs in HITSP. That was
biosurveillance and he said it was changed this morning. It is now joint. So,
as we take further and further looks at these, we find they stretch across
multiple use cases.

MR. LOONSK: There were comments by the Leesburg testimony that suggested
that biosurveillance shared a lot with research needs.

MR. BLAIR: I think what we are trying to do is trying to structure how we
proceed with the analysis and

I —

PARTICIPANT: I don’t think so.

MR. BLAIR: You don’t think so?

DR. STEINDEL: I don’t think we have gotten to that point yet.

MR. BLAIR: Okay.

DR. COHN: Did you have a comment to make specifically or —

MR. BLAIR: The only comment is, you know, Harry, I think, and John wound up
addressing some of the dilemma that I was facing. I think sufficient enough for
me to resolve it. Now, the next thing I was thinking of is that there is a lot
that has been done and I know that Margaret has been doing a lot of analysis as
we have gone through this and I know that we heard a lot of testimony. So, I
would really like to ask if Margaret has some suggestions or ideas of how we
might organize to take our next steps.

DR. COHN: Before we get into that, let me let Linda comment.

MS. FISCHETTI: I just wanted to check with the group to see if this would
be useful as well. When we heard the testimony or the report out from the June
28th/29th meeting, when we heard the testimony today there were a number of
people who volunteered priorities. I need this first. Is that something that we
should pick up and glean within the scope of this work or are we looking at
really just a spectrum of functions with no priority tag to them?

DR. COHN: Maybe this is a time to let Margaret give us some of her

MS. AMATAYAKUL: I guess actually I would like to ask for the slides or
whatever material the June forum produced because I don’t have that. I have my
personal notes.

MR. LOONSK: They can be shared. There were several products, breakout
presentations can be shared. They are on the web site now. But also we have
more detailed notes from the sessions that are also shareable.

MS. DEERING: All we got was the combined — originally, it was the combined
slides we put together from the report out. So, that I thought I did give to

MR. LOONSK: I do also at the risk of straining the absorption of
information further want to commend to you the next American Health Information
Community meeting, where there is going to be an update. The community has
asked for on the NHIN activity, not the functional requirements per se, but
there is going to be an activity around the session where we will have the four
consortia and we will represent some of the discussion at the forum. I think
that some of the issues and some of the questions that we are going to
basically present questions to them and have them answer them in that context
with some of the questions and answers there may be helpful to you in your
thought process as well.

We will ask questions, for example, about addressing some of the issues
that have come up here in this discussion. So, I think it will be relevant and
I hope to have a good discussion that will also support you in your work.

It will be recorded and on the web. It is live on the web, but they have
also been historically recorded.

DR. COHN: So that will be yet another piece of — just in case you don’t
have enough input.

MS. GREENBERG: What about the priorities because I remember way back —

MR. LOONSK: I think to a large extent we have had our priority expressed
for us in the articulation of the use cases and that that is the representation
of what the community has said is to be prioritized. We have also recognized
the separate bin of general NHIN infrastructure. We asked for the consortia to
think about that in terms of requirements. We have obviously heard a lot of
testimony from people who have interest in additional functionality.

I think that we need to in this first set make an initial effort at seeing
that systems functionalities are in place to support those activities, but
recognize that the much more detailed functional requirements that would be
associated with additional use cases would by necessity have to come in
subsequent time periods.

I think that there are a number of activities that are going on that are
related to those specific use cases and endeavor to remove barriers. We heard
about, for example, CLIA issues that relate to some lab result reporting. There
are efforts going on to look at those issues and to try to work through them.
So, that is just an example of the multitude of things that are being arrayed
to address these specific use cases because they have been identified as
priorities and we need to keep that focus for this activity as well.

DR. COHN: I tend to agree. I think there are many who have sort of
commented that this is a time for more doing some analysis. I mean, the general
feeling that at the high level, I think that our — I am at the first level
concerned about the high level framing that we are creating and my sense is
that what we have heard is a lot of confirmation. It is not perfect and we will
be modifying it further, but I think that high level framing is probably the
right high level framing.

I think once again we will be further refining it and all this stuff but I
think that is an important piece. Then from there, I think we can analyze
further requirements in a number of different ways, knowing that a lot of them
fall into hierarchies and things like this. So, I think we really are not
talking about 1,100 different requirements or even 900 or 84, but really at the
critical level that we are really concerned about there are a much smaller
number than that.

Now, I certainly think it would be probably interesting and certainly if it
is available, it might be interesting to at least compare, I mean, their
requirements for the use cases just to sort of see what they look like, just
because I think — when I made my comments, it didn’t sound like it was going
to be an easy thing to get to, but it is one of 20 different ways of sorting
things. So, I think we can at least look to that to see if it tells us

Certainly, part of our product is going to, of course, include the 12
inches of material at this point that we are accumulating. So, we won’t have to
worry that any of it is going to be lost.

MS. GREENBERG: If it does show, which it sounds like preliminary analysis
indicates it, that if these two or one of use cases or two of them don’t
dramatically swamp a third one, that is an important finding also that if the
functionality is very similar across the use cases because I think — I really
think that what we heard from at least from testifiers, if you are going to get
any support from the public, it is only going to be — it is really for the
care system one. Perhaps I am sensitive to the fact but those of us in public
health have been hearing for years that we are the ones who burden the system
with all of our requirements and then when they actually looked at it, it
really wasn’t the case, whether it be the discharge data set or whatever. So, I
think it is an important finding and it is an appropriate one for this
committee if you found that. But you have got to be honest about it. If you
find alternatively that one or the other use case does create a lot of
additional functional requirements, then I think it is important to get that
out. That is all I was saying.

DR. COHN: Okay. So, are you disagreeing with what I was suggesting there or
were you agreeing? You agreed. Good.

MS. GREENBERG: I think that is important to —

DR. COHN: Okay. So, we said we were going to do that.

MS. FISCHETTI: Just as an observation, I believe that our output will look
most like the FHA testimony because if you notice the FHA testimony had to take
multiple stakeholders. So, I was waiting for the group to fall out of their
chairs when they said 900 requirements because the previous testimony had been
parsing it for edge and core; whereas, this was that global — this is what you
need to get the job done with no — that is just an observation.

DR. STEINDEL: Actually, Linda, I felt we heard that from I would say 80
plus percent of the testifiers. It is just that the FHA stated it the most
bluntly. We heard continually, I think, from all of them that when they started
going through the list of whatever number of — how many are there, John? I


DR. STEINDEL: Yes, the 20 odd — you know, every time we heard that, you
know, everyone of the testifiers said, well, we need this one. We need this
one. We need this one. FHA was just blunt about it. We needed them all. I
didn’t really see much significance there.

MS. AMATAYAKUL: I was going to say that we do have a call soon, August 8, I
think it is, 7th, whatever. It would be helpful for me to have just a brief
discussion as to kind of what our agenda would be for that because I am sort of
looking at trying to do some checks and balances kinds of things where I am
going to take what I heard in these two days and try to come up separate from
that list of 900 plus what I heard.

Then go back to the list of 900 and try to reconcile that is there anything
that if you get it bubbled up, that there is grossly missing. I think I sort of
see the use case in the same manner as really more a check or balance that once
you have got what you think is an initial draft set of functions, then you go
back and you look at it in multiple ways and say did we leave something out or
is there some imbalance there.

So, I don’t see doing some of these cuts until a little bit later, after we
get — I think, as you suggested something more of a framework or some framing
of what we are looking at. But I do think it would be helpful to kind of think
about do we want that first cut of what we heard today, yesterday and today
plus some modification of our framework, picture thing, for the 7th. It would
be helpful for me to know if that is what we are going to try to do on that

MR. BLAIR: — looking for some guidance from us on that. I think she is
asking a question.

DR. COHN: Well, I was answering in the affirmative, Jeff. What are your

MR. BLAIR: Well, when you said you were answering in the affirmative, the
affirmative means what in terms of guidance for Margaret?

DR. COHN: I think she was asking a yes or no question and I was answering
yes. At least that is what I thought I was doing.

MR. BLAIR: Ask the question again because I didn’t know it was a yes or no.

DR. COHN: I think it was a yes or no, wasn’t it?

MS. AMATAYAKUL: I think I was — I was asking for guidance but I was
proposing that what I would — I think might be a first step for that first
call or the agenda be to look at the framework picture and to look at what we
cull out from yesterday and today testimony as sort of high level functional
requirements. It is a minimum but it is inclusive and that then we proceed to
do checks and balances after that because if we — even if we took only the 300
plus core or if we took only the little bit less than 300 plus core shall, in a
two hour phone call, we would get, you know, 30 done. We can’t do that, I don’t

DR. COHN: Those are things we will be doing looking at off line and


MR. BLAIR: That is what I was thinking of when I was thinking of how do we
— I took a look at one point on a number of those with Margaret and there were
a lot of them where those requirements needed clarification or definition or
they were ambiguous. Now, I — because we have worked with Margaret a number of
times before — have a lot of trust in her to begin to aggregate with fidelity
and integrity a lot of these to eliminate the redundancies and boil them down
to a high level list. But I think that that is what is going to be needed
before we meet this next time because I don’t think we have a way of dealing
with 300.

MS. GREENBERG: So, he said yes. too, Margaret.

DR. COHN: I think you agreed, Jeff.

Now, the truth is that I don’t want us to — we are running down now and we
have got about another five to ten minutes and we are going to finish. So, this
is not the time for us to be as a large committee working on a project plan.
Project plans occur off line. I think we are providing the input. We will be
obviously working with Margaret off line a fair amount to help refine things
and have ongoing calls. So, I don’t want to spend our sort of remaining couple
of minutes in terms of deep discussions of exactly what it is she is going to
be doing next.

Now, as I say that, Linda, do you have a —


DR. COHN: Okay. So, I am okay on that.

Now, what I wanted to do at this point, since we are getting closer to 5:00
is really talk about the next steps just sort of generally. Now, as I said I
think we have said this before, but (a) we have — the subcommittee has a
number of calls scheduled to sort of go over — they will be working calls.
They will be calls of the workgroup to sort of look at the work products.

I think we have three of them. They are August 7th, the 16th and 31st. With
really the big one being on the 31st when we will be asking for public comment.

Margaret, I guess at this point we had talked about maybe sending out some
drafts of spreadsheets. I don’t sense that we are sending — I mean, we are not
going to be sending those out immediately. I think those have to be probably
sometime in early to mid August. So, this will be an opportunity for people who
presented, who have opinions or otherwise, who want to review things. The
public could at least get an early view of, I think, of our analysis as we go
forward, once again, in preparation for that 31st call.

Now I do want to remind everybody that we are open for additional input
from the public in terms of written comment through July 31st, close of
business. Those on the Internet, those here, those who have testified. We
certainly welcome additional written comment based on anything that you have
heard, other comments that have been made through this testimony that may cause
you to think about things differently or want to make additional comments.

I want to make sure everybody realizes that with the close of these
hearings, the door is not shut and this once intended to be a very open
process, which will obviously culminate with I think sort of final
recommendations in mid-September for the NCVHS.


MS. GREENBERG: Just within the process part of it. Obviously, the end goal
is to get approval of the full committee in September. We have at least a
third, maybe more than a third of the full committee involved already directly.
This is complicated technical stuff in many ways and I think that on the one
hand, many other members of the committee may kind of defer — I mean, will
defer to the ad hoc workgroup because they are not going to have the ability to
go through all this in the same level of detail.

Nonetheless, they are also — need to be factored in probably, you know,
certainly as soon as the broader community is and maybe sooner than that. When
you have something to share with the presenters, the public, et cetera,
obviously, I think that there needs to be a transmittal to the other committee
members so that they are brought along —

DR. COHN: What are you suggesting as we move to drafts, we should make them
available. I think we have also talked about in terms of all this stuff that we
may very well probably in early September, I would imagine, after that open
call have a call with the committee members to sort of update them on this. We
could also invite them to the August 31st call also.

MS. GREENBERG: That is what I was thinking, whether it wouldn’t be a good
idea to send a note to the other members saying, you know, we have had these
two days. We went to the forum. We have had these two days of hearings. You
know, we are moving along. This is our plan and we are having this call on
August 31st, which would be a very good opportunity for you to —

DR. COHN: That is a very good idea. Okay.

DR. WARREN: I have a process question to build on this. I think we need
more than just committee notes. If people’s calendars are like mine, it would
be very nice to know in four weeks we have a conference call.

MS. GREENBERG: No, I think we will try to do that like on Monday. It
doesn’t have to be long, but I just think that —

MS. DEERING: My housekeeping process question has to do with quantities of
people to be involved on the calls and on the especially the four hour call. I
just wanted to put on the table that whereas I had originally planned to host
all the calls and the center web session on our own technology, there may be a
need for RSVPs, just for quantity.

Even if I don’t handle it, even if NCVHS decides that you have got some
other approach, I think the direction that our conversation is going suggests
that we will get to a point where we need RSVPs, simply because the technology
requires it. You know, you have to reserve x number of lines. At least you have
to second guess how many you are going to get. So, I just wanted to leave that
on the table as possibly something to build into the notes to the committee, et
cetera, you know.

MS. GREENBERG: Okay. We will coordinate with Katherine Jones and, you know,
what we need to do.

So, you are thinking that it is going to be something — what is our kind
of tentative date for something that would then be available for people to look

DR. COHN: Is this something that we need to determine right this moment?

MS. GREENBERG: Well, no. I think we have to have a general idea.

DR. STEINDEL: What I heard from Mary Jo just now is especially with respect
to the 31st conference call, she would like kind of a list of what is going on.
I would think we could post something in general now, you know, just generally
stating that we will have an open conference call on August 31st. Please
respond to Katherine Jones or whoever, you know, if you plan to attend. Mary Jo
just needs to know how many —

DR. COHN: But I think you were asking something different.

DR. STEINDEL: think she is asking something different.

MS. GREENBERG: Is it reasonable to say that the discussion document will be
posted a week before the call or — is that our intention?

DR. COHN: Yes. I guess I was confused only because I think we have been
talking about a draft document out as well as all this stuff. So, I didn’t
exactly know how the timing of that was going to be. But certainly whatever it
is, we are going to discuss on the 31st would need to be out by certainly a
week before. But I don’t know that we are necessarily — I am not sure exactly
whether we are posting it as a draft document. We don’t normally post on our
web site draft documents.

MS. GREENBERG: Yes, we do if it is —

DR. STEINDEL: And also, Simon, in this particular case we might have people
that have e-mail addresses to send the document to.

DR. COHN: That is what I am saying. Normally we would send them the e-mail
document or whatever because we would know who they are but we can certainly
also put it on the web site and just note it as a draft document.

DR. STEINDEL: We did that with the e-prescribing letters, too.

DR. COHN: Once again, I don’t know that we need to discuss this at length
during an open session. We have a number of different pieces going on at the
same time.

Obviously, much work to be during the summer. We know we have a number of
calls in place. We will be having hopefully a draft document for everyone to
review and provide us comment on before the 31st and then hopefully there will
be a second revision by a week before the 31st, based on anybody’s comments at
that point.

Once again, just sort of moving forward.

I want to thank everybody. We are running a couple of minutes late. I think
it has been a very useful set of hearings and I am actually very pleased with
where we are at this point, which is a lot of information I think to evaluate.
I think some views that — our initial views are not so divergent from I think
what we were hearing from the private sector and from our testifiers.

I want to thank everybody —

MR. BLAIR: Especially Mary Jo for the way she pulled together this very
successful agenda.

DR. COHN: And to my vice-chairs, Jeff Blair and Harry Reynolds without
which we would obviously not have had these hearings for these two days.

So, thank you both very much.

Okay. With that the meeting is adjourned.

[Whereupon, at 5:05 p.m., the meeting was concluded.]