This Transcript is Unedited

DEPARTMENT OF HEALTH AND HUMAN SERVICES

NATIONAL COMMITTEE ON VITAL AND HEALTH STATISTICS

Subcommittee on Privacy and Confidentiality

January 11, 2005

Room 705A
Hubert H. Humphrey Building
200 Independence Avenue, SW
Washington, D.C. 20201

Proceedings By:
CASET Associates, Ltd.
10201 Lee Highway, Suite 160
Fairfax, Virginia 22030
(703) 352-0091

TABLE OF CONTENTS


SUBCOMMITTEE ON PRIVACY AND CONFIDENTIALITY

MEMBERS

  • Mark A. Rothstein, CHAIR
  • Dr. Simon P. Cohn
  • Dr. Richard K. Harding
  • John P. Houston
  • Harry Reynolds

STAFF

  • Kathleen H. Fyffe, Lead Staff
  • Amy Chapper
  • Gail Horlick
  • Evelyn Kappeler
  • Lora Kutkat
  • Catherine Lorraine
  • Susan McAndrew
  • Dr. Helga Rippen
  • Bill Tibbitts
  • Sarah Wattenberg

P R O C E E D I N G S [9:06 a.m.]

AGENDA ITEM: Call to Order, Introductions and Opening Remarks – MARK ROTHSTEIN, Chair

MR. ROTHSTEIN: Good morning. My name is Mark Rothstein, and I
am Director of the Institute for Bioethics Health Policy and Law at the
University of Louisville School of Medicine and Chair of the Subcommittee on
Privacy and Confidentiality of the National Committee on Vital and Health
Statistics.

The NCVHS is a Federal advisory committee consisting of private
citizens which makes recommendations to the Secretary of HHS on matters of
health information policy. On behalf of the Subcommittee and staff, I want to
welcome you to today’s hearing, which will be divided between Radio Frequency
Identification in the morning and Decedent Archival Health Information in the
afternoon.

Tomorrow’s hearing will address third party access to, and use
of, health information.

We are being broadcast live on the Internet, and I want to
welcome our Internet listeners.

As is our custom, we’ll begin with introductions of the members
of the Subcommittee, staff, witnesses and guests. We invite members of the
Subcommittee to disclose any conflicts of interest they may have on today’s
issues. I will begin by noting that I have no conflicts of interest.

[Introductions. No conflicts of interest noted.]

MR. ROTHSTEIN: Thank you, and welcome to all of you.

This afternoon at 3:15, members of the public may testify for
up to five minutes on issues related to today’s topic or tomorrow’s topic,
because it will be the only public comment period during our two-day hearings.
If you’re interested in testifying, please sign up at the registration table.

Witnesses have been asked to limit their initial remarks to 10
to 15 minutes. After all the witnesses on a panel have testified, we will have
time for questions and answers. Witnesses may submit additional written
testimony to Marietta Squires within two weeks of the hearing.

I would ask that witnesses and guests turn off their cell
phones and other audible electronic devices. You can leave your passive devices
on, if you’d like.

Also, during the hearing, if we all speak clearly into the
microphones, it will make it easier for people listening on the Internet.

So I want to welcome the members of the first panel and I
would ask Dr. Waxman to please begin.

AGENDA ITEM: Panel 1 – Radio
Frequency Identification Presentation – DR. BRUCE WAXMAN

DR. WAXMAN: Good morning, and may it please the Committee. I’m
Bruce Waxman. I’m a Board-certified orthopedic surgeon practicing with the Palm
Beach Orthopaedic Institute in Palm Beach Gardens, Florida, which is a
13-person orthopedic group. I’m also the Chairman and President of SURGICHIP,
Incorporated and the inventor of SURGICHIP.

I received my medical degree from Columbia University and
trained in orthopedic surgery at Harvard University.

I spent two years in the Air Force after being drafted, and I
was Chief of Orthopedics at Tyndall Air Force Base in Panama City, Florida.

From 1976 until the present, I have practiced private
orthopedic surgery, specializing in total hip replacement, total knee
replacement, and arthroscopic surgery. During the last year, I have not
performed surgery, but I’ve maintained an active practice of office orthopedics
with my group.

Shortly after graduation from medical school, I became
acquainted with the problem of wrong-site surgery. I assumed the care of an
elderly gentleman who had a hip fracture and they operated on the wrong side.
After a series of medical complications, the patient eventually died without
ever having his hip pinned.

When I went into private practice, an eager scrub nurse
prepped and draped the wrong knee for surgery. I usually prep and drape my own
patients. Fortunately, I noticed the error and didn’t perform the wrong-site
surgery, but I did not forget the incident.

In 1991, I began to sign “yes” on my operative
sites. Shortly thereafter, the hospital recommended that everybody sign the
operative sign.

Still, mistakes were made. One of the surgeons mistook the
first for the second patient that he was going to operate on, and despite the
fact that he operated through a “yes” and on the proper site, he did
the wrong procedure.

This type of error has reached national attention now, not
because it’s common – it is actually very rare – but because it’s so
devastating when it occurs.

The Joint Commission on Accreditation of Healthcare
Organizations in July, 2004, recommended what they call the “universal
protocol,” and that entails having the surgeon sign the site with either a
“yes” or his initials, his or her initials, and then the surgical
team must have a time-out during which the surgical site is reviewed, the name
tag is reviewed on the patient, and the operative consent is reviewed.

And that’s a great idea. Unfortunately, just because you tell
people to do things does not mean they will do it, and sometimes it’s
forgotten. Mistakes are still being made. The Joint Commission is still
reporting approximately five to eight cases per month, or about 60 to 100 cases
per year, in the United States, and they state that it’s underreported. And I
believe they’re right.

Recently, I developed a system to hopefully cut down on the
frequency of these errors. The idea is to put the patient’s name, the
procedure, and the operative site right on the incision. It’s less likely that
people will forget to check if there’s actually a tag, or a marker, right on
the incision, and it’s also less likely that the information will be wrong. In
other words, a hospital chart can be switched by accident by picking up the
wrong chart in transit, whereas if a tag is stuck right on the incision site,
it’s less likely that that will happen.

Initially, I thought of the possibility of just writing out the
information, but that has the limitation first of all that it’s handwriting and
can be misread, and second of all, there’s not room on a reasonable size tag to
write the name of many operative procedures, for instance, arthroscopic
anterior cruciate ligament reconstruction with patellar tendon allograft, a
very common procedure. It can’t fit on a tag, unless you make a very large tag,
which is impractical because on a small area like the hand or the foot, and on
children, the tag would be so big that it would catch on other areas and pull
off and it would also obscure the “yes.”

So I decided that either a bar code or RFID might solve the
problem for me.

Bar code has three limitations that I saw. First, you cannot
actually program a bar; it’s there, and if you want to verify that the
information has been read as correct and put it on the tag, which I thought was
important, it cannot be done to a bar code.

Second of all, bar code is line of sight, which requires that
the bars be on the front of the tag, and therefore you don’t have room to put
the name and the site, which I think is important so that the chip is put on
the right area. Again, you could use a very large tag, but that’s impractical.

Third of all, the bar code is very frequently used in a
hospital and bar code readers are ubiquitous. Even the new three-dimensional
bar code readers are very ubiquitous in a hospital and the RFID readers are
not, and I thought that that would lend itself more toward people trying to
invade the privacy of the patient since they’re very familiar with these
instruments.

So I chose RFID, and invented SURGICHIP. SURGICHIP is a new use
of pre-existing technology in order

to help prevent wrong site, wrong person, wrong procedure
surgery. The chip is encoded by a health care personnel with the patient there
to verify that the information going on the chip is correct. They verify it
several times.

The chip is then sent to the operating room. When the patient
returns after the pre-operative visit for the surgery, again the chip is
checked by reading with a hand-held reader, this time to make sure that it’s
accurate, and the patient is wide awake when that’s done.

The chip is then put on, and I encourage the patients to put it
on themselves. The Joint Commission is encouraging the patients to do this, to
take part, because they’re the ones that know where the surgery is. It’s much
less likely that you as a patient are going to put a chip on your elbow when
you know it’s your knee that’s going to be operated on than the nurse who’s
never met you before who could possibly put it in the wrong place.

The chip then stays on, is taken to the operating room with
you, and then is read by the surgeon. If the information on the chip, which the
surgeon has confidence now that you’ve checked and double-checked, if the
surgeon agrees with what’s to be done, he or she will remove the tag and
proceed with the surgery.

The tag is put in the chart and becomes part of the hospital
record. It’s not used on another patient.

I think I’ll show you some slides. This is a picture of the
tag. This shows the tag in place. Again, SURGICHIP is used in addition to the
Joint Commission universal protocol; it’s not used instead of – it’s an
additional safeguard.

That shows you the back of the tag, which has a transponder and
an antenna.

This is the SURGICHIP software that the nurse in the
pre-operative outpatient area uses. It’s password-protected.

This is the SURGICHIP software in use, showing name of the
site, whether it’s right, left, or bilateral; the doctor’s name; the procedure,
if they drop down – it’s actually very easy for the nurse to program it
– the date of surgery.

And this shows just that it prompts you to verify that the
information is correct. The patient actually is right there. Of if you do it by
telephone – if you do it in the office, the patient is there; you read it
to them, and they affirm that it’s correct.

This is the printer that prints out the tag. It prints the name
and the site and it also encodes the chip at the same time.

And this is the reader – it’s a desktop reader –
which verifies that the information is correct.

And this is an older chip in use. That’s my daughter-in-law’s
leg.

And I don’t think this projects too well, but this is a
hand-held reader reading the information that’s on the tag. It gives you the
name of the patient, the doctor’s name, which side it is, the location, and the
procedure.

The SURGICHIP is HIPAA compliant. It’s programmed by a
designated professional in a private environment, usually the outpatient
pre-operative area, sometimes in the emergency room. The entrance into it is
password-protected, as I showed on the picture.

The information is encrypted. It can’t be read other than with
an RFID reader with SURGICHIP software. It would be like trying to read a PDF
document without an Adobe Acrobat reader.

Once the chip is encoded, it is electronically locked so that
it cannot be altered or deleted. This prevents any kind of inadvertent deletion
or changing or malicious changing of the data.

The tag that’s used, that I showed you, is called a
“passive RFID tag.” I think this will be discussed later. But this
does not have a battery. The read distance is short. The maximum distance this
can be read is five inches, so that it can’t be read from across the room by
somebody or from outside the operating room by somebody who wants to invade the
privacy of the patient.

The screen on the hand-held is very small; it fits in the palm
of the hand. It’s easily shielded. Again, the software in the hand-held
terminal is password-protected. It’s even possible to user-protect the
hand-held terminal if you feel that it’s necessary.

The read-out on the hand-held printer clears in a pre-set
amount of time, the maximum being five minutes. The reason we do that is that
– there are two reasons. One is that you don’t want the information to be
there for the next case and accidentally have somebody read the information
from the previous case. But just as important, you don’t want it to be lying
around so that somebody who’s not supposed to be reading it could read it. So
it clears in a pre-set amount of time, maximum, five minutes.

The screen is small. It can be easily covered, or protected or
shielded by the surgeon reading it or by the nurse confirming the data in the
holding area.

The chip itself, as I showed you, has the name of the patient
and the site. It does not have the procedure on the face of the chip. The chip
itself is easily covered, if it’s on the head, by a cap; on the extremities,
abdomen and chest, it’s covered by the blanket that takes the patients into the
operating room; if it’s on the neck or the back, it’s just covered by the body
itself.

If indeed the chip were to be seen, it’s analogous to the name
band, the name on a wrist band, and the “yes” on the surgical site.

This information is actually protected by the need to know
exception. And the need to know exception is barred from other areas of privacy
law, but basically what it states is that if the information is so compelling
that the doctor must know it, that supersedes the right to privacy in certain
circumstances. And certainly, knowing the name of the patient and knowing the
operative site are compelling reasons for the doctor to know.

I’ve already stated that the chip is not implanted; it’s put in
the hospital record and it becomes part of the HIPAA-protected medical record.

There is data stored in the computer after SURGICHIP is used;
this is used for research only. This is password-protected and it’s necessary
so that eventually we can track and see if this is working, if the results are
better than just having a time-out.

It’s not intended that the information will be transmitted at
this time. If in the future it becomes necessary to transmit it, the
information can be either de-identified or initialized.

SURGICHIP has been used by me and three of my

associates on a very limited basis in a hospital and in an
outpatient center and the results have been wonderful. No mistakes have been
made. The nurses that are involved in the surgical team are very happy with it
and the patients love it. Patients are very concerned about having wrong-site
surgery. It’s getting a lot of press recently and they’re afraid, and this is a
comfort to them. Patients have come up to me and told me that they’re very
happy with it.

In summary, I believe that SURGICHIP enables patients to
benefit from the use of RFID technology without compromise to their rights to
privacy.

MR. ROTHSTEIN: Thank you very much. It was very interesting and
we’ll have questions for you at the end of the panel presentations. Dr. Seelig?

AGENDA ITEM: Presentation – DR.
RICHARD SEELIG

DR. SEELIG: Thank you, Mr. Chairman. Good morning.

My name is Richard Seeling. I’m Vice President of Medical
Applications of the VeriChip Corporation, with offices in Delray Beach,
Florida.

I would like to thank the Subcommittee on Privacy and
Confidentiality for the opportunity to participate in this hearing regarding
the privacy implications of RFID technology, and we commend your efforts to
maintain the

sensitivity to the complex interaction between the need to know
critical health care information and the maintenance of privacy of that
information.

Prior to my comments, I’d like to read a paragraph from the
Executive Summary of the Information for Health: A Strategy for Building the
National Health Information Infrastructure Report and Recommendations from the
National Committee on Vital and Health Statistics which was presented on
November 15th of 2001, and with your permission, here’s the
quotation from that summary:

“An overarching principle applies to all elements
mentioned above. It is critically important that the NHII vision and its
embodiment be large enough to accommodate major changes in the future. The NHII
is by its nature dynamic; every one of the elements listed above will evolve,
just as the content of information and knowledge will change. All of the
entities contributing to the NHII therefore must think big – especially
the Federal government in its leadership role.

“In order to coordinate stakeholders appropriately and to
see that everyone can benefit from the evolving information infrastructure, HHS
must craft a national health information policy that is broad and flexible
enough to encourage and channel – rather than inhibit – positive
change.” And that’s the quote.

Your Committee is to be commended for its vision, foresight
and perseverance in moving forward in this important aspect of health care
services.

My responsibility with the VeriChip Corporation includes the
development and implementation of medical applications specific to the VeriChip
technology. I’m a Board-certified surgeon affiliated with Applied Digital

Solutions and its business units since 1999.

Before joining Applied Digital, I practiced in Morris County,
New Jersey, for 20 years.

In addition to my clinical practice, I was a consultant to the
United States Surgical Corporation and to Davis+Geck in the areas of minimally
invasive surgery and new product development.

I received a Bachelor of Science degree from the George
Washington University and an M.D. degree from the University of Medicine and
Dentistry of New Jersey. I continue to hold a clinical assistant professor at
that institution.

On September 16th, 2001, I implanted myself with
two RFID microchips, initiating the implementation of the VeriChip technology.

To briefly describe our company, Applied Digital is a total
owner of the VeriChip Corporation. Applied Digital in itself, as its mission,
develops through multiple business units innovative security products for
consumer, commercial and government sectors worldwide. Unique and often
proprietary products within these business units provide security for people,
animals, the food supply, government and military arena, as well as commercial
assets.

Included in this diversified product line are RFID
applications, end-to-end food safety systems, GPS/Satellite communications, and
telecom and security infrastructure.

And what I’d like to do is just briefly describe my personal
involvement of VeriChip and how it was developed.

At the outset in 1999, I was a consultant for one of Applied
Digital’s business units. I recognized that another business unit’s product
called “Home Again,” which is an implantable identification tag for
pets, and I’m sure many in this room already have their pets’ tags as such,
could have important applications for humans.

The first set of applications, as we just heard, revolved
around the identification of implanted medical devices such as pacemakers and
orthopedic hardware. It was my clinical experience and in discussion with
colleagues that we recognized the need for improved, rapid acquisition of
accurate, detailed, technical information regarding these devices and at that
time it was really lacking. And that access to information caused delays and
inefficiencies in patient care.

I believed that an implantable, passive, RFID tag available
when needed and linked to Internet-accessible databases could provide a
clinician access to the needed information many months after the procedure was
performed and at any facility throughout the country.

Living in the New York area during the 9-11 attack, I became
aware of rescue workers at Ground Zero writing their badge numbers, their
Social Security numbers, on their skin with Magic Markers, and they did this
that should they become injured or even worse, working in the pit, that they
would be identified and not served the same fate as those that were in the
buildings with delayed identification, so that there was a need for a more
secure form of personal identification and an approved form of access to that
information when that need came about.

And the experiences of the first days after the fall of the
twin towers inspired me to move the identification project forward at an
accelerated pace.

And of course, incidentally, we are now witnessing the same
horror and the same imperatives in Southeast Asia as a consequence of the
tsunami which struck that region approximately two weeks ago.

To evaluate this hypothesis, five days after the World Trade
Center and Pentagon attacks, I implanted myself with two of the veterinary
chips and began adapting the concept to a human version that eventually became
VeriChip. These microchips were implanted in my right forearm and my right hip,
as I felt there were more questions that needed to be answered that one site
could provide.

And in my prepared remarks, I have referred to the RFID
Journal
which has some very basic question and answers about the RFID
technology which you’re welcome to read at your leisure.

I’d like to move to the VeriChip system description. And the
VeriChip system is not just the implantable microchip, but it’s a number of
different units that interact together.

The VeriChip system contains an implantable RFID micro
transponder which is intended for personal identification, security access,
financial and health information applications in humans.

On October 12th of 2004, the Food and Drug
Administration cleared VeriChip for medical application use in the United
States. On December 10th of 2004, the FDA Center for Devices and
Radiologic Health published its guidance for industry for devices for a Class
II Special Controls Guidance Document, and that document was entitled

“Implantable Radiofrequency Transponder System for Patient
Identification and Health Information,” and I believe that’s now available
through the Federal Register.

The patented VeriChip micro transponder is a passive device
that contains an electronic circuit which is activated externally by a
low-powered radio beam, which is RFID, sent by a hand-held battery Pocket
Reader. The VeriChip is used to store a unique identification number. It is
implanted subcutaneously in the rear of the upper arm by means of a small,
hand-held, preloaded introducer.

The reader scans the arm, then displays the unique ID number of
the implanted VeriChip micro transponder. As RFID is employed to obtain the ID
number, direct line of sight is not required; therefore, it can be read through
clothing. The ID number is used to address a secure database that will provide
the implanted person’s identity and other previously entered information such
as a link to an electronic medical record or a personal health record.

I’ve included additional comments about the details of the
VeriChip in my prepared remarks.

In April of 2004, we were honored to be invited to present to
the Secretary at the Second Annual National Steps for a Healthier U.S. Summit
the application, health care application, of VeriChip.

And to summarize our presentation then, rapid access to
accurate patient information is required for optimal outcomes in medical
emergency situations. Chronic illnesses such as seizure disorders, stroke,
diabetes, COPD, cardiac conditions or Alzheimer’s disease frequently initiate
medical emergencies.

While healthy adults may be able to verbally provide personal
health history, a person with chronic diseases can experience a communication
barrier due to loss of consciousness, impaired speech, or memory loss,
resulting in treatment delays.

The challenge is to reliably obtain important information at a
moment when the ability to impart it is lacking. Current modalities available
to provide information include wallet cards, bracelets, and dial-in telephone
numbers. Frequently, these aids are not in the possession of a patient when the
need for emergency care arises, or the data is incomplete or conflicting.

An alternative to current information methodologies is
VeriChip. A passive, implantable, RFID microchip, VeriChip can provide an
identification gateway from the “chipped” individual to a secure
database containing patient-entered personal information, family contacts, and
health care history.

Advantages of this technology include that it cannot be lost,
stolen, forgotten, altered or copied, and it is always there when it’s needed.
The information is stored on a database, not on the chip, facilitating the
updating or expansion of the data via an Internet-accessible computer.

Chronic disease patients unable to communicate are at a
significant disadvantage in the health care delivery system compared with those
individuals not similarly impaired. By “speaking” for the patient,
the VeriChip technology offers an empowering option to obtain a comparable
level of care by rapidly and accurately furnishing important or even lifesaving
information.

Regarding HIPAA, we believe that the RFID usage in general and
VeriChip in particular will not impact or expand on HIPAA’s covered entities or
business associate categories or their compliance requirements. Further, we
believe that the VeriChip technology is HIPAA friendly because it doesn’t
convey a name or any information or identifier, only a number, and that number
is read by a proprietary scanner which is registered to a particular health
care facility.

On January 4th of 2005, The Record of Bergen
County
, New Jersey, published an article regarding the VeriChip and
interviewed a number of local parties, obtaining their information and reaction
to its utility.

The first person who was interviewed was Dr. Michael Gerardi,
who is an emergency room physician at Morristown Memorial Hospital, Chief of
the Pediatric Emergency Room, and the former Chairman of the Emergency Medical
Physicians Chapter of New Jersey. His comments regarding HIPAA were that:

“We’re heavily invested in electronic medical records.
We’re concerned with HIPAA, and we do everything we can to maintain security
and to protect information.

“The government created HIPAA to insure privacy in
electronic billing transactions,” Dr. Gerardi said. “Chip implants
will not compromise security because only health care providers will have
scanners for reading them.”

Regarding the NHII initiatives which we fully support and we’ve
been actively participating in in those endeavors, we believe we fall into the
category of stakeholders dealing with the information technology industry, and
we support the two main areas of participation in that, and that in the
interests of time, I’d like to refer you to the November, 2001, report dealing
with a case study which actually, with extension, predates our publication of
what VeriChip was all about, but to summarize it, the two areas of our
participation with NHII in the areas of privacy and systems application.

And specifically regarding the privacy issues, in November of
2004, our company publicly issued a privacy statement, and I’d like to put that
on the table for our discussion:

“To insure that its attribute” – meaning
VeriChip – “is a benefit and a benefit only, we are making privacy
our priority and our commitment. It is good business and it’s responsible
behavior for a leader in an area of RFID technology.”

The company’s six-point privacy statement is as follows:

“Number 1. VeriChip should be voluntary and voluntary
only. No person, no employer, nor any government should force anyone to get
‘chipped’.

“Number 2. Privacy must be a priority at the highest
levels of our organization, and as such, we will have a privacy officer who,
with privacy experts, will be charged with addressing the day-to-day global
evolution of this technology.

“Number 3. We will immediately address privacy and patient
rights in all consumer, distributor, and medical documents related to VeriChip.

“Number 4. VeriChip subscribers” – as we call
patients – “are able to have their chip removed and discontinued at
any time.

“Number 5. Privacy means different things to

different people, so only the VeriChip customer should
designate the groups that may have access to his or her database information.

“And Number 6. We pledge to thoughtfully, openly and
considerately engage government, privacy groups, the industry, and consumers to
assure that adoption of VeriChip and RFID technology is through education and
unity rather than isolation and division.

“Our Chairman of the Board has publicly stated that we
will work closely with all our key constituencies, including consumers,
distributors, hospitals, to make sure that the rapid adoption of VeriChip is as
broad as possible. With significant opportunities for VeriChip in other areas
such as security, we will strive to apply these standards consistently and
uniformly across all of our target markets.”

Mr. Chairman, the VeriChip Corporation stands ready to assist
the NCVHS in promulgation of recommendations which respect the need for privacy
yet allow current and future technological advances to improve the quality and
effectiveness of medical care, especially in the area of automated patient
identification and seamless access to digitized medical information.

We are confident that under the current doctor-patient
relationship and the public’s view of physicians and hospitals as the most
trusted guardians of their medical information, the current archaic and
inefficient means of managing medical information can be replaced by
information systems currently considered routine in other major business
sectors.

So in conclusion, VeriChip is an FDA-approved, implantable,
passive microchip which is indicated for patient identification and access to a
medical database. An individual voluntarily initiates the process, not a
government entity. Every person has a choice.

Further, the technology is a reversible biometric. It can be
removed as simply as a large splinter, thus eliminating the need for personal
identification, providing the person with a greater long-term control of
personal information access as opposed to a Social Security number,
fingerprint, or other form of identification.

And finally, I’d like to refer back to the article in the
Bergen Record.

Representative Robert E. Andrews, who represents the citizens
of the Haddon Heights region of northern New Jersey, is an advocate for genetic
privacy laws, and said that the chip must be kept strictly voluntary. And he
was quoted as saying:

“If someone chooses to have such an implant, then

it should legal and encouraged…. But the idea of a hospital
implanting a chip without permission is illegal, and should stay illegal.”
The question is not the technology, but how it is used, Andrews said.

And quoting Dr. Gerardi again, he was quoted as saying that
“This is a very exciting thing. The computer could really make a
difference for care givers worried that their loved ones show up in an
emergency department and no one will know their critical information.

“Everything has its drawbacks and its positives,” he
said. “I refuse to let civil libertarians get in the way of a good idea.
People out there fear information on the chip. They fear Big Brother. I think
that’s nonsense,” Dr. Gerardi said.

“I think life’s more important than the remote possibility
of loss of confidentiality. With the chip, you can markedly improve the
efficiency, accuracy and safety of care to provide people unable to communicate
their health needs.”

And the final comment is from a gentleman whose name is
Nicholas Minicucci. He is the President and founder of Molly Foundation, which
is a diabetic research and treatment institute located in Hackensack, New
Jersey.

He has a daughter who has juvenile diabetes for many years and
it was her plight that inspired him to develop this Foundation. He’s a very
passionate man, and this is the final quote of my presentation:

“Tell me about Big Brother when your daughter or loved one
is run over by a truck. I don’t want to hear about it.”

I thank the Committee for your time and attention, and I
welcome your questions.

MR. ROTHSTEIN: Thank you very much, Dr. Seelig. I’m confident
that we will have several questions for you at the end of the panel
presentation. And now, Dr. Tillman.

AGENDA ITEM: Presentation – DR. DONNA-BEA
TILLMAN

DR. TILLMAN: Thank you. I’m going to just make a few very
brief comments about FDA’s role in this matter.

I’d like to start off by saying that the Center for Devices
and Radiological Health is the part of FDA that’s responsible for oversight of
medical devices. And our primary mission is to insure the safety and
effectiveness of medical devices; that’s what we are mandated to do.

And to do this, we basically take a risk-based approach to the
evaluation of medical devices. The amount of regulatory oversight that we apply
to medical devices basically depends on the potential risk that the device
presents. Higher risk devices receive a higher level of

regulatory oversight.

The highest risk devices, things like heart valves, stints,
interocular lenses and things of that nature, are placed into Class III, and
they require what we call “pre-market approval.” In order to market a
Class III product, the company has to show that the device is safe and
effective, and that is very similar to those sort of more well known processes
that a new drug has to go through.

Moderate risk devices, and this includes many medical products
– monitoring devices, many surgical devices, and most of what you would
see in a hospital – are placed in a Class II. Now most, but not all, Class
II devices require some types of pre-market notification before they can be
marketed. So if you’ve got a Class II device, generally you need to come to FDA
before you can market your device.

In this case, the company has to show that these devices are
substantially equivalent to legally marketed devices, so they basically have to
show that the new device is equivalent to existing products on the market.

Both the VeriChip and the SURGICHIP devices are Class II
devices, although it’s worth noting that subsequent VeriChip-like devices will
be exempt from pre-market review unless there are significant changes made to
the device technology or for the indications for use.

We do have a guidance document that Dr. Seelig referred to and
it is available on our website. I can certainly show you to find that, if
you’re interested in it.

Now, the lowest risk devices, and these are things that many
people don’t even know are medical devices, like toothbrushes, manual surgical
instruments, gauze, those are placed in a Class I, and most, but not all, of
these devices do not receive any FDA pre-market review before they go on the
market.

However, all medical devices, with very few exceptions, are
subject to registration and listing – in other words, a company has to
tell FDA that they exist and what products they are selling; quality systems
requirements, and adverse event reporting.

Now, one thing that I think is also important to recognize is
the distinction between when these products become medical devices and when
they are not medical devices. Our legal counsel handed down an opinion that
said that when implantable devices like the VeriChip are used for non-medical
purposes – in other words, for security or to basically link to a database
that is financial or non-medical information – those are not considered
medical devices and FDA does not have any jurisdiction over the products when
they’re used in that way.

When their implantable transponder devices do have medical
indications, in other words, when the chip is linked to a medical database and
when there is actually a medical indication, only then does FDA have any sort
of regulatory oversight over the products, so that’s another important thing, I
think, for the Committee to recognize.

Now, although we’re primarily concerned with safety and
effectiveness in medical devices, there are instances where this would include
privacy issues.

For example, implantable transponder devices like the VeriChip
and devices like the SURGICHIP as well undergo validation testing in order to
show that they basically do what they say they do. They have to show that they
perform in accordance with their labeling and in accordance with how they’re
designed to be used. If these devices are designed to protect confidential
information, then in fact they must be shown to be able to protect confidential
information from disclosure.

So FDA asks companies to address any confidentiality claims
that they make about the device, and the guidance document actually on these
devices has a section entitled “Information Security Procedures” and
under that it talks about confidentiality, and that confidentiality means the
characteristic of data and information being disclosed only to authorized
persons and entities at authorized times and in an authorized manner.

So that is one way that we address confidentiality provisions
for these types of devices.

Finally, I’d like to note one area where FDA and CDRH do have
an express interest to deal with privacy issues, and that’s related to the
provision of confidential patient information collected during clinical trials.

FDA regulations are very clear that the names and other
personally identifying information that’s collected during a clinical trial
should be removed from medical records that are submitted to FDA as part of our
pre-market review process, and then even if a company or an investigator fails
to do so that FDA will not publicly disclose that information.

That ends my remarks.

MR. ROTHSTEIN: Thank you very much, Dr. Tillman. We have
another witness who was scheduled to testify by telephone at 10 o’clock, and
that’s Dr. Halamka, and I’m told that he is not on line yet. So we have a
couple of minutes for questions before Dr. Halamka, and then we will recognize
him when he is on line. So, questions?

Questions, Answers and Comments

MR. HOUSTON: I had a specific question regarding

the VeriChip. Does the chip actually hold any data, I mean
other than an identifier, or is the data really held in a database somewhere
else that can then be linked back to the identifier that’s held on their RFID
chip?

DR. SEELIG: You’re correct. The only obtainable information on
the chip itself is a unique identification number. The actual information
itself is maintained separately, apart from the microchip, on a database.

And the intent there is to accomplish two things. Number one
is to maintain privacy in that if a person were, an entity were, to
surreptitiously obtain a scanner, the only information that that scanner would
yield would be an identification number, and without password and other forms
of authentication, there wouldn’t be any access to anything more than a
16-digit number which has no relevance other than that.

And the second reason is that we’re all memory hungry and that
we couldn’t design anything that would obtain enough information storage on a
chip to satisfy everyone’s needs. And the subset of that thinking is that if
one wanted to update the information, and that information resided on the chip,
then you’d have to go back to a center to get your chip reprogrammed, if you
will, whereas if that data remained remote from the chip, it would be very
convenient to maintain updates and increase the file and add to it and revise
it, very, very convenient.

MR. HOUSTON: I sort of suspected that was the case, and I
understand the RFID capacities are fairly limited but that there is the
capability to have capacity on an RFID chip in terms of storage spaces.

Has there been any thought, or other people out in the market,
looking at maybe maintaining some type of limited data set of critical
information or, you know, allergies, meds, existing type of medical conditions,
things like that, on the chip itself, or is that pretty typical – what
you’re company is doing is what everybody else is thinking of doing?

DR. SEELIG: Well, I think there are about three questions, four
questions, there.

The answer is: It depends. And it depends on whether it’s
implanted or whether it’s a wearable form of RFID. And with a wearable form, as
a Smart Card, you can have much more memory capacity. That’s issue number one.

Issue number two is the capability of read-write. VeriChip is
read-only so that if you have a wearable chip, you could then expand and add to
it to or delete or modify. The form factor of VeriChip, as you may know, is two
by 11 millimeters, so it doesn’t allow much area for having this kind of luxury
of space that you can spread out over a credit card sort of format.

MR. HOUSTON: We’re going to look back 20 years and there’s
going to be like gigabytes on that little one.

DR. SEELIG: As we speak today, no.

MR. ROTHSTEIN: Before we call on Jeff, I want to follow up one
question, if I may. So it is possible now that more information could be on the
chip, but a limited set of information – in other words, you could
conceivably have blood type, chronic disease diagnosis, or something that might
be of value to first responders, but your particular product doesn’t utilize
that, is that correct?

DR. SEELIG: I have two audiences here; I have to directly
address my colleague here from the CDRH as well as you. The current form of
VeriChip does not have the capacity to do that. Should we want to do that, we
would then need to re-initiate the CDRH process because that’s –

MR. ROTHSTEIN: I understand. We’re just trying to get some idea
of the technology.

MR. HOUSTON: Hypothetically.

DR. SEELIG: And hypothetically, the answer is yes.

MR. ROTHSTEIN: Okay. Thank you. Mr. Blair?

MR. BLAIR: Actually, this question might be to both Dr. Seelig
and to Dr. Waxman. I was thinking of it first to Dr. Seelig.

You said that you have a 16-digit identification and that’s
probably more digits than are needed just for a numeric identification, so I’m
wondering: How did you come up with that? What is your methodology for issuing
new numbers and keeping these numbers unique? And does that identification
number contain, besides identification, any intelligent information? And I
guess that really goes for both Dr. Waxman and Dr. Seelig.

DR. SEELIG: Well, thank you for that very thoughtful question.
It’s the first time that’s ever been addressed, I think, and that 16 digits, we
settled on that because we could, and that we figured the larger number we
could pick, the less likelihood there would be of anyone coming up with a
sequencing that would outstrip a capacity or would somehow come up with some
sort of algorithm that could somehow come up with the number. So it’s an
impossibly large number for any reasonable expectation. That’s number one.

Number two is that we go through many methods during the
fabrication to assure that the number generation is unique and that it’s not
duplicated. First, the number itself is laser-engraved on an integrated
circuit, so it’s not something that’s programmed after manufacture; it’s
actually laser-engraved on it, which means that people can’t select the number
– it’s there, and you have to take what you get and then move on from
there. That answers your second question.

And again, during the manufacturing process, there are many
checks and balances to be sure that this number that’s generated is not
duplicated. And the proof statement of that is that our sister company, Digital
Angel Corporation, which is in fact the manufacturer of the VeriChip
technology, has been doing this for well over 10 years and that in the animal
world there are now 30 million of these microchip transponders implanted in
various life forms around the world and that they have very well honed a system
to assure that a number is not duplicated, and we are using their
well-established manufacturing practices compliant with CDRH requirements for a
human device to assure that very thing.

The third element is: Does that number contain any intelligent
information that someone could figure something out? And the answer is no,
except for one thing, and that one thing – and actually there’s another
thought I had – the one thing is this, that the first four numbers of that
sequence are 1022, and that’s our internal workings for the human, so humans
will not be mistaken for cattle or a chicken or a sheep or anything else, so
that’s the only thought that comes into those 16-digit numbers.

Now, a subset of that is something that we haven’t talked at
all about this morning and I daresay we’re going to get into, is actually
technical aspects of what RFID is and that I think it’s important to say very
clearly to everyone in this room: RFID is not a black and white issue. There is
not one frequency. There is not one size. There is not one type of transmitter.

There are active tags, which mean battery-powered, passive
tags, which our two products are, which contain no power at all. Distance of
read and range – there’s a whole industry out there which we’re not
discussing today.

Having said that, the frequency that we’re using is FCC
licensed, or implantable in life forms – human life and animal life forms.
It’s very, very low frequency, so it transmits through body fluids. We use a
different frequency in humans as the pets use so that we have different readers
for different applications.

Sorry for that lengthy –

MR. BLAIR: Thank you.

DR. WAXMAN: I’ll give a briefer answer.

We’re just the opposite. We have all intelligent information on
the chip. We don’t access a database; I wanted to try to avoid that.

This is a closed-circuit system. In an operating room, we want
the surgeon to be able to get the information right away without having to
access the database.

Each step, in our particular instance, introduces the
possibility of an error, and the whole purpose of SURGICHIP is to prevent
error. All the information is essential and intelligent; it’s on the chip.
Again, that’s why it’s password-protected. That’s why the chip is put in the
chart. That’s why all the information is locked so that it can’t be changed
maliciously or inadvertently.

We try to get as much information as we can on the tag. We’re
limited to 256 bits of information. The main reason is that some surgical
procedures are very long; a spine procedure can involve a description of
multiple levels. So I’ve tried to limit the amount of space for the name and
the site so that I have plenty of room for the procedure. Hopefully, somebody
smarter than I will devise a chip that can get more than 256 bits.

But this is different than VeriChip. We want the information to
be intelligent and accurate and rapidly accessible.

MR. BLAIR: Are you using one of the standardized code sets for
the procedure, and if so, which one?

DR. WAXMAN: I’m not sure what you mean by code set?

MR. BLAIR: Whether you’re using —

DR. WAXMAN: Oh, no, we’re not doing –

MR. BLAIR: — CPT codes –

DR. WAXMAN: Not at all.

MR. BLAIR: — using surgical codes, standardized surgical
codes.

DR. WAXMAN: Not at all. We’re using words, not any
abbreviations. Hopefully, no initials. The only limitation is the space. I want
it to be written out. I don’t want “99281” or “29881” to be
confused with something else. There are a million codes. You can hardly get a
surgeon write the name or the “S” on the site and look at it besides
having to reference and remember a whole book of codes. I want it to say
“total hip replacement,” and the surgeon says, “That’s what I’m
going to do.”

MR. BLAIR: Yes.

DR. WAXMAN: I want it to be simple.

MR. ROTHSTEIN: Well, thank you. I understand now we have Dr.
Halamka on the line?

DR. HALAMKA (on phone): Yes, I am here.

MR. ROTHSTEIN: Welcome. Good to hear from you again, and we’re
anxious to hear your testimony.

AGENDA ITEM: Presentation – DR. JOHN HALAMKA
(on phone)

DR. HALAMKA: Oh, very good. Well, thanks for having me. And
just as background for the Committee, I am the Chief Information Officer of
Harvard Medical School and the Chief Information Officer of CareGroup, a group
of six hospitals in Boston. Beth Israel Deaconess is the flagship hospital.

And today what I’ll describe is the use of RFID, both passive
and active, at Beth Israel Deaconess Medical Center and some of the issues that
we have encountered, specifically issues of privacy and functionality, and how
is it we use them and what are our use cases.

I should also disclose that I am one of those 30 million
biological individuals who does have a chip implanted. I am 102230000472, and
that chip is in my right triceps.

That implantation was done about a month ago, surely not
because I am an evangelist for this technology because I felt that the only
credible way to evaluate the technology was to have a human subject, and that
would be me as the volunteer, in terms of the pain of insertion, the efficacy
of the technology being read, the operational aspects of linking data to the
identifier, et cetera.

So with that as background, the two ways in which we use RFID
at Beth Israel Deaconess, passive and active. I’m going to start with active.

We have partnered with a company called PanGo Networks to
deploy 90 active RFID tags in the emergency department and in one of our
cardiology wards. And the use case for that active ID tag is to track
equipment, personnel and patients.

Specifically with regard to equipment, yes, there’s a theft
deterrent or tracking aspect to it, but most importantly, for our high-use
equipment such as EKG devices, ventilators and IV pumps, we really want to know
where they are so that they’re available just in time – that is, if a
trauma patient comes into the trauma bay, you want to make sure that you’ve got
the appropriate equipment handy, and therefore we have a heads up display that
shows the location of the high-use equipment items in our 48,000-square-foot
emergency room, based on active RFID transponder tags attached to those pieces
of equipment.

We’re interested in locating the patients, and as the other
speakers have said, we are not using patient-identified information on an RFID
tag; specifically, we are just simply tracking the gurneys on which a patient
resides, and therefore that’s a proxy for where is that patient, radiology? Bed
one, bed two? Et cetera.

And we use that information purely in a work flow setting. We
have an electronic dashboard that is viewed by our clinicians. The public view
of the dashboard is de-identified; there is no name of a patient; purely the
initials of a patient and the current location of that patient as indicated by
the RFID tag as shown on the board, as well as some basic clinical information
such as chief complaint and the status of various outstanding tests and whether
they’re admitted or not admitted, waiting for a bed, that sort of thing.

We also track staff, nurses and doctors, and there’s where a
real privacy issue certainly came up. When we went to do this pilot, the staff
said, are you going to be tracking us such that you could identify us by name
and use that in some sort of punitive fashion? So you were 17 minutes in the
break room and you only have 15 minutes allocated.

We said, from a person-tracking standpoint, workflow analysis,
we’re going to track roles of nurses and doctors and not individual names.

And so we use it in the case of if you look at the average
emergency room day, 20 percent of nursing time, as indicated by RFID tags, is
spent walking down this hallway; if we would only move the pixis pill cabinet
down to the center of the emergency department, we could enhance productivity
and workflow.

So we have very specifically told staff we do not track you as
an identified person, and that has certainly been a much easier way to use RFID
effectively for workflow analysis.

And thus far, our experience over the last three months in
using active RFID tags in the tracking of equipment, tracking of patients and
tracking of personnel has been effective to the level of a room. The technology
we’re using, PanGo, which uses WiFi networks, existing networks, as a mechanism
of triangulation of the RFID tag, can tell us not that a ventilator is in the
corner of Room 1 but it’s in Room 1 somewhere.

And so to that level of granularity, tracking of personnel and
patients and equipment to the level of a room in an emergency department
setting has been effective. And privacy issues, as I’ve said, the patients
purely de-identified, tracking them by initials, tracking them by gurney that
they’re on, and the individual personnel tracked by role.

Passive RFID tags are something that are really at this point
more of a speculative use for us. The use case we’re looking at is in
medication administration.

Today, we’re using bar codes. Bar code the nurse, bar code the
patient wristband, and bar code the medication in a pixis device. And that’s a
repackaging of the medication into a plastic bag with a bar code.

We believe that RFID passive tags will have a significant
advantage over bar codes because bar codes require a proximity to the reader,
they can get torn and bent, that the notion of having a patient wristband or,
if you really want to speculate – as I have, this implanted tag – a
patient is in a room and because of RFID being present, we know the patient has
been pushed into that room. Passive RFID tag, so we don’t have tracking per se
but we do know that, say, patient walked through the door which happens to have
a passive RFID access point that is using an RF excitation signal to get an
RFID read as the patient goes through the door.

Well, a nurse goes through the door and a medication goes
through the door; that then can serve as a proxy for a medication
administration record or in fact could be used in the sense of, gee, the nurse
has just walked into this patient’s room with this med and that’s not a med
that a patient should receive, and so then can be used from a medication safety
as well as a medication administration record.

So we believe that the technology is very early with regard to
the passive RFID tags, as the other speakers have said.

Standards are very much just evolving at this point and so in
my conversations with industry leaders, the EPC global group and folks at the
FDA, it sounds to me as if we should look at these potential uses of passive
RFID. For the moment, however, bar codes are the practical approach to track
medication and track medication administration records and positive patient IDs
but that

looking out on the one-year to two-year horizon, replacing bar
codes in those settings with RFID tags, assuming that standards are in place
and that the passive RFID technology is robust, that it would have the
advantages of allowing a much broader scanning area such as walking through the
threshold into a room rather than passing a reader over a specific bar code.

So today, I would say that the passive side of things has just
one privacy issue, and I think the other speakers did address it, and that is,
you can imagine a scenario where if we had all injected passive RFID tags and
the database that related our tag number to our name or other protected health
information was somehow available, that I could be walking in a mall past a
Mont Blanc pen shop and a billboard would appear: “Hey, John – we’re
having a sale on the pen that you like, and I understand that three months ago
you bought some ink. It’s time to buy some more.”

You know, one could envision interesting privacy violations
based on the fact that an individual could be associated with a passive RFID
tag that’s implanted.

But, as the other speakers have said, the intent is any linkage
is going to be kept private and certainly, at this point, no – at least in
the case of VeriChip – protected health care information is on the chip
itself, purely a 16-digit identifier which is utterly useless unless you have
access to the linkage data.

So, yes, privacy concerns are something we need to be aware of,
but at the moment, it does seem that the approach has separated protected
health information from the identifier itself and therefore I don’t see privacy
violations in the immediate future.

So that’s been my experience thus far, and certainly happy to
answer any questions the Committee may have.

MR. ROTHSTEIN: Thank you very much. I’m sure we will have some
questions, so if you could stay on the line for another half-hour, we would
appreciate it very much.

DR. HALAMKA: That would be fine.

Questions, Answers and Comments

MR. ROTHSTEIN: So, other Committee members? Dr. Cohn.

DR. COHN: Yes. John, this is Simon Cohn. Actually, I’m pleased
that you’re on as a user of the technology, I guess a personal user.

I actually had a question for you, and this is a little –
I’m not sure it’s tangential, but it maybe takes the question maybe a step
further, because obviously as I think you commented, there’s patient issues;
there’s also employment and employment sort of privacy and confidentiality
discrimination issues.

But, curious, we’ve all talked – and I’m talking about now
access to your electronic medical record, access to your terminals in your
environment that have patient-sensitive information – I think as we’ve all
talked about security and password protection log-on, obviously the issue of
biometrics comes up versus secure cards, versus secure IDs versus just a
regular password.

And I’ve heard recently a lot about proximity devices.
Normally, I guess, worn around your neck, which might be RFID technology. But
of course you bring up the interesting question about the fact that you’re
already implanted with something internally.

Are you comfortable – once again, as an employee of your
institution – having that become part of your biometric authentication
when you walk up to a computer device and start entering patient-specific
information?

DR. HALAMKA: Sure. Well, a couple of answers to that. We are
using proximity detectors today to enable qualified personnel to enter secure
areas such as our data center where we have nine million patient records stored
electronically. So I think the notion of carrying something that is used as a
means of identification is certainly workable and positive.

The challenge, I would think, in using, let’s

say, the VeriChip as the mechanism of authentication is
obviously this is a medical procedure. Now, it’s a relatively painless
procedure and it’s relatively fast, but I don’t think we would want to compel
any individual to get this thing inserted as part of their employment or part
of their use of a computing system.

Now, obviously there are other macabre variations one could
think about, which is if somebody wanted to break into a highly secure system,
they could extract a chip from a known valid user, whether that was removing
their arm or simply removing the chip, and clearly there’s speculative concerns
on what kind of either violence or violation might occur should that be used as
a primary means of authentication.

But I will say the notion of biometrics, in my experience,
where a biometric is defined as a thumbprint, hand geometry, iris scan et
cetera, in our health care environment, there have been too many false
positives and false negatives associated with standard biometric technology
that we have chosen not to use it, as well as the fact that I have 8,000
computers at Beth Israel Deaconess – the notion of outfitting every one of
8,000 computers with a biometric reader is certainly a capital expense I can’t
afford.

So it is certainly attractive to think about an implanted chip
which is easy to read, certainly more accurate in the sense that it is a
16-digit binary representation rather than a thumbprint which is an analog
representation.

So if I were to choose some kind of authentication device, it
certainly has appealing qualities. I just think the requirements for insertion
would not be tenable.

MR. ROTHSTEIN: The order of questioning will go Harry, then
Jeff, then Richard, then John, and then, if we have any time, I’ll ask some of
the 30 or 40 questions that have occurred to me. So, Harry?

MR. REYNOLDS: Yes, I have a question for Dr. Waxman and then
Dr. Seelig.

Dr. Waxman, you made a comment about the Privacy Law and that
if the information is compelling that the doctor must know they can get it. Do
you know where that is in the law, or can you give me a little more on that?

DR. WAXMAN: I can’t actually reference that. I looked it up
once on the Internet last week, and it had to do with, if I remember, a public
institution, records in a college setting, if I remember correctly. I can’t
give you the exact site.

MR. REYNOLDS: Okay.

And Dr. Seelig, since you’re implanted, which is

now public knowledge, I guess, give me a sense of how you see
this whole database idea working? I mean, obviously, once you get X-amount of
people chipped and you have a database somewhere that obviously identifies
them, and then that database starts building information, and then you think of
all the doctors and all the hospitals and you think of the fact that HMOs
really didn’t win in the United States so people are seeing many doctors for
many things and everybody needs information. How do you see that play out, at
least in your vision?

DR. SEELIG: The National Health Infrastructure initiative that
Secretary Thompson started in the spring of 2004 and that Dr. Brailer is now
moving forward in implementing has 12 goals that they want to see accomplished.
And the one goal that you’re referring to is something that is yet to be
achieved, and that’s interoperability.

And that means if a particular hospital or physician’s office
uses one form of electronic medical record or another, what NHII wants to do is
to be sure that those different systems can bring to a third location, if you
will, the information relating to that patient.

What VeriChip can offer is an alternative form of pointing, or
directing, to that information, so that us in room here are conversant and we
can give our names, some

sort of other pointing information and actually Dr. Halamka
chaired a very good session at NHII dealing with this whole thing of
identifiers to access that information.

VeriChip will be another component of that identifier set to be
sure that you can get 100 percent certainty, or as close thereto, of knowing
that the information reposing on an information database pertains to the person
in front of you, and conversely that you, in front of a clinician, can access
that information. That’s the fit for VeriChip.

MR. REYNOLDS: So would you see me giving my doctor my 16-digit
number and then they could access that database?

DR. SEELIG: Yes, and that – for example, you’re undergoing
chemotherapy; you’re fatigued, you’ve been to your 13th doctor on
your second day going through the mill, if you will. Why should you constantly
have to fill out the same forms and repeat the same information over and over
again, if you could, let alone if you were impaired? Yes.

So – and then the other thing is it’s going to be more
accurate because it’s going to be consistent, and you’re not going to likely
make mistakes or cause confusion.

MR. REYNOLDS: Thank you. That’s it, Mark.

DR. HALAMKA: In fact, could I add to this?

DR. SEELIG: Certainly.

DR. HALAMKA: Is that possible?

DR. SEELIG: Please. Go ahead.

DR. HALAMKA: Okay. I have added my 16-digit identifier to the
master patient index that is used to link all of our six hospitals, so my
personal medical record can be looked up by my Beth Israel Deaconess medical
record number, my private physician’s office medical record number, or my
16-digit identifier. They’re all equivalent to get to my electronic medical
records in Boston.

MR. ROTHSTEIN: Okay. Jeff?

MR. BLAIR: This is for Dr. Seelig and Dr. Halamka. Hello, John

DR. HALAMKA: Hello.

MR. BLAIR: — this is Jeff Blair.

When we were receiving testimony from the work we were doing on
the Subcommittee for Standards and Security for e-prescribing, one of the
– well, actually, it came up in bits and pieces; some of the information
that came up was that the pharmacy benefit managers that had a portion of the
medication record for a particular patient that’s part of a health plan would
be providing that information to the prescriber in order to facilitate clinical
decision support. But they were filtering it. And they felt they needed to
filter it because if the particular patient was taking medications for HIV or
sexually transmitted diseases or behavioral health diseases and that particular
physician wasn’t involved in those issues, they were actually making the
decision that they would filter that information out and not tell the
prescriber that that patient was taking that medication in order to protect
privacy.

In short, it was a decision where they’re making a tradeoff
between the safety of the patient in terms of a drug-to-drug interaction that
may not be detected and privacy. And I was wondering if either of you have
encountered this issue where you’re having to make this type of a tradeoff, and
if so, how you’ve dealt with it.

DR. HALAMKA: I’m certainly happy to take that one because we’ve
already done it.

We launched over the last six months in Massachusetts statewide
an initiative called Meds-Info. Meds-Info interlinks the databases of RX Hub,
Express-Scripts, Merck-Medco, MPC Health, effectively all of the pharmacy
benefit management databases that serve our entire state.

And today, if you walk into Beth Israel Deaconess, Boston
Medical Center or Emerson Hospital, our three pilot users, an authorized
physician types in the name, gender, date of birth of an individual and five
seconds later, with patient consent, a result of all of that information
seeking across all the PBMs in our region appears for use in clinical care.

It does not include medications of mental health, substance
abuse or HIV treatment, and the reason for that is a regulatory reason. In the
state of Massachusetts, it turns out that we require a level of consent,
specifically not just as global consent – yes, you can go look up my data
– but that every place that data lives, Express Scripts, RX Hub, requires
an individual, separate consent by the patient to release the information
therein if there is going to be mental health and substance abuse information
transmitted.

So in our first iteration, which, as I said, it’s been live and
working very well, we have not been able to get through a lot of this consent
and regulatory restriction. And we recognize that that could jeopardize patient
care. A patient could be on an MAO inhibitor and I’m going to give them a
medication in the emergency department with a drug-drug interaction that causes
harm.

So we’re working at the state level and through the Mass Health
Data Consortium and through our folks that are involved with the State House
trying to change that restriction that will enable us, with a global consent,
to share these medications that are not shown today.

MR. BLAIR: Thank you, John. Dr. Seelig?

DR. SEELIG: Very briefly, I think this reinforces our approach
to this whole matter, and that is that we will conform to whatever regulatory
issues and rules of the road that are out there, that the technology should not
shape them but actually be responsive to them.

MR. ROTHSTEIN: Richard?

DR. HARDING: Most of my questions have been asked, but one
quick one for Dr. Tillman. You mentioned that there were medical and
non-medical issues for insertion of an identifier like a VeriChip or something
and that if it was something for something like security, FDA would not have
any responsibility there. The insertion of a device under the skin is not a FDA
responsibility?

DR. TILLMAN: That is correct. We have a legal decision from our
counsel that, if you look at the definition of a medical device, that unless
there is a medical use of this product, it doesn’t meet the legal definition of
a medical – I’ll give you an example, if you think of ear-piercing. But
that is the legal decision we’ve been given.

So that is correct. Unless the VeriChip or an implantable
device has a medical indication, FDA has no legal authority over it.

DR. HARDING: So, the same device inserted the

same place can have different – okay.

DR. TILLMAN: That is correct.

DR. HARDING: Thank you.

The other is something that has been alluded to but we haven’t
really said is the issue of voluntary use of this device versus coerced use of
the device. And there’s just something a little bit creepy about inserting
something under the skin – you know, tagging, putting the SURGICHIP on my
arm doesn’t have the same feeling to me as having something inserted into my
body.

And the idea of having that being removable is good, but there
is that issue of “I have something that I can’t do anything about without
the help of somebody else.”

And from the time that I decide I don’t want this thing anymore
to the time that I get it out, it’s a funny situation to me. Any thoughts about
it? Can it be neutralized? Can it be stopped?

DR. TILLMAN: Can you turn it off, right?

DR. HARDING: Yes – turn this thing off for a while, or
something like that.

DR. SEELIG: Perhaps Dr. Waxman would like – would you like
to make any comments about that first? You’ve obviously done one way and we’ve
done another.

DR. WAXMAN: Well, the SURGICHIP is not implantable and it’s
removed and put in the chart, never used again. So it doesn’t seem to be so
much of an issue.

DR. HARDING: Time limited kind of thing.

DR. WAXMAN: Well, it’s not time limited; it can always be read.
But there’s no need, once the surgery has been performed. Then it’s a part of
the hospital record. It’s just not there anymore after the – it has a
one-time, one-purpose use, and once it’s done, it’s never used again.

DR. SEELIG: We’re here talking about health care and if we can
just confine the discussion regarding health care, I think we can be very clear
about this. And that is that separating health care information from any other
form of access health care information, patients and people have a level of
expectation about access and accuracy of that information when it’s needed, and
that people demand, or they expect, somehow, as soon as you get to the office
or as soon as you get to the emergency room, somehow everything that they need
to know about you is going to be at hand.

And they don’t know about all the running behind the scenes to
get the file rooms and the faxes and all of the other nonsense with microfiches
that has to happen to obtain that information to deal with the relevancy of
that situation.

So with that as a backdrop, and then dealing with medical
hardware, and now you’re going to put this creepy thing in your body that has a
microprocessor and has a 10-year lithium battery and it’s going to keep sending
shock waves to your heart? That sounds pretty creepy to me! Or you’re going to
take your natural bone out of your body and you’re going to replace with
titanium and plastic and it’s going to work, and you can play tennis instead of
having your own natural bone? Well, that sounds pretty creepy to me as well.

DR. WAXMAN: It doesn’t sound creepy to me.

[Laughter.]

DR. SEELIG: The point being that it depends on the perspective
of where and how that hardware is being used and to what place it’s going to be
and use it’s going to be put.

And I think that if we look at the application, and say, why
am I having this done? –- and then work back from there about the
benefits, I think that’s a safer approach and a more, if you will, conservative
medical approach to things rather than saying something is coming out of the
blue and all of a sudden I have to make a decision about it.

DR. HARDING: I think that’s the issue that I’m trying to
raise, is that there is a consent issue to have that thing put in my arm –

DR. SEELIG: Exactly.

DR. HARDING: — and that I have to be fully aware, fully
informed, of what the possibilities are. And as long as I have that, and
consent is given, and more power to it. But I just have some concerns about the
consent process in this.

DR. SEELIG: Well, that’s right. And I think that we firmly
believe that this is a medical procedure and a medical device and as such,
informed consent must be obtained. And as any clinician knows, you can make a
recommendation to a number of patients to get the same form of treatment and a
certain patient set, subset, will say yes and another set will say no. Whatever
they want to have done is done.

DR. TILLMAN: I’d like to add one more comment, and that is that
currently these devices are what we consider prescription use only, so both the
chip itself is available on the use of a prescription and the readers as well.
So at least for these two products, they both require prescriptions, so medical
involvement; they’re not available to just anybody off – for their medical
use.

The non-medical use, then that’s a different story.

MR. ROTHSTEIN: Okay. Mr. Houston, and please save me some time.

MR. HOUSTON: Three quick questions.

[Laughter.]

MR. HOUSTON: Three long questions. Especially, I guess, for all
types of technology, durability and failure rates – have there been any
types of studies as to especially the implantable kinds? How long do these
things last? Are they prone to being damaged or destroyed?

And I guess the other question is: I guess there is the
capability for somebody to steal the number, take the number, and encode
something else to admit the same number, and RFID technology may not be as
sophisticated as a VeriChip or another chip, and frankly use that for some
other theft, basically steal the number.

And then the third question I have is in terms of the range. I
mean, I know there are different powers of chips that make them have longer
ranges. What are we talking about? How far away would somebody be, especially
implantable kind, before they could actually read the number off the chip?

DR. WAXMAN: I have brief answers to your questions.

As I stated in my testimony, the distance is maximum of five
inches for the SURGICHIP. The SURGICHIP is only used once. The time for its use
is within a period of hours, so it’s not going to degrade.

And finally, the chip is locked electronically so that it
cannot be tampered with. It’s as if you were to envision an old-time post
office with lots of mailboxes that are open and have little tablets you can
write information on. Once the information is placed, a glass door is closed
and can never be opened again; you can only read the information inside. It
cannot be tampered with.

DR. SEELIG: I’ll be very, very brief in these answers so we
have more time.

Useful life we estimate to be 15 years. The way it’s implanted
in the body with soft tissue, it would take blunt force that would make that
extremity or that life form incompatible with life for it to be destroyed.

The abuse of the number that – I guess we could have
fiction writers here come up with some good scenarios, but the answer is not
really, number one, and number two, so what? If you have a pacemaker or a total
hip replacement, who cares? And what are they going to do with that
information? So, again, it’s the Pepsi challenge: What are you going to do with
the information now that you’ve gotten it?

And again, because it’s medical information and because it’s
medical information protected under HIPAA, there’s a whole constellation of
civil and criminal penalties for using that information in authorized ways.

And finally, the read range is three inches.

MR. HOUSTON: Thank you.

MR. ROTHSTEIN: Well, I have some general observations and
questions for the panel.

Our second panel is going to explore specifically the privacy
issues that are raised by RFID, but I’d like to ask our first panel while we
have you here to address some of the issues that I anticipate will be coming up
in the second panel.

And in particular is the question of sort of unintended
consequences, slippery slopes, potential for extending the technology to other
uses that I think it’s prudent for us to explore. I don’t see anyone opposed to
the idea of trying to prevent wrong-site surgery. I don’t see anyone opposed to
the idea of getting access to essential patient information in trauma
situations or to identify victims in disasters.

But it’s easy, I think, to contemplate more extensive uses of
this technology. For example, one of the premises of the technology is that it
will only be used with the informed consent of the patient. There are many
individuals who lack that decisional capacity and therefore the consent may be
legally made by others who have responsibility for them. So, in other words,
parents could require that their children get these implants or someone who had
custody over someone who was mentally retarded or who had Alzheimer’s disease.
It wouldn’t take too much imagination to see a requirement in a military that
all service personnel had these kinds of chips.

So the notion that it would be used only in the informed
consent that we view in the clinical setting I think may not work out.

In addition, I think there are many other uses that one can
envision, for example, putting RFID devices on controlled substance containers
that the FDA may approve could used by law enforcement very easily. I could see
public health officials insisting in lieu of quarantine or isolation that
infectious disease patients have RFID chips and you could follow them before
they entered public buildings or public spaces where they might spread disease.

Tomorrow, we’re going to talk about the issue of third party
uses and compelled disclosures, and so one can imagine a situation where you
apply for a job and you’ve got an RFID chip and the occupational physician has
the reader and as a condition of employment, you have to allow them to read
your chip, which would allow them electronic health record access to your
complete health records.

And one of the things we’re going to explore tomorrow is
whether in these kinds of settings there should be limited access to third
parties, which would be limited only to those essential things that they need
to know.

So I could go on and on and I imagine the next panel will
probably probe many of these and probably other issues as well.

I would like to give each of you a chance to respond to the
notion that even though your technology is benign and lifesaving in some
respects, what do you say to the criticism that some people might misuse it and
go beyond and therefore some degree of regulation should be put on the
technology?

DR. SEELIG: All right. I have more hours in me, so I’ll go
first.

[Laughter.]

DR. SEELIG: Let me just say, Mr. Chairman, we have since this
first came about been very active in addressing and thinking through these
particular questions and trying, as we evolve the technology, to address them
as this is going forward.

In terms of authority of consent, in fact that’s what we have
to go on, and that is, that people can provide informed consent for themselves.
We are our health proxies for our children and for our disabled parents and
that power of attorney is provided by a third party authorizing consent to
provide that. So in that regard, RFID, whether it’s wearable or implantable,
any format, is no different than any other form of authority given for use or
management or treatment.

In terms of use for other purposes, I can only tell you how
we’ve addressed it, and with our database, which we call a Global VeriChip
Registry System, we in fact give the user a choice, and in fact they have four
choices as to who can access their health care information so that if we’re
dealing with Alzheimer’s or, say, autistic children, we want the first
responders to know who the person is and what their emergency contact
information. But it’s not necessary, or even appropriate, for them to know the
underlying medical information about that individual.

So the user has the option to tag, or to block, that
information from being available to those people.

And the same thing can be true for employers – that we
could put another block in there for a person to say, no, no, no, my employer
is not going to be able to see my health information, but if they need to use
this for my job which requires secure access as a condition of employment, yes,
they can use it for that purpose only, but not for unintended consequence of
knowing what my other health information was.

And that’s what compelled use, I think, is voluntary.

And the other part of it is, too, that there are occupations
where you and I want to know that the right people are in the right place at
the right time, that the people in the cockpit of an airplane or the control
center of a nuclear power plant or who have their buttons on the launch codes
of nuclear submarine are the right people in the right place at the right time.

So I think it’s application specific and we can control it in
our way, and that’s something we would certainly offer anyone who is interested
in looking at it.

MR. ROTHSTEIN: Dr. Waxman, do you want to comment?

DR. WAXMAN: As far as SURGICHIP goes, it’s not mandatory that
the patient submit to using SURGICHIP any more than it’s mandatory that they
allow “yes” to be written on the surgical site. It certainly makes a
lot of sense but if the patient refuses, they don’t have to use it. Of course,
the surgeon, except in an emergency situation, has the right to refuse to do
the surgery if there’s no protection of this sort.

As far as the information being available to third parties,
it’s used in a closed circumstance. There are no third parties watching. The
chip is in the chart and protected by the medical records, the HIPAA
regulations, and the information that’s on the reader is timed out and will
disappear. Sir, I don’t consider that to be an issue with SURGICHIP.

MR. ROTHSTEIN: Dr. Halamka?

DR. HALAMKA: Sure. Well, I’m a strong believer in patient
consent for all uses of retrieval of medical information except in those cases
where a patient is comatose and can’t otherwise consent.

Let me tell you what we had to do, because the Attorney General
of Massachusetts was a bit uncomfortable with even this medication information
retrieval on Medicare patients who could not consent.

We are now required to print on all discharge summaries,
“During the course of your care, we may have looked up your medication
information at a time when you were not able to consent to such a look-up. If
you have an interest in exploring this further, here is the phone number of our
privacy officer.”

So even on those cases where a patient’s comatose or incapable
of consenting, we’re still advising them and informing them what we did and
giving them the opportunity to understand it better.

So, certainly those kinds of policies and procedures would need
to be wrapped around any use of this technology.

MR. ROTHSTEIN: Sarah, did you have a question?

MS. WATTENBERG: Yes. I’m interested in this idea that in some
cases the device will meet the definition of a medical device and in some cases
it won’t. And so you sort of have this parallel universe of medical versus
non-medical devices.

And I want to know whether or not there’s any sort of way that
they would intersect and what the potential for abuse might be.

For instance, if you have, you know, Associated Press, and they
buy one of these sort in the general marketplace and they have a very ingenious
IT person or somebody who doesn’t have much scruples, whether or not there’s a
potential that they could sort of walk down the hall of a hospital and just
kind of scan people and get information.

Perhaps a question out of my own ignorance of how health
technology works, but curious, nonetheless.

DR. SEELIG: Well, first of all, we struggled with the FDA on
this whole issue of bifurcation of application and of governance. We’re
comfortable with it now, as difficult as it was to come to that.

As far as the scenario that you laid out, again, HIPAA has put
into place civil and criminal penalties for obtaining that information. So if
someone were to obtain that data, then they would be violating the laws that
are already in place, number one.

Number two is that in order to gain access to the information,
there are many steps you need to go to to get it. We use proprietary software
in the scanner, so if you had an RF tuner, a scanner tuned to that particular
frequency, you wouldn’t get the information off the chip. There’s a lot more
memory on the chip dealing with authentication and transfer information back
and forth aside from that 16-digit number.

The third element is that if that information were obtained,
the question is to what use that would be put. And I’d like to welcome hearing
what those challenges of information would be to understand how we have
addressed them or could address that potential in the future.

Thirdly is that we’re the only manufacturer of the readers, so
therefore we know where these readers have been shipped and to who is receiving
them. We have a registration program for the physicians who do the insertions
and we have a registration procedure for the affiliates, as we call them, who
obtain the scanners. We also have a post-market tracking system so we know
where the chips have been shipped to and to whom the chip was the inserted.

So we have, as our part of our best manufacturing practices, a
tracking program of both the scanners and the chips so that we can, in the best
situation, if a bad thing happens, be able to move back to understand it.

And not only that, in our own personal health information
system – as I mentioned, it’s called GRVS, Global Registry Verification
System – we have a tracking system, an audit trail that automatically
locates and has a time stamp for the location and the facility that actually
does an interrogation of the chip and accesses that information.

And that’s the best we can do right now, so if there’s
something else we can work with, we’d be glad to.

MR. ROTHSTEIN: Well, I want to – oh, I’m sorry – Dr.
Waxman?

DR. WAXMAN: I just wanted to mention, as far as SURGICHIP is
concerned, SURGICHIP just has a very limited amount of information. The chip
itself is now in the medical record after it’s been used and it can only be
read with a reader that has software installed by SURGICHIP. And even if
somebody were to steal the reader with the SURGICHIP software in it, and it’s
not available to everybody, it is by prescription, as you know, you still have
to get the proper password to get in.

So it’s unlikely that it could be used for any other purpose.
And as a matter of fact, even if you could steal the reader, get into the
software using the password and read the information in the chart, it’s much
easier just to open to the page that says “Operative Consent.” It has
the same information and it’s easier to get.

MR. ROTHSTEIN: I want to thank all four members of our panel
for getting our RFID hearings off to such a lively, thought provoking start,
and we will take a 15-minute break.

And I want to alert our Internet listeners that we will begin
Panel 2 at 11:10, and that should not cut Panel 2 short because unlike the
first panel, we only have three witnesses, and so we should have ample time for
their testimony as well as questions.

So we are going to take a break until 11:10. Thank you.

DR. HALAMKA: Well, guys, thank you.

[Break at 9:52 a.m. Meeting resumes at 11:12 a.m.]

MR. ROTHSTEIN: Good morning. We are back with Panel Number 2
of our RFID hearings. I want to welcome all three of our Panel 2 members, and I
think you had a chance to hear at least some of the testimony that we heard
from our first panel group and it was very stimulating and I imagine you will
have some interesting perspective to add to that.

So I’d to like to invite Ms. Sotto to begin.

AGENDA ITEM: Panel 2 – Radio
Frequency Identification

Presentation – LISA SOTTO, Esquire

MS. SOTTO: Thank you very much. Good morning.

My name is Lisa Sotto, and I’m a Partner with the law firm of
Hunton & Williams. I head the firm’s Regulatory Privacy and Information
Management Practice.

And I also lead privacy projects for the firm’s Center for
Information Policy Leadership, which is a privacy think tank affiliated with
the law firm. The Center brings together business leaders, government
officials, consumer advocates and academic experts to provide thought
leadership on information management issues.

Through both the law firm and the Center, I advise chief
privacy officers and other senior executives on the development of global
information management programs. I have written and spoken extensively on
information management issues, with a focus on privacy in the health care
arena.

Thank you very much for the opportunity to participate today.
I’m doing so on my own behalf and my views should neither be attributed to the
firm of Hunton & Williams nor to any of my clients.

RFID technology in the health care arena holds enormous
promise. If its use becomes widespread, it can lead to greater accuracy and
efficiency in treating patients by making medical information immediately
accessible to health care providers. Privacy concerns, however, present a
significant obstacle to its widespread acceptance.

The benefits of using RFID in medical settings are achievable
only if patients are confident that the data being transmitted will not be
misused. The value of RFID in the medical arena can be fully realized only if
patients have confidence in both the security of the technology and in the
related policy environment.

For purposes of today’s discussion, I’ve divided the privacy
concerns related to the use of RFID into several distinct categories of
potential harm. I’ll go through each of those categories; I’ll name them first
and then hit each one in turn.

The first is the inappropriate collection of health
information through RFID technology.

Second, the intentional misuse or unauthorized disclosure of
the data by an authorized data holder.

Third, the intentional interception of the data and its
subsequent misuse by an unauthorized party.

And fourth, the unauthorized alteration of the data.

So, going to the first potential harm first, the inappropriate
collection of data through RFID technology.

In non-medical settings, there’s a widespread concern that
RFID chips may be used to collect data surreptitiously. For example, in the
library setting, RFID chips can be attached to books without the knowledge of
individual borrowers, and information about the borrowers’ reading habits can
be collected.

In the health care context, this issue does not present a
significant concern. RFID devices used in health care generally are used only
with the individual’s knowledge and consent or that of the individual’s legal
representative.

Further, with respect to the VeriChip, which is considered the
most privacy-invasive of the approved RFID devices, all data maintained is in a
database associated with the chip and is self-reported.

So in the medical context, currently use of RFID devices is
fully opt-in. Patients affirmatively choose to provide medical information
through RFID technology.

In addition, also with respect to the VeriChip, no medical
data is stored in the chip itself. Instead, the information is maintained in a
separate database. Therefore, even if you are implanted with a VeriChip that
was secretly scanned, the information the interloper would receive would be
limited to a 16-digit ID number and only would have meaning to those who have
access to the database itself.

The chip itself acts as a unique identifier, not as a
transmitter of health information. So the bigger privacy concern with respect
to the VeriChip really involves unauthorized access to the VeriChip database.
This sort of unauthorized access is a significant threat to individual privacy,
particularly when dealing with sensitive health information, but it’s not
unique to the RFID context. Unauthorized database access is an issue in many
other settings and it’s frequently managed using security tools like encryption
or authentication technologies.

The second category of potential harm is the intentional
misuse or unauthorized disclosure of the data by an authorized data holder.
This is where a party to whom an individual has granted permission to access
the data for authorized purposes may use or disclose it for unauthorized
purposes.

This is a legitimate and very significant concern, but again
it’s not unique to the RFID context. It’s an issue we confront daily in
connection with the collection and maintenance of data sets generally.

Whether information is recorded on paper or electronically,
guarding against its misuse or unauthorized disclosure is a security issue and
an organizational oversight issue that needs to be addressed by every entity
that maintains sensitive data.

The third category of potential harm, the intentional
interception of information and its misuse by unauthorized parties. Any
intentional and illicit interception of medical data and its subsequent use for
purposes for which it was not intended is a clear violation of patient privacy.

Here again, however, this is not a new privacy risk that has
arisen only as a result of the development of RFID technology. The risk of data
interception and misuse involves security issues that plague every organization
that stores sensitive data.

These problems generally are addressed through encryption and
authentication. As with any unauthorized interception of data, the solution
lies with better, more secure technology.

The fourth potential of harm, the unauthorized alteration of
medical data. The risk that a patient’s medical information may be
inappropriately altered poses a serious threat not only to a patient’s privacy
rights but also to the patient’s ability to obtain appropriate medical care.

Again, as with the other risks I’ve mentioned, the risks
related to data integrity are not unique to the RFID context. In any situation
in which data integrity is an issue, authentication technologies and other
safeguards must be used to help insure that only those with authority to amend
the data are given that access to amend.

By dividing into these bite-size chunks of potential harms the
issues, we find that these harms, while extremely serious, are not unique to
the RFID context. In the privacy arena, we’ve been discussing these harms,
these potential harms, for years.

The real question is whether the current regulatory
environment provides adequate protection against the potential dangers or
whether additional protections are needed to make RFID a secure option.

For most health care providers, HIPAA’s privacy and security
rules impose strict limits on the use and disclosure of health information. The
restrictions apply without regard to how the data was collected, whether
collected through RFID or any other collections method.

So for covered entities and for their business associates who
are contractually restricted in their use and disclosure of the data, no
additional protections appear to be necessary.

There are, however, in the RFID setting as used in the medical
context many parties, many entities, that will have access to the data that are
not HIPAA-covered entities. For these entities not covered by HIPAA, other
existing laws provide protection against the potential risks.

For example, the unauthorized use or disclosure of medical
data may be considered a violation of Section 5 of the FTC Act which prohibits
entities from engaging in unfair or deceptive trade practices. The FTC has
availed itself of this provision on numerous occasions to protect against many
of the same privacy abuses that I’ve mentioned above.

With respect to unauthorized data users who illicitly
intercept and exploit medical information obtained through an RFID system,
existing law provides the necessary tools to actively combat and deter this
illegal behavior. While the threat of hackers and lots of other bad guys will
continue to exist, the tools that are currently in place do provide a
sufficient framework for law enforcement authorities and other security experts
in the private sector to combat illegal activity.

In addition the protections provided by existing law, and
industry code of conduct should be developed for entities that maintain or
access RFID-related medical data. The Fair Information Practice Principles and
HIPAA’s Privacy and Security Rules provide very strong guidance in developing
this type of code of conduct.

I would submit that a code of conduct should contain the
following principles:

First, notice, and we’ve discussed this a little bit in the
previous panel. Patients who are “chipped” must receive notice,
written in plain, understandable language, of the data holder’s information
practices. This sort of notice will allow patients to make truly informed
decisions as to their level of participation in an RFID network.

At a minimum, the notice should clearly identify the entity
collecting the data, the uses and disclosures of the data, the type of data
collected, the methods by which the data are collected, the security measures
used to safeguard the information, and the rights of the patient – for
example, the rights of the patient to amend the data or access the data.

Second, consent. Data holders must use and disclose health
data only in a manner to which the patient has clearly consented, and consent
also is very much dependent on clear and complete notice.

If the data must be disclosed pursuant to legal requirements,
for example, in response to a subpoena, the data holder should seek to insure
that the recipient uses the data only for the narrow purpose for which it was
disclosed and should also seek to insure that safeguards are put in place to
protect the data.

The third principle, access and amendment.

Patients must have the ability to access their health
information and to challenge the accuracy of the information and correct it
where that’s appropriate. IN the health care arena, accuracy of medical
information is absolutely critical.

Fourth in a code of conduct, data integrity and security.
Health information collected in connection with RFID technology must be both
accurate and secure. Minimum standards must be established to protect against
loss and unauthorized alteration, destruction, access, use and disclosure.

Fifth, the principle of data retention and chip deactivation.
There needs to be clear guidance as to how an individual may first deactivate
an RFID chip used for medical purpose. And second, there needs to be guidance
as to how to request the destruction of medical data maintained either within
an RFID chip or within a database connected with an RFID chip.

Under most circumstances, data should be retained only for so
long as the individual agrees and must be permanently destroyed once the
individual has authorized its destruction.

The sixth principle, accountability – very, very
important principle. Strict accountability standards and enforcement and
redress mechanisms have to be established for all parties that participate in
an RFID system. There must be a price to be paid for being the weak link in a
security chain.

The privacy harms that may result from RFID abuses are
significant, but they are not unique to the RFID context. While I believe
existing laws are available to address potential harms, I would nevertheless
encourage RFID stakeholders to develop and adopt an industry code of conduct to
further protect against harms that might result from misuses of the data.

A coordinated approach by all stakeholders would provide the
public with the confidence needed to support the advancement of this beneficial
technology.

Thank you for the opportunity to appear before you today and
address these important privacy issues. I would of course be happy to answer
any questions.

MR. ROTHSTEIN: Thank you very much, and there will be
questions, I’m quite confident. And we’ll move now to our second witness, Mr.
Rotenberg.

AGENDA ITEM: Presentation – MARK
ROTENBERG

MR. ROTENBERG: Thank you very much, Mr. Chairman, members of
the Committee.

My name is Mark Rotenberg. I’m Executive Director and
President of the Electronic Privacy Information Center here in Washington. I’m
also on the faculty at Georgetown Law Center where I’ve taught privacy law for
15 years.

And I’d like to thank you for very much for inviting me to
participate this morning and also for looking into the issue of RFID
applications and privacy.

I wanted to begin by drawing your attention to a report that
I’ve circulated for you. It is the annual report by my organization, EPIC. It’s
titled Privacy and Human Rights. It’s an extensive survey that we
undertake each year of privacy developments around the globe. It’s broken down
essentially into two sections. The first is by subject matter and the second is
by country.

And what we are seeking to do is to identify emerging privacy
issues and then to understand on a comparative basis how different countries
are responding to new privacy challenges.

This publication has turned out to be an enormously useful
project, I think, for a Committee such as yours that are both assessing the
impact of RFID and trying to understand how others may be responding.

I will say that over the last few years of our research, RFID
has taken more and more of our attention. In fact, this year, 2005, we’ve just
announced our list of Top Ten issues to watch and we’ve identified RFID as one
of the Top Ten privacy issues for the coming year.

But interestingly, the applications in the health care setting
have not yet received the type of public attention and debate that the
applications in the consumer setting have received.

So by way of example, during the past year, the United States
Congress held its first hearing on RFID applications, focusing on new
developments in product distribution and inventory management and the tagging
of consumer products and what the impact might be on privacy interests of the
consumers in the business environment in the use of RFID technique.

In similar fashion, the Federal Trade Commission held its
first public workshop just this past year on the use of RFIDs in the
consumer-merchant context, asking questions similar to those that have been
asked by Congress.

The medical discussion about RFID applications, if I may be
direct, I think has been distorted somewhat by the debate over VeriChip, and I
say this at the outset because in preparing my remarks for you today, I came to
understand that there are really a wide range of applications for RFID
technology in the health care setting and I think it’s very important to
understand the scope of applications and to assess the privacy impact in each
of these contexts.

And in fact, that will be really the focus of my presentation
today, is to look at the range of applications for RFID and to try to identify
those applications where there is significant privacy interests and those
applications where there may be minimal privacy interests.

If we’re talking, for example, about the problem of
counterfeiting drugs, the labeling of products in bulk distribution turns out
to be an effective way, potentially effective way, to track and manage
inventory in the medical care context, much as large corporations are beginning
to realize that RFID is a useful technique to track distribution and products
at the inventory level.

We understand this; we don’t see a particular privacy concern.
If it helps diminish the risk of counterfeiting of drugs, it seems to be a
useful and important application of the technology.

But, of course, when we take the next step over and we move
the tagging from the bulk level to the individual product level, so, for
example, it’s on the prescription drug vial that’s in the possession of the
patient, then we may begin to understand that some privacy issues emerge,
because now there is a linkage between a particular product and a particular
individual. And I’m going to come back to this point in just a moment.

There’s a third category I’d like to suggest to you where the
primary purpose of the RFID is not to identify product but rather to identify
people, and I think here we are looking at two different types of applications.

It was actually very nicely illustrated on the first panel this
morning. There are the applications of RFID for individuals, for example, to
prevent errors in surgical procedure that are temporary. They exist in a
particular point in time and a particular context; when that purpose if
fulfilled, the use of the RFID effectively goes away.

We can identify some privacy concerns that might arise in that
setting although I’d actually suggest security concerns may turn out to be
greater than the privacy concerns. The accuracy of the information, for
example, may be a greater concern than the misuse of the information.

But the troubling category, and I will conclude on this one by
the end of my presentation, has to do with the permanent identification of
patients using RFID technique, and I do believe that this raises profound
issues for the medical community that you will need to consider in more detail.

Now, you’re broadly familiar, I’m certain, with the various
privacy frameworks that are discussed in the context of medical privacy
protection. Ms. Sotto, I think, did a very nice job providing an example of how
Fair Information Practices can be applied in developing codes of conduct for
the use of RFID technique.

There is certainly this specific HIPAA Privacy Rule which comes
into play with the collection of medical information. Our own organization,
because of its work on RFID issues, over the past year actually spent quite a
bit of time in consultation with consumer groups and academic experts
developing a set of RFID guidelines, but as I said at the outset, these are
really directed toward the concerns of consumers in the marketplace and don’t
actually reflect some of the issues that might arise in the health care
setting.

The key point about all privacy frameworks, whether they’re
voluntary or codes of conduct or legislative, is that they focus on the
collection and use of personally identifiable information, broadly speaking.
And as a general matter, as a presumption, I think we can say that when
personally identifiable information is not collected, there really is not a
privacy issue. There may be other concerns; I don’t mean to say that our
concerns about the misuse of data disappear when the data is no longer
individually identifiable, but they’re almost by definition of a different
type.

Now, in thinking about the privacy risks with

RFID, I think what we’re really talking about, apart from the
problems of a procedure going wrong and an implant, for example, are the
possible misuses of personally identifiable information. These are the concerns
the Privacy Law and regulation typically tries to minimize or eliminate where
possible.

The use of information out of its original context raises many
concerns where a data subject can be used in subsequent determinations about
employment, about insurance, can have a discriminatory impact. Ultimately, I
would argue, it can actually have a certain impact on the individual’s freedom
to understand the use of information about them and how it’s being used,
particularly in a circumstance where they simply don’t know what information
has been collected.

Privacy scholars and others have certainly recognized the
importance of Jeremy Bentham, the great utilitarian’s model of the ideal prison
called the Panopticon. The Panopticon had the unique feature that it gave the
guard in the prison the ability to constantly observe the prisoner while the
prisoner had no ability to know whether or not he was being observed. And the
consequence of this, which Bentham realized, was that in such a world, you
didn’t even need a guard because people would simply believe that they were
constantly being observed and would control their behavior as if that was the
condition. The understanding of the physical Panopticon as the ideal prison has
oftentimes been described as sort of the privacy nightmare, when people lose
control of personal information.

HIPAA itself actually deals with this issue of personally
identifiable information in a very complex fashion and I won’t go into the
detail now, but you have, as I’m sure you know, different categories and
subcategories that seek to identify certain classes of information and the
obligations that the covered entities have in the use and collection of that
data.

Our own RFID guidelines distinguish again between RFID users
and applications where personally identifiable information is not collected and
those circumstances where it is collected and we place a much greater
obligation, of course, on the circumstances where the data is collected.

Briefly, a highlight of some of the recent legislative
developments. As I said, we do believe that this year there’ll be a great deal
of attention on RFID issues.

I think an important starting point for the policy debate was
provided just a little more than a year ago by the International Privacy
Commissioners. These are leading officials from around the world with
legislative and regulatory authority over privacy matters in their national
government. Unfortunately, we don’t have actually a comparable office in the
United States.

But when this group met in 2003 to consider the RFID issue,
they said as a starting proposition, they would presume the application of
current privacy laws of personally identifiable information that may exist in
an RFID-enabled environment and that they proposed that additional
consideration be given to the unique tracking features of RFID.

So, for example, you may need to enable certain types of
leading, or so-called filling of tags, to protect privacy.

Various state bills are now under consideration, and as I
mentioned earlier, there are hearings at the Federal Trade Commission.

Well, I’d like to go directly then to my proposal to you as at
least a starting proposition for what an appropriate policy framework might be
for the regulation of the RFID environment in the health care setting. And this
proposal follows from the wide range of applications that I think we will be
seeing in the use of RFID.

And this first category, which I call Tier 1, we focus simply
on the bulk distribution of products. And as I suggested earlier, because there
are no links to specific individuals, I think the necessity of privacy rules in
this setting is – I don’t want to say non-existent; I’m sure someday
someone will show me this slide and I’ll regret saying today “no privacy
obligations.”

But at least as a starting point, I think it seems reasonable
that at this level, there’s quite a bit of separation between identifiable
individuals and the use of the technique.

As I suggested earlier, I think once you can link the product,
once you have the prescription drug being filled in the pharmacy or the
hospital, you have a patient, a customer, in possession of a vial with an RFID
tag on it, a whole different set of issues should be asked: What is the
application in that setting of current privacy rules? What new safeguards may
be necessary, recognizing the unique capabilities of this technology? And I
think this is a very important inquiry that should be pursued.

The Tier 3 set of applications I think really focuses on the
question of whether context can be limited. And I fully appreciate the panel
presentation – I think it was almost the ideal description of a limited
context application for RFID tagging. And if I could be assured that that was
the full application of the RFID tag, it would seem that maybe the privacy
interest here is not so great.

But as with all of these techniques, as with all businesses
that seek to expand and become profitable, as with innovation itself, it is
very, very difficult to draw these lines.

So even as I said, recognizing what may almost be the ideal
case for temporary tagging presented on the first panel, I would ask you to
consider: What are the boundaries, legal, technological and regulatory, that
can assure us that that type of tagging will be limited to its defined
application?

The final category obviously concerns VeriChip and other
similar products that may be developed. And as I said, it is my view that the
discussion about RFID in the health care context has been focused in many
respects too much on this particular product. I’m not even certain that we’ll
be talking about it a couple of years from now.

But nonetheless, to the extent that the product exists, I think
we really do need to think carefully and seriously about what it means to
permanently tag someone with a unique identifier in this country.

This is so closely tied to national debates about the use and
misuse of the Social Security number, about whether we should have a national
I.D. card, about how do we secure our borders so that we can link individuals
with Federal agency databases, I don’t think you can possibly consider that
implanting a unique identity number in an individual could be contained to
simply the health care setting.

I think there are also profound ethical issues here. I don’t
mean to put this quite on the same level, for example, with forced
sterilization, but we begin to approach that realm when we change a person’s
body in a way that they may or may not fully understand and which for them to
restore is really beyond the means of most people. I mean, that’s what we’re
talking about when we implant a chip of this type.

It’s significant, of course; I mean, I heard Dr. Seelig this
morning tell us that we’ve now got 30 million living organisms that have been
implanted with chips. I’m obviously working off of old numbers, because last
year we were simply talking about a million.

But this is a very serious issue, and I would simply propose to
you in your recommendations for HHS on the appropriate applications of RFID in
the health care setting that you consider a flat prohibition on the implant of
an RFID chip.

Thank you.

MR. ROTHSTEIN: Thank you very much, and that certainly raises
more questions for us to talk about. Professor Solove?

AGENDA ITEM: Presentation – DANIEL J.
SOLOVE

MR. SOLOVE: Thank you. My name is Daniel Solove. I’m a law
professor at the George Washington University Law School. And I want to thank
the Committee for inviting me to give these remarks.

RFID tags have many potential benefits, but they also pose a
substantial threat to privacy. The future of RFID is vast and potentially very
significant. As prices come down for the tags, they’ll increasingly be placed
into products; they can be placed into licenses or I.D. cards or virtually any
item or product.

Technology over time might include the range of the signal and
then that will enable chips to be read from greater distances. One scenario is:
Suppose the police want to track a person. They could, for instance, get a list
of the products that a person owns, or if the chip is implanted in the person,
it would be even easier, and then track the person with the chips 24 hours a
day.

Police might use an RFID reader to scan people’s luggage or
bags to get a full inventory of their contents.

Another potential use is a private company tracking people’s
RFID tags and beaming them individually tailored ads wherever they go; this is
from the movie “Minority Report,” for example.

What exactly is the threat to privacy that RFID tags pose?

Well, currently there’s a bit of a divide between the online
and offline world. Online, our transactions are readily tracked. Everything we
buy at stores like amazon.com is recorded, every single purchase. I purchase
quite a lot on Amazon and they have a list of everything I’ve ever bought since
I started shopping with them.

Everything we peruse on amazon.com, for example, can also be
recorded as well, and is being recorded.

And unless one is very computer savvy, it’s critically
difficult to achieve true anonymity online, truly be untraceable. Offline, it
is still possible to be readily anonymous. I can go to a bookstore and buy a
book with cash, and unless people remember my face, no record is going to be
kept that Daniel Solove bought that particular book, unless I use a credit
card, but if I use cash, I can still maintain anonymity.

RFID threatens to change this. On the Internet, people’s
Web-surfing can be tracked by cookies and spyware, and I think that RFID tags
threaten to be a kind of cookie or spyware equivalent in real space, to some
extent, a way of tagging people and permanently enabling the possibility of
monitoring their activities. They can be a very powerful information-gathering
tool and a way to track people’s movement.

And a key program that’s sort of the overarching theme of my
testimony today is that our existing legal regulation of privacy is not
prepared to deal with RFID. It’s not just an issue of technology; our legal
system isn’t ready for this as it currently stands. We have a weak regulatory
infrastructure for repositories of information gathered by private sector
businesses and institutions and RFID information will enter into this realm.

To understand the full implications of the technology, we must
understand the rise of what I call “digital dossiers” and this legal
regulatory infrastructure that protects our privacy.

In my new book, The Digital Person, I discuss the
unprecedented amount of personal information that’s collected by businesses and
the government and stored in gigantic databases. Every person is living with a
digital counterpart, a digital person who resides in a database. And this
digital person isn’t made up of flesh and blood but of hits and bites of
information that are aggregated together.

There are hundreds of records today that detail a person’s life
style, their consumption habits and other information, and these records are
being gathered and collected and assembled into databases, and profiling
techniques are being developed to analyze this information and analyze patterns
of behavior to profile people and to make very important decisions that affect
people’s lives.

So currently, the information in the dossiers is used to
determine whether we receive loans, jobs or licenses.

I think there are several problems with this development of the
birth and growth of the digital person.

First, the dossiers are not kept adequately secure. Companies
do not protect personal data with adequate security, and we’re seeing a current
reality that identity thieves readily are stealing people’s identities and in
the process polluting people’s dossiers with false and erroneous information.

Second, personal information is being traded and sold between
companies quite readily. Some companies will sell your personal information to
anyone who’s willing to pay a small fee for it. We have little control
currently over how a lot of this information is used to judge us.

Additionally, the government is tapping into our dossiers of
personal information maintained by businesses.

And the law has attempted to deal with some of these problems
and since the early ‘70s, Congress has passed over 20 laws pertaining to
privacy. Federal regulation covers records from Federal agencies, educational
institutions, cable television, video rental companies, state motor vehicle
agencies.

However, it does not cover records maintained by many state and
local officials, by libraries, charities, supermarkets, department stores,
mail-order catalogues, bookstores and other merchants.

So what we see is a law that has a number of gaps and holes,
pretty significant gaps and holes. A lot of the law, the privacy statutes that
Congress has passed, are very difficult to enforce. And it’s often difficult
for the individual to find out what information has been gathered and how it is
being used.

There are some special unique issues involving RFID in the
medical context, and I’ve been talking about RFID generally and now I want to
really focus in on the medical context.

In the medical context, as we’ve heard today, I think there are
some potential benefits to the use of RFID and they will allow medical
personnel to read chips and locate health information about particular people.
And it could be that people might voluntarily want to use this technology, but
I think that there are costs that could inhibit this process and potential
outweigh some of the beneficial uses, and obviously there’s a big range of uses
and I’m speaking at a fairly broad level of generality.

To understand what I mean, I think we need to begin with the
long-standing tradition of doctors maintaining patient confidentiality. This is
embodied in the Hippocratic Oath which dates to around 400 B.C. which states
that doctors must keep patient information confidential. In many states,
patient information is protected by an evidentiary privilege.

And the reason why we have these protections, the Supreme Court
has said, I think said it very well: “The mere possibility of disclosure
may impede development of the confidential relationship necessary for
successful treatment.”

One court expressed the rationale for the special privilege
protecting patient-physician confidentiality best. I think this court said the
following:

“When a patient seeks out a doctor and retains him, he
must admit him to the most private part of the material domain of man. Nothing
material is more important or more intimate to man than the health of his mind
and body.

“To promote full disclosure, the medical profession
extends the promise of secrecy referred to above. The candor which this promise
elicits is necessary to the effective pursuit of health; there can be no
reticence, no reservation, no reluctance when patients discuss their problems
with their doctors.”

The problem that we face, though, is that there is great
uncertainty in existing law about the degree to which medical information can
remain confidential. HIPAA regulations, for example, provide that a covered
entity may disclose health information to a law enforcement official in
compliance with a subpoena or court order, as well as an administrative
subpoena or demand. Subpoenas have a very low level of protection, much less
than a search warrant.

It is unclear whether the Fourth Amendment would apply when the
government seeks a patient’s medical information from a physician, hospital, or
other care giver.

Under what has become known as the third party doctrine, the
court has held that the Fourth Amendment doesn’t apply when a person divulges
information to a third party, so in one case, despite the fact that bankers
promise to keep people’s information confidential and often make such a promise
either explicitly or implicitly, the Fourth Amendment doesn’t require the
government get a warrant in order to get those records. The government can get
those records with just a search warrant.

The logic of the third party doctrine appears, on its surface,
to apply to information provided by health care providers, because, after all,
these are third parties. On the other hand, there is this long-standing
tradition that I just spoke about of doctors maintaining patient
confidentiality.

I think if the court were to apply the third party doctrine to
doctors and hospitals, it would seem almost farcical to claim that a person has
no reasonable expectation of privacy in his or her medical records.

But this is a problem that the courts are going to have to
confront. Until this time, there’s still a big question mark in the law.

And I submit that these open questions might make people more
reluctant to use RFID technology. Many people might not want to use RFID if
their information can be gathered and used for purposes beyond their treatment.
The government can get it at the whim of a government official for some future
purpose. People might fear that RFID could be used by law enforcement to track
them or that RFID data will be turned over to the government and used
insecurely.

And I think the chips will also harm people who unwittingly use
the technology with the false assumption that their data will be provided with
adequate legal protection when our current system has all these gaps, holes and
open questions. Our law simply is not yet ready for RFID technology.

I think that we need a more comprehensive

statutory protection, protection not just at the stage of
gathering data from RFID chips but also after the information is gathered and
stored. Otherwise, I think you could severely damage the patient-physician
relationship and make people reluctant to use these chips for medical purposes.

And I think there are significant dangers that RFID tags will
be used to track people’s movement, like a beeper or tracking device installed
on their very person. And the Supreme Court has held that the Fourth Amendment
does not apply when a tracking device tracks a person’s movements in public.

And the danger is that the police might use RFID to track
people in public and might get this information from doctors or hospitals or
physicians with just a subpoena if they want to then track somebody.

So, we’re really putting this into a world where there’s very
little regulation of what the police may do with tracking technologies and the
Supreme Court has left this very big void by holding that the Fourth Amendment
doesn’t apply, and Congress hasn’t filled it yet. Congress hasn’t addressed
that issue yet.

And this is just the law enforcement use of RFID. I focused
mainly on law enforcement access. I haven’t focused on issues of security, for
example. What if the database of the VeriChip gets hacked and someone gets it,
gets the information, or steals a reader? Does that mean everyone needs a chip
re-implanted in themselves, a new chip? One security breach could in fact prove
to be a significant problem.

So I don’t think the problems end with the collection of data
or just getting people’s consent at the stage of implanting a chip. I think the
problems come from what happens downstream when all this data is stored.

We need to insure that the tags can be permanently deactivated.
We need a way to insure that they’re not read by unauthorized persons. And we
need to insure that when people agree to use a tag that they’re not then used
for other purposes, as well as a way to protect the information, the profound
amount of information that could be gathered from the use of these tags.

I don’t think the technology of RFID is malignant or benign in
and of itself. I think it all depends on how we regulate it. But I think that
right now, and my concern right now, is that our law protecting personal
information needs to advance a lot further for RFID to be of net benefit to
society.

Questions, Answers and Comments

MR. ROTHSTEIN: Thank you very much. You’ve all given us a great
deal to think about, and I’ll be thinking while I call on the rest of the
Committee members with questions. John? Oh, okay. Well, maybe they’ll be
thinking while I start with my list of questions.

The first question I have is for Ms. Sotto, but the others, I’d
like your comments as well.

And you proposed a code of conduct which would be presumably
voluntarily adopted by the industry. And given our experience with codes of
conduct in various industries, do you think that would be adequate from both an
objective and a subjective standpoint? In other words, would it work and would
it satisfy the concerns of the public that they were being adequately
protected?

MS. SOTTO: There is right now I would say a reasonably small
network of entities that are developing RFID technologies both in the health
care arena and outside the health care arena. I think peer pressure plays a
great role in dictating how folks operate within the RFID arena, and I think
that a code of conduct would in fact have a great result in forcing the
community, because of peer pressure really, to adopt these standards.

I would also suggest that the community be very much involved
in developing the standards so that they buy into the principles, understand
them and adopt them internally within their own organizations.

MR. ROTHSTEIN: Either of you want to comment on that?

MR. ROTENBERG: Well, let me say I think a code of conduct can
be a useful supplement to a legislative or regulatory framework.

Our experience looking at codes of conduct in a wide range of
privacy areas is that they’re very difficult to sustain over time, even good
ones, and we’ve seen examples with companies that had a CEO, for example, who
was committed to do absolutely the right thing on privacy and articulated a
very good set of privacy principles and conveyed that to his or her company,
and then over the time there were competitors, there were new product
opportunities, there were new regulatory requirements.

Invariably, over time, codes of conduct are ratcheted down, and
ultimately, I think, that’s the reason to have a effective privacy protection,
you need legislation.

MR. SOLOVE: I would agree that everything that Mark said and
also just add that even a code of conduct alone is not enough because there are
also external pressures and factors that could breach confidentiality of
information, such as law enforcement access, for example. So no matter how good
a code of conduct is, there’s still some limitations on what particular
physicians or industries can do without a regime of adequate protection with
the law.

MR. ROTHSTEIN: One of the –

MS. SOTTO: May I –

MR. ROTHSTEIN: Sure.

MS. SOTTO: I think there’s a real danger in considering
legislation at this point. I don’t believe that the issue is really the
technology by which information is collected. The real issue is how is data
used? How is it misused? How is it disclosed?

So it doesn’t really matter what technology we’re using to pull
data into a system, whether it’s a database or another sort of system that we
can’t even envision today. The problem is technology changes so rapidly that we
can’t really legislate around the technology.

And I think there’s some danger of legislating around
technology, particularly this technology, which is really in its infancy. We
don’t even know the real issues to legislate around right now.

In addition, Mark mentioned some of the state laws that have
been proposed. And I think the danger in having all of these state laws in
place is that, number one, they won’t be consistent with each other; it will
provide a virtually impossible scheme for compliance officers to deal with.
Very, very difficult, I think.

The privacy arena is problematic. I think the privacy legal
environment is problematic right now in that we have this patchwork quilt of
privacy laws in many other areas of the privacy environment, and I think to add
this to the mix would really be an untenable situation.

MR. ROTENBERG: Well, if I could say briefly, actually I agree
with a good part of what Ms. Sotto just said. I think the goal is not to
regulate the technology. I think it’s to regulate the practices involving the
collection and use of personal information, which was the central point of my
presentation.

But I don’t think that’s the complete answer, because
technology enables new forms of collection and new forms of use that the
current regulatory framework may not have anticipated.

So, for example, I think a very logical way to understand the
privacy challenge of RFID is to begin by asking the question: What is the
application of current privacy law? Is it sufficient? I think Professor Solove
made a very good point about some of the weaknesses, broadly speaking, with
HIPAA.

But then we begin to look more precisely about what RFID
enables for data collection, which poses some new challenges. And there is
clearly a need to establish some safeguards.

My final point in terms of, you know, do we wait

till there’s a tremendous problem and then begin a legislative
process, or do we try to create a framework in which technology can evolve and
patients and consumers can be protected? I think the second course is always
preferable.

In fact, one of the very interesting developments I’ve learned
from studying the history of U.S. privacy laws is that the most successful
privacy regimes are those that have been established at the outset of a new
technology. Think, for example, about the development of government databases
in Federal agencies in the 1960s – huge concerns about the creation of a
Big Brother police state.

The Privacy Act was passed in 1974, not with the goal of
shutting down the use of technology or preventing the use of government
databases, but rather to create a regulatory framework in which the technology
can be used and the privacy rights of citizens, generally speaking, protected.
This has been the experience with electronic mail, it’s been the experience
with cable television, and it’s the areas such as the Internet where people
have said let’s hold off on privacy legislation, let’s try self-regulation
instead, that we have the greatest problems today in the U.S.

MR. ROTHSTEIN: All of the witnesses from the first panel as
well as this second panel either explicitly or implicitly said that consent is
a very important element of using the technology. I don’t think anybody would
question that. But the question is: What is adequate consent?

And it seems to me that if you went through the list of
possible privacy concerns beyond health care and how an implantable chip, for
example, might be used, and had to disclose those to everyone who was thinking
about being implanted, that might sort of dampen the enthusiasm of people who
are otherwise willing to consent in the health care or medical setting. I’m
wondering if you wanted to comment on that.

MR. SOLOVE: Yes. I actually think that the issue of consent is
a complicated one, and I think you’ve hit – you know, how can you disclose
all the possible future uses of something?

And I think that one of the difficulties is that if you had a
system that had a better legal and technological way of insuring that secondary
uses of the RFID tags would not be able to function or would not be able to be
used, then these potential future downstream uses would be curtailed, wouldn’t
be able to be used.

I mean, the only reason why they’re possible is because either
the technology is not limited in such a way – and we’ve seen two different
kinds of technology with different kinds of limits, the SURGICHIP, which I
think has a lot of limits that are built into the technology in use that will
prevent a lot of downstream uses, and the VeriChip, which I think does not have
the same degree of protections from the downstream use.

And I think getting consent in the latter scenario is much more
difficult because it’s really hard to consent to something when one doesn’t
know what could potentially lie in the future and one hasn’t really thought of
those scenarios down the road.

And I think that one thing to think about is it’s not just
getting consent but it’s also – consent is really meaningful in a regime
where future uses are limited, and then I think people can give meaningful
consent if they know, okay, I consent to this data use and that’s the use and
that’s it.

MS. SOTTO: Consent very much depends on notice, and in order to
provide accurate notice, you do need to lay out the possible scenarios.

Now, the possible sort of real fear-inducing scenarios could be
curtailed, I think, with better technology. And I’m no technology expert at
all, but I would imagine that there are huge security improvements that can be
made both in a chip that contains data and, you know, the fact is right now the
VeriChip does not contain data itself, but in the future, as somebody mentioned
on the previous panel, there is, I think, a strong likelihood that the chip
itself will contain medical data, much like the SURGICHIP but possibly
implanted.

So I think we need to think about those issues in advance, as
Mr. Rotenberg has said, of these uses of these technologies actually being here
today.

With respect to consent, however, I think the issue of data
being disclosed in an unauthorized way, that’s illegal conduct and can be dealt
with criminal statutes and can be dealt with with better technology, better
security, authentication technologies, encryption and the like.

MR. ROTENBERG: Well, let me say consent has long been a very
problematic term both in the privacy world and the field of medical ethics. I
think, in fact, medical ethics has usefully informed the privacy world about
what meaningful consent means because in the medical context, people’s
decisions about the types of procedures to accept/not accept really do have
lifelong consequences. And for that reason I think there’s actually been an
elevated concern about the use of the term consent that’s not generally found.

But at the same time, it’s important to understand that most
privacy regimes don’t rely solely on the concept of consent. There are a lot of
related interests intended to insure fairness and transparency and
accountability that help make consent meaningful.

And I think it’s critically important – I mean, I didn’t
begin with a conclusion today that I thought that the implantation should be
effectively prohibited; but the conclusion I came to was that the line there
between consent and coercion, or the lack of opportunity, for example, of
informing someone that the same purpose can be fulfilled in another way that
doesn’t require you to be permanently tagged with a unique identification
number, is so significant that there are very few circumstances where I could
even conceive of meaningful consent considering the viable alternatives to
obtain the same objective, that I do think you should prohibit it.

MR. ROTHSTEIN: I have one last question, and I think many of us
recognize the obvious privacy concerns of requiring implantable chips in
parolees or non-citizen visitors to the country or you can think of all sorts
of examples. Our jurisdiction is limited to health information privacy.

And so what I am concerned about is how we can come up with
recommendations where we don’t throw the baby out with the bath water, where
we’re concerned about all these abuses, but the abuses are in areas where we
don’t have jurisdiction.

MR. BLAIR: Potential abuses.

MR. ROTHSTEIN: Potential – well, I think there are some
that I could postulate that everybody would say they’re abuses.

So what I’m concerned about is that we not come up with
recommendations that preemptively throw the baby out with the bath water and
lose the potential benefits in the health care setting because of our concerns
of downstream uses. And I’m wondering if you could give us some guidance about
that.

MR. ROTENBERG: Well, I agree. I mean, I’m not asking you to
focus on the RFID applications outside of the health care setting. It’s only
because there’s one application that’s been, you know, highly publicized and
the FDA – you know, I just have to comment: I thought it was somewhat
remarkable that the witness from the FDA said that the counsel for the FDA
concluded that the VeriChip was not a medical device, and Dr. Seelig
representing VeriChip said less than 10 minutes later that it was a medical
device. So there’s at least some confusion within the regulatory framework
about what we do with that particular product.

But I certainly agree with your point. I’m not asking you to
consider applications outside of the health care setting.

MR. SOLOVE: I would recommend in particular – I mean, I
know I’ve mentioned some problems or some other aspects of the law that are
beyond your regulatory jurisdiction.

I think the difficulty is, though, when you bring this
technology to bear, and especially implant chips in people, I think you need to
think holistically and at least make recommendations to Congress to address
some of these problems that the technology comes into or create regulations
that limit the technology in such a way that those problematic uses can’t be
used until the point where Congress might adequately address those concerns.

And then maybe those technological limitations that might be
regulated on the technology could be lifted.

MR. ROTHSTEIN: Mr. Houston, do you have a question? Oh, I’m
sorry – Ms. Sotto, you wanted to comment on that.

MS. SOTTO: Thank you. I think we have a difficult framework in
this country in which to work with a technology like RFID. In Europe, where
there is overarching privacy legislation, I think there’s a different sort of
methodology for addressing these issues in that they address the uses, the
disclosures of the data rather than the technology by which the data is
gathered or maintained.

This is a much, much bigger issue than this panel is charged
with, although it might be interesting to discuss, but I would caution against
restricting uses of a particular technology and suggest instead that we look to
the bigger picture in this country of how we use data, how we collect data, and
how we disclose data, and early.

MR. ROTHSTEIN: John? And then Harry.

MR. HOUSTON: I think you just said what I was going to say,
which is, I think, what I’ve heard here is RFID is as much a surrogate for the
bigger issue of rights and the application of new technologies that might be
used for good and for not so good.

As Dr. Solove has indicated, if I want to be anonymous, I
simply don’t use my credit card. But it doesn’t have to be RFID; it’s the
facial recognition systems that they put on telephone poles and they’ve used in
a variety of settings to try to identify fugitives and felons.

This is really the bigger issue of personal privacy, and I
think RFID is simply a technology that causes us potentially to be less
anonymous. And sometimes it’s good in a medical case where you’re
incapacitated, somebody can figure out who the heck you are and try to get
information about you, but yet, where you want to be anonymous and you frankly
don’t want others to track you and know of your buying habits and your other
personal habits, then this is one of a number of technologies that could be
potentially harmful.

But it really isn’t many more than an extension on if you’re
recognized or not in a store or otherwise; you’ve also indicated that if
somebody really wanted to know who you were and track who you were, I guess
it’s just a more convenient way to track you.

So, again, I think this is a surrogate for a bigger issue that
maybe really is outside the purview of this Committee even, unless we want to
look at a variety of technologies and the like.

MR. ROTHSTEIN: Harry, then Richard and then Helga.

MR. REYNOLDS: Yes – excellent presentations. Glad I’m not
a doctor asking this question. But some of us on the Committee are dealing with
standards; we also deal with security, and we’re also dealing with privacy. So
we’re looking at the ball three different ways and kind of twisting it around.

As we talk about privacy, we always talk about the patient’s
privacy and their information privacy. On the other hand, we hold the care
givers liable and responsible if something happens to that person.

So as we build these things, whether it’s e-prescribing or
electronic medical record or RFID – and I agree with you; the technology,
there will be 10 more two years from now that have a capability of gathering
information – and as you look at how health care is delivered, there are
multiple points of consent.

When somebody gets a prescription and they let the insurance
company pay it, they’ve given people consent to have the information. When they
go to a doctor’s office, they’ve given the doctor consent to have the
information.

Now they show up in an emergency room somewhere, and there is
technology that is available to deliver all that information to them. Where do
the rights of the care giver and the rights of the patient intersect? Where is
good medicine versus technology? Where does that hit?

That to me, I struggle personally; that’s the one I struggle
with, because the information is available, it’s been given to people for
different reasons, and now, can’t it be delivered for that patient’s purpose
without them going out and saying, I want this piece not in there and this
piece not in there and this piece not in there?

But they may decide their fate and/or the fate of the person
who is liable for their care at the moment that they appear. So I’d just love
some input from you just to –

MR. ROTENBERG: Let me say, sir, having spent a long time on
this issue and having also been involved in the development of the HIPAA
guidelines, I think in the ideal world, the interests are aligned.

In other words, I think it is in the patient’s interest that
the care giver know as much information as possible about the patient to give
the most complete, correct diagnosis and the best assistance that can be given.
I think that’s self-evident. And, of course, as the care giver, you try to
obtain all the information that you can to use your skills to the best
advantage.

The reason that we have debates about the protection of
medical privacy is because we understand that this same information has value
and use to others outside of the medical context. And that’s also self-evident.

It has value to employers who make determinations based on
insurance liability and other factors the hire; it has values to the Federal
government in its investigation of crime; it has values to family members
because a person’s medical condition may affect their own condition.

And so privacy laws do in fact attempt to facilitate — and I
think this was Dan Solove’s point, actually — privacy laws attempt to
facilitate disclosure of medical information from the patient to the care giver

by providing in effect a protect conduit: I will tell you these
things so that you can give me better care because I have the assurance that
they won’t be disclosed to others.

And this, you see, is really the key here. Without that
assurance, I believe, as do others, a patient’s willingness to be forthcoming
is diminished.

So the interesting paradox about privacy protection is that
the more privacy protection you provide in these relationships, the more likely
you will have the open exchange of information that you need to deliver the
care.

MR. ROTHSTEIN: Dr. Harding?

DR. HARDING: Questions again have been asked and I’m trying to
keep focused on the data, you know, keeping that information private and so
forth and how we surround that data with safeguards. But I keep coming back to
the visceral issue, or subqueue issue, the feeling issue, of a permanent versus
a temporary.

If the VeriChip were on a bracelet or on a necklace, would we
be discussing this in the same way? Or is there something about the
subcutaneous nature that causes some kind of a feeling state or something or
that brings up the possibility of a national health identifier again, a
proprietary national health identifier? It gets us thrown off, somehow.

MS. SOTTO: There’s been a lot of discussion in the RFID
context of a kill switch, and while that is not yet in place on the devices
that we’ve heard about today, my sense is that the market would respond if
there were reluctance on the part of individuals who would otherwise be chipped
who will resist that because they can’t turn it off at will.

So I would suggest that market mechanisms will respond to
force those who develop these technologies to adopt some sort of kill switch on
whether the device is in clothing, whether the device is in a library device,
or whether the device is in fact implanted subcutaneously.

MR. ROTHSTEIN: Richard, I would just say that I think it’s
both. To use your term feeling that it’s a little “creepy” that it’s
subcutaneous, but I would have the same concerns with the uses of information
if it were a bracelet or a tag. You think about ankle tags that they sometimes
put on offenders to make sure that they keep on home arrest. We’re concerned
about a national I.D. card and so on, and that is certainly not part of your
body.

So I think it’s both, but I think it’s a very interesting
question.

Helga?

DR. RIPPEN: I guess I’d like to continue in that theme to some
degree.

When you talk about technologies, a lot of the time it’s not
the technologies per se, and I think a lot of you brought that up, and it’s
really the application of the technology. And if you look then, what is unique
about this RFID, then one can think about, well, it depends on the use. If it’s
to identify an individual, one could argue then, what’s the cost benefit, the
risk, the harm, potential harm of using this as opposed to an I.D. card, as
opposed to a biometric, as opposed to anything else?

And I think depending on the circumstances, one may decide
that it isn’t really beneficial or the costs outweigh the use, especially once
you start talking about implantation because then you have a lot of other
risks, health risks, potentially, too, and removal issues, which I was worried
about, too.

The other event is what are the alternatives? And that then
also goes to other potential uses, for example, Alzheimer’s patients and them
wandering off. So it’s more a proxy for someone to man the door as opposed to
them using the technology. And again there, the question goes then: What are
the costs, what are the benefits, what are the appropriate uses of the
technology?

And so I don’t know if there’s really a black and white to
technology per se, but actually when do we consider use of a specific
technology to help us address an issue and then what would be the appropriate
use of that?

Now, this is a little stickier because obviously of the
secondary use potential.

MR. ROTENBERG: I’d like to say something on this topic and I
know it’s oftentimes the experience when you hear a presentation on a privacy
issue and someone makes a strong argument for privacy, you assume that that
person has concerns about technology broadly speaking. And that’s really not my
view of technology or of RFID, and I want to give you a very specific example.

There was an ongoing debate, but very high-profile debate,
about technology of encryption 10 years ago. The question was: Do we have an
encryption standard that enables strong communications and minimizes the risk
of third party disclosure or do we have an encryption standard that enables
third party access that law enforcement and others may take advantage of?

And I think the privacy community, broadly speaking,
understood very quickly that this was not a debate for or against technology;
it was about the types of technologies that would or would not enable privacy.

Now, that’s my view of RFID. I think there are RFID
applications that are obviously beneficial. But saying that there are RFID
applications that are obviously beneficial, I don’t think you can say, and
therefore, there are no RFID applications that raise privacy risks. I think
that view of the world would be as mistaken as a view of the world which says
all RFID applications pose privacy risks.

So what I’m trying to propose to you today is a way to think
about this technology that focuses on the degree of privacy risk, considering
primarily the collection and use of personal information but also recognizing
that this is a new technique that has certain specific capabilities that I
think you’d want to understand as well.

MR. ROTHSTEIN: Sarah, and then Jeff will have the final
question.

MS. WATTENBERG: I just want to know if there is anything we
can learn from everything in this book. What is the rest of the world doing,
you know? Is there anything yet or are we sort of more or less ahead of the
curve, so to speak?

MR. ROTENBERG: It’s really interesting. I mean, as Ms. Sotto
said, one of the classic comparisons in the privacy world is between the U.S.
and Europe. Europe tends to create a regulatory framework which they did for
privacy protection; the U.S. tends to let private sector lead, and
interestingly, in trying to resolve conflicts between the U.S. and Europe,
we’ve seen several examples now of sort of meetings in the middle around such
things as safe harbor.

You know, I’ve given you an approach this morning; many others
will have other approaches. I would say: Look at a lot of approaches. Look at a
lot of different ways of understanding privacy protection in the RFID
environment.

There are going to be a lot of committees like this one meeting
around the world over the next few years trying to answer the same question
about the delivery of health care services in their own countries and they will
come up with some very interesting answers. So that’s my proposals.

MR. ROTHSTEIN: Jeff?

MR. BLAIR: I don’t know how many of the rest of you listen the
same way I do, but when Dr. Seelig was making his presentation and I was
listening, I was listening for the potential for abuse. And I was impressed,
because Dr. Seelig did an outstanding job of identifying all of the different
ways that he had designed the system. And when he was done, it shut off any
idea that I had of how that system could be abused.

Now, my imagination may not be as great as many others’ here,
and there could be future technologies or systems or organizations that may be
able to break the system, but it did leave me with the feeling that what Dr.

Seelig has done might be a very good beginning point for the
best practices that Ms. Sotto suggested and that that may take us through many
years before we discover things that we don’t know now.

And when we discover those, either we add them to the best
practices or we see if they’re of such a nature that legislation is required.
Legislation, while it has some great strengths, takes a long time. Best
practices is something that could sustain us for the next year or two or three
or four or five or six or seven, until legislation is crafted.

So, given that that was the reactions that I had, could I hear
from testifiers that might point out the fallacy or flaw or shortsightedness in
my observation?

MR. SOLOVE: Well, I think, first of all, privacy law – I
think Mark made the point well – really helps and facilitates the use of
technology and facilitates people using the technology. And I think that all of
us agree on this panel that we don’t see this technology as malignant; we see
this technology as potentially very beneficial.

My concern is I think that people who are using the technology
have to be adequately protected. I think there are a lot of people who might be
very reluctant to use the technology or to implement it as effectively as they
might want it implemented unless these concerns are addressed, and addressed in
advance, because once the information is out there, once there is a potential
abuse, it’s often hard to put Humpty-Dumpty back together again, especially
when your information is out there and it’s identifiable to you. Your Social
Security number gets out there or someone steals your fingerprints, It’s very,
very hard to put the pieces back together.

One potential danger I see is: What would happen if everyone in
this room has an implantable chip and then the whole system, with millions of
people, someone hacks into the database? Databases with very good levels of
protection, including government organizations with highly sensitive data, have
been hacked.

MR. BLAIR: But don’t we already have laws specifically against
that with the HIPAA privacy regs and security regs?

MR. SOLOVE: Well, we certainly have some protection against
hacking but I think we lack laws that create adequate incentives for companies
to maintain better security, I think especially when it comes to keeping
personal information private.

When you look at a law, I think it is often woefully inadequate
at protecting us against crimes like identity theft. The level of security that
someone can access our data is, I think, extremely low. And I think that this
is a problem of business practices that I think often are shoddy and law not
adequately addressing some of those business practices.

That, I think, is important, that we make sure that when this
information is gathered that there’s a framework to insure that at least
there’s minimum and adequate security practices on keeping the status secure
and to insure a way that if something happens, that there’s a way to put
Humpty-Dumpty back together again.

If there is a breach, even in the best of circumstances, of
security, even with all the law, the law has to provide some way for people to
either have the chip be able to be deactivated or some way to fix a problem
once it happens.

MS. SOTTO: In my testimony, I tried to break down the harms
into smaller pieces of the puzzle so that we could analyze each piece
individually. And when you do that, I think we find in most respects, though
not all, RFID technology carries the same dangers that many other collection
methods carry, for example, can you hack into a database? Without question.
Technology standards are not there yet to absolutely, without a doubt, protect
data.

Does that differ from any other hacking incident? Not really.
Is the data more sensitive? Absolutely.

We have in place HIPAA right now in the health care context.
It’s not perfect by any means, but it goes a long way, much farther than any
other sector, industry sector protections. It goes a long way towards
protecting data and HIPAA’s security rule, I think, is quite prescriptive.

The other piece that I just want to quickly comment on is Mr.
Blair said that legislation takes time. Legislation done well takes time. And I
think it’s very important if we are going down that track with respect to that
technology or any other or with respect to data collection and use generally
that we take the time to consider these issues and in a bigger, national
debate.

MR. ROTENBERG: If I could briefly say, I agree with you, Mr.
Blair. I think codes of practice can be very useful. We certainly support best
practices. My point was simply that I don’t think they should be viewed as an
alternative to legislation; I think they can be complementary to legislation,
they can provide a framework for legislation.

Now, if I may make a second point briefly, just in response to
your other comment regarding Dr. Seelig’s presentation. I actually had a very
different reaction, as I think did Dr. Halamka.

It hadn’t occurred to me actually that the chip could be
forcibly extracted and used basically as a form of high-tech identity theft.
And then I recalled that Dr. Seelig’s company, Applied Digital Solutions, has
an application of the chip involving – and this is not secret knowledge
– undercover police in Mexico.

So we have a situation in a very dangerous law enforcement
setting where people are relying, at least to some extent – I would assume
some redundancy in the identification protocols – but even so, they’re
relying to some extent on the authenticity of that chip in a particular body at
a particular point in time to prove that they’re an undercover agent to someone
who needs to know that.

If I was on the other side of that operation, I would be
interested in ways to obtain that chip.

So I think we really need to think through this. There is a
very significant difference in moving from the temporary application of RFID to
person identification to the permanent application.

MR. BLAIR: The case you just made I found very compelling
because you wound up saying maybe we need help. We need the best practices,
which we could do quickly, and we could base it on things that we’ve heard that
seem reasonably secure within the medical setting for medical devices, in this
case, RFID.

But you pointed out the fact that there’s a connection between
the information that’s gathered from this source and the ability to identify
somebody pretty definitively with an RFID chip and the broader issues which
legislation probably would be required for. Are we converging on our thinking
here?

MR. ROTENBERG: Yes, sir, I think we are.

MR. ROTHSTEIN: Sarah, you had a quick comment?

MS. WATTENBERG: A question.

MR. ROTHSTEIN: A quick question.

MS. WATTENBERG: A little bit off – something that keeps
going around in my head with this conversation and also the larger conversation
of electronic health records, that I haven’t really heard anyone talk about, so
perhaps I’m dreaming this up, but HIPAA permits disclosures about authorization
for the purpose of treatment. Is there a scenario in which these chips are sort
of construed as medical records, that the information – I mean,
everybody’s presuming that consent would be obtained; but is there is a
potential for this information being shared without consent because it would
further the treatment, quality treatment?

MR. ROTENBERG: And this actually goes to the point that RFID
has a new capability that was not anticipated with HIPAA. You see, consent sort
of requires knowledge of the flow of information. It requires the ability to be
aware of the circumstances when information is being conveyed to others.

There are certainly applications involving RFID where it’s
possible to obtain the person’s identity without their knowledge that it’s
being disclosed.

Now, there are technical questions about the reader and whether
you have an active transmitter, passive transmitter and so forth, but I think
what your question goes to is the recognition that this is a technique which
may enable the transfer of identity or personal information without the actual
knowledge and consent of the data subject.

MS. SOTTO: I do not think that HIPAA is a panacea in this area
because HIPAA covers only very specific covered entities with respect to
specific information. There’s no question that within the RFID system there are
parties that are not covered entities under HIPAA that will be privy to the
information.

So I don’t want to leave the impression here that HIPAA is the
be-all, end-all and provides all the privacy and security fixes that we need.

However, to go to your specific question, with respect to
treatment, HIPAA presumably was very well considered and it was determined that
for treatment purposes, data could be shared with other health care providers
to use it for that specific purpose.

MR. SOLOVE: Just one quick remark, I think that building on
some of those comments, is that our current regulation of privacy generally in
a broad generalization is it regulates the particular data holder, so if you
happen to be a covered entity, you fall under the protection of HIPAA. If
you’re not, you’re not, even if the data is the same.

And I think one thing to think about in developing codes of
practices for this and regulation for this is to think about the data itself,
not just who’s holding it, but in fact the data, and protect it across the
board, because I think that one thing RFID does is that it can take medical
data and potentially change who the holders of it could potentially be and that
could actually alter under existing regulatory framework what level of
protection it gets. And I think that’s a potential danger that needs to be
shored up in a regulation so we have a consistent level of protection of the
information no matter who has it.

MR. ROTHSTEIN: I want to thank all three panel members for
excellent testimony. I also want to thank Catherine Lorraine from the FDA for
putting Panels 1 and 2 together even though she had to leave and get back to
the FDA.

So, thank you again, and I hope we can call on

you in the future as we continue to struggle with these issues.

The Subcommittee will now take a lunch break until 12:30 when
we will start with Panel Number 3. Did I say 12:30? I’m sorry – 1:30.

[Lunch break 12:34 p.m. and meeting resumes at 1:37 p.m.

MR. ROTHSTEIN: Good afternoon. We are ready to resume our
hearings before the Subcommittee on Privacy and Confidentiality of the National
Committee on Vital and Health Statistics.

This morning we heard from two panels on RFID and this
afternoon we will begin with a panel on decedent health information. This is a
topic that we visited about six months ago and decided it was so important that
we needed to revisit the issue and ask for more witnesses to —

MR. BLAIR: Was it resurrected, maybe?

MR. ROTHSTEIN: Well, possibly to –

PARTICIPANT: Oh, that was bad!

[Laughter.]

MR. ROTHSTEIN: — dig into the issue. And so we have two
witnesses this afternoon and I want to first welcome both of you, and ask for
Dr. Novak to begin.

AGENDA ITEM: Panel 3 – Decedent Health
Information Presentation – STEPHEN E. NOVAK

MR. NOVAK: Well, first of all, I’m not a doctor, so –

MR. ROTHSTEIN: Okay!

MR. NOVAK: — “Mr. Novak” will be fine.

MR. ROTHSTEIN: Mr. Novak.

MR. NOVAK: First, I’d like to thank the Committee for having
me here. HIPAA’s had a huge impact on American life; I can see where its effect
on archives and libraries would be easy to ignore, and I’m grateful for being
given this chance to explain our concerns to you.

As Head of Archives and Special Collections at Columbia
University’s Augustus C. Long Health Sciences Library, I’ve had to deal with
the effect of HIPAA on researcher access to records in my collections.

I’ve written and spoken about the HIPAA Privacy Rule both for
the Society of American Archivists and the Archivists and Librarians in the
History of the Health Sciences.

That said, I want to be clear that I am not an official
spokesman for the archival profession, for historians of medicine, or for
Columbia University. Many of my colleagues in both organizations see me as
someone with expertise on HIPAA and its effect on libraries and archives, but
they might not agree with everything I see

here. As a matter of fact, I’m sure they won’t agree with
everything I say here.

As a Department of Health and Human Services publication
notes, HHS received over 63,000 public comments on the HIPAA Privacy Rule while
it was being formulated. It’s clear from reading the legislation that resulted
that none of these comments were from archivists, librarians, manuscript
curators or historians of medicine.

Because the Privacy Rule’s basic concern is the use of records
containing protected health information in clinical and biomedical research,
applying it to archives and libraries has not been an easy fit for the types of
research requests that we deal with on a daily basis.

That said, I believe that most libraries and archives that
have found themselves to be a covered entity under the HIPAA definition now
have a policy in place to regulate access to these records in their collections
that contain protected health information.

At Columbia, our policies for researcher access to records
containing PHI have been in effect since the spring of 2003 and are posted on
our website.

My knowledge of access policies at other libraries and
archives makes me believe that most are similar to Columbia’s but that none are
exactly identical. Although one of the Privacy Rule’s goals was to impose

uniformity on access to records with PHI, its result among
libraries and archives is in the opposite. Different institutions treat
identical materials differently.

Sometimes this is because an institution is not a covered
entity, and hence, not affected by HIPAA. But even libraries and archives which
are covered entities have divergent policies because their legal counsel has
given different interpretations of important aspects of the HIPAA Privacy Rule,
and these different opinions have their origin in the ambiguities of the Rule
itself.

While I believe the profession is coping more or less
successfully with the Privacy Rule, its opacity is irksome and worrisome to
librarians and researchers alike. Clarity is needed if we are to both abide by
the letter of HIPAA while still serving our patrons by providing prudent access
to records of the past.

Those parts of the Rule that have confused archivists and
libraries were described in an October 22nd, 2003, letter to
Secretary Tommy Thompson by then Society of American Archivists President
Timothy L. Ericson and President Jodi Koste of the Archivists and Librarians in
the History of the Health Sciences. This letter has never been answered, and
its questions are still relevant today. These questions are, and I will repeat
what’s in the letter of 2003:

First, does the Privacy Rule apply retroactively? If so, how
far back does it extend?

Libraries and archives hold records going back centuries, yet
HIPAA makes no distinction between the records of the 1990s and the records of
the 1790d. Why does someone looking to use an 18th century
physician’s casebook have to go through the same procedure as someone wanting
to use 20th century patient records? Is it possible to establish a
date before which records with PHI could be made available without researchers
having to go through a privacy or institutional review board?

If a non-covered part of a hybrid institution receives records
with PHI from a covered part of the institution, must it create a business
associate agreement?

Libraries and archives don’t easily fit into HIPAA’s business
associate model. We’re not separate organizations that are contracting, but
rather internal departments of an organization. Must a university library have
an agreement with its university hospital before accepting records with PHI?
Must state archives have agreements with all those branches of state government
that generate records with PHI?

Three. Do guidelines for research in the PHI of deceased
patients allow the researcher to use actual patient names? If not, is there a
chronological point at

which the names can be used?

The rule for research use of PHI deceased patients is
currently unclear if access to these records allows researchers to use actual
patient names in their finished product, whether that be a book, a
dissertation, a documentary film, whatever.

If use of names is not allowed, it would mean that certain
historical, biographical and especially genealogical works where the identity
of the individual is the whole point could not be written.

Laurel Thatcher Ulrich’s Pulitzer Prize-winning A Midwife’s
Tale
, based on the late 18th and early 19th century
diary of Maine midwife Martha Ballard, might be impossible to write now if the
diary was held by a library that was part of a covered entity.

Four. Physicians and other health care providers often mention
names of patients they are treating in their correspondence, sometimes
casually, sometimes in more detail. At what point does this correspondence
become PHI?

Personal papers of physicians, nurses and biomedical
scientists are filled with correspondence, outside of formal patient records or
clinical records, in which the names of patients may be mentioned. Often, this
is done in the most casual way, but occasionally it is more detailed. As the
Privacy Rule stands now, archivists will have to examine every document to make
sure no patient names are mentioned before opening them up to researchers, an
impossible task for most of the profession. It may lead to closing many
collections in which the amount of PHI is minimal.

Five. If photographs of patients were taken for publicity,
fundraising or clinical purposes and these images have appeared in published
form in the past, can we assume that the patients depicted gave their consent
to be published, even if the actual consent forms no longer exist?

Traditionally, photographs of patients have appeared in a wide
range of hospital and medical center publications. These photos are eventually
acquired by libraries and archives when they are no longer in active use by the
parent organization.

We know that at many institutions, some kind of permission was
asked of the subject of the photo before publication, but in most cases these
forms no longer exist. How can we allow use of these images? At what point does
a photo of a patient reach the threshold of PHI?

And that’s the end of the questions that they sent in 2003.

These last two questions, use of names and administration
photo collections, are particularly

important because of the enormous effect they have on the way
we administer our collections. Perhaps the largest challenge we have had in
applying HIPAA has been the way it has changed the definition of what records
are considered confidential.

Previously, history of medicine archives and libraries
regulated access to what we called “patient records,” usually
hospital patient records, either in old-fashioned bound volumes or the modern
file folder that were easy to identify and segregate.

We realized the patient information sometimes appeared in other
forms, say the correspondence between a physician and her patient or the
research notebooks of a clinician, but these two were usually distinct and
obvious among the larger mass of an individual’s papers and access to them was
still generally easy to administer.

Enlarging the definition of “confidential” to the
more expansive protected health information has made every piece of
correspondence and every photograph a potential landmine for librarians and
archivists. It has made the administration of the HIPAA Privacy Rule fraught
with unanswered questions and has resulted in makeshift expediencies to allow
us to function at all.

If every mention of a patient in physicians’ correspondence or
every photograph of a patient in a hospital bed falls under the definition of
PHI, those of us in charge of the nation’s history of medicine archives and
manuscripts will face an impossible task.

Modern manuscript collections tend to be large and the digital
revolution notwithstanding, are getting larger.

To give you an example, one physician’s papers which we
recently opened at Columbia comprise 128 cubic feet, or about 350 standard
archival storage boxes. With a professional staff of two at Columbia, we simply
do not have the people or the time to be constantly policing our collections.

My colleagues at the Archives of the Johns Hopkins University
Medical Institutions, which we’ll hear from after me, have addressed part of
this problem by devising a form to cover what might be called “incidental
disclosures” when researchers encounter PHI in the papers they are using.
It is something we’re contemplating using at Columbia, too, but it has to be
said that I at least see nothing in HIPAA that says such an out is allowed.

The problem of photographs is thornier still. At Columbia, I’m
afraid to say, we basically just ignored the issue since to strictly follow
HIPAA, we’d have to probably close our photo collection entirely. Since we
currently hold over 100,000 images and since our photo collection is one of our
most heavily used ones, we do not feel that closing it is an option.

In conclusion, I’d like to mention my fears of an unintended
consequence of HIPAA on the historical record of medicine and the biomedical
sciences in this country. The major academic medical centers, of which the
majority of formally organized history of medicine libraries and archives are
part, will continue to deal with the complications of HIPAA. They have the
resources to hire legal experts; they operate IRBs and Privacy Boards, and they
have a commitment to supporting research in the sciences and humanities.

But the majority of 20th century medical records in
this country still remain, I suspect, in the custody of smaller, local medical
institutions that are not primarily research institutions and do not have the
resources to deal with these issues.

For these institutions, the solution to the question of
research using records containing PHI may be to destroy the records as soon as
possible. Hospital patient records in particular have always been vulnerable to
destruction – they take up space and cost money to maintain – and
after they are no longer being actively used for health care, administrators
may find in HIPAA yet another reason to trash them.

On the donor side, physicians and scientists giving their
papers to archives may purge them of everything, even correspondence existing
outside patient records, that they think are covered by the Privacy Rule.

All this leads me to fear that the primary documents needed to
write the history of health care and the biomedical sciences in the United
States may come to an end in the late 1990s. It would be paradoxical and
unfortunate if a law that was designed in part to shed light on the workings of
our health care system makes it impossible for the scholars of the future to do
just that.

Thank you.

MR. ROTHSTEIN: Thank you very much, and we’ll hold our
questions until the testimony of Ms. McCall. Please.

AGENDA ITEM: Presentation – NANCY
McCALL

MS. McCALL: Thank you for the opportunity to be here to
discuss the impact of HIPAA, the HIPAA Privacy Rule, on the ability to access
and utilize archives.

In administering access to archival holdings, the role of the
archivist is to uphold legal rights to privacy along with legal rights to
public disclosure. Archivists must develop and implement policies on access and
use of holdings that promote open intellectual inquiry while meeting compliance
requirements of the law.

Over the past 20 years, archivists have had an especially
challenging job keeping pace with the rise of new laws and regulations
governing access to records from the health fields. However, the enactment of
HIPAA has had a greater impact on access and use of archival holdings than any
of the previous laws for the following reasons:

The HIPAA Privacy Rule does not apply equally to all archives
with collections documenting the history of the health fields. It applies only
to archivists within entities covered under this Rule. Some institutions with
covered entities have chosen to place their archival and special collections
programs outside the covered entities of their institutions. The Privacy Rule
does not apply to archival repositories of the United States government.

Throughout the country, there are many other types of public
and private archives with records documenting the health fields which are not
subject to the compliance requirements of the Privacy Rule. As a result, there
are many disparities between archives that are covered by the Privacy Rule and
those that are not covered.

Archives within covered entities have to follow strict
limitations on access and use of their holdings. Archives not subject to the
Privacy Rule do not have to follow these same limitations.

As a result, archival patrons throughout the country face
significant inequities in how they may access and utilize archival holdings at
covered entities.

The HIPAA Privacy Rule is one of the first rules to apply to
identifiable health information in general in any format and in any medium.
Previous laws and regulations related to privacy of health records have applied
specifically to records of patients and records of research subjects in hard
copy formats.

At archival repositories, fragments of incidental health
information not related to patients and human subjects may be found throughout
all types of holdings, from Minutes of governing boards and committee records
to correspondence files, diaries, photographs et cetera.

And one thing we have noticed is, in having to examine all of
these other types of materials very carefully, that there is a propensity in
the health fields, professionals in the health fields simply like discussing
health – their health, the health of their families, in professional
correspondence. So I think the incidental health information that is found in
these collections is probably greater than, for instance, maybe collections of
historians and other professions.

Having to limit access to the incidental health information of
all decedents presents enormous challenges to archivists. Principles of
archival theory and organization are based upon types of records and not upon
the various kinds of information that may be embedded in a record.

The only way to limit access to fragments of incidental health
information embedded in this wide spectrum of general non-patient holdings is
to limit access to the broad spectrum of records, or undertake the laborious
task of redacting the fragments of embedded incidental health information.

Such limitations on access to the broad holdings of an
archival repository create major impediments for patrons and are complex and
labor intensive for archival staff to administer.

The HIPAA Privacy Rule contains no principle for passage of
time. It is one of the first laws or regulations to assign rights of privacy to
the deceased in perpetuity. Under previous laws and regulations, an
individual’s rights to privacy ceased at death. Until the Privacy Rule went
into effect, archival repositories throughout the country had allowed open
access to the incidental health information of decedents that is found in
general correspondence files, diaries and other types of documents that are not
part of the official patient record and case file.

Most repositories with collections of patient records of the
deceased have had special policies and procedures in place for access and use
of these patient records and provisions in place for protecting the privacy of
patients.

Under the Privacy Rule, archival repositories at covered
entities must now limit reference access to all incidental health information
of decedents in general holdings that had previously been open and accessible
for research while archival repositories that are not part of covered entities
may continue to grant open access to incidental health information of decedents
in their general holdings.

To operate in compliance with the Privacy Rule, it has been
necessary to reorganize reference and research services and to transfer
additional staffing from other divisions of the Medical Archives to assist in
the operation of reference and research services. As a result, work in other
areas such as processing, cataloguing, exhibitions, publications and public
outreach has been severely constrained.

Since the Privacy Rule is an unfunded mandate, the Medical
Archives did not receive any additional funding from the government or Johns
Hopkins for managing the compliance requirements of the Privacy Rule. All staff
members must now participate to some extent in the operation of reference and
research services in order to meet the highly labor intensive compliance
requirements of the Privacy Rule.

The interpretation of the Privacy Rule and the implementation
of policies and procedures in reference and research services at the Medical
Archives have evolved as follows:

Since the Privacy Rule does not contain specific guidelines for
the operation of archives at covered entities, legal counsel for Johns Hopkins
medicine and outside legal experts studied implications of the Privacy Rule for
the operation of archival programs within covered entities.

The consensus of opinion was the Privacy Rule applied to all
types of information and records taken in and managed by the covered entity,
that the definition of protected health information referred to health
information about all individuals, not just information about patients and
human subjects, and that there was no distinction between the PHI of the living
and the deceased.

Guidelines for the operation of the Medical Archives evolved
out of this interpretation of the Privacy

Rule and attached is Addendum A, which was prepared by Joanne
E. Pollak, Vice President and General Counsel for Johns Hopkins Medicine, which
provides guidelines for the

operation of our program.

The major challenge in revising reference policies to comply
with the Johns Hopkins interpretation of the Privacy Rule is that materials
that had been previously open and accessible for general reference must now be
restricted and the incidental health information of decedents and non-decedents
must be screened and redacted before showing documents to patrons.

The Privacy Rule now limits general reference and use of
nearly 95 percent of the holdings of the Medical Archives. Only health
information that has previously been disclosed through publication may be
provided without redaction.

However, the Privacy Rule does allow broad access to archival
holdings for the purpose of research. It defines research as a systematic
investigation, including research, development, testing and evaluation designed
to develop or contribute to generalizable knowledge, and offers guidelines for
adjudicating requests for access to holdings to conduct research.

Rules for research on decedents are less rigorous than those
for non-decedents. To conduct research in a collection that contains health
information of non-decedents, the patron must apply to the Privacy Board of the
Johns Hopkins Medical Institutions for a waiver of authorization to conduct the
research.

Although this broader access for research is helpful, it does
not cover general reference access, which is a large part of the activity of
the archives.

Whereas the Privacy Rule’s compliance requirements for conduct
of research are in keeping with regulations that had already been in place at
the Medical Archives, the major challenge has been how to handle requests for
provision of reference information and copies of documents.

To comply with the institutional interpretation of PHI under
the Privacy Rule and to be able to provide basic reference information to
patrons, the staff must first screen the files requested for presence of
incidental health information. When there is incidental health information
embedded in a document, a staff member must first photocopy that document and
redact the incidental health information before showing it to the patron.

Screening and redacting incidental health information from
individual documents for general reference purposes is and enormously labor
intensive and frustrating procedure for staff to manage. Having to expend
valuable staff time to screen and redact incidental health information from
documents that would be freely accessible at archives not designated as covered
entities is a significant drain on the program’s resources.

The requirement for redaction of incidental health information
of decedents results in a major disparity for archival patrons who would have
open access to the same type of information at repositories not subject to the
Privacy Rule.

I ask that you also refer to Addendum B, which is a compilation
of our policies and procedures for access, and it does include the forms and
procedures for HIPAA and then also some other privacy regulations.

Archival patrons are primarily frustrated by, A, limitations on
access to archival holdings that had previously been accessible for general
reference; B, the bureaucratic requirements for registration for general
reference services and the length of time it takes to screen and redact
documents, and, C, the Johns Hopkins interpretation of the Privacy Rule differs
from that of some other repositories.

Since the Privacy Rule does not contain specific guidelines for
archives, archival repositories within covered entities have made individual
interpretations of the Privacy Rule to define the scope of PHI and to
administer the archival repositories at their respective institutions.

Patrons who are conducting research which they wish to publish
are especially frustrated by the Privacy Rule. If they receive a waiver of
authorization from the Privacy Board of the Johns Hopkins Medical Institution
to conduct archival research, they are still not free to publish the PHI of
deceased individuals which they may encounter in their research at the Medical
Archives.

The Privacy Rule requires that they obtain authorization from
the legal representatives of the deceased in order to publish PHI from a
covered entity.

In instances when the deceased have no legal representatives,
we believe that the only option is for the patron to obtain a ruling from a
judge.

That the Privacy Rule treats all types of health information
equally and makes no provisions for ranges of sensitivity poses major
impediments for patrons wishing to publish. Incidental health information found
in personal correspondence about conditions not considered to be highly
sensitive, such as colds and fractures, is regarded as the same as information
on patients about conditions considered to be highly sensitive, such as
psychiatric illnesses and sexually transmitted diseases.

Archival patrons who are conducting research on topics where
significant documentation on that particular topic is located at several other
archival repositories complain about having to submit to significantly
different policies on access and use of health information of decedents at
these various repositories.

For instance, if someone wishes to conduct research for a
biography of John Shaw Billings, who was the principal planner for the Johns
Hopkins Hospital, the National Library of Medicine, and the New York Public
Library, that individual would have to consult the Billings materials in the
archives at these three institutions. In order to publish the identifiable
health information found in a personal letter found at Johns Hopkins perhaps
– this is hypothetical, not a real document – from Billings to his
wife about falling and fracturing his foot while on a trip to Europe, a
biographer would have to obtain authorization from a legal representative of
Billings, who died in 1913.

If the biographer found the same information in Billings’s
correspondence at the New York Public Library or the National Library of
Medicine, he or she would be free to publish it without having to obtain
authorization from a legal representative or a ruling from a judge if a legal
representative could not be found.

The staff of the Medical Archives finds that the policies and
procedures for provision of general reference services imposed by the Privacy
Rule to be complex and time-consuming to administer. They find it especially
frustrating for general reference requests to have to

screen and redact incidental health information that had been
previously accessible before the Privacy Rule and continues to be accessible at
repositories not covered by the Privacy Rule and at repositories with different
interpretations of the Privacy Rule.

Since the new policies and procedures went into effect on the
14th of April, 2003, the staff has made a continued and concerted
effort to clarify the language of forms and to improve the flow of procedures
for patrons. Despite all of their efforts to improve the quality of services to
patrons, the major obstacles created by the Privacy Rule remain. The major
obstacles to archival patrons may be summarized as follows:

There is no uniform interpretation of the Privacy Rule for
archival practice at covered entities. A uniform interpretation of the Privacy
Rule for archival reference services at covered entities would normalize these
services and save valuable staff time now being expended on the development of
policies and procedures which are based on different interpretations of the
Privacy Rule.

And there’s no consensus on the definition and scope of PHI at
archival repositories within covered entities.

Action should be taken to address problems that the Privacy
Rule has imposed on archival reference services at covered entities. If the
following clarifications were to be made, archival programs within covered
entities would be able to offer patrons access to and the ability to publish
the same kinds of incidental health information of decedents that is accessible
at archival repositories outside covered entities.

Limit the scope of PHI to information from records of patients
and human subjects both living and deceased and allow access to incidental
health information for the purpose of general reference.

Introduce principles for passage of time. Restrictions for
reference use of incidental health information of decedents found in
correspondence files, diaries, that are hundreds of years old should be eased
when the possibility of offense to living descendants is minimal.

Documents containing health information from antiquity, the
Middle Ages, the Renaissance, and the 18th and early 19th
centuries should be open and accessible for archival reference and research
whenever possible.

Define and adopt standards for sensitivity of identifiable
health information in archival holdings. Incidental health information may be
categorized by ranges of sensitivity. Restrictions for access and use of
incidental health information of decedents that refers to common conditions not
considered to be highly sensitive and embarrassing to decedents if disclosed
should be eased.

Allow the publication of the incidental health information. The
need to promote knowledge through publication should be taken into greater
consideration. Archival patrons who wish to publish less sensitive incidental
health information of decedents that appears outside patient records in
professional and personal correspondence should not be encumbered by
requirements to locate a legal representative or to obtain a court ruling to
publish less sensitive incidental health information that appears outside
patient records in professional and personal correspondence. At a minimum,
publication of incidental health information about decedents should be allowed.

Other recommendations for changes may be found in Addendum C,
which is a presentation our General Counsel made in March of 2004.

That the Privacy Rule applies only to archival repositories
within covered entities creates major disparities for the administration and
operation of these archives. Varied interpretations of the Privacy Rule at
archival programs within covered entities have resulted in widely assorted
policies on access and use of archival holdings at these repositories.

Ultimately, there is much confusion over what constitutes the
protected health information in archival holdings at covered entities. Some
archival programs at covered entities have made the interpretation that
protected health information refers only to patient information and not the
fragmented incidental health information that is found in the correspondence
files et cetera.

Other archival programs have interpreted protected health
information to include the fragments of incidental health information found in
professional and personal correspondence files as well as all the identifiable
health information in patient records. As a result, there are varying
definitions of the scope of PHI at these archival repositories and how that PHI
may be utilized for purposes of reference and publication.

Extending privacy rights to all types of incidental health
information in perpetuity is unprecedented in archival practice and limits the
ability of students, scholars, writers, artists, film-makers, the media and
others to access archival sources at covered entities for general reference and
to publish from these archival sources when the same kind of information is
easily accessible at repositories not covered by the Privacy Rule. It greatly
constrains open intellectual inquiry and ability to document facts and events
and the telling of biography and history.

Whereas the Privacy Rule introduced much needed measures for
protecting health information of living and deceased patients, it now needs to
be adjusted to make incidental health information of all individuals, not
patients, accessible for archival reference and publication. he Privacy Rule
should be revisited to extend the same opportunities for archival reference
that exist at archival repositories outside covered entities to archival
repositories within covered entities.

MR. ROTHSTEIN: Thank you very much. Questions? John?

Questions, Answers and Comments

MR. HOUSTON: I usually have questions. I can’t find it –
Kathleen and I were talking to this; related to information prior to the
compliance data of the HIPAA Privacy Rule. We were of the impression, and I was
trying to find it as you were talking, that that wasn’t covered, those
materials were not covered by the Privacy Rule.

MR. ROTHSTEIN: Sure, they were.

MR. HOUSTON: Were they?

MR. ROTHSTEIN: Yes.

MS. HEIDE: Once you’re a covered entity –

MR. HOUSTON: All the priors?

MS. HEIDE: — the identifiable health information –

MR. HOUSTON: But if you carved it out beforehand –

MS. HEIDE: You have the option of carving out non-covered
functions through the hybrid entity provisions which were mentioned –

MR. ROTHSTEIN: Could you speak into the mike –

MS. HEIDE: But once you become a covered entity, the
identifiable health information that you hold –

MR. ROTHSTEIN: And identify yourself for the record.

MS. HEIDE: Christina Heide, Office for Civil Rights.

MR. ROTHSTEIN: So that my medical records before 2003 are
protected just as the ones after –

MS. HEIDE: Yes.

MR. ROTHSTEIN: — 2003.

MR. HOUSTON: Unless they were outside of some state law
provision and therefore – how do I say this? – would have been carved
out of the covered entities in part of an archive which was not part of the
covered entity, at which time they’d be open.

MS. HEIDE: Covered entities have the option of hybridizing,
which would be to only apply the Privacy Rule to their covered functions. In
doing so, then they applied it right. They applied the Privacy Rule to their
covered functions to the extent that they have carved out a part of their
business that does not do a health care provider function or a health plan
function. The Privacy Rule would not apply to that part of the entity.

MR. HOUSTON: Okay. Thank you. I guess I have one question,
though, not related to that, but the testimony.

I guess, if I’m a patient and I have medical information in my
record and I die and I guess I would, for my family’s sake, especially if there
was something that had some stigma attached to it, I guess I would have strong
feelings about wanting to have that information made available to the public or
available to archivists.

I mean, I think it’s – this is a very philosophical
question and I guess my gut reaction is I can understand how you’re talking,
you know, if there’s a letter and there’s a correspondence in a letter or
there’s something that’s tangential to the information that’s trying to be
reviewed, but, I mean, my gut reaction is – no matter if I’m dead a
hundred years or two weeks – I would still have concern that I don’t want
my information reviewed and disclosed and be made as part of the book. I guess
that’s the reaction I have. Am I missing something?

MS. McCALL: Well, I was speaking about incidental health
information between individuals who were not patients. At Johns Hopkins at the
Medical Archives, we have always restricted access to actual patient records
and case files and have had a review process for access and use of that
material.

There have been two provisions for access, and that is for
people who are doing quantitative studies and do not need to reveal identifiers
and they have to enter into an agreement that they would protect the
identifiers.

The other option has been for biography, and I will say that in
all the years that I have been working in the archives when biographers have
approached us, they usually come with letters from the legal representative of
the subject about whom they’re writing.

But what I was addressing in the incidental health information
is a correspondence – we have personal paper collections of faculty, staff
and former students from Johns Hopkins.

For instance, we have a wonderful collection of letters from
nurses who served overseas in military operations and rescue operations and
they are writing letters back to their families and they will discuss their own
health, they’ll discuss health matters of their families. And these are not
patients, but we as a covered entity have that material and therefore the
interpretation is that that is covered by the Privacy Rule also.

MR. NOVAK: Let me address that, too.

Archivists never let researchers have free access to patient
records even before HIPAA. The difference with HIPAA is that it’s no longer
just patient information, not just patient records – that is, your patient
record would be protected, but if your sixth cousin writes to your fifth cousin
saying that you have a cold, that’s also considered PHI. At least that’s how
most lawyers are interpreting it.

It’s that third party stuff that really throws us.

The other thing is that if in 200 years your patient record
ends up at Columbia where the Health Sciences Library is a covered entity, it
will be under HIPAA, if HIPAA is still in effect. If it ended at the New York
Historical Society, which is not a covered entity, it would be wide open.

So, saying that you don’t want your record looked at, HIPAA
doesn’t necessarily affect that. It depends on where the record ends up.

MR. ROTHSTEIN: Christine?

MS. HEIDE: You both mentioned putting limits on the coverage on
decedents’ information under the Privacy Rule. I mean, this is something that
we have grappled with

as well, especially when we were doing the original Rule. What
period of time would you protect these records under the Rule?

MR. NOVAK: That’s a hard one. Partly it’s because biographers
do want often to look at – fortunately, we haven’t had too many of those
in my career – and as Nancy said, they usually have a letter from the
family. But it gets to a point where what member of the family is in charge
here?

You know, you’re talking about George Washington’s health
information – who do you go to?

So when that dividing line is? Certainly, I would say anything
before 1880 or so. I mean, the census is open after 70, 75 years.

But again, part of the problem is, of course, that all patient
information is not equal, and generally I’ve found in my career is that we
never gave out information from psychiatric records. And I speak from
experience where we had someone, a biographer of Melville, trying to get access
to the records of his German teacher who was incarcerated in the Bloomingdale
Asylum in New York in the 1840s. That is still in existence and when we applied
to what is now the New York Hospital Westchester Division, they said absolutely
not. That was a confidential record. So this was at this point 145 years old.

But if people come in looking – somebody for a fracture
or an operation in the 1880s, I think we’d feel much more comfortable about
opening that up to someone.

MR. ROTHSTEIN: Okay, Christina?

MS. HEIDE: I just wanted to follow up.

So you would be advocating for a different standard for
different types of information, so certain types of information you would feel
comfortable with applying the Privacy Rule in perpetuity?

MR. NOVAK: I don’t feel applying the Privacy Rule in
perpetuity makes a lot of sense and in general. It certainly goes against —
what general common law has been over the years is that apart from wills, the
dead don’t have privacy rights.

MR. ROTHSTEIN: Well, there’s a difference between what the law
used to recognize and what medical ethics has recognized since the time of
Hippocrates. And medical information in the Hippocratic Oath will never, never
be divulged.

And that means it survives the death of the physician, it
survives the death of the patient; it’s not divulged.

And so if you are the successor of the physician – let’s
assume that the physician has also died – then you take those records with
the same ethical strictures that the physician had.

Now, the fact that our privacy laws were, and arguably still,
do lag in some respects, certainly pre-2003, I’m not sure that’s a great
argument. I’ve got some better arguments for it, I think.

MR. NOVAK: I’m not a medical ethicist.

MR. ROTHSTEIN: I’d like to pursue the suggestion that is in the
back of the Hopkins testimony – it’s on the page – they’re not
numbered – about three from the end – where it says “Issue:
Application of the HIPAA Privacy Regulations to Medical Archives.” I’ll
let you catch up to that. Do you see where I am?

Okay, the suggestion there is: “Clarify that only PHI in
the ‘designated record set’ held by Archives within a covered entity is
subject to the HIPAA Privacy Regulations.” So I assume that this is an
attempt to segregate what we’ve been calling sort of “incidental
information” from sort of core diagnostic, prognostic medical records,
which may be a workable solution, but I think we need some more information.

So can you sort of flesh out how that would work? In other
words, how that distinction would be drawn in your view in terms of designated
record set? Would you be including within that only material that was within a
clinical record or could some other core information elsewhere be within the
designated record set?

MS. McCALL: I believe the interpretation was to address the
fact that the Medical Archives is the official archival repository for the
health divisions of Johns Hopkins which includes the three academic divisions,
schools of medicine, nursing and in public health, and then the Johns Hopkins
Hospital and health system.

And there are parts of the School of Public Health, for
instance, that are not covered entities, designated as the covered entity,
whereas some divisions are. I believe the Department of Epidemiology and the
Department of Biostatistics are designated as part of the covered entity at
Johns Hopkins University.

So those materials that we would receive would be covered by
HIPAA but, for instance, the Department maybe of International Health, which is
not –

MR. ROTHSTEIN: No, I think maybe I didn’t make myself clear.
Within a covered entity, or the covered entity part of a hybrid entity, there
are still probably two kinds of records at both of your medical libraries,
right? You’ve got sort of incidental stuff where I write a letter telling you
about the health of my dad and it’s in your record somewhere. And there are the
actual medical records of perhaps historical figures or former patients from
the 19th century or whatever.

How can we devise, if we so choose, a way of meaningfully and
sort of clearly, for your benefit, separating these incidental from the
“designated record set” materials? Mr. Novak, can you help us?

MR. NOVAK: Hospital records, clinical records of doctors, not
their correspondence, clinic records, notebooks of scientists who are doing
clinical research that is with human subjects – that’s stuff that’s
usually fairly easy to segregate out. You get medical records up until about
1915; they’re bound in volumes and they say “Presbyterian Hospital
blah-blah-blah.”

The same thing with the notebooks of a research scientist.
There are Andre Cournand’s notebooks on cardiac catheterization – we know
he used human subjects on that, obviously. We know how to control that.

It’s when you give me the 18 cubic feet of the private papers
of a physician and we know that there may be six letters in there where he
says, “I saw your patient and she has diabetes,” you know. And we
have to go through all this – that one letter makes this radioactive in a
way.

This is how we proceeded before HIPAA. That is, patient
records, that is, records created by hospitals and records created by doctors,
were always restricted. Correspondence was not, photographs were not, even
though we knew that in some cases we had to be sensitive, depending on what
that correspondence was about, but even then generally you could segregate that
out of the larger collection or at least you could administer it separately
from the larger collection.

As it is now –

MR. ROTHSTEIN: I mean, that has some appeal under the theory
that the information in the traditional medical record was shared with the
health care provider under the implicit or explicit assumption or agreement
that that information would be subject to the normal strictures of medical
ethics, would not be re-disclosed without the consent of the individual and
therefore would be “protected.”

On the other hand, leaving everything out besides that, I’m
just thinking out loud whether it is letting out too much. So suppose my doctor
is kind of playing fast and loose with the ethics and I’m a famous person and
he wants to write a letter to his brother revealing all this information that I
gave during psychotherapy or whatever. Should the fact that it’s not in the
medical chart mean that it’s now fair game?

MR. NOVAK: I don’t think it’s fair game, and archivists have
never thought that was fair game. But I’m not sure it should be regulated by a
Federal policy. I think at some point you have to say: You have to trust us. If
this blows up, it blows up in our face.

When we process papers, we’re always on the alert for material
that we know is potentially embarrassing or confidential, and believe me, there
have been more sessions at the Society of American Archivists on this one than
you can imagine.

And that’s one reason why we have professionals process
records, not volunteers or paraprofessionals, because there is of course
confidential information that’s not medical and we have to be aware of that,
too, and we are.

I guess the other thing is that HIPAA doesn’t necessarily
protect any of all of that because, again, if those letters end up at someplace
that’s not a covered entity, right now nobody knows – most people are
saying that the Privacy Rule doesn’t apply to it.

MR. ROTHSTEIN: But I understand that, but that would apply to
the medical charts, too, right?

MR. NOVAK: Yes.

MR. ROTHSTEIN: So if the medical records wind up at the D.C.
Public Library, they’re sort of not covered by HIPAA. I don’t know what happens
to them after that. So we are necessarily operating in a system that segregates
the privacy requirements based on whether it’s a covered entity or not covered
entity.

Could we have a system where there were three sort of levels of
materials? One would be the sort of clinical records.

On the other extreme would be the incidental references that
are not on their face very sensitive, the “I had a broken leg,” where
everyone knew it at the time but it’s now written down somewhere.

And then maybe there’s some sort of middle ground where it’s
not a medical record but the disclosure could be somehow sensitive and I hear
you saying that the professionals in the archive world already have in place
some way of internally dealing with that middle area, is that right?

MR. NOVAK: One other thing our general privacy rule has, and we
did have a general access rule before HIPAA besides an access rule to patient
records, was that I was given the authority to close those records that I
thought were a confidential nature, and that could be a whole number of things.
That’s a very touchy subject because what’s confidential? And that varies with
time and it varies with space, too. And we’ve talked about this endlessly.

One of the things we thought HIPAA might do was standardize
that, but as a matter of fact, it hasn’t; it’s made it worse in some ways
because now we have to abide by the opinions of lawyers who don’t know a thing
about archives for the most part even in the case of at Hopkins and my own
case. They’ve been very sympathetic to our concerns.

But it seems to me the worst of both worlds here. You’re not
protecting all protected health information because many records will not end
up in places that are covered. And on the other hand, you’re making it
impossible for archivists who are in covered entities to do the work they need
to do.

As a matter of fact, a colleague of mine who will remain
nameless has been telling donors to give records to non-covered entities –
that is, to libraries and archives that don’t have an interest in the history
of medicine because they won’t have to deal with this. He doesn’t want to have
to go through this huge amount of administration.

MS. WATTENBERG: I have a question. So when somebody hands over
their private letters to you all, isn’t there the expectation or the implicit
understanding that they’re authorizing researchers to come in and potentially
report and write on that? And do you have any agreements that you currently
sign with those people?

MR. NOVAK: All archives have donor agreements, and those donor
agreements specify access requirements to a lesser or greater degree. And in my
donor agreement at Columbia, we say, is there any part of these records you
want closed, and for how long? And we will not accept records that are closed
for long periods of time. It’s not worth our while to have our shelves clogged
with things that nobody can use.

But yes, the expectation is that if you give me your records,
researchers are going to use them. And often they do. I just acquired a set of
diaries from a physician still living. They are not going to be opened until
his wife dies. He can die. If she dies before him, they can be opened, but he
doesn’t want them opened while she’s alive. And that’s very easy to administer.

MS. WATTENBERG: And has there ever been a provision for whether
or not that information contains – well, so information about other people

MR. NOVAK: There’s also questions about, for instance, at
Hopkins, and Nancy can correct me if I have this wrong, I believe that any
health information – if Mrs. H. in 1887 writes Mr. H, neither of them a
doctor, and says “I am suffering from la grippe,” that is covered
because Hopkins is a covered entity. Is that correct, Nancy?

MS. McCALL: Yes.

MR. NOVAK: At Columbia, that information would not be
considered PHI because it wasn’t created by somebody who now would be
considered a covered entity – that is, Mrs. H is not a physician, she’s
not a nurse, she’s not a biomedical scientist; just because we hold it doesn’t
mean it’s PHI.

On the other hand, we have records of hospitals now defunct
that have had no connection with Columbia or Presbyterian Hospital. They are
considered covered, as are letters by doctors who, if they were alive today,
would be covered.

Now, my colleague at the University of Alabama at Birmingham,
they have ruled that patient records from defunct hospitals that are not
related to the University of Alabama medical system are not covered.

So it’s, you know, like sheep, we all go astray. Everyone is
doing their own thing. So the idea that you’re protecting information here is
not correct. You’re protecting some at some places and not at all at others.

And, of course, researchers find this immensely annoying and
frustrating because for historians of medicine, they’re usually going to more
than one place, as Nancy said, because records are not all in the same place,
and they find such varying access requirements that they just sometimes leave
in frustration.

MR. ROTHSTEIN: Well, I can understand why you’re frustrated and
confused because we certainly are.

[Laughter.]

MR. ROTHSTEIN: Jeff?

MR. BLAIR: Yes. Can I reflect my frustration and confusion with
a few questions that’ll reduce it slightly but not all the way?

You gave the example of a physician, or maybe somebody at
Columbia University, that’s on the academic medical staff, indicated in the
instructions that don’t use his archive records until his wife dies. That
prompts a series of questions.

How do you get notified that the wife has died? How do you get
notified that the patient that is six months, six years or 16 years after the
surgery in your hospital, the patient is now deceased? How do you get notified
of these things? So that’s one series of questions. How do you know that
somebody’s passed away?

MR. NOVAK: Nancy, do you want to –

MS. McCALL: For, I guess, many of our major collections and
donors, we find through the news and also just through the institutional
publications. That has never been particularly difficult to find out when
donors have died or their relatives. We usually receive some notification.

MR. NOVAK: For personal papers, in the case of these diaries,
probably we might not know and we would not worry about it until someone asked
for access to it, and then we would try to contact the family.

The other thing, of course, is the wonderful Social Security
index of deaths really helps a lot. You can usually tell, or guess, and even if
you’re not sure, you can then generally kind of zero in on the locale where
they died.

Now, for most researchers, they’re not really interested in
specific individuals; they’re interested in doing research that covers a whole
topic. So, someone’s writing a book on Tourette Syndrome or someone’s writing a
book on the introduction of technology into medical care in the early
20th century, they want to look at huge masses of patient records.

Because HIPAA distinguishes between research into the
information of decedents and that of people living, this is something we had to
decide: How do you tell if someone is dead? And obviously in a casebook filled
with dozens of patients, there’s no way anybody can decide all of those people
are dead.

So we quite arbitrarily said that people would be considered
dead if it was 100 years since the time of birth or the date of the patient
record. So in practical terms, since a lot of patient records in the
19th and early 20th century don’t actually include birth
dates, or accurate ones, it means that we considered anybody who’s before 1904
to be dead, which is actually pretty generous since they’ve probably been dead
for quite a while but – so that’s how we do it. And that’s completely
arbitrary and it’s not covered by HIPAA, but we had to do something to decide
who was dead in the case of quantitative research.

MR. BLAIR: I have another question, and Mark, this reflects the
fact that I’ve been irresponsible. I have been on the Privacy Subcommittee for
a while, so I don’t know these things.

My understanding is that the basic motivation for the Privacy
regs under HIPAA was to try to increase trust on the part of the public so that
when we have their patient information in electronic form, that information
wouldn’t be disclosed, and I think I recall that the primary fear that Congress
had to deal with when they were writing this reg was the Health Insurance
Portability and Accountability Act which was winding up indicating that that
health care information could be used to guess the patient for employment
purposes.

And then I guess you expanded – okay; it’s expanded beyond
damage that could be done because of discrimination in employment to the fact
that it can embarrass the person with behavioral or HIV or sexually transmitted
disease – okay, and then you don’t want to embarrass the family so you’re
going to extend it to that.

And the thing that I guess I kind of don’t understand in my
question, because it seems like we’ve taken the interpretation of HIPAA beyond
at least what I thought was the intent of HIPAA into other areas of legal
ethics, would say these records are protected in perpetuity. And was that
really what HIPAA was intended to do? Was the intent of the legislation to do
that?

MR. ROTHSTEIN: Well, Jeff, if that’s a sort of rhetorical,
open-ended question, I will venture my answer, that I think perhaps you’re
conflating the origins of HIPAA with the intent of HIPAA. That is true that the
origins of HIPAA in terms of why that was included in the Privacy title, Title
IV, in the statute was as you described.

But once having put that in there, HIPAA has with the full
knowledge of Congress taken on the role of a limited Federal health privacy
statute, and I don’t think it was ever Congress’s intent that it be viewed so
narrowly as to only deal with the original nexus to the –

MR. BLAIR: Okay.

MR. ROTHSTEIN: — statute.

MR. BLAIR: Okay.

MR. ROTHSTEIN: I have a question for Ms. McCall and then you
can answer mine and yours then as well.

If you can go back to that same page –

MS. McCALL: Yes.

MR. ROTHSTEIN: — can you clarify, or help us, help me,
understand what’s intended by that second bullet under proposed change? And for
the benefit of our listeners on the Internet, I’ll read that. It says,
“Clarify that PHI used in bonafide education dissertations may be
published if the minimum necessary standard is applied and the document is
reviewed and approved by an IRB or Privacy Board.” I’m about three pages
from the end of the big packet of testimony from Johns Hopkins.

MS. McCALL: Actually, I would like to defer that question for interpretation by
our legal counsel —

MR. ROTHSTEIN: Okay.

MS. McCALL: — who is actually here today in the building at
another meeting but couldn’t come to this session and she had wanted to be here
in case there were any questions pertaining to this.

MR. ROTHSTEIN: Well, if you can ask her perhaps to send a
letter to the Subcommittee explaining that, because I have some problems with
it but rather than spar with you about what my problems are, I’d rather have
Joanne set out what exactly she has in mind. And you had another comment.

MS. McCALL: Yes. I thought it would be helpful for you all,
because in listening to your comments, I feel you may not understand completely
the kinds of materials that we have in our collections at Columbia and at
Hopkins. Actually, patient records and case files constitute a very, very, very
small part of our collection. The official patient records are maintained by
the Medical Records Division at Johns Hopkins.

We have what may be sometimes called the “shadow
records,” the case files that have been maintained by clinicians. And, for
instance, in the papers of Helen Taussig, the personal paper collections of
Helen Taussig and Alfred Blalock, these are collections of their correspondence
and their memorabilia, beginning with family correspondence, family
photographs, records of their education and medical training, and they both
maintained collections of case files on their blue baby patients. And the
actual unit record for those patients is in the Medical Records Division. But
this is a rather unusual example.

Most of the personal paper collections of the faculty and staff
are just general correspondence, records of their education and training,
manuscripts and research notes.

The other large statement of material we maintain are the
corporate records, corporate archives, and founding documents of the
institution and records of governance and administration of all the divisions,
and the PHI, the incidental PHI that I was discussing in my presentation –
for instance, we have the building diaries for the Johns Hopkins Hospital from
the late 19th century in which there is information about some of
the construction workers who had accidents and were injured and we have the
names and the dates and that, under the interpretation of HIPAA, the Privacy
Rule, is considered PHI.

MR. BLAIR: Is that correct? Is that under HIPAA considered PHI?
People that died a hundred years ago?

MR. ROTHSTEIN: Well, there’s – Christina, do you want to

MS. HEIDE: I mean, the PHI of deceased individuals is
protected, and generally once you’re a covered entity, the individually
identifiable health information that you hold becomes covered, subject to a few
exceptions such as for employment record.

MR. ROTHSTEIN: So I’m mentally trying to figure out how we can
work this. Would you be comfortable – I don’t know what it would do to
your archives; that’s why I’m asking this question – if we recommended
that there be no change for this designated record set? In other words, it
would still be covered by HIPAA, the actual records themselves –

MS. McCALL: The actual clinical and patient records?

MR. ROTHSTEIN: — the clinical patient files, but that for
incidental measures that were 50 years old or more, they would not be subject
to HIPAA. Would that –

MS. McCALL: Oh, that would –

MR. NOVAK: It would make our lives much simpler.

MS. McCALL: Yes, yes.

MR. ROTHSTEIN: Well, but not on medical records, only on
incidental, which would mean a letter that my friend broke his leg last year?

MR. BLAIR: I guess my question is: Considering the value of the
medical records for clinical research and public health research, why just do
it on the incidental, if we’re going to have 50 years old, you know, 50 years
after somebody’s death?

MR. ROTHSTEIN: Because the clinical record was given and taken
with the explicit ethical, even though not legal or possibly legal, assurances
that they would never, never be disclosed. Now, if there is someone who wants
to disclose them who is the heir of the individual and has a legal right to do
that, that’s fine.

MR. HOUSTON: I think it even goes further than that. There are
mechanisms through the IRB in which you can get a waiver of informed consent in
order to review medical records information, whether it be decedent information
or still alive, so there are ways to do that.

MR. BLAIR: So there’s a mechanism to access that information
for clinical or public health, okay.

MR. ROTHSTEIN: Right. But the same problems don’t apply to what
somebody, a non-treatment relationship. Christina, do you –

MS. HEIDE: Right. There’s always the IRB or the Privacy Board
waiver of authorization, but in addition, for decedents’ information, there are
different provisions for where research is involved where you don’t even have
to go through an IRB or Privacy Board. The Rule requires certain
representations to the covered entity from the researcher.

So, there are even different provisions there for research on
decedents’ information.

I think part of the problem that was expressed by Mr. Novak was
that sometimes they don’t know which records are of decedents and which records
are of living individuals.

MR. ROTHSTEIN: I’m just raising the issue of whether the IRB
has jurisdiction to review that —

MR. HOUSTON: Well, they actually don’t, because –

MR. ROTHSTEIN: — because research on decedents –

MR. HOUSTON: Right, you’re right. I’m sorry. I’m thinking of
my institution would actually have a separate research – I apologize.
Great.

MS. WATTENBERG: Can I just ask a question of whoever can
answer it?

My understanding of what incidental information was, that it’s
information that ends up being disclosed while performing another function, but
that doesn’t address the issue of severity or the nature of the privacy of the
information, right?

MS. HEIDE: Yes, I think they’re using incidental information
slightly differently than it’s used under the Rule now and they recognize that
now. The Rule allows for certain incidental disclosures to be permissible if
they’re part of an otherwise permissible disclosure. If someone overhears
something by accident and if there are reasonable safeguards in place, a
minimum necessary in place.

So, especially in the clinical setting when things may happen,
they are using incidental information slightly differently, just sort of
capture this other type of information that may not be so sensitive, and as I
understand it, may not be part of a medical record, a clinical record.

MS. WATTENBERG: If it was still sensitive, you would – I
mean, for instance, if it’s, you know, correspondence between two doctors but
it contains very sensitive information about a patient, that’s not – you
wouldn’t be asking to pass that through?

MS. McCALL: No.

MR. ROTHSTEIN: So could you live with a standard that said
– instead of “incidental,” let’s use “indirect” or
something so it’s not the same word – so indirect information that was
more than 50 years old would not be subject to the Privacy Rule unless the
archivists had reason to believe that it contained sensitive information?
Because I don’t think you should have to read every one of the 138 file
cabinets to figure out what’s in there and then segregate them.

But if somebody donates a file that is chock-full of –
that you know, it’s labeled – the real dirt on my psychiatric patients,
then I think you just can’t ignore that and you have to treat that – so,
some level of heightened duty where you have actual knowledge.

Sarah?

MS. WATTENBERG: Is there any situation where you all have
binding agreements with the researchers who are accessing the information, such
that if they come across something that they know to be defined as PHI, they
would come back to you or bring it to your attention or not publish it or
something like that?

MS. McCALL: That is, I believe, part of the agreement for the
waiver for authorization from the Privacy Board. And also our registration
agreement has that also.

MR. NOVAK: I mean, for research on decedents, we do not allow
them to disclose names because from what we can see under HIPAA, it’s not
allowed. But we can’t de-identify everything, so there’s a trust there that we
have to assume, and they do sign a form to that effect.

MR. ROTHSTEIN: I have another question. We’re very interested
in this issue, obviously; this is the second time we’ve dealt with it.

But you are a small panel, I guess the smallest possible panel

[Laughter.]

MR. ROTHSTEIN: — unless there were only one person, okay? And
you represent similar institutions. Can you help us identify what issues might
be out there for other kinds of institutions that are not as large, not as
comprehensive?

I for one, not speaking for the rest of the Subcommittee or the
full Committee, would be reluctant to try to come up with a proposal on the
basis of just your panel’s testimony.

MR. NOVAK: First of all, archivists are a rather small group in
general, and archivists of covered entities are smaller still. And of course,
two of the largest medical archives libraries in the country, the NLM and the
Countway at Harvard, are not considered covered entities. So that knocks a lot
of people out.

The reason that Nancy and I are concerned is that we’re part of
maybe 20 – we’re talking major academic medical centers. There are many
other hospitals that have records with PHI, but they don’t have archivists or
historical collections, so this is not on their radar at all.

MR. ROTHSTEIN: What about some community hospital in a very old
part of somewhere and they’ve got a medical library that in some corner of it
somewhere has 100-year-old stuff?

MR. NOVAK: That’s not unlikely. They’re not likely to be
members of the Society of American Archivists or archivists and librarians in
the History of the Health Sciences.

One thing you might want to look at, and this is a group I
don’t belong to, but the Medical Library Association has a History of Medicine
section, and they are often hospital libraries who have responsibility for the
records, which are usually in some basement somewhere. But there are some good
people in that.

Patricia Gallagher at the New York Academy of Medicine is
somebody I would contact, for instance, because she’s a very active member of
that group and is very conversant with these issues even though I don’t think
they affect her in her daily duties at the Academy.

MR. ROTHSTEIN: Because they’re not a covered entity?

MR. NOVAK: I don’t think they are. Don’t quote me on that, but

MR. ROTHSTEIN: So we would really want a covered entity person

MR. NOVAK: Pat would know people in a way that Nancy and I
don’t because we’re not really in contact with people who work at community
hospitals.

MR. ROTHSTEIN: Are we missing something in our exploration? Is
there another side to this thing we would be remiss if we didn’t consider?

I’m not asking you to be a good advocate for – you’ve
already done that. But, I mean, just help us out, somehow.

MS. McCALL: First of all, I think it might be interesting for
this panel to meet with archivists and manuscript librarians at institutions
that are not covered entities that have large sections of records of health
care. The New York Academy of Medicine would be one of those, the College of
Physicians in Philadelphia.

MR. NOVAK: NLM.

MS. McCALL: Yes, NLM.

MR. NOVAK: And Countway at Harvard.

MS. McCALL: Yes.

MR. ROTHSTEIN: Okay. So let me ask you a question then: Suppose
we recommended, and the Department adopted, something along the lines that
we’re kind of playing around with. Would that, in your view, with your
knowledge of the non-covered entity practices, more or less level the playing
field in terms of what you would have to do versus what, say, NLM would have to
do, with archival information?

MS. McCALL: I think it would be very helpful to have some
normalization of standards for access for records in the health fields, period,
at covered entities and non-covered entities. I think that is probably the most
crucial issue to be addressed by this disparity.

MR. NOVAK: I think one thing you’ll find is that HIPAA has made
non-covered entities who hold records with PHI – well, they’re not really
PHI but they’re not covered entities – more sensitive to access, and I
notice that the NLM now has a new policy to access to patient records that’s
very much like it would be if they were a covered entity.

Since this is a public record, I don’t want you to think that
people at Harvard and at the College of Physicians are just letting anybody
look at patient records that they have in their archives.

MR. ROTHSTEIN: So noted.

MR. NOVAK: They have policies in place as well.

But I do think you’re concerned about community hospitals,
hospitals that are not really kind of in the orbit of the academic medical
center and are probably not even aware of some of these issues or if they are
aware, their response has been simply to lock the door and not let anyone use
them.

MS. McCALL: There’s archives at Harvard, I believe Mass
General, Children’s. There’s also an archives at Memorial Sloan-Kettering that
I know has even more stringent interpretation of the Privacy Rule than Hopkins.
So we could provide you with I think a fairly broad, diverse list of other
contacts.

MR. ROTHSTEIN: Thank you. Harry?

MR. REYNOLDS: First a comment and then a question.

It appears to me that we keep running into this whole idea of
covered and non-covered entities whether you’re talking about transactions and
code sets or privacy or anything else. And as you look at the whole spectrum of
who’s dealing in the health world now, it is constantly changing and it’s not
real clear.

The question I have for the two of you is:

Regardless of whether somebody sets themselves up as a covered
or non-covered entity, they had to have gotten the records from a covered
entity, and with disclosure – I mean, this disclosure is kind of tricky
but the whole idea of inappropriate disclosure, accounting for disclosure and
the other things, does that come into play at all? Because basically a covered
entity takes all their records and hands it to a non-covered entity –

MR. HOUSTON: They may get correspondence through a third party
which is not a covered entity. So I think the point was that it might discuss a
health matter, it might discuss, you know, someone having some health
condition, and by being commingled into the records of a covered entity, all of
a sudden it’s considered protected health information because it deals with the
past medical history, past medical condition of an individual.

MR. REYNOLDS: Yes, that’s one condition. But you mentioned some
large entities that maybe the hospital’s covered and then their library’s not
covered, so you got people within a single entity philosophically pass the
stuff back and forth, being covered or non-covered.

MR. NOVAK: A couple things. One is, of course, there’s the
business associate model, and what I’ve been told at Harvard is that they get
records from covered entities; Countway will become the business associate of
whatever that hospital is. But their lawyers felt that did not apply to records
they had acquired before HIPAA went into effect, the Privacy Rule, that is.

So, as I said to you before, considering how Hopkins, Columbia
and UAB are dealing with it, the idea of whether kind of
“coveredness” is retroactive, whether it’s from innate quality that,
you know, if it passes from one person to another, does the coveredness go with
it? There is no standard interpretation of that among lawyers at academic
medical centers.

MS. McCALL: And I know another issue that’s been addressed at
archival conferences – I think this was in Birmingham when Joanne spoke
– that a number of state archives have acquired the collections of records
of defunct hospitals in their states and so they’re raising questions: Are they
covered entities?

And I think that this is another area for consideration, that
especially mental hospitals that have closed, and TB hospitals, many of those
records have been placed in state archives and the state archives are uncertain
whether these are covered entities or not.

MR. ROTHSTEIN: Any other questions from Subcommittee members?

Well, I want to thank the members of the panel. It may be a
small panel, but it was a very excellent panel.

And we will be taking up this issue again in Subcommittee.

Let me take a minute to go over our schedule. We had a
scheduled break, which we will take, followed by public testimony, which we
won’t take, because there are no public testimony slots signed up for. So we
will take a 15-minute break and then at 3:20 to 4:45, we will continue our
discussion as a Subcommittee. We’ve got several business issues to take up. So
we’re in recess for 15 minutes.

[Break at 3:07 p.m. Meeting resumes at 3:22 p.m.]

AGENDA ITEM: Subcommittee
Discussion

MR. ROTHSTEIN: Okay, we’re going to get started on the
afternoon meeting, and of course this is just for Subcommittee members although
it’s broadcast and it’s open to the public, so we have no witnesses.

And of course, the sooner we get finished with this meeting,
the sooner we can adjourn for the day. We’ve got a very packed day tomorrow.

The first thing I think we ought to take up is a consideration
of the draft letter that was written dealing with the medical device question.
As you know, we had a hearing on November 19th and it’s our
expectation that this letter will come before the full NCVHS at the next
meeting, which is in early March.

You should all have a copy of the letter in front of you and
does everybody have a copy of the letter?

MR. HOUSTON: Did you want me to read it?

MR. ROTHSTEIN: Should we read it because we’re on the Internet?
I guess – are we required to do that? Who’s our designated –

DR. COHN: Well, Jeff is here. I think in all fairness to Jeff.

MR. ROTHSTEIN: — Federal official? But Jeff is – I don’t
know whether he’s going to vote anyhow because he’s not officially a member of
the Subcommittee.

MR. HOUSTON: It’s not a very long letter. I mean, I’d happy to
read it.

MR. ROTHSTEIN: Let ‘er rip!

MR. HOUSTON: I mean, it’s up to you.

MR. ROTHSTEIN: Okay. We don’t want to lose our broadcast
audience, so –

MR. HOUSTON: Okay. How do you say his –

MR. ROTHSTEIN: Leavitt.

MR. HOUSTON: Leavitt? Leavitt, just Leavitt. Okay.

[Reading] “Dear Secretary Leavitt:

“As part of its responsibilities under the Health
Insurance Portability and Accountability Act of 1996 (HIPAA), the National
Committee on Vital and Health

Statistics (NCVHS) monitors the implementation of the
Administrative Simplification Provisions of HIPAA, including the Security
Standard for Electronic Protected Health Information (Security Rule). The
Subcommittee on Privacy and Confidentiality of the NCVHS held hearings in
Washington, DC, on November 19th, 2004. The hearings were intended
to gather information about the effect of the Security Rule on medical devices.

“At the hearings, we heard testimony from the Veterans
Administration (VA), the Food and Drug Administration (FDA), as well as various
manufacturers of FDA regulated software and medical devices. We also received
written comments from an individual representing various medical device
industry groups.

“The witnesses indicated that there are a wide
variety” – actually, where do you want me to stop? Just keep going
until we –

MR. ROTHSTEIN: Oh, just keep going.

MR. HOUSTON: Okay.

[Resumes reading] “The witnesses indicated that there are
a wide variety of challenges associated with bringing medical devices into
compliance with the Security Rule, as well as providing effective security. The
witnesses’ testimony centered around two main themes:

“1. While most new and currently produced medical devices
are capable of complying with the Security Rule, much of the medical equipment
in use is no longer manufactured and may not be upgradeable by the
manufacturers. As a result, it may not be possible to bring these ‘legacy
devices’ into compliance with the Security Rule.

“2. Many of the medical devices manufactured today
contain commercial-off-the-shelf (COTS) software and operating systems. Because
of the critical nature of the medical equipment, any updates, (including those
released by COTS software manufacturers in response to specific security
threats must be tested to insure that the update do not adversely the operation
of the medical device. This testing often delays implementing critical security
updates.

“One witness representing the VA testified that many of
the medical devices in use today have limitations that prevent them from fully
complying with the HIPAA Security Rule. The witness indicated that the Security
Rule has been perceived as a barrier to the continued use of certain medical
devices. Further, the witness asserted that attempting to add security features
on certain devices may cause the equipment not to function. Where medical
equipment needs to be modified to comply with the Security Rule, the providers
must often wait for the manufacturer to provide the appropriate updates.”
Oh, I think there’s one change there, which is we defined this to be the
Security Rule where I in this paragraph stated “the HIPAA Security
Rule,” so I’m going to strike out “HIPAA.”

MR. ROTHSTEIN: Okay.

MR. HOUSTON: That was in the second line of that sentence, of
that paragraph, I should say.

MR. ROTHSTEIN: Got it.

MR. HOUSTON: [Resumes reading} “Another witness
representing the FDA stated that the FDA’s primary focus is the safe and
effective use of medical devices. As such, the FDA does not evaluate security
in approving the use of a medical device. Rather, it is the responsibility of
the medical device manufacturer to insure that it has designed appropriate
security into its devices. Accordingly, the FDA sees no need for any new FDA
regulations concerning security.

“Witnesses representing medical device and software
manufacturers expressed concerns about updating FDA regulated medical devices
containing embedded computers and operating systems. They indicated that the
medical devices need to be tested to insure that any updates (including those
released to address security issues) do not interfere with the safe operation
of the medical device. This testing often results in delays in customers being
able to update the medical devices. Manufacturers are concerned that some
customers update medical equipment with the latest patches without first
verifying whether the updates affect the safe operation of the medical device.

“A number of witnesses recommended that a process be
developed to allow manufacturers to post Security Rule information for their
medical devices. The witnesses cited an initiative by the Healthcare
Information Management and Systems Society (HIMSS) Medical Device Security work
group. The work group proposed that the industry adopt the use of a
“Manufacturers Disclosure for Medical Device Security”
(MDS2) form.” And I think there’s a comment here –
“need another sentence describing what this is, which I didn’t notice
before.

“While there is no consensus whether the HIMSS
MDS2 form was suitable for use, in concept it appears that this
approach would be of great value to providers.

“Based on the oral and written testimony, NCVHS
recommends the following:”

First bullet point. “HHS should provide clarification
regarding compliance obligations of covered entities with non-compliant and
non-upgradeable legacy medical devices. A range of options should be considered
based on the nature of the equipment, its replacement cost, and life
expectancy, the security problems, and the possibility of protecting the
security of PHI through other means.”

Second bullet point. “HHS should provide guidance to
covered entities to assist in bringing medical equipment into compliance with
the Security Rule.”

Third bullet point. “HHS should consider supporting
industry efforts to have medical device manufacturers self report the
compliance status of their medical devices.”

Fourth bullet point. “CMS should work with the FDA to
assist medical device manufacturers to comply with the Security Rule and
address security risks.

“We appreciate the opportunity to offer these comments
and recommendations.

“Sincerely, John Lumpkin.”

MR. ROTHSTEIN: Okay. Comments, other than that one sentence
that we need to add? Harry?

MR. REYNOLDS: Yes, I’d like to go back to my earlier comment.

In the top of the second page where it talks about
“Rather, it is the responsibility of the medical device manufacturer to
insure that it has designed appropriated security into its devices.”

Is it worthwhile there to note, again, that they are not
covered entities, because we keep bumping into this everywhere we go? And then
the last thing we say is “CMS should work with the FDA to assist medical
device manufacturers to comply with the Security Rule….” What are they
going to do? And they’re not a covered entity, so what’s “work with”
mean?

MR. ROTHSTEIN: Well, it clearly –

MR. REYNOLDS: So you got a group of people again that aren’t.
You’ve got some legacy systems that can’t be.

MR. ROTHSTEIN: Right.

MR. REYNOLDS: They don’t have to follow the guideline as the
manufacturer. They can each take a separate direction.

MR. HOUSTON: I think the point, though, is that these
manufacturers, especially with this legacy equipment, aren’t doing anything in
many cases. I don’t think they feel compelled to do anything.

MR. REYNOLDS: I agree.

MR. HOUSTON: It’s sort of like – and I think our
recommendation ultimately came down to that HHS should work with providers who
have this legacy equipment to address issues of non-compliance so that these
providers either don’t feel that they’re using non-compliant equipment and that
they’re at risk or that they feel compelled to throw it away.

MR. REYNOLDS: I agree with that one, John. I’m not challenging
that one. I’m back to the whole point again of recommending at the top that the
device manufacturer, it’s their responsibility to insure security but nobody’s
got any jurisdiction over them. It’s like we didn’t have jurisdiction over
vendors in the transactions of code sets.

So as we continue to redefine everything that makes this whole
thing work, we’ve still got these entities out there that can float around.

MR. ROTHSTEIN: Well, I want to go back to Harry’s sentence.
Maybe we need to rewrite the “Rather” sentence because we’re giving
the impression that they’re under a legal duty when they’re not.

MR. REYNOLDS: Definitely, right.

MR. ROTHSTEIN: So I think what we really are trying to say is
that medical device manufacturers are in the best position –

MR. BLAIR: Could I suggest some wording here on this issue?

MR. HOUSTON: Well, let me clarify this because actually what
the “Rather” sentence was, that was a response out of the FDA. The
FDA actually also indicated. In the testimony, FDA indicated that it was –

MR. ROTHSTEIN: Okay. So what you’re saying is then it should be
as such, “Because the FDA does not

evaluate security in approving the use of a device and they
believe that blah-blah-blah-blah-blah,” so we want to attribute the
“Rather” sentence to the FDA, somehow.

MR. HOUSTON: What about rather than “Rather” say
“and indicated that?”

MR. ROTHSTEIN: Yes. Or “The FDA indicated that it is the
responsibility” – instead of the word “rather.”

MR. HOUSTON: Right. I’m taking “Rather” out.

MR. ROTHSTEIN: Okay.

MR. HOUSTON: In its place – take “rather” out
and insert “and indicated that” –

MR. ROTHSTEIN: Okay. Can we also then take up Harry’s second
point, which is in the last bullet?

MR. REYNOLDS: Last bullet, that’s correct.

MR. HOUSTON: Well, you want to take out the last bullet?

MR. ROTHSTEIN: Well, no, I want to discuss the last bullet.

MR. BLAIR: Is this still on the issue of vendor
responsibilities?

MR. ROTHSTEIN: Yes. Jeff. Did you want to –

MR. BLAIR: Yes. As a former vendor myself, I once dealt with
trying to get clarification from the FDA with respect to producing software
that would be considered a medical device and I feel uncomfortable having the
phrase in there that the vendor is responsible for insuring security because in
many cases the vendor provides the ability, or the tools, to facilitate
security, but it’s really the user that has to take the steps that will use
that device in a manner that is secure.

So I would change the word “insure security” to
“facilitate” or “enable security” and that it’s the
responsibility of the user, not the vendor.

MR. ROTHSTEIN: Well, the way it reads now, Jeff, is: “It
is the responsibility of the medical device manufacturer to insure that it has
designed appropriate security into its devices.” It’s not ensuring
security; it’s ensuring that it basically has enabled security to go forward.

MR. BLAIR: I’d still get either the word “enabled” or
“facilitate” into the sentence, that it is designed adequate –
well, I can’t see the sentence so, but to make sure that the vendors are not
held responsible for the fact that if it’s used in a manner which violates
HIPAA security that the vendor would be held accountable, as long as it
provided the tools to the user to facilitate security.

MR. REYNOLDS: See, we may forever have dumb devices where it’s
locked up in a room, being secured in a room, it’s locked up in a room; only
certain people go in

the room. They run the rest. They lock the door. It comes over
your network. You know, that’s secure, and you identified the order in which
you put the samples in.

So I think we’re still going to have lots of different levels,
but the issue is, and I would say that any of this legacy stuff that’s out
there right now, if you did some of things with it, you got to cover your
network because your network’s going to be the thing that’s going to shoot a
virus down or something else to a dumb device.

So there’s lots of ways to deal with it.

MR. BLAIR: I could give you examples to support I think what
both Harry and I are both saying – is a vendor could provide the ability
for the user to set passwords. It could provide the ability for the user to set
access controls. It could provide the ability for the user to encrypt the data.
But it cannot insure that any of those software and/or hardware tools will be
used by the user.

MR. ROTHSTEIN: Okay, I have a suggestion. Beginning with the
word “Rather” that we’ve taken out, it would now read: “And
indicated it is the responsibility of the medical device manufacturer to design
its devices to enable compliance with the Security Rule.”

MR. HOUSTON: Okay. Or could we say “design appropriate
security in its devices necessary to comply with the Security Rule?”

MR. ROTHSTEIN: Fine.

MR. BLAIR: Again, if you just took out “enable” or
“facilitate,” then you’re holding the vendor responsible for whether
it’s used the appropriate way or not.

MR. HOUSTON: Where’s “enabled” so I don’t –

MR. ROTHSTEIN: I have “device manufacturer to design its
devices to enable covered entities to comply with the Security Rule.”
How’s that? Is that better?

MR. BLAIR: Perfect.

MR. ROTHSTEIN: Because now we make it clear that they’re not
covered entities.

Still not crazy about that?

MR. HOUSTON: I was writing something –

MR. ROTHSTEIN: Oh, you want that again? Sorry. One more time,
with feeling, “and indicated that it is the responsibility of the medical
device manufacturer to design its devices to enable covered entities to comply
with the Security Rule.”

Marjorie, do you have a comment?

MS. GREENBERG: I haven’t seen this before, so maybe I need to
read it. But just in following the first reading, it seemed to me that it could
be tightened up quite a bit. Maybe you want to expand these themes. But then I
wasn’t sure of the extent to which then the next three paragraphs –
although I realize you’re now working on that middle paragraph – really
added that much and seemed kind of repetitive, particularly this third
paragraph starting with “Witnesses.” It seemed that everything that
was said there had already been –

MR. ROTHSTEIN: Are you on the second page?

MS. GREENBERG: I’m on the second page.

MR. ROTHSTEIN: Okay.

MS. GREENBERG: After the two points – number 1, number 2,
then there are three paragraphs, and they seem somewhat redundant. At least
some of it seemed kind of redundant of the two points, or of each other.

And I wasn’t sure – one person said this, one person said
that. I’m just thinking that it might strengthen the letter a little bit to
make it a little tighter because it did seem sort of repetitive, although
– I mean, as I’ve said, I’ve only heard this once – you may have
wanted to point out that the different – you know, one was for medical
device, one was from FDA, one was from VA. I don’t know. It’s just you might
look at it to see whether it could be tightened up.

MR. ROTHSTEIN: All right. I’ll put a tightening amendment on
the floor and let John respond to this. What would you say, John, if we took
out, on the second page, the second and third paragraphs? How much would that
hurt the letter?

MR. HOUSTON: Well, I think the second and third paragraphs on
the second page – well, first of all, the purpose of those two numbered
paragraphs on the first page was to sort lay out themes.

The only reason why I added these other paragraphs primarily
was because they described what the witnesses actually indicated, so I mean,
they are redundant; I will absolutely agree with you on that point.

I could see the one point in the second paragraph on the second
page which I think is important is the very last sentence, which described
manufacturers’ concerns about customers updating medical equipment without
verifying that the update does not –

MS. GREENBERG: Right. But I think that was already mentioned
somewhere above, that that was a risk/

MR. HOUSTON: Well, what we talked about on number 2 on the
first page is that there’s a delay in testing, that testing often delays
implementing critical security patches. I guess we could add to that in a way
to add that last sentence.

MS. GREENBERG: That’s what I was thinking. Maybe you could add
it up there.

MR. HOUSTON: I can add that there and pull out the second
paragraph.

And I think that the third paragraph on the second page,
though, as it relates to the MDS2, I think that’s a new concept
which really isn’t discussed before.

MS. GREENBERG: Yes. That’s why I didn’t mention that paragraph.

MR. HOUSTON: So I would say take out the second paragraph, add
to the last –

MS. GREENBERG: That was one I really –

MR. ROTHSTEIN: Okay. And we’re going to move that sentence that
begins “Manufacturers,” the last one, we’re going that.

MR. HOUSTON: To number 2. Bullet number 2.

MR. ROTHSTEIN: Okay.

MS. GREENBERG: And the other thing was, and maybe you’ve
addressed this, but in that previous paragraph, “Accordingly, the FDA sees
no need for any new FDA regulations concerning security,” that seemed to
me to come out of left field. I mean, was that one of the options?

MR. ROTHSTEIN: No, because you see a couple sentences before
they say – we’re talking about the FDA in that whole paragraph.

MS. GREENBERG: Yes.

MR. ROTHSTEIN: The FDA’s primary focus is safe and effective
use of medical devices. They don’t do anything with security. It’s the device
manufacturer and therefore the FDA isn’t going to regulate security.

MS. GREENBERG: Who suggested that they should regulate it?

MR. ROTHSTEIN: Nobody, but the FDA said it’s not going to be
us.

MR. HOUSTON: I think there was a question during testimony
where we asked the FDA about that and I believe their response was – I
don’t entirely remember the sequence of things, but I think they said, hey,
we’re responsible for the safe and effective operation of medical equipment but
it isn’t our responsibility to deal with security.

MS. GREENBERG: Well, maybe you need to say, then,
“Accordingly, they see no need for any FDA regulations to address this
problem,” or something.

MR. ROTHSTEIN: Okay.

MS. GREENBERG: Is that what you’re saying? You’ve identified a
problem, but FDA doesn’t think it’s their role to –

MR. HOUSTON: I don’t even know if it’s a problem. I mean, I
think that all we’re stating here is that right now the FDA –

MS. GREENBERG: To address these issues.

MR. HOUSTON: — doesn’t feel it’s a responsibility – now,
you could argue that it really is CMS’s responsibility to take responsibility
for this aspect of it. And all we’re saying in one of our recommendations is
for CMS to work with the FDA in order to accomplish –

MS. GREENBERG: But is part of the problem that the regulations,
the security regs, did not adequately address this area or that the security
regs have certain requirements that it’s difficult that it’s possible for
medical devices to meet?

MR. HOUSTON: I think it’s the latter.

MS. GREENBERG: The former legacy systems. So where would
additional regulation be used?

MR. HOUSTON: We’re not asking for additional.

MS. GREENBERG: I mean, when you say something like
“Accordingly, the FDA sees no need for any new FDA regulations concerning
security,” to me that’s like a leading question. It’s like somebody seems
to feel a need for that but the FDA doesn’t.

MR. HOUSTON: I guess we could take that out and simply leave it

MS. GREENBERG: It leaves the question in my mind about should
the FDA – see a reason for this?

MR. ROTHSTEIN: We can scratch that.

MR. HOUSTON: It really doesn’t add anything because the first
two sentences sort of describe –

MR. ROTHSTEIN: Okay.

MR. REYNOLDS: I concur with Marjorie.

MR. ROTHSTEIN: That sentence is gone, okay.

MS. GREENBERG: Okay.

MR. REYNOLDS: Good.

MR. ROTHSTEIN: Well, I made somebody happy today.

[Laughter.]

MR. HOUSTON: Let me read this here. Let me read this paragraph
over since we’ve made some modifications –

MR. ROTHSTEIN: Please.

MR. HOUSTON: — to it. Just simply say, “Another witness
representing the FDA stated that the FDA’s primary focus is a safe and
effective use of medical devices. As such, the FDA does not evaluate security
in approving the use of medical devices and indicated that it is the
responsibility of the medical device manufacturers to insure” –

MR. ROTHSTEIN: No – “to design?”

MR. HOUSTON: I’m sorry. ” – to design its devices to enable covered
entities to comply with the Security Rule.”

MS. GREENBERG: Okay. That ties in with down here where you
suggest that CMS should work with FDA so they could assist medical device
manufacturers.

MR. ROTHSTEIN: Well, that’s the bullet that we put on hold that
Harry phrased earlier.

So let’s take up that fourth bullet now. Jeff, it currently
reads: “CMS should work with the FDA to assist medical device
manufacturers to comply with the Security Rule and address security
risks.” So, Harry, do you want to restate your concerns?

MR. REYNOLDS: Well, you’re right about that now, okay. We want
the device manufacturers to take the security reg in consideration in their
designs and their implementation.

MS. GREENBERG: Right.

MR. REYNOLDS: That’s what we want.

MR. REYNOLDS: Because as you sit back and look at this, the
covered entity, whether you’re talking about a legacy system or a new system,
the covered entity is going to be responsible.

MS. GREENBERG: Right.

MR. REYNOLDS: And then you stick their network in the middle,
which obviously is where the viruses flow.

MS. GREENBERG: Where the what?

MR. REYNOLDS: Viruses flow.

They are responsible. But at some point, if somebody doesn’t
put up some guidelines or somebody doesn’t give some of these manufacturers a
sense of what it means to maybe frequently ask questions, which has really been
helpful in the transactions I know, so what would be the minimum things you
would hope was in a medical device?

MR. HOUSTON: Can I ask for a wording change because we wanted
to give FDA and CMS a lot of flexibility, I would think, how they go about
doing this. What if we say: “CMS should work with the FDA to assist
medical device manufacturers to insure that their medical devices are capable
of complying with the Security Rule.”

MR. BLAIR: But the devices can’t comply. The devices enable compliance of user.

MS. GREENBERG: How about saying CMS should work with FDA to
develop guidance for medical device manufacturers that will assist them in
assuring compliance with the Security Rule, something like that.

MR. HOUSTON: But back to Harry’s point – you can’t have
the medical device manufacturers implied.

It’s –

MS. GREENBERG: Okay, but assuring them of the devices, those
who use – I don’t know, something. Isn’t that really guidance? See, right
now –

MR. REYNOLDS: Guidance of appropriate security capability.
That’s what you’re putting in the medical device. Then the covered entity needs
to decide –

MS. GREENBERG: Then maybe don’t talk about compliance.

MR. REYNOLDS: Yes, right –

MS. GREENBERG: It should work with FDA to develop guidance for
medical device manufacturers consistent with the Security Rule, maybe.

MR. REYNOLDS: Recommending capabilities.

MS. GREENBERG: Recommending capabilities –

MR. REYNOLDS: I’m okay with that.

MS. GREENBERG: — consistent with the Security Rule, because
you can give guidance to people even if they’re not covered by HIPAA.

MR. ROTHSTEIN: The other thing that I would raise is the first
three bullets are worded “HHS should.” Now, should we –

MS. GREENBERG: Yes, why should we –

MR. REYNOLDS: HHS is – that’s fine.

MR. ROTHSTEIN: Okay.

MR. HOUSTON: So let me read – I mean, I’ve been sort of
working through trying to listen and write at the same time. HHS would work
with the FDA –

MR. ROTHSTEIN: No. We don’t want to put in “work with the
FDA.” HHS is FDA, CMS.

MR. HOUSTON: Okay. Okay, so “HHS should develop –

MS. GREENBERG: Guidance.

MR. HOUSTON: — guidance to assist medical device
manufacturers to enable their medical devices” – that’s wrong; never
mind.

MR. REYNOLDS: Start at the beginning and I’ll tell you where
to stop and then I want to add some more to it.

MR. HOUSTON: “HHS should develop guidance to assist
medical device manufacturers” –

MR. REYNOLDS: Okay, stop right there. “In identifying the
appropriate capabilities” –

MS. GREENBERG: You said something about “guidance for
medical device manufacturers on the –

MR. REYNOLDS: How about “guidance regarding” –

MS. GREENBERG: — “on appropriate” –

MR. REYNOLDS: Capabilities.

MS. GREENBERG: — “capabilities consistent with the
Security Rule,” something like that.

MR. REYNOLDS: See, if I’m a device manufacturer, I’m not
required to do the law —

MR. ROTHSTEIN: Right.

MR. REYNOLDS: — but I would like at least some minimum set of
what would everybody want in my device as a minimum standard so I can design
accordingly. If I design not to, and a hospital buys it, then they may have to
lock that device up in a room, they may have to have only certain people going
in and out of that place doing it; if I was them, I would set much stricter
procedures about how

you’re monitoring what’s going in, I’d run more blanks possibly
in certain devices. Then I can assure that I’m getting what I need because,
again, the covered entity has to be the one in the end.

MR. BLAIR: Could I make a suggestion here to maybe even
tighten it up, because clearly we have many different players? Each of them
provides a different piece in the puzzle to provide security here.

I think HHS needs to provide guidance. One is the guidance to
the vendor so that the vendor provides the capabilities for data security. The
other’s probably guidance to the users so that the users use the appropriate
tools provided by the vendors.

MR. ROTHSTEIN: We actually have that –

MR. BLAIR: You got that already?

MR. ROTHSTEIN: — in the second bullet.

MR. BLAIR: Okay. All right.

MR. ROTHSTEIN: It says “HHS should provide guidance to
covered entities to assist in bringing medical equipment into compliance with
the Security Rule.”

MR. BLAIR: And it isn’t “bringing medical equipment”
but using the tools provided by the medical equipment vendors.

MS. GREENBERG: Assuming they have provided them.

MR. BLAIR: Or if you don’t want to use “tools,” or
“using the capabilities provided by” –

MR. HOUSTON: I think we’re trying to make that concept too
complex. I mean, I think the point is that there’s a number of strategies for
bringing it into compliance. By the way, it may not be manufacturers’ tools
that they use. It could be where your firewall is, is the medical equipment on?

MR. BLAIR: Okay.

MR. HOUSTON: So there’s a variety of things that might be
done.

MR. BLAIR: Okay.

MR. ROTHSTEIN: Okay. So here’s a proposed Bullet 4: “HHS
should provide guidance to medical device manufacturers about measures to
insure that covered entities using their devices will be able to comply with
the Security Rule.”

MR. HOUSTON: Read that again. I want to write it down.

MR. ROTHSTEIN: “HHS should provide guidance to medical
device manufacturers about measures to insure that covered entities using their
devices will be able to comply with the Security Rule.”

MR. HOUSTON: Why don’t we simply say that their devices are
“capable” of complying with the Security Rule?

MR. ROTHSTEIN: That their devices are capable –

okay.

MR. HOUSTON: So read that? Okay –

MR. ROTHSTEIN: Well, you read it now; you changed it!

MR. HOUSTON: No, no, no, no – I didn’t even write it
down; I was waiting till we had something.

MR. ROTHSTEIN: Oh. “HHS should provide guidance to
medical device manufacturers about measures” – now, you changed that;
I had “to insure that covered entities using their devices” and you
changed it to something –

MR. BLAIR: See, the point is it’s the covered entities that
need to comply.

MR. ROTHSTEIN: I said so be sure that covered entities using
their devices.

MR. BLAIR: Right. And that’s the way you worded it.

MR. HOUSTON: No, but see my point is we were already talking
about the things above that HHS is going to do with covered entities. What
we’re doing now is making recommendation that HHS should also work with device
manufacturers.

MR. ROTHSTEIN: Oh, I know what you said. You said “HHS
should provide guidance to medical device manufacturers about measures to
insure that their devices are capable of complying with the security
rule.”

MR. BLAIR: And that’s – see, that I think goes too far.
You’re saying that the devices comply. And the devices are not under HIPAA.

MR. HOUSTON: The devices are capable of complying.

MR. BLAIR: No, no. It’s –

MS. GREENBERG: Why not “consistent with?”

MR. BLAIR: But, see that – I disagree with that. It’s the
users that have to comply. It’s the covered entities that have to comply. And
in this case, the vendors’ equipment has to enable or facilitate that
compliance or provide the capability so that the users can comply.

But now what you’ve done is you’ve reworded the sentence where
you’re winding up saying that the device complies or not, and they don’t.

MR. ROTHSTEIN: He was saying that covered entities using their
devices?

MR. HOUSTON: We’re already talking about the things above that
covered entities need to do. All we’re saying is, you know, a medical device, a
new medical device, can’t comply unless –

MS. GREENBERG: To be consistent with the Security Rule,
shouldn’t it?

MR. HOUSTON: I mean, all we’re saying is manufacturers have to
– you know, it’s like if somebody says car manufacturers have to have
airbags in their cars. Well, that’s not even a good example, but the point is
that somebody’s got to work with their device manufacturer to make sure that
they have devices that are capable of complying. Then it’s the other guidance
regarding assisting the covered entities to comply. It’s another theme.

MR. REYNOLDS: You’re talking about capabilities consistent
with the Security Rule. This is not capabilities consistent.

MR. HOUSTON: Right. That’s all I’m saying.

MR. REYNOLDS: That’s what we should say.

MR. HOUSTON: And that’s all we could say with the
manufacturers.

MR. ROTHSTEIN: Okay.

MR. BLAIR: The reason I’m pointing this out is that vendors in
the past, and this parallel to what happened with the patient safety regs that
FDA worked with, and that is that vendors in many cases have provided security
tools and the user didn’t use them. And all the vendors could do was provide
the tools –

MR. HOUSTON: I agree.

MR. BLAIR: — but they can’t make the users use role-based
access controls, class-based access controls.

MR. HOUSTON: But, Jeff, that’s why the very first bullet
points out “HHS should provide clarification regarding the compliance
obligations of covered entities with non-compliant and non-upgradeable legacy
medical devices.”

We could add a bullet that we could simply say – we could
take out the “non-compliant and non-upgradeable legacy medical
devices” and simply say “medical devices.”

I mean, we’re dealing with two different themes here, I think.

DR. RIPPEN: Do you think it’d be better just to draft and
circulate?

MR. ROTHSTEIN: I’m sorry?

DR. RIPPEN: Draft and circulate? I mean, I think –

MR. ROTHSTEIN: Well, where we are now is we have to get an
approved, that is, a Subcommittee-approved, draft to the Executive Committee in
advance of the January 21st Executive Committee conference call
which will decide whether it’s in reasonable enough shape to get on the March
agenda.

So I think we are close enough that we can delegate to John the
responsibility for fixing this somehow.

DR. HARDING: Read it one more time.

MR. ROTHSTEIN: Any version!

[Laughter.]

MR. REYNOLDS: Yes, just read something, John, and we’ll go,
“Yeah, now, OK, you got it – now fix it.”

MR. HOUSTON: “HHS should provide guidance to medical
device manufacturers about measures necessary to insure that the medical device
manufacturers’ devices are capable of complying with the Security Rule.”

MS. GREENBERG: You don’t need that second “medical device
manufacturers.”

MR. HOUSTON: I know. Well, I’m writing as I’m thinking and

MR. ROTHSTEIN: Okay, we got that.

DR. HARDING: Have the capacity to –

MR. BLAIR: Has to comply.

MR. HOUSTON: Let me wordsmith. What we’re trying to do in
concept is provide guidance to the medical device manufacturers so that they
can sort of develop compliant capable equipment.

MS. GREENBERG: Harry and I agree we should talk about
“capabilities” –

MR. REYNOLDS: “Consistent with the Security Rule.”

MS. GREENBERG: Don’t talk about compliance. It just raises the
question of who’s complying.

MR. REYNOLDS: “Capabilities consistent.”

MR. ROTHSTEIN: Okay. Other concerns or issues related to the
letter? Hearing none –

[Laughter.]

MR. ROTHSTEIN: — heard a squeal over there. Okay, all in favor
of the letter as amended, say aye.

ALL: Aye.

MR. ROTHSTEIN: Opposed?

[No response.]

MR. ROTHSTEIN: Okay, so it’s unanimously approved by the
Subcommittee, and John, can you then make those changes –

MR. HOUSTON: Yes.

MR. ROTHSTEIN: — and get it to Marietta so she will forward it
to the Executive Subcommittee in advance of our –

MR. HOUSTON: Do we want to talk briefly about the bullet
tomorrow?

MR. ROTHSTEIN: I’m sorry?

MR. HOUSTON: Let me bring this back for discussion on bullet 4
tomorrow.

MR. ROTHSTEIN: When?

MR. HOUSTON: At the very beginning of the morning. Maybe we
could informally circulate it.

MR. ROTHSTEIN: Fine.

MR. HOUSTON: Now, I have one other point on here, too. There
was a question about we need another sentence describing what this is under the
MDS2.

MR. ROTHSTEIN: Right.

MR. HOUSTON: While we were in here, I actually described
something.

MR. ROTHSTEIN: Okay.

MR. HOUSTON: I said – and this would go in place of this
question here, this bracketed question: “The MDS2 form is a
vehicle for vendors to uniformly report the compliance of their medical devices
with the Security Rule.”

MR. ROTHSTEIN: But it’s for their own use, right? I mean, for
whose use?

MR. HOUSTON: It was for the covered entities’ use. The covered
entity would go look at this form and see where –

MR. ROTHSTEIN: Okay.

MR. HOUSTON: — the device –

MR. ROTHSTEIN: That’s what you need to get in – I don’t
know if they’re reporting it for each other’s use or for CMS or FDA or –

MR. HOUSTON: Why don’t we say “vendors to uniformly report
to the covered entities the compliance of their medical devices with the
Security Rule?”

MR. ROTHSTEIN: Okay. And we’ll see that tomorrow morning,
bright and early?

MR. HOUSTON: Sure.

MR. ROTHSTEIN: I’m looking forward to it.

DR. RIPPEN: Except you may want to avoid the compliance.

MR. BLAIR: You’re saying the devices are compliant, and –

MR. HOUSTON: Okay – okay.

MR. ROTHSTEIN: Okay. So, John, any other –

MR. HOUSTON: No.

MR. ROTHSTEIN: Very good.

Now, we’ve got a couple of other items to take up. The first
one is: What do we want to do with the RFID issue? Do we need more hearings? Do
we need to hear from more people? Do we need to get more written information?
Do we want to drop it? Are we ready to write a letter? Are we ready to have a
chip implanted in each of our arms?

[Laughter.]

MR. ROTHSTEIN: What’s the Subcommittee’s view on RFID? John?

MR. HOUSTON: I honestly didn’t hear anything actionable today.
I think there’s a lot of interesting concerns about RFID technology, but I
didn’t hear anything that’s specifically related to HIPAA that is something

MR. ROTHSTEIN: Well, it doesn’t have to be related to HIPAA.

MR. HOUSTON: Okay. I thought that that was one of the purposes
here.

MR. ROTHSTEIN: No, it’s related to HHS jurisdiction more
broadly. We do other stuff besides HIPAA.

MR. HOUSTON: Okay – well –

MR. ROTHSTEIN: Even though one would not necessarily realize
that, at this point. Helga?

DR. RIPPEN: It occurred to me during the discussions that what
seemed to be missing, or at least from my perspective, was there was nothing
that we could use to evaluate whether a new technology would somehow change the
paradigm of the discussion of privacy and the use of the technology in the
health environment.

And I don’t know if that’s an appropriate thing to explore or
not, but it might provide a way to decide whether or not new technologies need
to be further discussed.

MR. ROTHSTEIN: Richard, you had strong feelings, or expressed a
colorful view of the technology earlier. Do you have any –

MR. REYNOLDS: Colorful and squeamish!

MR. ROTHSTEIN: No – creepy.

MR. REYNOLDS: Creepy, that’s right.

[Laughter.]

MR. ROTHSTEIN: Do you want to get your mike and –

DR. HARDING: The question being, is there any –

MR. ROTHSTEIN: I mean, do you have any feelings about what we
should do next, what if anything we should do next, based on what you heard
today? I mean, should we table this until we have more information? But what
more information would we have? Harry?

MR. REYNOLDS: Yes, I have a comment. I think if you take RFID,
you take some of the home monitors that are going to be used in home health
– in other words, there are more and more technologies showing up on the
street now that are going to be able to be used in many different ways to
monitor people, to identify people, and I think what we heard today is it’s
still about the data. It’s still all about protecting the data.

But I think it is going to be inherent that somebody watch
these new technologies closely because I think at some point they may cross a
line where it is out of somebody’s hands – you know, like you think of
some of the vests that people wear home now, you know, when they’re being
monitored. It takes their blood pressure, it takes the whole thing, and guess
what? They’re being monitored radio frequency. Or they’re plugging in or
they’re doing something else.

So, I mean, RFID to me is just happened to be the subject today
but it just raises – this whole technology thing is just becoming –
and it’s going to continually becoming an issue, and the privacy of the person
and the protection of that data and their consent is what I think I heard
today. I mean, no different than what we’ve heard all along.

So if I were thinking about drafting a letter, it would be that
continued diligence needs to be on this new technology so that anything that
happens falls under the Privacy Rule, consent is continually pushed forward.

So RFID was a good example and some of the other things that
we’ve heard, but I’m not sure if I heard six more presentations on RFID from
four or five different entities, I don’t think I’d feel any different about
consent and I sure wouldn’t feel any different about what could be dealt with
as far as a person’s information from a privacy standpoint.

The ease of getting somebody’s information with this new
technology is what’s really – I mean, all of us walking around with these
and everything else, I mean, the same kind of thing could happen – so I’m
not sure hearing any more hearings would make me feel any more or less about

RFID or any of the other things, technologies.

MR. ROTHSTEIN: So your view is that we didn’t hear anything
today that suggests action at this point? Is that what you’re saying?

MR. REYNOLDS: That would be my feeling.

MR. ROTHSTEIN: Well, I just want to ask the Subcommittee
something else.

Senator Leahy of the Senate Judiciary Committee wrote a letter
on October 18th to Secretary Thompson asking Secretary Thompson to
respond about RFID technology and in particular the VeriChip and the privacy
issues that were raised by it.

At the last full meeting of the NCVHS, I asked Jim Scanlon if
anything had been done to respond to the letter or if anything was sort of in
the works at the agency and he said no, and therefore that’s why we scheduled
this hearing, because we thought it would be helpful in assisting the Secretary
to respond to the inquiry from the Senate.

So now the question is from the Secretary’s standpoint. Would
it be valuable to receive some sort of letter or communication from us even
just setting forth that we held hearings, this is who we heard from, these were
the issues that were raised, we are on it, so that the Secretary could then
relay that to the Hill? I mean, we

have done that in the past, sort of like said that we’re
looking into stuff. Would that be appropriate, do you think?

MR. REYNOLDS: I don’t have a problem with that. I mean, that
would be pretty easy to draft.

MR. ROTHSTEIN: Is that volunteering?

MR. REYNOLDS: I was just making a comment.

[Laughter.]

MR. REYNOLDS: Knowing that the rest of you have initials past
your name, you ought to be able to do it easier.

MR. BLAIR: Would the Subcommittee feel comfortable if you
indicated that this is a complex area; there’s a lot more to learn about both
the technology, the emerging applications, and the implications of those
emerging applications, so that we don’t have recommendations with respect to
legislation at this time, but we could encourage that some of the existing
vendors work with like Ms. Sotto on those industry best practices so that the
best practices could be in place until more information is known and
legislation could be developed?

MR. ROTHSTEIN: Well, I think if we start saying that, that’s
really sort of a substantive judgment that we’re making that best practices are
valuable, that they ought to be encouraged in advance of legislation or in lieu
of legislation, and I’m not sure I’m prepared to endorse going that route.

But what I am prepared to do certainly is to advise the
Secretary that we have held these hearings, that it’s an issue that we will
continue to monitor, and here’s who we heard from. We heard from two
manufacturers, we heard from various privacy people, and they presented the
following information, and that, I would think, would give the Secretary
enough, if he so chose, to then respond to the inquiry from the Senate.

MS. GREENBERG: The only thing is, if you’re going to say that
you’re going to be monitoring this, then what are you going to do? Are you
going to hold more hearings?

MR. ROTHSTEIN: Possibly, I don’t know.

DR. HARDING: It’s just one of those cases like we get into so
often where the technology is developing so much faster than the policy. It’s
gone. I mean, that’s the feeling I get: it’s gone. And it’s already blurring
security of health information. They’re already talking about dust, you know,
RFID dust, just little micro-amounts that you can put into things that will ID
that. And things that I don’t know how we’re going to have a policy, how we’ll
keep up with that, because it’s booming so fast.

And that’s happened to us before. It’s a little frustrating.
Not that I want to slow technology, but I’d sure, like Europe, to have some
sort of parameters under which technology develops.

MR. ROTHSTEIN: Harry?

MR. REYNOLDS: I agree with Richard, but when it comes to
policy, the whole thing about consent and if NCVHS can’t continue to monitor
all these situations that exist about where data can and can’t be dealt with
and continue to recommend ways of tightening that consent, insuring the
consent’s there, keeping that as a consistent message, because once it gets out
– I mean, I’m with Richard; I mean, the technology itself has capabilities
that without that strong focus on that the patient has to have some kind of say
at some point about their information – you can’t take data points and fix
them, you can’t take individual technologies and fix them, because they’re all
going crazy.

DR. HARDING: I would love to read their notice – I’d love
to read it, what they say, the VeriChip.

MR. ROTHSTEIN: I have another suggestion. Because if we write
this letter to the Secretary, the purpose of the letter will be to enable the
Secretary to respond to this inquiry. Maybe what I ought to do is check with
Jim to see whether they even want the letter.

MS. GREENBERG: Maybe they already responded.

MR. ROTHSTEIN: Yes, or find out what’s happening.

MS. GREENBERG: It’s unusual not to send some kind of a response
to a Congressional inquiry.

MR. ROTHSTEIN: Well, the first letter was sent by Leahy in
April of 2004. At our last meeting in November, Jim indicated that there still
had not been a response. So if they didn’t respond in the first six months, I
mean, I don’t know why they would do it now.

MR. BLAIR: Maybe one of the ways that you can respond so you’re
not left in a position where you’re saying that you’re going to continue to
monitor it is you craft a letter where you wind up saying here was the findings
and here are the issues.

MR. ROTHSTEIN: Right.

MR. BLAIR: And then that is a close, and you’re turning it
back, in this case, to Congress. They could do some more hearings themselves.

MR. ROTHSTEIN: So the question is whether we should – I
mean, I think I ought to talk to Jim before we invest all this time to develop
a letter that’s just going to be filed even if we send it.

MS. GREENBERG: Well, I agree with Jeff that we don’t want to
send a letter that says we’re going to monitor this and gives a level of
comfort that, you know, we’re on the case when we have no plans –

MR. ROTHSTEIN: When we’re punting it, yes. I think Jeff’s
right; I agree.

MR. REYNOLDS: And Mark, since you did make the comment, if you
decide after talking to Jim, you’d like a draft letter, I would be happy to
work on it with you.

MR. ROTHSTEIN: I was just kidding you, but I –

MR. REYNOLDS: I know, but I’m not.

MR. ROTHSTEIN: Okay.

MR. REYNOLDS: I will, I’ll try to help you.

MR. ROTHSTEIN: Okay, the other item on today’s agenda for our
meeting is on the archival health information, and the question is, having
heard from our afternoon twosome, whether we are prepared to come up with a
recommendation or we want to seek additional testimony or we want to just file
it in our archives or do whatever?

DR. HARDING: It seems like it’s kind of like the issue of
fundraising, in that some places have a better deal than others and that if we
get into the issue of how to deal with shadow medical records or whatever you
call it, personal health information that’s not part of the medical record,
that it ought to be the same for everybody – I mean, that just seems like
a policy.

MR. ROTHSTEIN: But the easier part of the fundraising was we
were dealing with two categories of covered entities and now we’re dealing with

DR. HARDING: Self-determined.

MR. ROTHSTEIN: — one category of covered entities and one
category of non-covered entities.

DR. HARDING: Well, they were self-determined. Like Harvard says
theirs is not covered entities and Columbia says they are, by
self-determination.

MR. ROTHSTEIN: But like some others, like NLM, they couldn’t
elect to be covered entities. Simon?

DR. COHN: Well, without necessarily trying to solve the
question, I guess the question would be whether we could request the Secretary
to provide some sort of clarification to deal with this issue? I guess I
listened to him – maybe I’m like Richard, it was sort of like – that
doesn’t sound fair! And yet I’m not quite sure I have exactly got answer to how
you would – I mean, you don’t really want to write a new regulation just
to handle what may occur in 20 – I mean, you’d wonder how far you could
get away with sort of stretching by having CMS provide a clarification on this.

MR. ROTHSTEIN: OCR.

DR. COHN: OCR. To give them a little more unrestricted ability
as well understand better where the lines really are.

MR. ROTHSTEIN: Well – Marjorie?

MS. GREENBERG: Two things. One is I think that

we have left with them that they were going to provide you with
some more information.

MR. ROTHSTEIN: Well, Hopkins was going to send a letter
clarifying the second bullet point on their recommendation.

MS. GREENBERG: True, but also then you were asking them about
other –

MR. ROTHSTEIN: Names of –

MS. GREENBERG: — organizations and others. They made some
recommendations and I thought maybe they were going to then provide you with
some additional information; I don’t know.

MR. ROTHSTEIN: No, just names.

MS. GREENBERG: The other thing is that, talking about one
letter that didn’t get answered, another letter that didn’t get answered –
I think that they apparently have raised these questions with the Department,
and there are two ways that the Department can generally respond. One is a
direct response and the other is to develop FAQs or guidance.

But it seems like a third option is just to ignore it, but I
don’t think that’s one that we would support. So they’re struggling with these
issues in a reputable, important area, and they’re professionals who are well
trained to deal with records and information and yet they have a lot of the
same questions that they’ve asked for answers for.

So that’s the part that bothered me the most – although
admittedly you do have this kind of – well, life is unfair and we know
that, but as Harry said, it’s the same with a lot of things. I mean, why –
we’re going to hear from employers tomorrow; they’re not covered. But that’s
the nature of this Rule, that it doesn’t cover everybody.

MR. ROTHSTEIN: It’s not the unfairness of the covered versus
non-covered entities that concerns me. It’s the consequences of having covered
entities having to comply with the Privacy Rule that results in the lack of
access –

MS. GREENBERG: Yes –

MR. ROTHSTEIN: — to these archival records by researchers.

MS. GREENBERG: — that’s part of the whole bigger question of
how has the Privacy Rule impacted on research that this Committee’s been
raising, but without much response. I mean, you really would need a systematic
evaluation.

Some of the testimony reminded me of back in that hotel in
Silver Spring when people were coming forward and talking about how all the
different IRBs were interpreting things differently and it was so difficult for
people to do their research, and in some cases my reaction was: Well, nobody
ever should have had that kind of access.

MR. ROTHSTEIN: That’s right.

MS. GREENBERG: But in other cases, it was like, well, this does
seem to be that it’s really unnecessarily burdensome. And you have to sort all
that out. Just because things are more difficult now doesn’t mean that it’s bad
because maybe some things should have been more difficult and they weren’t to
get access to.

MR. ROTHSTEIN: On the other hand, nobody knows what they can do
now.

MS. GREENBERG: But now there’s confusion –

MR. ROTHSTEIN: Yes.

MS. GREENBERG: — so that’s the part that bothered me, that
they have posed these questions. They’re legitimate questions. And maybe there
aren’t good answers and that’s why there hasn’t been a response.

But I think that now you’ve raised it to another level by
having a hearing, so I don’t think just filing with no kind of –

MR. ROTHSTEIN: I agree.

MS. GREENBERG: — reaction is appropriate.

MR. ROTHSTEIN: I agree. Christina, can I ask you if this is on
OCR’s radar screen now?

MS. HEIDE: It is.

MR. ROTHSTEIN: So are you looking to us to provide some
guidance based on our hearing?

MS. HEIDE: It’s helpful to hear the specific concerns. We
always welcome recommendations.

One of the things that was a little troubling, has always been
a little troubling, is picking a timeline for protecting these records, no
matter what you define the records to be. No matter what period of time you
choose seems arbitrary.

And, I mean, that aside, I think we would welcome as much
information on this topic as you can provide us.

MS. GREENBERG: Another thing was that – I mean, this is
another aspect of this and I don’t know whether there’s any possibility for
consistency – probably not – but some states, when it comes to actual
death records, in some states these are completely open; in other states,
they’re completely closed. And you’ve got variations along that.

And there are some model laws that are out there, but states do
deal with that differently. Some states will release the fact of death to
anybody but you have a need to know for the cause of death. In other states, as
I said, once someone’s dead, anyone can get a hold of the record.

So it just seemed like, particularly when you’re talking about
those issues, about how much time after a person’s died, there should be some
kind of looking at: Well, what kind of policies are there about death
registration records and how does it fit in with that?

Obviously, the HIPAA rule doesn’t cover those records, I guess,
because they’re not held by covered entities.

MR. ROTHSTEIN: Correct.

MS. GREENBERG: I think they would hold a death record if a
hospital happened to have it, but that’s not their official –

MR. ROTHSTEIN: But once it gets to the state –

MS. GREENBERG: Right. And then they are a public health record,
so there’s some coverage there, you know, if they’re being used for public
health purposes.

But there are some kind of, I think, crossovers and issues here
and I didn’t know whether you wanted to think about those.

MR. ROTHSTEIN: Simon? Richard?

MR. REYNOLDS: I really struggled with the covered and
non-covered, you know, what was your structure to be, one or not the other and
a piece of them not to be and –

MS. GREENBERG: I mean, would Johns Hopkins have the option to
now decide that their library wasn’t covered or something? Or is it determined
by what your actual governance structure is or –

MR. ROTHSTEIN: Well, yes, that’s right, because these large
institutions have some non-covered entity functions and if they could feasibly
segregate them, they could become hybrid entities, and some institutions have
so chosen, and then there is different treatment.

MR. HOUSTON: Can I make a point, though? It seems somewhat
disingenuous to wait for a year after the Security Rule or the Privacy Rule,
the compliance period has passed, and then make a decision to sort of segregate
yourself differently as a hybrid entity, taking what was a covered function and
taking it out of – I think sort of like the cat’s out of the bag or –

MS. HEIDE: If you have a defined covered function, you can’t
just take that out of your –

MR. ROTHSTEIN: Right.

MS. HEIDE: — covered entity piece, if you will. It’s just
covered. There are certain things you can also include in your covered
function, or you can choose to have your whole institution be covered.

MR. HOUSTON: But my point is more so that now that the Privacy
Rule, the compliance period, has passed and you have stuff that now is
considered PHI, protected under the Privacy Rule, I would think it’d be
difficult now to just say, okay, it was yesterday; it’s now no longer PHI, and
therefore you don’t have to worry about coverage under the Privacy Rule. I
think by declaring it once, I sort of sense like you set your position. You
don’t have a choice now but to handle it as such.

MR. ROTHSTEIN: Well, it may be. I mean, you’re assuming the
lack of good motives on the part of the institution. It may be that institution
tried doing it in a certain way and it just didn’t work out and they think
it’ll be more efficient if the whole institution became a covered entity
because they can’t keep the firewall intact. I mean, I could see that.

Let me ask you the two questions that we need to resolve on the
archival stuff.

One is whether you think, based on the hearings, that there is
a need for us to write a letter to the Secretary to provide guidance to the
Department and to OCR on how to handle this issue.

And, do you think, assuming yes to the first one, there is a
sufficient consensus that we could develop at the Subcommittee and the full
Committee level that would justify our trying to do that?

I’ll take one at a time.

[Laughter.]

MR. ROTHSTEIN: I mean, as to the first one,

Christina said that OCR is interested in this issue –

MS. HEIDE: Yes.

MR. ROTHSTEIN: — so if we felt we had something to say and we
could reach agreement, it would be valuable to them. But those are two
“ifs,” so Harry?

MR. REYNOLDS: Somebody mentioned it earlier. This feels a lot
to me like the fundraising. The playing field to me –

MS. GREENBERG: The issue of not a level playing field.

MR. REYNOLDS: Yes, that’s where I’m going. I just heard enough
– then again, we heard from one side, so – but there really seemed to
be an unlevel playing field, and when you talk about why would somebody want to
change their status, well, if you have libraries and I would think from a
research standpoint more and more competition will continue to be an issue,
it’s going to be a whole lot easier for people — two companies in the same
business, two entities in the same business, and somebody can go to this one
and get a whole lot easier, faster and less stringent controls on them, why
not? It cuts down your research cost, cuts down your time, cuts down everything
you have to do.

So that personally is why I was still concerned. I don’t
understand the covered and non-covered and how you get in and out and just why
there should be a difference and what the fuss is.

So I just don’t feel comfortable having such an unlevel playing
field and personally I don’t understand why. You guys might, but I don’t know.

MR. ROTHSTEIN: Well, because the statute only covers three
classes of individuals and it was never designed to be a comprehensive health
privacy law. So that’s the problem. And it can only be fixed by Congress.

MR. BLAIR: Mark, if there’s a specific action we want HHS to
take, then I’d say we should write the letter.

MR. ROTHSTEIN: Okay, and –

MR. BLAIR: If there’s not, then –

MR. ROTHSTEIN: — do you think there is?

MR. BLAIR: We certainly haven’t come to one at this point.

MR. ROTHSTEIN: Simon?

DR. COHN: I sort of agree with you on that. I mean, I’m
struggling because I, like Harry, agree that – I actually completely agree
with what Harry was just saying and what Richard said. But I’m also struggling
with what the right answer is. The right answer somehow escapes m.

Now, of course, John Paul has been good enough writing letters
historically that sometimes we can express concern and ask for clarifications
to right this wrong without ever giving them the answer of how to right the
wrong.

MR. HOUSTON: Are you proposing I write a letter on that?

DR. COHN: I don’t really getting closer than that. What?

MR. HOUSTON: Are you proposing I write a letter?

DR. COHN: I did wake you up on that one, didn’t I?

MR. HOUSTON: My fingers are burning right now – yours,
too, I’m sure.

DR. COHN: I mean, I didn’t come from listening feeling that I
had gotten an answer of how do you right this wrong.

MR. ROTHSTEIN: Well, here’s another option. We could declare
victory and adjourn the meeting. I mean, I only say that sort of half-jokingly.

OCR is considering the issue, we brought in two witnesses, they
submitted written testimony, OCR heard the testimony, they get the document. We
don’t have a recommendation per se. They’re working on it. We helped them sort
or get some more information for that.

MS. GREENBERG: I guess maybe if in several months there’s still
no answer to their letter or no FAQs, then maybe OCR could let us know what
additional input

would be helpful.

MR. REYNOLDS: And also the questions they submitted to the
Secretary before us should still be in play.

MR. ROTHSTEIN: Yes, that’s right.

MR. REYNOLDS: So we’re going to monitor the situation?

MR. ROTHSTEIN: So we’re going to keep our eyes on those guys
and boot it to them.

Yes – Jeff?

MR. BLAIR: Reflecting back on at least the inclination that I
have is that we’re going to write a letter on anything – because I’m about
to switch topics here – that we really should have something where we’re
asking HHS to do something, either take action or research or something.

So going back to the letter that we would have on the RFID
issue, may I suggest that in addition to the letter, we’d indicate what the
findings are and we’d indicate what the issues are, I think that there should
be something there, even if it is that we suggest Congress explore or try to
refine the following issues or resolve – some direction as to what either
HHS does or what Congress does as a result of what we learned, even if we don’t
have the answers. Something that they should do.

MR. ROTHSTEIN: Well, the way we left that issue, I’m going to
talk to Jim Scanlon, find out what’s happening at the Department level, and see
if a letter setting forth our findings from our two panels would be valuable to
the Data Council or to the Secretary’s office of whomever.

If he wants to go ahead, then Harry and I will work on trying
to put together a letter for the Committee and I will report back at our
Subcommittee meeting that will accompany the March full Committee meeting.

The last item is just to alert you to tomorrow’s schedule. We
will be starting at 8:30 tomorrow instead of 9, so please be here early, and we
will go through till 4 o’clock. I don’t think we’re going to have the
opportunity at the end of the day to have one of these sort of debriefing
sessions like we did today, so we will have to take that up by conference call
or by some other means how we want to follow up, if at all, to tomorrow’s
hearing on compelled authorization disclosures.

Are there other matters to take up? Simon? Oh – John?

MR. HOUSTON: I made the changes and gave them to Marietta
– the letter. She’s making copies. They’re available now. I did a red-line
and not a red-line version, so if you want to –

We don’t have to read them now. If somebody wants to take them
home and read them this evening –

MR. ROTHSTEIN: Okay, excellent.

MR. HOUSTON: — that might facilitate getting that letter done.

MR. ROTHSTEIN: Thank you. We will pick those up on our way out.

Simon, did you want to say something?

DR. COHN: Well, just having a question having to do – I
will apologize if you guys have already discussed this when I was out of the
room for a couple minutes – the next meeting is actually in late –

MR. ROTHSTEIN: February, correct. Yes, and did you get a copy
of the agenda? Kathleen was going to email the proposed agenda for the February
23rd hearing.

DR. COHN: Yes, I don’t think any of us got a copy of it.

MR. ROTHSTEIN: Okay. I have it somewhere.

DR. COHN: Normally at the end of tomorrow we would talk about
that and the agenda, it doesn’t look like there’s going to be time.

MR. ROTHSTEIN: Right. And –

MR. BROWN: The topic of that hearing is –

MR. ROTHSTEIN: It’s our first hearing on NHIN. So we have one
scheduled February, I believe, 23rd and 24th here, and
then we were talking about scheduling our second meeting in Chicago in March
not because of the weather but because that was the meeting when we were going
to try to get AMA, AHA and a lot of those groups that are in Chicago to
testify.

MS. GREENBERG: Okay.

MR. HOUSTON: Is that scheduled in March already?

MR. ROTHSTEIN: Yes.

MR. HOUSTON: What is it, the first week?

DR. HARDING: It’s 30 and 31.

MS. GREENBERG: Very end of March.

MR. ROTHSTEIN: Right.

DR. COHN: We write down Chicago on this one?

MR. ROTHSTEIN: Well, I mean, do we need some special approval
to have it outside of Washington?

MS. GREENBERG: Well, if there’s a good rationale for it and I
signed a plan, both of which seem to exist, I’ll approve it.

MR. ROTHSTEIN: Okay. That’s the rationale. Chicago in March can
be nasty but I think, given who we have on our first hearing, and I’m sorry I
don’t have it with me, but Kathleen does and she’ll be here tomorrow morning
and I’ll get her to print that out.

We have basically in the first day of the two-day hearings
recruited some of the leading bioethics and health policy people to kind of
give us a grounding of what kind of considerations should go into formulating
policy in national health information network.

We supplied them with a one-pager on what the NHIN was all
about so that they could focus on the issues.

The second day, we were going to hear from a variety of
consumer groups – AARP and so on and so forth – and the second round
of hearings –

MS. GREENBERG: That’s in February is what you’re talking about
right now?

MR. ROTHSTEIN: Yes, I just described the February hearings.

The second round of hearings in March, and we’re going to need
to talk about that some, we wanted to then start on the provider side and hear
from the AMA, some of the medical specialty colleges, from the hospital
association, the nursing association and so on, and they’re all in Chicago and
that’s why it seemed to us to be more logical to be there.

MS. GREENBERG: We’ll have our contractor start looking into
accommodations.

MR. HOUSTON: The February meetings, you discussed consumer

MR. ROTHSTEIN: Correct.

MR. HOUSTON: We’ve just gone through with an NHI working group
a round of consumer comments regarding PHRs, so it’s not entirely on point but
we’ve just sort of gone through one round of consumer discussion on these types
of things. There is a little bit of a different angle to it. We probably should
make sure that we don’t –

MR. ROTHSTEIN: It would be good to find out what you heard

MR. HOUSTON: Yes, yes.

MR. ROTHSTEIN: — and from whom and –

MR. HOUSTON: Well, make sure we don’t get the same people

MR. ROTHSTEIN: Right.

MR. HOUSTON: — saying the same things.

DR. COHN: No, but for the record, any time there was a privacy
issue, we said, gee, the Privacy Subcommittee is going to hold hearings on
that. That was sort of what –

MR. HOUSTON: There was some privacy data. I’m just wondering
– make sure we don’t have too much of an overlap. I just want to make sure
you know it –

MR. ROTHSTEIN: Well, I thank you, and I hope I can get maybe a
list – think we can share that?

The two people who were working on that were Mary Jo who is
still staff to your Subcommittee?

MR. HOUSTON: NHII?

MR. ROTHSTEIN: Yes.

MR. HOUSTON: Yes. Mary would know, yes.

MR. ROTHSTEIN: Okay, so Mary Jo and Kathleen are the staff
working –

DR. COHN: Who was there.

MR. HOUSTON: So they would get a sense whether they’re –

MR. ROTHSTEIN: Yes. Okay?

DR. COHN: Okay.

MR. ROTHSTEIN: Anything else?

Well, thank you and I’ll see you bright and early tomorrow
morning. We are adjourned for today.

[Meeting adjourned at 4:42 p.m.]