Showing posts with label ross koppel. Show all posts
Showing posts with label ross koppel. Show all posts

Monday, February 18, 2013

Kaiser Health News/Philadelphia Inquirer on InformaticsMD: "The flaws of electronic records"

At my Dec. 2012 post "How an interview for Kaiser Health News rekindled memories of health IT dysfunction in the 90's that persist in the 10's" I mentioned an interview by a reporter from the Kaiser Health Foundation regarding health IT flaws.

His article appeared in both the Philadelphia Inquirer and Kaiser Health News today under the title "The flaws of electronic records":

Philadelphia Inquirer / Kaiser Health News
Feb. 18, 2013
The flaws of electronic records

Drexel University's Scot Silverstein is a leading critic of the rapid switch to computerized medical charts, saying the notion that they prevent more mistakes than they cause is not proven.


 



Scot Silverstein of Lansdale, one of the most ardent critics of electronic medical records, works on an antique computer. A growing collection of evidence suggests poorly designed medical software can obscure clinical data, generate incorrect treatment orders, and cause other problems.

RON TARVER /Staff Photogapher

Jay Hancock, KAISER HEALTH NEWS
Posted: Monday, February 18, 2013, 3:01 AM

Computer mistakes like the one that produced incorrect prescriptions for thousands of Rhode Island patients are probably far more common and dangerous than proponents of electronic medical records believe, says Drexel University's Scot Silverstein.

Flawed software at Lifespan hospital group printed orders for low-dose, short-acting pills when patients should have been taking stronger, time-release ones, the Providence-based system disclosed in 2011. Lifespan says nobody was harmed.

But Silverstein, a physician and adjunct professor of health-care informatics who is making a name for himself as a strident critic of electronic health records, says the Lifespan breakdown is part of a much larger problem.

"We're in the midst of a mania right now" as traditional patient charts are switched to computers, he said in an interview in his Lansdale home. "We know it causes harm, and we don't even know the level of magnitude. That statement alone should be the basis for the greatest of caution and slowing down."

In an ethical world, it would be.  Medicine, though, is in the throes of a loss of ethics, as many stories at this site and many others attest to.

Use of electronic medical records is speeding up, thanks to $10 billion and counting in bounties the federal government is paying to caregivers who adopt them. The consensus among government officials and researchers is that computers will cut mistakes and promote efficiency. So 4,000 hospitals have installed or are installing digital records, the Department of Health and Human Services said last month. Seventy percent of doctors surveyed in September by research firm CapSite said they had switched to digital data.

But the notion that electronic charts prevent more mistakes than they cause just isn't proven, Silverstein says. Government doesn't require caregivers to report problems, he points out, so many computer-induced mistakes may never surface.

The recent ECRI "deep dive" study of "EHR events" (link) is just the latest to raise red flags on that point.

Even Dr. David Blumenthal, former chair of the Office of the National Health IT Coordinator (ONC) in HHS seems to have changed his message about reporting of medical problems.

From a Feb. 16, 2013 New York Times article "Doctors Who Don’t Speak Out":

....  TRADITIONALLY, doctors have brought problems to the attention of colleagues by conducting research and publishing their findings in a medical journal. The advantage of that system helps ensure the credibility of study data and protects a researcher from random attack, said Dr. David Blumenthal, the president of the Commonwealth Fund, a group that studies health policy issues.

But getting a study published can take a year or two; some Johnson & Johnson consultants did publish studies about the hip’s flaws, but they largely appeared after it had been recalled.

Dr. Blumenthal said there was probably a need for more immediate ways for doctors to share their concerns, like forums supported by professional medical organizations.

Back to the Inquirer article:

He doesn't discount the potential of digital records to eliminate duplicate scans and alert doctors to drug interactions and unsuspected dangers.

But the rush to implementation has produced badly designed products that may be more likely to confound doctors than enlighten them, he says. Electronic health records, Silverstein believes, should be rigorously tested under government supervision before being used in life-and-death situations, much like medical hardware or airplanes.

In fact, arguments otherwise are specious.  "Harm to innovation" is the one I've heard most often.  Yet, those proffering such claims cannot point out what aspect of regulation - adherence to GMP's (good manufacturing processes), pre-market safety and quality testing, post-market surveillance, etc. - will "harm innovation."  Indeed, they opine as if innovation in medicine without objective safety and quality checks is a virtue, rather than a potential vice.

Silverstein "is an essential critic of the field," said physician George Lundberg, editor at large for MedPage Today and former editor of the Journal of the American Medical Association. "It's too easy for those of us in medicine to get excessively enthusiastic about things that look like they're going to work out really well. Sometimes we go too far and don't see the downside of things."

(Dr. Lundberg mentioned me in Nov. 2011 at MedPage today in a piece entitled "Health IT: Garbage In, Garbage Out" as here.)

The patients - including ourselves and our own family members, I might add - are the ones who pay the price of our hyper-enthusiasm.

... The FDA's Jeffrey Shuren, a neurologist, has said such cases "likely reflect a small percentage of the actual events that do occur."

"Tip of the iceberg" were his exact words (link).

... At conferences and working from home on the "Health Care Renewal" blog, Silverstein chronicles digital failures and criticizes hospitals in the same dogged way he applied himself to building the 1970s-era Heathkit computers [and amateur radio equipment - ed.] he still keeps in his home, say people who know him.

"His message has been consistent": Health IT "provides far less benefit than is claimed by its proponents and opens new, sometimes potent, routes to failure," said Richard I. Cook, a medical error expert at the University of Chicago who sat on a panel examining electronic record safety at the authoritative Institute of Medicine. "No one wants him to be visible. But his message and tone have not wavered."

Dr. Cook wrote the lone dissent to milquetoast IOM recommendations on health IT risk - the magnitude of which IOM itself admitted is unknown - in their 2012 report "Health IT and Patient Safety: Building Safer Systems for Better Care" available here.  See Appendix E.

The last scientific conference at which I chronicled these failures was at the Health Informatics Society of Australia's HIC2012, as a keynote speaker on health IT trust (link).  I would have been at HIC2011, to which I had been originally invited, but was helping care for a relative injured by bad health IT at that time.  Her death freed me to travel Down Under in 2012.

I have presented at a number of plaintiff attorney's meetings since then, however, such as the American Association for Justice Winter Convention (AAJ, formerly the Trial Lawyer's Association) just last week (link).  Trial lawyers don't seem to mind a very direct approach to the issues, unlike many so-called scientists who, as author Michael Crichton once warned (link), seem to believe in "consensus" rather than science.

The HIMSS Electronic Health Record Association, an industry group, declined to comment on Silverstein. A spokesman for the Office of the National Coordinator for Health Information Technology, the administration's proponent of digital records, said: "It's important to listen to all the voices" in the discussion of the subject. 

Some voices, unfortunately, are louder than others and backed by lobbyists and big money.  See, for instance, the May 2009 Washington Post article by Robert O'Harrow Jr. "The Machinery Behind Health-Care Reform: How an Industry Lobby Scored a Swift, Unexpected Victory by Channeling Billions to Electronic Records" (link to the article and my essay about it is here).

I do point out, however, for the benefit of those at HIMSS and ONC, that knowing of risks, while doing nothing substantive while "listening to all voices in the discussion" can be seen as gross negligence.

Trained as an internist and in medical information technology [Medical Informatics- ed.] as a Yale University postdoc, Silverstein, 55, served as Merck & Co.'s director of scientific information in the early 2000s and then as a full-time Drexel professor, shifting in recent years to part-time teaching and working on medical liability cases for plaintiff attorneys. His insistent warnings about digital health risks over more than a decade have effectively barred him from a lucrative career at a hospital or software vendor.

Perverse as that reality may be, it's also the reverse:  I would not want to work for a hospital or software vendor in 2013, where effecting change to protect patients from bad health IT is hard if not impossible, e.g., as at link, link, and link, and at the other case examples at that site.  I find it a far more effective use of my time to help enact change from the outside - and avoiding the pathological individuals who make such scenarios possible.

"I'm sure Scot would be better off by keeping his mouth shut and getting a job with a hospital that's just put in a big . . . system," said Matthew Holt, a Silverstein critic and cochairman of Health 2.0, which organizes health technology conferences.

I note that the raison d'être for this blog is the impact of many people doing just that sort of thing - "keeping their mouth shut" and making money, no matter what the ethical implications.  (The patients killed by such behaviors are, unfortunately, unavailable for comment.)

Many say he comes on too strong. Even admirers cringed when he began blogging about the 2011 death of his mother, which he blames in a lawsuit on a computer error that allegedly caused Abington Memorial Hospital to overlook a key medication. (Both he and the hospital said they couldn't comment on a pending suit.) Personalizing his campaign, some thought, made him seem less objective.

I'm at a loss here.  "Too strong" - on matters of life and death that affect everyone?  A proficient writer, who had been writing about health IT problems since 1998, should have kept silent about a first hand story of HIT harm in 2010 of potential great relevance towards public safety?  Writing about the incident 'lessens objectivity'?  That makes little sense, and is perverse.  Those who opined as such have their priorities in serious disarray.

Such personalized accounts are common and have caused great change.  Libby Zion's death due to hospital neglect, and her father's making the issues quite public, comes to mind, as do the many laws enacted that are named after people who've gone public after personal tragedy:

... Grieving the loss of their child, Zion's parents became convinced that their daughter's death was due to inadequate staffing at the teaching hospital.  Sidney Zion questioned the staff's competence for two reasons. The first was the administration of meperidine, known to cause fatal interactions with phenelzine, the antidepressant that Libby Zion was taking. The second issue was the use of restraints and emergency psychiatric medication. Sidney's aggrieved words were: "They gave her a drug that was destined to kill her, then ignored her except to tie her down like a dog." To the distress of the doctors, Zion began to refer to his daughter's death as a "murder." Sidney also questioned the long hours that residents worked at the time. In a New York Times op-ed piece he wrote: "You don't need kindergarten to know that a resident working a 36-hour shift is in no condition to make any kind of judgment call—forget about life-and-death." The case eventually became a protracted high-profile legal battle, with multiple abrupt reversals; case reports about it appeared in major medical journals.

An alternate explanation is that, lacking other credible means, this is an ad hominem reaction (of those "some") seeking an angle to attack the message ... and the messenger ... or is simply a reaction of, to put it bluntly, castrati who are more at home in a country club than in the world of ideas.

For a bona fide example of "coming on too strong", there's this:

In the NEJM article "The 'Meaningful Use' Regulation for Electronic Health Records", David Blumenthal, M.D., M.P.P. (ONC Chair) and Marilyn Tavenner, R.N., M.H.A. (10.1056/NEJMp1006114, July 13, 2010) available at this link, the opening statement is (emphases mine):

The widespread use of electronic health records (EHRs) in the United States is inevitable. EHRs will improve caregivers’ decisions and patients’ outcomes. Once patients experience the benefits of this technology, they will demand nothing less from their providers. Hundreds of thousands of physicians have already seen these benefits in their clinical practice.

I think it fair to say those are grandiose statements and predictions presented with a tone of utmost certainty in one of the world's most respected scientific medical journals. 


Even though it is a "perspectives" article, I once long ago learned that in writing in esteemed scientific journals of worldwide impact, statements of certainty were at best avoided, or if made should be exceptionally well referenced.

I note the lack of footnotes showing the source(s) of these statements.

Another bona fide example of "coming on too strong":

“We have the capacity to transform health with one thunderous click of a mouse after another,” said (former) HHS Secretary Michael Leavitt - 2005 HIMSS Summit 


We shall transform health (into what, exactly, is not specified) one thunderous mouse click after another!

It doesn't get any stronger than that, unless, perhaps, the thunderous wrath of God is invoked.

Back to the Inquirer once again:

"His refusal to temper his message makes it sometimes difficult to hear," said Ross Koppel, a University of Pennsylvania sociologist and digital health record skeptic.

As per a recent article by Joe Conn in Modern Healthcare entitled "Health IT Iconoclasts" (link), which wrote of Dr. Koppel, Deborah Peel, Lawrence Weed and me, Dr. Koppel has firsthand experience at his message being found "difficult to hear" by the hyper-enthusiasts:

... Researcher Ross Koppel started an uproar in 2005 when he and a colleague coauthored an article in the Journal of the American Medical Association that found a first-generation computerized physician order entry system (CPOE) at the Hospital of the University of Pennsylvania was simultaneously creating new errors even as it reduced others.

Koppel’s bombshell—he’s now an adjunct professor of sociology at the University of Pennsylvania— brought down the wrath of information technology boosters. The Healthcare Information and Management Systems Society, a health IT trade group, challenged the study’s “methodology and its subsequent outcomes,” and criticized its authors for their “limited view” and not “looking at the big picture.”  [Others wrote that his work was 'disingenuous', although it had similar findings to my own observation of the very same CPOE system at Yale-New Haven Hospital ... in 1992 - ed.]

... In 2009, he revealed in another JAMA article that health IT vendors’ contracts included “hold harmless” clauses that shielded software developers from legal liability for medical errors their systems caused, even if the developers had been warned about the defects. “That got me major upheaval,” the worst of his career, Koppel recalls.

Koppel, a sociologist, has probably done more for health IT transparency and safety than the physicians of the entire academic medical and medical informatics community combined.

... But Silverstein says his position today is the same as it has always been. He believes in the potential power of electronic records for good, he says. But any doctor who feels bound by the Hippocratic oath's injunction to "first, do no harm," he adds, should balk at what's going on.

"Patients are being harmed and killed as a result of disruptions to care caused by bad health IT," he said. "I'm skeptical of the manner and pace" of implementation, "not of the technology itself. . . . My only bias is against bad medicine. And my bias is against people with complacent attitudes about bad medicine."

The issues are actually relatively simple, using terminology coined by Dr. Jon Patrick of U. Sydney at the aforementioned HIC2012 meeting in Australia during our discussions.  Bad health IT must be removed from the market, and good health IT must replace it.

Good Health IT ("GHIT") is defined as IT that provides a good user experience, enhances cognitive function, puts essential information as effortlessly as possible into the physician’s hands, keeps eHealth information secure, protects patient privacy and facilitates better practice of medicine and better outcomes.

Bad Health IT
("BHIT") is defined as IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation.  
  
It is this author's opinion that this change will not happen by "going along to get along" or "listening to all the voices in the discussion of the subject" (especially those with Big Money and Big Lobbyists behind them) while doing nothing.

I also note that "complacent attitudes about bad medicine" are not the sole province of IT personnel or healthcare management.  Physicians and nurses who acquiesce to bad health IT are part of the problem.

-- SS

Saturday, February 2, 2013

Modern Healthcare: "Health IT Iconoclasts"

The healthcare executive-targeted journal "Modern Healthcare" has named Dr. Ross Koppel, Dr. Deborah Peel, Dr. Larry Weed, and yours truly as "healthcare IT iconoclasts."  Dr. William Bria corroborated the importance of iconoclasty in this domain.

(I would rather see the term "defender of patient's rights" rather then "iconoclast", but I'll settle for the latter if it gets the message out.)

Author Joe Conn writes:

... health IT has long had its critics, even among its pioneers and proponents, as these four prominent health IT iconoclasts will attest. All four consider themselves to be proponents of health IT, but they rail against a tide of health IT boosterism. Their targets: misplaced priorities, failing to promote EHR usability and interoperability, inadequate concern for patient safety and privacy, overemphasizing EHR adoption, understating IT costs and overestimating the return on public IT investments.

He then profiles the four.

On Univ. of Pennsylvania professor and industry punching-bag Dr. Ross Koppel:

Researcher Ross Koppel started an uproar in 2005 when he and a colleague coauthored an article in the Journal of the American Medical Association that found a first-generation computerized physician order entry system (CPOE) at the Hospital of the University of Pennsylvania was simultaneously creating new errors even as it reduced others.

Koppel’s bombshell—he’s now an adjunct professor of sociology at the University of Pennsylvania— brought down the wrath of information technology boosters. The Healthcare Information and Management Systems Society, a health IT trade group, challenged the study’s “methodology and its subsequent outcomes,” and criticized its authors for their “limited view” and not “looking at the big picture.”

... In 2009, he revealed in another JAMA article that health IT vendors’ contracts included “hold harmless” clauses that shielded software developers from legal liability for medical errors their systems caused, even if the developers had been warned about the defects. “That got me major upheaval,” the worst of his career, Koppel recalls.

On privacy advocate, psychiatrist Dr. Deborah Peel (no relation to Emma Peel, although there are similarities in the "has guts" department):

“Let’s face it,” Peel says, “HHS is the agency that eliminated patient control over electronic medical records and has remained hostile to patients’ rights ever since.”

Days before the 2002 revision [HHS redraft of the privacy rule of the HIPAA Health Insurance Portability and Accountability Act] went into effect, a group of patients calling themselves Citizens for Health, and more than dozen other plaintiffs, including Peel, sued HHS Secretary Tommy Thompson in federal court, alleging the revisions violated patients’ constitutional rights to privacy. They lost at both the trial and appeals-court levels and were denied a hearing on appeal to the U.S. Supreme Court in 2006.

Peel launched the not-for-profit Patient Privacy Rights Foundation in 2003.

... “Where I’m coming from is, I’ve spent all this time in a profession with people being hurt,” Peel says. “Starting in the 1970s, when I first let out my shingle, people came to me and said, if I paid you in cash, would you keep my records private. Now, we’ve got a situation where you don’t even know where all your records are. We don’t have a chain of custody for our data, or have a data map” to track its location.

 On SOAP-note and Medical Informatics pioneer Dr. Larry Weed:

It’s a rotten system,” declares Dr. Lawrence Weed, who at age 89 is the dean of healthcare information technology iconoclasts.

Weed isn’t disparaging any particular brand of electronic health record system. A dismissive “they’re inadequate” would fairly well cover a Weed-guided tour of today’s EHR systems.

“People don’t get the general picture,” he says. “It’s broken. It’s basically an unsound system.” By that he means the entire healthcare system, but not because its providers are using faulty information technology, but because they’re using IT the wrong way, at least in part.

... In 1984, to help physicians cope [with the knowledge explosion], Weed developed a computer-based, diagnostic support system he called the problem-knowledge coupler. The software company he founded, but is no longer with, PKC, now part of Sharecare, still sells the system. Weed still proselytizes with fervor, calling for the use of computers to store, retrieve and apply medical knowledge.

On me:

The title of Dr. Scot Silverstein’s teaching website at Drexel University, “Contemporary Issues in Medical Informatics: Good Health IT, Bad Health IT, and Common Examples of Healthcare IT Difficulties,” [link] summarizes the veteran physician informaticist’s general outlook on the current state of affairs in health information technology.

It tells you nothing, however, of the passion with which Silverstein speaks or writes about the subject. Also a frequent contributor to the popular reformist “Healthcare Renewal” blog, Silverstein writes with the fire you might expect coming from a self-described computer geek who says he has witnessed a faulty electronic health-record system mysteriously drop a single medication from a patient’s medication list. That missing drug led to a medical error that resulted in a year of suffering and, eventually, that patient’s death, he says.

Silverstein’s passion is even more understandable when he tells you that patient was the doctor’s own mother.

... The health IT world, Silverstein says, parts neatly between “good IT” and “bad IT.” There are those who push hard for the good and complain about the bad, physicians and other clinicians he calls “pragmatic,” and for whom he has sympathy and respect. And then there are those who stay silent, ignoring or acquiescing to the bad, the “hyper-enthusiasts” for whom he holds only unmitigated scorn. “The doctors who don’t speak up about health IT, who work around it, which can cause its own bad results, those are traitors to the oath they took to first do no harm,” he says.

“Physicians are still being accused of being Luddites for not adopting this stuff,” Silverstein says. “Physicians are not Luddites. When it’s good IT, it’s used. I see the tension now between hyper-enthusiasts, who turn a blind eye to the negatives, and pragmatic physicians and nurses who have work to do.”

Finally, Dr. William Bria, longtime president of the Association of Medical Directors of Information Systems (AMDIS) sums up the article this way:

... When it comes to the criticism, “the one thing we can’t do with this information is to ignore it."

“Many, many technologies have come and gone in the history of medicine over the centuries, and it often has been a maverick physician that has called a timeout on ineffective medications or treatments.

“There is little question that, going forward, medicine will be using information tools,” Bria says. “However, I believe it’s becoming also very apparent that we need a modulation and a proper regulation of information technologies used in day-to-day care.”

Amen to that.

The article is available here.

-- SS

Wednesday, January 30, 2013

AMIA: Enhancing patient safety and quality of care by improving the usability of EHR systems, but ... no sympathy for victims of bad health IT?

A panel of experts from the American Medical Informatics Association have written a paper "Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA."

The paper is publicly available at this link in PDF.

The authors are  Blackford Middleton (Harvard Medical School),  Meryl Bloomrosen (AMIA),  Mark A Dente (GE Healthare IT),  Bill Hashmat (CureMD Corp.), Ross Koppel (Dept. of Socology, Univ, of Pennsylvania), J Marc Overhage (Siemens Health Services), Thomas H Payne (U. Washington IT Services),  S Trent Rosenbloom (Vanderbilt Informatics), Charlotte Weaver (Gentiva Health Services) and Jiajie Zhang (University of Texas Health Science Center at Houston).

The paper states what has been obvious to this author - and many others - for many years:

ABSTRACT:  In response to mounting evidence that use of electronic medical record systems may cause unintended consequences, and even patient harm, the AMIA Board of Directors convened a Task Force on Usability to examine evidence from the literature and make recommendations. This task force was composed of representatives from both academic settings and vendors of electronic health record (EHR) systems. After a careful review of the literature and of vendor experiences with EHR design and implementation, the task force developed 10 recommendations in four areas: (1) human factors health information technology (IT) research, (2) health IT policy, (3) industry recommendations, and (4) recommendations for the clinician end-user of EHR software. These AMIA recommendations are intended to stimulate informed debate, provide a plan to increase understanding of the impact of usability on the effective use of health IT, and lead to safer and higher quality care with the adoption of useful and usable EHR systems.

The paper is a respectable start at acknowledgement of the issues ... albeit years late.

That said:

I noted some typical language in the article characteristic of the reluctance of the health IT industry and its friends to directly confront these issues.  I wrote a letter to the authors that, as I indicate below, not unexpectedly went unanswered except for one individual -- not even a physician -- who's gone out on a limb professionally in the interest of patient's rights, and as a health IT "iconoclast" (i.e., patient advocate) suffered for doing so (link).  The lack of a response to the letter is itself representative, in my opinion, of a pathology that renders more rights to the healthcare computer and its makers than patients.   More on this below.

First, I note I am rarely if ever cited by the academics.  They are not prohibited from doing soI've probably been writing on these issues -- poorly done health IT, improper leadership, the turmoil created, etc., publicly for longer than anyone else in the domain.

I also note that the paper is somewhat in the form of an analytical debate.  Analytical debates are relatively ineffective in this domain.  They are like popcorn thrust against a battleship.  The paper, also, appearing as it does in a relatively obscure specialty journal (Journal of the American Medical Informatics Association), will probably get more exposure from this blog post than the entire readership of that journal.  The authors need to be relating these issues in forums that are widely read by citizens and government, not in dusty academic journals - that is, assuming they want the messages to widely diffuse.

In my review of the article, I note the following:

... In an Agency for Healthcare Research and Quality (AHRQ) workshop on usability in health IT in July 2010, all eight participating vendors agreed that usability was important and many suggested it was a competitive differentiator, although some considered that usability was in the eye of the beholder and that the discipline of usability evaluation was an imperfect science, with results that were not useful.

A paper like this should have clearly repudiated antiquated viewpoints like that, not merely made note of them.   Not taking a stand is a sign of weakness...or sympathy.

As a matter of fact, if leaders such as this had paid attention to the 'iconoclasts' and their 'anecdotes', my own mother might not have gone through horrible suffering and death, with me as sad witness as I related to them in my letter below.

... End-users of EHR systems are ultimately accountable for their safe and effective use, like any tool in clinical care.

I see a linguistic sleight of hand via use the word "tool" to describe HIT and trying to blend or homogenize this apparatus with other "tools" in clinical care.  The HIT "tool" is unlike any other since no transaction of care can occur without it going through this device, and as such, all care is totally dependent on it.  Further, unlike pharma and medical devices, this "tool" is unvetted and unregulated but its use forced upon many users.

... [AMIA] subcommittees reviewed the literature on usability in health IT, current related activities underway at various US Federal agencies, lessons learned regarding usability and human factors in other industries, and current federally funded research activities.


Did they speak with the source of the most candid information?  The plaintiff's and defendant's Bars?

Need I even ask that question?

... Recent reports describe the safe and effective use of EHR as a property resulting from the careful integration of multiple factors in a broad sociotechnical framework

This is not merely 'recent' news.  The field of Social Informatics (link), that has studied IT in its social contexts for decades now, has offered observations on the importance of considering multiple factors in a broad sociotechnical framework.   The authors all know this - or should know this, or should have made it their business to know this The statement sounds somewhat protective of the HIT and hospital industries for their longstanding negligence towards those issues.

... User error may result in untoward outcomes and unintended negative consequences. These may also occur as a result of poor usability, and may also be an emergent property only demonstrated after system implementation or widespread use.

I note the use of the term "user error" and lack of the term "use error" with significant disdain.  As I wrote here regarding the views of a HIT industry exexcutive holding the mystical "American Medical Informatics Certification for Health Information Technology" NIST itself now defines "use error" (as opposed to "user error") as follows:

“Use error” is a term used very specifically by NIST to refer to user interface designs that will engender users to make errors of commission or omission. It is true that users do make errors, but many errors are due not to user error per se but due to designs that are flawed, e.g., poorly written messaging, misuse of color-coding conventions, omission of information, etc. From "NISTIR 7804: Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records." It is available at http://www.nist.gov/healthcare/usability/upload/Draft_EUP_09_28_11.pdf (PDF).

In the article, indefinites were exchanged with what should have been stronger, declarative statements, and vice versa:

User error ... may also represent a potential health IT-related error yet to happen.

I most decidedly wish they'd stop this "may" verbiage in policy papers like this.

... Anecdotal reports suggest that these application differences [where clinicians use more than one commercial EHR system] result in an increased training burden for EHR users.

"Anecdotal"?  How about "obvious to a third grader?" 

"Anecdotal" in academic papers often is a term of derision for inconvenient truths such as reports of health IT problems.  Its use often reflects a need for authors using the term (per a senior clinician from Victoria, Australia on the 'anecdotes' issue, link) "to attend revision courses in research methodology and risk management."

... Some suggest that the expected gains sought with the adoption of EHR are not yet realized.

"Some"?  How about "credible experts?"  "Suggest?"  They merely hint at it?  How about "opine?"
 
... The design of software applications requires both technical expertise and the ability to completely understand the user’s goal, the user’s workflow, and the socio-technical context of the intent

In the meantime, AMIA has been promoting national rollout of a technology where, most often, the latter does not apply.

To ... transform our healthcare delivery system ... clinicians need to use usable, efficient health IT that enhances patient safety and the quality of care.

This is the typical hyperenthusiast mantra.  Where's the proof?  And, transform into what, exactly?  Vague rhetoric like this in allegedly scientific papers is most unwelcome.

Some experts suggest that improving the usability of EHR may be critical to the continued successful diffusion of the technology.

More weak talk.  Why not come right out and say "Credible experts opine that ...."?

... While some EHR vendors have adopted user-centered design when developing health information technologies, the practice is not universal and may be difficult to apply to legacy systems.

From the patient advocacy perspective, that's their problem...it's a risk of being in this business.  Patients should not be expected to be used as experimental subjects while IT sellers figure out what other industry sectors have long mastered.   Further, they should be held accountable for failures that result in harm.  Another risk of doing business in this sector that clinicians have long learned to live with...

... Some believe it is difficult or impossible to reliably compare one product with another on the basis of usability given the challenges in assessment of products as implemented.

Nothing is "impossible" and again, if it's "difficult", that's the industry's problem.  There is risk of being in the business of medicine or medical facilitation; nobody promised a rose garden, and a rose garden should not be expected.

... Many effects of health IT can be considered to be ‘emergent’ or only discovered after monitoring a system in use

One might ask,  where's the industry and AMIA been regading postmarket surveillance (common in other health sectors) for the past several decades?

... AMIA believes it is now critical to coordinate and accelerate the numerous efforts underway focusing on the issue of EHR usability.

Only "now?"

... Establish an adverse event reporting system for health IT and voluntary health IT event reporting

No, no, no ...voluntary reporting doesn't work.  Even mandatory reporting is flawed, but it's better than voluntary.

I am invariably disappointed by recommendations like this.  I've observed repeatedly, for example, that "volunatary reporting" of EHR problems already exists - in the form of the FDA MAUDE database - and most HIT sellers' reports are absent.  See my posts on MAUDE here, here and here(Also, the only one that seems to report may have ulterior motives, i.e., restraint of trade.)

... A voluntary reporting process could leverage the AHRQ patient safety organizations (PSO) ... This work should be sponsored by the AHRQ.

These folks clearly don't want any teeth in this.  AHRQ is a research-oriented government branch, not a regulator, nor does it have regulatory expertise.

AMIA recommends:

Research and promote best practices for safe implementation of EHR

In 2013 this is valuable information in the same sense that advice to use sterile technique during neurosurgery is valuable.

"Promoting best practices" has been done for decadesNot mentioned is avoiding worst practices.   I've long written these are not the same thing, as toleration of the inappropriate leadership by health IT amateurs (a term I use in the same sense that I am a Radio Amateur, not a telecommunications professional), politics, empire-building and other dysfunction that goes on in health IT endeavors negates laundry lists of "best practices."

What is required is to research and abolish worst practices, including the culture and dynamics of the 'health IT-industrial complex.'  I made this point in my very first website in 1998.  It appears the authors don't get it and/or won't admit to the dysfunction that goes on in health IT projects.
 
... The adoption of useful and usable EHR will lead to safer and higher quality care, and a better return on investment for institutions that adopt them.

"Will?"  With respect to my observation above about the paper's prominent misuse of indefinites vs. stronger declarative terms, the word "may" would have been the appropriate term hereAs I wrote about similar statements from ONC in the NEJM in my 2010 post "Science or Politics? The New England Journal and The 'Meaningful Use' Regulation for Electronic Health Records", I'm quite disappointed seeing speculation and PR presented as fact from alleged scientists and scientific organizations.

Finally, I wrote the following email letter to the authors, to which (except for Ross Koppel) I received no reply.  While Dr. Koppel (a PhD) graciously expressed sympathy for my me and mother, the others (many MD's) were silent.

Perhaps the silence is the best indicator of their concern for the rights of computers and HIT merchants relative to the rights of people:

Mon, Jan 28, 2013 at 1:12 PM
Dear authors,

I've reviewed the new paper "Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA" and wanted to express thanks for it.

It's a good start.  Late, but a good start at returning the health IT domain to credibility and evidence-based practice.

It's too bad it didn't come out years earlier.  Perhaps my mother would not have gone through a year of horrible suffering and death, with me as sad witness, due to the toxic effects of bad health IT. 

Perhaps you should hear how horrible it was to hear my mother in an extended agitated delirium; to hear her cry hysterically later on when the rehab people told her that her weight was 95 pounds; to have to make her a "no code" and put her on hospice-care protocols, and then to have watched her aspirate a sleeping pill when she was agitated, and die several days later of aspiration pneumonia and sepsis ... in the living room of my home ... and then watch the Hearse take her away from my driveway...as a result of bad health IT.

I will be writing more thoughts on your article at the Healthcare Renewal blog, of course, but wanted to raise three issues:

1.  The use of "may" and "will" is reversed, and conflating the term "anecdote" with risk management-relevant case reports. 


  • They may also represent a potential health IT-related error yet to happen.  --->  They likely represent a potential health IT-related error yet to happen
  • Anecdotal reports suggest that these application differences result in an increased training burden for EHR users.  ---> Credible reports indicate...
  • Some suggest that the expected gains sought with the adoption of EHR are not yet realized. ---> Credible experts opine ....
  • Some experts suggest that improving the usability of EHR may be critical to the continued successful diffusion of the technology. --->  "Credible experts opine that ..."
  • The adoption of useful and usable EHR will lead to safer and higher quality care, and a better return on investment for institutions that adopt them. ---> The adoption of useful and usable EHR may lead to safer and higher quality care

You really need to show more clarity ... and guts ... in papers like this, and drop the faux academic weasel words.

2.  You neglected to speak to the best source for information on EHR-related harms, evidence spoliation, etc... med mal attorneys.

3.  You also neglected to speak to, or cite, the writings of a Medical Informaticist on bad health IT now going back 15 years - and whose mother was injured and died as a result of the issues you write about - me.  In fact I am rarely cited or mentioned by anyone with industry interests.

An apparent contempt for 'whistleblowers' such as myself makes me wonder ... what kind of people are the leaders of health IT, exactly? 

Do they value computer's rights over patients'?


It is not at all clear to me which has been the primary motivator of many of the health IT leaders.

I think the rights which I value are quite clear.

Sincerely,

Scot Silverstein

I neglected to mention the horror of seeing my mother put in a Body Bag before being taken to the Hearse in my driveway.

-- SS

Saturday, January 5, 2013

ONC and "Health IT Patient Safety Action & Surveillance Plan": When Sociologists Uphold the Hippocratic Oath While Physicians Pay Respect to the Lords of Kobol, We Are in a Dark Place, Ethically

[Note: this essay contains many hyperlinks. They can be right-clicked and opened in a separate tab or window.]

I've been meaning to write more on the just-before-Christmas, Friday afternoon, minimal-visibility release of the ONC report I'd written about in my Dec. 23, 2012 post "ONC's Christmas Confessional on Health IT Safety: HIT Patient Safety Action & Surveillance Plan for Public Comment."   (The ONC report itself is available at this link in PDF.)

The Boston Globe and Globe staff writer Chelsea Conaboy, however, have beaten me to the punch in the Jan. 3, 2013 article "Federal government releases patient safety plan for electronic health records", link below.

('Lords of Kobol', of course, is a pun.  They were fictional gods in a sci-fi series from the 1970's and a remake a few years ago, but in my circles the term is used satirically and derisively to reflect people expressing inappropriate overconfidence in - and perhaps worship of - computers.   Cobol, the COmmon Business-Oriented Language, is one of the oldest programming languages and was the major programming language of the merchant computing sector, including business, finance, and administrative systems for companies and governments.)

First, I do want to reiterate what I'd mentioned in my earlier post:  the new ONC report is a sign of progress, in terms of a government body explicitly recognizing the social responsibilities incurred by conducting the mass human subjects experiment of national health IT.  However, I also wrote:

... [The ONC report] is still a bit weak in acknowledging the likely magnitude of under-reporting of medical errors, including HIT-related, in the available data, and the issue of risk vs. 'confirmed body counts' as I wrote at my recent post "A Significant Additional Observation on the PA Patient Safety Authority Report -- Risk".

The Globe quoted a number of people involved the health IT debate, and I am now commenting on their Jan. 3 article:

Federal government releases patient safety plan for electronic health records
Boston Globe
01/03/2013 11:16 AM   

By Chelsea Conaboy, Globe Staff

The federal office in charge of a massive rollout of electronic health records has issued a plan aimed at making those systems safer by encouraging providers to report problems to patient safety organizations.

Though some in the field say it doesn’t go far enough, others said the plan is an important step for an office whose primary role has been cheerleader for a technology that has the potential to dramatically improve health care in the United States but that may come with significant risks.

A major issue at the heart of the controversy is the fact that, admittedly, nobody knows the magnitude of the risks - in large part due to systematic impediments to knowing.  This has been admitted by organizations including the Joint Commission (link), U.S. FDA (link; albeit in an "internal memo" never intended for public view, and discovered only through the hard work of Center for Public Integrity investigative reporter Fred Schulte when he was at the Huffington Post Investigative Fund), Institute of Medicine of the U.S. National Academies (link, quoted at midsection of post), and others. 

I have made the claim that when you don't know the level of harm of an intervention in healthcare, and there are risk management-relevant case reports of dangers, you don't go gung-ho and start a national-scale implementation with penalties for non-adopters, and then decide to study safety, quality, usability etc.  You determine safety first in more controllable and constrained environments.  Anything else is, as I wrote, putting the cart before the horse (link).


Things are a bit out of order here.


You also certainly don't dismiss risk management-relevant case reports from credible observers as "anecdotal", the common refrain of hyperenthusiasts and (incompetent) scientists who conflate scientific research with risk management - as a researcher from Down Under eloquently observed in the Aug. 2011 guest post "From a Senior Clinician Down Under: Anecdotes and Medicine, We are Actually Talking About Two Different Things."

 Back to the Globe:

A year ago, the Institute of Medicine issued a report urging the federal government to do more to ensure the safety of electronic health records. It highlighted instances in which the systems were linked to patient injury, deaths, or other unsafe conditions.

The report suggested creating an independent body to investigate problems with electronic records and to recommend fixes, similar to how the National Transportation Safety Board investigates aviation accidents.

Instead, the Office of the National Coordinator for Health Information Technology delegated various monitoring and data collection duties to existing federal offices, including the Agency for Healthcare Research and Quality [AHRQ].

The problem is that AHRQ is a research agency (as its name suggests), has no regulatory authority nor any experience in regulation, and most clinicians have never heard of it.  In effect, this ONC recommendation is lacking teeth, even compared to the relatively milquetoast recommendations of IOM itself (as I wrote about in a Nov. 2011 post "IOM Report - 'Health IT and Patient Safety: Building Safer Systems for Better Care' - Nix the FDA; Create a New Toothless Agency").


The [ONC] office has asked patient safety organizations, which work with doctors and hospitals to monitor and analyze medical errors, to add health IT to their agendas. Data from the organizations would be aggregated by the agency, but reporting by doctors and hospitals is completely voluntary.  [A prime example of what I term an extraordinary regulatory accommodation afforded the health IT industry - ed.]

Now we're into septic shock blood pressure-level weakness. Here is PA Patient Safety Authority Board Member Cliff Rieders, Esq. on mandatory, let alone voluntary reporting. From “Hospitals Are Not Reporting Errors as Required by Law", Philadelphia Inquirer, pg. 4, http://articles.philly.com/2008-09-12/news/24991423_1_report-medical-mistakes-new-jersey-hospital-association-medication-safety:
  


... Hospitals don’t report serious events if patients have been warned of the possibility of them in consent forms, said Clifford Rieders, a trial lawyer and member of the Patient Safety Authority’s board.

He said he thought one reason many hospitals don’t want to report serious events is that the law also requires that patients be informed in writing within a week of such problems. So, if a hospital doesn’t report a problem, it doesn’t have to send the patient that letter. [Thus reducing risk of litigation, and, incidentally, potentially infringing on patients' rights to legal recourse - ed.]


Rieders says the agency has allowed hospitals to determine for themselves what constitutes a serious event and the agency has failed to come up with a solid definition in six years.

Fixing this “is not a priority,” he added.

To expect hospitals to voluntarily report even a relevant fraction of mistakes and near-misses out of pure altruism, or permit their clinicians to do so, with the inherent risks to organizational interests such reporting entails, is risible.

The near-lack of reporting by most health IT sellers and hospitals in the already-existing FDA Manufacturer and User Facility Device Experience (MAUDE) database is substantial confirmation of that; the fraction of reports in MAUDE, however, are hair-raising.  See my Jan. 2011 post "MAUDE and HIT Risks: What in God's Name is Going on Here?" for more on that issue.

Here's an example of what happens to 'whistleblowers', even those responsible for system development and safety: "A Lawsuit Over Healthcare IT Whistleblowing and Wrongful Discharge."

ONC's recommendations thus in my opinion reflect bureaucratic window dressing, designed to create progress - but progress that can probably be measured in microns.

“There was no evidence that a mandatory program was necessary,” Jodi Daniel, the [ONC] office’s director of policy and planning, said in an interview.

Really?  See the aforementioned Philadelphia Inquirer article Hospitals Are Not Reporting Errors as Required by Law", as well as numerous articles on pharma and medical device industry reporting deficits such as starting at page 5 in my paper "A Medical Informatics Grand Challenge: the EMR and Post-Marketing Drug Surveillance" at this link in PDF.

There is no evidence mandatory reporting is necessary ... to someone who's either naïve, incompetent - or persuaded, e.g. with money, to not find evidence or rationale.

The [ONC] office has been under pressure to roll out the electronic health records systems quickly while protecting patient data and making sure that the systems don’t cause problems in medical care, said Dr. John Halamka, chief information officer at Beth Israel Deaconess Medical Center. 

Under pressure by the health IT lobby, perhaps; but nobody else that I can think of.

“It’s this challenging chicken-and-egg problem,” he said.

No, actually, it isn't.  Patient safety must come first. This becomes clear when one considers the late 5th century BC ethical principle Primum non nocere ("first, do no harm" or "abstain from doing harm") versus the late 20th and early 21st century IT-hyperenthusiast credo I've expressed as "Cybernetik Über Alles"  ("Computers above all"). Under CÜA, the computer has more rights than the patients, and the IT industry receives extraordinary regulatory accommodation to sloppy practices that no other healthcare or mission-critical non-healthcare sector enjoys.

I sent Dr. Halamka a set of arguments such as I make here, and a picture of a health IT 'chicken', my deceased mother in her death robes.

I received back a "thank you for the views" message - but no condolences.  (It occurs that I have rarely if ever received condolences from any senior HIT-hyperenthusiast Medical Informatics academic or government official to whom I've mentioned my mother.  Not to play amateur psychologist, but I believe it reflects the level of disdain or even hatred felt by these people towards health IT iconoclasts/patient's rights advocates.)

The plan, which is subject to public comment through Feb. 4, “is a reasonable start,” in part because it puts more pressure on hospitals and doctors to monitor safety, Halamka said.

As I expressed to Dr. Halamka, we are in agreement on that point.

The government would have risked stifling innovation in the industry if it had opted instead to require the kinds of tests and review by the Food and Drug Administration that new medical devices and drugs must go through, he said.

To that, I mention here (as I did in my email to him) my response to this industry meme, as I had expressed it at Q&A after my August 2012 keynote address to the Health Informatics Society of Australia:

... I had a question from the audience [after my talk], from fellow blogger Matthew Holt of the Health Care Blog.  (I've had some online debate with him before, such as in the comment thread at my April 2012 post here.)

Matthew asked me a somewhat hostile question (perhaps in retaliation for the thrashing he received at the end of my May 2009 post on the WaPo's HIT Lobby article here), that I was well prepared for, expecting a question along these lines from the seller community, actually.  The question was preceded by a bit of a soliloquy of the "You're trying to stop innovation through regulation" type, with a tad of Merck/VIOXX ad hominem thrown in (I ran Merck Research Labs' Biomedical libraries and IT group in 2000-2003).

His question was along the lines of - you were at Merck; VIOXX was bad; health IT allowed discovery of the VIOXX problem by Kaiser several years before anyone else; you're trying to halt IT innovation via demanding regulation of the technology thus harming such capabilities and other innovations.

The audience was visibly unsettled.  Someone even hollered out their disapproval of the question.

My response was along the lines that:

  • VIOXX was certainly not Merck at its best, but regulation didn't stop Merck from "revolutionizing" asthma and osteoporosis via Singulair and Fosamax;
  • That I'm certainly not against innovation; I'm highly pro-innovation;
  • That our definitions of "innovation" in medicine might differ, in that innovation without adherence to medical ethics is not really innovation.  It is exploitation.

I stand by that assessment.

More from the Globe article:

There is little good research into how the systems improve health care and there are big obstacles to fixing even the known problems, said Ross Koppel, a professor of sociology at the University of Pennsylvania who studies hospital culture and medication errors.

Some developers require providers to sign nondisclosure agreements before using their systems, and the safety plan does not prohibit such gag clauses.  [Note: I wrote on this issue here, and in a published July 2009 JAMA letter to the editor "Health Care Information Technology, Hospital Responsibilities, and Joint Commission Standards" here - ed.]  While the plan addresses reporting of known problems, Koppel said it will not help researchers and developers understand problems that go unnoticed but that may be causing real patient harm. 

“We only know the tip of the iceberg” about how electronic health records affect patient care, said Koppel, who was an official reviewer for the Institute of Medicine report.

As per the title of this blog post, we are in a dark place, ethically, when a PhD sociologist who's never taken the Oath of Hippocrates (to my knowledge) appears to express more concern for patient safety and patient's rights than a Harvard physician-informatics Key Opinion Leader such as Dr. Halamka.

Koppel said the mantra of the Office of the National Coordinator has been that more health IT leads to better health care. “It probably is better than paper,” he said, “but it could be so much better than it is.”

I agree, but with caveats.  I opine that bad health IT is likely worse for patients than a good, well-staffed paper based system.  For instance, the former can cause systematic dangers that even a bad paper system cannot, such as tens of thousands of prescription errors (see my Nov, 2011 post "Lifespan Rhode Island: Yet another health IT 'glitch' affecting thousands - that, of course, caused no patient harm that they know of - yet") or mass privacy breaches (see the current 30 or so posts on that issue at this blog query link: http://hcrenewal.blogspot.com/search/label/medical record privacy).

On good health IT and bad health IT from my teaching site "Contemporary Issues in Medical Informatics: Good Health IT, Bad Health IT, and Common Examples of Healthcare IT Difficulties" at http://www.ischool.drexel.edu/faculty/ssilverstein/cases/:

Good Health IT ("GHIT") is IT that provides a good user experience, enhances cognitive function, puts essential information as effortlessly as possible into the physician’s hands, keeps eHealthinformation secure, protects patient privacy and facilitates better practice of medicine and better outcomes. 

Bad Health IT ("BHIT") is IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation.
 

The Boston Globe article concludes:

Ashish Jha, associate professor of health policy at Harvard School of Public Health and a member of the panel that drafted the Institute of Medicine report, said he wants doctors to be able to report problems -- errors in medication lists, for example -- in real-time so they can be found and fixed quickly. The safety plan does not require systems to have that capability, but Daniel said her office could soon add such a requirement for products that receive federal certification.

The bigger problem is that health care as a whole needs a better way of tracking patient safety, Jha said. Monitoring issues caused by electronic health records “should be a part of it, and then we can actually know if this is a small, medium or large contributor to patient safety issues,” he said. “But we don’t know that.”

I agree with Dr. Jha, but the IT sellers and healthcare organizations will (legitimately) claim that adding real-time error reporting/forwarding to their products will be extremely resource-intensive.

I have an alternate approach that will require little effort on the part of the sellers and user organizations.

  • Post a message at the sign-in screen of all health IT along the lines that "This technology is experimental, adopted willingly by [organization] although not rigorously vetted for safety, reliability, usability, nor fitness for purpose, and thus you use it at your own risk.  If problems occur, report them to the following" ...

"The following" could include a list of alternatives such as I wrote in my Aug. 2012 post "Clinicians: How to Document the EHR Screens You Encounter That Cause Concern."


... When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.


See the actual post for an idea about clinicians seeking indemnification when forced by healthcare organizations to use bad health IT.  I can attest to actually seeing HIT policies that call for "human resources actions" if clinicians refuse to use HIT, or cannot learn to use it at a sufficient pace.

(Left out of this reiteration is the demonstration on photographing problematic EHR screens.  See the post for the details - it is easy to do, even with a commodity cellphone.)

HHS should be promoting laws on protection from retaliation upon clinicians reporting problems in good faith.

Thus, physicians, nurses and other clinicians can create needed health IT transparency and help our society discover the true level of risks of bad health IT.  They simply need the right information on what to do and where to report, bypassing the ONC office and, in the spirit of medicine, taking such matters into their own hands in the interests of patient care and medical ethics.

I also made recommendations to the Pennsylvania Patient Safety Authority on how known taxonomies of health IT-related medical error can be used, and need to be used, to promote error reporting in common formats.  Slides from my presentation to the Authority entitled "Asking the Right Questions:  Using Known HIT Safety Issues to Improve Risk Reporting and Analysis", given in July 2012 at their invitation, are at http://www.ischool.drexel.edu/faculty/ssilverstein/PA_patient_safety_Jul2012.ppt

Finally, another sign of progress:  unlike the HITECH Act, this new ONC plan is open to public comment.

-- SS

Addendum Jan. 8., 2012:

Dr. Halamka has put more details regarding his views in his blog.  The entry is entitled "Electronic Health Record Safety" at this link:  http://geekdoctor.blogspot.com/2013/01/electronic-health-record-safety.html .

He writes:

... Some have questioned the wisdom of moving forward with EHRs before we are confident that they are 100% safe and secure.   [That, of course, is not my argument - nothing is ever 100% safe and secure.  However, we don't yet know just how safe and secure - or unsafe and insecure - HIT is.  That is the issue I am concerned about - ed.] I believe we need to continue our current implementation efforts.

I realize it is a controversial statement for me to make, but let me use an analogy.

When cars were first invented, seat belts, air bags, and anti-lock brakes did not exist.    Manufacturers tried to create very functional cars, learned from experience how to make them better, then innovated to create new safety technologies. many of which are now required by regulation.

Writing regulation to require seat belts depended on experience with early cars.

My grandmother was killed by a medication error caused by lack of an EHR.  My mother was incapacitated by medication issues resulting from lack of health information exchange between professionals and hospitals.   My wife experienced disconnected cancer care because of the lack of incentives to share information.     Meaningful Use Stage 2 requires the functionality in EHRs which could have prevented all three events.

I express my condolences on those events.

I disagree, however, with continuing national implementation efforts at the current rate, with penalties for non-adopters.  I opine from the perspective of believing health IT has not reached a stage where it is ready for national rollout and remains experimental, its magnitude of harms admittedly unknown and information flows systematically impaired.  I recommend and prefer great caution under those circumstances, and remediation of those circumstances before full-bore national implementation.

I will leave it to the reader ponder the two views.

-- SS