The US healthcare system has been under increasing pressure on multiple fronts. Healthcare cost has been rising faster than the rate of GDP growth in recent years and an estimated 100,000 patients die from preventable medical error each year - equivalent to 1 jumbo jet of patients every week. These errors in 2010 cost an estimated 21 billion dollars.
Digitization has been touted as a solution for these ills – instant and simultaneous access to patient data, automated alerts for adverse drug interaction, streamlined workflow, are among the key benefits of this new technology. But Wachter argues that we are still in the early stages of medical information digitization and its success is mixed. He has written this book as a "state of digital healthcare IT" report, documenting its triumphs and pitfalls, focusing on the Electronic Medical Record (EMR) and Image Management systems in the hospital setting in the US, with a glimpse into the bright future such technologies may bring.
Radiology
Prior to digitization, patient exams/studies, often using X-rays, would be developed or printed on a film and stored centrally in a cabinet full of patient files. Doctors and radiologists would pull the films from the files and review them together during the diagnostic process. Doctors shared the clinical context while radiologists provided the reading of the film.
As the imaging technologies matured, digitized images started showing their advantages, and in some cases, necessity, in providing quality care.
In the early 80s, a CT scan study may contain up to a few dozen images. By the late 90’s, however, a high speed CT scanner can produce hundreds of images in one study. Printing each study on film at $4 per film would cost thousands for one exam and became cost prohibitive, time consuming, and made their transportation to various departments too onerous. In one story, a doctor printed only the even-numbered slides from a study in order to save money, when the abnormality in the images only presented on the odd-numbered slides, nearly causing a misdiagnosis. Archiving physical films also became too expensive compared to the digital counterpart. Digital images are also much more accessible. While films can only be viewed at one place at a time, a digital file can be viewed by several doctors in different locations at the same time. Such developments made digitization of images a very attractive option.
The digitization effort became the Picture and Archiving System (PACS). It is the software system that manages medical images. Doctors use it to receive, transmit, store, archive, and read the medical images and produce reports.
As PACS evolved, more automation were naturally added as software capabilities allowed and hardware processing power became abundant. In the film days, doctors might look at a study and decide to read an earlier study of the same patient for context. These earlier studies are called priors, and the retrieval process used to mean walking over to the patient file again, searching for the study with the priors with the relevant information - perhaps the most recent earlier study of a particular body part – a time consuming process.
On a PACS, however, doctors can choose the prior searching criteria, set it once, and the system automatically retrieve the priors in the background every time a study is being viewed. This way the priors are readily available when the doctors need them – significantly reducing the time for searching through images. Digital images also allowed for more flexible annotations. Multiple layers of “virtual views” can be superimposed on images and saved separately for ease of presentation and and discussion.
The automation and instant access much increased radiologists’ productivity. By one estimate, the case loads to radiologists increased by 70%.
The clear cost saving and productivity advantages of PACS together with radiologists’ relative ease with adopting newer technologies (radiology imaging equipment required a level of technical savvy even in its early days) meant the digitization of radiology occurred rapidly once the tool was available, and it occurred without requiring government intervention. In 2000, only 8% of hospitals had PACS, but in 2008, over three quarters had them.
The adoption of the PACS, while beneficial, had some unintended consequences. Unbounded by the single copy film days, doctors now access images and look at them at different locations. Pressured by the pay-per-study model, radiologists spend most of the time in front of the screen just reading studies and not interacting with clinicians. Clinicians and radiologists cooperate less to come up with a diagnosis and radiologists do not often see patients. The Baltimore VA hospital estimated that after their implementation of PACS, in person consultation rates for general radiology studies dropped an estimated 82%. This separation can make diagnosis less accurate and gives less learning opportunities for different parts of the medical world.
Further, due to the increase in caseload, radiologists are reporting more incidents of burnout. Many PACS now provide telemedicine capabilities and allowed radiologists to read images from remote regions, creating competition for local and more costly radiologists. These changes are also occurring at a time when U.S. healthcare system is shifting from a pay per volume model to value based model, with a focus to rein in unnecessary medical images.
In response to the current environment and to make themselves more valuable, some radiologists have started to actively participate in in-person consultation and be more involved in providing value in diagnostic processes to the clinician and patients.
Looking to the future, as AI advances, some software programs are showing early promise in reading images in specialized areas, which adds anxiety to radiologists’ sense of job security. Vendors, however, are quick to add that AI is meant to complement and support radiologists, not to replace them. As Wachter argues, the complex art of combining the image reading with the social and clinical context is still solely the domain of human radiologists.
Electronic Medical Record and Health IT systems
The earliest medical note dated to 5th century BCE. Until a few centuries ago, it was mainly a narrative that read like a story describing the state of the patient, written in a doctor's journal. Lacking other diagnostic tools, description of ills are often told by the patient. It looked like this:
The first symptoms always affect the extremities of the limbs and the lower limbs particularly. When the whole body becomes affected, the order of progression is more or less constant: (1) toe and foot muscles, then the hamstrings and glutei, and finally the anterior and adductor muscles of the thigh; (2) finger and hand, arm, and then shoulder muscles…
The doctor in those days, without scientific measurement or much in the way of effective treatment, offered care and their attention for one who is at their most vulnerable.
Gradually, the realm of science permeated medicine. It became apparent that the same disease has common symptoms from one individual to another, and those symptoms can be summarized and well measured by various tools - the knee hammer, stethoscope, and now, x-rays and MRIs. Gradually, observations of a patient's symptoms overtook the patient narrative, and measurements crept into the doctor's notes.
As medical knowledge grew, specializations arose, and a patient is often cared by many doctors and their private notes became the public patient chart. To identify a patient more efficiently, patient ID came on the scene, where each patient is uniquely identified by an ID tied to a chart, used by whoever is caring for the patient. This innovation vastly improved the efficiency and effectiveness of patient care. In the age of paper charts, hospitals like the Mayo clinic built elaborate pneumatic tubes to expedite the movement of patient charts.
Gradually, other institutions joined the healthcare ecosystem and the chart took on ever more duties. Besides patient care, charts are used for clinical research, billing to payers like insurance companies, to auditors who maintain patient safety, and to the lawyers and legal teams who fight for and against medical malpractice suits. The chart became a political battleground serving the demands of multitude of stakeholders. It reflects an emphasis on the illness and its measurements and less on the patient. A chart today looks like the following – far from the narrative it once was:
vs stbl, ō comp.; no Δ resp. 02 sat ok; xam un-Δ’d – see note 11/12; fam. visit.; no nursing issues; labs = no incr. aldolase, CK’s; note: this enctr. took 65’ & inv. a hi deg. of complex. in dec. making.
Adding to this complex picture is digitization. Efforts for digitization of medical note started in the 70's, but after decades only a small percentage have a digital medical record system. Unlike the digitization of radiology which had a clear benefit and occurred 10 years earlier, a large marketable benefit of electronic medical record (EMR/EHR) was interoperability – that a patient treated at hospital A will have his/her record available at another hospital across the country. It was seen as a public good like highway infrastructure but wouldn’t benefit the hospitals who themselves spend millions implement (or roughly around $40,000 per physician, not to mention the high risk of such an enormous system – 30% of the systems fail. This was the selling point of Epic, the dominant vendor of EMR, which is very expensive but has a tried and true strategy with a great success rate). Lack of incentive translated to low implementation rates for EMR.
A pivotal moment came during Obama's presidency. Realizing the need to curb cost, the administration wanted to move from volume based to value based care. But to measure value requires tracking medical procedures and outcomes accurately and can only be feasibly done through digital means of medical charts, creating the need for digitization. Secondly, engulfed in the financial crisis of 2008 the government wanted to keep up spending in the economy through a large stimulus package. These two imperative convinced the Obama administration to channel 30 billion dollars to mandate and implement the use of electronic medical records at hospitals across the country. This dramatically boosted EMR adoption in hospitals and doctors’ offices. In 2008, the adoption rate was in the low teens. By 2014 it had gone up to about 70 percent.
Digital medical records did deliver on a number of major promises – there are no more confusions from doctor’s illegible prescription notes. Robotic arms in the pharmacy can fill the drug order without error, exactly as prescribed. When dispensing drugs to patients in beds, a machine checks all the prescribed doses are given, reducing human error in the distribution.
But digital medical notes did bring its own set of issues. Doctors now spend much more time on the computer, very often looking at the screen while listening to the patient and missing critical eye contact that helps patients feel attended to. The demand of learning to work with the complex software puts additional burden and time drain on the physician, often trained in science and not technology, and diminishes their role as the healer in the patients’ view, contributing to the rise of alternative medicine. The focus on the patient’s file, often bloated by the ease and overuse of the “copy and pasting” function of patient notes can obfuscate the diagnostic process. Doctors can miss the obvious and correct diagnosis for the complex ones.
Problems in Healthcare IT System
Alarm fatigue is another area needing improvement. With the proliferation of sensors, the value of the sanctity of life, and the liability aversion of the device manufacturers led to the dramatic rise in the number of alarms, often with little prioritization. It’s the latter part that’s most troubling as an overwhelming majority of the alarms can be false ones. Often, a life threatening alarm sounds not much different than one which isn’t. Nurses can dash at the note of an alarm to find it being caused by a patient having misplaced the sensor but is otherwise well.
Sometimes in frustration or often for practical reasons of efficiency, the alarms were ignored or turned off, with occasional devastating results. This alert fatigue could range from heart rate monitors to alerts on the drug interaction of prescription software. Between January 2005 and June 2010, at least 216 deaths were linked to alarm malfunction or fatigue.
The University of California, San Francisco (UCSF) hospital’s 5 intensive care units (ICU) had an average of 66 patients each day. A study done in early 2013 showed every cardiac monitors showed 187 audible alerts every day – 1 every 8 minutes. If both audible and inaudible alerts are added, there are 2.5 million unique alarms across all the ICUs in one month - the overwhelming majority of them false. This is just the cardiac monitor, and does not include alerts from the IV machine alarms, mechanical ventilator alarms, bed alarms, or nurse call bell, or computerized alerts. The alerts are numerous, and even more when compared to commercial aviation, another safety critical system, where only less than 10% of all flights gets any alerts whatsoever.
Wachter argues that much of the alarm fatigue can be helped by simple measures such as tweaking the threshold or delaying some alarms by 1 minute can reduce a majority of the false alarms with no ill effect.
On the subject of the future of AI, Wachter shows that medical data is very unstructured, noisy, and diverse, making it very difficult for AI diagnosis. In a sense, most diseases are unique. As an example, in a year in the trauma centre in the state of Pennsylvania, 41,000 patients were seen. There were 1,224 different injuries. In all, there were 32,2611 unique combinations of injuries. Such uniqueness and small dataset makes it hard to train AI algorithms, which requires large datasets for every case. Furthermore, the clinical reality and lack of intuitive understanding of human life conditions (what does a patient’s appearance tell us of his lifestyle and how it impacts the diagnosis? What’s the patient’s life priorities?) still missing in the AI general knowledge base, compels Wachter to argue machines will not be replacing general human diagnosis in the short term.
Future
Despite the shortcomings, Wachter is optimistic of the benefit that technology can bring. Modern hospitals are expensive to run - they were built for urgent and intensive care and cost the patient and the state a lot of money. Wachter argues it might not be the most productive means of providing care to all patients. For many patients, bringing care to their homes is both cheaper and with better outcome. Wachter believes in the future, hospital care will be for the most urgent and critical cases – similar to the level currently given to ICU patients. Other measures like patient/doctor portal, remote/telemedicine consult, home care, and better preventative measures, will decrease the use of emergent care, hospitalization, all the while increasing patient care quality and satisfaction.
Wachter promotes the view that health IT systems are not merely technological projects – they are social change projects that require the change of processes and cultural practices in the healthcare ecosystem. To the software developers, he urges more care in considering the usability of physicians (e.g. alert fatigue, prioritized error messages, and intuitive interfaces that don’t overly tax the memory). Respecting the background and training of a physician can ease the adoption of new technologies.
On the part of the government, Wachter urges for more regulation fine tuning. Some laws and rules were well intended but in the end did not serve its purpose. For instance, to promote use of the EMR, the government mandated that physicians must perform some functions in the EMR system by themselves to be considered meaningfully using the system and thus compliant. Yet this prevents doctors from having secretaries to help in the administrative tasks, adding to the burden of care process and lessen time for patient interaction.
Wachter believes it is still early days of healthcare IT, and like the early days of electric motors in a manufacturing plant, the benefits will be realized when a new mindset, management practices, and refined technological tools, are available, can bring forward.
I look forward to the days when Wachter’s vision is realized.