Dear digital health — health information technology (IT), artificial intelligence (AI), etc. — leaders and healthcare leaders,
Digital health is part of our lives and has become a part of “us,” and I respect your leadership to make patients’ lives better. Along with digital health, leaders of healthcare organizations make decisions to implement and use digital products in our work environments. The collaboration of both parties makes up today’s healthcare, hence open and transparent communication channels are crucial to a healthy ecosystem in healthcare.
I have experienced almost every level of digital health integration in our industry and feel obliged, based on my experiences and the issues I have observed, to speak up. Being passionate about patient safety and empathy in healthcare, the frustrations and concerns raised by providers and colleagues, and the accompanying instances of patient harm are of deep concern to me. As I have worked in both inpatient and outpatient settings, here are direct quotes from people I have interacted with.
Let me start from the inpatient setting:
“I can’t get my medication from the Pyxis (medication dispensing machine)! It has been 2 hours past the schedule!” says one nurse, trying to give an insulin dose for high sugar to a patient. She couldn’t get her login figured out after her bioID or password did not work.
“I spent 50 minutes just trying different workarounds!” says an infusion nurse trying to scan an IV bag to assemble with an antibiotic that will not scan because of a glare from the plastic wrap. There is no clear solution.
“Can pharmacy come and help? I have a patient in surgery already!” says a surgery technician, frantically asking for anesthesia medication because she has an issue with a new login process for accessing Pyxis.
Is it an upgrade or a downgrade?
Whenever there is an upgrade for a health IT product or process we are using, these upgrades are accompanied by cries for help. It is an almost standard response accompanying the upgrade from staff trying their utmost to access medications for their patients. Even when there is no upgrade in progress, we continuously need to develop a process to get around the problems created by AI and IT.
“I spent 2 hours talking directly to the support members of all the involved products, and none of them are willing to accept the onus of addressing issues created by their technology in the field. Nurses are calling nonstop, and orders are piling up.”, says a colleague.
“The message on the screen said to check the power cords and to try other things, then call if none of those would work. Do you know what the real problem was? It was the browser. The application would only work in Chrome! Why didn’t the message say it then? I wasted 2 hours troubleshooting an issue that I could have resolved within minutes.
I did not see that sticky note.”, says another colleague.
Is the outpatient setting any better?
“I cannot make an appointment for a vaccination. There is no feature that allows me to enter my problem in the appointments portal! Why isn’t there an option to enter free text? There is no one answering the phone at the time I can make a call, so instead of being able to use this conveniently located vaccination center, I had to drive 100 miles to get the vaccine through a different pharmacy.”, says a friend.
“Thanks for covering my shift. Do you know what happened? I got to the appointment, and the receptionist said she couldn’t find me in the system. I had the confirmation number and everything. I am never going back there, and now I have to figure out everything all over again, time, work, family.”, says my coworker.
“I went to all the pharmacies I used before, and none of them had my prescription ready! The doctor said it was sent to the pharmacy electronically, and his office was closed when I called to find out which pharmacy he’d used. Can you guess where I found it the next day? It was at the outpatient pharmacy at the clinic!” says a patient.
Many of these flawed digital options I’ve described above are forced on patients without their consent, choice, or support. When calling a medical office, patients get directed to go to the digital solutions offered by providers’ websites.
For clinicians, we use various products, from checking products in crash carts for Code Blue to Epic for electronic medical records to record notes and communicate with various participants to care for patients. For patients, these records get accessed through Epic’s MyChart portal. When I attended South by Southwest (SXSW) in 2013 as a student startup attendee, speakers were already talking about a future where every patient would be wearing devices like the Apple Watch to measure health. That time has come.
And digital has become pervasive throughout healthcare. Even when we visit websites, we encounter chatbots to guide us through our visit, guiding us, asking questions, and directing us when we search for information, such as for clinical trials. Patients and healthcare workers use these tools every day to make our lives more efficient and I applaud the passion and efforts to make healthcare better.
However, what if things don’t go as intended? What is the plan?
When things don’t work out as intended, no one is accountable. Not for lost productivity, money lost and for patients who experience the negative impacts of poorly functioning solutions. Most importantly patient safety issues caused by digital health solutions are not addressed. Healthcare’s bottom line can be affected by how digital health products perform in a clinical setting like this example, allegedly resulting in the financial breakdown of the institution. If this institution was the only hospital in the area, imagine how detrimental the effects would have been to the local community.
When we get the explanation of “It is an IT issue.” in healthcare, we wait till someone in the IT department resolves it. However, as we use many different products, establishing which one is causing the issue takes time. Different vendors point to other products for faults. In the meantime, we have to work around the issue while it’s addressed, often making the situation worse. Identifying which vendor is to blame doesn’t always resolve the problem either. Sometimes flagged issues get lost somewhere in the communication process, and no one follows up. Sometimes vendors wait until enough people complain about an issue before they apply a patch or a fix.
Here are 3 recommendations I would like to voice as a healthcare professional and a patient advocate:
- Your end-users are ultimately patients, no matter who uses your product.
Whether your product line targets businesses, clinicians, or customers including patients, if your product is for healthcare, the ultimate end user of your product are the patients. Let’s say your product’s users are doctors and nurses, or even nonclinical professionals in healthcare who support the clinical professionals. Your product is ultimately helping the institutions that are treating patients. Even though your product’s direct users are not the patients, your product needs to be designed “for” patients’ safety. Please remember that when your product breaks down or does not work as intended, you are ultimately compromising patient care and safety; the time and resources saved by your product are no longer enabling the direct users, which negatively impacts their ability to properly assist their patients.
2. Healthcare is not binary, and your digital solutions shouldn’t be either.
Healthcare is not binary.
The digital solutions you provide should not be either when integrated into practice settings. I know it is easier when things are all categorized with a yes or no. Data and A.I. like 1’s and 0’s but healthcare is far more nuanced. Not everything is easily categorized into predetermined boxes, as many issues are interconnected to each other. Consider “other” and “comment” as an option at all times. Digital healthcare works most efficiently when combined with a human touch that allows for the complexity of our species, in my humble opinion. Consider connecting the services you provide digitally with a person who can help that process as not everything is clearly yes or no, especially on the patients’ end.
3. Please look ahead and consider empathy for “patients” and provide solid feedback loops for the “users.”
Notice “patients” and “users” are different here. I am a strong believer in empathy being the essential ingredient for all healthcare problems. Digital health is no exception. Again, your products exist essentially to help patients in the end, even though the direct users may not be patients. Consider asking questions like,
“Are we providing our support staff with a broad enough brief, of how our products can not only help but also harm patients when things don’t work out?”
“How are we advocating for patients even though our direct users are not patients?”
“Are we regularly assessing feedback loops? Do our users have ways to properly bring their concerns to our attention? What is the pattern of communication breakdown? Do we have ways to collect this data?”
“Are our staff properly trained in how to resolve problems users bring to their attention? Are we training their listening skills and developing their empathy?”
Users’ concerns should always be seriously considered and never ignored. Healthcare is very detail-oriented. Continuous improvement and quality enhancement are must-haves, again, for patient safety. This is the difference between products that suffer poor adoption in the market and those that become indispensable. Great products work with and for the users and are flexible enough to change to address fluid workplaces.
Patient safety is a serious issue in digital health that does not enjoy enough attention. Not enough is being done about the problems users encounter on a daily basis. While the explosion of digital health is openly welcomed in the deeply flawed health system in the US, serious consideration of how the integration of digital technology into clinical settings and real lives occasionally compromises patient safety, which lags far behind. The digital health phenomenon is no longer based on “optional adoption’; it is intentionally and forcefully pushed to all of us.
Avishek Choudhury and his colleagues studied research in digital health in the geriatric population and also studied artificial intelligence solutions in healthcare for patient safety. Their research highlighted
- machine learning and AI lack standardization,
- lack of customization to clinical settings,
- and lack of appropriateness of data being used in provided solutions
The most concerning issue for me, identified in the papers, was the inability of many systems to consider patients with comorbidities appropriately. While digital health solutions may provide help for patients, there are still huge limitations because, as stated above, healthcare is not binary. Are we assessing patients’ preferences even before the interactions start? How much time are we spending to find more about what is involved beyond the scope of the limitations of the systems and narrowly defined categories? How much benefit do patients get from replacing people with digital health solutions?
Focusing on patients, being aware that healthcare solutions should not be binary but be integrated with a personal touch, and having empathy for patients and providing solid feedback loops are what I ask you to please consider.
I’ll leave you with a parting story for your consideration, as I am active as a patient advocate for patients with language barriers. I attended a continuing education committee conference for doctors as a patient advocate. When I brought up my concerns for patients with language barriers to one of the speakers, he shared an “unfortunate” story of a Japanese patient with limited English who died of pneumonia while trying to communicate with a doctor using a patient portal.
The patient’s request to speak to a doctor was routed through a digital portal, and the doctor was not able to clearly assess the patient’s condition before it got out of hand. While it can be argued the breakdown in communication here was not simply due to technology, we have to be aware of how we use digital health and ensure that patients and users who are not tech-savvy, ready, or willing to use it because of barriers, still have other ways to communicate.
Should a case like this remain an “unfortunate” case? I respectfully disagree. The assumption that all patients and providers be able to communicate via the electronic portal and the assumption that all should be able to communicate in English, well enough to describe symptoms electronically, is fundamentally flawed. I have to disagree, as one in five of the US population speaks a different language at home, This is not a percentage that healthcare can ignore in favor of technology.
Thank you for your time, and please have “patient safety” in mind at all times. Let’s not forget that it is a prerequisite for “being” in any type of healthcare, whether it is digital or personal. I humbly ask you to learn from patients and constantly improve the patient experience with empathy. Covid-19 should have been an awakening opportunity for patient safety as we all were on the verge of being a patient, more so than at any time in our lives.
We must improve patient safety together, digitally and in person.