Today, accessing medical information, which once required scheduling an appointment with a health provider, can be done through an app with a single tap. Health portals provide instant access to diagnostic test results, from complete blood count panels to kidney function markers to hormone levels.
While we should welcome ready access to health data, viewing it without the interpretation of a seasoned clinical expert can fuel fear. This is especially true when worrisome data prompts you to reflexively rush off to “Dr. Google” – a source that knows neither empathy nor accuracy.
On behalf of myself, my family, and the patients whom I know, I’d urge those engaged in managing and improving health tech to recognize what now essentially amounts to a gap in care. You must integrate responsible, clear, and reassuring interpretative information into your apps and platforms. For patients, the reason for the health system’s very existence is much more than a “nice-to-have” feature; supplying it is an ethical imperative.
Patients should not be left to parse complex lab data alone, nor should they be making decisions based on fragmented information, especially when a single abnormal result might suggest a more serious situation than exists. While AI-generated engines process and summarize vast amounts of information in a blink, they lack the clinical judgment and humanity of a physician who knows a patient’s medical history.
Further, AI-based models cannot currently account for every factor influencing personal health metrics, such as lifestyle, age, genetic predispositions, or recent health events. Even the language used by AI can inadvertently amplify fear; a model might describe results as “abnormal” or “high-risk” when the variation is clinically insignificant. And because AI still lacks a provider’s empathy and human touch, patients may rightly feel uneasy receiving sensitive information without the guidance of a health professional.
The Prostate Cancer Anxiety Trap: Low PSA and Free PSA
I recently underwent a routine prostate cancer screening, where the lab results appeared to signal a high risk. But without knowledgeable interpretation, the data I received was incomplete and could easily have been misleading. Consider my PSA (Prostate-Specific Antigen) tests: while a low PSA level, around 1 ng/mL, is well within the safety margin, a low percentage of Free PSA—such as 10%—could indicate a higher risk of prostate cancer, even when the PSA level itself does not suggest immediate concern. Interpreting these results on my own, the low Free PSA percentage triggered no small amount of anxiety about prostate cancer.
Conversely, a low PSA level combined with a low Free PSA percentage does not paint a clear picture; further medical consultation was required to interpret risk accurately. Age, overall health, and family history are crucial in clinical decision-making, but these factors are absent from raw diagnostic data. I was lucky to have the immediate benefit of my urologist’s expertise; without their judgment and insight, I might have jumped to conclusions, undergone unnecessary stress, or requested unneeded tests out of fear.
And it’s not just prostate exams. Thyroid function tests, heart-health markers, and diabetes screening metrics are three additional areas rife with potential confusion for patients reviewing their raw lab results. Elevated TSH readings, high LDL or CRP levels, or a single HbA1c result do not tell the whole story. These values are not necessarily predictive independently; they’re components of a broader clinical picture. Consumers need to be supported in reviewing that clinical picture, and AI needs to be part of providing that help.
Moving Toward Empowered, Informed Health Access
Despite the overall need to incorporate supports for interpreting health data, some health diagnostic platforms have begun to address consumers’ need to know by adding helpful explanatory notes to lab results or providing easy-access telehealth consults for patients seeking immediate explanations. A patient receiving an abnormal test result might be able to speak with a health professional or receive brief, plain-language insights about what the numbers indicate in general terms. These options can serve as bridges, guiding patients away from alarmist conclusions and toward actionable steps. At the very least, they provide patients with prompts for the questions to ask their health professional during a face-to-face meeting.
For those building and maintaining portal apps and platforms, acknowledging that many patients lack extensive health literacy and may be unfamiliar with basic medical terminology is crucial. Developers of health platforms should consider embedding visual aids, simplified summaries, or links to reputable health information sources to provide a foundation for stronger understanding before patients seek further information.
AI and Large Language Models: Help and Hazard
A recent JAMA study surprisingly suggests patients often prefer ChatGPT tools to physician conversations. While physicians understandably doubted the study’s conclusions, its data and the public conversation it sparked revealed that technology available in real-time to answer patients’ pressing clinical and emotional needs was sought and welcomed.
Large language models (LLMs) can provide general context, describing what test results typically mean and suggesting possible follow-up actions, like consulting a doctor if certain thresholds are exceeded. Technology can offer helpful insights, especially for people with conditions and concerns like cancer and other frightening health challenges requiring prompt access to professional input.
LLM Cancer Mentor Apps such as Dave AI are revolutionary tools for cancer patients, caregivers, and physicians. It’s like having a WAZE for navigating the complex cancer diagnostic and care journey. Patients no longer have to wait for their next appointment to get answers to their questions. Health tech innovators realize that diagnoses leave patients anxious between doctor visits, and the best solution may often be well-trained LLMs. [See LLM Cancer Mentor “Dave AI” Offers WAZE-like 24/7 Personalized Support, Making it a Game-Changer in Patient Care.]
The Cost of Health Data Without Context
The rapid consumerization of health data has democratized healthcare in powerful ways, but its pace has opened gaps in care and introduced risks. The emotional toll can be high when consumers receive real-time results without contextual guidance. Data that might otherwise encourage patients to engage actively in their wellness may instead provoke anxiety, confusion, or even mistrust in their own body’s signals.
“In clinical trials and healthcare, acknowledging and addressing the psychological symptoms brought on by the diagnosis, prognosis, and treatment of many conditions is vital,” writes Emily Epstein, LMSW, a Weill Cornell staff members who advocate for the inclusion of mental health support research and medical settings. “When a provider overlooks the anxiety that accompanies the stress of managing physical symptoms, a disservice is done to the mental well-being of the patients and participants, which can have adverse effects on the trial itself,” she adds.
A Call for Physician-Guided Digital Health Experiences
Access to lab results is part of a significant and positive cultural shift in health, where the focus has moved toward patient empowerment and shared decision-making. However, for true empowerment, patients need more than raw data; they need empathetic guidance in understanding their data’s meaning in the context of their unique health profile. To address this, health systems and app developers should incorporate sophisticated LLM resources that help educate users about the strengths and limitations of raw lab data and how to navigate the journey better ahead.
Empowered patients are educated patients capable of better understanding their health journeys in partnership with their health providers. As we advance, let’s strive for a digital health environment where patients can access their results clearly and confidently and know that communication is always part of the care.