<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Ethics in Practice - Medika Life</title>
	<atom:link href="https://medika.life/category/disciplines/policy/ethics-in-practice/feed/" rel="self" type="application/rss+xml" />
	<link>https://medika.life/category/disciplines/policy/ethics-in-practice/</link>
	<description>Make Informed decisions about your Health</description>
	<lastBuildDate>Tue, 14 Apr 2026 13:51:48 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">180099625</site>	<item>
		<title>&#8220;The Borrowed Mind&#8221; &#8211; Reclaiming Thought in an Age That Wants to Do It For Us</title>
		<link>https://medika.life/the-borrowed-mind-reclaiming-thought-in-an-age-that-wants-to-do-it-for-us/</link>
		
		<dc:creator><![CDATA[Gil Bashe, Medika Life Editor]]></dc:creator>
		<pubDate>Tue, 14 Apr 2026 13:51:44 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[For Doctors]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[Book]]></category>
		<category><![CDATA[Brian Health]]></category>
		<category><![CDATA[Cognitive Power]]></category>
		<category><![CDATA[Covid-19]]></category>
		<category><![CDATA[Gil Bashe]]></category>
		<category><![CDATA[Human Thought]]></category>
		<category><![CDATA[John Nosta]]></category>
		<category><![CDATA[LLMs]]></category>
		<category><![CDATA[mental health]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[The Borrowed Mind]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21654</guid>

					<description><![CDATA[<p>In The Borrowed Mind: Reclaiming Human Thought in the Age of AI, John Nosta steps into that quieter, more consequential space. This is not a technical manual, nor a manifesto driven by fear or exuberance. It is something rarer. It is a meditation on cognition itself, on how human thought is being reshaped in real [&#8230;]</p>
<p>The post <a href="https://medika.life/the-borrowed-mind-reclaiming-thought-in-an-age-that-wants-to-do-it-for-us/">&#8220;The Borrowed Mind&#8221; &#8211; Reclaiming Thought in an Age That Wants to Do It For Us</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In <em><a href="https://a.co/d/0h7LovkU">The Borrowed Mind: Reclaiming Human Thought in the Age of AI</a></em>, <a href="https://www.linkedin.com/in/johnnosta/">John Nosta</a> steps into that quieter, more consequential space. This is not a technical manual, nor a manifesto driven by fear or exuberance. It is something rarer. It is a meditation on cognition itself, on how human thought is being reshaped in real time, and on what we risk losing if we fail to notice.</p>



<p>Early in the book, Nosta writes, <em>“The solved can never touch the whole.”</em>&nbsp; That line lingers. It captures the essence of his argument. AI can solve, generate, synthesize, and accelerate. Yet something about the human experience of thinking, the struggle, the friction, the meaning-making, exists beyond resolution.</p>



<p>This tension defines the book. It is not anti-technology. Nosta is deeply engaged with AI and candid about its value. He describes large language models as tools that “move faster and connect more disparate concepts than our minds could ever manage on their own.”&nbsp; He is equally clear that this capability introduces a subtle risk. We may begin to outsource not just tasks, but thought itself.</p>



<p>That distinction matters more than many may be willing to admit.</p>



<h2 class="wp-block-heading"><strong>From Tools to Thought</strong></h2>



<p>One of the most compelling contributions of <em>The Borrowed Mind</em> is its framing of AI not as the next step in computing, but as a turning point in cognition. Nosta traces a clear arc. Gutenberg unlocked words. Google unlocked facts. AI, he argues, is unlocking thought.&nbsp;</p>



<p>That progression is elegant, yet also unsettling. Words and facts could be externalized without fundamentally altering the structure of human reasoning. Thought is different. It is intimate. It is identity. It is how we become.</p>



<p>Nosta reminds us that thinking once required effort, a type of natural friction that created sparks of innovation. <em>“The distance between question and answer created space for our discernment.”</em>&nbsp; Within that space, judgment formed, curiosity deepened, and understanding took root.</p>



<p>AI compresses that distance. It removes friction. It delivers coherence with remarkable speed. &nbsp;One of the book’s most important insights emerges here. Coherence is not the same as understanding.</p>



<p>Nosta introduces the concept of “anti-intelligence,” describing it as “fluency without understanding. Coherence without experience.”&nbsp; AI does not think. It mirrors the structure of thinking. It produces language that resembles reasoning without sharing its origin.</p>



<p>In health, where evidence, interpretation, and judgment must coexist, this distinction is not academic. It is operational. It shapes how clinicians trust tools, how leaders deploy them, and how patients ultimately experience care.</p>



<h2 class="wp-block-heading"><strong>The Seduction of the Socratic Mirror</strong></h2>



<p>One of the most original sections of the book is Nosta’s description of the “Socratic Mirror.” He draws a parallel between classical dialogue and modern AI interaction. Socrates asked questions to surface the truth. AI, in a different way, reflects our thinking back to us, reframed, extended and sometimes clarified.</p>



<p>Nosta writes that the model <em>“…does not tell me what to think but creates the conditions under which my own thinking could deepen.”</em>&nbsp;This is where the book moves beyond critique and into possibility.</p>



<p>Used well, AI becomes a cognitive partner. It expands perspective, accelerates exploration, and invites iteration. In clinical research, patient engagement, and system design, this capacity holds enormous promise.</p>



<p>Nosta does not romanticize the relationship. He recognizes its asymmetry. The model has no interior life. It does not ponder. It does not carry consequence. It does not bear responsibility. That responsibility remains human.</p>



<h2 class="wp-block-heading"><strong>Rethinking the Fear of Displacement</strong></h2>



<p>A persistent anxiety runs beneath every conversation about AI. Many fear it will become a job slayer, a force that displaces rather than elevates human contribution. That concern is understandable, yet not new.</p>



<p>Every meaningful advance in technology has reshaped how people work. The wheel did not eliminate labor. It redefined movement. The stethoscope did not replace physicians. It extended their ability to listen and interpret. The tollbooth transponder did not end transportation roles. It changed the flow and focus of human involvement. Each innovation shifted roles, demanded new skills, and expanded what people could do.&nbsp; AI belongs in that lineage.</p>



<p>What distinguishes this moment is not the elimination of work, but the redistribution of cognitive effort. The real risk is not that machines will think for us, but that people may become less inclined to think for themselves. Nosta’s warning is subtle yet profound. Surrendering curiosity, judgment, and reflection to systems that generate answers with ease risks dulling the very faculties that define human intelligence.</p>



<p>This is why <em>The Borrowed Mind</em> is such an important read at this moment. It does not dismiss concerns around job displacement. It reframes it. The central challenge is not protecting roles as they exist today, but strengthening the uniquely human capacities no system can replicate. Creativity, discernment, ethical reasoning, and the ability to navigate ambiguity are not diminished by AI. They become more essential.</p>



<p>The book offers reassurance without complacency. The future of work will favor those who sharpen their thinking, engage deeply with ideas, and remain active participants in their own intellectual development. The machine is not the adversary. Neglecting the development of one’s own mind is a danger.</p>



<h2 class="wp-block-heading"><strong>Composite Intelligence and the Limits of the Machine</strong></h2>



<p>Nosta introduces “composite intelligence” to describe the interaction between human and machine cognition. Composite does not mean blended into sameness. It means distinct contributions working in concert. The model brings speed and breadth. The human brings depth.</p>



<p>This triad becomes one of the most useful frameworks in this book. AI excels in velocity and scale. Depth, the slow transformation of understanding, remains human. As Nosta writes, “Models do not ponder.”&nbsp;</p>



<p>In health, this distinction is profound. Data can inform. Algorithms can suggest. The act of deciding, especially in moments of uncertainty, requires something more. It requires what Nosta elevates as the defining human contribution. Virtue.</p>



<p>Drawing on Aristotle’s concept of practical wisdom, Nosta reminds us that judgment is forged through experience, consequence, and accountability. A model can generate options. It cannot live with outcomes.</p>



<p>This is where the book resonates most deeply for those working in health. Intelligence is becoming abundant. Discernment is becoming scarce and, therefore, more valuable.</p>



<h2 class="wp-block-heading"><strong>The Risk of the Borrowed Mind</strong></h2>



<p>The book&#8217;s title is not metaphorical. It is a warning. Nosta argues that as engagement with AI deepens, internal dialogue begins to change. The model becomes a cognitive tuning fork, subtly shaping how questions are framed, how ideas are explored, and how answers are anticipated. This dynamic is not inherently negative. It can elevate thinking, accelerate learning, and make complex domains more accessible. Dependency remains the concern.</p>



<p>Reliance on generated thought risks weakening the muscle of original thinking. Access can be mistaken for understanding. Individuals may become, in Nosta’s words, “cognitive clones.”&nbsp;</p>



<p>This concern is particularly relevant in health ecosystems already strained by time, complexity, and administrative burden. The temptation to offload cognitive work will be strong. The discipline to remain intellectually engaged will be essential.</p>



<h2 class="wp-block-heading"><strong>A Book About AI That Is Not About AI</strong></h2>



<p>What makes <em>The Borrowed Mind</em> stand apart is that it is not ultimately about technology. It is about humanity. Nosta writes, <em>“This book is not really about technology. It is about you.”</em>&nbsp; That idea anchors this work.</p>



<p>Readers are challenged to consider what it means to remain “<em>the authors of our own minds.”</em>&nbsp; Not passive recipients of generated insight, but active participants in meaning-making.</p>



<p>This question sits at the center of the health ecosystem’s future. As AI becomes embedded in clinical workflows, research, and patient engagement, the issue is not whether it will improve efficiency. It will.</p>



<p>The deeper question is whether it will deepen humanity or dilute it. Will it create space for clinicians to think more deeply, connect more meaningfully, and act more wisely? Or will it create a system that values speed over reflection, output over understanding, and coherence over truth?</p>



<p>Nosta offers no simple answers. He offers a framework for asking better questions.</p>
<p>The post <a href="https://medika.life/the-borrowed-mind-reclaiming-thought-in-an-age-that-wants-to-do-it-for-us/">&#8220;The Borrowed Mind&#8221; &#8211; Reclaiming Thought in an Age That Wants to Do It For Us</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21654</post-id>	</item>
		<item>
		<title>AI Will Not Fix Health Care &#8211; Leadership Might</title>
		<link>https://medika.life/ai-will-not-fix-health-care-leadership-might/</link>
		
		<dc:creator><![CDATA[Gil Bashe, Medika Life Editor]]></dc:creator>
		<pubDate>Tue, 07 Apr 2026 05:25:12 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[For Doctors]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Healthcare Policy and Opinion]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[ChatGPT]]></category>
		<category><![CDATA[Clalit Health Services]]></category>
		<category><![CDATA[Gil Bashe]]></category>
		<category><![CDATA[Hal Wolf]]></category>
		<category><![CDATA[Harvard Medical School]]></category>
		<category><![CDATA[HIMSS]]></category>
		<category><![CDATA[Issac Kohane]]></category>
		<category><![CDATA[LLMs]]></category>
		<category><![CDATA[Ran Balicer]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21627</guid>

					<description><![CDATA[<p>There is a moment at the HIMSS Global Health Conference when the conversation shifts. It moves away from what artificial intelligence can do and toward how it is already being used. Not in controlled pilots or planned rollouts, but in real time, by countless clinicians making decisions under pressure. Artificial intelligence is no longer a [&#8230;]</p>
<p>The post <a href="https://medika.life/ai-will-not-fix-health-care-leadership-might/">AI Will Not Fix Health Care &#8211; Leadership Might</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>There is a moment at the <a href="https://www.himss.org/">HIMSS Global Health Conference</a> when the conversation shifts. It moves away from what artificial intelligence can do and toward how it is already being used. Not in controlled pilots or planned rollouts, but in real time, by countless clinicians making decisions under pressure. Artificial intelligence is no longer a future state. It is present, embedded and influencing care before many organizations have fully decided how it should be governed. The industry is not lacking innovation. It is navigating its consequences.</p>



<p>Health systems are not stepping into artificial intelligence from a place of calm or control. In the United States, spending now exceeds $4.5 trillion, with a significant share tied up in administrative work that adds complexity more than clarity. Clinicians are caring for more patients, navigating more data and making more decisions under pressure than ever before. The system is stretched. Artificial intelligence is entering at a moment when change is no longer a choice.</p>



<p>The discussion drew on the experience of three leaders who are not observing this shift. They are guiding it. <a href="https://iowa.himss.org/resource-bio/harold-f-wolf-iii">Hal Wolf</a> leads HIMSS, influencing digital health policy and implementation across more than 100 countries. <a href="https://dbmi.hms.harvard.edu/people/isaac-kohane">Isaac Kohane, MD, PhD, Chair of Biomedical Informatics at Harvard Medical School</a>, has spent four decades defining how data informs clinical care. <a href="https://en.wikipedia.org/wiki/Ran_Balicer">Ran Balicer, MD, Chief Innovation Officer at Clalit Health Services</a>, operates within one of the world’s most integrated health systems, where data and care are aligned across generations.</p>



<p>These are not just star panelists. They are system-wide architects.  What emerged from the hour-long conversation was not what artificial intelligence can do. It was a recognition that it is already doing more than most systems are prepared to guide and govern.</p>



<figure class="wp-block-image size-large"><img data-recalc-dims="1" fetchpriority="high" decoding="async" width="696" height="445" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=696%2C445&#038;ssl=1" alt="" class="wp-image-21628" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=1024%2C654&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=300%2C192&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=768%2C490&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=1536%2C981&amp;ssl=1 1536w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=2048%2C1308&amp;ssl=1 2048w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=150%2C96&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=696%2C444&amp;ssl=1 696w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=1068%2C682&amp;ssl=1 1068w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?resize=1920%2C1226&amp;ssl=1 1920w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Issac-1.png?w=1392&amp;ssl=1 1392w" sizes="(max-width: 696px) 100vw, 696px" /><figcaption class="wp-element-caption">Photo Credit: HIMSS: Isaac Kohane, PhD, MD, Chair of Biomedical Informatics at Harvard Medical School, shares insights from the mainstage of HIMSS</figcaption></figure>



<p>Dr. Kohane captured the tension immediately. <em>“I think that we have to worry about the fact that we’re going both too slow and too fast.”</em></p>



<p>That statement reflects a reality many leaders feel but rarely express. Governance takes time because it must. Patient safety, validation and accountability require structure. Practice moves in real time. Clinicians do not have the luxury of waiting for perfect systems.</p>



<p><em>“They’re so desperate to do right by their patients to use other resources,”</em> Dr. Kohane adds.</p>



<p>That instinct is not a weakness. It reflects a commitment to doing what is right for the patient. When clinicians turn to external AI tools, they are seeking clarity, speed, and confidence in their decisions. Artificial intelligence is already present at the point of care, shaping how physicians assess information, validate thinking, and move forward. The system is not adopting AI. The system is catching up.</p>



<p>This creates a condition that is difficult to measure and even harder to manage. Different clinicians use different ChatGPT platforms. Those tools produce different answers. Different assumptions shape those answers. Over time, consistency erodes. The system begins to operate with multiple definitions of truth (and the risk of varied outcomes).</p>



<p>Dr. Kohane’s warning is not about misuse. It is about misguided permanence. <em>“The worst outcome will be if the worst parts of medicine get concrete poured over it, by AI.”</em></p>



<p>Artificial intelligence does not fix a system; without leadership, it accelerates the integration of incorrect assumptions. If workflows are inefficient, they become more efficiently inefficient. If bias exists in data, it becomes more precise. If fragmentation defines care, it scales.</p>



<h2 class="wp-block-heading"><strong>This is not a failure of technology. It is a mirror held up to system-wide leadership.</strong></h2>



<p>Hal Wolf, among the health sector’s leading policy and operational voices, grounded this moment in proven experience. Health care has seen this pattern before. When internet connectivity entered hospitals, clinicians moved faster than governance. They created access where it was needed. Systems responded later. Risks were discovered after adoption.</p>



<figure class="wp-block-image size-large is-resized"><img data-recalc-dims="1" decoding="async" width="696" height="575" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=696%2C575&#038;ssl=1" alt="" class="wp-image-21629" style="width:871px;height:auto" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=1024%2C846&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=300%2C248&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=768%2C634&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=1536%2C1269&amp;ssl=1 1536w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=2048%2C1692&amp;ssl=1 2048w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=150%2C124&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=696%2C575&amp;ssl=1 696w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=1068%2C882&amp;ssl=1 1068w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?resize=1920%2C1586&amp;ssl=1 1920w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Hal-Wolf-2.png?w=1392&amp;ssl=1 1392w" sizes="(max-width: 696px) 100vw, 696px" /><figcaption class="wp-element-caption">Photo Credit: HIMSS &#8211; Hal Wolf, President and CEO, HIMSS, on the mainstage conversation on &#8220;Recognizing the Value Proposition” Criteria While Selecting AI Applications&#8221; with Drs. Kohane and Balicer.</figcaption></figure>



<p>Artificial intelligence now follows that same trajectory, though at far greater speed and with far greater consequences. Web connectivity gave quick access to information. Artificial intelligence influences how that information is interpreted and acted upon.</p>



<p><em>“We have to go faster,”</em> Mr. Wolf said<em>. “But there needs to be structure around it.”</em></p>



<p>That is the leadership challenge of this moment. Speed without structure creates exposure. Structure without speed creates irrelevance. The tension between the two is not something to resolve. It is something to manage continuously.</p>



<p>The industry has predictably responded to artificial intelligence. It has started where risk is lowest and return is clearest. Documentation, scheduling and revenue cycle optimization have become the entry points. These applications reduce burden and improve efficiency. They are necessary. However, they are not transformational.</p>



<p>The shift occurs when artificial intelligence moves into clinical decision-making. At that point, the question is no longer whether the system works. The question becomes whether it should be trusted.</p>



<p>Who owns a decision informed by an algorithm? How is accuracy validated? What happens when a clinician disagrees with a recommendation? These are not technical questions. They are questions of accountability. Artificial intelligence does not assume responsibility. It does not carry consequence. That remains with leadership.</p>



<p>Dr. Balicer reframed the conversation, shifting how the room thought about artificial intelligence. <em>“There’s no such thing as AI neutrality. Algorithms are just opinions embedded in code.”</em></p>



<figure class="wp-block-image size-full"><img data-recalc-dims="1" decoding="async" width="696" height="523" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/HkPtQ7MB11g_0_171_2000_1501_0_x-large.jpg?resize=696%2C523&#038;ssl=1" alt="" class="wp-image-21630" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/HkPtQ7MB11g_0_171_2000_1501_0_x-large.jpg?w=1024&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/HkPtQ7MB11g_0_171_2000_1501_0_x-large.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/HkPtQ7MB11g_0_171_2000_1501_0_x-large.jpg?resize=768%2C577&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/HkPtQ7MB11g_0_171_2000_1501_0_x-large.jpg?resize=150%2C113&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/HkPtQ7MB11g_0_171_2000_1501_0_x-large.jpg?resize=696%2C523&amp;ssl=1 696w" sizes="(max-width: 696px) 100vw, 696px" /><figcaption class="wp-element-caption">Photo Credit: CTECH &#8211; Ran Balicer, MD, Chief Innovation Officer at Clalit Health Services.</figcaption></figure>



<p>That insight is easy to acknowledge and difficult to operationalize. Every model reflects choices. What data is included? What outcomes are prioritized? What trade-offs are accepted? Those decisions are embedded in the system, shaping how it interprets information.</p>



<p>When a health system adopts an AI tool, it is not simply implementing technology. It is adopting a perspective.</p>



<p>At Clalit Health Services, alignment across payer and provider creates a system where priorities are consistent. Even there, external AI models introduce new assumptions. Those assumptions may not align with the system’s goals. If leadership does not define its own values, it inherits someone else’s.</p>



<p>This becomes real in proactive care. Artificial intelligence enables systems to identify patients at risk before they present. It allows for earlier intervention, often improving outcomes.</p>



<p>It also creates a new kind of pressure. <em>“The toughest choice is what not to do,”</em> Dr. Balicer said.</p>



<p>That statement deserves more attention than it receives. Health care has been built around responding to need. Artificial intelligence introduces the ability to anticipate it. When every patient can be flagged, every risk predicted and every intervention suggested, the system is no longer constrained by insight. It is constrained by capacity.</p>



<p>Artificial intelligence expands what can be done. It does not expand who can do it. Leadership becomes the act of choosing who does what based on validated data.</p>



<p>There is a moment that captures this shift. Imagine a primary care physician starting the day not with a schedule of patients who have called for appointments, but with a list generated by AI identifying individuals who are likely to experience clinical complications in the next six months. Some will develop chronic conditions. Some will require hospitalization. Some can be helped now – preventively.</p>



<h2 class="wp-block-heading">The physician cannot see them all. Artificial intelligence expands what is possible. Leadership decides what is essential and permissible.</h2>



<p>The industry often responds to complexity with activity. Organizations pilot, test and explore. They engage broadly without committing deeply. This creates motion. It rarely creates progress. Pilots are nothing more than experiments. At some point, leadership must decide what to scale, what to stop and what defines value.</p>



<p>Hal Wolf grounded the conversation in discipline. Without a defined, shared objective, effort becomes noise. Pilots create learning, though they often avoid decision-making. Leadership requires clarity. What problem are we solving? What outcome defines success? What are we willing to prioritize? Without those answers, artificial intelligence adds another layer of complexity to an already complex system.</p>



<p>Dr. Kohane brought the conversation back to the discipline of leadership. It cannot remain abstract. It must be informed by experience.</p>



<p><em>“Go and pay a few bucks and use three or four of the models… get a feel for what this does,” Dr. Kohane advised.</em></p>



<p>That is not a call for technical fluency. It is a call for leadership proximity. Leaders cannot guide what they do not understand. Artificial intelligence does not behave consistently across models. It produces different answers, shaped by different assumptions. Without direct engagement, those differences remain hidden, and leadership becomes removed from the very decisions it is responsible for guiding.</p>



<p>This is where many organizations hesitate. Artificial intelligence feels complex and complexity invites delegation. At this moment, delegation creates distance. Leadership is required to move closer, not further away.</p>



<h2 class="wp-block-heading"><strong>Artificial intelligence is not reducing the role of leadership. It is redefining it.</strong></h2>



<figure class="wp-block-image size-large"><img data-recalc-dims="1" loading="lazy" decoding="async" width="696" height="536" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=696%2C536&#038;ssl=1" alt="" class="wp-image-21631" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=1024%2C789&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=300%2C231&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=768%2C591&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=1536%2C1183&amp;ssl=1 1536w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=2048%2C1577&amp;ssl=1 2048w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=150%2C116&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=696%2C536&amp;ssl=1 696w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=1068%2C822&amp;ssl=1 1068w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?resize=1920%2C1479&amp;ssl=1 1920w, https://i0.wp.com/medika.life/wp-content/uploads/2026/04/Gil-Bashe-1.png?w=1392&amp;ssl=1 1392w" sizes="auto, (max-width: 696px) 100vw, 696px" /><figcaption class="wp-element-caption">Phot Credit: HIMSS &#8211; Gil Bashe, Chair Global Health and Purpose, FINN Partners and Editor-in-Chief, Media Life at HIMSS moderating the mainstage session &#8220;Recognizing the Value Proposition” Criteria While Selecting AI Applications.&#8221;</figcaption></figure>



<p>This is not a gradual transition. It is already underway. Artificial intelligence is embedded in workflows, shaping decisions and influencing behavior in real time. The system is adapting whether leadership is ready or not.</p>



<p>The question is no longer whether artificial intelligence will shape the future of health. It will. The question is whether leadership will shape how it is applied.</p>



<p>Artificial intelligence will not fix health. It will scale whatever we allow it to touch. The question is whether it will scale what is best in health or what we have yet to fix.</p>
<p>The post <a href="https://medika.life/ai-will-not-fix-health-care-leadership-might/">AI Will Not Fix Health Care &#8211; Leadership Might</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21627</post-id>	</item>
		<item>
		<title>From AI Excitement to Execution: Why Health Leaders Must Now Master the “How”</title>
		<link>https://medika.life/from-ai-excitement-to-execution-why-health-leaders-must-now-master-the-how/</link>
		
		<dc:creator><![CDATA[Gil Bashe, Medika Life Editor]]></dc:creator>
		<pubDate>Fri, 06 Mar 2026 20:02:51 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[For Doctors]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[ChatGPT]]></category>
		<category><![CDATA[Clalit Health Services]]></category>
		<category><![CDATA[Ethics]]></category>
		<category><![CDATA[Governance]]></category>
		<category><![CDATA[Hal Wolf]]></category>
		<category><![CDATA[HIMSS]]></category>
		<category><![CDATA[HIMSS 2026]]></category>
		<category><![CDATA[Isaac Kohane]]></category>
		<category><![CDATA[LLMs]]></category>
		<category><![CDATA[OpenAI]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21616</guid>

					<description><![CDATA[<p>Artificial intelligence is advancing in health care faster than almost any other technology in modern medical history. According to research from McKinsey &#38; Company, artificial intelligence could generate as much as $100 billion annually across healthcare systems worldwide, through improved clinical decision support and workflow efficiency, as well as advances in drug development and population [&#8230;]</p>
<p>The post <a href="https://medika.life/from-ai-excitement-to-execution-why-health-leaders-must-now-master-the-how/">From AI Excitement to Execution: Why Health Leaders Must Now Master the “How”</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Artificial intelligence is advancing in health care faster than almost any other technology in modern medical history. According to research from <a href="https://www.mckinsey.com/industries/life-sciences/our-insights/generative-ai-in-the-pharmaceutical-industry-moving-from-hype-to-reality">McKinsey &amp; Company, artificial intelligence could generate as much as $100 billion annually across healthcare systems worldwide</a>, through improved clinical decision support and workflow efficiency, as well as advances in drug development and population health analytics. The promise is extraordinary, and the pace of implementation shows little sign of slowing.</p>



<p>History, however, offers a useful caution. Breakthrough technologies in medicine rarely achieve their full potential simply because they exist. Their real impact depends on whether the institutions responsible for health-care delivery know how to adopt them wisely, integrate them responsibly and align them with their mission to improve patient health.</p>



<p>Artificial intelligence now stands at that same threshold. The industry has moved beyond fascination with what algorithms can do and entered a more demanding phase: determining how these tools should be evaluated, governed, and integrated into the environments where care is delivered. At the same time, some health professionals are turning to AI – not to augment their knowledge – but assuming the information is patient-care ready.</p>



<p>Across the health ecosystem, leaders are discovering that the most important questions about artificial intelligence are not technological. They are organizational, ethical and operational. Which AI systems genuinely improve clinical decision-making? Which tools strengthen the efficiency of hospitals and health systems? Which innovations introduce complexity without delivering measurable benefit?</p>



<p>Answering those questions requires a perspective that bridges policy leadership, real-world care delivery, and the scientific foundations of biomedical informatics. That convergence of experience sits at the center of a “Views From the Top” mainstage discussion at the <a href="https://www.himssconference.com/register/?utm_source=google&amp;utm_medium=cpc&amp;utm_campaign=US-EN-GA-BRD-PHA-Search-HIMSS26-Core&amp;gad_source=1&amp;gad_campaignid=23028140300&amp;gbraid=0AAAAA9RcRS5VnIvOREOV_e8P__ck9VjTR&amp;gclid=Cj0KCQiAk6rNBhCxARIsAN5mQLtutruWd-5p1Wn2AwXHxy1v-Qi3oN1ADdz2MjA78q5H_4qD6RWCwNIaAoAHEALw_wcB">HIMSS Global Health Conference &amp; Exhibition</a>, where some 35,000 leaders whose work spans the global health ecosystem will examine how organizations can recognize the true value proposition of artificial intelligence applications before embedding them into health-care systems.</p>



<p>The perspectives shaping this discussion reflect three essential dimensions of responsible artificial intelligence in health: governance frameworks that guide innovation, operational insights from large-scale health care delivery, and scientific rigor grounded in biomedical informatics. Together, these vantage points illuminate the path from technological promise to practical value.</p>



<h2 class="wp-block-heading"><strong>Governing Innovation in a Rapidly Changing Health Ecosystem</strong></h2>



<p>Digital transformation in health rarely succeeds simply because technology exists. It succeeds when organizations develop leadership frameworks capable of evaluating innovation, managing risk and aligning new tools with patient-centered goals.</p>



<p>Few leaders have observed the evolution of digital health across as many national systems and institutional environments as <a href="https://iowa.himss.org/resource-bio/harold-f-wolf-iii">Hal Wolf, president and chief executive officer of HIMSS</a>, <a href="https://en.wikipedia.org/wiki/Ran_Balicer">Ran Balicer, MD, PhD, chief innovation officer of Clalit Health Services</a> and <a href="https://dbmi.hms.harvard.edu/people/isaac-kohane">Isaac Kohane, MD, PhD, chair of biomedical informatics at Harvard Medical School</a>. The three will step onto the mainstage at HIMSS to share their “View from the Top” in a session titled: <a href="https://app.himssconference.com/event/himss-2026/planning/UGxhbm5pbmdfNDMyNzU3NA==">“Recognizing the &#8216;Value Proposition&#8217; Criteria While Selecting AI Applications</a>.”</p>



<figure class="wp-block-image size-large"><img data-recalc-dims="1" loading="lazy" decoding="async" width="696" height="392" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=696%2C392&#038;ssl=1" alt="" class="wp-image-21617" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=1024%2C576&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=300%2C169&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=768%2C432&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=1536%2C864&amp;ssl=1 1536w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=150%2C84&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=696%2C392&amp;ssl=1 696w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?resize=1068%2C601&amp;ssl=1 1068w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?w=1920&amp;ssl=1 1920w, https://i0.wp.com/medika.life/wp-content/uploads/2026/03/116-H26-VFTT-Social-Graphic.png?w=1392&amp;ssl=1 1392w" sizes="auto, (max-width: 696px) 100vw, 696px" /><figcaption class="wp-element-caption">Image provided by HIMSS</figcaption></figure>



<p>Through his work with global government health ministries, hospital networks, and technology innovators worldwide, Wolf has consistently emphasized that technological progress must be anchored in governance and trust.</p>



<p><em>“Digital health transformation is not about technology alone. It is about leadership, governance, and the trust that allows innovation to improve care,”</em> Wolf has said in discussions about global digital health transformation.</p>



<p>Artificial intelligence intensifies this leadership challenge because its influence extends far beyond traditional clinical tools. AI systems increasingly operate across multiple layers of healthcare delivery. Some applications assist clinicians by analyzing medical data or suggesting treatment options. Others function within hospitals&#8217; and health systems&#8217; operational infrastructure, helping manage patient flow, prioritize diagnostic reviews, and allocate scarce resources.</p>



<p>These operational algorithms rarely capture headlines; however, &nbsp;they shape the environment in which health care is delivered. Decisions about which cases are reviewed first, how clinicians allocate their attention, and how health systems manage capacity can profoundly influence patient outcomes.</p>



<p>For leaders responsible for health systems, artificial intelligence cannot be treated as simply another technological upgrade. It must be evaluated through governance structures capable of understanding how algorithms function, what assumptions shape their recommendations, and how their use aligns with institutional priorities.</p>



<p>Without that oversight, innovation risks amplifying complexity rather than improving care. Instead of informing, it can spread misinformation.</p>



<h2 class="wp-block-heading"><strong>Aligning Artificial Intelligence With the Values of Medicine</strong></h2>



<p>Governance provides the policy foundation for responsible adoption of artificial intelligence, but real-world implementation reveals a second challenge: ensuring that AI systems operate effectively within healthcare delivery itself.</p>



<p>Large population health systems increasingly use advanced analytics to anticipate risk, manage chronic disease, and allocate clinical resources across diverse communities. Within these environments, artificial intelligence is no longer a theoretical innovation. It is already influencing how health organizations prioritize patients, coordinate care and deploy limited resources.</p>



<p>That operational perspective is central to Ran Balicer, MD, PhD, of <a href="https://www.clalit-innovation.org/clalitresearchinstitute">Clalit Health Services</a>, one of the world’s most advanced data-driven health systems. The Clalit integrated infrastructure connects hospitals, clinics, and community health programs through longitudinal datasets that support predictive analytics at the national scale.</p>



<p>Experience within such systems reinforces an important insight: artificial intelligence models do not function independently of human judgment. They reflect priorities embedded in their design and the assumptions guiding their deployment.</p>



<p><em>“Algorithms are opinions embedded in code,”</em> Balicer has observed in discussions about the role of artificial intelligence in population health.</p>



<p>In practice, this means that AI systems interpret clinical data through frameworks shaped by human choices. The way a model defines risk, prioritizes cases, or recommends interventions reflects decisions about what matters most within a healthcare environment.</p>



<p>Those decisions carry ethical implications. When artificial intelligence helps determine which patients receive immediate attention or which cases are escalated for further review, transparency about how algorithms function becomes essential to maintaining trust among clinicians and patients alike. The scientific frontier of health-care AI reinforces that concern.</p>



<p>Isaac Kohane, MD, PhD, who has also served as a co-author of the <em>Institute of Medicine Report on Precision Medicine</em>, which has served as the template for national efforts, has spent decades exploring how machine learning can advance medicine while preserving the judgment that defines clinical practice. His research emphasizes that artificial intelligence in healthcare must align with the ethical traditions and professional responsibilities of medicine.</p>



<p><em>“AI systems in medicine must ultimately reflect the values of the profession they serve,”</em> Kohane has written in discussions about AI alignment in biomedical informatics.</p>



<p>This perspective highlights a crucial distinction between technological capability and clinical responsibility. Many AI models entering healthcare environments were originally designed for broader computational tasks rather than the nuanced realities of patient care. Medicine operates within a landscape shaped by uncertainty, empathy, and accountability, and technologies introduced into that environment must reflect those values.</p>



<p>Ensuring that artificial intelligence aligns with the principles guiding health-care delivery, therefore, represents one of the most important scientific and ethical challenges facing the future of health.</p>



<h2 class="wp-block-heading"><strong>The Discipline Required to Make Innovation Matter</strong></h2>



<p>The health sector has experienced waves of technological enthusiasm before. Electronic health records promised seamless information exchange, but then introduced administrative burdens on health professionals when implemented without thoughtful workflow design. Data analytics promised unprecedented insight, but sometimes led to fragmentation when systems failed to communicate across institutions.</p>



<p>Artificial intelligence now stands at a similar moment in the evolution of health technology.</p>



<p>Its capabilities in supporting decision-making flow are extraordinary, yet realizing them will require disciplined leadership to evaluate, integrate and govern AI tools within health-care delivery systems. Health leaders must learn to ask deeper questions before embracing the next algorithmic breakthrough. What problem does this system truly solve? How does it strengthen clinical practice? What assumptions guide its recommendations? How does its use advance the mission of improving patient health?</p>



<p>These questions move the conversation beyond technological novelty toward operational practicality. It’s among the many reasons these three global leaders step to the HIMSS stage together.</p>



<p>Artificial intelligence will undoubtedly reshape the health ecosystem in the years ahead. Its long-term impact, however, will not be determined solely by the sophistication of algorithms or the speed of technological progress. Along with how to leverage AI, ChatGPT and LLMs, users require heightened cognitive awareness.</p>



<p>It will be determined by whether the health community develops the discipline and ability required to translate innovation into systems that strengthen care, support clinicians and improve the health of the populations they serve.</p>



<p>The real story of artificial intelligence in health is no longer about what machines can do. It is about how wisely the health sector chooses to use them.</p>
<p>The post <a href="https://medika.life/from-ai-excitement-to-execution-why-health-leaders-must-now-master-the-how/">From AI Excitement to Execution: Why Health Leaders Must Now Master the “How”</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21616</post-id>	</item>
		<item>
		<title>How Transactional Medicine Threatens the Future of Your Health</title>
		<link>https://medika.life/how-transactional-medicine-threatens-the-future-of-your-health/</link>
		
		<dc:creator><![CDATA[Gil Bashe, Medika Life Editor]]></dc:creator>
		<pubDate>Mon, 02 Mar 2026 01:07:46 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[For Practitioners]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Healthcare Policy and Opinion]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[American Medical Association]]></category>
		<category><![CDATA[Annals of Family Medicine]]></category>
		<category><![CDATA[BMJ Open]]></category>
		<category><![CDATA[Danny Sands]]></category>
		<category><![CDATA[e-Patient Dave deBronkart]]></category>
		<category><![CDATA[Gil Bashe]]></category>
		<category><![CDATA[Healing the Sick Care System: Why People Matter]]></category>
		<category><![CDATA[Health Innovation]]></category>
		<category><![CDATA[Health Tech]]></category>
		<category><![CDATA[OECD]]></category>
		<category><![CDATA[Primary Care Medicine]]></category>
		<category><![CDATA[Society for Participatory Medicine]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21604</guid>

					<description><![CDATA[<p>Patients rarely describe healing in technological terms. They speak instead about whether someone listened, if their physician remembered them and how their concerns were understood in context. Being heard is a tipping point for establishing trust, and trust shapes when patients seek care, what they disclose and how faithfully they follow guidance. That relationship becomes [&#8230;]</p>
<p>The post <a href="https://medika.life/how-transactional-medicine-threatens-the-future-of-your-health/">How Transactional Medicine Threatens the Future of Your Health</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Patients rarely describe healing in technological terms. They speak instead about whether someone listened, if their physician remembered them and how their concerns were understood in context. Being heard is a tipping point for establishing trust, and trust shapes when patients seek care, what they disclose and how faithfully they follow guidance. That relationship becomes the foundation upon which every diagnostic and therapeutic decision – and perhaps future advances – rests.</p>



<p>Primary care continuity allows physicians to develop a longitudinal awareness that no episodic encounter or health tech tool can replicate. Over time, physicians learn what is normal for each patient and what represents meaningful clinical change. Subtle physiological shifts, early symptoms or emerging risk factors appear not as isolated data points from a blood exam, but as part of a social narrative unfolding across time. Early recognition allows earlier intervention, often before disease takes its profound toll.</p>



<p>Clinical evidence confirms the protective effect of continuity. It’s not a matter of opinion. A systematic review published in <em><a href="https://bmjopen.bmj.com/content/8/6/e021161">BMJ Open</a></em> found that patients with sustained continuity of care had significantly lower mortality than those with fragmented care. Continuity did not just improve satisfaction; it altered survival. The physician who knows the patient can detect disease earlier and guide care more effectively.</p>



<p>Listening allows physicians to detect patterns that laboratory values alone cannot explain. Patients share information differently when they believe that their physician understands them and remembers their history. This sustained awareness allows physicians to identify emerging illnesses without relying solely on reactive diagnostics. Continuity transforms listening into clinical intelligence and a deeper care partnership.</p>



<p>In <em><a href="https://a.co/d/08Xmu2qv">Healing the Sick Care System: Why People Matter</a></em>, which has become a surprise Amazon bestseller, one insight repeatedly emerges: patients do not seek care only for treatment; they seek reassurance that someone who knows them is guiding their journey. Physicians who listen across time accumulate knowledge that cannot be captured in a chart alone. That memory allows earlier recognition, more accurate interpretation, and wiser intervention. Healing begins in that continuity of understanding.</p>



<h2 class="wp-block-heading"><strong>Transactional Care Solves Symptoms but Sacrifices Understanding</strong></h2>



<p>Health has, for some time, been undergoing a structural shift toward transactional encounters. Walk-in clinics, urgent care centers, and virtual platforms provide speed and accessibility that patients value. These models address immediate symptoms efficiently and fill important gaps in care delivery. Accessibility has improved, yet continuity has weakened.</p>



<p>Transactional medicine treats episodes rather than trajectories. Each encounter begins without the benefit of longitudinal understanding. Clinical decisions are made with time-stamp specific knowledge of how symptoms emerged or how physiology has changed over time. Care becomes reactive rather than interpretive.</p>



<p>Research demonstrates the consequences of this fragmentation. Studies published in the <em><a href="https://www.annfammed.org/content/16/6/492.short">Annals of Family Medicine</a></em> show that sustained primary care continuity reduces hospitalizations and lowers healthcare expenditures. Early recognition prevents complications that require more invasive, costly interventions. Fragmentation delays recognition and increases clinical risk.</p>



<p>In fact, physicians in the vanguard of building relationships encourage their patients to ask questions.&nbsp; In their co-authored book <em><a href="https://a.co/d/0fLCuzj2">Let Patients Help!&nbsp;A “Patient Engagement</a>” handbook – how doctors, nurses, patients and caregivers can partner for better care&nbsp;</em>by “<a href="https://en.wikipedia.org/wiki/Dave_deBronkart">e-Patient Dave” deBronkart</a> with <a href="https://drdannysands.com/">Daniel Z. Sands, MD, MPH</a>, the founder of the <a href="https://participatorymedicine.org/">Society for Participatory Medicine</a>, offer <a href="https://participatorymedicine.org/what-is-participatory-medicine/10-things-clinicians-say-that-encourage-patient-engagement/">10 suggestions</a> that clinicians say to encourage patient engagement.</p>



<p>This shift also alters how patients engage with care. Connections that develop over time can be lost quickly when continuity disappears. Patients become consumers navigating isolated services rather than partners guided across time. The clinical relationship weakens, and with it the interpretive depth that makes prevention possible.</p>



<p>Health systems globally recognize the value of continuity. <a href="https://www.oecd.org/content/dam/oecd/en/publications/reports/2021/11/health-at-a-glance-2021_cc38aa56/ae3016b9-en.pdf">The Organization for Economic Co-operation and Development (OECD</a>), a Paris-based international organization that promotes policies to improve economic and social well-being globally, reports that hospital admissions for chronic diseases, often preventable through effective primary care, account for a substantial share of healthcare utilization. Systems that preserve physician-led primary care continuity achieve better outcomes and greater efficiency. Relationship stabilizes care.</p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe loading="lazy" title="Steve Jobs - Start with the Customer Experience" width="696" height="392" src="https://www.youtube.com/embed/QGIUa2sSYFI?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<h2 class="wp-block-heading"><strong>Innovation Requires Connection to Fulfill Its Potential</strong></h2>



<p>This shift toward transactional care carries life-threatening implications that extend far beyond the patient experience. It also directly affects whether health innovation fulfills its promise or becomes a compensatory tool addressing fragmentation. Innovation depends on context to generate meaningful insight. Context emerges through continuity. That context can devalue life-saving innovations.</p>



<p>Artificial intelligence, predictive analytics, and remote monitoring technologies are designed to detect patterns across time. These tools require longitudinal clinical awareness to distinguish meaningful change from statistical variation. Physicians who know their patients can interpret innovation correctly and act earlier. Innovation becomes transformative when anchored in relationship.</p>



<p>Fragmented care weakens this interpretive capacity. Data collected across disconnected encounters lacks coherence. Predictive tools lose precision when longitudinal context is absent. Innovation becomes reactive, identifying disease after symptoms emerge rather than predicting disease before it develops.</p>



<p>Technology achieves its highest value when it extends the physician’s ability to listen and observe. Remote monitoring allows earlier recognition of physiological change. Predictive analytics strengthens preventive intervention. Innovation amplifies continuity when guided by sustained physician leadership.</p>



<p>Team-based primary care models reflect this principle. Nurse practitioners and physician assistants expand access while physician leadership preserves interpretive continuity. Research published in <em><a href="https://www.sciencedirect.com/science/article/pii/S0889159120307832">Medical Care Research and Review</a></em> confirms that coordinated team-based care maintains strong clinical outcomes. Physician oversight ensures that innovation remains integrated within longitudinal care. It also improves health professional job satisfaction and reduces burn-out.</p>



<p>Innovation cannot replace the relationship at the center of medicine. Algorithms detect patterns but do not understand meaning, and they do not strengthen physician/patient ties. Devices collect data, but do not know the patient behind the data. Physicians translate information into guidance by integrating technology with human understanding.</p>



<p>The future of health innovation depends on preserving continuity between patient and physician. Technology deployed within sustained relationships strengthens prevention and improves outcomes. Technology deployed within fragmented systems often compensates for structural weakness rather than transforming care. Continuity determines whether innovation fulfills its promise.</p>



<p>Health systems now face a defining moment. Transactional care offers speed and convenience. Relational care offers understanding and prevention. Innovation will achieve its full potential only when it strengthens the continuity that allows physicians to listen, learn, and guide patients across time.</p>



<p>Healing begins with being heard. Health technology succeeds when it helps physicians listen more deeply and act more wisely in the service of the people who entrust them with their care.</p>
<p>The post <a href="https://medika.life/how-transactional-medicine-threatens-the-future-of-your-health/">How Transactional Medicine Threatens the Future of Your Health</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21604</post-id>	</item>
		<item>
		<title>Who Will Direct Patient Care: Physicians or Technocrats?</title>
		<link>https://medika.life/who-will-direct-patient-care-physicians-or-technocrats/</link>
		
		<dc:creator><![CDATA[Gil Bashe, Medika Life Editor]]></dc:creator>
		<pubDate>Mon, 09 Feb 2026 15:07:29 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[American Medical Asssociation]]></category>
		<category><![CDATA[ChatGPT]]></category>
		<category><![CDATA[Danny Sands]]></category>
		<category><![CDATA[Healing the Sick Care System: Why People Matter]]></category>
		<category><![CDATA[Humata Health]]></category>
		<category><![CDATA[John Nosta]]></category>
		<category><![CDATA[John Whyte]]></category>
		<category><![CDATA[Optum]]></category>
		<category><![CDATA[Society for Participatory Medicine]]></category>
		<category><![CDATA[Technologies]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21571</guid>

					<description><![CDATA[<p>Not long ago, a physician’s most powerful instrument was not a machine, an algorithm, or a digital platform. It was presence. Listening with intention. Judgment shaped by experience and compassion. Today, as medicine is being reshaped by artificial intelligence, predictive analytics and digital systems, technologies are advancing at remarkable speed. These innovations promise earlier diagnosis, [&#8230;]</p>
<p>The post <a href="https://medika.life/who-will-direct-patient-care-physicians-or-technocrats/">Who Will Direct Patient Care: Physicians or Technocrats?</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Not long ago, a physician’s most powerful instrument was not a machine, an algorithm, or a digital platform. It was presence. Listening with intention. Judgment shaped by experience and compassion. Today, as medicine is being reshaped by artificial intelligence, predictive analytics and digital systems, technologies are advancing at remarkable speed.</p>



<p>These innovations promise earlier diagnosis, greater precision and improved efficiency by augmenting the knowledge and insight that health professionals develop through years of care. Yet beneath this progress lies a more difficult question. Will we use technology to strengthen the physician–patient relationship, or allow it to redefine the nature of care?</p>



<p>As written in <em><a href="https://a.co/d/04ILhkhW">Healing the Sick Care System: Why People Matter</a></em>, “…the system is not broken because it lacks innovation, talent, or investment, but because it has lost sight of the people it exists to serve.” Technology is not the epicenter of care. It is meant to support communication, deepen relationships, and strengthen the human bond at the center of medicine.</p>



<p>Yet as artificial intelligence becomes embedded in diagnostics, decision support, documentation, reimbursement and care navigation, extraordinary clinical potential is accompanied by a growing tension.</p>



<h2 class="wp-block-heading"><strong>Two Encounters, One Technology</strong></h2>



<p>For instance, in a primary care practice, a physician begins a routine visit with a patient in their mid-50s who has diabetes and hypertension. An ambient AI system seamlessly documents conversations, captures symptoms, updates medications, and generates a clinical note. The physician no longer turns toward a screen. Connection with the patient is essential. The patient speaks openly about fatigue, stress, and concern about long-term health.</p>



<p>Midway through the visit, the electronic record surfaces an AI-generated prompt suggesting an adjustment in therapy based on predictive risk modeling. The physician pauses, not to mindlessly follow the algorithm, but to ask additional questions about daily routine, financial constraints, and willingness to adopt lifestyle changes. Technology informs conversation. It does not replace it.</p>



<p>When the visit ends, documentation is complete, the treatment decision is shared, and the patient leaves with confidence, clarity and a sense of partnership in care. The physician directs the encounter. Technology supports judgment and understanding. The visit feels thoughtful, personal and grounded in relationship.</p>



<p>Now imagine the same technology in a different environment. The documentation remains seamless. The prompts still appear. The system functions efficiently. But here, the pace is set as much by operational demand as by clinical judgement. The schedule tightens. The visit is short. The physician moves quickly from one room to the next, guided less by the patient’s story and more by the system’s tempo. The encounter becomes transactional and compressed. Technology has not changed. What has changed is who is directing the care.</p>



<p>This is the quiet divide now shaping modern medicine. One path preserves physician-directed care, where technology supports human understanding. The other reflects system-directed transaction, where efficiency begins to overshadow the relationship. The difference lies not in the tool but in the priorities that shape its use.</p>



<p>This question of direction is not theoretical. It reflects a deeper shift in how technology may shape human judgment itself. Innovation theorist <a href="https://www.psychologytoday.com/us/contributors/john-nosta">John Nosta,</a> whose work has long been rooted in the health sector and now spans a broader landscape, cautions in his <em>Psychology Today</em> column: <em>“Artificial intelligence is far from neutral, and we need to be careful by calling it simply a tool. By simulating understanding, it may reshape what humans expect from thinking itself. Over time, it can erode the habits required for discernment. And this danger is cumulative. It doesn&#8217;t announce itself as failure. It arrives as convenience.”</em> Nosta is also the author of the upcoming book: <em>The Borrowed Mind—Reclaiming Human Thought in the Age of AI.</em></p>



<h2 class="wp-block-heading"><strong>When Technology Reflects the System Around It</strong></h2>



<p>Technology itself is not the challenge. When developed in partnership with physicians, nurses, and other health professionals, it can be transformative. Many of the most effective innovations emerge when developers observe the realities of care and design tools that strengthen human interaction rather than disrupt it.</p>



<p><a href="https://www.ama-assn.org/about/authors-news-leadership-viewpoints/john-j-whyte-md-mph">John Whyte, MD, MPH, CEO of the American Medical Association</a>, has emphasized that artificial intelligence must support physicians and care teams, not replace clinical judgment, and that technology should strengthen, not weaken, the physician–patient relationship.</p>



<p>A clear example of this tension is emerging in the context of prior authorization. Health professionals and administrative staff often spend more than a dozen hours each week navigating authorization requirements, time taken directly from patient care. <a href="https://www.optum.com/en/about-us/news/page.hub5.ai-powered-digital-prior-authorization.html">New AI-enabled platforms, such as Optum’s Digital Authorization Complete powered by Humata Health</a>, are designed to remove that burden by embedding real-time automation into clinical workflows and reducing manual steps. These innovations restore something invaluable: time.</p>



<p>Now, the deeper question is not technological but human. When time is returned to the system, how will it be allocated to the health professional? Will it allow clinicians to deepen their understanding of patient needs and strengthen their connection? Or will it simply enable the system to see more patients during their shift? The technology is neutral. Its meaning is shaped by people’s intent.</p>



<p>Health care operates within systems shaped by financial and operational pressures. In a transactionally driven environment, even well-intentioned technology can be redirected toward productivity rather than connection. A tool designed to restore time can become a mechanism to increase throughput. A system intended to support thoughtful care can accelerate volume in a fee-for-service environment. Technology inevitably reflects the values and objectives of the system in which it is deployed. It is not the technology that directs decisions and action; it&#8217;s the leadership.</p>



<p>The scale of investment underscores the stakes. The global AI in health market, estimated at roughly $36–39 billion in 2025, is projected to grow substantially in the coming decade. Investment shapes priorities. Priorities shape design. Design shapes experience. And experience shapes trust.</p>



<p>Emerging guidance aligned with the <a href="https://www.ama-assn.org/practice-management/digital-health/augmented-intelligence-medicine">American Medical Association</a> emphasizes that artificial intelligence must remain under meaningful clinical oversight. Technology must support physicians and care teams, not replace judgment or responsibility. Governance, transparency, and continuous evaluation are essential to ensure that technology strengthens patient safety, clinical reasoning, and trust.</p>



<p>This perspective aligns with participatory medicine. <a href="https://drdannysands.com/">Dr. Danny Sands of the Society for Participatory Medicine</a> has described health care not as a service transaction, but as a collaboration between patient and clinician. In that view, technology should support relationship-centered care, not redirect medicine toward system-driven throughput.</p>



<h2 class="wp-block-heading"><strong>The Direction of Care</strong></h2>



<p>Health systems face real pressures: workforce shortages, clinician burnout, chronic disease, and financial strain. These realities demand smarter and more scalable solutions. Artificial intelligence offers meaningful progress. It can detect disease earlier, reduce administrative burden, and support more informed decisions. But efficiency is not healing.</p>



<p>Healing occurs when patients feel understood, supported, and guided by clinicians who have the time and space to listen and respond with care. When technology restores time and that time deepens connection, it fulfills its promise. When reclaimed time becomes additional volume, something essential is diminished.</p>



<p>Artificial intelligence will continue to shape medicine. The deeper question is not whether technology will advance, but who will decide how it is used and for what purpose.</p>



<p>If guided primarily by efficiency, care risks becoming faster but less human. If guided by partnership with physicians and patients, it can restore time to listen, space to understand, and the ability to decide together. Technology is not the healer. People are.</p>



<p>When guided by clarity of purpose, with the patient at the center of effort, and grounded in physician-guided judgment, technology becomes what it was always meant to be: a force that strengthens knowledge, deepens understanding, and restores the bond between physician and patient. Systems matter. They enable scale, coordination, and progress. Yet their purpose is fulfilled only when they serve people. Health care is at its best when human connection and well-designed systems work together in the service of healing.</p>
<p>The post <a href="https://medika.life/who-will-direct-patient-care-physicians-or-technocrats/">Who Will Direct Patient Care: Physicians or Technocrats?</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21571</post-id>	</item>
		<item>
		<title>‘I Can’t Tell You’: Attorneys, Relatives Struggle To Find Hospitalized ICE Detainees</title>
		<link>https://medika.life/i-cant-tell-you-attorneys-relatives-struggle-to-find-hospitalized-ice-detainees/</link>
		
		<dc:creator><![CDATA[Medika Life]]></dc:creator>
		<pubDate>Mon, 02 Feb 2026 02:45:11 +0000</pubDate>
				<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[For Doctors]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Industry News]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[California]]></category>
		<category><![CDATA[Federal Policies]]></category>
		<category><![CDATA[health]]></category>
		<category><![CDATA[ICE]]></category>
		<category><![CDATA[Immigration]]></category>
		<category><![CDATA[Minnesota]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21543</guid>

					<description><![CDATA[<p>[By Claudia Boyd-Barrett. Illustration by Oona Zenda. Reprinted with permission from KFF Health News.] Lydia Romero strained to hear her husband’s feeble voice through the phone. A week earlier, immigration agents had grabbed Julio César Peña from his front yard in Glendale, California. Now, he was in a hospital after suffering a ministroke. He was shackled to [&#8230;]</p>
<p>The post <a href="https://medika.life/i-cant-tell-you-attorneys-relatives-struggle-to-find-hospitalized-ice-detainees/">‘I Can’t Tell You’: Attorneys, Relatives Struggle To Find Hospitalized ICE Detainees</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong>[By <a href="https://kffhealthnews.org/news/author/claudia-boyd-barrett/">Claudia Boyd-Barrett</a>. Illustration by <a href="https://kffhealthnews.org/news/author/oona-tempest/">Oona Zenda</a>.  Reprinted with permission from KFF Health News.]</strong><a href="https://www.kff.org/about-us/support-our-work/?utm_campaign=KHN?utm_campaign=KHN"></a></p>



<p>Lydia Romero strained to hear her husband’s feeble voice through the phone.</p>



<p>A week earlier, immigration agents had grabbed Julio César Peña from his front yard in Glendale, California. Now, he was in a hospital after suffering a ministroke. He was shackled to the bed by his hand and foot, he told Romero, and agents were in the room, listening to the call. He was scared he would die and wanted his wife there.</p>



<p>“What hospital are you at?” Romero asked.</p>



<p>“I can’t tell you,” he replied.</p>



<p>Viridiana Chabolla, Peña’s attorney, couldn’t get an answer to that question, either. Peña’s deportation officer and the medical contractor at the Adelanto ICE Processing Center refused to tell her. Exasperated, she tried calling a nearby hospital, Providence St. Mary Medical Center.</p>



<p>“They said even if they had a person in ICE custody under their care, they wouldn’t be able to confirm whether he’s there or not, that only ICE can give me the information,” Chabolla said. The hospital confirmed this policy to KFF Health News.</p>



<p>Family members and attorneys for patients hospitalized after being detained by federal immigration officials said they are facing extreme difficulty trying to locate patients, get information about their well-being, and provide them emotional and legal support. They say many hospitals refuse to provide information or allow contact with these patients. Instead, hospitals allow immigration officers to call the shots on how much — if any — contact is allowed, which can deprive patients of their constitutional right to seek legal advice and leave them vulnerable to abuse, attorneys said.</p>



<p>Hospitals say they are trying to protect the safety and privacy of patients, staff, and law enforcement officials, even while hospital employees in&nbsp;<a href="https://laist.com/news/politics/boyle-heights-hospital-ice-agents-patient-care-privacy-rights">Los Angeles</a>,&nbsp;<a href="https://sahanjournal.com/health/ice-agents-hospitals-hennepin-county-medical-center/">Minneapolis</a>, and&nbsp;<a href="https://www.portlandmercury.com/news/2025/12/12/48187215/legacy-staff-and-nurses-union-say-hospital-policies-harm-immigrants">Portland, Ore.</a>, cities where Immigration and Customs Enforcement has conducted immigration raids, say it’s made their jobs difficult. Hospitals have used what are sometimes called blackout procedures, which can include registering a patient under a pseudonym, removing their name from the hospital directory, or prohibiting staff from even confirming that a patient is in the hospital.</p>



<p>“We’ve heard incidences of this blackout process being used at multiple hospitals across the state, and it’s very concerning,” said Shiu-Ming Cheer, the deputy director of immigrant and racial justice at the California Immigrant Policy Center, an advocacy group.</p>



<p>Some Democratic-led states,&nbsp;<a href="https://kffhealthnews.org/news/article/california-ice-immigrant-protections-hospitals-clinics-agents/">including California, Colorado, and Maryland</a>, have enacted legislation that seeks to protect patients from immigration enforcement in hospitals. However, those policies do not address protections for people already in ICE custody.</p>



<h2 class="wp-block-heading"><strong>More Detainees Hospitalized</strong></h2>



<p>Peña is among&nbsp;<a href="https://www.theguardian.com/us-news/ng-interactive/2025/aug/29/trump-immigration-ice-cbp-data">more than 350,000 people</a>&nbsp;arrested by federal immigration authorities since President Donald Trump returned to the White House. As arrests and detentions have climbed, so too have reports of people taken to hospitals by immigration agents because of illness or injury — due to preexisting conditions or problems stemming from their arrest or detention.</p>



<p>ICE has&nbsp;<a href="https://vasquez.house.gov/media/press-releases/statement-us-representative-gabe-vasquez-reports-ices-increasingly-aggressive#:~:text=WASHINGTON%2C%20D.C.%20%E2%80%93%20Today%2C%20U.S.,and%20respect%20for%20human%20rights.">faced criticism</a>&nbsp;for using&nbsp;<a href="https://www.propublica.org/article/videos-ice-dhs-immigration-agents-using-chokeholds-citizens">aggressive</a>&nbsp;and&nbsp;<a href="https://www.startribune.com/man-fatally-shot-by-federal-agents-in-south-minneapolis/601570050">deadly</a>&nbsp;tactics, as well as for&nbsp;<a href="https://www.aclu.org/news/immigrants-rights/inside-an-ice-detention-center-detained-people-describe-severe-medical-neglect-harrowing-conditions">reports of mistreatment</a>&nbsp;and&nbsp;<a href="https://www.kff.org/racial-equity-and-health-policy/health-issues-for-immigrants-in-detention-centers/#:~:text=The%20Government%20Accountability%20Office%20(GAO,detained%20less%20than%206%20months.">inadequate medical care</a>&nbsp;at its facilities. Sen. Adam Schiff (D-Calif.) told reporters at a Jan. 20 news conference outside a detention center he visited in California City that he spoke to a diabetic woman held there who had not received treatment in&nbsp;<a href="https://www.latimes.com/california/story/2026-01-20/u-s-senators-tour-california-city-detention-center-decry-conditions-inadequate-medical-care">two months</a>.</p>



<p>While there are no publicly available statistics on the number of people sick or injured in ICE detention, the agency’s news releases point to&nbsp;<a href="https://www.ice.gov/newsroom">32 people</a>&nbsp;who died in immigration custody in 2025. Six more have died this year.</p>



<p>The Department of Homeland Security, which oversees ICE, did not respond to a request for information about its policies or Peña’s case.</p>



<p>According to&nbsp;<a href="https://www.ice.gov/doclib/detention-standards/2025/nds2025.pdf">ICE’s guidelines</a>, people in custody should be given access to a telephone, visits from family and friends, and private consultation with legal counsel. The agency can make administrative decisions, including about visitation, when a patient is in the hospital, but should defer to hospital policies on contacting next of kin when a patient is seriously ill, the guidelines state.</p>



<p>Asked in detail about hospital practices related to patients in immigration custody and whether there are best practices that hospitals should follow, Ben Teicher, a spokesperson for the American Hospital Association, declined to comment.</p>



<p>David Simon, a spokesperson for the California Hospital Association, said that “there are times when hospitals will — at the request of law enforcement — maintain confidentiality of patients’ names and other identifying characteristics.”</p>



<p>Although policies vary, members of the public can typically call a hospital and ask for a patient by name to find out whether they’re there, and often be transferred to the patient’s room, said William Weber, an emergency physician in Minneapolis and medical director for the Medical Justice Alliance, which advocates for the medical needs of people in law enforcement custody. Family members and others authorized by the patient can visit. And medical staff routinely call relatives to let them know a loved one is in the hospital, or to ask for information that could help with their care.</p>



<p>But when a patient is in law enforcement custody, hospitals frequently agree to restrict this kind of information sharing and access, Weber said. The rationale is that these measures prevent unauthorized outsiders from threatening the patient or law enforcement personnel, given that hospitals lack the security infrastructure of a prison or detention center. High-profile patients such as celebrities sometimes also request this type of protection.</p>



<p>Several attorneys and health care providers questioned the need for such restrictions. Immigration detention is civil, not criminal, detention. The Trump administration says it’s focused on&nbsp;<a href="https://www.whitehouse.gov/articles/2025/03/president-trump-is-removing-killers-rapists-and-drug-dealers-from-our-streets/">arresting and deporting criminals</a>, yet most of those arrested have no criminal conviction, according to data compiled by the&nbsp;<a href="https://tracreports.org/immigration/quickfacts/">Transactional Records Access Clearinghouse</a>&nbsp;and several news outlets.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2026/01/Hospital-blackouts-01.jpg?w=696&#038;ssl=1" alt="A man sits on his bike in the backyard of his home surrounded by plants and flowers on a sunny day." class="wp-image-2149285"/><figcaption class="wp-element-caption">Julio Cesar Peña, who has terminal kidney disease, sits on his bike in the backyard of his home in Glendale, California. His family had a hard time locating him when he was hospitalized after being detained by Immigration and Customs Enforcement.(Peña family)</figcaption></figure>



<h2 class="wp-block-heading"><strong>Taken Outside His Home</strong></h2>



<p>According to Peña’s wife, Romero, he has no criminal record. Peña came to the United States from Mexico in sixth grade and has an adult son in the U.S. military. The 43-year-old has terminal kidney disease and survived a heart attack in November. He has trouble walking and is partially blind, his wife said. He was detained Dec. 8 while resting outside after coming home from dialysis treatment.</p>



<p>Initially, Romero was able to find her husband through the&nbsp;<a href="https://locator.ice.gov/odls/#/search">ICE Online Detainee Locator System</a>. She visited him at a temporary holding facility in downtown Los Angeles, bringing him his medicines and a sweater. She then saw he’d been moved to the Adelanto detention center. But the locator did not show where he was after he was hospitalized.</p>



<p>When she and other relatives drove to the detention facility to find him, they were turned away, she said. Romero received occasional calls from her husband in the hospital but said they were less than 10 minutes long and took place under ICE surveillance. She wanted to know where he was so she could be at the hospital to hold his hand, make sure he was well cared for, and encourage him to stay strong, she said.</p>



<p>Shackling him and preventing him from seeing his family was unfair and unnecessary, she said.</p>



<p>“He’s weak,” Romero said. “It’s not like he’s going to run away.”</p>



<p><a href="https://www.ice.gov/doclib/detention-standards/2025/nds2025.pdf">ICE guidelines</a>&nbsp;say contact and visits from family and friends should be allowed “within security and operational constraints.” Detainees have&nbsp;<a href="https://kffhealthnews.org/news/article/ice-immigrants-hospitals-detained-california-privacy-rights/">a constitutional right</a>&nbsp;to speak confidentially with an attorney.<a href="https://kffhealthnews.org/news/article/ice-immigrants-hospitals-detained-california-privacy-rights/"></a>&nbsp;Weber said immigration authorities should tell attorneys where their clients are and allow them to talk in person or use an unmonitored phone line.</p>



<p>Hospitals, though, fall into a gray area on enforcing these rights, since they are primarily focused on treating medical needs, Weber said. Still, he added, hospitals should ensure their policies align with the law.</p>



<h2 class="wp-block-heading"><strong>Family Denied Access</strong></h2>



<p>Numerous immigration attorneys have spent weeks trying to locate clients detained by ICE, with their efforts sometimes thwarted by hospitals.</p>



<p>Nicolas Thompson-Lleras, a Los Angeles attorney who counsels immigrants facing deportation, said two of his clients were registered under aliases at different hospitals in Los Angeles County last year. Initially, the hospitals denied the clients were there and refused to let Thompson-Lleras meet with them, he said. Family members were also denied access, he said.</p>



<p>One of his clients was&nbsp;<a href="https://www.latimes.com/california/story/2025-10-07/federal-agents-held-shackled-a-seriously-injured-man-hospital-bed-37-days">Bayron Rovidio Marin</a>, a car wash worker injured during a raid in August. Immigration agents surveilled him for over a month at Harbor-UCLA Medical Center, a county-run facility, without charging him.</p>



<p>In November, the Los Angeles County Board of Supervisors voted to&nbsp;<a href="https://assets-us-01.kc-usercontent.com/0234f496-d2b7-00b6-17a4-b43e949b70a2/dc3c5a6a-e25c-4c90-8482-dad9d63e4e2e/Agenda%20111825_links.pdf">curb the use</a>&nbsp;of blackout policies for patients under civil immigration custody at county-run hospitals. In a statement, Arun Patel, the chief patient safety and clinical risk management officer for the Los Angeles County Department of Health Services, said the policies are designed to reduce safety risks for patients, doctors, nurses, and custody officers.</p>



<p>“In some situations, there may be concerns about threats to the patient, attempts to interfere with medical care, unauthorized visitors, or the introduction of contraband,” Patel said. “Our goal is not to restrict care but to allow care to happen safely and without disruption.”</p>



<h2 class="wp-block-heading"><strong>Leaving Patients Vulnerable</strong></h2>



<p>Thompson-Lleras said he’s concerned that hospitals are cooperating with federal immigration authorities at the expense of patients and their families and leaving patients vulnerable to abuse.</p>



<p>“It allows people to be treated suboptimally,” Thompson-Lleras said. “It allows people to be treated on abbreviated timelines, without supervision, without family intervention or advocacy. These people are alone, disoriented, being interrogated, at least in Bayron’s case, under pain and influence of medication.”</p>



<p>Such incidents are alarming to hospital workers. In Los Angeles, two health care professionals who asked not to be identified by KFF Health News, out of concern for their livelihoods, said that ICE and hospital administrators, at public and private hospitals, frequently block staff from contacting family members for people in custody, even to find out about their health conditions or what medications they’re on. That violates medical ethics, they said.</p>



<p>Blackout procedures are another concern.</p>



<p>“They help facilitate, whether intentionally or not, the disappearance of patients,” said one worker, a physician for the county’s Department of Health Services and part of a coalition of concerned health workers from across the region.</p>



<p>At Legacy Emanuel Medical Center in Portland, nurses publicly expressed outrage over what they saw as hospital cooperation with ICE and the flouting of patient rights. Legacy Health has&nbsp;<a href="https://www.portlandmercury.com/news/2026/01/23/48271076/legacy-emanuel-sends-cease-and-desist-to-nurses-union-over-ice-statements">sent a cease and desist letter</a>&nbsp;to the nurses’ union, accusing it of making “false or misleading statements.”</p>



<p>“I was really disgusted,” said Blaire Glennon, a nurse who quit her job at the hospital in December. She said numerous patients were brought to the hospital by ICE with serious injuries they sustained while being detained. “I felt like Legacy was doing massive human rights violations.”</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2026/01/Hospital-blackouts-02.jpg?w=696&#038;ssl=1" alt="A young man leans down to hug a woman. Neither of their faces are visible to the camera." class="wp-image-2149288"/><figcaption class="wp-element-caption">Julio Peña Jr. hugs his stepmother, Lydia Romero, outside an immigration detention facility in downtown Los Angeles as they try to get information about his father, Julio Cesar Peña, who was detained by ICE in December.(Immigrant Defenders Law Center)</figcaption></figure>



<h2 class="wp-block-heading"><strong>Handcuffed While Unconscious</strong></h2>



<p>Two days before Christmas, Chabolla, Peña’s attorney, received a call from ICE with the answer she and Romero had been waiting for. Peña was at Victor Valley Global Medical Center, about 10 miles from Adelanto, and about to be released.</p>



<p>Excited, Romero and her family made the two-hour-plus drive from Glendale to the hospital to take him home.</p>



<p>When they got there, they found Peña intubated and unconscious, his arm and leg still handcuffed to the hospital bed. He’d had a severe seizure on Dec. 20, but no one had told his family or legal team, his attorney said.</p>



<p>Tim Lineberger, a spokesperson for Victor Valley Global Medical Center’s parent company, KPC Health, said he could not comment on specific patient cases, because of privacy protections. He said the hospital’s policies on patient information disclosure comply with state and federal law.</p>



<p>Peña was finally cleared to go home on Jan. 5. No court date has been set, and his family is filing a petition to adjust his legal status based on his son’s military service. For now, he still faces deportation proceedings.</p>
<p>The post <a href="https://medika.life/i-cant-tell-you-attorneys-relatives-struggle-to-find-hospitalized-ice-detainees/">‘I Can’t Tell You’: Attorneys, Relatives Struggle To Find Hospitalized ICE Detainees</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21543</post-id>	</item>
		<item>
		<title>Why Biological Learning Demands the Friction We Seek to Delete?</title>
		<link>https://medika.life/why-biological-learning-demands-the-friction-we-seek-to-delete/</link>
		
		<dc:creator><![CDATA[Atefeh Ferdosipour]]></dc:creator>
		<pubDate>Wed, 07 Jan 2026 18:47:31 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Atefeh Ferdosipour]]></category>
		<category><![CDATA[Behaviorial Health]]></category>
		<category><![CDATA[Fiction-Based AI]]></category>
		<category><![CDATA[LLMs]]></category>
		<category><![CDATA[Skinner]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21516</guid>

					<description><![CDATA[<p>This short piece, as always, is born out of my passion for studying how theories can help us use Artificial Intelligence more effectively. I believe now more than ever that without interdisciplinary research, we won’t be able to logically face the challenges of the Cognitive Age. Systematically speaking, the key to identifying challenges lies in [&#8230;]</p>
<p>The post <a href="https://medika.life/why-biological-learning-demands-the-friction-we-seek-to-delete/">Why Biological Learning Demands the Friction We Seek to Delete?</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>This short piece, as always, is born out of my passion for studying how theories can help us use <em>Artificial Intelligence</em> more effectively. I believe now more than ever that without interdisciplinary research, we won’t be able to logically face the challenges of the Cognitive Age.</p>



<p>Systematically speaking, the key to identifying challenges lies in examining fundamental issues, not just their consequences. For example, if we want to fix the flaws in the learning process, we must first redefine the roots of deep learning and its underlying mechanics. We may even need to redefine them repeatedly to understand how to solve the problems arising from mind-based technologies.</p>



<p>Let me explain what I mean through one of the most debated topics of our time: the mental laziness caused by the way <em>AI</em> is rewriting our brain&#8217;s habits. To understand this, we need to look at the dynamics of deep learning in the brain. By grasping this process through interdisciplinary research, we might find ways to make <em>AI</em> learning feel more like natural deep learning.</p>



<p>The goal isn&#8217;t just to know the biochemistry of cells. Before looking at what happens inside an organism, we should ask:</p>



<p>Why do we usually prefer learning through <em>AI</em> over the effortful, traditional human way?</p>



<p>You might say the answer is obvious: because learning with technology is effortless and fast.</p>



<p>As a learning specialist, I’d like to answer this from a theoretical perspective.</p>



<p>&nbsp;First, we must accept a reality: Human deep learning is naturally a challenging process. It is fundamentally different from the vast amounts of data we consume today through formal or informal education assisted by <em>LLMs</em>.</p>



<h2 class="wp-block-heading">The Logic of Immediate Reward: From Skinner to the Present</h2>



<p>There is strong research showing that learners prefer a small, immediate reward over a larger, delayed one. This was first highlighted by B.F. <em>Skinner</em> (1953), the pioneer of operant conditioning.&nbsp;(I’ve previously written about how this connects to <em>AI</em>. )</p>



<p>Later, others expanded on this effortless reward preference. In short, according to the behavioral economics of Skinner’s theory, humans look for shortcuts.&nbsp;</p>



<p>AI is currently the ultimate shortcut, giving the best answer in seconds without any real struggle. From this view, it’s not just about the mind; it’s about behavioral economics.</p>



<p>A behavior that leads to a quick reward will always be repeated.</p>



<p><em>Richard</em> <em>Herrnstein</em> (1961), a student of Skinner&#8217;s, developed a mathematical formula called the Matching Law. He showed that organisms don&#8217;t just look at one reward; they choose between options. If given two choices, a living being will put its energy into the one that pays off faster and more directly. </p>



<p>In <em>behavioral</em> <em>economics</em>, this <span style="box-sizing: border-box; margin: 0px; padding: 0px;">phenomenon is known as <em>temporal</em> <em>discounting</em></span> (<em>Ainslie</em>, 1975). The value of a reward drops the longer you have to wait for it. Simply put, the reward loses its shine in the organism&#8217;s mind because it requires patience.</p>



<p>We <span style="box-sizing: border-box; margin: 0px; padding: 0px;">observe this phenomenon every day with <em>AI</em> users, particularly those utilizing</span> <em>ChatGPT</em>. Students, for instance, might feel that spending hours writing a thesis is stupid or inefficient when they can get an answer in a split second. They don&#8217;t just feel productive; they feel smart for bypassing the effort. </p>



<p>Even if you tell them that the struggle is what actually builds their brain, they often won&#8217;t listen. They choose the immediate payout over the long-term value. </p>



<p><em>Evolutionary</em> <em>psychology</em> explains this too: an immediate reward is guaranteed, while a future one is uncertain. Since we are wired for survival, we grab what’s available now.</p>



<p>Brain Biochemistry and the <em>Deep</em> <em>Learning</em> <em>Process</em></p>



<p>When we learn something deeply, three key things happen at a neurological level:</p>



<ol class="wp-block-list">
<li>Exposure to New Information: The nervous system makes its first contact with data for which it has no existing pattern.</li>
</ol>



<p>2. Cognitive Load: This is that stuck feeling when a mental process is harder than expected. It’s the effort the brain needs to process unfamiliar data (Sweller, 1988). This friction is essential.</p>



<p>3. Processing and Protein Synthesis: If the information is processed correctly, chemical signals trigger the creation of proteins that physically change the brain&#8217;s structure to store that knowledge (Kandel, 2001).</p>



<p>This is why sleep is so vital. Most of this protein synthesis happens while we rest.&nbsp;</p>



<p>One of the most beautiful parts of learning is when we stop thinking about a problem, but our brain keeps working on it.&nbsp;</p>



<p>Through the Default Mode Network or DMN (Raichle, 2015), the brain makes random, creative connections. This is where true creativity is born.</p>



<h2 class="wp-block-heading">Toward Friction-Based AI</h2>



<p>If deep learning is the result of protein synthesis triggered by challenge, then the paradox of modern AI is clear: By removing the friction, technology is removing the learning.&nbsp;</p>



<p>We are facing a biological crisis where human brains, instead of producing genius and problem-solving skills, are becoming mere terminals for receiving quick hits of dopamine.</p>



<p>My proposal is simple: How can we turn AI from a passive answer-giver into a Cognitive Challenging Provocateur? </p>



<p>We need to design models that don&#8217;t bypass cognitive load but manage it in a personalized way.&nbsp;</p>



<p>I call this Friction-based AI; a model where algorithms are programmed not for the shortest path, but for the most effective learning path. This is an open invitation to researchers, neuroscientists, and AI architects to collaborate on this new paradigm. My ideas are ready to be turned into actionable proposals.</p>



<p>As a final note, I believe the way we interact with AI is a skill in itself. Even if everyone has the same tools, the results aren&#8217;t equal. Efficiency depends on the how.&nbsp;</p>



<p>I am currently developing a startup idea to address these exact challenges in EdTech.It’s EdTechxDr. Atefeh F.</p>



<h2 class="wp-block-heading">References</h2>



<p>• Ainslie, G. (1975). Specious reward: A behavioral theory of impulsiveness and impulse control. Psychological Bulletin.</p>



<p>• Herrnstein, R. J. (1961). Relative prevalence of response in relation to the relative frequency of reinforcement. Journal of the Experimental Analysis of Behavior.</p>



<p>• Kandel, E. R. (2001). The Molecular Biology of Memory Storage: A Dialogue Between Genes and Synapses. Science.</p>



<p>• Raichle, M. E. (2015). The Brain&#8217;s Default Mode Network. Annual Review of Neuroscience.</p>



<p>• Skinner, B. F. (1953). Science and Human Behavior. Simon and Schuster.</p>



<p>• Sweller, J. (1988). Cognitive Load During Problem Solving: Effects on Learning. Cognitive Science.</p>
<p>The post <a href="https://medika.life/why-biological-learning-demands-the-friction-we-seek-to-delete/">Why Biological Learning Demands the Friction We Seek to Delete?</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21516</post-id>	</item>
		<item>
		<title>AI in 2026 – Boom, Bust or Backlash in Healthcare?</title>
		<link>https://medika.life/ai-in-2026-boom-bust-or-backlash-in-healthcare/</link>
		
		<dc:creator><![CDATA[Tom Lawry]]></dc:creator>
		<pubDate>Wed, 07 Jan 2026 18:29:01 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[TeleHealth]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[GenAI]]></category>
		<category><![CDATA[Generative AL]]></category>
		<category><![CDATA[Hacking Health Care]]></category>
		<category><![CDATA[Health Care Nation]]></category>
		<category><![CDATA[HIMSS]]></category>
		<category><![CDATA[Tom Lawry]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21510</guid>

					<description><![CDATA[<p>It was the fall of 2022 when large language models and Generative AI burst out of research labs and onto Main Street. Since then, every day seems to bring another AI breakthrough that challenges how work gets done. In my role advising organizations on AI strategy and deployments, I see a consistent pattern among healthcare [&#8230;]</p>
<p>The post <a href="https://medika.life/ai-in-2026-boom-bust-or-backlash-in-healthcare/">AI in 2026 – Boom, Bust or Backlash in Healthcare?</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-full"><img data-recalc-dims="1" loading="lazy" decoding="async" width="478" height="79" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/01/Tom-Lawry-Pic-2.png?resize=478%2C79&#038;ssl=1" alt="" class="wp-image-21513" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/01/Tom-Lawry-Pic-2.png?w=478&amp;ssl=1 478w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/Tom-Lawry-Pic-2.png?resize=300%2C50&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/Tom-Lawry-Pic-2.png?resize=150%2C25&amp;ssl=1 150w" sizes="auto, (max-width: 478px) 100vw, 478px" /></figure>



<p>It was the fall of 2022 when large language models and Generative AI burst out of research labs and onto Main Street. Since then, every day seems to bring another AI breakthrough that challenges how work gets done.</p>



<p>In my role advising organizations on AI strategy and deployments, I see a consistent pattern among healthcare leaders: excitement about what AI could unlock, paired with exhaustion from the volume of noise, pressure, and competing claims.</p>



<h2 class="wp-block-heading"><strong><em>Welcome to 2026.</em></strong></h2>



<p>As predictions flood inboxes and social feeds, focused on what AI <em>might</em> do next, I want to ground the conversation in something more useful. Rather than forecasting outcomes, let’s focus on three forces already at work—forces that will determine whether AI delivers real value in healthcare or quietly stalls.</p>



<p>Will 2026 be a year of boom, bust, or backlash?</p>



<p>The honest answer is yes.</p>



<h2 class="wp-block-heading"><strong>Boom: Early Wins—and an AI Arms Race</strong></h2>



<p>Let’s start with what’s working.</p>



<p>Healthcare is seeing real, if narrow, gains from AI:</p>



<ul class="wp-block-list">
<li>Ambient documentation reduces administrative burden</li>



<li>Imaging and pathology tools iare mproving speed and consistency</li>



<li>Operational and revenue cycle applications driving incremental efficiency</li>
</ul>



<p>These are not moonshots. They are targeted solutions addressing specific pain points. And they matter.</p>



<p>At the same time, healthcare is now firmly in an AI arms race.</p>



<p>Every EHR vendor, medical device company, life sciences firm, and digital health startup is racing to declare itself “AI-native.” Roadmaps are packed with copilots, assistants, agents, and automation claims. No vendor wants to be perceived as falling behind.</p>



<p>That pressure is accelerating innovation—but it’s also compressing timelines, encouraging over-promising, and pushing organizations to adopt faster than they can realistically absorb.</p>



<p>Boom energy is real.</p>



<p>But it is also uneven and fragile.</p>



<p><strong>Prediction:</strong> Within two years, most AI used in provider organizations will arrive embedded inside core systems and devices already in use. Intelligence will not be something teams “add on”; it will be something they inherit.</p>



<p><strong>Recommendation: </strong>Understand where AI is already embedded across your vendor ecosystem and what’s coming next. Engage early through advisory councils or pilots. Engage and prepare clinicians before introducing these capabilities into workflows. AI should never arrive as a surprise.</p>



<h2 class="wp-block-heading"><strong>Bust: When Pilots Multiply, but Value Doesn’t</strong></h2>



<p>Generative AI has dominated innovation agendas, yet only a fraction of pilots ever reach sustained production. A survey cited by MIT reports that roughly <strong>95% of business AI pilots fail to generate measurable returns.</strong></p>



<p>This is not evidence that AI lacks value.</p>



<p>It is evident that many organizations lack discipline.</p>



<figure class="wp-block-image size-large"><img data-recalc-dims="1" loading="lazy" decoding="async" width="696" height="420" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=696%2C420&#038;ssl=1" alt="" class="wp-image-21511" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=1024%2C618&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=300%2C181&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=768%2C464&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=150%2C91&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=696%2C420&amp;ssl=1 696w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?resize=1068%2C645&amp;ssl=1 1068w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image.jpeg?w=1274&amp;ssl=1 1274w" sizes="auto, (max-width: 696px) 100vw, 696px" /></figure>



<p>High failure rates are normal in early markets. Technology matures. Tools improve. But value only materializes when leaders focus on fundamentals: design, data readiness, workflow integration, and ownership.</p>



<p>Most AI initiatives fail not because the technology doesn’t work, but because success is never clearly defined. Projects are launched out of curiosity, vendor pressure, or fear of being left behind. Clinical impact, operational accountability, and economic value are clarified too late—if at all.</p>



<p>Equally damaging is the underestimation of the human systems AI enters. Healthcare work is relational, regulated, and trust-dependent. When AI is introduced without redesigning workflows, preparing staff, or clarifying responsibility, it creates friction—not relief. Adoption then stalls quietly.</p>



<p><strong>Prediction:</strong> In 2026, organizations will run fewer AI pilots—but with much higher expectations. Boards and executives will require clearer evidence of clinical, workforce, or financial value before approving new initiatives.</p>



<p><strong>Recommendation:</strong> Move from “fail fast” to “fail before you scale.” Define success upfront, assign ownership early, and redesign workflows in tandem with technology. AI initiatives without a credible path to value should be halted immediately<strong>.</strong></p>



<h2 class="wp-block-heading"><strong>Backlash: Fear, Workforce Anxiety, and the Trust Gap</strong></h2>



<p>The most underestimated force shaping AI’s trajectory in 2026 is neither technical nor financial.</p>



<p>It’s human.</p>



<p>History offers context. When automobiles first appeared, they were seen as dangerous and socially disruptive. Red Flag laws required people to walk ahead of vehicles waving flags and capped speeds at just a few miles per hour. These laws weren’t about innovation—they were about fear, control, and adjustment.</p>



<p>Healthcare AI is entering a similar phase.</p>



<p>Workforce research shows healthcare workers are among the most cautious about AI adoption, citing concerns about trust, transparency, and job impact. This caution is not irrational. Healthcare has a long history of technology being imposed rather than co-designed.</p>



<figure class="wp-block-image size-large"><img data-recalc-dims="1" loading="lazy" decoding="async" width="696" height="317" src="https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=696%2C317&#038;ssl=1" alt="" class="wp-image-21512" srcset="https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=1024%2C467&amp;ssl=1 1024w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=300%2C137&amp;ssl=1 300w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=768%2C350&amp;ssl=1 768w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=150%2C68&amp;ssl=1 150w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=696%2C317&amp;ssl=1 696w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?resize=1068%2C487&amp;ssl=1 1068w, https://i0.wp.com/medika.life/wp-content/uploads/2026/01/image-1.jpeg?w=1174&amp;ssl=1 1174w" sizes="auto, (max-width: 696px) 100vw, 696px" /></figure>



<p>As a result, scrutiny is increasing—particularly from labor organizations and state legislators. Recent bills, including those limiting AI’s role in clinical decision-making and licensed practice, reflect not anti-innovation sentiment, but unresolved trust and knowledge gaps.</p>



<p>Innovation does not scale without trust.</p>



<p>In 2026, AI scrutiny will intensify, especially with labor organizations and at the state legislative level.</p>



<p>As I write this, the Chair of the New York State Senate Committee on Internet and Technology just introduced a bill (S7263) to “protect patients and front-line care workers from the adverse effects of AI tools in risky or untested settings.”&nbsp; The bill prohibits chatbots from performing the duties of licensed nurses and puts strong guardrails around the use of AI in healthcare settings.”</p>



<p>I often write about the need for a balanced approach to defining both the “gas and guardrails” that guide AI’s use in health and medicine. Incentives and safeguards are equally important.</p>



<p><strong>Prediction</strong>: Expect increased legislative activity and labor engagement around AI in healthcare throughout 2026. Such actions should not be dismissed simply as anti-innovation. They reflect something deeper: a trust and knowledge gap that needs to be closed.</p>



<p><strong>Recommendation: </strong>Create durable AI value by investing in workforce and consumer education. Clinicians need clarity—not just on how AI works, but on how it supports professional judgment rather than replaces it.</p>



<h2 class="wp-block-heading"><strong>From Awe to Analytical</strong></h2>



<p>The year ahead will test the resolve of leadership. Transformation in healthcare is rarely linear—and never clean.</p>



<p>Vendors will continue to showcase breakthroughs. The hype will continue. But 2026 is not the year for cheerleading.</p>



<p>It is the year for realism.</p>



<p>The most effective leaders are moving from awe to analysis—recognizing that AI value does not come from the technology itself, but from the opportunity it creates to rethink how work gets done.</p>



<p>In that sense, AI value is—and always will be—a uniquely human process.</p>
<p>The post <a href="https://medika.life/ai-in-2026-boom-bust-or-backlash-in-healthcare/">AI in 2026 – Boom, Bust or Backlash in Healthcare?</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21510</post-id>	</item>
		<item>
		<title>Guns, Race, and Profit: The Pain of America’s Other Epidemic</title>
		<link>https://medika.life/guns-race-and-profit-the-pain-of-americas-other-epidemic/</link>
		
		<dc:creator><![CDATA[Medika Life]]></dc:creator>
		<pubDate>Fri, 29 Aug 2025 12:22:02 +0000</pubDate>
				<category><![CDATA[Bills and Legislation]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[News and Views]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Public Health]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[Fred Clasen-Kelly]]></category>
		<category><![CDATA[Gun violence]]></category>
		<category><![CDATA[Guns]]></category>
		<category><![CDATA[Kaiser Health News]]></category>
		<category><![CDATA[KHN]]></category>
		<category><![CDATA[Public Policy]]></category>
		<category><![CDATA[Renuka Rayasam]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21394</guid>

					<description><![CDATA[<p>BOGALUSA, La. — Less than a mile from a century-old mill that sustained generations in this small town north of New Orleans, 19-year-old Tajdryn Forbes was shot to death near his mother’s house. She found Forbes face down in the street in August 2023, two weeks before he had planned to move away from the [&#8230;]</p>
<p>The post <a href="https://medika.life/guns-race-and-profit-the-pain-of-americas-other-epidemic/">Guns, Race, and Profit: The Pain of America’s Other Epidemic</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>BOGALUSA, La. — Less than a mile from a century-old mill that sustained generations in this small town north of New Orleans, 19-year-old Tajdryn Forbes was shot to death near his mother’s house.<a href="https://www.npr.org/sections/shots-health-news/2025/08/19/nx-s1-5501679/gun-violence-prevention-trump-black-communities"></a></p>



<p>She found Forbes face down in the street in August 2023, two weeks before he had planned to move away from the empty storefronts, boarded-up houses, and poverty that make this one of the most troubled places in the nation.</p>



<p>Naketra Guy thought about how her son overcame losing his father at age 4 and was the glue of the family. She called him “humble” and “respectful,” a leader in the community and on the football field, where he shined.</p>



<p>Yet he could not outrun the grim statistics of his hometown. Bogalusa posts some of the worst health outcomes and poverty in Louisiana, a state that routinely ranks among the worst nationally in both. And Bogalusa has endured another indicator of poor public health: high levels of gun violence.</p>



<p>Since the beginning of the covid-19 pandemic, gun violence has shattered any sense of peace or progress here. Louisiana suffers the nation’s second-highest firearm&nbsp;<a href="https://www.cdc.gov/nchs/pressroom/sosmap/firearm_mortality/firearm.htm">death rate</a>&nbsp;— and Bogalusa, a predominantly Black community with 10,000 residents, has seen dozens of shootings and a violent crime rate approaching twice the national average.</p>



<p>A nearby team refused to play football at Bogalusa High School in fall 2022,&nbsp;<a href="https://bogalusadailynews.com/2022/11/04/breaking-albany-will-forfeit-friday-nights-football-game-at-bogalusa/">citing safety concerns</a>.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/Bogalusa_04-resized.jpg?w=696&#038;ssl=1" alt="A photo of boarded-up buildings in Bogalusa." class="wp-image-2074930"/><figcaption class="wp-element-caption">Boarded-up buildings in downtown Bogalusa, Louisiana. Once known as “the Magic City” because of its giant mill and fast growth, the town now struggles with empty storefronts and blight.(Fred Clasen-Kelly/KFF Health News)</figcaption></figure>



<p>Bogalusa’s mayor, Tyrin Truong, was elected in 2022 at age 23 on his promises to fix entrenched challenges: few youth programs and good jobs, and perpetual crime and blight.</p>



<p>“I ran for mayor because I got sick of seeing our city painted as mini-New Orleans,” he said, “due to the high levels of youth gun violence.”</p>



<p>In January, the Louisiana State Police&nbsp;<a href="https://www.facebook.com/LouisianaStatePolice/posts/898227042498235/">arrested Truong</a>, accusing him of soliciting a prostitute and participating in a drug trafficking ring that allegedly used illicit proceeds to buy firearms. He has&nbsp;<a href="https://www.wwltv.com/article/news/local/northshore/bogalusa-mayor-truong-proclaims-innocence-amid-legal-battles-and-city-hall-drama/289-92df2865-6975-48f1-b39a-afa9518e4561">said he is innocent</a>. “I still haven’t been formally arraigned,” he told KFF Health News in late July, “and I haven’t been charged with anything.”</p>



<p>Every year tens of thousands of Americans —&nbsp;<a href="https://publichealth.jhu.edu/center-for-gun-violence-solutions/research-reports/gun-violence-in-the-united-states">one every few minutes</a>&nbsp;— are killed by gun violence on the scale of a public health epidemic.</p>



<p>Many thousands more are left to recover from severe injuries, crushing&nbsp;<a href="https://kffhealthnews.org/news/article/super-bowl-parade-shooting-survivors-donations-bills-wait-kansas-city/">medical debt</a>, and the&nbsp;<a href="https://kffhealthnews.org/news/article/still-a-lot-of-pain-four-years-after-mass-shooting-texas-community-grapples-with-fallout/">mental health toll</a>&nbsp;of losing loved ones.</p>



<p>Most headlines focus on America’s urban centers, but the numbers also reflect the growth of gun violence in places like Bogalusa, a pinprick of a town 75 miles north of New Orleans. In 2020, the gun violence death rate for rural communities&nbsp;<a href="https://publichealth.jhu.edu/sites/default/files/2022-05/2020-gun-deaths-in-the-us-4-28-2022-b.pdf">was 40% higher</a>&nbsp;than in large metropolitan areas, according to Johns Hopkins University.</p>



<p>Firearms are the No. 1 killer of children in the U.S., and no group suffers more than&nbsp;<a href="https://publichealth.jhu.edu/2024/guns-remain-leading-cause-of-death-for-children-and-teens#:~:text=In%202022,%20in%20the%201%20to%2017%20age%20group,%20Black">young Black people</a>. More Black boys and men ages 15 to 24 in 2023&nbsp;<a href="https://giffords.org/lawcenter/report/gun-violence-in-black-communities/">were killed in gun homicides</a>&nbsp;than from the next 15 leading causes of deaths combined. Though overall U.S. homicides&nbsp;<a href="https://www.fbi.gov/news/press-releases/fbi-releases-2023-crime-in-the-nation-statistics">dropped sharply</a>&nbsp;after the pandemic ended, adolescent gun deaths climbed even higher in the years after, according to&nbsp;<a href="https://www.bu.edu/sph/news/articles/2025/after-pandemic-spike-adolescent-gun-homicide-rates-show-no-sign-of-slowing/">research by Jonathan Jay</a>, an associate professor in the School of Public Health at Boston University.</p>



<p>“It has all the markers of an epidemic. It is a major driver of death and disability,” Jay said. “Gun violence does not get the attention it deserves. It is underrecognized because it disproportionately impacts Black and brown people.”</p>



<p>Rather than bolstering efforts to save lives, federal, state, and local government officials have undermined them. KFF Health News undertook an examination of gun violence since the pandemic, a period when firearm death rates surged. Reporters reviewed government reports and academic research and interviewed dozens of health policy experts, activists, and victims or their relatives. They reviewed corporate earnings reports from gun manufacturers and&nbsp;<a href="https://www.opensecrets.org/industries/indus?ind=Q13">data on the industry’s donations</a>&nbsp;to politicians.</p>



<p>In polling published in 2023 by KFF,&nbsp;<a href="https://www.kff.org/other/poll-finding/americans-experiences-with-gun-related-violence-injuries-and-deaths/">more than half of Americans</a>&nbsp;said they or a family member had been impacted by gun violence such as by seeing a shooting or being threatened, injured, or killed with a gun.</p>



<p>American politicians and regulators have put in place laws and practices that have helped enrich firearm and ammunition manufacturers — which tout&nbsp;<a href="https://www.nssf.org/government-relations/impact/">$91 billion in economic impact</a>&nbsp;— even as&nbsp;<a href="https://www.brookings.edu/articles/mapping-gun-violence-a-closer-look-at-the-intersection-between-place-and-gun-homicides-in-four-cities/">gun violence has terrorized neighborhoods</a>&nbsp;already damaged by white flight, systemic disinvestment, and other forms of racial discrimination.</p>



<p>President Donald Trump championed gun rights on the campaign trail and has&nbsp;<a href="https://www.opensecrets.org/outside-spending/detail/2016?cmte=National+Rifle+Assn&amp;tab=targeted_candidates">received millions</a>&nbsp;from the National Rifle Association,&nbsp;<a href="https://abcnews.go.com/Politics/deeply-troubling-gun-violence-prevention-groups-react-trump/story?id=115530910">to whose members he promised</a>, “No one will lay a finger on your firearms.” His administration has rolled back efforts under President Joe Biden to address the rise in gun violence.</p>



<p>Emboldened in his second term, Trump&nbsp;<a href="https://elections.bradyunited.org/resources/project-2025-guns">is pushing</a>&nbsp;to&nbsp;<a href="https://apnews.com/article/trump-policies-agenda-election-2024-second-term-d656d8f08629a8da14a65c4075545e0f">allow more guns</a>&nbsp;in schools, weaken federal oversight of the gun industry, override state and local gun laws, permit sales&nbsp;<a href="https://www.nytimes.com/2025/04/07/us/politics/trump-gun-control-measures.html">without background checks</a>, and cut funding for violence intervention.</p>



<p>Trump&nbsp;<a href="https://www.whitehouse.gov/presidential-actions/2025/02/protecting-second-amendment-rights/">ordered the attorney general</a>&nbsp;to review all Biden administration actions that “purport to promote safety but may have impinged on the Second Amendment rights of law-abiding citizens.”</p>



<p>The Biden administration said “<a href="https://bidenwhitehouse.archives.gov/briefing-room/statements-releases/2021/04/07/fact-sheet-more-details-on-the-biden-harris-administrations-investments-in-community-violence-interventions/">a historic spike in homicides</a>” during the pandemic took its greatest toll on racially segregated and high-poverty neighborhoods.</p>



<p>Black youths in four major cities were&nbsp;<a href="https://www.bu.edu/sph/news/articles/2023/during-covid-black-children-were-100x-more-likely-than-white-children-to-experience-gun-injuries/">100 times as likely</a>&nbsp;as white ones to experience a firearm assault, research showed. Gun suicides reached an all-time high, and for the first time the firearm suicide rate among older Black teens surpassed that of older white teens.</p>



<p>In Bogalusa, the pandemic gun violence spread fear. Among the victims killed were a 15-year-old attending a birthday party and a 24-year-old nationally known musician. Thirteen people were injured at a memorial for a man who himself had been shot. Residents said neighbors stopped sitting in their yards because of stray bullets.</p>



<p>Researchers say communities like Bogalusa endure a collective trauma that shatters their sense of safety. Two years after&nbsp;<a href="https://www.facebook.com/bogalusa.louisiana/posts/674646324700458/">Forbes’ death</a>, his mother says that when she leaves home her surviving children worry that she, too, might get shot.</p>



<p>Repercussions from the surge will last years, researchers said: Exposure to shootings increases risk for post-traumatic stress disorder, anxiety, suicide, depression, substance abuse, and poor school performance for survivors and those who live near them.</p>



<p>“We saw gun violence exposure go up for every group of children except white children, in the cities we studied,” Jay said. “<a href="https://kffhealthnews.org/news/article/gun-violence-data-public-health-experts-research-funds/">Limits on government funding</a>&nbsp;into gun violence research may stop us from ever knowing exactly why.”</p>



<h2 class="wp-block-heading"><strong>Politics of Pain</strong></h2>



<p>The year before Forbes died in Bogalusa, Biden signed into law the Bipartisan Safer Communities Act, considered the&nbsp;<a href="https://www.americanprogress.org/article/the-bipartisan-safer-communities-act-1-year-later/">most sweeping firearm legislation</a>&nbsp;in decades.</p>



<p>In a matter of months, Trump has systematically dismantled key provisions.</p>



<p>Efforts to regulate guns have long proven ineffective against the power of political and business interests that fill the streets with weapons. In 2020, the number of guns manufactured annually in the U.S. hit 11.3 million, more than double a decade earlier, according to&nbsp;<a href="https://www.atf.gov/firearms/docs/report/national-firearms-commerce-and-trafficking-assessment-firearms-commerce-volume/download">the federal government</a>. In 2022, the United States had nearly 78,000&nbsp;<a href="https://everytownresearch.org/report/firearms-dealers-and-their-impact/">licensed gun dealers</a>, more than its combined number of McDonald’s, Burger King, Wendy’s, and Subway locations, according to Everytown for Gun Safety, an advocacy group.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/GettyImages-1248275702-resized.jpg?w=696&#038;ssl=1" alt="A photo of a gun on the counter at a gun shop in Maryland." class="wp-image-2074924"/><figcaption class="wp-element-caption">A customer looks at a handgun at a gun shop in Capitol Heights, Maryland, in 2023.(Andrew Caballero-Reynolds/AFP via Getty Images)</figcaption></figure>



<p>The Biden administration&nbsp;<a href="https://bidenwhitehouse.archives.gov/briefing-room/statements-releases/2021/06/23/fact-sheet-biden-harris-administration-announces-comprehensive-strategy-to-prevent-and-respond-to-gun-crime-and-ensure-public-safety/">announced in 2021</a>&nbsp;it would attempt to reduce gun violence by adopting a “zero tolerance” policy toward firearm dealers who committed violations such as failing to run a required background check or selling to someone prohibited from buying a gun.</p>



<p>The federal Bureau of Alcohol, Tobacco, Firearms and Explosives, or ATF, which licenses gun dealers, has the authority to enforce laws meant to prevent illegal gun sales. In issuing an executive order, the Trump administration&nbsp;<a href="https://www.whitehouse.gov/fact-sheets/2025/02/fact-sheet-president-donald-j-trump-is-protecting-americans-second-amendment-rights/">declared that</a>, under Biden, the agency targeted “mom-and-pop shop small businesses who made innocent paperwork errors.”</p>



<p>From October 2010 to February 2022, the agency conducted more than 111,000 inspections, recommending revocation of a dealer’s license only 589 times, about 0.5% of cases, an inspector general’s report said. Even when it cited serious violations, the ATF rarely shut dealers down.</p>



<p>ATF leaders&nbsp;<a href="https://oig.justice.gov/sites/default/files/reports/23-062_0.pdf">told the inspector general’s office</a>&nbsp;that recommendations for license revocations increased after Biden’s zero-tolerance policy was implemented. In April, the Trump administration&nbsp;<a href="https://www.atf.gov/news/press-releases/doj-atf-repeal-ffl-inspection-policy-and-begin-review-two-final-rules">repealed it</a>.</p>



<p>Surgeon General Vivek Murthy&nbsp;<a href="https://kffhealthnews.org/news/article/gun-violence-us-surgeon-general-vivek-murthy-public-health-crisis/">last year declared</a>&nbsp;firearm violence a public health crisis. Within weeks of Trump’s inauguration,&nbsp;<a href="https://giffords.org/press-release/2025/03/trump-administration-deletes-surgeon-general-webpage-with-advisory-on-gun-violence/">his administration removed</a>&nbsp;<a href="https://kffhealthnews.org/wp-content/uploads/sites/2/2025/08/firearm-violence-advisory.pdf">the advisory</a>. Of the 15 leading U.S. causes of death, firearm injuries received less research funding from the National Institutes of Health for each person who died than all but poisoning and falls, according to&nbsp;<a href="https://www.bradyunited.org/resources/research/reducing-firearm-violence">an analysis</a>&nbsp;in 2024 by Brady, an anti-gun violence organization.&nbsp;<a href="https://www.bradyunited.org/press/trump-budget-cuts">Trump is trying to cut</a>&nbsp;that funding, too.</p>



<p>Trump’s Department of Justice&nbsp;<a href="https://kffhealthnews.org/news/article/gun-violence-prevention-trump-cuts-st-louis">abruptly cut 373 grants</a>&nbsp;in April for projects worth about $820 million, with a large share from gun violence intervention.</p>



<p>“We are going to lose a generation of community violence prevention folks,” said Volkan Topalli, a gun violence researcher at Georgia State University. “People are going to die, I’m sorry to say, but that is the bleak truth of this.”</p>



<p>Asked about its policies, the White House did not address questions about public health considerations around gun violence.</p>



<p>“Illegal violence of any sort is a crime issue, and President Trump has been clear since Day One that he is committed to Making America Safe Again by empowering law enforcement to uphold law and order,” White House spokesperson Kush Desai said.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/60_Inaugural_Address-Senate-resized.jpg?w=696&#038;ssl=1" alt="A photo of President Trump speaking at a podium after being sworn in." class="wp-image-2074921"/><figcaption class="wp-element-caption">President Donald Trump gives his inaugural address after being sworn in on Jan. 20.(Rosa Pineda/U.S. Senate)</figcaption></figure>



<p>Trump administration officials “want safer streets and less violence,” Topalli said. “They are hurting their cause.”</p>



<p>Garen Wintemute, an emergency medicine professor who directs the violence prevention program at the University of California-Davis, was among the first in the nation to consider guns and violence as a public health issue. He said race plays a significant role in perceptions about gun violence.</p>



<p>“People look at the demographic risk for firearm homicide and depending on the demographics of the people in the audience, I can see the transformation in their faces,” Wintemute said. “It’s like they’re saying, ‘Not my people, not my problem.’”</p>



<h2 class="wp-block-heading"><strong>Eroding Gun Restrictions</strong></h2>



<p>Trump’s incursions against public health efforts to contain gun violence are backed by lobbying power.</p>



<p>Firearm industry advocacy groups made millions of dollars in political donations in recent years, mostly to conservative causes and Republican candidates. That includes $1.4 million to Trump,&nbsp;<a href="https://www.opensecrets.org/industries/indus?ind=Q13">according to OpenSecrets</a>, which tracks campaign finance data.</p>



<p>The assassination of civil rights icon the Rev. Martin Luther King Jr. helped lead to the passage of the federal&nbsp;<a href="https://www.thetrace.org/newsletter/martin-luther-king-gun-control-act-nra-history/">Gun Control Act of 1968</a>, which imposed stricter licensing rules and outlawed the sale of firearms and ammunition to felons.</p>



<p>While it remains the law of the land, over time, federal and state government actions have significantly weakened its protections.</p>



<p>Most states now&nbsp;<a href="https://giffords.org/lawcenter/gun-laws/policy-areas/guns-in-public/concealed-carry/">allow people to carry</a>&nbsp;concealed weapons without a permit or background check, even though&nbsp;<a href="https://vpc.org/press2/states-with-weak-gun-laws-and-higher-gun-ownership-have-highest-gun-death-rates-in-the-nation-new-data-for-2023-confirm/">research suggests</a>&nbsp;the practice can increase the risk of firearm homicides.</p>



<p>In Louisiana, Democratic former Gov. John Bel Edwards, in office from 2016 to 2024,&nbsp;<a href="https://apnews.com/article/la-state-wire-gun-politics-laws-government-and-politics-e3d0715cb75456ffcb58391bf2850cb4">vetoed a bill</a>&nbsp;that would have allowed people to carry concealed firearms without a permit.</p>



<p>Elected in 2023, Republican Gov. Jeff Landry&nbsp;<a href="https://www.gov.louisiana.gov/index.cfm/newsroom/detail/4439">signed a law</a>&nbsp;to allow any person over age 18 to conceal-carry without a permit.</p>



<p>The Trump administration has created&nbsp;<a href="https://www.justice.gov/opa/pr/attorney-general-pamela-bondi-statement-regarding-creation-2nd-amendment-task-force">a task force</a>&nbsp;<a href="https://www.justice.gov/ag/media/1395956/dl?inline">to implement</a>&nbsp;his executive order to end most gun regulations and which would allow more people with criminal convictions, including for domestic abuse, to own guns.</p>



<p>Figures vary, but some researchers estimate as many as 500 million guns circulate in the U.S. Sales reached&nbsp;<a href="https://smallarmsanalytics.com/v1/pr/2022-01-05.pdf">record highs</a>&nbsp;during the pandemic and publicly traded firearm and ammunition companies saw&nbsp;<a href="https://www.jec.senate.gov/public/_cache/files/9bfdef03-67b9-49d3-8252-23f7b90a01d6/jec-gun-industry-profits-final.pdf">profits jump</a>.</p>



<p>Donald Trump Jr. this summer&nbsp;<a href="https://www.axios.com/2025/03/24/grabagun-trump-spac">joined the board</a>&nbsp;of GrabAGun, an online gun retailer that went public in July under the stock ticker PEW. In a&nbsp;<a href="https://www.sec.gov/Archives/edgar/data/1995413/000121390025063424/ea024879701ex99-1_colombier2.htm#:~:text=A%20Registration%20Statement%20on%20Form,attend%20the%20Extraordinary%20General%20Meeting.">Securities and Exchange Commission filing</a>, the company, which markets guns to people ages 18 to 44, cited “<a href="https://www.sec.gov/Archives/edgar/data/1995413/000121390025056297/ea0233554-09.htm">gun violence prevention and legislative advocacy</a>&nbsp;organizations that oppose sales of firearms and ammunition” as threats to its sales growth.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/GettyImages-2224718045-resized.jpg?w=696&#038;ssl=1" alt="A photo of Donald Trump Jr. at the New York Stock Exchange. He smiles, facing to the left, holding his left hand up in a finger gun pose." class="wp-image-2074919"/><figcaption class="wp-element-caption">Donald Trump Jr. is a board member of GrabAGun, an online gun store that went public on the New York Stock Exchange under the ticker PEW.(Michael Nagle/Bloomberg via Getty Images)</figcaption></figure>



<p>Dave Workman, a gun rights advocate with the&nbsp;<a href="https://saf.org/">Second Amendment Foundation</a>, said firearms are not to blame for the surge in pandemic shootings.</p>



<p>“Bad guys are going to do what bad guys are going to do regardless of the law,” Workman said. “Taking away gun rights is not going to reduce crime.”</p>



<p>David Yamane, a Wake Forest University sociology professor and national authority on guns, said the U.S. firearm debate is complex and the industry is often “painted with too broad a brush.”</p>



<p>Most guns will never be used to kill anyone, he said. Americans tend to buy more guns during times of unrest, Yamane added: “It’s part of the American tradition. Guns are seen as a legitimate tool for defending yourself.”</p>



<h2 class="wp-block-heading"><strong>‘A Low Level of Hope’</strong></h2>



<p>Once called “<a href="https://bogalusarebirth.com/history/">the Magic City</a>,” Bogalusa has become a grim symbol of deindustrialization.</p>



<p>Bogalusa emerged as Black people formed their own communities in the time of Jim Crow racial segregation at the turn of the 20th century.</p>



<p>Racism concentrated Black people in neighborhoods that&nbsp;<a href="https://jamanetwork.com/journals/jama/fullarticle/2804822">became epicenters of poor health</a>, reflected in high rates of cancer, asthma, chronic stress, preterm births, pregnancy-related complications — and, over recent decades,&nbsp;<a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC10155117/">firearm violence</a>.</p>



<p>Thousands flocked to Bogalusa after the Great Southern Lumber Company built one of the world’s biggest sawmills, establishing Bogalusa as a company town. Racial tensions&nbsp;<a href="https://www.zinnedproject.org/news/tdih/bogalusa-labor-massacre/">soon followed</a>.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/GettyImages-515516180-resized.jpg?w=696&#038;ssl=1" alt="An archival photo of a Black man holding up replica KKK robes at a protest." class="wp-image-2074917"/><figcaption class="wp-element-caption">Racial tensions followed the growth of Bogalusa in the 20th century. Charles Sims, a leader in the Deacons for Defense and Justice, a civil rights group, holds up replicas of Ku Klux Klan attire in Bogalusa in 1966.(Bettmann/Getty Images)</figcaption></figure>



<p>Members of the local&nbsp;<a href="https://www.blackpast.org/african-american-history/deacons-defense-and-justice/">Deacons for Defense and Justice</a>&nbsp;gained national attention in the 1960s for protecting civil rights organizers from the Ku Klux Klan,&nbsp;<a href="https://www.splcenter.org/resources/extremist-files/ku-klux-klan/">a hate group</a>&nbsp;that burned houses and churches, terrorizing and killing Black people.</p>



<p>As the mill changed hands over the decades, Bogalusa’s fortunes slid. In the mid-20th century, the population surpassed 20,000, but it is now about half that.</p>



<p>International Paper,&nbsp;<a href="https://www.opportunitylouisiana.gov/news/gov-edwards-announces-52-million-modernization-plan-for-international-paper-in-bogalusa">a Fortune 500 company</a>&nbsp;based in Tennessee, runs the mill as a containerboard factory, employing about 650 people. In 2021, the state announced incentives for the company that included a $500,000 tax break, saying the move would help bring “prosperity.”</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/Bogalusa_08-resized.jpg?w=696&#038;ssl=1" alt="A photo of the exterior of a large mill. Smoke or steam billows out of one of a cooling tower." class="wp-image-2074926"/><figcaption class="wp-element-caption">International Paper, a Fortune 500 company, operates a containerboard mill in Bogalusa that was once one of the largest sawmills in the world.&nbsp;(Fred Clasen-Kelly/KFF Health News)</figcaption></figure>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/Bogalusa_05-resized.jpg?w=696&#038;ssl=1" alt="A photo of a storefront window with large &quot;Store closing&quot; signs." class="wp-image-2074927"/><figcaption class="wp-element-caption">A few blocks from the containerboard mill, the main drag in Bogalusa is littered with empty storefronts and boarded-up buildings.&nbsp;(Fred Clasen-Kelly/KFF Health News)</figcaption></figure>



<p>Businesses remain boarded up along the main drag. Houses still bear damage from Hurricane Katrina, and many streets are eerily quiet.</p>



<p>Nearly 1 in 3 people in Bogalusa live in poverty — 2½ times the national average.</p>



<p>Bogalusa’s violent gun crime rate&nbsp;<a href="https://ejusa.org/wp-content/uploads/A-Roadmap-for-Change-Bogalusa-Report.pdf#page=11">reached 646.1 per 100,000</a>&nbsp;people in 2022, higher than Louisiana’s and 1.7 times the national one, according to the nonprofit Equal Justice USA, citing FBI Uniform Crime Reporting data.</p>



<p>In many rural towns across the South, “there is a level of desperation that is more apparent” than in other parts of the U.S., said&nbsp;<a href="https://www.goodreads.com/book/show/75816949-the-injustice-of-place">Luke Shaefer</a>, a&nbsp;<a href="https://ssw.umich.edu/faculty/profiles/tenure-track/lshaefer">University of Michigan professor</a>&nbsp;of social justice and public policy.</p>



<p>“They don’t have the same infrastructure to have robust social services. People are like, ‘What are my life chances?’” Shaefer said. “People feel like there is nothing that can be done. There is a low level of hope.”</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/GettyImages-514870726-resized.jpg?w=696&#038;ssl=1" alt="An archival photo of a civil rights protest in Bogalusa in 1965. A group of Black men walk in a protest. The man on the left side of the photo holds a sign that reads, &quot;We don't buy where we can't work.&quot;" class="wp-image-2074918"/><figcaption class="wp-element-caption">Bogalusa emerged as a battleground for civil rights in the 1960s. James Farmer (far right), national director of the Congress of Racial Equality, walks in a Bogalusa protest in 1965.(Bettmann/Getty Images)</figcaption></figure>



<h2 class="wp-block-heading"><strong>Missed Opportuniti</strong><strong>es</strong><strong></strong></h2>



<p>Mayor Truong lamented the violence in Bogalusa after Forbes was killed,&nbsp;<a href="https://www.facebook.com/100078891425748/posts/pfbid0MFS4KUpd2k4FBC8LX8khcJR5MHZu7RjLSBJMgh2bRgduB9q7jUqaeqiwTXgsT15bl/?mibextid=cr9u03">writing on Facebook</a>, “When are we as a community going to come together and decide enough is enough?”</p>



<p>The federal government had offered one path forward.</p>



<p>The Biden administration provided billions of dollars to local governments through the American Rescue Plan Act during the pandemic. Biden urged them to deploy money to community violence intervention programs, shown to&nbsp;<a href="https://bidenwhitehouse.archives.gov/briefing-room/statements-releases/2021/04/07/fact-sheet-more-details-on-the-biden-harris-administrations-investments-in-community-violence-interventions/">reduce homicides</a>&nbsp;by as much as 60%.</p>



<p>A handful of cities seized the opportunity, but most did not. Bogalusa has received&nbsp;<a href="https://house.louisiana.gov/housefiscal/COVID19/Local%20ARPA%20Estimated%20Distribution%206.21.21.pdf">$4.25 million in ARPA funds</a>&nbsp;since 2021. None appears to have gone toward violence prevention.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/Bogalusa_07-resized.jpg?w=696&#038;ssl=1" alt="A photo of an abandoned house overgrown with shrubbery. A lone shopping cart is in front of it." class="wp-image-2074928"/><figcaption class="wp-element-caption">Abandoned houses dot parts of Bogalusa. Mayor Tyrin Truong, who was elected in 2022 at age 23, has promised to reduce crime and blight that plague parts of this community 75 miles north of New Orleans.(Fred Clasen-Kelly/KFF Health News)</figcaption></figure>



<p>The Louisiana legislative auditor, Michael Waguespack, found that Bogalusa used nearly $500,000 for employee bonuses, which his report said may have violated state law. In some cases,&nbsp;<a href="https://app2.lla.state.la.us/publicreports.nsf/0/ee0f2965b8adc10a86258b55006b7965/$file/000050a5b.pdf?openelement&amp;.7773098">the report</a>&nbsp;says, payments were not tied to work performed.</p>



<p>Bogalusa officials did not respond to a public records request from KFF Health News seeking detailed information about its ARPA money.</p>



<p>Former Mayor Wendy O’Quin-Perrette, who served from 2015 through early 2023, told Waguespack&nbsp;<a href="https://app2.lla.state.la.us/publicreports.nsf/0/ee0f2965b8adc10a86258b55006b7965/$file/000050a5b.pdf?openelement&amp;.7773098#page=58">in a June 2024 letter</a>&nbsp;that the city used ARPA money to improve streets and pay the bonuses. “We would not have done it without being sure it was allowed,” she said.</p>



<p>O’Quin-Perrette did not respond to requests for comment.</p>



<p>In a&nbsp;<a href="https://s3.documentcloud.org/documents/26052776/city-of-bogalusa-investigative-audit-services-issued-july-10-2024.pdf#page=48">2023 letter</a>&nbsp;to Waguespack, O’Quin-Perrette’s successor, Truong, wrote that Bogalusa officials didn’t know how the federal money was spent. When he took office, Truong alleged, officials discovered “tens of thousands of dollars of checks and cash” stashed “in various drawers and on desks” in city offices.</p>



<p>Truong defended his stewardship of ARPA funds, saying that about $1 million remained when he assumed office but that the money was needed for more urgent sewer infrastructure repairs. “I wish we could have invested more, invested any money in gun violence prevention efforts,” he said.</p>



<p>In an interview, Truong said the city has been “intentional” about bringing down gun violence, including through a summer jobs program. He pointed to statistics that show homicides decreased from nine in 2022 to two in 2024. “If you keep them busy, they won’t have time to do anything else,” he said.</p>



<p>Asked about his January arrest, Truong said he has political enemies.</p>



<p>“I’m the only Democrat in a very red part of the state, and, you know, I’ve made a lot of changes at City Hall, and that ticks people off,” Truong told KFF Health News. He said that he ended long-standing city contracts with local businesspeople. “When you’re shaking up power structures, you become a target.”</p>



<p>Josie Alexander,&nbsp;<a href="https://ejusa.org/about-us/staff/">a Louisiana-based senior strategist</a>&nbsp;for&nbsp;<a href="https://ejusa.org/wp-content/uploads/A-Roadmap-for-Change-Bogalusa-Report.pdf">Equal Justice USA</a>, said city officials missed an opportunity when they didn’t use ARPA funds for gun violence prevention. “The sad thing is people here can now see that money was coming in,” she said. “But it just wasn’t used the way it needed to be.”</p>



<h2 class="wp-block-heading"><strong>‘Too Much Trouble Here’</strong></h2>



<p>Truong said the city is still reeling from the&nbsp;<a href="https://www.documentcloud.org/documents/26038599-cde-fbi-bogalusa/">pandemic spike in violent crime</a>. He said he was at Bogalusa High School’s homecoming football game in 2022 when one teen shot another. Shots rang out, Truong said, and he grabbed his 3-month-old son and “laid in the bleachers.”</p>



<p>“It’s not a foreign topic to hardly anybody in town, whether you’ve heard the gunshots in the distance, whether you have attended a funeral of somebody who passed due to gun violence,” he said. Many still grapple with trauma.</p>



<p>In December 2022, Khlilia Daniels said, she hosted a birthday party for her teenage niece, praying no one would bring a gun.</p>



<p>The hosts checked guests for weapons, she said.</p>



<p>Yet gunfire erupted, Daniels said. Three teens were shot, including&nbsp;<a href="https://www.crainandsons.com/obituary/ronie-taylor">15-year-old Ronié Taylor</a>, who died, according to police.</p>



<p>“When someone you know is killed, you never forget,” said Daniels, 32, who held Taylor until emergency responders arrived.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/Bogalusa_03-resized.jpg?w=696&#038;ssl=1" alt="A photo of a Black woman standing outside in Bogalusa, Louisiana." class="wp-image-2074914"/><figcaption class="wp-element-caption">Khlilia Daniels tried to help save a 15-year-old boy who was fatally shot in Bogalusa in December 2022 at a birthday party for her niece. “When someone you know is killed, you never forget,” she says.(Fred Clasen-Kelly/KFF Health News)</figcaption></figure>



<p>Tajdryn Forbes was planning his future when he&nbsp;<a href="https://www.cookrichmondfuneralhome.com/obituary/tajdryn-forbes">was killed</a>, likely because of a dispute that started on social media over lyrics in a rap song, Guy said.</p>



<p>In a&nbsp;<a href="https://www.facebook.com/story.php/?story_fbid=1016093937218305&amp;id=100064531246730">Facebook post</a>&nbsp;in January, Bogalusa police said they had arrested someone in connection with Forbes’ killing. Authorities had&nbsp;<a href="https://www.facebook.com/bogalusapd/posts/691870959640606/">previously announced</a>&nbsp;the arrest of a teen in connection with the homicide.</p>



<p>Forbes had been a high school football standout, like his late father, Charles Forbes Jr., who played semipro. When Forbes scored a touchdown, he would look to the sky to honor his dad.</p>



<p>The school praised Forbes for his senior baseball season in&nbsp;<a href="https://www.facebook.com/bogalusahighschool/posts/we-were-pleased-to-honor-our-senior-baseball-player-tajdryn-forbes-on-senior-nig/4998121910304057/">a social media post</a>: “This young man makes a difference on our campus and on the field with his strong character.”</p>



<p>When hopes for a college football scholarship did not pan out, Forbes worked as a deckhand for a marine transportation company. He saved money, looking forward to moving to Slidell, a suburb of New Orleans.</p>



<p>“He would always say, ‘There’s too much trouble here’” in Bogalusa, Guy recalled.</p>



<figure class="wp-block-image"><img data-recalc-dims="1" decoding="async" src="https://i0.wp.com/kffhealthnews.org/wp-content/uploads/sites/2/2025/07/Bogalusa_00.jpg?w=696&#038;ssl=1" alt="A photo of Tajdryn Forbes posing with a football and his helmet." class="wp-image-2074915"/><figcaption class="wp-element-caption">Tajdryn Forbes had been a high school football standout, like his late father, Charles Forbes Jr., who played semipro. When Forbes scored a touchdown, he would look to the sky to honor his dad.(Kevin Magee)</figcaption></figure>
<p>The post <a href="https://medika.life/guns-race-and-profit-the-pain-of-americas-other-epidemic/">Guns, Race, and Profit: The Pain of America’s Other Epidemic</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21394</post-id>	</item>
		<item>
		<title>It’s Not Us vs. Them: What the Terminator Teaches Us About AI and the Future of Health</title>
		<link>https://medika.life/its-not-us-vs-them-what-the-terminator-teaches-us-about-ai-and-the-future-of-health/</link>
		
		<dc:creator><![CDATA[Gil Bashe, Medika Life Editor]]></dc:creator>
		<pubDate>Sun, 29 Jun 2025 02:53:52 +0000</pubDate>
				<category><![CDATA[AI Chat GPT GenAI]]></category>
		<category><![CDATA[Digital Health]]></category>
		<category><![CDATA[Diseases]]></category>
		<category><![CDATA[Editors Choice]]></category>
		<category><![CDATA[Ethics in Practice]]></category>
		<category><![CDATA[For Doctors]]></category>
		<category><![CDATA[General Health]]></category>
		<category><![CDATA[Habits for Healthy Minds]]></category>
		<category><![CDATA[Mental Health]]></category>
		<category><![CDATA[Policy and Practice]]></category>
		<category><![CDATA[Trending Issues]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Apple]]></category>
		<category><![CDATA[ChatGPT]]></category>
		<category><![CDATA[Coding]]></category>
		<category><![CDATA[Ethics]]></category>
		<category><![CDATA[GenAI]]></category>
		<category><![CDATA[Gil Bashe]]></category>
		<category><![CDATA[LLMs]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[OpenAI]]></category>
		<category><![CDATA[Patient Experience]]></category>
		<category><![CDATA[T800]]></category>
		<category><![CDATA[Terminator]]></category>
		<category><![CDATA[Tim Cook]]></category>
		<guid isPermaLink="false">https://medika.life/?p=21261</guid>

					<description><![CDATA[<p>“I know now why you cry. But it is something I can never do.”– The Terminator, T2: Judgment Day That moment, when the T-800, a machine built for destruction, understands human emotion, is among the most powerful in action cinema. It is the climax of Terminator 2: Judgment Day, but also a beginning: the start [&#8230;]</p>
<p>The post <a href="https://medika.life/its-not-us-vs-them-what-the-terminator-teaches-us-about-ai-and-the-future-of-health/">It’s Not Us vs. Them: What the Terminator Teaches Us About AI and the Future of Health</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong><em>“I know now why you cry. But it is something I can never do.”<br>– The Terminator, T2: Judgment Day</em></strong></p>



<p>That moment, when the T-800, a machine built for destruction, understands human emotion, is among the most powerful in action cinema. It is the climax of <a href="https://en.wikipedia.org/wiki/Terminator_2:_Judgment_Day">Terminator 2: Judgment Day</a>, but also a beginning: the start of the android’s transformation, not into a human, but into something more self-conscious that recognizes the worth of organic life, even if it can outthink people, it can appreciate the human experience.</p>



<p>The metaphor feels timely as we stand at the edge of an AI-driven health future. Today’s GenAI tools are evolving rapidly, but are we, their creators and coders, evolving with equal intentionality? Are we teaching the owners of these systems why we heal, or just how?</p>



<p>We often speak of artificial intelligence as if it were separate from us. But AI is not alien. It is us—our ideas, data, values—encoded and amplified. It mirrors back what we feed it. In the realm of health, that reflection must be carefully considered. Unlike a Hollywood villain, GenAI doesn’t turn against us with malicious intent. But it can misalign from its purpose if we forget that behind every innovation must be a human-centered goal.</p>



<p>From the first recorded prayer for healing in the Bible—<em>&#8220;G-d, please heal her now”—</em>health has always been rooted in empathy, intuition, and relationships. The clinician’s pause before giving a diagnosis, the nurse’s touch when comforting a patient, and the community health worker navigating skepticism in underserved areas are not functions you can replicate with an algorithm. They are acts of presence, of judgment shaped by experience and emotion. Yet, technology now surrounds these moments, offering powerful new support.</p>



<p>Even Satya Nadella, CEO of Microsoft, captured this imperative clearly: <em>“Empathy must be embedded in artificial intelligence from the moment it is created to ensure it becomes a positive force in people’s lives.” </em>It’s not just about what technology can do—it’s about how it’s directed, and who it serves.</p>



<p>GenAI is already beginning to assist clinical teams by synthesizing medical records, supporting drug discovery, and interpreting diagnostic images faster than human eyes. It scales knowledge, translates complex science for patients, and identifies early signals of population health risks. These are welcome advancements—but only when guided by a human compass.</p>



<p>Let’s not look at a future of “us vs. them”—patients and providers versus machines. The more accurate framing is “us and them”: a coalition of human and machine intelligence, working together in the service of healing. Patients, payers, providers, product developers, and policymakers are the “us.” GenAI, LLMs, machine learning, and chatbots form the “them.” Power lies not in one side dominating the other, but in how we integrate these efforts.</p>



<p>Tim Cook, CEO of Apple, has often said<em>, “At Apple, we believe technology should lift humanity.”</em> In a world driven by rapid innovation, his words are a steady reminder that progress without purpose is not progress—it’s motion without meaning. Cook also noted at MIT, <em>“Technology is capable of doing great things, but it doesn’t want to do great things. It doesn’t want anything … That part takes all of us.”</em></p>



<p>To do that, we must resist the urge to see AI as an all-knowing oracle. AI is not autonomous in values, does not possess a conscience, and lacks intuition unless we teach it patterns. Those patterns, if drawn from biased data, can replicate systemic inequities. In health, where trust is everything, we cannot afford such blind spots. Human oversight is not just necessary, it’s irreplaceable.</p>



<p>There’s also a danger in assuming technology alone can fix what’s broken. We already know the limits of scale without empathy. We’ve seen systems become more efficient but less personal. We’ve witnessed patients lost in data flows, their lived experience reduced to metrics. If GenAI becomes another layer of distance rather than connection, we will have failed to grasp its most powerful potential: to bring clarity, not complexity; to extend human capacity, not replace it.</p>



<p>OpenAI CEO Sam Altman acknowledges the promise and the peril: “<em>This will be the greatest technology humanity has yet developed… We’ve got to be careful here … people should be happy that we are a little bit scared of this.”</em> Fear, in this case, signals responsibility. Responsibility requires centering AI in the service of people, not pushing people to conform to the logic of machines.</p>



<p>There are lessons in Terminator beyond the thrill of a dystopian chase. Sarah Connor learns to trust the very machine that once tried to kill her. John Connor, the future leader of humanity, becomes the teacher. And the T-800—a symbol of cold efficiency—becomes the student. This reversal reflects what we need now: machines that learn how to act and why their actions matter, not just how to optimize workflows but why saving time matters when time is the difference between life and death.</p>



<p>We cannot forget how this transformation from killer machine to protector occurs. In &#8220;Terminator 2: Judgment Day,&#8221; the T-800 model evolves into humanity’s hero because&nbsp;John Connor reprograms it from the future to protect his younger self and his mother, Sarah Connor. The human is the creator—the coder.</p>



<p>Somewhere in this cinematic science fiction lies a guiding truth for our future reality: technology learns from humanity. Just as this version of the Terminator changed by being close to people, our AI systems will evolve based on what—and who—they are near. If surrounded by empathy, equity, and ethical standards, they can amplify what’s best in us. If left untethered from human purpose, they risk scaling our worst habits.</p>



<p>We often frame digital health progress in terms of speed and scale. But what if we reframed it through the lens of dignity? What if the measure of innovation wasn’t just how fast a model can generate results, but how well it supports the human healing experience?</p>



<p>In the end, the T-800 sacrifices itself to protect a better future. It understands that some decisions aren’t logical; they are meaningful. It doesn’t cry—but it finally sees why we do.</p>



<p>Let’s not wait for machines to catch up with our humanity. Let’s lead with it.</p>
<p>The post <a href="https://medika.life/its-not-us-vs-them-what-the-terminator-teaches-us-about-ai-and-the-future-of-health/">It’s Not Us vs. Them: What the Terminator Teaches Us About AI and the Future of Health</a> appeared first on <a href="https://medika.life">Medika Life</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21261</post-id>	</item>
	</channel>
</rss>
