The room was standing room only. At the HIMSS AI in Healthcare Forum, the energy was palpable, and the audience quiet and focused. This wasn’t a tech demo or a sales pitch. It was a gathering of health sector stewards—leaders seeking clarity amid the fog of anticipated disruption. Setting the tone for the two-day event was keynote Tom Lawry, former National Director for AI at Microsoft Health and now a national strategic advisor to global institutions that shape the future of care.
Lawry, the author of Hacking Healthcare and Healthcare Nation, is no stranger to the crossroads of innovation and implementation. His talk didn’t begin with algorithms—it began with accountability. With courage. The unvarnished truth is that the role of AI in the health sector will not be determined by tech developers alone, but by leaders willing to stand for ethical adoption and clinical collaboration.
AI Is Not an Add-On—It Is the Infrastructure
One of the most powerful refrains in Lawry’s address was this: Artificial Intelligence (author’s definition – “augmented implementation”) is a general-purpose technology. Like electricity or the printing press, it doesn’t simply optimize existing processes—it refines them. It changes the sector’s (and perhaps society’s) operating system. It is not “intelligent,” it’s intelligence, [as John Nosta suggests].
In the health sector, AI must be treated not as a pilot project or a staffing replacement but as core infrastructure. It’s not a department, it’s a foundation. Lawry urged leaders to go beyond adoption cycles and recognize AI’s capacity to reshape systems, relationships and responsibilities. That shift requires not just technical integration but cultural transformation. It requires integration of clinical, operational and human resource functions.
“What Does This Mean for Me?”
Clinicians aren’t resisting AI—they’re seeking relevance. Their reflective question is: “What does this mean for me?” This question is surfacing across hospitals, clinics, and systems worldwide. Health professionals are not asking for more white papers or coding walkthroughs—they want to know if their judgment, autonomy, and voice will be protected.
His message was clear: Don’t ask clinicians to adopt. Invite them to co-design. Empower them to lead alongside technologists. That is how AI earns trust and ensures value.
Elevating the Workforce Through Upskilling
Referring to McKinsey forecasts, Lawry noted that up to one-third of clinical activity—primarily administrative—can be automated. But this is not about eliminating jobs. It’s about restoring the joy of practice and aligning people with purpose. If deployed wisely, AI can liberate talent from tasks that dull passion and delay patient care. The real challenge will be forging a bridge between aspirational and operational intent.
This is possible if health systems democratize AI knowledge. Upskilling cannot remain the domain of senior executives and IT teams. The professionals most affected by AI must also be those most prepared to use and question it with confidence.
Governance Is the Bedrock of Responsibility
Too many institutions speak about responsible AI but fail to structure it. As Lawry outlined and reflected in his Responsible AI Discussion Guide shared at the Forum, governance is not an aspiration. It is a requirement.
Lawry asks three key questions to test organizational readiness:
- Has the institution formally adopted responsible AI principles, ratified by top leadership or the board?
- Are those principles consistently applied to all AI projects and partnerships?
- Are AI standards written into procurement practices?
The health sector cannot afford partial answers. Most AI will arrive embedded within larger platforms—EHRs, diagnostics or billing systems. Governance alone is insufficient if the umbrella of its use doesn’t extend to vendors and embedded solutions. His cautionary guidance: policies without enforcement expose organizations to reputational and regulatory risk.
Judge AI by Outcomes, Not Headlines
It’s easy to get excited about the tech tools and early pilots. However, Lawry warned against evaluating success by the number of AI deployments. Actual value must be measured by mission alignment. That means, are outcomes improving? Are clinicians regaining time and focus? Are costs becoming more sustainable? Is ethical compliance being elevated? It is a nuts-and-bolts call-to-action.
Lawry urges organizations to treat AI not as a buzzword, but as a continuous improvement program. Like clinical quality, it requires constant evaluation and a clear connection to purpose.
Leadership Is the Deciding Factor
Perhaps the most lasting message from his keynote is that technology does not create transformation—leadership does. From boards to bedside, AI requires a mindset of clarity, courage, and consistency.
Leadership must:
– Understand what AI can—and cannot—do
– Create a culture where experimentation and transparency thrive
– Build governance, avoiding treating AI as a vendor offering
“AI value at scale is about leadership,” Lawry declared. “No algorithm, no matter how powerful, can substitute for moral clarity and institutional courage.”
A Gathering That Signals Momentum

This HIMSS AI in Healthcare Forum is a chance to compare notes and is a change-agent catalyst. It brought together visionaries and practitioners, policymakers and informaticists, engineers and ethicists. What unites them is the understanding that we are not waiting for the future—we are in it.
From HIMSS CEO Hal Wolf’s empowering opening remarks to Forum Master of Ceremonies and HIMSS Senior Director of the Personal Connected Health Alliance Rob Havasy and the powerful opening keynote by Tom Lawry, it is clear that this event is more than a “professional meeting” – it is an invitation to the possibilities of the age of AI in the health sector. But its success will not come from software, coding and flashing healthtech alone. It will arise from systems that put people at the center—patients, providers and the health ecosystem community.
AI Has No MRI: It’s All About Leadership
Tom Lawry offered more than a presentation—he provided a roadmap. One that begins not with technology but with trust. One that demands more than innovation—it requires intention.
The transformation ahead is not only technical. It is cultural, operational and profoundly human. The institutions that rise to the occasion will do much more than survive disruption. They will define the next era of healing.
AI is here. The question now is whether we are ready to use it and lead with it.