AI was a particularly pervasive conversation topic across The Health Management Academy’s executive membership programs in 2023. To close out this year, we wanted to acknowledge the collective wisdom we’ve gleaned from some great member discussions and share our team’s takeaways on how LHS can use these insights in actionable ways come 2024.
LHS’ execs anxiety about AI is widespread and warranted, but shouldn’t be paralyzing
Sarah O’Hara, Senior Director, AI Catalyst
As THMA has launched AI Catalyst across 2023, our team has had the privilege of talking to A LOT of Leading Health System executives about AI. Though people vary in how they engage with the topic, one consistent reaction has been fear. Many LHS leaders express anxiety about how little they understand about AI. They fear its implications both narrowly (patient harm, malpractice) and broadly (elimination of entire job classes, extinction of humanity). They fear missing out on its benefits if they don’t move quickly enough—and they fear making bad investments if they move too fast. The concerns and uncertainties are rampant.
Fear is not unwarranted. The move to AI is different from past industry disruptions for several reasons. First is the speed with which change is happening. Unlike, say, the rise of value-based care, consumerism, or digital health—all of which felt disruptive conceptually but in practice occurred more slowly—AI is evolving from novelty idea to mainstream practice at lightning speed. The more recent disruption of Covid-19 perhaps provides a better parallel, in that it also emerged rapidly and, like AI, had implications that stretched well beyond healthcare. Yet even there, we at least understood the underlying mechanisms of a viral pandemic and knew that it would ultimately end. With AI, even the experts don’t always know how it works or where it will go. In many ways, LHS are truly in uncharted territory, with fear a natural response.
But just because fear is justified doesn’t mean it must be paralyzing. Even if the AI revolution is unprecedented, we are not bereft of knowledge about how to respond. Health systems that navigate this transition successfully will be those that recognize that, as with past disruptions we’ve confronted, it’s about bringing structure to uncertainty: building governance, identifying where AI can advance system strategy, establishing investment assessment protocols, managing the workforce through change.
At the end of the day, AI is just another new technology to be integrated—a complex and rapidly evolving technology, to be sure, but not wholly unfamiliar. Hopefully, that fact will help LHS leaders sleep a little better at night.
LHS have an opportunity to lead on developing—and commercializing—clinical AI use cases
Jackie Kimmel, Senior Director, and MonYi Lwin, Senior Analyst, Strategy Catalyst
Turning only one-year-old on Nov. 30, ChatGPT has already garnered significant attention and curiosity from LHS chief strategy officers (CSOs). Despite this, only 6% of health systems have a generative AI strategy developed. How can strategy leaders plan for the impact of AI, especially not knowing how it’s going to develop over the next five years?
While we don’t exactly know how the field will evolve, a key lesson that we learned in 2023 is that health systems will play an indispensable role in developing AI use cases involving clinical decision-making. Note this is different from use cases that have matured commercially this past year, such as ambient listening and clinical documentation. Many CSOs believe the only safe way to train AI tools—especially those affecting clinical pathways—is to use LHS knowledge (i.e., health system data), which they consider to be the industry’s best representation of clinical knowledge.
Some may find it surprising that other industry stakeholders agree as well. In September, THMA’s Strategy Catalyst team sat down with Peter Durlach, CVP and CSO of Microsoft Health & Life Sciences, and Dr. David Rhew, global CMO and VP of healthcare at Microsoft, for a conversation around setting a health system AI strategy. Durlach acknowledges that there are “a million other use cases that are not being developed by the commercial market.” He believes that “health systems have an enormous opportunity because the only people that can really build these are health systems or other companies that bring in clinical expertise.” He predicts tech vendors will likely avoid these clinical cases because they face uncertain regulations.
Several LHS are already considering how these tools could help diversify revenue; Durlach adds that there are “a lot of health systems who are also looking to get commercial value out of building these applications.” Radiology may be an initial area for development. However, the key questions will be how feasible this development will be in-house and what go-to-market strategies are possible. Many LHS feel they will need partners, especially in the commercialization phase, to provide the infrastructure and commercial footprint. Therefore, while it’s too early to predict exactly which use cases will emerge as revenue generators for LHS and how partnerships will play out, one thing is certain: there is consensus among both LHS strategy leaders and industry leaders that LHS can use their differentiated position in the market to develop clinical AI.
Don’t ignore workforce impact in assessing AI’s ROI
Sara Zargham, Senior Analyst, AI Catalyst
Few can dispute that health system margins were under increased pressure in 2023. Yet, the greatest strategic pressure on LHS continued to be the repercussions of our Covid-accelerated staffing crisis, with little apparent improvement in conditions. As administrative work continues to take time away from what clinicians signed-up to do—namely, practice medicine—the signs of dissatisfaction remain obvious. Only 57% of doctors would choose medicine again, and more than 50% of nurses report experiencing burnout.
In response to what amounts to an existential crisis for some health systems, we are seeing a shift in how LHS view strategic imperatives with the potential to improve clinicians’ lives. That’s why even at a time of thin margins, the investment calculus for AI requires an expanded ROI analysis: one that looks beyond cost savings to account for other strategic and indirect benefits, especially as they hold the potential to improve clinician satisfaction.
We’ve seen this mindset shift play out as well with other technology investments like virtual nursing and back-end IT changes such as single sign-on. In each of these cases, we’ve heard members share that as long as they break roughly even, they are willing to invest in novel technologies and AI solutions that improve staff satisfaction. But to ensure maximum return on using AI to alleviate the staffing crisis, here are a few additional things you should consider:
Be prepared to explain how AI will help staff do their jobs: After years of head-spinning tech implementations, many healthcare workers are jaded and feel major change fatigue when it comes to implementing new technologies. Secure employee buy-in by showing how AI use cases can improve staff workflows.
Guard against duplicating technological solution sets: Make sure you are already leveraging existing technology (like your EMR) as much as you can before adding additional point solutions.
Start small and then grow: Begin with a pilot and track satisfaction/turnover measures at regular intervals before you decide whether to grow the program.
Alleviate “replaced-by-AI” workforce concerns: Despite the powerful transformation that AI will have on the healthcare workforce, few LHS anticipate wholesale workforce reductions. Rather, AI can improve the ability of staff to work top-of-license, providing a value-add to productivity and to employee satisfaction.
If you skip building organization-wide AI literacy, you’ll never see the full return on AI adoption
Alex Polyak, Director, AI Catalyst
In speaking with dozens of LHS on the topic of AI adoption, we’ve heard time and again the importance of bringing in clinical expertise, financial stakeholders, and a wide cross-section of CXOs. But what we’ve rarely heard of is bringing in Chief Learning Officers or other continued learning professionals. This is a problem, because one of the most common refrains from LHS is how they lack a common organizational definition of AI, coupled with poor AI literacy across the org. The results of this illiteracy are far-reaching: LHS have told us about how they’ve struggled to implement AI, build AI governance, or define value metrics for the ROI of AI because, as one member put it, “we can’t agree on what we’re talking about.”
THMA surveyed 24 of the most AI-progressive LHS on the topic of their AI maturity. In an analysis of 11 determinants for AI maturity, the worst performing determinant—by a longshot—was AI literacy. Seemingly advanced determinants such as tech infrastructure, change champions, and formalized AI business case all scored far higher than AI literacy: in effect, putting the cart before the horse. No wonder then that even among the most AI-savvy LHS, overall AI maturity was rated at a middling 2.52 on a five-point scale.
AI has the potential to transform every workflow within LHS, but this requires organization-wide education. In the same way that we recognize digital literacy (and increasingly, data literacy) as integral skillsets for healthcare employees, AI literacy must receive the same recognition. Already, we’re hearing proposals on how training on the ethical use of AI should be rolled out alongside trainings on HIPAA and data privacy that are already compulsory. In much the same way that newly minted healthcare managers receive supplemental education on budgeting and workforce planning, these new leaders should be upskilled on data governance and AI fundamentals. And just as organizations roll out system-wide mission statements and health equity definitions, so too should they socialize a system-wide definition of artificial intelligence.
Across all industries, only 21% of employees feel confident in their data literacy skills. If AI adoption is to ever reach its full potential, collectively, healthcare workers have to get comfortable with data, digital, and AI literacy. Otherwise, we’ll continue to reap the consequences of building a complex house on a cracked foundation.