LAS VEGAS — When the University of Illinois Hospital and Health Sciences System was testing an artificial intelligence-backed tool that drafts responses to messages, a patient misspelled the name of a medication, Karl Kochendorfer, chief health information officer, recalled during a panel at the HLTH conference last week.
The mistake led the AI to give side effects for a drug the patient wasn’t using when a nurse forgot to double-check the response.
Ultimately, it wasn’t a huge issue — they just needed to call the patient or send another message to issue a correction, he said. But it could have had serious implications for the tool.
“It almost killed the pilot. […] And it happened on day one,” he said.
As healthcare grapples with how to safely implement AI, investors and health systems are first seeing promise adopting tools that automate administrative and back-office work, which could make a dent in provider burnout and pose fewer risks to patient care, experts said at the HLTH conference.
But the pressure is on to adopt the tech. Proponents argue AI could help solve healthcare’s significant workforce challenges: The nation faces a shortage of more than 100,000 critical healthcare workers in 2028, as the overall population ages and needs more care, according to a report by consultancy Mercer.
While AI could be transformative, the sector has to move with caution as it implements emerging tools, experts say. The stakes are high, as policymakers and experts have raised concerns about accuracy, bias and security.
Implementing AI in healthcare is complex, a lesson the industry should take from some predictive tools that were previously deployed, said Rohan Ramakrishna, co-founder and chief medical officer at health information app Roon, during a panel at HLTH.
“I think one of the things we’ve learned is that you have to be exceedingly careful applying AI solutions in healthcare settings,” he said.
How AI could help a ‘simple mismatch between supply and demand’
AI could help alleviate one of the biggest problems in healthcare: a growing number of older patients with more complex, chronic conditions and fewer providers who can help them, Daniel Yang, vice president of AI and emerging technologies at Kaiser Permanente, said during a panel discussion.
As older Americans need more care, millennials, the largest generation in the U.S., want a more on-demand consumer experience, he added. But that will be difficult to achieve with supply constraints. It takes years to train a new doctor, and fewer physicians means more burnout, care delays and higher costs.
“It’s not even about AI, it’s really about what I think ails healthcare in general,” Yang said. “What we’re seeing is a simple mismatch between supply and demand.”
AI could augment clinicians’ workflows, potentially helping them offer better care. Yang said an algorithm developed by researchers at the Permanente Medical Group helps save 500 lives each year by flagging patients who were at risk of clinical decompensation, or when their condition worsens.
“It’s better for me because it does save you time. But at the end of the day, it’s better for the patient because they feel like they’re being heard. It truly has been a transformative experience for me.”
Christopher Wixon
Vascular surgeon at the Savannah Vascular Institute
The technology could also be a way to lessen burnout and improve retention among clinicians by lessening the time they spend on administrative work such as note-taking. Providers have long reported they spend hours on work in their electronic records, often to the detriment of patient care.
Christopher Wixon, a vascular surgeon at the Savannah Vascular Institute, said he was close to leaving medicine as the industry shifted to electronic health records. Collecting information while trying to listen to patients was a challenge, and it was easy to miss out on non-verbal cues when he had to focus on a laptop screen.
But ambient documentation, typically where an AI tool records conversations between clinicians and patients and then drafts a note, has changed the game, he said.
“It’s better for me because it does save you time,” Wixon said during a panel. “But at the end of the day, it’s better for the patient because they feel like they’re being heard. It truly has been a transformative experience for me.”
Investors, health systems focus on administrative burden
Given providers’ heavy administrative workloads and worries about errors or bias in models that are more involved with clinical decision making, products that solve administrative concerns are some of the top priorities for AI adoption.
Some investors are more interested in automating those operational tasks too.
“We’re going to continue to focus on the unsexy back-office automation that’s going to alleviate the workforce burden, truly, and nothing related to clinical as it relates to AI,” said Payal Agrawal Divakaran, partner at .406 Ventures.
Administrative AI is also bringing in more venture capital dollars this year, according to a report published by Silicon Valley Bank earlier this month. In 2024 so far, administrative AI companies raised $2.4 billion, compared with $1.8 billion for clinical AI, likely because of lower regulatory and institutional hurdles, especially for decision support tools.
“Where we’re seeing all that action is on administrative tasks, but also on pre-authorization, more low-value tasks,” Megan Scheffel, head of credit solutions for life science and healthcare banking at SVB, said in an interview. “You could take your office staff and move them to higher value projects.”
A lot of opportunity exists now to use large language models as a drafting tool, including for notes or a nurse handoff document, Greg Corrado, distinguished scientist and head of health AI at Google Research, said during a panel.
The oversight is built in because providers will have to review the output before finalizing. It’s also easier to evaluate the quality by asking users about their experiences or checking how many revisions they had to make, he said.
But you still have to be methodical when evaluating operational or administrative tools, testing with the health system’s local patient data, said Todd Schwarzinger, partner at Cleveland Clinic Ventures, in an interview. Cleveland Clinic’s governance structure also focuses on questions about data used in the tool, how that information is protected and whether the product is safe and positively impacts patient care.
Administrative or operational products, like ambient scribes or revenue cycle management tools, are a safer place to start.
“You’re not running the risk of making clinical decisions, right?” he said. “That level of trust is not there. I think it’s going to take time.”
Preparing for AI deployment
Though AI could show promise at alleviating provider burnout, health systems face challenges to get providers on board, set up pilots, establish governance policies and roll the products out.
In one example shared at HLTH, St. Louis-based provider BJC decided to use vendor data to identify clinicians who took many days to sign off on documentation or wrote long notes for an ambient note-taking pilot, said Michele Thomas, associate chief clinical information officer for ambulatory at BJC Healthcare and chief medical information officer at BJC Medical Group.
“I think we had one person out of 20 respond. So right away we’re thinking the people who need it the most aren’t interested in that,” she said during a panel.
They decided to switch course, inviting anyone interested to join the pilot — prompting a quick response, Thomas said.
Still, it’s important to think through which providers should participate in testing, as some doctors didn’t understand the requirements to participating in a pilot, such as the need to spend time sending feedback or managing frustrations with a new product.
Health systems should also consider what outcomes they want when adopting AI tools. A lot of physicians didn’t see much time savings because they spent a lot of time editing their notes — not correcting them — to make notes in-tune with their writing style, Thomas said. In contrast, advanced practice providers were quicker to sign off on the ambient notes after reviewing.
“You really have to decide what your ROI is. Are you looking for a financial savings? Are you looking for time savings? Are you looking for hard numbers that justify this technology? Or is it softer than that? Are you looking for patient satisfaction?” she said.
Cybersecurity — already a challenge for the healthcare sector more generally — is also key to AI deployment.
Organizations should think through the same questions they’d ask any system that uses protected health information, Melanie Fontes Rainer, director of the HHS’ Office for Civil Rights, said in an interview.
If they enter into a relationship with a developer, do they have a business associate agreement in place? Are they thinking about data deletion policies for information stored in the cloud? Are they considering who needs to have access to the data?
“I think there’s certainly a balance to walk here, but it requires us all to be responsible and think about how we’re using this information and how it might harm our systems, our patients, and how we can take proactive steps to protect it,” she said.