Health

What it takes to engage the clinical workforce in AI



BOSTON – When asked about the rise of artificial intelligence and when sentient AI will become an integral part of the healthcare system, Dr. Patrick Thomas, director of digital innovation in pediatric surgery at the University of Nebraska Medical Center School of Medicine, said clinical training needs to be overhauled.

“We need to prepare clinical students for the real world they’re about to enter,” he said at the HIMSS AI in Healthcare Forum on Thursday.

As the moderator of a discussion about how to help the healthcare workforce trust and adopt AI — emphasizing the importance of governance, keeping humans informed, and sharing responsibility for maintaining clinical AI quality with developers — Thomas asked other panelists how they address skepticism and distrust among clinicians.

Joining Thomas on the panel were Dr. Sonya Makhni, medical director of the Mayo Clinic Platform, Dr. Peter Bonis, medical director of Wolters Kluwer Health, and Dr. Antoine Keller of Ochsner Lafayette General Hospital.

Expanding clinical resources

Using large language models to address the large cognitive burden facing clinicians still poses many complications, from bias and data illusion to cost.

Bonis notes that it is likely that application developers will have to pay for system development costs in addition to the cost of innovating the underlying platform models.

Keller adds that the healthcare industry has more information to absorb than the clinical workforce.

“We don’t have the manpower to respond” to accurate clinical decisions in a timely manner, he said. While clinicians are focused on risk innovation — building in safeguards to mitigate concerns about using AI — giving clinicians a comfort level to adopt it is essential.

Keller, a cardiac surgeon, described how Louisiana-based Oschner Health provides community health partners with an AI-powered tool called Heart Sense to drive interventions.

By diagnosing in underserved communities with low-cost technology, the AI ​​tool “dramatically expands the workforce,” he said.

Improving Access in Underserved Communities

Leveraging the cardiac screening tool not only improves Oschner utilization but also focuses attention on the patients who need it most.

Thomas asked what it means for AI to impact healthcare when a healthcare organization doesn’t have a data scientist on staff, how Oschner’s public health partners understand AI tools and how to use them.

Keller said there is a lot of hesitation when it comes to improving the communities they serve through dedicated support.

“You have to be present and aware of the obstacles or problems people have when using technology,” he explains.

However, he said people who have access to the technology in areas with a shortage of health services are grateful.

One of the key diagnostic criteria – a heart murmur – is required before a patient can undergo aortic valve replacement. Using an AI-driven screening tool, the health system found that 25% of people over the age of 60 in its community will have a pathological heart murmur – which can be treated with surgical intervention.

“Prevalence data show that there is a large proportion of the population that is undiagnosed,” says Keller.

Using AI to understand which patients are at risk of being cured is a significant advantage in treating patients before they develop irreversible dysfunction.

But he said adoption depends on having a well-trained workforce.

“Visually – with something concrete” and understandable even to low levels of education, he said.

Shares and joint liability

“The stakes for clinical care are high,” Bonis acknowledged, noting that keeping humans involved in clinical decisions is a guiding principle in developing trustworthy AI in its many nuances.

While frontline doctors aren’t always concerned about “making sausages” out of AI, “from a provider perspective, caution is important,” he said.

The question for her, Ms. Makhni said, was how to connect expertise across the entire lifecycle.

The Mayo Clinic platform works directly with AI developers and the Mayo Clinic to find ways to deploy clinical AI in a user-centric way — “and then communicate that information transparently to empower the end user.”

Such a multidisciplinary review could determine whether the AI ​​developer is attempting to bias the assessment and whether that information can be communicated to clinicians in a way they understand.

“We meet [developers] where they are on their journey,” but the goal is to demonstrate principles of safety, fairness and accuracy, she said.

Considering the digital divide and asking the clinical workforce about their concerns is critical to delivering safe AI systems and is a burden that cannot be placed solely on users.

“Sometimes it should also fall on the solution developer,” she said. “We have to take collective responsibility.”

She added that while healthcare can’t solve every AI challenge quickly, “we can take a measured approach.”

Andrea Fox is senior editor of Healthcare IT News.
Email: [email protected]

Healthcare IT News is a publication of HIMSS Media.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button