Health

Ethical AI-Based Transcription Technology Helps WellPower Cut Documentation Time



A recent Gartner survey found that most customers are “afraid of AI”: 64% said they would prefer companies not to incorporate AI into their customer experience. Customers are also concerned about AI and disinformation (42%), data privacy (34%), and bias/inequality (25%).

Ethical AI can help organizations create innovative, trustworthy user experiences – protecting their brands, allowing them to maintain a competitive edge and foster better customer relationships. And ethical AI is part of the story at WellPower.

PROBLEM

In the mental health field, there aren’t enough therapists to help everyone struggling. Community mental health centers like WellPower in Colorado serve some of the most vulnerable populations in need of help.

Due to the complex needs of the people they serve, WellPower clinicians face more complex documentation rules than therapists in private practice. These additional rules create an administrative burden, taking time away from clinical care that could otherwise be spent.

WellPower has been researching how technology can act as a tool to enhance the mental health workforce.

The provider organization turned to the Iliff Innovation Lab, which works with AI, to see how health information technology could make it easier for people to connect with health care, such as through telehealth; how people could get treatment faster by enabling high-fidelity evidence-based practices and remote treatment monitoring; and how WellPower could reduce administrative burdens by enabling therapists to create accurate, high-quality documentation while focusing more on providing care.

“When used properly, clinical documentation is a particularly promising area for AI deployment, especially in behavioral health,” said Wes Williams, CIO and vice president of WellPower. “Large language models have proven to be particularly effective at summarizing large amounts of information.

“In a typical 45-minute psychotherapy session, there is a lot of information that needs to be summarized to document services,” he continues. “Staff often spend 10 minutes or more completing documentation for each service, adding up to hours that could otherwise be spent providing clinical care.”

PROPOSE

Williams said WellPower’s commitment to health equity drives the company’s approach to deploying technology, making the partnership with Iliff essential to furthering that mission.

“AI tools are often black boxes that hide how they make decisions and can perpetuate biases that lead to health care disparities faced by the people we serve,” he explained. “This puts us in a difficult position, because not using these emerging tools negates their effectiveness for those who need them most, but adopting them without assessing for bias can increase disparities if an AI system with a history of health care bias is integrated into the system.

“We came up with a system that leverages AI as a passive listening tool that can join therapy sessions (both telehealth and in-person) and act as a kind of digital scribe, creating draft notes for our clinicians to review and approve,” he added. “However, we needed to ensure that the digital scribe could be trusted to create summaries of therapy sessions that were accurate, useful, and unbiased.”

Behavioral health data is some of the most sensitive data out there, in terms of privacy and security; these protections are necessary to ensure people feel comfortable seeking the help they need, he continued. For this reason, it is important for WellPower to thoroughly test any new system, especially one that relies on AI, he said.

RESULT

In implementing the AI ​​digital recordkeeping tool, WellPower needed to ensure that it did not infringe on the privacy or safety of the people it serves.

“Many therapists were initially hesitant to try this new system, citing these legitimate concerns,” said Alires Almon, director of innovation at WellPower. “We worked with the Iliff team to ensure that the digital scribe was built ethically with privacy in mind.

“One example: The system doesn’t record therapy sessions, but encrypts the conversation immediately,” she continues. “This means that at the end of a therapy session, the only thing that is stored is metadata about the topics that were covered during the session. With insights from the team at Iliff, we can ensure privacy for our patients while still freeing up more time for care.”

She added that implementing an AI-powered platform to assist with transcription and development of progress note drafts has significantly improved the treatment experience for both staff and the people WellPower serves.

“Since implementing the Eleos system, WellPower has seen a significant improvement in staff progress note completion,” Almon reports. “Three out of four outpatient therapists are using the system.

“For this group, the average time to complete documentation improved by 75% and the total time spent documenting was reduced by 60% (reducing the time it took to write notes from 10 minutes to four minutes),” she says. “Our therapists were so excited about joining Eleos that some said they would think twice about leaving WellPower because of their experience with Eleos.”

ADVICE FOR OTHERS

Almon notes that artificial intelligence is a new and exciting project for health IT, but it also has its own limitations defined by science fiction, media hype, and its actual capabilities.

“It is important for your organization to educate and define AI to your employees,” she advises. “Explain how AI will be used and what processes and policies will be put in place to protect them and their customers. AI is not perfect and will continue to evolve.

“If possible, before you start deploying AI-enabled tools, measure their heart rate to assess their level of understanding of AI and how they feel about it,” she continues. “Partnering with a program like Iliff’s AI Trust framework not only helps you choose ethical technologies to use, but also communicates that your organization has considered the potential harms that AI-enabled platforms can cause.”

That is more important than the results achieved, she added.

“Finally, reassure your staff that they cannot be replaced by AI,” she concludes. “Human relationships are the most important relationships in an individual’s healing process. AI is there to support humans in their role, it is an assistive technology. AI can support and help, but it can never replace the therapeutic connection.”

Follow Bill’s HIT coverage on LinkedIn: Bill Siwicki
Email him: [email protected]
Healthcare IT News is a publication of HIMSS Media.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button