Health

Northwell narrows maternal health disparities with AI chatbot


Northwell Health is trying to address maternal health disparities with the help of an artificial intelligence chatbot.

Northwell Health’s Pregnancy Chat tool, developed in partnership with Conversa Health, guides patients on their prenatal and postnatal journey while assessing social barriers and health issues. mental health.

The tool is part of an initiative within Northwell’s Center for Maternal Health that aims to reduce maternal mortality, particularly among Black women. Dr Zenobia Brown, senior vice president of population health and deputy chief medical officer at New Hyde Park, the New York-based health system, said a major barrier is addressing gaps in health care. behavioral health, education and community resources.

Using a “high-tech, high-touch” approach, the chatbot helps Northwell service providers manage high-risk pregnant patients by implementing personalized education and patient assessment. This tool provides patients with information related to each stage of pregnancy, such as blood pressure monitoring, prenatal testing, birth planning and breastfeeding support, and regularly monitors filter them for social and mental health needs.

The chatbot is integrated with Northwell’s care management team and can direct patients to relevant resources and alert providers if intervention is needed. When a patient tells the chatbot they are experiencing medical complications, the tool triggers a call from a Northwell representative or directs the patient to the emergency department.

“I could have someone call mothers three times a week and ask them about their situation. But it allows us to perform more operations using technology than we can do with humans,” Brown said.

Since its launch earlier this year, the AI ​​chatbot has shown promising preliminary results, according to the health system. An internal survey found that 96% of users expressed satisfaction with their experience. In addition, the chatbot effectively identified patients with complications and guided them to appropriate care, Brown said.

For example, the chatbot identified a woman with postpartum depression, even though she did not disclose her symptoms during a previous mental health checkup with her doctor. The patient confided in the chatbot about suicidal thoughts, leading to a response from the care team with mental health and psychiatric support.

According to a study by researchers at the University of California, San Diego published in the journal JAMA Internal Medicine in April, the use of AI-powered chatbots in healthcare has been shown to increase enhanced interaction, providing more detailed and empathetic conversations than traditional doctor-patient interactions.

“These chatbots never tire,” said John Ayers, associate director of innovation for the infectious diseases and global public health division of the UC San Diego School of Medicine. The findings suggest that AI chatbots have the potential to increase patient satisfaction while reducing the administrative burden on clinicians.

“We’re using these as really cool, fancy tools to go back to the things that we know for sure work in healthcare, which is listening,” Brown said. your patient, allowing them to ask you many questions and involve them in their care.

This approach could also increase the amount doctors can earn from insurance companies by answering more patient emails, Ayers said. However, to realize the technology’s full potential, tools must be tailored to meet the needs of each individual patient. For example, many chatbots on the market are designed to reduce worker burnout and facilitate patient management. For patients, such tools could be similar to a telephone tree, he said. He says a chatbot should be linked to a real person if the patient needs more complex assistance.

Bioethicists are wary of seeing AI-powered chatbots as a definitive solution to patient engagement and have called for closer scrutiny.

“Regulation has to be in place in some form,” said Bryn Williams-Jones, a bioethicist at the University of Montreal. “It’s not clear what form it will take because the thing you’re trying to adjust is evolving extremely rapidly.”

To deploy technology responsibly now, healthcare providers should clearly understand the methodology behind the software, fact-check how it works, and create a responsible mechanism to respond when something goes wrong. These tools should be designed in line with standards of care and seek to avoid overuse, he said.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button