Novel chatbot could ask emergency department visitors about social needs

NewsGuard 100/100 Score

Americans visit hospital emergency departments nearly 130 million times per year. Although the focus of these visits is to address acute illness and injury, doctors are increasingly finding that social needs -; such as food and housing insecurity -; place many patients at higher risk of getting sick and requiring emergency care.

In order to better serve patients and possibly prevent future emergency department visits, doctors need a way to assess incoming patients to establish a wider context behind their visit.

A team led by the University of Washington developed a chatbot that could ask emergency department visitors about social needs, including housing, food, access to medical care and physical safety. The team tested it on 41 patients in Seattle and Los Angeles emergency departments. Results show that two groups of patients preferred the chatbot: patients who had less than a middle school level of health literacy and patients who appreciated establishing emotional connections.

The team presented these results in July at the Conference for Conversational User Interfaces 2021.

A few years ago there was a huge buzz around chatbots, and then people started realizing that maybe they aren't meant for everything. We have been trying to figure out opportunities where having a chatbot would actually be meaningful and make sense."

Gary Hsieh, co-senior author, UW associate professor in the human centered design and engineering department

One good opportunity involved collaborating with emergency department doctors.

"We want to understand the upstream issues that bring people into the emergency department. What are the social needs of the patients that we serve and how can we develop interventions that address these needs?" said co-author Dr. Herbert Duber, associate professor of emergency medicine in the UW School of Medicine. "For many people, including those with low literacy levels, a chatbot makes so much sense for collecting this information."

The team designed a chatbot named HarborBot, after the hospitals where it was tested. HarborBot takes patients through a social needs survey that was developed by the Los Angeles County Health Agency. This survey asks patients 36 questions related to demographics, finances, employment, education, housing, food and utilities. It also asks questions related to physical safety, legal needs and access to care.

HarborBot is displayed on a tablet as a typical chat window with the patient's and bot's conversation showing up in different colored bubbles. HarborBot's chat bubble shows animated ellipses when the bot is "typing."

Based on a previous study, the researchers improved the chatbot's efficiency and social skills.

For efficiency, the researchers:

  • modified the amount of time the bot looked like it was typing to match the length of text the bot displayed. This means that the bot would "type" for a shorter amount of time for a shorter response
  • added a question at the beginning of the interaction that would allow patients to stop HarborBot from reading all of its questions and responses aloud
  • placed the patients' answer options in the same part of the screen so that patients, who were often tired or in pain, could respond without having to move their hands

To increase the empathy of the interaction, the team changed the bot's reactions to better match the content of the questions and patient responses.

"Some of the questions are quite sensitive -; there are questions about violence and sexual abuse -; and the bot's original responses said 'Sure,' 'Great' or 'Thanks for sharing with us,'" said lead author Rafał Kocielnik, who completed this project as a doctoral student at the UW and is now a postdoctoral fellow at Caltech. "We tried tailoring its responses in a way that made them more appropriate for the content and specific to the patients' responses, such as 'That must be stressful, thank you for letting me know.'"

After HarborBot received its upgrades, the researchers tested it at two emergency departments: one at Harborview Medical Center in Seattle and the other at the Harbor-UCLA Medical Center in Los Angeles.

For both locations, the researchers worked at night (between 8 p.m. and 1 a.m. in Seattle and between 4 p.m. and 4 a.m. in Los Angeles). The teams collaborated with triage nurses to select potential participants. Then the researchers took participants to a visitor room where they could still hear announcements. After the patients signed a consent form, they completed:

  • two surveys to gauge health literacy. One survey asks patients to pronounce health-related terms and the other asks patients to answer questions about the nutritional facts label on a pint of ice cream
  • the social needs survey as both a web form through SurveyGizmo and an interaction with HarborBot. These were given in a randomized order
  • evaluations for both the web form and HarborBot
  • a survey to gauge a patient's desire for emotional interactions

At the end, the researchers interviewed the participants about the experience.

The team was not surprised to find that many people with low health literacy preferred the HarborBot version of the survey -; 17 out of 20 low-literacy participants chose HarborBot, compared to 8 out of 21 high-literacy participants. People who valued emotional connection also liked the chatbot but these two groups didn't necessarily overlap.

"We thought maybe people with low health literacy would also be more in need of emotional interaction," Kocielnik said. "But it turns out, the two groups are not strongly correlated."

For the 23 participants who scored high on the emotional interactions questionnaire, 18 chose HarborBot. Meanwhile only 7 of the 18 participants who scored lower on that questionnaire preferred HarborBot.

"It's important to understand that chatbots can benefit people in different ways," said co-author Raina Langevin, a UW doctoral student in human centered design and engineering.

In the future, the team plans to design a survey system that could tailor the experience to each user. For example, it could start out as the chatbot, but then based on how a user is answering the questions, it could shift into more of a survey format.

"Our vision would be some sort of kiosk people could use while they are waiting. Or even a QR code that people can scan with their own devices and then answer these questions," Hsieh said. "Ultimately we want to connect people entering emergency departments as smoothly as possible with the resources that they need."

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Feeling lonely? It may affect how your brain reacts to food, new research suggests