top of page

End of Life planning with a chatbot assistant 

Exploring how conversational AI can help facilitate sensitive conversations
UCSD DESIGN LAB x TRAITS AI
VITAThumbnail.png
TIMELINE
ROLE
WHAT I DID
UX Designer
UX Research, UI/UX, Prototyping, Conversation Design, User Testing
1 year
July 2020 - June 2021

*NDA-limited case study*

Overview

CONTEXT

Artificial intelligence is the future and it is already making a difference in healthcare. One example is Woebot, a mental health chatbot that has shown significant reductions in depression among its users. During the summer of 2020, I was recruited by Heidi Rataj, the project lead for an AI Design Sprint, to collaborate with Traits AI (a start-up company that develops animated artificial intelligence avatars that exhibit personalities).

PROJECT BRIEF

Our project was to use human-centered artificial intelligence and conversational design concepts to design a chatbot app that would aid palliative care patients with their advanced care planning (aka end of life planning). I worked with three other UCSD student designers to research pain points in end-of-life planning, design a chatbot assistant, and the app it would live in.

Desk Research

Our first step was to thoroughly educate ourselves on advance care planning and identify any problems that arise with it. We focused our preliminary research on three main components: patients, healthcare providers, and advance directives (AD).

Advance care planning allows the patient to plan the healthcare they would want to receive in the event of a medical crisis. The initial problem we saw in advance care planning was that only ⅓ of hospital patients have an advance directive (a “living” will in which the patient writes all of their end-of-life wishes).

Many adults have not even heard of it, resulting in stressful end of life discussions.

PATIENTS

We wanted to understand the underlying reasons behind why the majority of patients lack advance directives. Here are some of the main points we found:

  • While 90% of patients recognize the importance of end of life talks, only 27% actually have these talks, according to a 2014 Report from Kaiser Health News.

  • Education level, religious beliefs, and acculturation are significant predictors of who completes an AD.

  • Many older patients are asked about AD’s for the first time only when they become hospitalized.

  • Additionally, many have to make end-of-life choices without loved ones by their side. 

  • Patients who engage in end of life discussions prior to a potential hospitalization are less likely to encounter these stressful situations. 

HEALTHCARE PROVIDERS

From the healthcare provider’s perspective, there are many barriers to effective end-of-life talks. Oftentimes, there is inadequate physician-patient communication that may stem from disparities in cultural values, religious beliefs, and occasionally, language and medical interpretation challenges. We found that doctors feel more confident when decisions are guided by an advance directive, which can lead to better treatment.

ADVANCE DIRECTIVE

We discovered that advance directives are oftentimes too vague, leading patients to include information that may not always be meaningful to the healthcare provider. ADs typically follow a one-size-fits-all style, which does not work for everyone.

We tried to complete an advance directive ourselves to better understand the perspective of the users we aimed to help. Below details our overall experiences with the process.

actualpic.png

As we learned more about the importance of advance care planning, we all felt the need to talk about it with our family. However, there was no natural segue into that uncomfortable conversation. When we talked with our families, we noticed four common patterns.

Our preliminary research showed us that patients are not completing advance directives because they are not giving any thought to end of life planning.

Possibly a chatbot would encourage more people to fill out an advance directive properly.

Our team was curious about how AI could aid patients, so we decided to explore solutions related to a Chatbot assistant. 

How might we encourage patients to think about end of life care?

Messaging-cuate.png

Competitive Analysis

We conducted a competitive analysis to identify existing alternatives and simplify the completion process for advance directives. In addition, we examined existing chatbots, including those used in medical settings, to evaluate their strengths and weaknesses, and guide the design of our own chatbot. We looked at the strengths and weaknesses of each option, making note of how we wanted our chatbot solution to look in the end.

 

Below is a screenshot of our competitive analysis that we created on Miro.

Competitive Analysis of alternatives to advance directives & chatbot assistants

User Research

RESEARCH OBJECTIVES

  • How do adults above the age of 50 feel about end-of-life planning?

  • What do healthcare providers, including clinical social workers and palliative care physicians, consider as the main barriers to completing an Advance Directive

  • How do healthcare providers and adults feel about a chatbot assistant helping people with end-of-life planning?

METHODS

undraw_Group_video_re_btu7.png

Semi-Structured Interviews

To have conversations with participants that provide valuable insights for our research questions

Research Findings

Advance Directive User:

There are important details about end-of-life preferences that are not included in the advanced directive, making it less useful.

Healthcare Worker:

Many healthcare workers agree that while AD's are a useful tool, they are not the most important part of advance care planning.

“[My husband] went into hospice and he lost his ability to speak quickly. I knew he wanted to die at home, but I didn’t know what kind of setting it should be...like visitors? Music? Hospital bed?...all those things could have been personalized. But I had to guess...and make assumptions. These things are lacking in the legal forms.”

-Advance Directive User 1

“People thinking about these issues and talking to their family/friends is way more helpful, [than completing an AD in isolation]” 

-Healthcare worker 1

“I felt cheated because I felt like I did all the Advance Directive and stuff and it didn’t help”

-Advance Directive User 2

Reactions To a Chatbot:

We gathered opinions from physicians and others about a chatbot aiding in advance care planning. We received valuable feedback and examples of how a chatbot could help, some of which we incorporated into the design of our chatbot. We carefully noted any skepticism and confusion regarding the chatbot.

MODIFIED DESIGN CHALLENGE

After talking to healthcare providers, we realized that the problem was less about the advance directive and more about having end-of-life conversations with loved ones.

 

How might we encourage people to have end-of-life conversations with loved ones and possibly their healthcare providers?

undraw_Getting_coffee_re_f2do.png

Solution

To tackle this problem, we designed an artificial intelligence chatbot that guides patients through advance care planning by encouraging them to think about what brings them joy in their lives. The chatbot's persona reflected personality traits aimed at offering optimal support for patients during difficult times.

We chose the term joy rather than happy because happiness is a fickle state of being while joy is something that can be conjured by specific moments.

undraw_young_and_happy_hfpe.png

We wanted patients to truly understand what brought them the most comfort.

We emphasized that the chatbot would not replace the healthcare provider but rather support their role. It would act as a constant assistant whenever patients decided to contemplate their end-of-life planning.

CHATBOT SERVES TO:

#1

Get patients and loved ones thinking about their end-of-life care.

#2

Provide a framework for discussions.

#3

Provide healthcare workers with a useful end-of-life care plan for their patients.

Prototype

Next, we taught ourselves AI conversation design fundamentals and created a prototype of the conversational dialogue of the chatbot and part of its functionality as seen below. In order to make the dialogue flow more personalized to the user, we accounted for a variety of user contexts (e.g. health, religion). 

tablet_edited.jpg

Dialogue flow with the chatbot educating the user about end-of-life planning.

USER TESTING INSIGHT

The chatbot cannot respond if someone asks a question or shares something personal with the bot. This constraint can come off as insensitive.

CHANGES WE MADE

  • Let user know of chatbot limitations. Chatbot is there is guide, encourage, and listen.

  • For the MVP, the input will only consist of multiple choice options.

At the end of the summer, we presented our design process and findings to stakeholders, consisting of healthcare providers and other designers, where we received tons of positive feedback and future collaborators!

"I am floored and delighted by all that your team has achieved! They clearly listened deeply, especially to some of the more skeptical folks I sent their way!"

-Palliative Care Physician

UI Design

Next we wanted to explore how the app that would hold our chatbot would look like, so we and also our stakeholders could visualize the full product better. We started by creating a high-level architecture flow of the main screens of the app as shown below.

vita1.png

High level information architecture flow of the app

Then we paper prototyped multiple layouts of each screen, using methods like Crazy 8.

vita2.png

Paper Prototypes

After that we merged all our ideas into two A/B flows with different physical features. We conducted A/B testing on the information architecture, navigation options, and color scheme.

vita3.png

A/B Testing

DESIGN FOR ACCESSIBILITY

  • The primary target audience is the older generation, so the app needs to be easy and accessible to use with large, bolder fonts.

  • Learning curve is higher for older adults. Tradeoff: clutter vs recall

While the functionality and design of the app is important, it is equally important to have an understanding of the user journey and where the app will come into play. So our team hosted a design workshop, inviting students, healthcare providers, and community members, to discover our chatbot and the app and discuss with one another how it could help solve the current problems that are occurring with advance care planning. My role was to communicate our design progress thus far as well as facilitate the preceding discussions.

Next Steps

Throughout the year, our team continued to design high fidelity screens. We had plans to observe clinical social workers as they interacted with patients to help model the conversation dialogues. As I neared graduation in June 2021, the project slowed down as the team was looking for funding to start development of the project.

Reflection

This was a remarkable experience working in a field I didn't even know existed - human-centered design in healthcare! I learned a lot about working in cross-functional teams, handling multiple roles, and communicating with stakeholders. This project was the first time I interviewed real user participants, so my confidence and moderating skills started to develop. I also learned how to perform user testing through A/B testing and create different prototype flows on Figma. 

 

Thinking about this project years later, a series of co-design workshops with healthcare providers would have been extremely effective.

Screen Shot 2020-08-12 at 19.19.48.png

Here is my design team, working remotely through broken audio, low-quality cameras, and poor lighting. :)

*NDA Limited Case Study. Graphics and images from slides.go, Storyset, and Undraw

bottom of page