How Are Academy Members Embracing This Transformative New Technology?
Larry Beresford
Artificial intelligence (AI) is a new and ubiquitous technology sometimes described in terms of computers programmed to think and act like humans or simulate intelligent behavior. Since the November 30, 2022, launch of ChatGPT, a conversational AI program by OpenAI (a San Francisco-based company founded in 2015 to develop artificial general intelligence), it has spread rapidly while generating controversy and questions. For instance, what has been—and what will be—its impact on hospice and palliative medicine, perhaps the ultimate person-centered branch of health care?
“One of the important things I would like everyone to know is that AI is here. It has already been integrated, it’s happening,” said Sonal Admane, MD MPH, a physician in the department of palliative, rehabilitation, and integrative medicine in the division of cancer medicine at The University of Texas MD Anderson Cancer Center in Houston, TX. “It’s even on your favorite web search portal. It’s happening all around us, whether we love it or we hate it. And it’s going to integrate into the medical field as well.”
Dr. Admane is a hospice and palliative care physician who works mainly with patients with cancer. She is also a clinical researcher on AI in the realm of palliative and supportive care, and she leads an institution-wide journal club at MD Anderson where they discuss AI-related clinical research articles and industry articles, with a focus on improving understanding and education about AI and digital health technology.
While forms of AI have been around for years, it has recently evolved from performing basic computational tasks to the more sophisticated form of chatbots, which are able to show human-like reasoning and even respond to questions. “The goal behind the AI technology—or if you want to call it augmented intelligence—is to have machines or computers think like humans and even surpass human intellectual ability,” Dr. Admane said.
This all may seem very technical to many palliative care clinicians. According to Dr. Admane, though, “our role as clinicians and healthcare personnel is to make sure that we monitor this technology and see it in the context of our patient care. It’s important that we understand a little about the technological framework of AI applications, but mainly we need to make sure that they are accurate, as free of bias as is possible, and equitable, not worsening health disparities.”
“Our role as clinicians and healthcare personnel is to make sure that we monitor this technology and see it in the context of our patient care.”
“If we can view AI as an adjunct to patient care, then we can use this technology to make our work more efficient, allowing us to provide better patient care, whether that means more time with patients while automating some administrative tasks that can take clinicians away from their patients or improving our symptom assessment, triage and referral, or prognostication.” Clinical workflow in palliative care can be complex and can be broken down into multiple steps. A lot of research is now looking at where to integrate AI and how to make this process faster, Dr. Admane explained.
“What can we automate? The results are mixed. Some studies show that it improves efficiency, others show that it does not.” Dr. Admane collaborated with Dr. David Hui, director of research; Dr. Akhila Reddy, section chief; and Dr. Eduardo Bruera, chair of the department of palliative care at MD Anderson on two recent studies examining how conversational AI agents define key terms in hospice and palliative care.1,2
The researchers tested early iterations of common conversational AI agents in order to see how those agents behave. “We found that even though they gave us seemingly accurate answers, we had six physician reviewers take a closer look and there were a lot of deficiencies in their answers. We sometimes got completely inaccurate examples of the terms that we asked them to define or fabricated citations and nonfunctional links to webpages of what appeared to be legitimate hospice and palliative care organizations. And this is a common drawback of all the conversational AI agents, which we are still attempting to study,” Dr. Admane said.
Just Another Clinical Tool?
Arif Kamal, MD MBA MHS FASCO FAAHPM, chief patient officer for the American Cancer Society and the Academy’s current president, compares AI to other tools used by hospice and palliative medicine clinicians. “I see this as another natural evolution in augmenting our clinical practice,” he said.
“I certainly don’t think it takes away from the high-touch, compassion-centric care that we deliver, particularly as it opens up time and space for compassion by taking over some of the components of our clinical practice that are complex or can be automated,” he said. “For example, if you can use AI to help do your documentation and note taking and billing and things like that, it can also help you make your clinical practice more efficient.”
Dr. Kamal recently penned a “Message from the President”3 to Academy members about AI’s place in their work. He acknowledged that it may feel to some like a threat to the ethos of palliative medicine or to its high-touch delivery of care. But if AI is redefined as assistive—”helping humans be smarter, more efficient, and more resilient at their jobs, [it becomes] a complement to human-delivered palliative care,” he wrote. “At its essence, AI harnesses the mathematical skills of computers at speeds and with complexity beyond human capabilities.”
Dr. Kamal’s message highlighted three primary areas where AI could revolutionize palliative medicine. The first is identifying appropriate, high-priority candidates to receive palliative care out of vast databases such as a health system’s electronic health record. Then, it can identify trends in that population and even suggest interventions such as home-based palliative care visits before the patient confronts an actual crisis. And it can help prevent burnout in the palliative care workforce by offering some administrative tasks to AI assistants, allowing clinicians to focus more of their time and energy on their clinical work.
“I encourage you to learn with an open mind and to lean into opportunities that could impact our day to day with colleagues and patients,” Dr. Kamal noted. “If there is an AI committee or task force at your workplace, get a seat at the table. If other departments are leading AI, they won’t be thinking of—or fully understand—how it relates to palliative care. We need representation in these rooms. This is our chance to use our voices, our expertise, and our compassion to shape AI for good.”
“This is our chance to use our voices, our expertise, and our compassion to shape AI for good.”
Think about the challenges that the average palliative care practitioner faces, Dr. Kamal added in an interview for this article. “One is that you [are] consulted and the patient and their family have a long medical history that you need to read through or at least try to understand.” Trying to identify where there is conflict, where there is complexity, where time intensity is called for, where there are unmet needs, he said, “a lot of that is nuance, not necessarily structured data.” An AI “agent” (ie, a software program designed to think and act humanly) can read the chart through the lens of palliative care and identify what is most important.
Or the question of who on the palliative care team needs to see a specific patient: the chaplain, the social worker, the nurse, or whomever. “In the delivery of palliative care, we have often taken a one-size-fits-all approach,” Dr. Kamal said. “But what if the AI agent cay say these patients probably need a chaplain, and those probably don’t but should see a social worker? Although it will still require human confirmation and adjustments of care with follow-up, the AI agent can help to tee up who should go first, doing the background work for the team.”
Other areas where AI could contribute to hospice and palliative care, now or in the near future, as identified in the literature include
- optimizing staff scheduling, which is particularly important in a tight labor market
- maximizing staffing resources in other ways, such as streamlining recruitment and enhancing employee retention
- addressing prior authorization demands, such as helping to draft letters of denial appeals to insurers, which can be very repetitive
- creating personalized care plans that draw upon data from patient history, symptom problems, expressed care preferences, etc.
- using data, statistics, algorithms, machine learning, and predictive analytics to better define the end of life and determine which patients are nearing it, as well as other anticipated needs
- facilitating advance care planning for patients and helping to prepare their advance directive documents
- supporting family communication by providing information and answering their questions
- leaning on remote patient monitoring, such as vital signs for home-based patients
- managing the clinician’s inbox with patient email messages
- understanding ethical and legal considerations.
Mayo Clinic Leads the Way
Mayo Clinic, based in Rochester, MN, has been at the forefront of AI for years, leveraging agentic AI solutions to revitalize the workplace, expand capacity, and reduce staff stress, reported Laura Dyrda for Becker’s Health IT.4 At Mayo Clinic, a multidisciplinary team of researchers and practitioners developed a “control tower,” complete with a medical “air traffic controller.”5 With its feedback, palliative care providers are able to spot the patients most likely to benefit from this type of support.
Mayo’s initial AI project in palliative care started in 2017, and since then the team has tried different variations of a learning AI tool for risk assessment, said Alisha Morgan, DO, a palliative care specialist at Mayo who has been helping to drive its ventures into AI. It incorporates numerous different metrics: patient history, what’s happening to the patient in the hospital, uncontrolled symptoms, pain scores, anxiety and depression scores, the absence of advance care planning, and other specific risk factors, largely pulled from the electronic health record.
“All of those kinds of metrics together are fed into a quantile system to gauge low, medium, and high-risk patients in the hospital,” Dr. Morgan said. The program sends a report every morning to the palliative care team with a specified number of patients in each of two Mayo hospitals. “With the risk score, we’re able to evaluate [and] make sure that we think we can be helpful with this patient,” Dr. Morgan said.
Finding patients with unmet needs might lead to engaging with the primary oncologist, making a recommendation for palliative care, and explaining why. “We’re also looking at earlier engagement with the emergency department from data being entered into the electronic health record.”
How far the palliative care team is able to go down the list of potential patients identified each morning by AI varies day to day, Dr. Morgan added. “There are definitely ways that you can personalize AI for your team’s needs, given that every palliative care team, every hospital, every patient population is different,” she said. “Looking at our early studies, we certainly reached patients more often than we otherwise may have. Also, with this engagement, we were able to show that we reduced hospital readmission rates, particularly 60- and 90-day readmissions. A lot of our outcomes showed some benefit.”
AI is not just one thing, added April Christensen, MD MS, another palliative medicine physician at Mayo Clinic. “There has been an explosion of different AI tools in our practice. And so just trying to keep up with those various AI tools and all the ways we’re utilizing AI in practice is an incredible process and journey in and of itself for each of us,” she said.
“We’ve also had the opportunity in clinic to draft clinical notes with AI.” The Abridge AI-powered platform summarizes palliative clinical encounters and integrates them into the electronic health record. One of the drawbacks of this system is that it hasn’t been as good at capturing advance care planning conversations, which aren’t as medically based as some other interventions. As a result, it doesn’t record things about people’s values and what’s important to them, Dr. Christensen said. She has also used AI “to think through, ‘how do I improve some of my own communication and how I think about communication challenges.’”
Dr. Christensen shared a recent example of thinking through how to communicate with an individual from a highly analytical background, such as engineering or accounting—“someone who’s very logical—and then they have what’s called an amygdala hijack, an immediate and overwhelming emotional response that overrides their frontal lobe’s ability to engage in logical conversation. And yet they’re still trying to reason their way through a highly emotional situation,” she explained.
“So my question was: How do I address this when my typical statements and strategies and skills don’t work? How do I meet this person where they’re at and help them move through this really challenging situation?”
Nothing is ever going to replace humans—”our nonverbals, our human presence, just being there for another human being in these important and crucial times in their lives,” Dr. Christensen said. “I would say we should always be looking at AI as an opportunity to enhance our practice, grow our skill set, and learn rather than something that’s going to replace us. It should always be a partnership. We should be giving input on what we want to be getting back from it.”
Nothing is ever going to replace humans—”our nonverbals, our human presence, just being there for another human being in these important and crucial times in their lives.”
Empowering Authors
David Casarett, MD, section chief of palliative care at Duke Health in Durham, NC, sees great potential for AI to enhance the work of palliative care. “I hear much more concern than excitement about AI in our field, which is understandable,” he said. “There are plenty of technologies that have been foisted on us, so an innate skepticism is probably healthy. But there are opportunities out there (from AI) and we need to be open to them.”
Dr. Casarett, who has worked in a National Institutes of Health grant-funding space for many years, thinks there are other pathways to improve palliative care more quickly through start-up initiatives. He is medical advisor to CareYaya, a health technologies company based in Research Park Triangle. He has worked on projects for them involving AI-powered technology, for example, helping patients to craft appeals letters when their claim for coverage of a health service is denied by an insurer.
“AI is one of those rare tools with an almost endless list of applications,” he said. “It pulls in new technologies to answer old problems, it opens doors. And when you go through the door into a room, there are a million other rooms leading off from that.”
“AI is one of those rare tools with an almost endless list of applications.”
Dr. Casarett is also the editor in chief of the Journal of Pain and Symptom Management at a time when one of the biggest controversies around AI involves its ability to write papers, mimicking various styles of discourse including academic coursework and shrouding the actual participation of the putative authors. “Our journal has adopted a policy that doctors need to disclose if they used AI in preparing their papers,” he said. But that admission won’t necessarily cause the paper to be rejected.
Many times, AI-aided manuscripts come in from other countries and from authors for whom English is not their first language. “I’ve seen the quality of those manuscripts improved by AI, made more intelligible. Sometimes we have said to authors, ‘You need to work with an English language editor,’” he said. AI can fill a comparable role.
“One negative we’ve seen is when physicians lean too heavily on AI, not just to proofread or edit but to rewrite, to the point where it’s obvious,” Dr. Casarett said. “There is a middle ground with how to use it that improves what’s already there. I’m hesitant to say we’d turn down a manuscript because of too much AI,” he said.
“But I think when some manuscripts coming across the transom are heavily AI-influenced, they tend not to be very data-driven but instead are just reviews. And those manuscripts tend to be lower priority for us.” Dr. Casarett uses AI himself for proofreading. And he thinks it’s a good thing if it empowers people to write better papers.
Where Do We Go from Here?
Much of AI in this setting is still being defined or tested, Dr. Admane said. But Academy members should make themselves familiar with the terms and the trends, even if they are not actively embracing them. “We are currently exploring how AI can be used in hospice and palliative care settings to generate predictions, provide prognostic models, and assist communication. How can we leverage AI’s advanced reasoning and conversational ability? The more ambitious use will be having AI counsel patients.”
Dr. Admane offered one more message to Academy members: “Don’t be scared by the technical terms. Most people are intimidated and fearful of technical terms, and that is why they avoid this area. Try to understand the basic terminology. You don’t have to be a technical expert to use AI or to understand this. You just need to know how to use the tools in your setting and appreciate how AI is going to be useful to you.”
Editor’s Note: This article was developed with the help of AI programs, primarily the public domain AI Software Gemini for Google Chrome.
References
- Admane S, Kim MJ, Reddy A, et al. Performance of three conversational artificial agents in defining end-of-life care terms. J Palliat Med. 2025;28(8):1102-1107. https://doi.org/10.1089/jpm.2024.0526.
- Kim MJ, Admane S, Chang YK. Chatbot performance in defining and differentiating palliative care, supportive care, hospice care. J Pain and Symptom Mange. 2024;67(5): e381-e391. https://doi.org/10.1016/j.jpainsymman.2024.01.008.
- Kamal A. “The future of artificial intelligence and palliative care.” AAHPM Quarterly. Spring 2025. https://aahpm.org/publications/aahpm-quarterly/issue-archive/spring-2025/the-future-of-artificial-intelligence-and-palliative-care/
- Dyrda L. 2025 is becoming the year of AI agents in healthcare. Becker’s Health IT. Published June 27, 2025. Accessed October 8, 2025. https://www.beckershospitalreview.com/healthcare-information-technology/ai/2025-is-becoming-the-year-of-ai-agents-in-healthcare/
- Streed J. AI and your medical air traffic controller. Mayo Clinic News Network. Published July 13, 2021. Accessed October 8, 2025. https://newsnetwork.mayoclinic.org/discussion/ai-and-your-medical-air-traffic-controller/
Larry Beresford is a medical journalist in Oakland, CA, with a strong interest in hospice and palliative care.