Issue link: https://beckershealthcare.uberflip.com/i/1122871
60 PATIENT EXPERIENCE Banner Health launches podcast featuring medical stories By Leo Vartorella P hoenix-based Banner Health has launched a pod- cast called "Bedside Stories" to give listeners the opportunity to hear some of the most engaging stories of patients and clinicians at the system. The 15-minute episodes feature medical challenges, tense moments and inspiring successes. The podcast's narrative storytelling is similar to shows such as "This American Life" and "Serial." "These truly brave and unforgettable people are sharing some of the most intimate and profound moments of their lives," said Corey Schubert, host and narrator of "Bedside Stories." "That sounds heavy, but there's a lot of laughter, too. These are stories that remind us how we're all more alike, as humans, than we are different." n Illinois nurses canoe to hospice patients' homes amid flooding By Mackenzie Bean S ome Illinois nurses waded through floodwaters and canoed to hospice patients' homes in April after the Mississippi River flooded some areas of the Quad Cities, reported WQAD News 8. The nurses, Stephanee Petersen and T.A. Peterson, work for Nashville, Tenn.-based Hospice Compassus. Twice a week, the nurses traveled through floodwaters to reach a hospice patient on Campbell's Island, which sits in the Mississippi River in Illinois. Some days, the trip took the women up to three-and-a-half hours. "These patients need us. We can't have the floodwaters keep us from doing [our] job," Ms. Peterson told WQAD News 8 in April. "This is the time when they need us the most." n Viewpoint: How to tell patients AI is part of their care By Megan Knowles A s artificial intelligence use ex- pands in the healthcare space, physicians must be aware of how to properly explain the role of AI to patients and address ethical concerns that arise, a commentary published in the American Medical Association's Journal of Ethics said. e commentary was written by Daniel Schiff, a PhD student at the Georgia Institute of Technology in Atlanta who studies AI and its intersection with social policy, and Jason Borenstein, PhD, director of graduate research ethics programs at the institute. Here's how physicians can address several ethical concerns around telling patients AI is involved in their care: 1. Informed consent. One ethical challenge stemming from AI in healthcare is the difficulty of obtaining a patient's informed consent to use a novel AI device. "e novelty and technical sophistication of an AI device places additional demands on the informed consent process," the authors said. "When an AI device is used, the presentation of information can be compli- cated by possible patient and physician fears, overconfidence or confusion." Additionally, physicians must be able to effectively explain to patients how an AI de- vice works for an informed consent process to proceed appropriately, the authors said. Assuming the physician is informed about the AI technology, he or she should explain the basic nature of the technology to the patient and distinguish between the roles human caregivers will play during each part of the procedure and the roles the AI/robotic system or device will play, the authors said. 2. Patient perceptions of AI. Patients and providers have various perceptions about AI, including a concern for potential medical errors. "In addition to delineating the role of the AI system, the physician can address the pa- tient's fears or overconfidence by describing the risks and potential novel benefits attrib- utable to the AI system," the authors said. For example, beyond sharing that they have done a procedure with an AI system in the past, the physician should describe studies comparing the system to human surgeons. "In this way, the patient's inaccurate perceptions of AI can be countered with a professional assessment of the benefits and risks involved in a specific procedure," the authors said. 3. Potential medical errors and AI. Iden- tifying who is morally responsible and perhaps legally liable for a medical error that involves AI technology is oen complicated by the "problem of many hands," the authors said. is problem refers to the challenge of attributing moral responsibility when the cause of patient harm is distributed among several persons or organizations. "A first step toward assigning responsibility for medical errors (thus hopefully mini- mizing them in the future) is to disentangle which people and professional responsibili- ties might have been involved in committing or preventing the errors," the authors said. ese actors may involve the coders and designers who created the technology; the physicians responsible for understanding the technology; the medical device companies that sell the technology; and the hospitals responsible for ensuring best practices when using AI systems. n