Becker's Hospital Review

April 2019 Becker's Hospital Review

Issue link: https://beckershealthcare.uberflip.com/i/1092388

Contents of this Issue

Navigation

Page 88 of 119

89 FINANCE CMO / CARE DELIVERY Viewpoint: How to tell patients AI is part of their care By Megan Knowles A s artificial intelligence use expands in the healthcare space, physicians must be aware of how to properly explain the role of AI to patients and address ethical concerns that arise, a commentary published in the American Medical Associa- tion's Journal of Ethics said. e commentary was written by Daniel Schiff, a PhD student at the Georgia Institute of Technology in Atlanta who studies AI and its intersection with social policy, and Jason Borenstein, PhD, director of graduate re- search ethics programs at the institute. Here's how physicians can address several ethical concerns around telling patients AI is involved in their care: 1. Informed consent. One ethical challenge stemming from AI in healthcare is the diffi- culty of obtaining a patient's informed con- sent to use a novel AI device. "e novelty and technical sophistication of an AI device places additional demands on the informed consent process," the authors said. "When an AI device is used, the pre- sentation of information can be complicated by possible patient and physician fears, over- confidence or confusion." Additionally, physicians must be able to ef- fectively explain to patients how an AI de- vice works for an informed consent process to proceed appropriately, the authors said. Assuming the physician is informed about the AI technology, he should explain the basic nature of the technology to the patient and distinguish between the roles human caregivers will play during each part of the procedure and the roles the AI/robotic sys- tem or device will play, the authors said. 2. Patient perceptions of AI. Patients and pro- viders have various perceptions about AI, in- cluding a concern for potential medical errors. "In addition to delineating the role of the AI system, the physician can address the pa- tient's fears or overconfidence by describing the risks and potential novel benefits attrib- utable to the AI system," the authors said. For example, beyond sharing that they have done a procedure with an AI system in the past, the physician should describe studies comparing the system to human surgeons. "In this way, the patient's inaccurate per- ceptions of AI can be countered with a professional assessment of the benefits and risks involved in a specific procedure," the authors said. 3. Potential medical errors and AI. Iden- tifying who is morally responsible and per- haps legally liable for a medical error that involves AI technology is often complicat- ed by the "problem of many hands," the au- thors said. This problem refers to the chal- lenge of attributing moral responsibility when the cause of patient harm is distribut- ed among several persons or organizations. "A first step toward assigning responsi- bility for medical errors (thus hopefully minimizing them in the future) is to dis- entangle which people and professional responsibilities might have been involved in committing or preventing the errors," the authors said. These actors may involve the coders and designers who created the technology; the physicians responsible for understanding the technology; the medical device com- panies that sell the technology; and the hospitals responsible for ensuring best practices when using AI systems. n Mount Carmel: 5 patients who died under former physician's care may have lived if given right treatment By Megan Knowles A t least five patients who died under the care of a former physician at Columbus, Ohio-based Mount Carmel Health System may have sur- vived if they had received the correct treatment, health system offi- cials told The Columbus Dispatch in late February. The five patients are among nearly three dozen intensive-care patients who died after receiving excessive painkillers doses ordered by William Husel, MD, hospital officials said. Dr. Husel was fired from the hospital in December. "We [are continuing to review] the records of all patients who were treated by Dr. Husel and died in the hospital," said Ed Lamb, president and CEO of Mount Carmel Health System. "At this point, we have identified one addition- al patient who received an excessive and potentially fatal dose of medication ordered by Dr. Husel." That brings the number of patients involved to at least 35, including at least 29 who received a potentially fatal dose of medication ordered by Dr. Husel. A review of patient records and the care they received found five patients may have survived if given the right treatment, said Dan Roth, MD, executive vice president and chief clinical officer for Livonia, Mich.-based Trinity Health, Mount Carmel's parent company. The health system previously indicated that all of the patients were near death. On Feb. 22, Dr. Roth described the patients as being "critically ill." Mount Carmel has added a new protocol to set maximum appropriate dos- es for pain medication in its EMR; implemented a new escalation policy for deviations in its pain medication protocols; restricted the ability to bypass pharmacy review of medication orders; and increased clinician education on end-of-life care practices. n

Articles in this issue

view archives of Becker's Hospital Review - April 2019 Becker's Hospital Review