Issue link: https://beckershealthcare.uberflip.com/i/763161
28 QUALITY MEASUREMENT Hospital Quality Measures Need Work: Mass General Experts Weigh In By Heather Punke M easuring the quality of hospitals is an imperfect, messy science some have likened to "sausage mak- ing" — meaning there is definitely room for improvement. In a post for U.S. News & World Report, two executives from Boston-based Massachusetts General Hospital — Elizabeth Mort, MD, senior vice president of quality and safety, and Peter Slavin, MD, president — laid out some ways hospital quality measurement can improve. A recap of their points is below. 1. Incorporate structural mea- sures. Several hospital ratings agencies exclude things like the availability of tech- nologies, services offered and designations like Magnet for nursing excellence. "Many report cards…rely heavily on measures of process and outcomes and fail to include structural features as part of the scoring system," the authors wrote, calling this a "missed opportunity." 2. Get rid of patient safety indica- tors. PSIs are derived from administrative billing codes and have several limitations. "We should work hard to identify reasonable replacements and plan for their obsoles- cence," according to the article. 3. Use caution on clinical met- rics. Hospitals collect their clinical data on things like hospital-acquired infections in different ways, meaning the data accuracy can vary from hospital to hospital. Drs. Mort and Slavin said "surveillance bias can influ- ence the results when these measures are used for public reporting" and urged ratings agencies to avoid using clinical data. 4. Stop dividing scores into mean- ingless categories. "We believe that partitioning performance should only be done if there are meaningful clinical differ- ences between categories," the authors wrote. Partitioning data into categories should only be done with "sound statistical testing." 5. Improve the readmission met- ric. Recent studies have called the usefulness of using 30-day readmissions in quality measurement into question, and Drs. Mort and Slavin also cast doubt on the metric. "We urge measure designers to refine and improve the metric which, arguably, is an important, albeit crude, measure of quality," they wrote. 6. Boost transparency and the use of clinical registries. Even though the au- thors note the public may not yet value this type of information, "we believe that over time, consumers will seek this information and push providers to share more of it." 7. Look at research and clinical trials. Patients are increasingly valuing physicians' participation in clinical trials, according to the authors. "We believe ex- cluding research has been a lost opportunity, and would encourage U.S. News and others to revisit this," they wrote. 8. Examine equality of care. Drs. Mort and Slavin believe hospitals can measure care gaps between races, genders, ethnici- ties, locations and socioeconomic statuses, and would "encourage these measures to be refined and adopted for broader use." n Dr. Peter Pronovost: 'Patients Deserve Quality Measures That Are More Science, Less Sausage-Making' By Heather Punke T oo often, hospital ratings and rankings reflect how well a hospital codes rather than how a hospital pro- vides care, according to Peter Pronovost, MD, PhD, who argues against using patient safety indicators to rate hospitals in a post for U.S. News & World Report. Dr. Pronovost, the director of the Armstrong Institute for Pa- tient Safety and Quality and senior vice president for patient safety and quality at Baltimore-based Johns Hopkins Med- icine, likened the hospital ratings process to sausage-mak- ing because it is "messy" and "involves ingredients that have to be recombined, repackaged and renamed." While hospital ratings can be useful for patients, the "rec- ipe needs to be right," Dr. Pronovost wrote — and patient safety indicators should be left out. Dr. Pronovost called PSIs "notoriously inaccurate" because the data are derived from administrative codes in bills sent to CMS, not from clinical records. This means hospitals with the time and resources to improve their coding process can get a leg-up on other organizations and improve their rating without improving care. For instance, Johns Hopkins reduced the number of patient safety indicator incidents it reports to CMS by 75 percent, thereby reducing its penalties. "About 10 percent of the improvement resulted from changes in clinical care. The other 90 percent resulted from documentation and coding that was more thorough and accurate," Dr. Prono- vost wrote. Instead of using PSIs as part of the ratings process, Dr. Pronovost argued for "valid and reliable measures" that would be audited, similar to audited financial data. "In the end, patients deserve quality measures that are more science and less sausage-making," he concluded. n