Doing not so good? Blame the test.
On May 8, the Leapfrog group released the third iteration of its Hospital Safety Scores. The first appeared in June 2012, and the second in November. I have expressed concerns about the value to curious professionals and consumers alike because of things like volatility of scores over the short term, lack of inclusion of the many small hospitals in the state, and lack of correlation with safety scores proposed by other organizations. Nonetheless, the Leapfrog Group was one of the earliest to push for public disclosure of hospital safety parameters and its evaluation deserves to be taken seriously. I will break out the data underlying the scores and compare it to last November’s in more detail later, but for the time being, here is a raw count of the results for Kentucky hospitals.
Fewer Hospitals Scored
Compared to 54 in the November list, only 45 Kentucky hospitals received an individual score. One of these was new. Ten hospitals (nearly one quarter) that were rated in November were absent on the current list. These include three of the St. Joseph hospitals and four of the ARH hospitals . This high dropout rate suggests to me problems with uniformity of reporting and comparability of data and is quite troubling. I will try to find out what the reasons were. (I am aware that there was a delay in May’s report because of some central problem with calculation of the data.)
Better Worse and the Same
Forty-four hospitals were on both lists. Thirty-two of these retained the same letter grade. One hospital improved by two grades from a C to an A, and 5 others improved from a B to an A. Seven hospitals fell one grade from a variety of initial starting points. (One of these fell to a D, and another to an F.) I will calculate and append later how May’s scores differ from the first set of last June. Some hospitals either improved or worsened twice in a row over the course of the year– significant movements indeed. I prepared lists of the individual hospitals in these various categories.
Worsening in Louisville
There were some changes in scores of the Louisville Hospitals that have received special attention in these pages. Baptist East and Baptist Northeast kept their Bs. The four adult Norton Hospitals and Jewish Hospital Shelbyville kept their Cs. University of Louisville Hospital fell from a B to a C. Jewish Hospital and St. Mary’s Health Care fell from a C to a D, the second lowest in the state, just above Taylor Regional Hospital.
I am withholding additional comment until I have a chance to look at the underlying details from the Leapfrog and Medicare Compare databases. Suffice it to say that no Louisville-area hospital reached the “outstanding” level, although with its B, Baptist leads the local pack. Jewish and St. Mary’s hit the cellar with their D. Because as far as Medicare Compare is concerned the two are a single entity, it is not possible for me to speculate where the problem lies.
No One Likes to Look Bad.
The common reflex response of hospitals that do not do well in such ratings is to blame the test in some way. Leapfrog gives hospitals advance warning of what their grade is going to be and an opportunity to rebut or correct any of the underlying data. Thus, on the day the scores were released to the public, KentuckyOne Health sent an email to its employees giving them a heads-up about their obviously disappointing score. KentuckyOne alleges methodological shortcomings in the evaluation methods and laments that their most recent clinical information is not used, which presumably would make them look better. Of course, up-to-the-last-minute data can never be included in any survey, and all Kentucky Hospitals were evaluated over the same periods for each category of data and are in the same boat. All hospitals are vulnerable to looking better or worse when newer data is added to the mix.
Is the System Unfair?
The methodological criticisms put forward by the American Hospital Association were posted earlier in these pages, along with a rebuttal by Leapfrog. KentuckyOne focuses in particular on that fact that Leapfrog includes some self-reported quality and safety items from its separate proprietary hospital quality reports. Because there is “no validation process to ensure that reports are accurate,” Jewish and St. Marys “chose not to submit the survey information over the past five years. We want to assure you that that our score was calculated without our survey input, which heavily skewed the findings.” In my opinion, using the fact that your competitors might cheat as a reason for not participating yourself is a pretty thin argument. [I confess that I also worry about the integrity of self-reported data as did Medicare when it set up the present system of evaluation. Transparency is only valid when accompanied by verifiability and accountability]
Of course, almost none of the other hospitals in Kentucky have chosen to participate in Leapfrog’s Hospital Quality Survey either, so it seems to me that the playing field is still pretty level! Of the 13 Kentucky hospitals currently participating with Leapfrog, seven are in the Norton and St. Elizabeth systems, and another is University of Louisville Hospital. There were 5 A’s and 7 C’s among them. In my opinion it would be a stretch to claim that filling in the extra blanks gave anyone an edge, although I did wonder myself whether UofL’s new participation last November may have been responsible for its jump from an unranked score to a B. Appropriate staffing of ICUs and low infection rates can give a bump to hospitals that choose to work with Leapfrog. UofL continues to participate and has claimed that its new results are better. Nonetheless its score has dropped to a C.
For hospitals like Jewish & St. Mary’s that choose not to submit the full panel of information, the blanks are, strictly speaking, not held against them. Instead, all the other, and more objective outcome data from Medicare Compare are given greater weight. An initial look at the raw data suggests to me that this is what is driving the drop in UofL’s and in Jewish & St.Mary’s safety scores. (All three of these hospitals are now part of KentuckyOne Health.)
[Addendum, May 25: As I look at individual hospitals over time, I must conclude that the criticism that participating in Leapfrog’s proprietary system can meaningfully effect the final score of a hospital deserves to be addressed definitively. I believe Leapfrog should release its entire database to the general public for free, just as Medicare does, so that health policy experts can analyze it. To much is riding on the scores for any of the process to be less than fully open.]
Better System Needed but More Information Must Be Disclosed.
I remain in favor of even more public disclosure of financial and clinical information by hospitals, health professionals, insurers and government. Every time the spotlight gets turned on, major anomalies in cost and outcome are revealed that demand be explained. Some of these, like getting a D, can be embarrassing. The process of examination and understanding are central to continuous evidence-based improvement in the delivery of health care. We should not be afraid of this. Some hospitals in Kentucky are willing to “let it all hang out.” All should be and they are not. However, I must reiterate my comments in earlier pages that I am not convinced that the current attempts at simplifying massive complexity are as useful as is claimed. The lability of scores, the inconsistencies of participation, and the lack of congruity among different systems of evaluation frighten me. I would not use the scores alone to make any personal decisions, but I do want to understand what is behind them.
Shopping for a Good Rating.
Innumerable awards, ratings, and commendations are attached to hospitals and used in their marketing. I am reminded of the old saying, “Even the Devil can quote scripture to his own purpose.” It seems to me that every hospital can find some way to claim that it is the best– and that may even be true for something at least. For now, I have little confidence that I can reliably identify what that thing might be or if it is a thing that matters.
Peter Hasselbacher, MD
President, KHPI
Emeritus Professor of Medicine, UofL
May 10, 2013