{"id":1740,"date":"2012-10-07T16:01:32","date_gmt":"2012-10-07T20:01:32","guid":{"rendered":"http:\/\/www.khpi.org\/blog\/?p=1740"},"modified":"2012-10-08T22:09:10","modified_gmt":"2012-10-09T02:09:10","slug":"the-joint-commission-publishes-list-of-top-performing-hospitals","status":"publish","type":"post","link":"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/","title":{"rendered":"The Joint Commission Publishes List of Top Performing Hospitals."},"content":{"rendered":"<p><span style=\"color: #800080;\"><strong>Who better should know how hospitals are doing?<br \/>\nKentucky gets a few nods.\u00a0<\/strong><\/span><\/p>\n<p>At the end of September, the Joint Commission<a title=\"Joint Commission Top Performing Hospitals\" href=\"http:\/\/www.jointcommission.org\/accreditation\/top_performers.aspx\" target=\"_blank\"> released its list<\/a> of \u201cTop Performers on Key Quality Measures for 2012.\u201d The Joint Commission accredits all hospitals in the US, and does the bulk of data collection for Medicare\u2019s <a title=\"Medicare Hospital Compare\" href=\"http:\/\/www.hospitalcompare.hhs.gov\" target=\"_blank\">Hospital Compare<\/a>. A total of 45 accountability measures in 8 clinical areas were evaluated for some 3500 hospitals. To make the Top Performers list, a hospital had to have performed a required action 95% of the time for each indicator individually and in aggregate. Since these same quality measures are central to the rating systems of several different organizations one might expect that the lists of top hospitals would more-or-less agree with each other. To my initial observation, any agreement seems to be less rather than more. When different sets of quality measures are applied to different subsets of hospitals, the subsequent results may not be easy for us in the trenches to interpret or use.<\/p>\n<p>Kentucky is home to 18 of the 620 top hospitals. Our fair share would have been 31. As <a title=\"Robley Rex Veterans Hospital is Rated Highest for Quality and Safety in Louisville\" href=\"http:\/\/www.khpi.org\/blog\/?p=1718\" target=\"_blank\">previously reported<\/a>, the only hospital in Louisville making the list is the Robley Rex VA Hospital. As seemed to be the case with the <a title=\"Further Details of Louisville\u2019s Hospital Safety Scores.\" href=\"http:\/\/www.khpi.org\/blog\/?p=1488\" target=\"_blank\">Leapfrog Safety Scores<\/a> and <a title=\"Consumer Reports Releases a New Set of Hospital Safety Scores.\" href=\"http:\/\/www.khpi.org\/blog\/?p=1506\" target=\"_blank\">Consumer Reports<\/a> safety evaluations, smaller and rural hospitals seemed to have a better chance of looking good. Of the hospitals on the Joint Commission list, 43 were psychiatric hospitals, including 2 in Kentucky. A list of the Kentucky Hospitals is available from KHPI on request.<\/p>\n<p><strong>Summary of observations:<\/strong><\/p>\n<ul>\n<li>Most high-profile, and virtually all teaching hospitals failed to make the list.<\/li>\n<li>It appears that few if any safety net hospitals made the list either, but I cannot yet tell how few.<\/li>\n<li>Some hospital systems successfully contributed several of their individual hospitals to the list.<\/li>\n<li>Correlation of standing in the Joint Commission list with ratings of hospital quality and safety from the lists of other organizations does not appear to be very good.<\/li>\n<\/ul>\n<p><strong>and comments:<\/strong><\/p>\n<ul>\n<li>We need to be confident of the reliability of self-reported hospital data.<\/li>\n<li>Merger of hospitals can diminish the usefulness of hospital ratings.<\/li>\n<li>Transparency is important in the financial relationships between the rating organizations and the institutions being rated.<\/li>\n<li>Does what is being measured really matter?<\/li>\n<li>Does quality in the limited number of measured processes and outcomes \u201ctrickle down\u201d to the rest of a hospital\u2019s patients?<\/li>\n<li>How might the rating process itself distort the provision of healthcare in undesirable ways?<\/li>\n<li>Are there too many different rating organizations slicing and slicing the same information?<\/li>\n<\/ul>\n<p><!--more--><span style=\"color: #800000;\"><strong>Additional Discussion.<\/strong><\/span><\/p>\n<p><strong>Teaching hospitals nearly absent in the list.<\/strong><br \/>\nI went through the list to see how many of the 620 hospitals I recognized\u2013 it was only a handful. I looked particularly for teaching hospitals to see if there might any validity to the claim that teaching hospitals are handicapped in these quality evaluations. In fact, I could recognize only two teaching hospitals in the entire top-dog list: Duke University Hospital, and Creighton Medical Center- St. Joseph. While I probably missed some minor teaching hospitals that host a few trainees, from my perspective as a former group chairman of the Association of American Medical Colleges, I did not recognize any by name. Strikingly, none of the nation\u2019s many high profile teaching hospitals in New York, Massachusetts, California, Pennsylvania, Maryland, Illinois, or Connecticut made the list. These states contain some of the most famous hospitals in the world, but were not considered \u201ctop performers\u201d by their own accrediting organization!<\/p>\n<p>As a lifelong academician, the suggestion that teaching hospitals cannot provide the same quality of care as other hospitals troubles me greatly. I believe that it is impossible to deliver high quality education in an institution that does not provide high quality care. Claims (or excuses) that teaching hospitals or safety net hospitals cannot look good in head-to-head comparisons with other hospitals include the fact that the patients are sicker, that the socioeconomic overlays of their patients magnify their illnesses and complicate their treatment, that the institutions do not have enough money to do their job, or even that they are staffed by the least experienced or inadequately supervised physician-trainees. All of the above and more may be valid explanations, but whatever the causes, perceived quality disparities in teaching hospitals cannot be allowed to stand unchallenged or unfixed.<\/p>\n<p><strong>Do hospital systems do better?<\/strong><br \/>\nSome other patterns seemed to emerge from my initial review. In California, hospitals from the Kaiser Hospital System blew the competition away. Our own Robley Rex Hospital Hospital was joined by 11 other VA hospitals in the nation making it one of the larger hospital systems to appear on the Top-Gun List. I suspect that healthcare systems as these may do well because they have more control over their staff and physicians than other institutions! With reference to the paragraph just above, you will not be surprised that large teaching hospitals have very little control over their doctors. It is easier to herd cats.<\/p>\n<p><strong>Differences between evaluations.<\/strong><br \/>\nI assumed I would see a fair amount of concordance between the Joint Commission and Leapfrog Safety Score lists for our Kentucky hospitals. I was quite surprised that this was not the case. Of the 16 acute care hospitals on the Joint Commission\u2019s top performer\u2019s list, only 8 hospitals appear at all on Leapfrog\u2019s list of 49 rated Kentucky hospitals and of those on both lists, only 4 received \u2018A\u2019s. Of the other four, two received \u2018B\u2019s, and two received only \u2018C\u2019s from Leapfrog. It appears to me that some of the hospitals evaluated by the Joint Commission were small Medicare Critical Access Hospitals that were not evaluated by Leapfrog because there is no federal requirement for these tiny limited-service community hospitals to report quality data. In fact, this category of hospital can apparently achieve Joint Commission accreditation even if their quality measures are terrible! So much for protecting the public!<\/p>\n<p><strong>Merged for some things but not others?<\/strong><br \/>\nThere were some other quirks in the two lists. Two of the several St. Elizabeth Hospitals in Northern Kentucky (Florence and St. Thomas) were top performers in the Joint Commission list but neither appears with any rating at all in the Leapfrog list. (St. Elizabeth of Covington was rated with a \u2018B\u2019 in the Leapfrog list and does not appear in the Joint Commission list.) Are these hospitals being lumped together for some reporting purposes but split apart for others? How is a person to know whether rating organizations are lumpers or splitters when using their recommendations?<\/p>\n<p>I have discussed earlier and will again the fact that hospital mergers make it difficult to evaluate individual hospitals within those systems. For example, in Louisville, all five Norton Hospitals appear as one in Medicare Compare and some other rating systems. The same this is true for Jewish &amp; St. Mary\u2019s Hospitals. This is because they use the same Medicare Provider number to bill the government. While there may be financial advantages to merged hospitals such as leveraging Medicare bonuses for teaching beds or Medicaid patients, it can hardly be assumed that the hospitals involved provide identical clinical care. With all the hospital mergers going on around the country, such lumping together dininishes the specificity of quality or safety evaluations, and therefore the value to the patient or physician is diminished.<\/p>\n<p><strong>What are we measuring?<\/strong><br \/>\nIt may be validly argued that I am comparing apples to oranges. The Joint Commission list discussed above makes reference to Quality Measures, and the Leapfrog list and Consumer Reports list to Safety Scores. What is the difference between quality and safety? (What does \u2018Excellence\u201d mean for that matter? We hear that term all the time in hospital marketing.) I confess that the differences at this level remain obscure to me, especially since both determinations draw substantially from a common set of indicators. Are these distinctions without utility? Is it possible to have a high-quality hospital if it is not safe? What good is a safe hospital that does not provide high-quality care? If a hospital is not safe nor of measurable high quality, can it ever be excellent? Is it fair to assume that multiple hospitals reported as a single entity provide care of the same quality? I began my recent considerations of this subject with an assumption that evaluation of quality is better than no evaluation. Like all medical hypotheses however, this must be proven. Surely, if it is not easy to understand how measurements of \u201cquality\u201d are conducted or what they mean, how useful can such efforts be? I have the feeling we are still in the very early stages of this national endeavor.<\/p>\n<p><strong>Can all hospitals claim some sort of quality award?<\/strong><br \/>\nAs I have gone recently to many individual hospital websites, it seemed like a majority displayed some sort of commendation or quality award on their home page. Many of the awarding organizations were unknown to me. If any or every hospital cound get an award, it would be very confusing for us consumers.<\/p>\n<p>For example, <a title=\"Cure and Outrage Coexist Comfortably in American Medicine\" href=\"http:\/\/www.khpi.org\/blog\/?p=1689\" target=\"_blank\">St. Joseph\u2019s Hospital London<\/a> highlights that it was named as one of the 50 Top Cardiovascular Hospitals for 2011 by Thompson Reuters. We are told that the hospital \u201cmore than exceeds the standards for advanced quality care\u201d as an accredited Chest Pain Center. We are told that the healthcare rating organization, HealthGrades, found that St. Joseph London was the number one hospital in Kentucky for Overall Cardiac Care and Cardiac Surgery for two years in a row and in the top 5% nationally. The hospital does not appear on the Joint Commission Top Hospital list nor on the Consumer Reports safety lists. It received low scores from Consumers Report for overall quality. On the Leapfrog Safety Score list, it received a \u2018B.\u2019<\/p>\n<p>In the lists for 2013 prepared by Truven Health Analytics (which I understand to be the successor company to Thompson Reuters above), St. Joseph London no longer appears as a top cardiac hospital. The only Kentucky hospital on the company\u2019s 100 Top Hospital list for \u201ccurrent performance and fastest long-term improvement\u201d is the Owensboro Medical Health System, which did not appear on the Top Performer list of the Joint Commission, and which received only a \u2018C\u201d from Leapfrog. It received low quality scores from Consumer Reports. Yet all of these rating organizations make use of the same basic set of information collected by Medicare! What is someone like me to make of all of this?<\/p>\n<p><strong>Accrediting and rating as a business.<\/strong><br \/>\nThe accreditation and rating of hospitals has become big business. Hospitals realize that to be competitive and to negotiate for higher payments that they are going to have to not just claim, but to prove that they are doing a good job. Both governmental and private healthcare payers have already initiated payment systems that are linked to exactly the quality indicators we are talking about in this series of articles! This is a very expensive undertaking! Hospitals spend a large fortune collecting and reporting all these numbers. The reporting requirements change in real time as do the number of entities requesting reports. Some raters charge hospitals for the privilege of collecting their information, and then may charge them again for using the results in their advertising! All the players stand to make or lose huge sums of money and reputation. We need to be confident we have it right. I am not yet confident but would like to be. At the very least, I believe the public has a right to know of any financial exchanges between the evaluators and the evaluees.<\/p>\n<p><strong>Are reported results trustworthy?<\/strong><br \/>\nBecause the financial implications are so great, it is not surprising that some research is now concluding that many hospitals go beyond \u201cgaming the system\u201d to exaggerating if not falsifying their reported results. We have long known that doctors and hospitals can be reluctant to report bad news. (In Kentucky we have a special problem because internal quality evaluations are discoverable in civil litigation.) Even if reported accurately, hospitals are allowed to decide themselves just what categories of information (if any) they submit. In essence, they can design their own report cards. To use metaphorical comparisons of healthcare to education, we all know that cheating occurs in school from elementary to professional levels. College degrees can be purchased by mail or on the Internet. Healthcare institutions are no more ethical or honest than the individuals who make them up. We as consumers must have confidence that systems of quality evaluation that are based on self-reporting and on which we depend represent the full, the whole, and nothing but the truth. I am not there yet.<\/p>\n<p><strong>Good in one thing or many?<\/strong><br \/>\nCan we generalize from evaluations of specific clinical diagnoses or processes? The whole concept of defining specific standards for processes of care for a limited number of diagnoses is that there will be some \u201ctrickle down\u201d of quality to other areas of the hospital and other disorders. This hoped-for result is not unreasonable, but because another possibility is also quite possible, it needs to be proven. It is possible and perhaps likely that the special attention given to limited areas results in a relative decrease in others. I offer another example from the field of education. In both elementary and medical schools, I and other educators have argued that \u201cteaching for the test\u201d narrows the scope of both engagement and knowledge for the student. So it can be in medicine when grades are given as part of a highly standardized and predictable evaluation.<\/p>\n<p><strong>Important outcomes or just busy work?<\/strong><br \/>\nThe discussion of whether it is more important to measure processes of care or outcomes of care has been going on for as long as I have been involved in clinical medicine. The accountability measures in the Joint Commission evaluation are all processes of care. Did the heart attack patient get aspirin on admission and beta-blockers at discharge? Did the pneumonia patient get the right antibiotic and on time? Were they immunized against pneumonia? These measures are on the list because we mostly think they are the right things to do, and that they probably lead to better clinical outcomes for our patients. Doctors who object to outside review of their actions like to call this \u201ccookbook\u201d medicine. Most of the rest of us call this using best medical practices or checklists. What is not among the Joint Commission\u2019s accountability measures are clinical outcomes like what percent of heart attack patients survived for 30 days, or how many urinary tract infections occurred because catheters were not maintained properly, or how many patients developed blood clots in their veins or which traveled to their lungs. Such outcomes are the results that matter most to patients and which we doctors and hospitals are ultimately trying to impact. Outcome measurement, in my opinion, is not substantially different than medical research. For the results to be reliable, the evaluation must be planned carefully, is often complicated or difficult, and may be expensive to do correctly. My thinking about process vs. outcome measurement is that both should be attempted. Outcomes are what we are trying to effect with our treatment processes. We must be prepared to put our resources of money and manpower behind those processes that have the most impact. We are likely never going to able to do everything humanly possible and will need the evidence to choose wisely.<\/p>\n<p>Enough! This has gotten too long again. I am in favor of independent external review of hospitals but do not think we are where we need to be. At the very least, I would not yet personally feel confortable selecting a hospital based only on any of the scoring systems I have seen. What do you think? If you are an expert, we need your input and advice.<\/p>\n<p>Peter Hasselbacher, MD<br \/>\nPresident, KHPI<br \/>\nEmeritus Professor of Medicine, UofL<br \/>\nOctober 7, 2012<\/p>\n<div class=\"sharedaddy sd-sharing-enabled\"><div class=\"robots-nocontent sd-block sd-social sd-social-icon-text sd-sharing\"><h3 class=\"sd-title\">Share this:<\/h3><div class=\"sd-content\"><ul><li><a href=\"#\" class=\"sharing-anchor sd-button share-more\"><span>Share<\/span><\/a><\/li><li class=\"share-end\"><\/li><\/ul><div class=\"sharing-hidden\"><div class=\"inner\" style=\"display: none;\"><ul><li class=\"share-facebook\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"sharing-facebook-1740\" class=\"share-facebook sd-button share-icon\" href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=facebook\" target=\"_blank\" title=\"Click to share on Facebook\" ><span>Facebook<\/span><\/a><\/li><li class=\"share-linkedin\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"sharing-linkedin-1740\" class=\"share-linkedin sd-button share-icon\" href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=linkedin\" target=\"_blank\" title=\"Click to share on LinkedIn\" ><span>LinkedIn<\/span><\/a><\/li><li class=\"share-end\"><\/li><li class=\"share-twitter\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"sharing-twitter-1740\" class=\"share-twitter sd-button share-icon\" href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=twitter\" target=\"_blank\" title=\"Click to share on Twitter\" ><span>Twitter<\/span><\/a><\/li><li class=\"share-email\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"\" class=\"share-email sd-button share-icon\" href=\"mailto:?subject=%5BShared%20Post%5D%20The%20Joint%20Commission%20Publishes%20List%20of%20Top%20Performing%20Hospitals.&body=http%3A%2F%2Fwww.khpi.org%2Fblog%2Fthe-joint-commission-publishes-list-of-top-performing-hospitals%2F&share=email\" target=\"_blank\" title=\"Click to email a link to a friend\" data-email-share-error-title=\"Do you have email set up?\" data-email-share-error-text=\"If you&#039;re having problems sharing via email, you might not have email set up for your browser. You may need to create a new email yourself.\" data-email-share-nonce=\"499945b555\" data-email-share-track-url=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=email\"><span>Email<\/span><\/a><\/li><li class=\"share-end\"><\/li><li class=\"share-end\"><\/li><\/ul><\/div><\/div><\/div><\/div><\/div>","protected":false},"excerpt":{"rendered":"<p>Who better should know how hospitals are doing? Kentucky gets a few nods.\u00a0 At the end of September, the Joint Commission released its list of \u201cTop Performers on Key Quality Measures for 2012.\u201d The Joint Commission accredits all hospitals in the US, and does the bulk of data collection for Medicare\u2019s Hospital Compare. A total &hellip; <a href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;The Joint Commission Publishes List of Top Performing Hospitals.&#8221;<\/span><\/a><\/p>\n<div class=\"sharedaddy sd-sharing-enabled\"><div class=\"robots-nocontent sd-block sd-social sd-social-icon-text sd-sharing\"><h3 class=\"sd-title\">Share this:<\/h3><div class=\"sd-content\"><ul><li><a href=\"#\" class=\"sharing-anchor sd-button share-more\"><span>Share<\/span><\/a><\/li><li class=\"share-end\"><\/li><\/ul><div class=\"sharing-hidden\"><div class=\"inner\" style=\"display: none;\"><ul><li class=\"share-facebook\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"sharing-facebook-1740\" class=\"share-facebook sd-button share-icon\" href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=facebook\" target=\"_blank\" title=\"Click to share on Facebook\" ><span>Facebook<\/span><\/a><\/li><li class=\"share-linkedin\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"sharing-linkedin-1740\" class=\"share-linkedin sd-button share-icon\" href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=linkedin\" target=\"_blank\" title=\"Click to share on LinkedIn\" ><span>LinkedIn<\/span><\/a><\/li><li class=\"share-end\"><\/li><li class=\"share-twitter\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"sharing-twitter-1740\" class=\"share-twitter sd-button share-icon\" href=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=twitter\" target=\"_blank\" title=\"Click to share on Twitter\" ><span>Twitter<\/span><\/a><\/li><li class=\"share-email\"><a rel=\"nofollow noopener noreferrer\" data-shared=\"\" class=\"share-email sd-button share-icon\" href=\"mailto:?subject=%5BShared%20Post%5D%20The%20Joint%20Commission%20Publishes%20List%20of%20Top%20Performing%20Hospitals.&body=http%3A%2F%2Fwww.khpi.org%2Fblog%2Fthe-joint-commission-publishes-list-of-top-performing-hospitals%2F&share=email\" target=\"_blank\" title=\"Click to email a link to a friend\" data-email-share-error-title=\"Do you have email set up?\" data-email-share-error-text=\"If you&#039;re having problems sharing via email, you might not have email set up for your browser. You may need to create a new email yourself.\" data-email-share-nonce=\"499945b555\" data-email-share-track-url=\"http:\/\/www.khpi.org\/blog\/the-joint-commission-publishes-list-of-top-performing-hospitals\/?share=email\"><span>Email<\/span><\/a><\/li><li class=\"share-end\"><\/li><li class=\"share-end\"><\/li><\/ul><\/div><\/div><\/div><\/div><\/div>","protected":false},"author":21,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"spay_email":"","jetpack_publicize_message":"","jetpack_is_tweetstorm":false,"jetpack_publicize_feature_enabled":true},"categories":[6],"tags":[],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p5mRQe-s4","_links":{"self":[{"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/posts\/1740"}],"collection":[{"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/users\/21"}],"replies":[{"embeddable":true,"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/comments?post=1740"}],"version-history":[{"count":5,"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/posts\/1740\/revisions"}],"predecessor-version":[{"id":1745,"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/posts\/1740\/revisions\/1745"}],"wp:attachment":[{"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/media?parent=1740"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/categories?post=1740"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.khpi.org\/blog\/wp-json\/wp\/v2\/tags?post=1740"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}