We will be performing site maintenance on our learning platform at learn.aaos.org on Sunday, February 5th from 12 AM to 5 AM EST. We apologize for the inconvenience.

AAOS Now

Published 11/1/2016
|
Eeric Truumees, MD

How Valuable Are Healthcare Quality Rankings?

In 2010, the passage of the Patient Protection and Affordable Care Act required the U.S. Centers for Medicare & Medicaid Services (CMS) to create Physician Compare (www.medicare.gov/physiciancompare), a site that aims to "provide information for consumers to encourage informed healthcare decisions" and "create explicit incentives for physicians to maximize performance." Since its inception, the information Physician Compare provides has evolved, and CMS data have joined those compiled by dozens of other online physician rating systems, all of which assess various types of data, such as patient satisfaction scores and surgical complication rates.

Along with launching Physician Compare, CMS also began publishing hospital quality data at Hospital Compare (www.medicare.gov/hospitalcompare). In July 2016, after CMS started issuing quality ratings of one to five stars on Hospital Compare in addition to the individual measures previously offered, I checked out the rankings to see how my hospital and other institutions fared. Much as when I looked up the Consumer Reports rankings of hospitals for orthopaedic surgery, I was surprised by how much the star rankings differed from my perception of each hospital's quality. Using U.S. News & World Report's list of top 10 hospitals for orthopaedic surgery, I compared the ratings those same institutions received from other sources, CMS included. As I expected, there was little agreement among the various rating systems. The scatterplot of grades begs the following questions: "How are these ratings developed? Do they really represent the quality of care offered? And do they help or confuse patients seeking care?"

CMS quality data
Let's take a closer look at Hospital Compare, which was developed in partnership with the Yale New Haven Health Services Corporation Center for Outcomes Research and Evaluation (CORE). CMS reports a 2-year development process with "substantial stakeholder input." A single, aggregate score of hospital quality is generated using "an evidence-based approach reflecting both modern statistical methods and expert insights."

CMS incorporates 64 of the 100 care quality measures it collects into the star measures. Of its seven "key measures," several apply to orthopaedic care, such as postsurgical infection rates and complication rates after hip surgery. Other key measures include length of emergency department (ED) waits and readmission rates after heart attacks. Each of these measures carries different weight into the final ranking, with results updated quarterly.

Nearly 4,600 hospitals of all sizes and types were included in the analysis. Because only hospitals reporting three of the seven measure groups were scored, 938 hospitals were not rated. Five-star ratings were awarded to 102 hospitals (2.2 percent). Four, three, two, and one star(s) were awarded to 934 (20.3 percent), 1,770 (38.5 percent), 723 (15.7 percent), and 133 (2.9 percent) hospitals, respectively.

Of course, the algorithm is complex and has to account for the heterogeneity of the existing measures as well as the fact that different hospitals report different numbers and types of measures. In addition, existing measures are gradually retired and new measures are added. In a September 22 feedback letter to CMS, MedPac noted that, of 102 five-star hospitals, only 56 percent were rated based on all four outcomes groups. MedPac asked if missing data were associated with higher ratings, and also expressed concerns regarding whether the star ratings were adequately risk-adjusted. For example, on average, one-star hospitals received 78 percent of their admissions through the ED. In five-star hospitals, on the other hand, only 36 percent of admissions were from the ED.

Another consideration is that, in some cases, healthcare systems with multiple hospitals report their data together using a unified provider number. The individual hospitals may be miles apart, treating very different patient populations, with very different staffing and equipment.

A closer look at the ratings
I ran into some issues when I compared the various institutions' ratings. First, I could not find the Mayo Clinic Rochester on Hospital Compare, and Maryland hospitals—including Johns Hopkins—are not required to publicly report the safety information required for most groups to render a score. Critically, some semi-independent orthopaedic hospitals were graded separately (eg, Hospital for Special Surgery), while others were graded only with their parent institution (Rothman/Thomas Jefferson).

In 2013, Consumer Reports didn't include any of the U.S. News & World Report's top 10 hospitals for orthopaedic surgery in either its "highest-rated hospitals for hip surgery" or "knee surgery" categories. Of the U.S. News & World Report's top 10 hospitals for orthopaedic surgery, only three—Mayo Clinic, Johns Hopkins, and Thomas Jefferson—were included in The Joint Commission's 2014 Top Performer list, while only the Mayo Clinic was listed in Truven Health Analytics' "15 Top Health Systems Study, 2016." 

"I worry a lot about these ratings," Jerod Loeb, executive vice president for healthcare quality evaluation at The Joint Commission told Kaiser Health News (KHN) in 2013. "They're all justifiable efforts to provide information, but at the end of the day, every single one of them is flawed in some respect. Rather than enlightening, we may be confusing." In 2016, The Joint Commission put its Top Performer list "on hiatus" to "better fit the evolving national measurement environment."

Even "slight" changes in methodology can lead to significant changes in quality rankings. For example, KHN reported on a Commonwealth Fund study that found that only 46 percent of hospitals ranked as top performers by Thomson Reuters in 2008 were also winners in 2007.

Who uses the data and how?
Aside from CMS, there are at least a dozen other hospital evaluators, from for-profit websites to state-based reports. To add flavor, some hospitals have taken to reviewing their doctors on their institutions' sites.

Many physicians look at these data with a jaundiced eye. Orthopaedic surgeons whose practice activities center on their office or outpatient surgery centers may ignore hospital ratings altogether; however, for academic, employed, and hospital-based orthopaedists, a poor quality rating or online review has the potential to drive patients away.

While I was unable to find published data on how many people use the CMS site, use of rating services to select hospitals and physicians has risen logarithmically. To stand out in this busy market, sites such as Yelp.com are aggregating data.

Many web raters charge hospitals licensing fees to advertise their awards. In 2013, KHN quoted Andrew Brotman, MD, chief clinical officer at NYU Langone Medical Center, as saying "Healthgrades, which is one we did well on, charges $145,000 to use this even on the website as a logo, so we don't do that. U.S. News & World Report is in the $50,000 range. Leapfrog is $12,500." Consumer Reports does not allow hospitals to advertise their ratings.

One of the issues with the Hospital Compare site is that most hospitals are rated as "average" for most measures. And yet, KHN reported that a "third of U.S. hospitals—more than 1,600—last year won at least one distinction from a major rating group or company." It also noted that "In the greater Fort Lauderdale hospital market, 21 of 24 hospitals were singled out as exemplary by at least one rating source. In the Baltimore region, 19 out of 22 hospitals won an award."

A 2013 KHN article quoted Dr. Brotman as saying, "Even though there's not a hospital executive who won't tell you that they have a great deal of skepticism about a lot of the methodology, there's not one who will tell you they don't want to be on the lists." This seems a telling statement. Despite questions about the accuracy and helpfulness of various quality rankings, these rating systems will likely continue to proliferate and garner an increasing amount of attention.

Eeric Truumees, MD, is the editor-in-chief of AAOS Now.

Additional information: