51˶

It's Time to Move on From the U.S. News Rankings

— The lack of data and transparency should raise questions

MedpageToday
The U.S. News & World Report logo over a photo of a hospital.
Caplan is a professor and medical ethicist.

On August 1, buried amidst the news of unprecedented heat waves and political conflicts of the day, was the annual release of U.S. News & World Report's "Best Hospitals" rankings. But there was a huge difference this year from prior years: no hospital got first place.

U.S. News has been getting negative pushback about the validity of all its rankings from law schools to colleges, with numerous complaints about the metrics it uses. In response to gripes about hospital rankings, the publication announced last month that its annual Honor Roll of best hospitals would stop numerical ranking in favor of simply identifying top performers. They this change by saying, "Specifically, there will be no ordinal ranking for hospitals selected for this year's Honor Roll when that list is ultimately published." And there weren't. Instead, there is an "honor roll" of 22 best hospitals, a number that in itself seems arbitrary, listed alphabetically.

I am not going to republish the list. Because a group award is hardly worth repeating. Not all agree.

Many hospitals and academic medical centers continue to flout their U.S. News awards with full page ads in major newspapers or on social media. They claim they are "Among the Best in NYC" or "4 CT Hospitals Among Nation's Best" or "3 Wisconsin Hospitals Ranked Among Nation's Best: U.S. News Ranking." Far too many hospital PR departments have gone into overdrive to brag about their local, regional, state, or national status.

Potential patients do want to know where to go for the best diagnosis or care. And medical students want to know where to go for their residencies. But the U.S. News ranking is not the guide to follow.

First, when the response of the ranker, U.S. News, is to ignore any data they do have and simply give all of the highest performers an arbitrarily defined trophy (even those hospitals that didn't supply any data), then the whole rationale for ranking collapses. In their own words, ranking the list "obscures the fact that all of the Honor Roll hospitals have attained the highest standard of care in the nation." Really? U.S. News has in essence concluded that if you're on their honor roll then you're ranked #1 in your region. So, California has five #1 hospitals, New York has four #1 hospitals, Massachusetts has two #1 hospitals. That is playing awfully loosely with the idea of #1.

U.S. News has data but has become afraid to use it, seemingly due to the complaints that flow from highly competitive hospitals. The right response is not participation trophies but to get better data and use fair metrics to permit ordinal ranking.

Second, if the rankings they do offer, including the "best hospitals honor roll," continue to rely in part on perceived "reputation" from respondents then a subjective measure is playing an unwarranted roll. Reputations are slow to change, and therefore may ignore the most current realities of hospital safety and quality. Furthermore, using "reputation" can skew results because there could be bias among those motivated to fill out surveys.

Third, U.S. News doesn't reveal the justification for the methodology it uses. The data that patients need to judge the likely quality of care -- misdiagnoses, adverse events, readmissions, prevention of longer than appropriate hospital stays, infection rates, nursing staffing, coordination of follow up care, patient satisfaction, and so on -- may simply be absent or not given sufficient weight.

Years ago, U.S. News & World Report was a distant competitor among weekly news magazines such as Time and Newsweek. As social media began to make weekly news reporting obsolete, U.S. News stopped its print publication. They found a way to survive by shifting into the rankings business. They have a huge, and likely lucrative, footprint in this space. Hospitals, diets, nursing homes, health plans, states, countries, universities, high schools, mutual funds, vacation destinations, cruise ships, travel reward programs, cars, and real estate agents are all ranked. Healthcare has a big presence. But as desirable as it might be to get advice on the best hospital, specialty care, or nursing homes, it isn't clear why that task ought to fall to U.S. News.

The mere popularity of the annual rankings in healthcare is no guarantee that the right metrics are used, the data they receive is valid or properly analyzed, and that "reputation" is not tipping the scales in ways that don't reflect quality. The degree of negative pushback from the healthcare community is sufficient to cast doubt on the rankings. When the ranking data and methods are not transparent, we should question whether we can rely on them.

So, what is a prospective patient to do? Ask questions about your doctors. Ask how many procedures they have done. If you're deciding among hospitals, the more specific safety and outcome data they will share the better. It would not hurt to track down a nurse or two and get their opinions. And if you are a student wondering where to apply for residency, then the key for you is also data and transparency. It also helps to reach out to current students, residents, and fellows. My experience is they will be frank and honest.

We all care about the quality of care we are likely to receive from our hospitals, doctors, and nurses. But how much sense does it make to rely on a self-appointed, for-profit journalism company to tell us? Healthcare should demand the assessment of standardized evaluation by healthcare professionals.

is the Drs. William F. and Virginia Connolly Mitty Professor of Bioethics and founding head of the Division of Medical Ethics at NYU Grossman School of Medicine in New York City.