Rankings are alluring. And because they feed into our hierarchy instinct, we tend to take them more seriously than we should. Let me elaborate with some examples in this post.
College Rankings
My friends and acquaintances sometimes consult me, a college instructor, regarding college choice decisions.
In these interactions, I am surprised by their over-emphasis on college rankings and that they do so without knowing the measurement methodology.
Here are some examples from the most recent database of the popular US News & World Report college rankings that highlight the value of knowing the measurement methodology.
- In “best national universities,” the University of Washington ranks 55; in “best global universities,” it ranks 6!
- In “best business schools,” Harvard ranks 5 and Stanford 6. Almost all MBA aspirants will not prefer any of the top four schools over Harvard or Stanford!
- In “best national universities,” Cornell ranks 17. In rankings by a different source, the reputable AcademicRanking of World Universities (ARWU), it ranks 12 worldwide!
We can reconcile the above contradictions if we understand the ranking measure(s).
Specifically, what criteria comprise the measure, such as faculty resources, student outcomes, and the like? Additionally, the relative weightage of these criteria and the data used to represent them. Variations in these three elements produce different results in different ranking systems.
Ranking measures are reasonable for ascertaining overall college quality, but because of their inherent limitations, they cannot meaningfully discriminate among same-tier colleges.
Accordingly, I wouldn’t fret too much about rank differences of a few spots.
Rankings represent a general level and do not fully align with the specific requirements of individuals.
As such, rankings are best used to get an overall idea of college quality. Supplement this information with criteria that matter at the individual level but are not captured or underemphasized in rankings.
Country Rankings
Hundreds of indexes rank countries on social, political, and cultural attributes such as gender equality, religious freedom, happiness, and more.
These indexes rest on the principle that anything and everything is measurable. This notion has merit, but it often leads to flawed measurement, especially under resource constraints and in partisan hands.
Consider the World Press Freedom Index (WPFI) by Reporters Sans Frontiers, wherein for 2023, India ranks 161/180 and falls in the bottommost “very serious” category. This label is inconsistent with my estimation of press freedom in India.
I believe India has vibrant, diverse, and free media. Hundreds of print, television, radio, and online media platforms offer a range of perspectives and voices without restrictions.
To me, WPFI does not pass the smell test. Upon scrutiny, it reveals a narrow focus and a survey methodology. It comprises:1
- a tally of abuses against media and journalists in connection with their work
- a qualitative analysis based on survey responses from press freedom specialists
The abuse tally seems to assume journalists should be unchallenged. But biased and irresponsible journalists are the norm and must be challenged and held accountable.
The website does not disclose how “press freedom specialists” are selected and surveyed. In India, it could be a group whose personal biases, political leanings, and monetary incentives produce lopsided survey responses.
It’s inadvisable to take country index rankings at face value when this information is reported in summary in the media. Unfortunately, that is not the case.
There are other reasons to be wary!
Some of these indexes tend to be from partisan sources such as ideology-based think tanks, media companies, overzealous NGOs, and government organizations.
The most widely disseminated and followed indexes are from an American/West European reference frame; when applied worldwide, they sometimes impose American standards to vastly different contexts.
In the context of developing countries, some of these indexes might be inadequate in that they do not reflect the sociopolitical and cultural dynamics of these countries.
People Rankings
Ranking individuals on influence, physical appearance, leadership, and the like is fun.
Unfortunately, these rankings receive far more attention than they warrant.
Several decades ago, Vogue magazine featured Gayatri Devi, an Indian royal, among the ten most beautiful women. Even today, biographical sketches of Gayatri Devi mention her Vogue listing.
It’s crazy that a subjective and frivolous list by a couple of Vogue editors long ago still has currency.
Consider Time magazine’s 100 most influential people annual list. In the digital era, gimmicky lists are Time’s attempt to stay relevant.
The compilation is by Time’s editorial staff, and the process is subjective and opaque.
Isn’t it ridiculous that a list compiled by a bunch of editors at a failing news magazine is referenced as a badge of honor by many bright and intelligent people worldwide?
Notes
US News’s “national universities” ranking caters to undergraduate students whereas its “global universities” ranking emphasizes knowledge creation.
Assume you strongly believe Harvard and Stanford’s business schools are the best. The US News rankings suggest a mismatch between your criteria and the relative importance you assign to your criteria versus those in the US News methodology.
US News rankings are based partly on survey data, which has many limitations, and data submitted by college administrators, who might be tempted to fudge data. Recent data-fudging culprits include Columbia University and Temple University’s School of Business.
The ARWU rankings employ objective publicly available data such as Nobel laureate and Fields Medalist alumni and faculty, publications in two top journals, Science and Nature, and other such data.
When my son attended college, he narrowed his choice to three colleges that accepted him. One college had a strong basketball program but was 12 hours from home. As an avid college basketball fan, my son preferred this college. However, we wanted him to attend one of the other two colleges within four hours of home. After considering several factors, including his intended major, campus life, course offerings, and the like, he enrolled at the lowest ranked, per US News, of the three colleges. Some of his ranking-enamored friends were baffled by his decision.
The US Department of State’s International Freedom of Religion (IFR) report is a prime example of blatantly partisan and political evaluation. IFR’s assessment of freedom of religion draws on religious violence and discrimination incidence reports by government officials, NGOs, religious leaders, and human rights activists. I believe the Christian Church, for whom India is important as its biggest target market for Christian conversion, drives the IFR assessment of religious freedom in India. Read about the Christian Church nexus here.
1https://rsf.org/en/methodology-used-compiling-world-press-freedom-index-2023?year=2023&data_type=general
6 Comments
Point well made.
It’s important for ranking agencies to be transparent about their methodology and criteria, and equally important for people to make the effort to go through these, and then decide for themselves.
Thanks, Anju. Yes, it is a two-way street.
Thanks, Sangeeta. Rankings provide information, but we must know the rankings computation procedure.
I think you are right but the rankings are taken seriously by a majority of people.
Ranking in India, especially for private universities is often questionable, grade by NAAC.
Great post. Always have to take rankings with a grain of salt.
Yes.