The price of a citation, or How did King Abdulaziz University get in the world’s top 10?

[reblogged from Matters Mathematical]

According to a great recent blogpost by Berkeley academic Lior Pachter, there is something very fishy about university rankings.  In last week’s global university ranking published by the US News and World Report (USNWR), the top 10 universities listed in mathematics are:

1. Berkeley
2. Stanford
3. Princeton
5. University of Oxford
6. Harvard
7. King Abdulaziz University
8. Pierre and Marie Curie – Paris 6
9. University of Hong Kong
10. University of Cambridge

The USNWR rankings are based on 8 attributes:

– global research reputation
– regional research reputation
– publications
– normalized citation impact
– total citations
– number of highly cited papers
– percentage of highly cited papers
– international collaboration

Now, how did KAU end up in the top 10?  Its chair received his PhD in 2005 and has zero publications.  Its own PhD programme is only two-years old. It has separate campuses for men and women.  The author, and probably many other mathematicians, have never heard about KAU. Apparently, the secret of the ranking success lies in the fact that,

“[a]lthough KAU’s full time faculty are not very highly cited, it has amassed a large adjunct faculty that helped them greatly in these categories. In fact, in “normalized citation impact” KAU’s math department is the top ranked in the world. This amazing statistic is due to the fact that KAU employs (as adjunct faculty) more than a quarter of the highly cited mathematicians at Thomson Reuters. “

The article goes on with a very interesting and evidence-supported discussion of the ranking system, and of the particular approach taken by KAU in order to put itself on the world’s mathematical map. There are also comments by various academics, a few of whom work for KAU. Well worth a read if you have time to be scared about the $$$$$future$$$$$ of global academia.

Pachter’s blogpost raises some very interesting questions about the future of global academia. First of all, it is not at all surprising that universities from the periphery (the “global south”, as we sociologists like to call it) are trying to gain prestige and put themselves out there.  It is also not surprising that some, which are very affluent, will attempt to buy their way in the global academic system. In fact, by doing so, they are merely using loopholes and bugs – which to them are “features” – in the ranking and prestige system created by old-world academia. Our indignation at this, while justified, is also somewhat hypocritical: after all, they are simply taking the “money makes research go round” principle that bit further. Academics and administrators in US and European universities should take this as a warning – a mirror held up to our own academic institutional  practices which may be less blatant and aggressive, but are nevertheless often the same in their nature.  UK universities in particular – more so than in the rest of Europe, but still less so than in the US – are also doing their best to hire highly-cited academics.  I’m not at all worried about universities from other places taking the lead in research, and no doubt many of the names on the list are doing just that.  What is really worrying is the increasing overreliance on numeric indicators of academic quality as a substitute for much more detailed, more qualitative indicators.  I think that we… or someone? but who? well, we – vice-chancellors, academics and administrators – should take the hint from KAU’s success on paper and change the system of science quality assessment not just by tightening existing loopholes, but by not relying on simplified indicators at all.

Categories: Higher Education, Matters Mathematical

Tags: , , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *