The latest global rankings contain many items that academics would be advised not to read in public places lest they embarrass the family by sniggering to themselves in Starbucks or Nandos.
THE would, for example, have us believe that St. George's, University of London is the top university in the world for research impact as measured by citations. This institution specialises in medicine, biomedical science and healthcare sciences. It does not do research in the physical sciences, the social sciences, or the arts and humanities and makes no claim that it does. To suggest that it is the best in the world across the range of scientific and academic research is ridiculous.
There are several other universities with scores for citations that are disproportionately higher than their research scores, a sure sign that the THE citations indicator is generating absurdity. They include Brandeis, the Free University of Bozen-Bolzano, Clark University, King Abdulaziz University, Anglia Ruskin University, the University of Iceland, and Orebro University, Sweden.
In some cases, it is obvious what has happened. King Abdulaziz University has been gaming the rankings by recruiting large numbers of adjunct faculty whose main function appears to be listing the university as as a secondary affiliation in order to collect a share of the credit for publications and citations. The Shanghai rankers have stopped counting secondary affiliations for their highly cited researchers indicator but KAU is still racking up the points in other indicators and other rankings.
The contention that Anglia Ruskin University is tenth in the world for research impact, equal to Oxford, Princeton, and UC Santa Barbara, and just above the University of Chicago, will no doubt be met with donnish smirks at the high tables of that other place in Cambridge, 31st for citations, although there will probably be less amusement about Oxford being crowned best university in the world.
Anglia Ruskin 's output of research is not very high, about a thirtieth of Chicago's according to the Web of Science Core Collection. Its faculty does, however, include one Professor who is a frequent contributor to global medical studies with a large number of authors, although never more than a thousand, and hundreds of citations a year. Single-handedly he has propelled the university into the research stratosphere since the rest of the university has been generating few citations (there's nothing wrong with that: it's not that sort of place) and so the number of papers by which the normalised citations are divided is very low.
The THE citations methodology is badly flawed. That university heads give any credence to rankings that include such ludicrous results is sad testimony to the decadence of the modern academy.
There are also many universities that have moved up or down by a disproportionate number of places. These include:
Peking University rising from 42nd to 29th
University of Maryland at College Park rising from 117th to 67th.
Purdue University rising from 113th to 70th.
Chinese University of Hong Kong rising from 138th to 76th.
RWTH Aachen rising from 110th to 78th
Korean Advanced Institute of Science and Technology rising from 148th to 89th
Vanderbilt University falling from 87th to108th
University of Copenhagen falling from 82nd to 120th
Scuola Normale Pisa falling from 112nd to 137th
University of Cape Town falling from 120th to 148th
Royal Holloway, University of London falling from 129th to173rd
Lomonosov Moscow State University falling from 161st to 188th.
The point cannot be stressed too clearly that universities are large and complex organisations. They do not in 12 months or less, short of major restructuring, change sufficiently to produce movements such as these. The only way that such instability could occur is through entry into the rankings of universities with attributes different from the established thus changing the means from which standardised scores are derived or significant methodological changes.
There have in fact been significant changes to the methodology this year although perhaps not as substantial as 2015. First, books and book chapters are included in the count of publications and citations, an innovation pioneered by the US News in their Best Global Universities. Almost certainly this has helped English speaking universities with a comparative advantage in the humanities and social sciences although THE's practice of bundling indicators together makes it impossible to say exactly how much. It would also work to the disadvantage of institutions such as Caltech that are comparatively less strong in the arts and humanities.
Second, THE have used a modest version of fractional counting for papers with more than a thousand authors. Last year they were not counted at all. This means that universities that have participated in mega-papers such as those associated with the Large Hadron Collider will get some credit for citations of those papers although not as much as they did in 2014 and before. This has almost certainly helped a number of Asian universities that have participated in such projects but have a generally modest research output. It might almost certainly have benefitted universities in California such as UC Berkeley.
Third, THE have combined the results of the academic reputation survey conducted earlier this year with that used in the 2015-16 rankings. Averaging reputation surveys is a sensible idea, already adopted by QS and US News in their global rankings, but one that THE has avoided until now.
This year's survey saw a very large reduction in the number of responses from researchers in the arts and humanities and a very large increase, for reasons unexplained, in the number of responses from business studies and the social sciences, separated now but combined in 2015.
Had the responses for 2016 alone been counted there might have been serious consequences for UK universities, relatively strong in the humanities, and a boost for East Asian universities, relatively strong in business studies. Combining the two surveys would have limited the damage to British universities and slowed down the rise of Asia to media-acceptable proportions.
One possible consequence of these changes is that UC Berkeley, eighth in 2014-15 and thirteenth in 2015-16, is now, as predicted here, back in the top ten. Berkeley is host for the forthcoming THE world summit although that is no doubt entirely coincidental.
The overall top place has been taken by Oxford to the great joy of the vice-chancellor who is said to be "thrilled" by the news.
I do not want to be unfair to Oxford but the idea that it is superior to Harvard, Princeton, Caltech or MIT is nonsense. Its strong performance in the THE WUR is in large measure due to the over- emphasis in these tables on reputation, income and a very flawed citations indicator. Its rise to first place over Caltech is almost certainly a result of this year's methodological changes.
Let's look at Oxford's standing in other rankings. The Round University Ranking (RUR) uses Thomson Reuters data just like THE did until two years ago. It has 12 of the indicators employed by THE and eight additional ones.
Overall Oxford was 10th, up from 17th in 2010. In the teaching group of five indicators Oxford is in 28th place. For specific indicators in that group the best performance was teaching reputation (6th) and the worst academic staff per bachelor degrees (203rd).
In Research it was 20th. Places ranged from 6th for research reputation to 206th for doctoral degrees per admitted PhD. It was 5th for International Diversity and 12th for Financial Sustainability
The Shanghai ARWU rankings have Oxford in 7th place and Webometrics in 10th (9th for Google Scholar Citations).
THE is said to be trusted by the great and the good of the academic world. The latest example is the Norwegian government including performance in the THE WUR as a criterion for overseas study grants. That trust seems largely misplaced. When the vice-chancellor of Oxford University is thrilled by a ranking that puts the university on a par for research impact with Anglia Ruskin then one really wonders about the quality of university leadership.
To conclude my latest exercise in malice and cynicism, (thank you ROARS) here is a game to amuse international academics .
Ask your friends which university in their country is the leader for research impact and then tell them who THE thinks it is.
Here are THE's research champions, according to the citations indicator:
Argentina: National University of the South
Australia: Charles Darwin University
Brazil: Universidade Federal do ABC (ABC refers to its location, not the courses offered)
Canada: University of British Columbia
China: University of Science and Technology of China
France: Paris Diderot Univerity: Paris 7
Germany: Ulm University
Ireland: Royal College of Surgeons
Japan: Toyota Technological Institute
Italy: Free University of Bozen-Bolzano
Russia: ITMO University
Turkey: Atilim University
United Kingdom: St George's, University of London.
EmoticonEmoticon