AnythingButWork Cities Food & Drink Gardening Health History Learning Science Society Travel Updates

Are International University Rankings Misleading?

October 2008 - A new 2008 Edition of the Times Higher-QS World University Rankings has been released showing a dominance of US and UK universities in the top ten. Technologically-inclined universities have increased their rankings in the list based on survey responses from 6,354 academics (compared with 5,101 in 2007) and 2,339 employers (compared with 1,482 in 2007).

2008 RANK       2007 RANK       INSTITUTION NAME
1       1       HARVARD University
2 2= YALE University
3 2= University of CAMBRIDGE
4 2= University of OXFORD
5 7= CALIFORNIA Institute of Technology (Caltech)
6 5 IMPERIAL College London
7 9 UCL (University College London)
8 7= University of CHICAGO
9 10 MASSACHUSETTS Institute of Technology (MIT)
10 11 COLUMBIA University

Source: QS Quacquarelli Symonds (www.topuniversities.com)

According to Ann Mroz, editor of THE: "These Rankings use an unprecedented and accurate amount of data to deliver the best overall look at the strength of the world's top universities."

Nunzio Quacquarelli, Managing Director of QS and co-editor of the Top Universities Guide, said: "In just five years, the THE-QS World University Rankings have become the primary benchmark for comparing universities across borders."

But research published in the open access journal BMC Medicine at the end of 2007 concluded that international university rankings are "misleading and should be abandoned".

The study analyzed 2006 data from the Times Higher Education Supplement "World University Rankings" and the Shanghai Jiao Tong University "Academic Ranking of World Universities". It found that only 133 institutions appeared in both top 200 lists and four of the top 50 in the Shanghai list did not feature in top 500 according to the Times ranking.

Researchers suggest that these discrepancies result from poor methodology and use of inappropriate indicators. For example, the Shanghai ranking assessed research excellence partly by the number of Nobel laureates and alumni on an institution's staff. Researchers argue that their innovative work has often been conducted at a previous institution and their presence does not necessarily imply better undergraduate education. The Times ranking drew on a survey of more than 190 000 researchers asked to list the top 30 universities in their field. The authors point out that this entirely subjective method had a response rate of less than 1 per cent.

Lead researcher John Ioannidis said:

"There are flaws in the way that almost every indicator is used to compile these two popular rankings. I don't disagree that excellence is important to define, measure, interpret and improve, but the existing ranking criteria could actually harm science and education."

The authors conclude that global collaboration is needed to standardize key data on universities and other institutions with limitations acknowledged and not underestimated.

John Ioannidis commented:

"Evaluation exercises should not force spurious averages and oversimplified rankings for the many faces of excellence. And efforts to improve institutions should not focus just on the numbers being watched."


Anythingbutwork.com makes minimal use of cookies, including some placed to facilitate features such as Google Search. By continuing to use the site you are agreeing to the use of cookies. Learn more here

Contact
Linked sites
Privacy Policy
Garden Guide
British Isles
City Visit Guide
Copyright © 2006-2024 Alan Price and AnythingButWork.com contributors. All rights reserved.