Friday, October 28, 2011

An Error

This replaces an earlier post.

Last year Times Higher Education admitted to an error involving Monash University and the University of Adelaide

Also, after the launch of the World University Rankings 2010 it became apparent that, owing to a data processing error, the ranking positions of two Australian universities in the top 200 list were incorrect — the University of Adelaide and Monash University.

Both universities remain in the top 1 per cent of world universities.


This year, a representative of Adelaide commented on the error: 


Adelaide's DVCR Mike Brooks said it had been "disconcerting'' that there had been a data processing error last year in the first iteration of the revised rankings since their split from QS. "It certainly raises further questions about the credibility of the rankings,'' Professor Brooks said.

"Based on our own analysis we believe that we have a similar ranking this year to that of 2010. The shift in position is attributed to the error in the processing last year, ongoing changes in THE methodology and increased competition.''

"I think the students and the wider community are able to judge for themselves.  As South Australia's leading research-university and only member of the Group of Eight, I know that we are in an incredibly strong position for the future.''

Adelaide's fall seems to have been due very largely to a massive fall in the score for research impact. How much of this was due to the correction of the 2010 error, how much to changes in methodology and how much to the inherent instability of the normalisation procedure is not clear

Monday, October 17, 2011

GLOBAL: Despite ranking changes, questions persist 

 My article on the Times Higher Education World University Rankings can be accessed at University World News

The international university ranking scene is starting to look like the heavyweight boxing division. Titles are proliferating and there is no longer an undisputed champion of the world. Caltech has just been crowned top university by Times Higher Education and Thomson Reuters, their data collectors and analysts, while QS have put Cambridge in first place. Over at Webometrics, MIT holds the number one spot. But Harvard has the consolation of remaining the top university in the Scimago and HEEACT rankings as well as the Academic Ranking of World Universities, ARWU, published by Shanghai Jiao Tong University.

Read here

Sunday, October 09, 2011

Rising Stars of the THE - TR Rankings

These are some of the universities that have risen significantly in the rankings compared to last year.

Europe

Karolinska Institute
Munich
LSE
Zurich
Leuven
Wageningen
Leiden
Uppsala
Sheffield
Humboldt

USA

UC Davis
Minnesota
PennState
Michigan State

Australia

Monash

Asia

Osaka
Tohoku
Caltech in First Place

The big news of the 2011 THE - TR rankings is that Caltech has replaced Harvard as the world's top university. So how exactly did they do it?

According to the Times Higher iPad apps for this year and last (easily downloadable from the rankings page), Harvard's total score fell from 96.1 to 93.9 and Caltech's from 96.0 to 94.8, turning a 0.1 Harvard lead into one of 0.9 for Caltech.

Harvard continued to do better than Caltech in two indicators, with 95 .8 for teaching and 67.5 for international orientation compared to 95.7 and 56.0 for Caltech.

Caltech is much better than Harvard in industry income - innovation but that indicator has a weighting of only 2.5 %.

Harvard's slight lead in the research indicator has turned into a slight lead of 0.8 for Caltech.

Caltech is still ahead for citations but Harvard caught up a bit, narrowing the lead to 0.1.

So, it seems that what made the difference was the research indicator. it seems unlikely that Caltech could overcome Harvard's massive lead in reputation for research and postgraduate teaching: last year it was 100 compared with 23.5. That leaves us with research income per faculty.
 
According to Phil Baty :

"Harvard reported funding increases that are similar in proportion to those of many other universities, whereas Caltech reported a steep rise (16 per cent) in research funding and an increase in totalinstitutional income."

This seems generally compatible with Caltech's 2008-2009 financial statement according to which:

Before accounting for investment losses, total unrestricted revenues increased 6.7% including JPL, and 14.0% excluding JPL

and

Research awards in FY 2009 reached an all-time high of $357 million, including $29 million of funds secured from the federal stimulus package. Awards from federal sponsors increased by 34.4%, while awards from nonfederal sponsors increased by 20.7%.  We also had a good year in terms of private giving, as donors continue to recognize the importance of the research and educational efforts of our outstanding faculty and students.

It seems that research income is going to be the tie-breaker at the top of the THE - TR rankings.  This might not be such a good thing. Income is an input. It is not a product, although universities everywhere apparently think so. There are negative backwash effects coming if academics devote their energies to securing grants rather than actually doing research.
Update on Alexandria

Elnaschiewatch reports that Hend Hanafi, President of Alexandria University, has resigned following prolonged student protests.

Apparently she was under fire because of her links to the old regime but one wonders whether her university's apparent fall of nearly 200 places in the THE - TR rankings gave her a final push. If so, we hope that Times Higher will send a letter of apology for unrealistically raising the hopes of faculty and students. 
Meanwhile over in Alexandria
One of the strangest results of the 2010 THE - TR rankings was the elevation of Alexandria University in Egypt to the improbable status of fourth best university in the world for research impact and 147th overall. It turned out that this was almost entirely the work of precisely one marginal academic figure, Mohamed El Naschie, former editor of the journal Chaos Solitons and Fractals, whose worked was copiously cited by himself, other authors in his journal and those in an Israeli - published journal  (now purchased by De Gruyter) of which he was an editor.

The number of citations collected by El Naschie was not outrageously high but it was much higher than usual for his discipline and many of them were within a year of publication. This meant that El Naschie and Alexandria University received massive credit for  his citations since Thomson Reuters' normalisation system meant comparison with  the international average in a field where citations are low especially in the first year of publication.

Alexandria was not the only university to receive an obviously inflated score for research impact. Hong Kong Baptist University received a score of 97.6 and Bilkent one of 95.7, although in those two cases it seems that the few papers that contributed to these scores did have genuine merit.

It should be remembered that the citation scores were averages and that a few highly cited papers could have a grossly disproportionate effect if the total number of published papers was low.

This year Thomson Reuters went to some length to reduce the impact of a few highly cited papers. They have to some extent succeeded. Alexandria's score is down to 61.4  for citations (it is in 330th place overall),  Bilkent's to 60.8 (222nd place overall) and HKBU's to 59.7 (290th place overall).

These scores are not as ridiculous as those of 2010 but they are still unreasonable. Are we really expected to believe that these schools have a greater research impact than the University of Sydney, Kyoto University, the London School of Economics, Monash University and Peking University who all have scores in the fifties for this indicator?

I for one cannot believe that a single paper or a few papers, no matter how worthwhile, can justify inclusion in the top 300 world universities.

There is another problem. Normalisation of citations by year is inherently unstable. One or two papers in a low citation discipline cited within a year of publication will give a boost to the citations indicator score but after a year their impact diminishes because the citations are now coming more than a year after publication.

Alexandria's score was due to fall anyway because El Naschie has published vary little lately so his contribution to the citations score has fallen whatever methodological changes were introduced. And if he ever starts publishing again?

Also, if Thomson Reuters are normalising by field across the board, this rises the possibility that universities will be able to benefit by simply reclassifying research grants, moving research centres fromone field to another, manipulating abstracts and key words and so on.

Friday, October 07, 2011

Who else is down ?

Just looking at the top 200 of the THE rankings, these universities have fallen quite a bit.

University of North Carolina Chapel Hill
Sydney
Ecole Normale Superieure
Ecole Polytechnique
Trinity College Dublin
University College Dublin
William and Mary College
University of Virginia
Asian Decline?

The Shanghai rankings have shown that universities in Korea, China (including Taiwan and Hong Kong) and the Middle East  have been steadily advancing over the years. Did they get it wrong?

The latest  Times Higher Education -Thomson Reuters rankings appear to prove that Asian universities have inexplicably collapsed over the last year. Tokyo has dropped from 26th to 30th place. Peking has fallen twelve places to 49th. Pohang University of Science and Technology and the Hong Kong University of Science and Technology have slipped out of the top fifty. Bilkent and Hong Kong Baptist University are way down.The decline of China University of Science and Technology is disastrous, from 49th to 192nd. Asian universities are going to be dangerous places for the next few days with students and teachers dodging university administrators jumping out of office windows.

Of course, massive declines like this do not reflect reality: they are simply the result of the methodological changes introduced this year. 

Anyone accessing a ranking site or downloading an iPad app should be made to click on a box reading "I understand that the methodological changes in the rankings mean that comparison with last year's ranking is pointless and I promise not to issue a public statement or say anything to anyone until I have had a cup of tea and I have made sure that everybody else understands this."

Thursday, October 06, 2011

New Arrivals in the THE Top 200.

Every time a new ranking is published there are cries for the dismissal or worse of vice-chancellors or presidents who allowed their universities to lose ground. There will no doubt be more demands as the results of this year's THE rankings are digested. This will be very unjust since there are reasons why universities might take a tumble that have nothing to do with any decline in quality.

First, Thomas Reuters, THE's data collectors, have introduced several methodological changes. In the top 20 or 30 these might not mean very much but lower down the effect could be very large.

Second, rankers sometimes make mistakes and so do those who collect  data for institutions.

Third, many new universities have taken part this year. I counted thirteen just in the top 200 and there are certainly many more in the 200s and300s. A university ranked 200 last year would lose 13 places even if it had exactly the same relative score.

The thirteen newcomers are Texas at Austin, Rochester, Hebrew  University of Jerusalem, University of Florida, Brandeis, Chinese University of Hong Kong, Nijmegan, Medical University of South Carolina, Louvain, Universite Paris Diderot vii, Queen's University, Canada, Sao Paulo, Western Australia.
Highlights of the THE rankings

Some interesting results.

57.  Ohio State University
103.  Cape Town
107 Royal Holloway
149. Birkbeck
184. Iowa State
197. Georgia Health Sciences University
201-225. Bilkent
201-225 University of Medicine and Dentistry of New Jersey
226-250 Creighton University USA
226-250 Tokyo Metropolitan
251-275 Wayne State
276-300 University of Crete
276-300 University of Iceland
276-300 Istanbul Technical University
276-300 Queensland University of Technology
276-300 Tokyo Medical and Dental University
301-350 Alexandria
301-350 Aveiro University
301-350 Hertfordshire
301-350 Plymouth University, UK
301-350 Sharif University of Technology
301-350 National University of Ireland, Maynooth
301-350 Taiwan Ocean University
301-350 Old Dominion University, USA

Wednesday, October 05, 2011

THE Rankings Out


Here is the top 10.

1. Caltech
2. Harvard
3. Stanford
4. Oxford
5. Princeton
6. Cambridge
7. MIT
8. Imperial College London
9. Chicago
10. Berkeley
THE Rankings: Caltech Ousts Harvard

This is from the Peninsula in Qatar


LONDON: US and British institutions once again dominate an annual worldwide league table of universities published yesterday, but there is a fresh name at the top, unseating long-time leader Harvard.
California Institute of Technology (Caltech) knocked the famous Massachusetts institution from the summit of the Times Higher Education (THE) league table for the first time in eight years, with US schools claiming 75 of the top 200 places.
Next is Britain, which boasts 32 establishments in the top 200, but an overhaul in the way in which the country’s universities are funded has raised concerns over its continuing success.
Asia’s increasing presence in the annual table has stalled, with 30th placed University of Tokyo leading the continent’s representation.
China’s top two universities hold on to their elite status, but no more institutions from the developing powerhouse managed to break into the top 200.
THE attributed Caltech’s success to “consistent results across the indicators and a steep rise in research funding”.
THE Rankings

Caffeineblogging

The Guardian appears to have heard something.

On Thursday, the Times Higher Education its global universities rankings. As usual, UK universities shine disproportionately. Altogether a dozen are in the top 100 in the world, with seven in the top 50.

Tuesday, October 04, 2011

Latin American Rankings

QS have produced their new Latin American rankings. The Top five are:

1. Universidade de Sao Paulo
2. Pontificia Universidad Catolica de Chile
3. Unidersidade Estadual de Campinas, Brazil
4. Universidad de Chile
5. Universidad Nacional Autonoma de Mexico (UNAM)
Suggestion

In Times Higher Education, Terrance Karran claims that universities that do well in the THE rankings (and the other ones?) are those that show more regard for academic freedom, which is equated to "compliance" with the AAUP's academic freedom statement.

Perhaps an annual prize could be awarded to the university that has the most academic freedom. I propose that it be called the Lawrence Summers Prize
Expectation

David Willetts, the Brish minister for universities and science says that he expects that more British universities will be in the Times Higher Education World University Rankings top 200.

And if more British universities, then fewer........?
The US News rankings

The U.S. News rankings of American colleges and universities were released on September 13th. For more information go here.

The top 10 national unuiversities are:

1.  Harvard
2.  Princeton
3.  Yale
4.  Columbia
5 = Caltech
5 = MIT
5= Stanford
5= Chicago
5= University of Pennsylvania
10. Duke

Tuesday, September 13, 2011

Announcement from THE

Times Higher Education have just announced that they will only rank 200 universities this year. Another 200 will be listed alphabetically but not ranked.

Let us be clear: the Times Higher Education World University Rankings list only the world’s top 200 research-led global universities.

We stop our annual list at the 200th place for two reasons. First, it helps us to make sure that we compare like with like. Although those ranked have different histories, cultures, structures and sizes, they all share some common characteristics: they recruit from the same global pool of students and staff; they push the boundaries of knowledge with research published in the world’s leading journals; and they teach at both the undergraduate and doctoral level in a research-led environment.
We unashamedly rank only around 1 per cent of the world’s universities – all of a similar type – because we recognise that the sector’s diversity is one of its great strengths, and not every university should aspire to be one of the global research elite.
But we also stop the ranking list at 200 in the interests of fairness. It is clear that the lower down the tables you go, the more the data bunch up and the less meaningful the differentials between institutions become. The difference between the institutions in the 10th and 20th places, for example, is much greater than the difference between number 310 and number 320. In fact, ranking differentials at this level become almost meaningless, which is why we limit it to 200.
 
If THE are going to provide sufficient detail about the component indicators to enable analysts to work out how universities compare with each other this would be be a good idea. It would avoid  raucous demands that university heads resign whenever the top national university slips 20 places in the rankings but would allow analysts to figure out exactly where schools were standing.

 It is true, as Phil Baty says, that there is not much difference between being 310 and 320 but there is, or there would be if the methodology was valid, a difference between 310 and 210. If THE are just going to present us with a list of 200 universities that did not (quite?) make it into the top 200 a lot of usable information will be lost.

The argument that THE is interested only in the ranking of the leading research led institutions seems to run counter to THE's emphasis on its bundle of teaching indicators and the claim that normalization of citations data can uncover hidden pockets of excellence. If we are concerned only with universities with a research led environment then  a few pockets or even a single pocket should be of little concern.

One also wonders what would happen if disgruntled universities decided that it was not worth the effort of collecting masses of data for TR and THE if the only reward is to be lumped among 200 also rans.
700 Universities

QS have released a ranked list of 700 universities. See here.

Saturday, September 10, 2011

QS: The Employer Survey

The employer survey indicator in the QS World University Rankings might be regarded as a valuable assessment tool since it provides an external check on university quality. There are, however, some odd things about this indicator in the 2011 QS Rankings.

Thirteen universities are given scores of 100, of which 10 are listed as in 4th= place, presumably meaning that they had scores that were identical down to the first or second decimal point. Then 15 schools are listed as being in 15th place with a score of 90, 48 in 51st place with a score of 59.4 and 52 in 100th= place with a score of 55.9.

This is probably something to do with a massive upsurge in responses from Latin America, although exactly what is not clear. QS report that:

"QS received a dramatic level of response from Latin America in 2011, these counts and all subsequent analysis have been adjusted by applying a weighting to responses from countries with a distinctly disproportionate level of response."
Baloney

The economist David Blanchflower has dismissed the QS rankings as "a load of old baloney".

Much of what he says is sensible, indeed obvious. But not entirely.

"This ranking is complete rubbish and nobody should place any credence in it."

A bit too strong. The QS rankings are not too bad in parts, having improved over the last few years, and are moderately accurate about sorting out universities within a country or region. I doubt that anyone seriously thinks that Cambridge is the best university in the world unless we start counting May balls and punting on the Cam but it is quite reasonable that it is better than Oxford or Durham. Similarly, I wonder if anyone could argue that it is rubbish that Tokyo is the best university in Japan or Cape Town in Africa.

"It is unclear whether having more foreign students and faculty should even have a positive rank; less is probably better."

Students yes, but if nothing else more international faculty does mean that a university is recruiting from a larger pool of talent.

Blanchflower does not mention the academic and employer surveys both of which are flawed but do provide another dimension of assessment or the faculty student ratio which is very crude but might have a slightly closer relationship to teaching quality than the number of alumni who received Nobel prizes decades ago.

He then goes on to compare the QS rankings unfavorably with the Shanghai rankings (That actually is Shanghai Jiao Tong University not what he calls the University of Shanghai). I would certainly agree with most of what he says here but I think that we should remember that flawed as they are the QS rankings do, unlike the Shanghai index, give some recognition to excellence in the arts and humanities, make some attempt to assess teaching and  provide a basis for discriminating among those universities without Nobel prize winners or Fields medalists.

Finally, I would love to see if Blanchflower has any comments on last year's THE-Thomson Reuters rankings which put Alexandria, Bilkent and Hong Kong Baptist University among the world's research superpowers.

Friday, September 09, 2011

Well Done, QS
 
QS have just indicated that they have excluded self-citations from their citations per faculty indicator in this year's World University Rankings. This is a very positive move that will remove some of the distortions that have crept into this indicator over the last few years. It would have been even better if they had excluded citations within journals and within institutions. Maybe next year.

It will be interesting to see if Times Higher Education and Thomson Reuters do the same with their rankings in October.It would not be very difficult and it might help to exclude Alexandria University and a few others from an undeserved place in the world's top universities for research impact.

(By the way Karolinska Institute is not in the US)


Although it may not make very much difference at the very top of this indicator, it seems that some places have suffered severely and others have benefited  from the change. According to the QS intelligence unit:

  • Of all of the institutions we looked at the institution with the largest absolute number of self-citations, by some margin, is Harvard with over 93,000 representing 12.9% of their overall citations count
  • The top five institutions producing over 3,000 papers, in terms of proportion of self-citations are all in Eastern Europe – St Petersburg State University, Czech Technical University, Warsaw University of Technology, Babes-Bolyai University and Lomonosov Moscow State University
  • The top five in terms of the difference in citations per paper when self-citations are excluded are Caltech, Rockefeller, UC Santa Cruz, ENS Lyon and the University of Hawaii
  • And the top 10 in terms of the difference in citations per faculty when self-citations are included are:
# Institution Country
1 California Institute of Technology (Caltech) United States
2 Rockefeller University United States
3 Stanford University United States
4 Gwangju Institute of Science and Technology (GIST) South Korea
5 Karolinska Institute United States
6 Princeton University United States
7 Leiden University Netherlands
8 Harvard University United States
9 University of California, San Diego (UCSD) United States
10 University of California, San Francisco (UCSF) United States

Tuesday, September 06, 2011

The Best University in the World
Update 8/9/2011 -- some comments added

For many people the most interesting thing about the QS rankings is the battle for the top place. The Shanghai rankings put Harvard in first place year after year and no doubt will do so for the next few decades. QS when it was in partnership with Times Higher Education also routinely put Harvard first. This is scarcely surprising since the research prowess of Cambridge has steadily declined in recent years. Still, Cambridge, Oxford and two London colleges did quite well mainly because they got high scores for international faculty and students and for the academic survey (not surprising since a disproportionate number of responses came from the UK, Australia and New Zealand) but not well enough to get over their not very distinguished research record.

Last year, however, Cambridge squeezed past Harvard. This was not because of the  academic and employer surveys. That remained at 100 for both places. What happened was that between 2009 and 2010 Cambridge's score for citations per faculty increased from 89 to 93. This would be a fine achievement if it represented a real improvement. Unfortunately, almost every university with scores above 60 for this indicator in 2009 went up by a similar margin in 2010 while universities with scores below 50 slumped. Evidently, there was a new method of converting raw scores. Perhaps a mathematician out there can help.

And this year?

Cambridge and Harvard are both at 100 for the academic and employer surveys just like last year. (Note that although Harvard does better than Cambridge in both surveys they get the same reported score of 100).


For the faculty student ratio Harvard narrowed the gap a little from 3 to 2.5 points. In citations per faculty Cambridge slipped a bit by 0.3 points. However, Cambridge pulled further ahead on international students and faculty.

Basically, from 2004 to 2009 Harvard reigned supreme because its obvious superiority in research was more than enough to offset the advantages Cambridge enjoyed with regard to internationalisation (small country and policies favouring international students), faculty student ratio (counting non-teaching research staff) and the academic survey (disproportionate responses from the UK and Commonwealth). But this year and last the change in the method of converting the raw scores for citations per faculty artificially boosted Cambridge's overall scores.

So, is Cambridge really the world's top university?

Monday, September 05, 2011

The THE-TR Rankings

The THE-TR World University Rankings will be published on October 6th.

There will be some changes. The weighting given to the citations indicator will be slightly reduced to 30% and internationalisation gets 7.5% instead of 5%.

There will be some tweaking of the citations indicator to avoid a repeat of the Alexandria and other anomalies. Let's hope it works.

In the research indicator there will be a reduction in the weighting given to the survey and public research income as a percentage of research income will be removed.

There will, unfortunately, be a slight increase in the percentage given of international students and a decline in that for international faculty.
Commentary on the 2011 QS World University Rankings

From India

"University of Cambridge retains its number one spot ahead of Harvard, according to the QS World University Rankings 2011, released today. Meanwhile, MIT jumps to the third position, ahead of Yale and Oxford.

While the US continues to dominate the world ranking scenario, taking 13 of top 20 and 70 of top 300 places, 14 of 19 Canadian universities have ranked lower than 2010. As far as Europe is concerned, Germany, one of the emerging European destinations in recent times, has no university making it to the top 50 despite its Excellence Initiative.

Asian institutions - particularly those from Japan, Korea, Singapore, Hong Kong and China - have fared well at a discipline level in subject rankings produced by QS this year - this is particularly true in technical and hard science fields.

Despite the Indian government's efforts to bring about a radical change in the Indian higher education sector, no Indian university has made it to the top 200 this year. However, China has made it to the top 50 and Middle East in the top 200 for the first time.

According to Ben Sowter, QS head of research, "There has been no (relative) improvement from any Indian institution this year. The international higher education scene is alive with innovation and change, institutions are reforming, adapting and revolutionising. Migration amongst international students and faculty continues to grow with little sign of slowing. Universities can no longer do the same things they have always done and expect to maintain their position in a ranking or relative performance.""