Year
|
Oxford
|
Cambridge
|
Gap
|
%
responses arts and
humanities
|
2011
|
68.6
|
80.7
|
12.1
|
--
|
2012
|
71.2
|
80.7
|
9.5
|
7%
|
2013
|
73.0
|
81.3
|
8.3
|
10.5%
|
2014
|
67.8
|
74.3
|
6.5
|
9%
|
2015
|
80.4
|
84.3
|
3.9
|
16%
|
2016
|
67.6
|
72.2
|
4.6
|
9%
|
2017
|
69.1
|
69.1
|
0
|
12.5%
|
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, July 03, 2017
Proving anything you want from rankings
Tuesday, June 19, 2018
Are the US and the UK really making a comeback?
The THE reputation rankings include only 100 universities. QS is now ranking close to 1,000 universities and provides scores for 500 of them including academic reputation and employer reputation.
The publication of these rankings has led to claims that British and American universities are performing well again after a period of stress and difficulty. In recent years we have heard a great deal about the rise of Asia and the decline of the West. Now it seems that THE and QS are telling us that things are beginning to change.
The rise of Asia has perhaps been overblown but if Asia is narrowly as Northeast Asia and Greater China then there is definitely something going on. Take a look at the record of Zhejiang University in the Leiden Ranking publications indicator. In 2006-9 Harvard produced a total of 27,422 papers and Zhejiang University 11,173. In the period 2013-16 the numbers were 33,045 for Harvard and 20,876 for Zhejiang. In seven years Zhejiang has gone from 42% of Harvard's score to 63%. It is not impossible that Zhejiang will reach parity within two decades.
We are talking about quantity here. Reaching parity for research of the highest quality and the greatest impact will take longer but here too it seems likely that within a generation universities like Peking, Zhejiang, Fudan, KAIST and the National University of Singapore will catch up with and perhaps surpass the Ivy League, the Russell Group and the Group of Eight.
The scientific advance of China and its neighbours is confirmed by data from a variety of sources, including the deployment of supercomputers, the use of robots, and, just recently, the Chinese Academy of Science holding its place at the top of the Nature Index.
There are caveats. Plagiarism is a serious problem and the efficiency of Chinese research culture is undermined by cronyism and political conformity. But these are problems that are endemic, and perhaps worse, in Western universities.
So it might seem surprising that the two recent world rankings show that American and British universities are rising again.
But perhaps it should not be too surprising. QS and THE emphasise reputation surveys, which have a weighting of 50% in the QS world rankings and 33% in THE's. There are signs that British and American universities and others in the Anglosphere are learning the reputation management game while universities in Asia are not so interested.
Take a look at the the top fifty universities in the QS academic reputation indicator, which is supposed to be about the best universities for research. The countries represented are:
US 20
UK 7
Australia 5
Canada 3
Japan 2
Singapore 2
China 2
Germany 2.
There is one each for Switzerland, Hong Kong, South Korea, Mexico, Taiwan, France and Brazil.
The top fifty universities in the QS citations per faculty indicator, a measure of research excellence, are located in:
USA 20
China 4
Switzerland 4
Netherlands 3
India 2
Korea 2
Israel 2
Hong Kong 2
Australia 2.
There is one each from Saudi Arabia, Italy, Germany, UK, Sweden, Taiwan, Singapore and Belgium.
Measuring citations is a notoriously tricky business and probably some of the high flyers in the reputation charts are genuine local heroes little known to the rest of the world. There is also now a lot of professional advice available about reputation management for those with cash to spare. Even so it is striking that British, Australian, and Canadian universities do relatively well on reputation in the QS rankings while China, Switzerland, the Netherlands, India and Israel do relatively well for citations.
For leading British universities the mismatch is very substantial. According to the 2018-19 QS world rankings, Cambridge is 2nd for academic reputation, 71st for citations, Manchester is 33rd and 221st, King's College London 47th and 159th, Edinburgh 24th and 181st. It is not surprising that British universities should perform well in rankings where there is a 40 % weighting for reputation.
UCLA has risen from 13th to 9th
Cornell from 23rd to 18th
University of Washington from 34th to 28th
University of Illinois Urbana-Champaign from 36th to 32nd Carnegie Mellon from 37th to 30th
Georgia Institute of Technology from 48th to 44th.
Some of this is probably the result of a change in the distribution of survey responses. I have already pointed out that the fate of Oxford in the THE survey rankings is tied to the percentages of responses from the arts and humanities. THE have reported that their survey this year had an increased number of responses from computer science and engineering and a reduced number from the social sciences and the humanities. Sure enough, Oxford has slipped slightly while LSE has fallen five places.
The shift to computer science and engineering in the THE survey might explain the improved reputation of Georgia Tech and Carnegie Mellon. There is, I suspect, something else going on and that is the growing obsession of some American universities with reputation management, public relations and rankings, including the hiring of professional consultants.
In contrast, Asian universities have not done so well in the THE reputation rankings.
University of Tokyo has fallen from 11th to 13th place
University of Kyoto from 25th to 27th
Osaka University from 51st to 81st
Tsinghua University is unchanged in 14th
Peking University 17 unchanged in 17th
Zhejiang University has fallen from the 51-60 band to 71-80 University of Hong Kong has fallen from 39th to 40th.
All but one of the US universities have fallen in the latest Nature Index, UCLA by 3.1%, University of Washington 1.7%, University of Illinois Urbana-Champaign 12%, Carnegie Mellon 4.8%, Georgia Tech 0.9%.
All but one of the Asian universities have risen in the Nature Index, Tokyo by 9.2%, Kyoto 15.1%, Tsinghua 9.5%, Peking 0.9%, Zhejiang 9.8%, Hong kong 25.3%.
It looks like that Western and Asian universities are diverging. The former are focussed on branding, reputation, relaxing admission criteria, searching for diversity. They are increasingly engaged with, or even obsessed with, the rankings.
Asian universities, especially in Greater China and Korea, are less concerned with rankings and public relations and more with academic excellence and research output and impact.
As the university systems diverge it seems that two different sets of rankings are emerging to cater for the academic aspirations of different countries.
Saturday, December 09, 2023
Global Subject Rankings: The Case of Computer Science
Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.
QS, URAP, and National Taiwan University also published
subject rankings earlier in the year. The US News global rankings
announced last year can be filtered for subject. The methods are different and
consequently the results are also rather different. It is instructive to focus
on the results for a specific field, computer science and on two universities,
Oxford and Tsinghua. Note that the scope of the rankings is sometimes
different.
1. Times
Higher Education
has published rankings of eleven broad subjects using the same indicators as in
their world rankings, Teaching, Research Environment, Research Quality,
International Outlook, and Industry: Income and Patents, but with different
weightings. For example, Teaching has a weighting of 28% for the Engineering
rankings and Industry: Income and Patents 8%, while for Arts and Humanities the
weightings are 37.5% and 3% respectively.
These rankings continued to be led by the traditional
Anglo-American elite. Harvard is in first place for three subjects, Stanford,
MIT, and Oxford in two each and Berkeley and Caltech in one each.
The top five for Computer Science are:
1. University of
Oxford
2. Stanford University
3. Massachusetts
Institute of Technology
4. Carnegie Mellon
University
5. ETH Zurich.
Tsinghua is 13th.
2. The Shanghai
subject rankings are based on these metrics: influential journal
publications, category normalised citation impact, international collaboration,
papers in Top Journals or Top Conferences, and faculty winning significant
academic awards.
According to these rankings China is now dominant in
Engineering subjects. Chinese universities lead in fifteen subjects although
Harvard, MIT and Northwestern University lead for seven subjects. The Natural
Sciences, Medical Sciences, and Social Sciences are still largely the preserve
of American and European universities.
Excellence in the Life Sciences appears to be divided between
the USA and China. The top positions in Biology, Human Biology, Agriculture,
and Veterinary Science are held respectively by Harvard, University of
California San Francisco, Northwest Agriculture and Forestry University, and
Nanjing Agricultural University.
The top five for Computer Science and Engineering are:
1. Massachusetts
Institute of Technology
2. Stanford
University
3. Tsinghua
University
4. Carnegie Mellon
University
5. University of
California Berkeley.
Oxford is 9th.
3. The Round
University Rankings (RUR), now published from Tbilisi, Georgia, are derived
from 20 metrics grouped in 5 clusters, Teaching, Research, International
Diversity, and Financial Sustainability. The same methodology is used for
rankings in six broad fields. Here, Harvard is in first place for Medical
Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences,
and University of Pennsylvania for Humanities.
RUR’s narrow subject rankings, published for the first time,
use different criteria related to publications and citations: Number of Papers,
Number of Citations, Citations per Paper, Number of Citing Papers, and Number
of Highly Cited Papers. In these rankings, first place goes to twelve
universities in the USA, eight in Mainland China, three in Singapore, and one
each in Hong Kong, France, and the UK.
1. National
University of Singapore
2. Nanyang Technological
University
3. Massachusetts
Institute of Technology
4. Huazhong
University of Science and Technology
5. University of
Electronic Science and Technology of China.
Tsinghua is 10th.
Oxford is 47th.
4. The QS World
University Rankings by Subject are based on five indicators: Academic reputation, Employer
reputation, Research citations per paper, H-index and International research
network. At the top they are mostly led
by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.
The top five for Computer Science and Information Systems
1. Massachusetts
Institute of Technology
2. Carnegie Mellon
University
3. Stanford
University
4. University of
California Berkeley
5. University of
Oxford.
Tsinghua is 15th.
5. University
Ranking by Academic Performance (URAP) is produced by a research group at
the Middle East Technical University, Ankara, and is based on publications,
citations, and international collaboration. Last July it published rankings of
78 subjects.
1. Tsinghua
University
2. University of
Electronic Science and Technology of China
3. Nanyang Technological University
4. National University of Singapore
5. Xidian University
Oxford is 19th
6. The US News
Best Global Universities can be filtered by subject. They are based on publications, citations and
research reputation.
The top five for Computer Science in 2022 were:
1. Tsinghua
University
2. Stanford
University
3. Massachusetts
Institute of Technology
4. Carnegie Mellon
University
5. University of
California Berkeley
Oxford was 11th.
7. The National
Taiwan University Rankings are based on articles, citations, highly cited
papers, and H-index.
The top five for Computer Science are:
1. Nanyang
Technological University
2. Tsinghua
University
3. University of
Electronic Science and Technology of China
4. National
University of Singapore
5. Xidian University
Oxford is 111th
So, Tsinghua is ahead of Oxford for computer science and
related fields in the Shanghai Rankings, the Round University Rankings, URAP,
the US News Best Global Universities, and the National Taiwan University
Rankings. These rankings are entirely or mainly based on research publications
and citations. Oxford is ahead of Tsinghua in both the QS and THE subject
rankings. The contrast between the THE and the Taiwan rankings is especially
striking.
Saturday, September 24, 2016
The THE World University Rankings: Arguably the Most Amusing League Table in the World
The latest global rankings contain many items that academics would be advised not to read in public places lest they embarrass the family by sniggering to themselves in Starbucks or Nandos.
THE would, for example, have us believe that St. George's, University of London is the top university in the world for research impact as measured by citations. This institution specialises in medicine, biomedical science and healthcare sciences. It does not do research in the physical sciences, the social sciences, or the arts and humanities and makes no claim that it does. To suggest that it is the best in the world across the range of scientific and academic research is ridiculous.
There are several other universities with scores for citations that are disproportionately higher than their research scores, a sure sign that the THE citations indicator is generating absurdity. They include Brandeis, the Free University of Bozen-Bolzano, Clark University, King Abdulaziz University, Anglia Ruskin University, the University of Iceland, and Orebro University, Sweden.
In some cases, it is obvious what has happened. King Abdulaziz University has been gaming the rankings by recruiting large numbers of adjunct faculty whose main function appears to be listing the university as as a secondary affiliation in order to collect a share of the credit for publications and citations. The Shanghai rankers have stopped counting secondary affiliations for their highly cited researchers indicator but KAU is still racking up the points in other indicators and other rankings.
The contention that Anglia Ruskin University is tenth in the world for research impact, equal to Oxford, Princeton, and UC Santa Barbara, and just above the University of Chicago, will no doubt be met with donnish smirks at the high tables of that other place in Cambridge, 31st for citations, although there will probably be less amusement about Oxford being crowned best university in the world.
Anglia Ruskin 's output of research is not very high, about a thirtieth of Chicago's according to the Web of Science Core Collection. Its faculty does, however, include one Professor who is a frequent contributor to global medical studies with a large number of authors, although never more than a thousand, and hundreds of citations a year. Single-handedly he has propelled the university into the research stratosphere since the rest of the university has been generating few citations (there's nothing wrong with that: it's not that sort of place) and so the number of papers by which the normalised citations are divided is very low.
The THE citations methodology is badly flawed. That university heads give any credence to rankings that include such ludicrous results is sad testimony to the decadence of the modern academy.
There are also many universities that have moved up or down by a disproportionate number of places. These include:
Peking University rising from 42nd to 29th
University of Maryland at College Park rising from 117th to 67th.
Purdue University rising from 113th to 70th.
Chinese University of Hong Kong rising from 138th to 76th.
RWTH Aachen rising from 110th to 78th
Korean Advanced Institute of Science and Technology rising from 148th to 89th
Vanderbilt University falling from 87th to108th
University of Copenhagen falling from 82nd to 120th
Scuola Normale Pisa falling from 112nd to 137th
University of Cape Town falling from 120th to 148th
Royal Holloway, University of London falling from 129th to173rd
Lomonosov Moscow State University falling from 161st to 188th.
The point cannot be stressed too clearly that universities are large and complex organisations. They do not in 12 months or less, short of major restructuring, change sufficiently to produce movements such as these. The only way that such instability could occur is through entry into the rankings of universities with attributes different from the established ones thus changing the means from which standardised scores are derived or significant methodological changes.
There have in fact been significant changes to the methodology this year although perhaps not as substantial as 2015. First, books and book chapters are included in the count of publications and citations, an innovation pioneered by the US News in their Best Global Universities. Almost certainly this has helped English speaking universities with a comparative advantage in the humanities and social sciences although THE's practice of bundling indicators together makes it impossible to say exactly how much. It would also work to the disadvantage of institutions such as Caltech that are comparatively less strong in the arts and humanities.
Second, THE have used a modest version of fractional counting for papers with more than a thousand authors. Last year they were not counted at all. This means that universities that have participated in mega-papers such as those associated with the Large Hadron Collider will get some credit for citations of those papers although not as much as they did in 2014 and before. This has almost certainly helped a number of Asian universities that have participated in such projects but have a generally modest research output. It might have benefitted some universities in California such as UC Berkeley.
Third, THE have combined the results of the academic reputation survey conducted earlier this year with that used in the 2015-16 rankings. Averaging reputation surveys is a sensible idea, already adopted by QS and US News in their global rankings, but one that THE has avoided until now.
This year's survey saw a very large reduction in the number of responses from researchers in the arts and humanities and a very large increase, for reasons unexplained, in the number of responses from business studies and the social sciences, separated now but combined in 2015.
Had the responses for 2016 alone been counted there might have been serious consequences for UK universities, relatively strong in the humanities, and a boost for East Asian universities, relatively strong in business studies. Combining the two surveys would have limited the damage to British universities and slowed down the rise of Asia to media-acceptable proportions.
One possible consequence of these changes is that UC Berkeley, eighth in 2014-15 and thirteenth in 2015-16, is now, as predicted here, back in the top ten. Berkeley is host for the forthcoming THE world summit although that is no doubt entirely coincidental.
The overall top place has been taken by Oxford to the great joy of the vice-chancellor who is said to be "thrilled" by the news.
I do not want to be unfair to Oxford but the idea that it is superior to Harvard, Princeton, Caltech or MIT is nonsense. Its strong performance in the THE WUR is in large measure due to the over- emphasis in these tables on reputation, income and a very flawed citations indicator. Its rise to first place over Caltech is almost certainly a result of this year's methodological changes.
Let's look at Oxford's standing in other rankings. The Round University Ranking (RUR) uses Thomson Reuters data just like THE did until two years ago. It has 12 of the indicators employed by THE and eight additional ones.
Overall Oxford was 10th, up from 17th in 2010. In the teaching group of five indicators Oxford is in 28th place. For specific indicators in that group the best performance was teaching reputation (6th) and the worst academic staff per bachelor degrees (203rd).
In Research it was 20th. Places ranged from 6th for research reputation to 206th for doctoral degrees per admitted PhD. It was 5th for International Diversity and 12th for Financial Sustainability
The Shanghai ARWU rankings have Oxford in 7th place and Webometrics in 10th (9th for Google Scholar Citations).
THE is said to be trusted by the great and the good of the academic world. The latest example is the Norwegian government including performance in the THE WUR as a criterion for overseas study grants. That trust seems largely misplaced. When the vice-chancellor of Oxford University is thrilled by a ranking that puts the university on a par for research impact with Anglia Ruskin then one really wonders about the quality of university leadership.
To conclude my latest exercise in malice and cynicism, (thank you ROARS) here is a game to amuse international academics .
Ask your friends which university in their country is the leader for research impact and then tell them who THE thinks it is.
Here are THE's research champions, according to the citations indicator:
Argentina: National University of the South
Australia: Charles Darwin University
Brazil: Universidade Federal do ABC (ABC refers to its location, not the courses offered)
Canada: University of British Columbia
China: University of Science and Technology of China
France: Paris Diderot Univerity: Paris 7
Germany: Ulm University
Ireland: Royal College of Surgeons
Japan: Toyota Technological Institute
Italy: Free University of Bozen-Bolzano
Russia: ITMO University
Turkey: Atilim University
United Kingdom: St George's, University of London.
Wednesday, February 28, 2024
Comments on the THE Reputation Rankings
Times Higher Education (THE) has announced the latest edition of its reputation ranking. The scores for this ranking will be included in the forthcoming World University Ranking and THE's other tables, where they will have a significant or very significant effect. In the Japan University Ranking, they will get an 8% weighting, and in the Arab University Ranking, 41%. Why THE gives such a large weight to reputation in the Arab rankings seems a bit puzzling.
The ranking is based on a survey of researchers "who have published in academic journals, have been cited by other researchers and who have been published within the last five years," presumably in journals indexed in Scopus.
Until 2022 the survey was run by Elsevier but since then has been brought in-house.
The top of the survey tells us little new. Harvard is first and is followed by the rest of the six big global brands: MIT, Stanford, Oxford, Cambridge, and Berkeley. Leading Chinese universities are edging closer to the top ten.
For most countries or regions, the rank order is uncontroversial: Melbourne is the most prestigious university in Australia, Toronto in Canada, Technical University of Munich in Germany, and a greyed-out Lomonosov Moscow State University in Russia. However, there is one region where the results are a little eyebrow-raising.
As THE has been keen to point out, there has been a remarkable improvement in the scores for some universities in the Arab region. This in itself is not surprising. Arab nations in recent years have invested massive amounts of money in education and research, recruited international researchers, and begun to rise in the research-based rankings such as Shanghai and Leiden. It is to be expected that some of these universities should start to do well in reputation surveys.
What is surprising is which Arab universities have now appeared in the THE reputation ranking. Cairo University, the American University in Beirut, Qatar University, United Emirates University, KAUST, and King Abdulaziz University have achieved some success in various rankings, but they do not make the top 200 here.
Instead, we have nine universities: the American University in the Middle East, Prince Mohammed Bin Fahd University, Imam Mohammed Ibn Saud Islamic University, Qassim University, Abu Dhabi University, Zayed University, Al Ain University, Lebanese University, and Beirut Arab University. These are all excellent and well-funded institutions by any standards, but it is hard to see why they should be considered to be among the world's top 200 research-orientated universities.
None of these universities makes it into the top 1,000 of the Webometrics ranking or the RUR reputation rankings. A few are found in the US News Best Global Universities, but none get anywhere near the top 200 for world or regional reputation. They do appear in the QS world rankings but always with a low score for the academic survey.
THE accepts that survey support for the universities comes disproportionately from within the region in marked contrast to US institutions and claim that Arab universities have established a regional reputation but have yet to sell themselves to the rest of the world.
That may be so, but again, there are several Arab universities that have established international reputations. Cairo University is in the top 200 in the QS academic survey, and the RUR reputation ranking, and the American University of Beirut is ranked 42nd for regional research reputation by USN. They are, however, absent from the THE reputation ranking.
When a ranking produces results that are at odds with other rankings and with accessible bibliometric data, then a bit of explanation is needed.
Sunday, June 13, 2021
The Remarkable Revival of Oxford and Cambridge
There is nearly always a theme for the publication of global rankings. Often it is the rise of Asia, or parts of it. For a while it was the malign grasp of Brexit which was crushing the life out of British research or the resilience of American science in the face of the frenzied hostility of the great orange beast. This year it seems that the latest QS world rankings are about the triumph of Oxford and other elite UK institutions and their leapfrogging their US rivals. Around the world, quite a few other places are also showcasing their splendid achievements.
In the recent QS rankings Oxford has moved up from overall fifth to second place and Cambridge from seventh to third while University College London, Imperial College London, and Edinburgh have also advanced. No doubt we will soon hear that this is because of transformative leadership, the strength that diversity brings, working together as a team or a family, although I doubt whether any actual teachers or researchers will get a bonus or a promotion for their contributions to these achievements.
But was it leadership or team spirit that pushed Oxford and Cambridge into the top five? That is very improbable. Whenever there is a big fuss about universities rising or falling significantly in the rankings in a single year it is a safe bet that it is the result of an error, the correction of an error, or a methodological flaw or tweak of some kind.
Anyway, this year's Oxbridge advances had as much to do with leadership, internationalization, or reputation as goodness had with Mae West's diamonds. It was entirely due to a remarkable rise for both places in the score for citations per faculty, Oxford from 81.3 to 96, and Cambridge from 69.2 to 92.1. There was no such change for any of the other indicators.
Normally, there are three ways in which a university can rise in QS's citations indicator. One is to increase the number of publications while maintaining the citation rate. Another is to improve the citation rate while keeping output constant. The third is to reduce the number of faculty physically or statistically.
None of these seem to have happened at Oxford and Cambridge. The number of publications and citations has been increasing but not sufficiently to cause such a big jump. Nor does there appear to have been a drastic reduction of faculty in either place.
In any case it seems that Oxbridge is not alone in its remarkable progress this year. For citations, ETH Zurich rose from 96.4 to 99.8, University of Melbourne from 75 to 89.7, National University of Singapore from 72.9 to 90.6, Michigan from 58 to 70.5. It seems that at the top levels of these rankings nearly everybody is rising except for MIT which has the top score of 100 but it is noticeable that as we get near the top the increase gets smaller.
It is theoretically possible that this might be the result of a collapse of the raw scores of citations front runner MIT which would raise everybody else's scores if it still remained at the top but there is no evidence of either a massive collapse in citations or a massive expansion of research and teaching staff.
But then as we go to the other end of the ranking we find universities' citations scores falling, University College Cork from 23.4 to 21.8, Universitas Gadjah Mada from 1.7 to 1.5, UCSI University Malaysia from 4.4 to 3.6, American University in Cairo from 5.7 to 4.2.
It seems there is a bug in the QS methodology. The indicator scores that are published by QS are not raw data but standardized scores based on standard deviations from the mean The mean score is set at fifty and the top score at one hundred. Over the last few years the number of ranked universities has been increasing and the new ones tend to perform less well than the the established ones, especially for citations. In consequence, the mean number of citations per faculty has declined and therefore universities scoring above the mean will increase their standardized scores which is derived from the standard deviation from the mean. If this interpretation is incorrect I'm very willing to be corrected.
This has an impact on the relative positions of Oxbridge and leading American universities. Oxford and Cambridge rely on their scores in the academic and employer survey and international faculty and staff to keep in the top ten. Compared to Harvard, Stanford and MIT they are do not perform well for quantity or quality of research. So the general inflation of citations scores gives them more of a boost than the US leaders and so their total score rises.
It is likely that Oxford and Cambridge's moment of glory will be brief since QS in the next couple of years will have to do some recentering in order to prevent citation indicator scores bunching up in the high nineties. The two universities will fall again although it that will probably not be attributed to a sudden collapse of leadership or failure to work as a team.
It will be interesting to see if any of this year's rising universities will make an announcement that they don't really deserve any praise for their illusory success in the rankings.