Monday, December 24, 2007

Cambridge and Harvard

The THES-QS rankings can be viewed as a collection of complex interweaving narratives. There is the rise of China and its diaspora, the successful response of Australian universities to financial crisis, the brave attempts of Africa, spearheaded by the University of Cape Town, to break into the top 200.

The most interesting narrative is that of British universities -- Oxford, Cambridge and Imperial and University Colleges, London -- steadily coming closer to Harvard and pulling ahead of Princeton, Caltech and the rest.

This particular narrative requires rather more suspension of disbelief than most. By all accounts, including the Shanghai rankings and THES’s own count of citations per faculty, the research record of Cambridge and Oxford has been less than spectacular for several years.

Until this year Cambridge’s apparent near equality with Harvard was largely the result of its performance on QS’s survey of academic opinion, the so-called peer review. Since this has such an astonishingly low response rate, since it is noticeably biased against the US, since its relationship with research proficiency measured by citations per faculty or per paper is very limited, it should not be taken seriously.

This year methodological changes mean that the differences between Cambridge and Harvard on most measures are virtually obliterated. Both universities get 100 or 99 for the “peer review”, employer review and student faculty ratio. Both get 91 for international students.

Harvard stays ahead of Cambridge because of a much better performance on citations per faculty. I thought it might be interesting to see how this margin was achieved.

QS is now using the Scopus database for which a 30-day free trial is available. THES states that the consultants counted the number of citations of papers published between 2002 and 2006 and then divided the total by the number of faculty. I have tried to reproduce QS's scores for Cambridge and Harvard

First, here is the number of papers published by authors with an affiliation to “Cambridge University” between 2002 and 2006 and the number of citations of those papers. The number of documents in the Scopus database is increasing all the time so a count done today would yield different results. These numbers are from two weeks ago.

CAMBRIDGE (“Cambridge University”) 2002-2006


Life sciences

Documents 7,614

Citations 116,875

Health sciences

Documents 4,406

Citations 65,211

Physical sciences

Documents 11,514

Citations 100,225

Social sciences

Documents 2,636

Citations 24,292


Total

Documents 26,170

Citations 306,603

Using the FTE faculty figure of 3,765 provided by QS on their website, we have 83 citations per faculty.

I noticed that a number of authors gave their affiliation as “University of Cambridge”. This added 26,710 citations to make a total of 333,313 citations and 89 citations per faculty.

Now for Harvard. Searching the Scopus database reveals the following totals of papers and citations for “Harvard University”.

HARVARD ("Harvard University") 2002-2006

Life sciences

Documents 4,003

Citations 79,663


Health science

Documents 2,577

Citations 47,486

Physical Science

Documents 6,429

Citations 91,154

Social Science

Documents 3,686

Citations 48,844

Total

Documents 16,695

Citations 267,147

I suspect that most observers would consider Cambridge's superiority to Harvard in number of publications and citations indicative more of the bias of the database than anything else.


If we use QS’s faculty headcount figure for Harvard of 3,389 and assume that 8 per cent of these are part-timers with a quarter-time teaching load then we have 3,167 FTE faculty. This would give us 84 citations per faculty, slightly better than Cambridge if citations of “University of Cambridge " publications are excluded and somewhat worse if they are included.


The problem is, though , that QS give Harvard a score of 96 for citations per faculty and Cambridge a score of 83. The only plausible way I can think of for Harvard to do so much better when they have fewer citations is that a smaller faculty figure was used to calculate the citations per faculty number for Harvard than was used to calculate the student faculty ratio. The Harvard web site refers to "about [sic] 2,497 non-medical faculty" and in QS’s school profile of Harvard there is a reference to "more than 2,000 faculty". I suspect that this number was used to calculate the citations per faculty score while the larger number was used to calculate the student faculty ratio. Had the former been used for both criteria, than Cambridge and Harvard would have been virtually equal for citations and Cambridge would have moved into the lead by virtue of a better international faculty score.

The may be some other explanation . If so , then I would be glad to hear it.

If this is what happened then it would be interesting to know whether there was simply another run of the mill error with that ubiquitous junior staff member using two different faculty figures to calculate the two components or a cynical ploy to prevent Cambridge moving into the lead too early.


2 comments:

Anonymous said...

Richard,

If you are aware of the Taiwan Performance Rankings of Scientific Papers for World Universities, then what's your assesment?

Look forward to your blog. See, http://www.heeact.edu.tw/ranking/index.htm

Anonymous said...

Its kinda hard to define the ranking of universities based on just a few criterias..perhaps the greatest influence is the public endowment of the university or not..but i really think that a measure of universities' sucess is based on what type of people they produce...any comment?

http://www.worldbest-universities.blogspot.com