Sunday, January 22, 2017

What's Wrong with Ottawa?

The University of Ottawa (UO) has been a great success over the last few years, especially in research. In 2004 it was around the bottom third of the 202-300 band in the Shanghai Academic Ranking of World Universities. By 2016 it had reached the 201st place, although the Shanghai rankers still recorded it as being in the 201-300 band. Another signing of a highly cited researcher, another paper in Nature, a dozen more papers listed in the Science Citation Index and it would have made a big splash by breaking into the Shanghai top 200.

The Shanghai rankings have, apart from recent problems with the Highly Cited Researchers indicator, maintained a stable methodology so this is a very solid and remarkable achievement.

A look at the individual components of these rankings shows that UO has improved steadily in the the quantity and the quality of research. The score for publications rose from 37.8 to 44.4 between 2004 and 2016, from 13.0 to 16.1 for papers in Nature and Science, and from 8.7 to 14.5 for highly cited researchers (Harvard is 100 in all cases). For productivity (five indicators divided by number of faculty) the score went from 13.2 to 21.5 (Caltech is 100).

It is well known that the Shanghai rankings are entirely about research and ignore the arts and humanities. The Russian Round University Rankings (RUR), however, get their data from the same source as THE did until two years ago, include data from the arts and humanities, and have a greater emphasis on teaching related indicators.

In the RUR rankings, UO rose from 263rd place in 2010 to 211th overall in 2015, from 384th to 378th in five combined teaching indicators and from 177th to 142nd in five combined research indicators. Ottawa is doing well for research and and creeping up a bit for teaching related criteria, although the relationship between these and actual teaching may be rather tenuous.

RUR did not rank UO in 2016. I cannot find any specific reason  but it is possible that the university did not submit data for the Institutional Profiles at Research Analytics.

Just for completeness, Ottawa is also doing well in the Webometrics ranking, which is mainly about web activity but does include a measure of research excellence. It is in the 201st spot there also.

It seems, however, that this is not good enough. In September, according to Fulcrum, the university newspaper, there was a meeting of the Board of Governors which discussed not the good results from RUR, Shanghai Ranking and Webometrics. but a fall in the Times Higher Education (THE) World University Rankings from the 201-250 band in 2015-16 to the 250-300 band in 2016-17. One board member even suggested taking THE to court.

So what happened to UO in last year's THE world rankings? The only area where it fell was for Research, from 36.7 to 21.0. In the other indicators or indicator groups, Teaching, Industry Income, International Orientation, Research Impact (citations), it got the same score or improved.

But this is not very helpful. There are actually three components in the research group of indicators, which has a weighting of 30%, two of which are scaled. A fall in the research component might be caused by a fall in its score for research reputation, a decline in its reported research income, a decline in the number of publications, a rise in the number of academic staff, or some combination of these.

The fall in UO's research score could not have been caused by more faculty. The number of full time faculty was 1,284 in 2012-13 and 1,281 in 2013-14.

There was a fall of 7.6% in Ottawa's "sponsored research income" between 2013 and 2014 but I am not sure if that is enough to produce such a large decline in the combined research indicators.

My suspicion is -- and until THE disaggregate their indicators it cannot be anything more  -- that the problem lies with the 18% weighted survey of postgraduate teaching. Between 2015 and 2016 the percentage of survey respondents from the arts and humanities was significantly reduced while that from the social sciences and business studies was increased. This would be to the disadvantage of English speaking universities, including those in Canada, relatively strong in the humanities and to the advantage of Asian universities relatively strong in business studies. UO, for example, is ranked highly by Quacquarelli Symonds (QS) for English, Linguistics and Modern Languages, but not for Business Management Studies and Finance and Accounting.

This might have something to do with THE wanting to get enough respondents for business studies after they had been taken out of the social sciences and given their own section. If that is the case, Ottawa might get a pleasant surprise this year since THE are now treating law and education as separate fields and may have to find more respondents to get around the problem of small sample sizes. If so, this could help UO which appears to be strong in those subjects.

It seems, according to another Fulcrum article, that the university is being advised by Daniel Calto from Elsevier. He correctly points out that citations had nothing to do with this year's decline. He then talks about the expansion in the size of the rankings with newcomers pushing in front of UO. It is unlikely that this in fact had a significant effect on the university since most of the newcomers would probably enter below the 300 position and since there has been no effect on its score for teaching, international orientation, industry income or citations (research impact).

I suspect that Caito may have been incorrectly reported. Although he says it was unlikely that citations could have had anything to do with the decline, he is reported later in the article to have said that THE's exclusion of kilo-papers (with 1,000 authors) affected Ottawa. But the kilo-papers were excluded in 2015 so that could not have contributed to the fall between 2015 and 1016.

The Fulcrum article then discusses how UO might improve. M'hamed Aisati, a vice-president at Elsevier, suggest getting more citations. This is frankly not very helpful. The THE methodology means that more citations are meaningless unless they are concentrated in exactly the right fields. And if more citations are accompanied by more publications then the effect could be counter-productive.

If UO is concerned about a genuine improvement in research productivity and quality there are now several global rankings that are quite reasonable. There are even rankings that attempt to measure  things like innovation, teaching resources, environmental sustainability and web activity.

The THE rankings are uniquely opaque in that they hide the scores for specific indicators, they are extremely volatile, they depend far too much on dodgy data from institutions and reputation surveys that can be extremely unstable. Above all, the citations indicator is a hilarious generator of absurdity.

The University of Ottawa, and other Canadian universities, would be well advised to forget about the THE rankings or at least not take them so seriously.


No comments: