Friday, November 17, 2017

Another global ranking?

In response to  suggestion by Hee Kim Poh of Nanyang Technological University, I have had a look at the Worldwide Professional University Rankings which appear to be linked to "Global World Communicator" and the "International Council of Scientists" and may be based in Latvia.

There is a methodology page but it does not include essential information. One indicator is "number of publications to number of academic staff" but there is nothing about how either of these are calculated or where the data comes from. There is a reference to a survey of members of the International Council of Scientists but nothing about the wording of the survey, date of survey, distribution of respondents or the response rate.

Anyway, here is the introduction to the methodology:

"The methodology of the professional ranking of universities is based on comparing universities and professional evaluation by level of proposed training programs (degrees), availability and completeness of information on activities of a university, its capacity and reputation on a national and international levels. Main task is to determine parameters and ratios needed to assess quality of the learning process and obtained specialist knowledge. Professional formalized ranking system based on a mathematical calculation of the relation of parameters of the learning process characterizing quality of education and learning environment. Professional evaluation criteria are developed and ranking is carried out by experts of the highest professional qualification in relevant fields - professors of universities, specialists of the highest level of education, who have enough experience in teaching and scientific activities. Professional rating of universities consists of three components.. "

The top five universities are 1. Caltech,  2. Harvard,  3. MIT,  4. Stanford,  5. ETH Zurich.

Without further information, I do not think that this ranking is worth further attention.









http://www.cicerobook.com/en/ranks

Wednesday, November 15, 2017

Rankings Calendar: QS BRICS University Rankings

The QS BRICS (Brazil, Russia, India, China, South Korea) university rankings will be announced on November 23 at the QS-APPLE conference in Taiwan.


China overtakes USA in supercomputing

The website TOP500 keeps track of the world's most powerful computers. Six months ago the USA had 169 supercomputers in the top 500 and China 160. Now China has 202 and the USA 143.

They are followed by Japan with 35, Germany 20, France 18 and the UK 15.

There are four supercomputers in India, four in the Middle East (all in Saudi Arabia), one in Latin America (Mexico), one in Africa (South Africa), 



Tuesday, November 14, 2017

The closing gap: When will China overtake the USA in research output?

According to the Scopus database, China produced 387,475 articles in 2016 and the USA 409,364, a gap of 21,889.

To be precise, there were 387,475 articles with at least one author affiliated to a Chinese university or research center and 409,364 with at least one author affiliated to an American university or research center.

So far this year there have been 346,425 articles with Chinese affiliations and 352,275 with US affiliations.

The gap is now 5,850 articles.

I think it safe to say that at some point early next year the gap will close and that China will then pull ahead of the USA.

Some caveats. A lot of those articles are just routine stuff and not very significant. For a while, the US may do better in high impact research as measured by citations. Also, US universities contribute more heads of research projects.

On the other hand, I suspect that many of the researchers listed as having American affiliations did their undergraduate degrees or secondary education in China.

And if we counted Hong Kong as part of China, then the gap would already have been closed.

Sunday, November 05, 2017

Ranking debate: What should Malaysia do about the rankings?


A complicated relationship

Malaysia has had a complicated relationship with global university rankings. There was a moment back in 2004 when the first Times Higher Education Supplement- Quacquarelli Symonds (THES-QS) world rankings put the country's flagship, Universiti Malaya (UM), in the top 100. That was the result of an error, one of several QS made in its early days. Over the next few years UM has gone down and up in the rankings, but generally trending upwards with other Malaysian universities following behind. This year it is 114th in the QS world rankings and the top 100 seems in sight once again.

There has been a lot of debate about the quality of the various ranking systems, but it does seem that UM and some other universities have been steadily improving, especially with regard to research, although, as the recent Universitas 21 report shows, output and quality are still lagging behind the provision of resources.  

There is, however, an unfortunate tendency in many places, including Malaysia, for university rankings to get mixed up with local politics. A good ranking performance is proclaimed a triumph by the government and a poor one is deemed by the opposition to be punishment for failed policies.

QS rankings criticised

Recently Ong Kian Ming, a Malaysian opposition MP, said that it was a mistake for the government to use the QS world rankings as a benchmark to measure the quality of Malaysian universities and that the ranking performance of UM and other universities is not a valid measure of quality.

"Serdang MP Ong Kian Ming today slammed the higher education ministry for using the QS World University Rankings as a benchmark for Malaysian universities.
In a statement today, the DAP leader called the decision “short-sighted” and “faulty”, pointing out that the QS rankings do not put much emphasis on the criteria of research output.

According to the QS World University Rankings  for 2018, released on June 8, five Malaysian varsities were ranked in the top 300, with Universiti Malaya (UM) occupying 114th position."

The article went on to say that:


"However, Ong pointed to the Times Higher Education (THE) World University Rankings for 2018, which he said painted Malaysian universities in a different light.

According to the THE rankings, which were released earlier this week, none of Malaysia’s universities made it into the top 300.



Ong suggests that they should rely on locally developed measures.

“Instead of being “obsessed” with the ranking game, he added, the ministry should work to improve the existing academic indicators and measures which have been developed locally by the ministry and the Malaysian Qualifications Agency to assess the quality of local public and private universities”

Multiplication of rankings

It is certainly not a good idea for anyone to rely on any single ranking. There are now over a dozen global rankings and several regional ones that assess universities according to a variety of criteria. Universities in Malaysia and elsewhere could make more use of these rankings some of which are technically much better than the well known big three or four, QS, THE, The Shanghai Academic Ranking of World Universities (ARWU) and sometimes the US News Best Global Universities.

Dr. Ong is also quite right to point out the QS rankings have methodological flaws.  However, the THE rankings are not really any better, and they are certainly not superior in the measurement of research quality. They also have the distinctive attribute that 11 of their 13 indicators are not presented separately but bundled into three groups of indicators so that the public cannot, for example, tell whether a good score for research is the result of an increase in research income, more publications, an improvement in reputation for research, or a reduction in the number of faculty.

The important difference between the QS and THE rankings is not that the latter are focussed on research. QS's academic survey is specifically about research and its faculty student ratio, unlike THE's, includes research-only staff. The salient difference is that the THE academic survey is restricted to published researchers while QS's  allows universities to nominate potential respondents, something that gives an advantage to upwardly mobile institutions in Asia and Latin America.


Ranking vulnerabilities
All of the three well known rankings, THE, QS and ARWU now have  vulnerabilities, metrics that can be influenced by institutions and where a modest investment of resources can produce a disproportionate and implausible rise in the rankings.

In the Shanghai rankings the loss or gain of a single highly cited researcher can make a university go up or down dozens of places in the top 500. In addition the recruitment of scientists whose work is frequently cited, even for adjunct positions, can help universities excel in ARWU’s publications and Nature and Science indicators.

The THE citations indicator has allowed a succession of institutions to over-perform  in the world or regional rankings:  Alexandria University, Anglia Ruskin University in Cambridge, Moscow Engineering Physics Institute, Federico Santa Maria Technical University in Chile, Middle East Technical University, Tokyo Metropolitan University, Veltech University in India, Universiti Tunku Abdul Rahman (UTAR) in Malaysia. The indicator officially has a 30% weighting but in reality it is even greater because of THE’s “regional modification” that gives a boost to every university except those in the top scoring country. The modification used to apply to all of the citations but now covers half.

The vulnerability of the QS rankings is the two survey indicators accounting for  50% of the total weighting which allows universities to propose their own respondents. In recent years some Asian and Latin American universities such as Kyoto University, Nanyang Technological University (NTU), the University of Buenos Aires, the Pontifical Catholic University of Chile and the National University of Colombia have received scores for research and employer reputation that are out of line with their performance on any other indicator.

QS may have discovered a future high flyer in NTU but I have my doubts about the Latin American places. It is also most unlikely that Anglia Ruskin, UTAR and  Veltech will do so well in the THE rankings if they lose their highly cited researchers.

Consequently, there are limits to the reliability of the popular rankings and none of them should be considered the only sign of excellence. Ong is quite correct to point out the problems of the QS rankings but the other well known ones also have defects.


Beyond the Big Four


Ong points out that if we look at "the big four" then the high position of UM in the QS rankings is anomalous.  It is in 114th place in the QS world rankings (24th in the Asian rankings), 351-400 in THE, 356 in US News global rankings and 401-500  in ARWU.

The situation looks a little different when you consider all of the global rankings. Below is UM's position in 14 global rankings. The QS world  rankings are still where UM does best but here it is at the end of a curve. UM is 135th  for publications in the Leiden Ranking, generally considered by experts to be the best technically, although it is lower for high quality publications, 168th in the Scimago Institution Rankings, which combine research and innovation and 201-250 in the QS graduate employability rankings.

The worst performance is in the Unirank rankings (formerly ic4u), based on web activity, where UM is 697th.

The Shanghai rankings are probably a better guide to research prowess than either QS or THE since they deal only with research and, with one important exception, have a generally stable methodology. UM is 402nd overall, having fallen from 353rd in 2015 because of changes in the list of highly cited researchers used by the Shanghai rankers.  UM does better for publications, 143rd this year and 142nd in 2015.

QS World University Rankings: 114 [general, mainly research]
CWTS Leiden Ranking:  publications 135,  top 10% of journals 195 [research]
Scimago Institutions Rankings:  168 [research and innovation]
QS Graduate Employability Rankings: 201-250 [graduate outcomes]
Round University Ranking: 268 [general]
THE World University Rankings: 351-400 [general, mainly research]
US News Best Global Universities: 356 [research]
Shanghai ARWU: 402 [research]
Webometrics: overall 418 (excellence 228) [mainly web activity]
Center for World University Rankings: 539 [general, quality of graduates]
Nature Index: below 500 [high impact research]
uniRank: 697 [web activity]


The QS rankings are not such an outlier. Looking at indicators in other rankings devoted to research gives us results that are fairly similar. Malaysian universities would, however, be wise to avoid concentrating on any single ranking and  they should look at the specific indicators that measure features that are considered important.


Universities with an interest in technology and innovation could look at the Scimago rankings which include patents. Those with strengths in global medical studies might find it beneficial to go for the THE rankings but should always watch out for changes in methodology. 

Using local benchmarks is not a bad idea and it can be valuable for those institutions that are not so concerned with research but many Malaysian institutions are now competing on the global stage and are subject to international assessment and that, whether they like it or not, means assessment by rankings. It would be an improvement if benchmarks and targets were expressed as reaching a certain level in two or three rankings, not just one. Also, they should focus on specific indicators rather than the overall score and different rankings and indicators should be used to assess and compare different places.


For example, the Round University Rankings from Russia, which include five of the six metrics in the QS rankings plus others but with sensible weightings, could be used to supplement the QS world rankings.


For measuring research output and quality universities the Leiden Ranking might be a better alternative to either the QS or the THE rankings. Those universities with an innovation mission could refer to the innovation knowledge metric in the Scimago Institutions Rankings

When we come to measuring teaching and the quality of graduates there is little of value from the current range of global rankings. There have been some interesting initiatives such as the OECD's AHELO project and U-Multirank but these have yet to be widely accepted. The only international metric that even attempts to directly assess graduate quality is QS's employer survey.

So, universities, governments and stakeholders need to stop thinking about using one ranking as a benchmark for everyone and also to stop looking at the overall rankings. 

Friday, November 03, 2017

Ranking Calendar

Over on the right there will be a list of events such as conferences, workshops, and announcements of rankings.

First is the 7th World-Class Universities Conference in Shanghai starting next Monday, November 6th.




Resuming Posting

I have been busy with family and work matters recently but I shall resume posting tomorrow.

I shall be adding some features that I hope will make the blog more of a useful resource.

Sunday, September 17, 2017

Criticism of rankings from India

Some parts of the world seem to be increasingly sceptical of international rankings, or least those produced by Times Higher Education (THE). MENA (Middle East and North Africa) and Africa did not seem to be very enthusiastic about THE's snapshot or pilot rankings. Many Latin American universities have chosen not to participate in the world and regional rankings.

India also seems to be suspicious of the rankings. An article by Vyasa Shastri in the E-paper, livemint, details some of the ways in which universities might attempt to manipulate rankings to their advantage.

It is well worth reading although I have one quibble. The article refers to King Abdulaziz University recruiting faculty who would list the university as their secondary affiliation (now 41) when publishing papers. The original idea was to get top marks in the Shanghai Ranking's highly cited researchers indicator. The article correctly notes that the Shanghai rankings no longer count secondary affiliations but they can still help in the Nature and Science and publications indicators and in citations and publications metrics in other rankings.

Also, other Saudi universities do not recruit large numbers of secondary affiliations. There are only four for the rest of Saudi Arabia although I notice that there are now quite a few for Chinese and Australian universities, including five for the University of Melbourne.

Last word, I hope, on Babol Noshirvani University of Technology

If you type in 'Babol University of Technology" rather than 'Babol Noshirvani University of Technology' into the Scopus search box then the university does have enough publications to meet THE's criteria for inclusion the world rankings.

So it seems that it was those highly cited researchers in engineering that propelled the university into the research impact stratosphere. That, and a rather eccentric methodology.

Saturday, September 09, 2017

More on Babol Noshirvani University of Technology

To answer the question in the previous post, how did Babol Noshirvani University of Technology in Iran do so well in the latest THE rankings, part of the answer is that it has two highly cited researchers in engineering, Davood Domiri Ganji and Mohsen Sheikholeslami. I see no reason to question the quality of their research.

But I still have a couple of questions. First THE say that they exclude universities whose research output is less than 1,000 articles between 2012 and 2016. But checking with Scopus indicates that the university had 468 articles over that period, or 591 documents of all kinds including conference papers, book chapters and reviews, which seems way below the threshold level for inclusion. Is it possible that THE have included the Babol University of Medical Sciences in the count of publications or citations? 

Those documents have been cited a total of 2,601 times, which is respectable but not quite on a scale that would rival Oxford and Chicago. It is possible that some or one of those articles have, for some reason, got an unusual number of citations compared to the world average and that this has distorted the indicator score. If so, then we have yet another example of a defective methodology producing absurd results.




Friday, September 08, 2017

Why did Babol Noshirvani University of Technology do so well in the THE rankings?

The THE world rankings and their regional offshoots have always been a source of entertainment mixed with a little bit of bewilderment. Every year a succession of improbable places jumps into the upper reaches of the citations indicator which is supposed to measure global research impact. Usually it is possible to tell what happened  Often it is because of participation in a massive international physics project, although not so much over the last couple of years, contribution to a global medical or genetics survey, or even assiduous self-citation.

However, after checking with Scopus and the Web of Science, I still cannot see exactly how Babol Noshirvani University of Technology got into 14th place for this metric, equal to Oxford and ahead of Yale and Johns Hopkins, in the latest world rankings and 301-350 overall, well ahead of every other Iranian university?

Can anybody help with an explanation? 

Wednesday, September 06, 2017

Highlights from THE citations indicator


The latest THE world rankings were published yesterday. As always, the most interesting part is the field- and year- normalised citations indicator that supposedly measures research impact.

Over the last few years, an array of implausible places have zoomed into the top ranks of this metric, sometimes disappearing as rapidly as they arrived.

The first place for citations this year goes to MIT. I don't think anyone would find that very controversial.

Here are some of the institutions that feature in the top 100 of THE's most important indicator which has a weighting of 30 per cent.

2nd     St. George's, University of London
3rd=    University of California Santa Cruz, ahead of Berkeley and UCLA
6th =   Brandeis University, equal to Harvard
11th=   Anglia Ruskin University, UK, equal to Chicago
14th=   Babol Noshirvani University of Technology, Iran, equal to Oxford
16th=   Oregon Health and Science University
31st     King Abdulaziz University, Saudi Arabia
34th=   Brighton and Sussex Medical School, UK, equal to Edinburgh
44th     Vita-Salute San Raffaele University, Italy, ahead of the University of Michigan
45th=   Ulsan National Institute of Science and Technology, best in South Korea
58th=   University of Kiel, best in Germany and equal to King's College London
67th=   University of Iceland
77th=   University of Luxembourg, equal to University of Amsterdam












Thursday, August 24, 2017

Milestone passed

The previous post was the 1,000th.

Comment by Christian Scholz

This comment is by Christian Schulz of the University of Hamburg. He points that the University of Hamburg's rise in the Shanghai rankings was not the result of highly cited researchers moving from other institutions but the improvement of research within the university.

If this is something that applies to other German universities, then it could be that Germany has a policy of growing its own researchers rather than importing talent from around the world. It seems to have worked very well for football so perhaps the obsession of British universities with importing international researchers is not such a good idea..


I just wanted to share with you, that we did not acquire two researchers to get on the HCR List to get a higher rank in the Shanghai Ranking. Those two researchers are Prof. Büchel and Prof. Ravens-Sieberer. Prof. Büchel is working at our university for over a decade now and Prof. Ravens-Sieberer is at our university since 2008.

Please also aknowledge, that our place in the Shanghai Ranking was very stable from 2010-2015. We were very unpleasent, when they decided to only use the one-year list of HCR, because in 2015 none of our researchers made it on the 2015-list, which caused the descend from 2015 to 2016.

Guest Post by Pablo Achard

This post is by Pablo Achard of the University of Geneva. It refers to  the Shanghai subject rankings. However, the problem of outliers in subject and regional rankings is one that affects all the well known rankings and will probably become more important over the next few years


How a single article is worth 60 places

We can’t repeat it enough: an indicator is bad when a small variation in the input is overly amplified in the output. This is the case when indicators are based on very few events.

I recently came through this issue (again) with Shanghai’s subject ranking of universities. The universities of Geneva and Lausanne (Switzerland) share the same School of Pharmacy and a huge share of published articles in this discipline are signed under the name of both institutions. But in the “Pharmacy and pharmaceutical sciences” ranking, one is ranked between the 101st and 150th position while the other is 40th. Where does this difference come from?

Comparing the scores obtained under each category gives a clue

Geneva
Lausanne
Weight in the final score
PUB
46
44.3
1
CNCI
63.2
65.6
1
IC
83.6
79.5
0.2
TOP
0
40.8
1
AWARD
0
0
1
Weighted sum
125.9
166.6


So the main difference between the two institutions is the score in “TOP”. Actually, the difference in the weighted sum (40.7) is almost equal to the value of this score (40.8). If Geneva and Lausanne had the same TOP score, they would be 40th and 41st

Surprisingly, a look at other institutions for that TOP indicator show only 5 different values : 0, 40.8, 57.7, 70.7 and 100. According to the methodology page of the ranking, “TOP is the number of papers published in Top Journals in an Academic Subject for an institution during the period of 2011-2015. Top Journals are identified through ShanghaiRanking’s Academic Excellence Survey […] The list of the top journals can be found here  […] Only papers of ‘Article’ type are considered.”
Looking deeper, there is just one journal in this list for Pharmacy: NATURE REVIEWS DRUG DISCOVERY. As its name indicates, this recognized journal mainly publishes ‘reviews’. A search on Web of Knowledge shows that in the period 2011-2015, only 63 ‘articles’ were published in this journal. That means a small variation in the input is overly amplified.

I searched for several institutions and rapidly found this rule: Harvard published 4 articles during these five years and got a score of 100 ; MIT published 3 articles and got a score of 70.7 ; 10 institutions published 2 articles and got a 57.7 and finally about 50 institutions published 1 article and got a 40.8.

I still don’t get why this score is so unlinear. But Lausanne published one single article in NATURE REVIEWS DRUG DISCOVERY and Geneva none (they published ‘reviews’ and ‘letters’ but no ‘articles’) and that small difference led to at least a 60 places gap between the two institutions.


This is of course just one example of what happens too often: rankers want to publish sub-rankings and end up with indicators where outliers can’t be absorbed into large distributions. One article, one prize or one  co-author in a large and productive collaboration all of the sudden makes very large differences in final scores and ranks. 

Friday, August 18, 2017

Comment on the 2017 Shanghai Rankings

In the previous post I referred to the vulnerabilities that have developed in the most popular world rankings, THE, QS and Shanghai ARWU, indicators that have a large weighting and can be influenced by universities that know how to work the system or sometimes are just plain lucky.

In the latest QS rankings four universities from Mexico, Chile, Brazil and Argentina have 90+ scores for the academic reputation indicator, which has a 40% weighting. All of these universities have low scores for citations per faculty which would seem at odds with a stellar research reputation. In three cases QS does not even list the score in its main table.

I have spent so much time on the normalised citation indicator in the THE world and regional rankings that I can hardly bear to revisit the issue. I will just mention the long list of universities that have achieved improbable glory by a few researchers, or sometimes just one, on a multi-author international physics, medical or genetics project.

The Shanghai rankings were once known for their stability but have become more volatile recently. The villain here is the highly cited researchers indicator which has a 20% weighting and consists of those scientists included in the  lists now published by Clarivate Analytics.

It seems that several universities have now become aware that if they can recruit a couple of extra highly cited researchers to the faculty they can get a significant boost in these rankings. Equally, if they should be so careless to lose one or two then the ranking consequences could be most unfortunate.

In 2016 a single highly cited researcher was worth 10.3 points in the Shanghai rankings, or 2.06 on the overall score after weighting, which is the difference between 500th place and 386th. That is a good deal, certainly much better than hiring a team of consultants or sending staff for excruciating transformational sharing sessions

Of course, as the number of HiCis increases the value of each incremental diminishes so it would make little difference if a top 20 or 30 university added or lost a couple of researchers.

Take a look at some changes in the Shanghai rankings between 2016 and 2017. The University of Kyoto fell three places from 32nd to 35th place or 0.5 points from 37.2 to 36.7. This was due to a fall in the number of highly cited researchers from seven to five which meant a fall of 2.7 in the HiCi score or a weighted 0.54 points in the overall score.

McMaster University rose from 83rd to 66th  gaining 2.5 overall points. The HiCi  score went from 32.4 to 42.3,  equivalent to  1.98 weighted overall points, representing an increase in the number of such researchers from 10 to 15.

Further down the charts,the University of Hamburg rose from 256th  with an overall  score of 15.46 to  188th with a  score of 18.69, brought about largely by an improvement in the  HiCi score  from zero to 15.4 which was the result of the acquisition of tworesearchers.

Meanwhile the Ecole Polytechnique of Paris fell from 303rd place to 434th partly because of the loss of its only highly cited researcher.

It is time for ShanghaiRanking to start looking around for a Plan B for their citations indicator.









Wednesday, August 16, 2017

Problems with global rankings

There is a problem with any sort of standardised testing. A test that is useful when a score has no financial or social significance becomes less valid when coaching industries workout how to squeeze a few points out of docile candidates and motivation becomes as important as aptitude.

Similarly, a metric used to rank universities may be valid and reliable when nobody cares about the rankings. But once they are used to determine bureaucrats' bonuses, regulate immigration, guide student applications and distribute research funding then they become less accurate. Universities will learn how to apply resources in exactly the right place, submit data in exactly the right way and engage productively with the rankers. The Trinity College Dublin data scandal, for example, has indicated how much a given reported income can affect ranks in the THE world rankings.

All of the current "big three" of global rankings have indicators that have become the source of volatility and that are given a disproportionate weighting. These are the normalised citations indicator in the THE rankings, the QS academic survey and the highly cited researchers list in the Shanghai ARWU.

Examples in the next post.


Monday, August 14, 2017

Some implications of the Universitas 21 rankings

Universitas 21 (U21) produces an annual ranking not of universities but of 50 national university systems. There are 25 criteria grouped in four categories, resources, connectivity, environment and output. There is also an overall league table.

The resources section consists of various aspects of expenditure on tertiary education. Output includes publications,  citations,  performance in the Shanghai rankings, tertiary enrolment, graduates and graduate employment .

The top five in the overall rankings are USA, Switzerland, UK, Denmark and Sweden. No surprises there. The biggest improvements since 2013 have been by China, Malaysia, Russia, Saudi Arabia, Singapore and South Africa.

It is interesting to compare resources with output. The top ten for resources comprise six European countries, three of them in Scandinavia, Canada, the USA, Singapore and Saudi Arabia.

The bottom 10 includes two from Latin America, four, including China, from Asia, three from Eastern Europe, and South Africa.

There is a significant relationship correlation of .732 between resources and output. But the association is not uniform.  China is in 43rd place for resources but is 21st for output.  Saudi Arabia in the top ten for resources but 33rd for output. Malaysia is 11th for resources  but 38th for output.

I have constructed a table showing the relationship between resources and output by dividing  the score for output by resources and we get a table showing how efficient systems are at converting money into employable graduates, instructing students and doing research. This is very crude as is the data and the way in which U21 combines them but it might have some interesting implications

The top ten are:
1. China
2. USA
3. Italy
4. Russia
5. Bulgaria
6. Australia
7. UK
8. Ireland
9. Israel
10. Denmark

We have heard a lot about the lavish funding given to Chinese tertiary education. But it seems that China is also very good at turning resources into research and teaching.

The bottom ten are:

41. Austria
42. Brazil
43. Serbia
44. Chile
45. Mexico
46. India
47. Turkey
48. Ukraine
49. Saudi Arabia
50. Malaysia

At the moment the causes of low efficiency are uncertain. But it seems reasonable that the limitations of primary and secondary school systems and cultural attitudes to science and knowledge may be significant. The results of standardised tests such as PISA and TIMSS should be given careful attention.


Sunday, August 13, 2017

The Need for a Self Citation Index

In view of the remarkable performance of Veltech University in the THE Asian Rankings, rankers, administrators and publishers need to think seriously about the impact of self-citation, and perhaps also intra-institutional ranking. Here is the abstract of an article by Justin W Flatt, Alessandro Blassime, and Effy Vayena.

Improving the Measurement of Scientific Success by Reporting a Self-Citation Index

Abstract

: 
Who among the many researchers is most likely to usher in a new era of scientific breakthroughs? This question is of critical importance to universities, funding agencies, as well as scientists who must compete under great pressure for limited amounts of research money. Citations are the current primary means of evaluating one’s scientific productivity and impact, and while often helpful, there is growing concern over the use of excessive self-citations to help build sustainable careers in science. Incorporating superfluous self-citations in one’s writings requires little effort, receives virtually no penalty, and can boost, albeit artificially, scholarly impact and visibility, which are both necessary for moving up the academic ladder. Such behavior is likely to increase, given the recent explosive rise in popularity of web-based citation analysis tools (Web of Science, Google Scholar, Scopus, and Altmetric) that rank research performance. Here, we argue for new metrics centered on transparency to help curb this form of self-promotion that, if left unchecked, can have a negative impact on the scientific workforce, the way that we publish new knowledge, and ultimately the course of scientific advance.
Keywords:
 publication ethics; citation ethics; self-citation; h-index; self-citation index; bibliometrics; scientific assessment; scientific success


Saturday, August 12, 2017

The public sector: a good place for those with bad school grades

From the Economist ranking of British universities, which is based on the difference between expected and actual graduate earnings.

 That, as Basil Fawlty said in a somewhat different context, explains a lot.  

"Many of the universities at the top of our rankings convert bad grades into good jobs. At Newman, a former teacher-training college on the outskirts of Birmingham, classes are small (the staff:student ratio is 16:1), students are few (around 3,000) and all have to do a work placement as part of their degree. (Newman became a university only in 2013, though it previously had the power to award degrees.)

Part of Newman’s excellent performance can be explained because more than half its students take education-related degrees, meaning many will work in the public sector. That is a good place for those with bad school grades. Indeed, in courses like education or nursing there is no correlation between earnings and the school grades a university expects." 

Friday, August 11, 2017

Malaysia and the Rankings Yet Again

Malaysia has had a complicated relationship with global university rankings. There  was a fleeting moment of glory in 2004 when Universiti Malaya, the national flagship, leaped into the top 100 of the THES-QS world rankings. Sadly, it turned out that this was the result of an error by the rankers who thought that ethnic minorities were international faculty and students. Since then the country's leading universities have gone up and down, usually because of methodological changes rather than any merit or fault of their own.

Recently though, Malaysia seems to have adopted sensible, if not always popular, policies and made steady advances in the Shanghai rankings. There are now three universities in the top 500, UM, Universiti Sains Malaysia (USM) and Universiti Kebangsaan Malaysia (UKM). UM has been rising since 2011 although it fell a bit last year because of the loss of a single highly cited researcher listed in the Thomson Reuters database.

The Shanghai rankings rely on public records and focus on research in the sciences. For a broader based ranking with a consistent methodology and teaching metrics we can take a look at the Round University Rankings. There UM is overall 268th. For the 20 metrics included in these rankings UM's scores range from very good for number of faculty and reputation (except outside the region) to poor for doctoral degrees and normalised citations.

The story told by these rankings is that Malaysia is making steady progress in providing resources and facilities, attracting international students and staff, and producing a substantial amount of research in the natural sciences. But going beyond that is going to be very difficult. Citation counts indicate that Malaysian research gets little attention from the rest of the world. The Shanghai rankings report that UM has zero scores for highly cited researchers and papers in Nature and Science.

In this year's QS world rankings, UM reached 114th place overall and there are now hopes that it will soon reach the top 100. But it should be noted that UM's profile is very skewed with a score of 65.7 for academic reputation and 24.3 for citations per faculty. Going higher without an improvement in research quality will be very challenging since the reputation curve becomes very steep at this level, with dozens of survey responses needed just to go up a few points.

It might be better if Malaysia focused more on the Shanghai rankings, the Round University Rankings and the US News Best Global Universities. Progress in these rankings is often slow and gradual but their results are usually fairly consistent and reliable.







Tuesday, August 08, 2017

Excellent Series on Rankings

I have just come across a site, ACCESS, that includes a lot of excellent material on university rankings by Ruth A Pagell, who is Emeritus Faculty Librarian at Emory University and Adjunct Faculty at the University of Hawaii.

I'll provide specific links to some of the articles later

Go here  


Saturday, August 05, 2017

There is no such thing as free tuition

It is reported that the Philippines is introducing free tuition in state universities.It will not really be free. The government will have to find P100 billion from a possible  "re-allocation of resources."

If there is a graduate premium for degrees from Philippine universities then this measure will increase existing social inequalities and result in a transfer of wealth from the working class and small businesses to the privileged educated classes.

Unless lecturers work for nothing and buildings and facilities materialize, Hogwarts style, out of nothing, tuition is never free.




Who educates the world's leaders?

According to Times Higher Education (THE), the UK has educated more heads of state and government than any other country. The USA is a close second followed by France. No doubt this will get a let of publicity as the THE summit heads for London but, considering the state of the world, is it really something to be proud of?



Thursday, August 03, 2017

America's Top Colleges: 2017 Rankings



America's Top Colleges is published by Forbes business magazine. It is an unabashed assessment of institutions from the viewpoint of the student as investor. The metrics are post-graduate success, debt, student experience, graduation rate and academic success.

The top three colleges are Harvard, Stanford and Yale.

The top three liberal arts colleges are Pomona, Claremont McKenna and Williams.

The top three low debt private colleges are College of the Ozarks, Berea College and Princeton.

The top three STEM colleges are MIT, Caltech and Harvey Mudd College.







Wednesday, August 02, 2017

Ranking Rankings



Hobsons, the education technology company, has produced a ranking of global university rankings. The information provided is very limited and i hope there will be more in a while. Here are the top five according to a survey of international students inbound to the USA.

1.    QS World University Rankings
2.    THE World University Rankings
3.     Shanghai ARWU
4.     US News Best Global Universities
5.     Center for World University Rankings (formerly published at King Abdulaziz University).        



University of Bolton head thinks he's worth his salary



George Holmes, vice-Chancellor of the University of Bolton with a salary of GBP 220,120 and owner of a yacht and a Bentley, is not ashamed of his salary. According to an article by Camilla Turner in the Daily Telegraph, he says that he has had a very successful career and he hopes his students will get good jobs and have Bentleys.

The university is ranked 86th in the Guardian 2018 league table which reports that 59.2% of graduates have jobs or in postgraduate courses six months after graduation. It does not appear in the THE or QS world rankings.

Webometrics puts it 105th in the UK and 1846th in the world so I suppose he could claim to be head of a top ten per cent university.

Perhaps Bolton should start looking for the owner of a private jet for its next vice-Chancellor. it might do even better.



Tuesday, August 01, 2017

Highlights from the Princeton Review

Here are the top universities in selected categories in the latest Best Colleges Ranking from Princeton Review. The rankings are based entirely on survey data and are obviously subjective and vulnerable to sampling error.

Most conservative students: University of Dallas, Texas
Most liberal students: Reed College, Oregon
Best campus food: University of Massachusetts Amherst
Happiest students: Vanderbilt University, Tennessee
Party schools: Tulane University, Louisiana
Don't inhale: US Coast Guard Academy, Connecticut
Best college library: University of Chicago, Illinois
Best-run college: University of Richmond, Virginia
Most studious students: Harvey Mudd College, California
Most religious students: Thomas Aquinas College, California
Least religious students: Reed College, Oregon
Best athletic facilities: Auburn University, Alabama.

The world is safe for another year

The Princeton Review has just published the results of its annual survey of 382 US colleges with 62 lists of various kinds. I'll publish a few of the highlights later but for the moment here is one which should make everyone happy.

"Don't inhale" refers to nor using marijuana. Four of the top five places are held by service academies (Coast Guard, Naval, Army, Air Force).

The academies also get high scores in the stone cold sober rankings (opposite of party schools) so everyone can feel a bit safer when they sleep tonight.


Wednesday, July 19, 2017

Comments on an Article by Brian Leiter

Global university rankings are now nearly a decade and a half old. The Shanghai rankings (Academic Ranking of World Universities or ARWU) began in 2003, followed a year later by Webometrics and the THES-QS rankings which, after an unpleasant divorce, became the Times Higher Education (THE) and the Quacquarelli Symonds (QS) world rankings. Since then the number of rankings with a variety of audiences and methodologies has expanded.

We now have several research-based rankings, University Ranking by Academic Performance (URAP) from Turkey, the National Taiwan University Rankings, Best Global Universities from US NewsLeiden Ranking, as well as rankings that include some attempt to assess and compare something other than research, the Round University Rankings from Russia and U-Multirank from the European Union. And, of course, we also have subject rankingsregional rankings, even age group rankings.

It is interesting that some of these rankings have developed beyond the original founders of global rankings. Leiden Ranking is now the gold standard for the analysis of publications and citations. The Russian rankings use the same Web of Science database that THE did until 2014 and it has 12 out of the 13 indicators used by THE plus another eight in a more sensible and transparent arrangement. However, both of these receive only a fraction of the attention given to the THE rankings.

The research rankings from Turkey and Taiwan are similar to the Shanghai rankings but without the elderly or long departed Fields and Nobel award winners and with a more coherent methodology. U-Multirank is almost alone in trying to get at things that might be of interest to prospective undergraduate students.

It is regrettable that an article by Professor Brian Leiter of the University of Chicago in the Chronicle of Higher Education , 'Academic Ethics: To Rank or Not to Rank' ignores such developments and mentions only the original “Big Three”, Shanghai, QS and THE. This is perhaps forgivable since the establishment media, including THE and the Chronicle, and leading state and academic bureaucrats have until recently paid very little attention to innovative developments in university ranking. Leiter attacks the QS rankings and proposes that they should be boycotted while trying to improve the THE rankings.

It is a little odd that Leiter should be so caustic, not entirely without justification, about QS while apparently being unaware of similar or greater problems with THE.

He begins by saying that QS stands for “quirky silliness”. I would not disagree with that although in recent years QS has been getting less silly. I have been as sarcastic as anyone about the failings of QS: see here and here for an amusing commentary.

But the suggestion that QS is uniquely bad in contrast to THE is way off the target. There are many issues with the QS methodology, especially with its employer and academic surveys, and it has often announced placings that seem very questionable such as Nanyang Technological University (NTU) ahead of Princeton and Yale or the University of Buenos Aires in the world top 100, largely as a result of a suspiciously good performance in the survey indicators. The oddities of the QS rankings are, however, no worse than some of the absurdities that THE has served up in their world and regional rankings.  We have had places like University of Marakkesh Cadi Ayyad University in Morocco, Middle East Technical University in Turkey, Federico Santa Maria Technical University in Chile, Alexandria University and Veltech University in India rise to ludicrously high places, sometimes just for a year or two, as the result of a few papers or even a single highly cited author.

I am not entirely persuaded that NTU deserves its top 12 placing in the QS rankings. You can see here QS’s unconvincing reply to a question that I provided. QS claims that NTU's excellence is shown by its success in attracting foreign faculty, students and collaborators, but when you are in a country where people show their passports to drive to the dentist, being international is no great accomplishment. Even so, it is evidently world class as far as engineering and computer science are concerned and it is not impossible that it could reach an undisputed overall top ten or twenty ranking the next decade.

While the THE top ten or twenty or even fifty looks quite reasonable, apart from Oxford in first place, there are many anomalies as soon as we start breaking the rankings apart by country or indicator and THE has pushed some very weird data in recent years. Look at these places supposed to be regional or international centers of across the board research excellence as measured by citations: St Georges University of London, Brandeis University, the Free University of Bozen-Bolsano,  King Abdulaziz University, the University of Iceland, Veltech University. If QS is silly what are we to call a ranking where Anglia Ruskin University is supposed to have a greater research impact than Chicago, Cambridge or Tsinghua.

Leiter starts his article by pointing out that the QS academic survey is largely driven by the geographical distribution of its respondents and by the halo effect. This is very probably true and to that I would add that a lot of the responses to academic surveys of this kind are likely driven by simple self interest, academics voting for their alma mater or current employer. QS does not allow respondents to vote for the latter but they can vote for the former and also vote for grant providers or collaborators.

He says that “QS does not, however, disclose the geographic distribution of its survey respondents, so the extent of the distorting effect cannot be determined". This is not true of the overall survey. QS does in fact give very detailed figures about the origin of its respondents and there is good evidence here of probable distorting effects. There are, for example, more responses from Taiwan than from Mainland China, and almost as many from Malaysia as from Russia. QS does not, however, go down to subject level when listing geographic distribution.

He then refers to the case of University College Cork (UCC) asking faculty to solicit friends in other institutions to vote for UCC. This is definitely a bad practice, but it was in violation of QS guidelines and QS have investigated. I do not know what came of the investigation but it is worth noting that the message would not have been an issue if it had referred to the THE survey.

On balance, I would agree that THE ‘s survey methodology is less dubious than QS’s and less likely to be influenced by energetic PR campaigns. It would certainly be a good idea if the weighting of the QS survey was reduced and if there was more rigorous screening and classification of potential respondents.

But I think we also have to bear in mind that QS does prohibit respondents from voting for their own universities and it does average results out over a five- year period (formerly three years).

It is interesting that while THE does not usually combine and average survey results it did so in the 2016-17 world rankings combining the 2015 and 2016 survey results. This was, I suspect, probably because of a substantial drop in 2016 in the percentage of respondents from the arts and humanities that would, if unadjusted, have caused a serious problem for UK universities, especially those in the Russell Group.

Leiter then goes on to condemn QS for its dubious business practices. He reports that THE dropped QS because of its dubious practices. That is what THE says but it is widely rumoured within the rankings industry that THE was also interested in the financial advantages of a direct partnership with Thomson Reuters rather than getting data from QS.

He also refers to QS’s hosting a series of “World Class events” where world university leaders pay $950 for “seminar, dinners, coffee breaks” and “learn best practice for branding and marketing your institution through case studies and expert knowledge” and the QS stars plan where universities pay to be audited by QS in return for stars that they can use for promotion and advertising. I would add to his criticism that the Stars program has apparently undergone a typical “grade inflation” with the number of five-star universities increasing all the time.

Also, QS offers specific consulting services and it has a large number of clients from around the world although there are many more from Australia and Indonesia than from Canada and the US. Of the three from the US one is MIT which has been number one in the QS world rankings since 2012, a position it probably achieved after a change in the way in which faculty were classified.

It would, however, be misleading to suggest that THE is any better in this respect. Since 2014 it has launched a serious and unapologetic “monetisation of data” program.

There are events such as the forthcoming world "academic summit" where for 1,199 GBP (standard university) or 2,200 GBP (corporate), delegates can get "Exclusive insight into the 2017 Times Higher Education World University Rankings at the official launch and rankings masterclass,”, plus “prestigious gala dinner, drinks reception and other networking events”. THE also provides a variety of benchmarking and performance analysis services, branding, advertising and reputation management campaigns and a range of silver and gold profiles, including adverts and sponsored supplements. THE’s data clients include some illustrious names like the National University of Singapore and Trinity College Dublin plus some less well-known places such as Federico Santa Maria Technical University, Orebro University, King Abdulaziz University, National Research Nuclear University MEPhI Moscow, and Charles Darwin University.

Among THE’s activities are regional events that promise “partnership opportunities for global thought leaders” and where rankings like “the WUR are presented at these events with our award-winning data team on hand to explain them, allowing institutions better understanding of their findings”.

At some of these summits the rankings presented are trimmed and tweaked and somehow the hosts emerge in a favourable light. In February 2015, for example, THE held a Middle East and North Africa (MENA) summit that included a “snapshot ranking” that put Texas A and M University Qatar, a branch campus that offers nothing but engineering courses, in first place and Qatar University in fourth. The ranking consisted of precisely one indicator out of the 13 that make up THE’s world university rankings, field and year normalised citations. United Arab Emirates University (UAEU) was 11th and the American University of Sharjah in the UAE 14th.  

The next MENA summit was held in January 2016 in Al Ain in UAE. There was no snapshot this time and the methodology for the MENA rankings included 13 indicators in THE’s world rankings. Host country universities were now in fifth (UAEU) and eighth place (American University in Sharjah). Texas A and M Qatar was not ranked and Qatar University fell to sixth place.

Something similar happened to Africa. In 2015, THE went to the University of Johannesburg for a summit that brought together “outstanding global thought leaders from industry, government, higher education and research” and which unveiled THE’s Africa ranking based on citations (with the innovation of fractional counting) that put the host university in ninth place and the University of Ghana in twelfth.

In 2016 the show moved on to the University of Ghana where another ranking was produced based on all the 13 world ranking indicators. This time the University of Johannesburg did not take part and the University of Ghana went from 12th place to 7th.

I may have missed something but so far I do not see sign of THE Africa or MENA summits planned for 2017. If so, then African and MENA university leaders are to be congratulated for a very healthy scepticism.

To be fair, THE does not seem to have done any methodological tweaking for this year’s Asian, Asia Pacific and Latin American rankings.

Leiter concludes that American academics should boycott the QS survey but not THE’s and that they should lobby THE to improve its survey practices. That, I suspect, is pretty much a nonstarter. QS has never had much a presence in the US anyway and THE is unlikely to change significantly as long as its commercial dominance goes unchallenged and as long as scholars and administrators fail to see through its PR wizardry. It would be better for everybody to start looking beyond the "Big Three" rankings.