A more rounded view: The U21 ranking of national higher education systems 2012

The U21 Ranking of National Higher Education Systems 2012, released by Universitas 21 on 11 May, is the most comprehensive attempt to date in evaluating higher education systems by country, an approach U21Coversmallperhaps precluded by the glut of institution-based university rankings in the last decade.

The motivation for the exercise is indicated, if obliquely. The authors assert that 'Given the importance of higher education, a nation needs a comprehensive set of indicators in order to evaluate the quality and worth of its higher education system'. They add that the existing international rankings of universities emphasise research excellence but 'throw no light, however, on issues such as how well a nation's higher education system educate all its students [of] different interests, abilities and backgrounds'.

The methodology is in part derived from and inspired by the work of Jamil Salmi, formerly of the World Bank. A team of researchers at the Melbourne Institute of Applied Economic and Social Research ranked 48 national higher education systems based on 20 measures grouped into four categories:

  • Resources (private and public expenditure)
  • Environment (regulation, diversity and participation opportunities)
  • Connectivity (number of international students and articles written jointly with international collaborators)
  • Output (amount and impact of research; connection to labour market)

These were integrated into a single score using weightings as follows: 40% for output, 25% each for resources and environment, and 10% for connectivity. These shares reflect author judgement, and availability and quality of data. Connectivity has a weak share because of a lack of 'data on joint activity between HEIs and the rest of society'.

US at the pole position

The US is ranked 1 and given a score of 100. The authors note that that would not change even if the output variable were omitted, so the result is 'not solely a size effect'. In output specifically the US (also 100 points) is far ahead of the second country, the UK (62.2). But the US is ranked 36th in connectivity, so lucky for them that it had a low weighting.



                           The Americas


The authors note that groupings of neighbouring countries can be discerned in the ranking, an indication of regional competition - keeping an eye on the neighbours. A cluster of Nordic countries occupies some of the top positions, as shown on the map below. 





In East Asia, Hong Kong is ranked 17, Japan 20, Taiwan 21, and Korea 22. France and Germany are very close, and equally close to each other are southern European countries (Portugal 23, Spain 24, Greece 29, Italy 30, Bulgaria 31). Central/eastern Europeans are also together in the middle of the list. Finally, two clusters lie at the bottom: Latin America (Chile 37, Argentina 38, Brazil 40, Mexico 43) and South East Asia (Malaysia 36, Thailand 41, Indonesia 47). Singapore is 11, closer to Hong Kong rather than its immediate neighbours.

Canada is ranked third because of its high scores in resources (1) and output (3). Switzerland is sixth(3rd in connectivity) and the Netherlands is ninth (1st in environment). Australia is seventh (4th in connectivity, 7th in output).

The UK is tenth; this reflects a poor ranking in resources (27, including 41st for government expenditure, which will decline further) and a high rank in output (2), which is consistent with the institutional league tables. The lead author of the report, Ross Williams, noted in an article for the Guardian that this discrepancy between output and resources is the greatest for all 48 countries and reflects very high productivity. This is normally attributed to the discipline enforced by the research assessment exercises over years. In most other cases there is a positive correlation between resources and output.

Professor Williams remarked in an interview with BBC News that 'if you want to maintain high output you must maintain high resource levels'. Music to the UK sector but given that it has managed thus far to avoid this causality, what are we to make of this? The UK government need only to glance at the report to say it ain't so. This is ironic given that the UK government is not in fact withdrawing from research funding; its axe now falls over the teaching grant. The current swing in the pendulum from public to private funding should indeed depress the position of the UK system in subsequent exercises but whether that will have anything to do with research output remains to be seen.

A higher weighting for connectivity (in which the UK is 6th) would have earned the UK a higher overall position. Interestingly, the UK is ranked 13th in environment, just below France and also below Poland, Bulgaria and the Czech Republic. The reason is a relatively low proportion of female academic staff, along with a poor ranking in the qualitative index used by the authors of the report to measure the efficiency of policy and regulatory environments.

A warning for China and India?

A major finding of the report is that India and China have a long way to go to match the quality standards of higher education in Europe and North America. Among the 48 countries India is ranked last and China 39th,, as shown on the map of Asia below. Both countries are in the bottom quartile in most categories. India is in a much worse position (45th) than China (27th) in the environment category.

It is surprising to see that both countries need to invest more in resources (India 47 and China 45), although the sheer sizes of their populations will have contributed to poor rankings in this category. In output China is mid-table (26) but India fourth from the bottom (45). China shares the UK profile of higher output and lower resources but Chinese articles have less impact than UK ones. Connectivity, roughly a measure of internationalisation, is also an area where China and India lag behind, in contrast with Singapore which seems to be at the forefront.



Where top countries need to improve

The U21 ranking supports what we already know: that higher education systems in North Europe, North America and Australia are the most competitive ones in the world. But certain findings can still be worrying. For example, the 36th place of the US in connectivity is difficult to interpret. Is it a sign of insularity or just a reflection of the country's dominant role in academic research? According to the authors, 'the United States, Korea and Japan are in the bottom quartile for research collaboration, in part reflecting the existence of a critical mass within the national research community'. On the other hand, countries such as the UK and Switzerland that also excel in academic research are at the top of the connectivity list. Surprisingly enough, the UK, Germany and Switzerland do not fare well in the environment category, while French higher education is more internationally-orientated than Germany (76.8 and 60.1 respectively in connectivity).

A contribution to research on higher education

As the first systematic attempt to evaluate higher education systems of a wide range of countries, this report is a useful contribution to research on international higher education. It provides a rich amount of data and a methodology that can be used by other researchers.

The following two graphs, derived from the U21 report, show that for most countries there is a positive correlation between the impact of articles produced and the number of articles written with international collaborators. The US is an anomaly, however, with disproportionately great impact.



Correlation between impact and number of articles co-authored with international researchers: Countries are ranked according to position in U21’s connectivity ranking (1st to 24th) 




Correlation between impact and number of articles co-authored with international researchers: Countries are ranked according to position in U21’s connectivity ranking (25th to 48th)


The U21 report is the first attempt to look at the broader picture rather than top institutions, which only have a small portion of a country’s student population. It is not surprising that Universities UK draws attention to the fact the ‘other more established compilers of world rankings’ such as THE, QS and Shanghai Jiao Tong rate the UK as second to the US. But as William Lawton noted in 2010, drawing conclusions on national prowess and competitiveness from institution-based methodology is a tricky and almost arbitrary game – among the things they do not take into account are population sizes and national wealth.

Country rankings have to address the population issue. The authors assert that they ‘control for national size in most measures’. It is not clear how this is done and it will have to be spelled out next time. And some countries might excel at specific sectors which are crucial for their economy (eg, IT in India) but such an apparently useful indicator has yet to be reflected in a ranking – as far as we are aware.

The weighting system used for the overall ranking will be the subject of criticisms, as are the other ones. It seems reasonable to weigh output by 40% but this does not surmount the English-language bias for published research in the established rankings. And U21 uses the Shanghai Jiao Tong ranking (SJTU is a U21 member) in measuring output, thereby reproducing the inherent biases there. The use of a measurement that combines findings from the three big university rankings (THE, Shanghai and QS) might be perceived as more reasonable.

The U21 scores based only on the output variable unsurprisingly produces the closest match with the institution-based rankings. Its other three categories provide what many might see as a corrective to the research-output bias. Many have expressed unease about the power, influence and unintended consequences of the established institutional rankings, and have suggested ways of reining them in. Who would have thought another ranking system is the closest thing yet?


Alex Katsomitros, William Lawton