Academic Ranking of World Universities (ARWU) compiled by Shanghai Jiao Tong University, China, is considered one of the very first university ranking systems. Launched in 2003, it aimed to measure the global standing of universities in China.
Since then, several international models have sprung up, most notably those managed by Times Higher Education (THE) and Quacquarelli Symonds (QS). From 2004 to 2009, THE published rankings in association with QS. However the companies parted in 2009, deciding to develop their own methodology and draw their own performance comparisons.
Part of the inspiration behind the THE rankings came from the UK government, asserts Phil Baty, Rankings Editor. “Government published a review of university and business links which called for better global comparisons between universities to help benchmark Britain’s higher education system globally.” Plugging an important global information gap, Baty adds that the rankings have become a valuable resource for students, student recruitment agents and much more. Results are readily available online and via a handy new iPhone app that allows users to use search criteria.
After identifying a need to make their model more rigorous and transparent, THE partnered media conglomerate Thomson Reuters in 2009. “A balanced, comprehensive and rigorous system was developed after almost a year of open consultation, and expert advice from a senior advisory group of more than 50 leading experts from all over the world,” relates Baty.
Employing 13 separate performance indicators in five areas (teaching, research, citations, industry income and internationalisation), it is attention to detail in areas other than research that can be said to help differentiate their system from others. “The Shanghai system is very well regarded because it is transparent, but it is very narrow it only looks at a university’s performance in research, and even with those narrow confines, it only really looks at research in the hard sciences.” While it is a respected benchmark of research excellence, he adds, it offers little to the student.
Danny Byrne, Editor of TopUniversities.com, home of QS World University Rankings, concurs adding that the aim of the QS rankings from day one has been to “empower students to make informed choices”. With a myriad of ranking systems now available, Byrne asserts that there are different rankings to suit different audiences, from rankings that measure web presence to the number of CEOs a university produces. “Any sane methodology will produce some sort of useful information for the right person,” he observes.
According to Byrne, the QS rankings are designed to cover aspects of university performance related to the broader needs and priorities of prospective students. Focussing on six main areas (citations per faculty member, student/faculty ratio, international student ratio and international faculty ratio), it is perhaps the academic reputation and employer surveys that are its differential. The former is a major part of the overall assessment (weighted at 40 per cent), says Byrne, and is based on peer reviews of some 30,000 scholars. The employer survey, meanwhile, is a unique barometer that measures graduate employability and helps determine the ‘market value’ of a degree from a given provider.
Becky Smith at the University of Toronto’s School of Continuing Studies in Canada relates that the rankings can provide a general indication of a university’s standing if interpreted correctly. “They can, if contextualised and the way they are compiled is understood, be helpful in assisting students make their decisions about which university to select.” Ranking 19th in the latest THE listing, Smith relays, “We recognise that there are a multitude of factors that aid a student in making their educational choices, and rankings are only one such factor.”
Pleased that the university is regularly included in the top 100 institutions, Markus Laitinen at the University of Helsinki in Finland opines there is a slight bias towards English-speaking countries. Ranking 89th globally and 33rd in Europe in the recent QS system, he says, “We consider global rankings interesting but not as the whole truth when it comes to institutional quality.” As to whether it helps drive international student enrolments, Laitinen notes rankings are not the most important drivers. “The increase in English-taught master’s programmes and good feedback from past students are much more important,” he observes.
Ranking systems have encountered their fair share of criticism. Given there are an estimated 15,000 universities worldwide, these rankings represent a relatively small pool of institutions. Quacquarelli Symonds claims to independently review 700 universities, while Times Higher Education, in conjunction with Thomson Reuters, officially ranks the top 200 universities.
Justifying a small sample size, Danny Byrne from QS World University Rankings, relates, “The top 500 is actually around the top three per cent of universities. Non-appearance therefore does not imply a university is offering a substandard service to students, but is often merely a reflection of the intensity of global competition.” Similarly, Phil Baty, of the Times Higher Education Rankings, reflects, “We could rank many more than [we do], but we want to be sure we are comparing like with like, so keeping the list relatively short ensures that only institutions with a similar global outlook, and a similar research-led profile are compared.”
While rankings can help in the decision making process, Byrne asserts that they should only be used as a guide. “They are not intended as a complete solution, but they can certainly be a very useful starting point in identifying universities that excel in a given area or discipline.”
Despite being widely recognised by a wider community that includes government, some within the education industry argue these schemes are too subjective, and therefore not an accurate indication of a university’s national or international credentials. Others feel they are open to manipulation by institutions. “Student and their advisors should always take a hard look at the methodologies of rankings and make their own minds up about which ones are going to be the most rigorous and the most useful for their needs,” counters Baty.