Good British Universities

The ranking list begins to struggle to distinguish between the institutions beyond No. 99, and groups the lower ranking into large groups. The University of Sussex is in the first of these groups, 102-150, LSE is in the next group 151-202 and Warwick in the next, 203-304.

If one examines the methodology we begin to get some explanations. One of the categories measured in the methodology is "Articles published in Nature and Science". This is one of the indicators measured to assess the quality of an institution's "Research Output". The points for this are allocated over the other categories, but I have not discovered how. There is a problem here. Firstly, these are both English language publications, and thus may bias high scores towards institutions in English speaking countries, helping the US & UK dominance, and possibly being part of the explanation of Canada's 5th place position. . (It might be interesting to calculate the distribution of the top 100 by language). Secondly, a number of institutions would not consider these publications documents of record for their primary research focus. Shanghai Jiao Tong University have developed a work around for those institutions specialising in Humanities and Social Sciences, which it applies to the LSE.

I have examined the base data, and cannot apply the published weights to the published scores and get the same total score as Shanghai Jiao Tong University. The University publish its scores and weighting so I have recreated the summary scores for the purposes of my analysis. I have also designed two alternative weightings, one which allocates the 20% points allocated to the publication of articles in Nature & Science across the remaining categories in proportion to their contribution. The second method, seeks to keep the "Research Output" score at 40% and allocates the missing 20% to the second RO indicator score, "Articles in Science Citation Index-expanded, Social Science Citation Index". The original weights by category are as follows,

Quality of Education Quality of Faculty Research Output Size
Weights 10% 40% 40% 10%
Factors 1 2 2 1

The final problem with having two calculation methods is when do you apply the second method i.e. when is Articles published in Science and Nature an irrelevant indicator. ( I am sure there are some who'd argue never ). I have calculated scores using both schemes, the original and my Research Output orientated scheme. This allows me to compare the effect of the different weights on the ranking. I have applied these techniques to a number of the UK Universities, and also applied the Guardian's teaching quality score to those Universities to see if there was much of a difference. The Guardian's teaching quality score is departmentally based, and I chose to use the ICT departmental scores. Applying my revised "Research Output" score doesn't have much of an effect on the position of the LSE, there are one or two some interesting differences, but it would seem to me that we are back to asking how good the indicators are. I noted in my previous article that the methodology favoured science and anecdotally universities with large medical and bioscience faculties. It might be interesting to look at the big movers and examine the methodological causes of the changes.

I have come to the conclusion that the Shanghai University method's indicators are too narrow to easily answer the questions I am asking and the Guardian's research cannot be used to rank the institutions. They only evaluate departments, and aim to evaluate they undergraduate teaching experience. In their notes on their methodology, the Guardian says

To use the indicators' absolute values would make it virtually impossible to produce an overall table for the institutions, since their position would be dependent on what subjects they teach, rather than on how well they teach it.........

and added that

Note that we don't include research funding, figures from the research assessment exercise or data in that line - this is supposed to be a ranking for undergraduates, not a health check for the university as a whole.

Tediously, it seems that I am repeating the criticisms made by sufficient others to have made it to the University Rankings page on Wikipedia but looking into the data always improves one's understanding.

So the survey may over estimate English speaking institutions success, it probably devalues non pure science teaching, and it uses very few indicators. These factors may explain why the 'wisdom of crowds', market evaluation of entry grades required, comes out with very different answers about the LSE. My final conclusion is that this survey is seen by the EU, the Commission and its advisors as too important. Someone should do another one, but what is really needed is an economic, or political model that defines a successful University. These are issues for public policy makers, and increasingly in the UK the people funding tertiary education which is becoming the students and their families. But if looking to attend a UK University, I'd thoroughly recommend the relevant Guardian Guide. They are published each year to help the school leaving cohort, and it helped me advise my children over the last 5 years, to the extent they let me.

Notes

The UK Universities in the 100-150 group include Glasgow, Leeds, Liverpool, Sussex.

Those in the top 100 are Cambridge, Oxford, Imperial, UCL, Manchester, Edinburgh, Bristol, Sheffield, Nottingham, Kings College London and Birmingham.

The Guardian does not score the LSE for teaching ICT.

Canada is 5th beating France, Italy and Spain, all with more people and with similar or greater per capita GDP. I am not looking to denigrate Canada's tertiary education system.

tags:

Comments:

Post a Comment:
Comments are closed for this entry.
About

DaveLevy

Search

Archives
« April 2014
MonTueWedThuFriSatSun
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
    
       
Today