Let’s think, for a minute, (if we’re inclined to care enough…) about why British universities might be slipping down some international league tables.
The figures are fairly clear, after all. 51 of the UK’s top 76 institutions, including sixteen from the Russell Group, have dropped in the latest QS rankings.
The Telegraph had a go at this exercise on election morning, and concluded that “experts [blame] the decline on pressure to admit more disadvantaged students”. The word “experts” is a curious one here. It seems to me they had just one. Their other interviewees seemed to be pulling in different directions; but, hey, why miss an opportunity for some reactionary elitism?
Our expert is quoted as follows (and I’m not even sure that he would be comfortable with the way his words have been used):
Professor Alan Smithers, who is head of the centre for education and employment at the University of Buckingham, said that the decline was because “universities are no longer free to take their own decisions and recruit the most talented students which would ensure top positions in league tables”.
He said that instead, universities are forced to comply with “all sorts of requirements in terms of the ethnic mix, the levels of income of the students and whether they come from low income areas”.
What this extraordinary explanation for the fall of Britain is doing, after all, is blaming a programme of social mobility for a decline in quality. Or, in a not wholly subliminal way, it’s suggesting that our top universities could be great again if only they didn’t have to admit so many of the wrong kind of person. Those poor people from underfunded schools: they really pull us all down. More means worse. How reassuring this must all be for the Telegraph’s declining readership.
Another way of analysing these results might have been to start with the QS methodology. It considers six metrics:
- Academic Reputation
- Employer Reputation
- Faculty/Student Ratio
- Citations per faculty
- International Faculty Ratio
- International Student Ratio
As hard as I look, I don’t see anything here about the average net worth or skin colour of a university’s students. Indeed, I don’t see anything about recruiting the “most talented students” that Professor Smithers claims are being so cruelly marginalised. Funny that; if we could only take ourselves back a few generations, it was all so much more straightforward.
So where else might we look for explanations? First of all, I’d suggest we might consider the level of international competition. There are countries around the world, not least in Asia, that have methodically and ruthlessly targeted success in the international league tables. They have increased investment across the board, and also concentrated resources on identified elite groups of universities.
Secondly, let’s pause on the final two measures, which are all about international outlook. For all the ‘We Are International’ hash-tags, British universities are hamstrung by a government hostile – at least in its rhetoric – towards international students and insular in its outlook. Other countries are increasing their numbers of international students while we are going backwards. Our participation in EU research funding schemes, which have been the single greatest engine of international collaboration, is in serious doubt.
Which leads us to Brexit. After an election campaign in which both major parties have made promises about this and that while determinedly ignoring the fact that Brexit will rip a bloody great hole in the nation, it seems appropriate that we should be looking every which way for an explanation for these league table trends. Because it couldn’t have anything to do with Brexit, could it? It surely couldn’t – or not, anyway, for the Telegraph – be influenced in any way by this historical act of insularity and xenophobia.
No: it must be caused by our dreadfully misguided efforts to drag forward all these frightfully uneducated oiks. Britain was an altogether greater nation when they knew their place, and the higher education system was designed to damn well keep them there. Thank God we have the QS World University Rankings to prove it.
What a very strange take on the situation, in particular “all sorts of requirements in terms of the ethnic mix”. What are these ‘requirements’?
Brilliant, Andrew. Thank you.
May I just punch just one more hole beneath the waterline of the Telegraph unseaworthy article: the argument blaming wider access for a mildly lower position for one ‘top’ uni in one ranking presupposes rankings tell us anything realistic about quality.
Look again at those metrics that QS uses…
• This is a shamefully narrow set of things to measure: staff:studio ratios, citations, employer reputation, etc, two (out of six) measures focused on internationalisation? Where’s teaching quality, student engagement, value added, learning gain, developing rounded individuals, opportunity created, culture enriched, regional or economic contribution? Impossible things to measure perhaps, but that’s no excuse for replacing them with six shades of gobshite.
• Even if QS had correctly identified the factors truly representative of some objective standard of quality, QS uses poor proxies of those measures. For example, internationalisation is about far more than just incoming staff and students. Employability is not even remotely well measured by an employer reputation survey, etc.
• Even if they were good proxies, the methodology of measurement is shameful. Take staff:student ratios: it’s impossible to establish a consistent international definition of either ‘staff’ or ‘students’. How much less possible is it to measure the ratio with any comparability? As for academic reputation, that’s a two-dime freak show of embedded privilege…
• Even if the metric methodology were good, the aggregation of the data is entirely arbitrary and designed to prop up the crumbling argument why a certain type of university is ‘better’ than a certain other type (even though they exist for entirely different purposes).
• And even if the aggregation had any defensible basis, the way the data is misrepresented instantly undermines its own claims to any authority. Rankers spend half their time showing how consistent their results are in order to claim credibility for their methods and the other half trying to read grand meaning into any swapping of places which is far more likely to be down to statistical noise (especially given the dodgy methods).
When you recognise the fantasy foundations on which league tables are built, it is breathtaking that *anyone* takes them seriously, but for any academic like Prof Smithers – whose own research is subject to scrutiny and who presumably applies it to the work of colleagues and students – to choose not to apply even the slightest intellectual rigour suggests that either he has been misquoted as badly as Jesus’ pronouncements about vending machines, or that he should consider carefully his ability to contribute to human (or even vegetable) knowledge. Who knows, perhaps Smithers has a Mr Burns urging him to talk such nonsense?