Publication of this year’s National Student Survey results has resulted – as is the case every year – in a display of masterful creativity from communications and press teams across the country.
I’d like to pay homage to this work and to compile some hints and tips for anyone new to this summer-time game. There’s still just enough time to play before hitting confirmation and clearing week.
There is every good reason to spin the story, to place one’s university in the best possible light and to achieve the most desirable headline. Where there’s scope for being ‘first’, such as within a nation, region or mission group then that will take a prominent position in the press release. It’s also clearly a delight from a smattering of universities to point out that they’re above rivals: coming out better than Oxbridge is a particular favourite.
If there isn’t a ‘first place’ available outright, then there’s the opportunity to eliminate those not seen as fair competition, and you get to determine what fair means. With the Universities of Buckingham and Law, both ‘private’ providers, having performed particularly well, one can deem it appropriate to make fair comparison only against “publicly funded” institutions or the more loaded “mainstream.”
And if that hasn’t got rid of the requisite numbers of institutions ahead in the table, it’s fairly normal to exclude “small and specialist” institutions and focus solely on the “multi-faculty,” “broad based” or “full service” players. Bye bye high-scoring Courtauld. If you hail from a small and/or specialist institution, you have the opportunity to make a comparison excluding the big beasts: don’t miss out on being top amongst the conservatoires.
With St Andrews and Aberystwyth having done well this year too, you may also have scope to go for an England-only comparison. Just for clarity, obviously.
If you really want to push the boundaries, you can additionally set an arbitrary limit on the numbers of students for your own league table. One institution deemed it appropriate to allow comparison only with those universities with a survey population of at least 500. Out goes Bishop Grosseteste, but only just, as its survey population is exactly 500. But that may just be a coincidence.
If those handy hints haven’t provided you with the ranking you were hoping for, you could also just make your statement a little vague and pick your own group against which to measure your performance. One university tautologically gave its ranking as “when compared against comparator institutions.” Quite.
And finally, if you really really want to push it, it’s possible to twist the overall ranking position into ‘highest score’. That means that everyone with 97% is in place one, 94% place two and so on. With multiple institutions listed jointly for each score, because they’re presented with zero decimal places, you can rocket up the table. Who’s going to pay that much attention anyway?
Here’s a rundown of some of the claims on offer and, please note, this is not a comprehensive league table. Surely everyone’s entitled to their caveats?
Institution | Overall satisfaction | Claim |
---|---|---|
University of Buckingham | 97% | 1st |
University of Law | 97% | Equal 1st |
Courtauld Institute of Art | 94% | 4th |
Keele University | 94% | 1st |
University of St Andrews | 94% | Equal 1st |
Aberystwyth University | 92% | Top 10 |
Bishop Grosseteste University | 92% | 2nd |
Harper Adams University | 92% | 2nd and 7th |
Liverpool Hope University | 92% | Top 4 |
University of Dundee | 91% | 8th |
University of East Anglia | 91% | Equal 3rd in England |
University of Exeter | 91% | 9th and 11th |
Lancaster University | 91% | Top 10 |
University of Lincoln | 91% | 11th |
University of Essex | 90% | Equal 8th in England |
University of Kent | 90% | 4th |
University of Leeds | 90% | Equal 2nd in Russell Group |
Newcastle University | 90% | 12th |
Swansea University | 90% | 14th |
Bangor University | 90% | Top 15 |