I don’t play with data to generate headlines – I play with data to try to understand what the sector is like.
People do generally seem to appreciate a focus on analysis in these articles, but it doesn’t stop me wondering what things would be like if I went for big splashy headline findings and bothered less about what the data said.
Sometimes things are difficult for a reason
Because HESA has wisely made it very difficult to build league tables off the back of Graduate Outcomes, I’m fairly sure that it will be a while before anyone else does it. Here’s what they’ll find, and the rest of this article is about why the findings are largely meaningless.
- Suffolk graduates in employment are more likely to be in a highly skilled role than Exeter graduates. Bath graduates are similarly more likely than those from Oxford.
- Graduates of the University of Cumbria are less likely to be unemployed than graduates of Oxford, Cambridge, or Imperial College. There appears to be a London effect contributing to unemployment too.
- A quarter of Aberystwyth graduates in full time employment disagree that they are utilising what they learned on their course. Graduates of specialist arts colleges are generally more likely to report that their current full time employment is not meaningful, whereas all but 2 per cent of Buckingham graduates find meaning in their work.
Honestly, there’s a career’s worth of this stuff in here.
OfS is right – down with universities!
One big problem faced by the sector and regulators is that the higher education provider is such a beguiling unit of analysis – I mean there’s an entire industry out there that churns out league tables based on this idea. And it makes sense on an immediate level that processes and decisions about teaching within individual providers may have some impact on how graduates do later in life.
The charm fades, however, when you give it some thought. Institutions have differing offers around subject of study and qualification type. They attract different types of students from different backgrounds , and release graduates into a range of local areas. And all of these things have an impact on graduate outcomes.
It is possible to find a great course or department in a provider that is surrounded by a sea of what we may politely call “run of the mill” provision. Likewise, a university filled with superb teaching might have a handful of, well, low quality courses. We don’t see any of this if we look at graduate outcomes at provider level – instead we are drawn to make lazy assumptions about the kind of course quality available at a particular provider based on nothing more than whatever brand of snobbery we have internalised.
Who is in the data
Provider response rates vary in Graduate Outcomes, and there is no way to know what kind of selection bias issue this causes. Are satisfied graduates more likely to respond? Angry graduates? Graduates with time on their hands? Social sciences graduates who just flat out love survey instruments? We don’t know.
So HESA, perhaps reasonably, have released provider level data in such a way that you need a fair amount of analytical skill to produce worthwhile rankings from it – I suspect in the hope that anyone that would do the hard yards on this would also caveat it properly. That’s what I’m intending to do.
Toast droppers
Let’s start with what we might reasonably consider a big question – which providers have the highest proportion of graduates who entered full time employment in high skilled jobs – and how much do they earn at whatever skill level?
You’ll spot a lot of the teasers from the top in that one. But our issues with this use of the data are manifold.
Very small providers, with very small graduating cohorts, are both flattered and suffer on a proportional measure, with the life course of an individual graduate (or group of 2-3 graduates given HESA-style rounding) having a major effect. By default, I’ve filtered out anyone who produced less than 100 graduate responses, but this effect still has an impact.
And then we have the subject effect. As we saw in the other article, different subjects lead to different outcomes. If you’ve got a medical school you get a bonus, if you’re an arts college you are going to look bad on this measure. The state of the local graduate employment market is also going to have an impact.
I’ve never liked the salary info in graduate outcomes – it’s self-reported, and as we know student characteristics have a huge impact on salary, as does subject of study and location. For that reason I’ve only made this viewable as a time series for a single provider, and you can choose your provider of interest by hovering over the dot on the chart on the top.
A more sensible way of looking at occupations by provider
This version shows numbers of responding graduates rather than proportions – the ranking is by total number of responses rather than proportion, so you’ll see Nottingham Trent (2,500+) at the top, and Wirral College (5, so between 3 and 6) at the bottom. Mousing over a provider lets you compare across the three years of data we have.
This is less of a headline-generating machine than the plot above, but it does present factually what is going on with graduate outcomes. Ideally we’d still be able to see this at least at subject resolution – and when Graduate Outcomes 2019-20 finds its way into the Unistats data set I’m sure I’ll let you see what is available. The problems we will face there are low numbers, rounding, and inconsistent subject coding.
What they do
Which providers have the most unemployed graduates? – using just the “unemployed” markers, not the “unemployed and” ones that suggest that had the survey been completed a month later the result would be different.
Here our major complicating issue is response rates – because graduate outcomes survey responses are a self-selecting group we don’t know how representative of all graduates the responses we have are.
This is clear if we plot provider by provider – the grey bar each time represents the number of graduates we know nothing about.
There’s two ways we can go here – and both of these run into response bias problems almost straight away. Choice one is to plot the proportion of unemployed respondents, choice two is to plot the proportion of all graduates that we know are unemployed. Both are a problem in that we don’t know how more likely (or unlikely) than others graduates who are unemployed are to complete their survey.
I’ve gone for option two:
Again, I’ve used a size filter – not presenting proportions for providers producing less than 500 graduates a year overall, but you can change this with the filter if you dare.
There’s not a clear pattern here – but there’s some startling findings. The University of Cumbria, for instance, produces graduates less likely to be unemployed than those of Imperial, Cambridge, or Oxford… but if this ends up in a prospectus I will be singularly unimpressed.
At the other end of the chart we find quite a few larger London universities, alongside a range of more specialist providers and small colleges. There are some arts providers up the top, but there are also quite a few much further down.
How does it feel?
Though SOCs and employment are the points of interest in regulation, the most striking findings from Graduate Outcomes have always been from the self-reflection questions. Although we (rightly) spend a lot of time thinking about how students feel, the idea of any kind of duty of care for graduates is a very new one.
We don’t get the ONS-specification wellbeing questions by provider – which is a shame, as I have a pet theory about a link to provider prestige I would love to explore. But we do get proportional (not numerical) responses by provider for the other questions – here’s a version that sorts by the proportion of responding graduates that disagree or strongly disagree with each question.
One thing I note here is the preponderance of specialist arts providers towards the top of each question ranking. Because we are asking questions here that link the ostensible degree subject with activity 15 months on, we’re asking creative arts graduates struggling to establish a creative career (there are no other ways to establish a creative career) how they feel about the way they are struggling. Unsurprisingly they are not happy.
Here’s a time series lookup by provider. We can see disagreement on using acquired skills (though there’s no prompting in the survey on whether these are the wider range of skills or just subject specific skills), planned career fit, and ideas of meaning grow through the pandemic – at a time when very few paid arts-related roles were available. And it is a specific arts provider effect, not a specialist provider effect.
Regarding specialist arts providers: I would be very interested in an analysis that explored the bias around the type of student attracted to and taking art practice courses and the aspirations for early career work, which the type of course is potentially standing as proxy for in this data. The flip side of creativity is dissatisfaction with the status quo, so by extension creative industries graduates are more likely to be dissatisfied with a post-degree situation that graduates from other disciplines are going to be more accepting about. In addition, creative industry courses tend to identify leadership roles as the ultimate employment aspiration, so subordinate starting positions (once acquired) compound this frustration. In contrast, professions with a more structured hierarchy fit the question set more accurately. To become a barrister it is essential to go through pupilage at chambers, so that will be seen as evidently meaningful and fitting with future plans; the progression is much less clear in most creative professions. I would propose that if it was possible to disaggregate the arts courses from the other disciplinary areas in a single larger institution, the same disparity may well appear.
this is excellent stuff David – and shows a largely positive story in challenging times for our graduates UK wide. Would be useful to be able to compare unemployment rates (the old HESA PI) by taking out non-respondents to show a % of respondents in employment or further study or % of respondents unemployed. This also confirms how the DfE request to publish data on graduates into highly skilled employment/further study is meaningless (response rate bias, possible equality issues?) – and is almost designed to penalise the creative arts and humanities – to which our country excels and generates billions from our graduates, makes us human – we gave the world and shared with the world the english language!