How did the pandemic affect Graduate Outcomes?
David Kernohan is Deputy Editor of Wonkhe
Tags
So, it’s Graduate Outcomes time – but not quite.
Today saw the release of the summary statistics – for the detailed tables (including the provider level stuff) you’ve another seven days to wait. In terms of headlines the proportion of 2018-19 graduates in full time employment is down three percentage points from this point in the career of the 2017-18 cohort – with correspondingly more in part time work and work with further study. Unemployment (in the strictest sense, without being due to start employment or further study) was up a single percentage point.
Given that this is data collected on young people during the Covid-19 pandemic – the second of four phases of collection started alongside the March lockdown, the second ended in early December with the nation dealing with lockdown two, things could have looked a lot worse. We’re hampered in knowing just how far out of the ordinary these patterns are by the fact that Graduate Outcomes is still new, experimental, data – we don’t know what a standard year-on-year variation would be.
So we only really have HESA’s administrative data to account for just how weird last year was – and happily the summary release is accompanied by a fantastic briefing note from Lucy Van Essen-Fishman that explains things in detail.
Nothing else to do?
Response rates are up. That’s the starting point for understanding the year. Over the four cohorts, the first (pre Covid) was up 7.4 percentage points on that time last year, the second (March through to June) was up 6.1. The final two cohorts were very slightly under last year, but not specifically so – putting the year up slightly overall. The age of graduates wasn’t a factor, neither was sex, ethnicity, or disability. Clearly these groups (and others, for example by subject area – and thus likely career) had different experiences of the year, but nothing that affected the likelihood of responding.
Because Graduate Outcomes is a mixed mode survey (some graduates fill in a form online, others are interviewed over the phone), HESA keep a very close eye on differences in responses between methods – social desirability bias is the name of the phenomenon that sees interview subjects more likely to want to give socially acceptable answers. In pre-pandemic data, this is an effect seen in the wellbeing questions: graduates interviewed over the phone are more positive about positively worded wellbeing questions (happiness, life satisfaction, feelings of worth) and less likely to score negatively worded questions (anxiety) highly.
For the graduates interviewed during the pandemic, the pattern on positive questions continued – but phone interviews were more likely to report being more anxious. This was only a “slight” effect (we don’t get the numbers), but the briefing speculates that our more open societal attitude to discussing mental health during the pandemic had an impact.
Covid-19 and employment
We get an unusual look at employment rates by survey cohort – pitting the pre-pandemic cohort one (responses collected between December and March) against the toilet paper hoarding (March-June) cohort two, the “eat out to help out” (June-September) cohort three, and the second wave cohort four (September-December).
Unexpectedly, full time employment is up slightly on last year onl;y for the second cohort, who responded during the first lockdown period. Unemployment is up slightly for each period, and the difference between this year and last year rises as we go through the year. We would expect this – young people suffered disproportionately in terms of employment during the pandemic – although the low numbers compared to the youth unemployment rate (which was up around 14 per cent) point to a graduate premium of sorts.
Graduates opting to travel during 2020 dropped by 50 per cent – it was about the same as last year for cohort one but this dropped sharply with the closing of the borders.
How does it feel?
HESA reports a slight contraction in highly positive responses to wellbeing questions during this year, particular towards the end of the year – but these were balanced by a growth in moderately positive responses. Despite a hugely disquieting and difficult year, in the main graduates do appear – on these survey metrics, at least – to be resilient enough to be reporting similar responses to the only other existing cohort that has been surveyed in this way.
Again – huge flashing caveat here – this is a newish, experimental, data series and there’s clearly a lot we don’t know about what a normal year would be, or if there is such a thing. My instinct is no – there’s always something going on that will have an impact on graduate outcomes.
I hope that these commentaries continue alongside future releases – especially while Graduate Outcomes beds in. It’s a real benefit of having an expert and disinterested (in policy terms) data body dealing with information about outcomes. This year’s findings are (hopefully) a one off, but in a rational world they should make us pause just a little before measuring the “quality” of the system via output metrics.