HESA UKPI Continuation, 2018-19
David Kernohan is Deputy Editor of Wonkhe
Tags
Technically, a provider performing poorly against a non-continuation benchmark could find themselves in a B3 related Office for Students investigation very quickly. Performance against the benchmark would also, technically, be used within whatever becomes of the Teaching Excellence Framework.
But as we sail closer to the edge of the Covid discontinuity, data on student attitudes and behavior feel like a much less robust regulatory tool. We all know that next year’s iteration of this data, which will cover students who started their courses in 2019-20 will be as fascinating to examine as the implications are terrifying to consider. Many students may have left their course through no failing of their provider. Study during a pandemic is harsh and unrelentingly painful – provider responses can mitigate that but can’t make it stop.
I would honestly be surprised (or, at least, disappointed) to see OfS step in on the basis of numbers that may as well have come from another world. But sometimes it is interesting to travel there:
Not in higher education – 2018-19 entry
Here’s the classic plot of observed non-continuation against benchmark. Bear in mind that this relates to full time undergraduate students who remained on their course for longer than 50 days.
There are other options to dive deeper into the data via age, POLAR, and previous HE experience (think of this as “expert mode” – the default presentation of the datafile assumes you know your way around the HESA data already) but we’re preset to look at all first degree students.
You can see here the kinds of “high absolute” values that could see a quality and standards driven OfS knock on the door (or, indeed, the regulator could break the doors down – s61 of the Higher Education and Research Act 2017 applies) – smaller providers and those who specialise particularly in access missions are as always among the names at the top of the list.
I’ve also plotted the difference from the benchmark – a view some people find easier to work with. Here the smaller lines show the standard deviation (I’ve pointed them in the right direction to make things easier) to give us a more nuanced view of statistical significance than just the flags.
If you’re thinking that this looks like a pretty normal year, you are right. Non-continuation lingers between 11 and 12 per cent as it has for the last five years. Smaller providers (a mathematical not a student experience phenomenon), and those who specialise in offering HE to non-traditional groups, tend to overshoot the benchmark – specialists providers (and the OU) tend to be better at keeping students on course than the benchmark suggests.
Here’s a look at how your (or any) provider got on compared to previous years – “no longer in HE” is blue (with the benchmark as a black line), “transfers” are orange, and continuation/qualification is red.
Chicken entrails time
This data release also projects outcomes for the 2018-19 cohort (for larger providers in England). As far as I can see, this is based purely on historic patterns and takes no account of the Covid discontinuity so I can only assume it is presented as a charming novelty. Here’s the plot anyway:
Certainly, questions will already have been asked in Luton.