OfS has updated a bunch of its dashboards – can the regulation keep up?
Jim is an Associate Editor at Wonkhe
Tags
The figures matter a lot because they’re the core of Condition B3 – providers must deliver successful outcomes for all of their students, which are recognised and valued by employers, and/or enable further study.
What we spent a long time describing as the “B3 bear” is a set of minimum percentage outcomes that all providers must hit, and if they don’t they have to have a convincing justification and a plan. We explained the system on the site when it formally launched last September.
We can also see the data that drives the TEF – the usual toggles allow us to see benchmarks, and comparisons with those benchmarks.
In the context of OfS’ board papers earlier this week, the publication is interesting insofar as we now have a new theoretical list of providers that could go on to be judged to be in breach of B3. It doesn’t seem to have got very far investigating last year’s lot, but time marches on.
Since last year, some will now have outcomes now looking worse, and some better. The standard dashboard allows you to see where your provider is at both at aggregate level and for each of the student characteristics, mode and subject splits.
Separately, the sector distribution of student outcomes and experience measures data dashboard shows… the distribution of student outcome and experience measures calculated for each provider across the sector.
This one is absolutely fascinating because it allows us to easily see how many and which providers are below each of the main thresholds at aggregate level.
So if we take full-time, first degree continuation for example, we see big names like the University of Bedfordshire, London Met and and the University of Suffolk rubbing shoulders with a collection of FE colleges, some small and specialist providers and more generalist private HE providers like the Global Banking School, Nelson College, the ICON College of Technology and Management and the London School of Commerce & IT.
In that group is the Bloomsbury Institute, which shows up with a 66.8 per cent continuation rate. Bloomsbury, you will recall, was the provider that was involved in a legal battle with OfS back when the regulator was less keen on publishing the acceptable thresholds.
OfS has tended to emphasise that it will target providers with volumes of students on the basis that doing so addresses the biggest risks – but if it continues in that vein the risk is that entire groups of providers who collectively enrol a large number of students in the OfS register’s “long tail” won’t be tackled.
I know that plenty of folk bemoan the metric and its basis for regulation, but when you are looking at (as I am now) an HE provider with over 2,000 students whose completion rate is 50.8 per cent and whose progression rate is 57.8 per cent, if nothing else surely a regulatory eyebrow ought to go up.
On progression, if you ignore the Norland Nannies (not a graduate job, see) on 3.7 per cent, there’s a number of art and design providers – we don’t yet know if any of them have been under scrutiny and if they’ve been able to argue that students somehow get other good outcomes.
Add in other providers and look at subject level, and it’s a wonder that OfS hasn’t launched a full-on outcomes assault on the arts – although maybe even OfS’ taste for taking on the sector doesn’t stretch that far, at least not yet.
There’s also a clutch of those Willetts-era private providers around the edge of London who are almost certainly sending different marketing signals on the career benefits of their programmes than the data here suggests.
There’s one other dashboard update – that shows data about the size and shape of a provider’s student population.