When reading through the Office for Students (OfS) consultations and the regulatory condition B3 proposals to set thresholds for assessing institutional performance based on student outcomes my initial reaction was that it was relatively balanced and individually each of the component parts can be rationally explained and understood. Indeed, if it had been written by a different regulator I might have been relatively relaxed about the proposals.
There are, of course, some areas that I would want to see tightened up – particularly the progression to graduate employment indicator and some of the data that underpins it. I do also have real concerns about the regulatory burden of using such a heavily data driven approach, especially on smaller providers. But the more I’ve been thinking about the proposals the more it has struck me that the primary issue is one of trust.
Regulation does not exist in a vacuum. The OfS proposals come against a backdrop of how OfS has operated in the first few years of operation. I would suggest many of the sector requests for additional information or transparency in the OfS processes, or wider outcries and consternation about the proposals, stem from an erosion in trust between the regulator and the sector.
The first few years of the OfS regime have been marked by overly legalistic and confrontational communications, lack of genuine consultation and consideration of views of those that OfS regulates, as well as insufficient consideration of UK and European regulatory cohesion and international reputation.
There will always be a tension in relationships between the regulated and regulator, but relationships do need to be built on trust. This was highlighted in last week’s National Audit Office report which suggested that OfS needs to communicate more effectively with the sector to build trust on how it regulates on financial sustainability. I would recommend that the OfS considers what more it might do to rebuild and strengthen its relationship with the higher education sector.
Progression to graduate employment
In the details of the proposals I have particular concerns about the progression indicator, both on a principled basis – it does not measure the quality of learning opportunities – but also on a practical level.
The role of the regulator is to regulate on aspects which an institution can control. There are structural inequalities within society that impact on the type and level of jobs that people are appointed to. There are also likely to be regional differences in current skills needs and not considering this factor by removing benchmarks has the potential to negatively impact on government plans to level up regional economies.
However, we recognise that OfS has expressed a policy intent of seeking to measure successful graduate outcomes. We therefore propose that if it insists on proceeding with the use of the progression indicator that it needs to be strengthened to ensure that it is robust.
First, the data is not currently robust enough. This is particularly important in the context of the OfS’s role as a provider of official statistics. Graduate Outcomes data shouldn’t be used until response rates are at least 50 per cent and coding errors resulting from self-reporting of jobs and coding by non-experts have been resolved.
Second, the SOC codes themselves need to be more flexible to recognise jobs changing their skills needs, new jobs being created and that some jobs are already misclassified. A review of SOC every ten years is not enough to capture these inconsistencies and changes and so OfS should have an additional list alongside the SOC codes.
Third, and most important, OfS should draw on a wider set of information to contextualise the purely quantitative SOC data. This context should provide a better recognition of student and graduate expectations and could be gathered through using the additional questions in Graduate Outcomes about whether the job is meaningful and meets their future needs.
Regulatory burden
The final area of concern relates to the regulatory burden of the proposal. I have real concerns about the mass of data resulting from the thousands of indicators, split metrics and data points. This would result in a significant data analysis burden for the regulator and for providers and will specifically disadvantage smaller providers in the sector who do not have teams of data practitioners to scrutinise the data workbooks.
Well over half of the providers that the OfS regulates are small and micro providers with volatile datasets. These are unlikely to be statistically significant and it is likely to require contextual conversations with them on a more regular basis than larger providers, and so baking in additional regulatory burden for smaller providers.
It is also worth highlighting that the lack of coherence between all the datasets that OfS is proposing with B3, TEF, and APP is likely to further increase the burden on institutions as they all use slightly different data or benchmarking approaches.
OfS prides itself on being a risk-based regulator and so could consider developing a set of key performance indicators that are more directly linked to the risks. The OfS also has duties to have regard to the need to use its own resources efficiently, effectively and economically and to have regard to best regulatory practice. We believe that the scale of data analysis and associated burden is not an efficient and effective use of OfS resources, nor is it proportionate to the regulatory risk.
The mass of data means that it is impossible to see the wood for the trees and essentially means that OfS can decide which institutions it wants to investigate because every institution is likely to be below an indicator somewhere. This emphasises the importance of the processes by which the OfS prioritises where it focuses its resources. The need for a clear, transparent and robust prioritisation process, free from political interference will be key to maintaining the confidence in the regulator.
Finally, regulators should always be mindful of unintended consequences. The current proposal risks swamping providers with data but would, in practice, result in diverting considerable staff resource away from established processes of quality management such as annual monitoring and periodic review.
The OfS proposals could be improved by clearer prioritisation of areas of focus through annual key performance indicators publicised in advance and only the data for those areas published, and also significantly enhancing the data underpinning the progression metric. However, the key issue that the OfS will need to consider is how it goes about rebuilding trust, which in a largely compliant sector is essential for good regulation and efficient use of resources.
If you want to be cynical – the metrics seem perfectly designed for newspaper headlines “few than six out of ten University of nowhere students get a good job” that are hard to fight.