December 2022 saw the Office for Students write to twelve providers with concerns about student outcomes. Data analysed against the numeric thresholds set – under registration condition B3 – for continuation, completion, and progression gave the regulator cause to seek assurances and explanations.
Of the eleven providers we have reports for – three FE colleges, four traditional universities, two specialist providers and two alternative providers – eight received a notice to improve, with the remaining three getting a clean bill of health.
The notices to improve require an initial provider-level review, and the development of an action plan with a built in evaluation. Each of these cases represents an additional condition of registration, with improvements (performance at or above the appropriate numerical threshold or above) required by either 2027 or 2028, depending on the case.
What’s a B3?
It’s worth refreshing our understanding of B3 assessments – a provider’s own performance (and also the performance of groups of students, such those linked to a particular subject or mode of study) is assessed against a set of numeric thresholds for continuation, completion, and progression. Though there is not an automatic link, performance below these thresholds represents a risk of breach of regulatory conditions and can trigger an investigation.
Each provider involved is able to submit contextual information to “explain” the data of concern. There are no inspections – this is purely a desk-based exercise.
In thinking about the B3 data that has driven these investigations, we should bear in mind the lagging nature of the three indicators. This illustration considers full-time first degree students for the 2022 dataset – the data will be different for other groups.
- Continuation: includes four years of data, dealing with cohorts who started their courses between 2017 and 2020
- Completion: includes four years of data, dealing with cohorts who started their courses between 2014 and 2018
- Progression: includes four years of data, dealing with cohorts who started their courses between 2014 and 2018
About time
In the fast-moving world of UK higher education, 2022 was quite a while ago. But although the process has been long and torturous – I’ve heard concerns about the quality and speed of OfS communications from providers involved – the sheer age of the data should really give us pause.
If you are involved with one or more higher education you’ll know that the combination of at least one internal course review, numerous module reviews, changes in staff, changes in delivery methods, and so on, will mean that a decade will leave pretty much everything unrecognisable. Students are not studying on, much less applying to, the same course that is reflected in this data. It’s a well-understood weakness in the OfS’ data driven approach to regulation that has never properly been addressed.
So, when these eight providers are asked to show an improvement in the data before the start of the 2027 or 2028 academic year, this date represents the first point at which none of the data used in the construction of the indicator currently will be represented in the new indicator. Even so, the indicators for completion and progression will include only one year of data referring to students who started their course after the OfS started the investigation – and none from after the results were published and action plans needed to be developed.
OfS’ John Blake, in his many impressive pronouncements about evaluation and access, likes to talk about theories of change – what, in other words, makes you think that the action you are taking will have the result you are looking for. It’s a great discipline for anyone making interventions in processes to think about a theory of change. I would be delighted to hear more from OfS on what it thinks that making an intervention from 2024 will do for cohorts who graduated in 2022 and 2023.
Process engineering
Though it is right that OfS considered the context in each case, and that providers were able to make representations before a regulatory penalty was applied, it should not have taken a year and a half to publish conclusions and make recommendations. Providers should have been kept informed about the timescales involved, should have had their queries and concerns answered properly.
The appearance of TEF results during the investigation hasn’t been a good look either. Among the eleven providers investigated we find one gold (“typically outstanding”, three silvers (“typically very high quality”) , and three bronzes (“typically high quality”). There was one provider holding a “requires improvement”, and three that did not choose to enter. It is therefore the case that a provider with “typically very high quality” provision also requires a blanket registration condition relating to student outcomes.
Though the data has been available publicly, you would have needed to hunt around a bit (or used my 2022 version of the data dashboard) to get a sense of whether a given provider would be in breach of the numeric threshold. Though you can find some of the names below at the tail end of these charts, there are many providers with similarly challenging data that have not been investigated – we are never told the rationale for why these eleven providers have been chosen.
One of the providers, Norland College, bumps along the bottom of progression data because the Office for National Statistics believes that childcare is not a graduate job. Norland nannies earn far more than most of us ever will, and all those who complete the course end up in their chosen job. This is neither new nor controversial information, and given that OfS awarded Norland a Gold TEF one would imagine that this contextual argument had been accepted. It is not clear why they needed to make it again here.
The findings
Of the twelve B3 investigations, OfS have published eleven sets of findings and eight notices to improve.
The default finding was either a B3A, B3B, or B3C specific condition of registration (the supplementary letters just refer to the number of previous B3 conditions held by each provider), requiring the provider to develop and monitor an action plan to show an improvement on the required indicator by the time of another OfS review in either 2027 or 2028.
In cases where the OfS found that contextual information did explain performance against indicator threshold, it still maintained that the provider was “at risk” of breaches unless otherwise stated.
In reviewing Arden University the OfS had concerns around continuation, completion, and progression for full-time first degree students – alongside concerns related to part-time provision. The university argued that the socioeconomic profile of the student body (mature students, deprived areas) was a contributing factor, noted actions and investment since 2020, and had concerns about some healthcare-related outcomes not being classified as graduate jobs. OfS recognised that some substantial steps had been taken to improve outcomes, and noted the withdrawal of part-time other undergraduate but did not agree with the two arguments made. It considered Arden at risk of a future breach of condition B3 and imposed condition B3A.
For Blackburn College, the concerns were around completion and continuation for full-time first degree courses, and continuation for full-time other undergraduate courses. The college argued that the socioeconomic makeup of the student body was a factor (mature, ethnic minority, low attainment backgrounds), that changes to leadership since 2019 had driven improvement and further actions were planned, and submitted internal information on current performance against the benchmarks. Here, OfS found that the context justified performance on continuation but not completion, and imposed condition B3B.
There was a similar split finding at Burnley College, where OfS found that contextual factors did not justify performance on continuation but did on progression. In both cases full-time first degree provision was in scope – Burnley made a socioeconomic argument (deprivation), noted the impact of the pandemic, and supplied information on student support and quality assurance (including data, case studies, and policy). OfS did not feel that actions already in train would fully address continuation issues, and imposed condition B3A.
At Croydon College, concerns related to continuation and completion for full-time first degree and part-time other undergraduate courses. The college submitted contextual information covering changes to the curriculum, changes to the leadership team, the socioeconomic profile of students (mature, female, deprivation), ongoing improvement plans and the termination of a student recruitment partnership. The argument about the recruitment partnership impressed OfS, but the other contextual factors not so much. Condition B3C was imposed.
Concerns about continuation at the University of Cumbria related to full-time first degree courses in business, and part-time taught masters provision. The evidence provided related to the socioeconomic profile of students at the London campus, and on previous actions taken regarding top-up courses and postgraduate student support. Cumbria was not found to be in breach of the continuation requirements for full-time first degree business students, but placed condition B3A to ensure early signs of improvement in postgraduate continuation were maintained.
The OfS found that Leeds Beckett University was at “increased risk” of breaches relating to completion among full-time first degree students, but was convinced by contextual arguments relating to a range of other student groups. Beckett told OfS about quality assurance processes, a number of courses that had been closed or withdrawn, courses subject to internal “enhanced monitoring” and the performance of business and management provision. Condition B3A refers only to completion for full-time, first degree, computing provision.
At London Film School, OfS focused on completion rates among postgraduate taught masters students. The investigation revealed historic data issues, and the corrected data did not give cause for concern.
For London Metropolitan University, contextual information justified data on continuation and completion for part-time, other undergraduate courses, and completion for full-time first degree courses. However, the university was found to be at increased risk of breaches of B3 for continuation outcomes for full-time first degree courses and all modes of postgraduate taught masters provision. LMU made the contextual case around student demographics (disadvantage), integrated foundation years, historical data issues, and actions already put in place – only some parts of the data issues washed with the regulator, so condition B3B was imposed.
I dealt with the Norland College story above – suffice it to say the same issues that are always raised about the construction of the progression indicator were noted by OfS once again.
OfS had concerns about full-time first degree continuation and completion at Richmond, the American International University in London. The university raised issues around reporting transfers to overseas providers, the nature of the model of participation (more similar to US norms around intensity and flexibility), actions underway, and the treatment of international students. None of this washed, so condition B3A was imposed.
At the University of Worcester an investigation into the continuation of full-time, first degree business and management students was conducted. Many of the students in question were studying via a partnership with an embedded college that had since been terminated. OfS took no regulatory action.
And there is one further investigation report still pending.
What’s the difference between these B3 reports and the existing quality assessment reports that we’ve seen in the meantime? are they related at all? are both instigated using student outcomes data? so confused.
Hi – the other quality assessment reports relate to conditions B1, B2, and B4. Though they can be triggered by concerns identified via B3 data, they can also be sparked by notifications or other information – this leads to an in depth examination of teaching, assessment, and policy matters within a given subject area at a provider.
These B3 investigations are triggered only by B3 data, and are a desk-based exercise.
‘Contextual’ a term that hides many issues that providers have little or no control over or ability to influence. Whilst some simply lower the bar for such students, others do so for the whole cohort, and every employer who looks closely and see’s this happening excludes their graduates when recruiting, adversely effecting those with the ability and application required to graduate without ‘adjustments’…
Thanks for this summary. It seems the office is focused on procces and through put counting. Very little on quality accountability. As a london met post grad student i understand why many students leave the cources. No curriculum and nothing happening. The Uni continues to exploit international students who was enrolled are essentially trapped, fleeced and forgotten. The disadvantaged claim is a sad excuse for why the university is so bad.
OFS is doing the Ofsted version of universities and especially replacing the TIER 4 sponsorship team, you had that coming i.e. Policing in education. (Ofsted approach sadly led to a few teacher deaths in the past few years.)
Anyways, while this may be good to identify certain providers either failing the system exploiting the system, on the other hand genuine providers have no support mechanism from OFS, and the delays are also killing.
These bodies should focus on fixing the broken system first and then move to providers.
What Colin mentioned above makes sense, unless it’s a really technical / practical course most of the degrees are obsolete but kept alive to generate revenue.
Employability challenge is a problem nationally on itself, if govt doesn’t create enough jobs let’s penalise the universities for it because their students are not employable.
As dig in, there are more and more watchdog issues.
A self reflection is required by those who run the system.
Vicky Green’s remark that “Employability challenge is a problem nationally on itself, if govt doesn’t create enough jobs let’s penalise the universities for it because their students are not employable.” seems very strange to me and a bit worrying.
It is not the job of The Government to create jobs. The public sector already employs too many people in my view and all public sector jobs (and generous pensions) are paid for by taxpayers. I and many others do not want to pay higher taxes.
In a free market economy like the UK, employers in the private sector offer work to prospective employees and currently there are around 1 million unfilled vacancies. There is no shortage of work or jobs.
Sensible employers will not, however, employ individuals who do not have the necessary skills and attitudes and who are not likely to add value to the business or cover the cost of their wages.
It is important that the OfS carries out these investigations and tries to prevent Universities using taxpayer money to recruit unsuitable students who do not complete their courses and / or do not get a job at the end of their studies and end up on benefits, which are also paid for by tax payers.
It is also unfair to students, if the students Universities take on, do not progress to good jobs, as it may leave those students with massive debt and heavy interest and loan repayments for up to 40 years.