The Skills and Post-16 Education Bill’s bumpy ride to Royal Assent is nearly over.
A final round of Parliamentary ping-pong and this miscellany of a piece of proposed legislation – covering everything from essay mills to defamation to (most controversially) the end of BTEC qualifications – will become an actual law.
Perhaps one of the least controversial facets of the Skills Bill are the Local Skills Improvement Plans (LSIPs). There has been a lot of – clearly fascinating – debate about how these plans will mesh with the various permutations of administrative geography. But the core idea – that what is taught should be decided by those who control entry to employment and have an idea of the skills that this might involve – has been curiously uncontroversial.
Who says what goes
Much is said in higher education of institutional autonomy – of the right for a provider to determine what is taught and how it is delivered. This autonomy is checked by the market for applications, because a course that nobody wants to do is not a viable one. And, much more controversially, it will soon be checked by the kind of jobs (or other graduate outcomes) that the course has historically led to.
If only there was some kind of documentation concerning what a course of a given type or title might be expected to contain. Providers deciding to update, extend, or refresh their portfolio could access a central, agreed, source of truth about what subject experts, professional bodies, former students, and key employers expect a course to cover. A course designed to these standards would be attractive to applicants – who would feel confident they were getting to study the same topics wherever and however they chose to study – and would lead to good employment outcomes.
Best kept secret
Today sees the launch of the latest set of subject benchmark statements. We’ve always been fans of these at Wonkhe – but Alison Felce’s characterisation of the statements as the QAA’s “best kept secret” feels like it cuts rather close to home at the moment. You’ll look in vain for a mention of these in the voluminous, multiple ring-binder, regulatory framework developed by OfS – and engagement with the statements does not characterise any of the myriad diatribes about “Mickey Mouse” courses that appear to have sustained the commentary industry (and latterly ministers) for more than twenty years.
Despite QAA being a member organisation, the statements are free for anyone to read. Staff, students, parents, employers, professional bodies – anyone can use these nationally recognised standards to hold a university to account if a course does not do what it says on the tin. And any provider (from the oldest to the very newest) can draw on the accumulated experience contained in each example to develop a course that can compete with any similar example.
The moment comes
The Skills Bill makes legislative steps to develop localised specifications for courses to address immediate needs. These specifications (LSIPs) will need to be modulated through a longer term perspective to ensure that courses do more for the learner than get them their first job – for me, at least, a higher education qualification should set you up for a lifetime of employment or further study. The development of higher technical qualification specifications and degree apprenticeship descriptions involves bringing together employer and professional expertise via the Institute for Apprenticeships and Technical Education (IfATE).
And in traditional higher education we have the subject benchmarks. For a twenty-year old intervention this latest iteration of statements feels incredibly timely.
Other parts of the post-compulsory sector are playing catch-up – dragged into their own agreed standards via mandates and requirements. Higher education is more consensual and more mannered in the scaffolding of the world-class universities that DfE likes to brag about – there is no mandate to use these benchmarks but it would be very courageous (in the Sir Humphrey Appleby sense of the word) to develop a course without them.
So it frustrates me that we don’t make more noise about how great these subject benchmarks are. Hidden alongside the Quality Code as evidence of a sector that takes standards and expectations seriously and maturely, they hold the key to squaring the quality circle in a world of fashionable metrics and moral panics.
“there is no mandate to use these Benchmarks but it would be very courageous (in the Sir Humphrey Appleby sense of the word) to develop a course without them.”
Yes to an extent, but in their analysis of quality condition response, the OfS cite subject benchmark statements, and external examining, and using externality in research degree examination, as examples of gold plating, while asking (slightly aggressively) that we dismantle the complex internal procedures to which they relate (committing to writing something which has previously mostly been confirmed orally). I remain perplexed about this, as the research degree examination process (in particular) is straightforward and largely unchanged since the 50s.
The statements – useful as they are – seem to be something the OfS seem actively hostile to, rather than ambivalent about. The change in frequency, volume and tone of demands that we dismantle processes which the OfS dislikes (seemingly on the basis that their bureaucracy is necessary and a powerful force for good, and everyone else’s is a pointless waste of time) makes me worry we’re about to face much more explicit demands to ‘reduce internal bureaucracy’.
Making noise would be good, and may be necessary.
If only it were true that unviables courses were closed and replaced by more attractive ones co-designed (where relevant) with alumni, employers and professional bodies.
One issue though is that benchmarks are the product of consultation and not proper market research. The views of some easy to access organisations are thus over represented and I suspect this means that dynamic change in the labour market is often under reported.
In my experience relevant “employers” (for a course) are ill defined (the best insight comes from the line managers of graduates not recruiters or senior execs), with too few small and micro business graduate employers engaged.
The insights of graduates who have recently transitioned into professional roles are woefully under-emphasised. Surely they, with their immediate supervisors, can identify the clear gaps in their expriences and skills that course designer might be able to address?
Couldn’t agree more with the article’s comments about the value of the subject benchmarks, and I’d endorse what Andy says about the perplexing hostility of OfS towards these. E.g. if we need to demonstrate [Condition B1] that our courses are up-to-date, provide challenge and are coherent, aligning these with benchmark statements developed and agreed by subject experts, professional bodies, employers and former students are an effective (though of course not the sole) way of doing this. It’s yet another example of OfS’s illogical, and ultimately damaging, scorched earth policy towards the work of other sector bodies.
I think you may be making the mistake of assuming that the regulator cannot simultaneously hold two contradictory thoughts.
OF COURSE we are only required to comply with their bureaucracy, and everyone else’s is pointless; equally, OF COURSE we require (lots of) documentary evidence in relation to anything at all that they choose to enquire into. There is apparently no inconsistency between these two statements.