It’s now a “representative sample” of assessed work
David Kernohan is Deputy Editor of Wonkhe
Tags
One of the biggest concerns about OfS regulation has just been partially addressed. Though commentators have focused on free speech and financial stability, the issue that has most exercised academic registrars has been the retention of all assessed work for five years.
Storing archives like this – with assessed work including physical artifacts (architects’ models, fashion, sculpture), digital recordings (music, dance, presentations), and the thoughts of others (observations, placement reports) – is a complicated and expensive business, as large universities that already store all kinds of research data for posterity already know. It’s all there in the guidance pertaining to condition of registration B4:
As part of its approach to assessing compliance with this condition, the OfS is likely to need access to students’ assessed work, including for students who are no longer registered on a course. A provider is therefore expected to retain appropriate records of students’ assessed work for such regulatory purposes for a period of five years after the end date of a course.
To be clear, this isn’t just a regulatory requirement – providers may also want to store assessed work (and the wider pool of “records of assessment” including rubrics and guidance) in the light of a future legal challenge – with Jisc recommending a time limit of six years after graduation (the statute of limitations applies).
Today’s supplementary guidance from OfS finesses the ask – it is now clear that only a “representative sample” is required. This shifts comes from a working group including representatives from Universities UK, the Association of Heads of University Administration (AHUA), Jisc, and the Academic Registrars’ Council (ARC).
But what constitutes a representative sample? – well, a university has to decide what is appropriate and be ready to explain their reasons for this choice. It’s the classic OfS “think of a number – no not that one” approach that allows the regulator to be as tough as it likes, while simultaneously criticising the sector for “gold plating” regulations and going beyond what is required. Though many providers have retained some assessed work for a period to allow for appeals to be made, the retention of work for regulatory purposes goes far beyond previous regulatory assessments, which examined the assessment and marking process rather than specifics.
To address specific registrarial sarcasm, the end of the document is a table detailing what is to be kept for each type of assessed work – for instance, OfS would expect to see written work, marks, feedback, and the assessment brief from an assessed lab practical (not the actual collection of hydrocarbons each student had fractionally distilled from crude oil). For a performance, simply the brief and a record of assessment (including assessor observations) would be needed, not a digital recording of the performance.
This section “does not provide an exhaustive list and is intended to provide a framework for providers to use to think about the records that might be appropriate for retention”. However, one also is given the impression that if there are not at least a sample of required records available, any OfS assessment team would not be impressed.
When Paul Greatrix saw a draft of this guidance, back in January of this year, he did not feel like all of his concerns had been addressed. It feels like we’ve got a bit more specific advice here, but the word “appropriate” (as Greatrix notes) still does a lot of heavy lifting.
The guidance makes things much less worse than they were, and represents a classic OfS partial rollback – I’ve heard a couple of OfS-linked folks talk about this as a clarification of what was always meant, but it isn’t what the guidance said, and is emphatically not what OfS staff were saying in person at the time (when all work meant all work).
It’s still (yet another) huge regulatory burden with significant cost and time implications for over-stretched HEIs and, while I can see the logic that this is something which reviewers may want to see, five years is a very long time (for a sample of all assessments/programmes/students across all providers), and you only need to look at how many reviews the OfS can possibly support in a year (small single figures so far, focussed on bits of subjects in some providers; it’s notably the QAA not the OfS looking at international foundation programmes) to see that this is another sledgehammer to crack some pretty tiny nuts – 99.9999999% of the assessments held are never going to be looked at by anyone, ever, and this is data being collected and saved for no purpose other than regulation.
If the OfS (or, given the true starting point for every issue, more likely the Telegraph, Mail or the Times) has concerns about the marking of a subject in a provider, it still isn’t clear to me why they couldn’t just go in and review the marking which was happening that year/the year before.
But, we are where we are, and this is at least more manageable than the starting point (which came into effect in May 2023(?), so there’s presumably a fair few places already in breach of the expectations, *if* they are unlucky enough to trigger the ire of the OfS, Government and/or any of the right-wing newspapers).