Data even further into the future
David Kernohan is Deputy Editor of Wonkhe
Tags
Will there ever come a day where student data is collected from universities in-year?
Whenever that glorious day falls, it will not now be in the 2024-25 academic year. The implementation of the Data Futures programme – a seven year epic with roots extending back to the sector-led HEDIIP project, has once again been delayed.
Today’s letter from the Office for Students does not include a long hoped-for apology to the army of sector professionals who return student records data (for that, we turn to a note from Jisc as Designated Data Body). It is important that we recognise the intolerable pressure placed on staff in this position – unpicking genuine concerns about submissions from a blizzard of inapplicable error messages and unexpected stop points, and with the available support and advice often unable to address these concerns.
The issue is within the submission platform itself – when OfS talks about technical issues these are with the central technology, being developed by Jisc on a bespoke basis. Despite a detailed planned testing regime, and a programme architecture that included regulatory customers at both board and delivery group level, serious issues only became apparent in August – addressing these had a knock on impact on other work planned during the last three months.
If you are reading this wondering just how hard it can be to count students, the sheer scope and detail present in the HESA Student data collection is not to be sniffed at. There are numerous rules, conditional requirements, and error checks that ensure the vast quantity of data collected each year is fit for regulatory purposes. When these processes break down, or struggle, the quality of data is affected. The issues faced with data collection this year (and, by extension, next year) put the suitability of this data in question.
It’s worth quoting from the OfS letter:
I also want to remind you of the important role that context plays in our regulatory approach. Whether through the narratives included in access and participation plans, or the dialogue involved in our assessments of student outcomes, the OfS would not normally expect any single year of student data to be determinative of a regulatory decision. We expect that the data quality that has been possible for the 2022-23 Student data return may contribute relevant contextual information for some of our assessments.
That line at the bottom is hugely telling. There are serious concerns here about data quality – which, in turn, is why OfS is being so forgiving about submission and sign-off deadlines, both on the Student collection (where providers are urged to contact the regulator where extensions beyond 22 November are needed) and on other returns relating to student record. The HESES data return (which supports OfS funding allocations among other things) has been pushed back a calendar month (from 11 December to 18 January, with a knock on impact on the sign-off deadline. Extensions are available on request for the submission of contact details for Graduate Outcomes and the National Student Survey.
OfS will apply caveats to this year’s data where appropriate:
There are various actions we can take in response to concerns about data quality. For example, we can publish clear explanations of known weaknesses in the data, suppress or remove data, or require a provider to submit data amendments to correct the most serious issues. As we do each year, we will consider those actions on a case-by-case basis, informed by the nature and severity of any data error and its implications for how we want to use the data.
And at a provider level the data amendment process is still running.
The decision to pause the roll-out of in year data collection while an independent review of the programme takes place is the right one. A shift in data collection plans of this scale requires absolute confidence in the systems available – currently this simply isn’t there.
The surprise here for me is the existence of critical issues with the platform at this late stage in the process. OfS had already taken steps to beef up governance and oversight of this programme – and currently sits, alongside senior Jisc staff, on both the programme board and the Data Futures Delivery Group. It is vanishingly unlikely that issues were not known about, or even considered, before August at the coal face – the independent review would likely look at reporting lines within Jisc and the programme governance structure to understand where and why concerns have not been passed up the chain of command.
Thank you for highlighting this under appreciated and often invisible area of HE statutory compliance, critical to the continued functioning and funding of Universities. The work is complex, detailed and and requires a subject expertise in student data operations that is taken for granted and under-valued by insitutions. Student data teams have worked under impossible conditions this summer to attempt to deliver to the programme and the negative impact on teams and people has been high.
The elephant in the room is why OfS were absent from so many of the HESA/JISC sessions at the time we were trying to understand the requirements of the new DF records. By the toner errors in interpretation of requirements of the DF return were flagged in the HDP we were way too far down the to engineer a different solution, earlier engagement from OfS would have gone a long way to mitigate this.
Maybe that triennial review of the DDB OfS had promised might have been a good idea after all.
I think what’s most concerning in all the finger pointing that has taken place over recent months is the severe breakdown in trust between OfS, HESA/Jisc and providers themselves. Those damaged relationships will continue to impact data quality, and the spectre of this year’s SNAFU will be felt for years to come.
The point about the breakdown of trust between providers and HESA is very very true. Not a reflection on the people at HESA at the coalface, I feel for those colleagues in HESA Liaison who have had to deal with these issues. The questions are at the very senior levels where they either did not know what was going wrong, or as I have been led to believe, hid the scale of the problem from the Programme Board
In previous years HESA had deadlines (Insert, Commit etc) which involved engagement with their business/quality rules. This year institutions had to submit files containing 50% (May) or 90% (August) of the expected total population and which complied with schema rules only. That meant that most institutions could say that they were on track until August/September as there was no external pressure to engage with the quality rules process. Not really very surprising therefore that all the quality rules problems were so late in the day,