The twenty-second of November should be marked in your calendar.
That’s the deadline for the official, twice-delayed sign-off date for the Jisc/HESA Data Futures return – the date by which all validation checks must be passed and all the subsequent data queries addressed and signed-off.
The funders and regulators who depend on this data continue to offer pragmatic flexibility around the deadlines, recognising the problems that have been encountered in the delivery of Data Futures by many, if not all, stakeholders.
These delays also knock on to the tightly-packed calendar of annual data returns and events; HESES, Finance Return, ILR, NSS, Graduate Outcomes, and so on.
The professionals
At the eye of this storm is a thinly-spread cadre of higher education data professionals. These are the people tasked with making the data returns; the people who have to digest hundreds of pages of dense, technical coding manuals in order to prepare and quality assure their data submissions. This community has spent the last few months wrestling with a data collection system that has encountered a myriad of problems, most painfully the inability of the validation systems to correctly reject data which is bad and accept data which is good.
This community is used to pressure. Deadlines are always tight and the quality thresholds are demanding. As funders, regulators, and league-tablers seek to drive increasing amounts of value from data, the significance of these data submissions ratchets up year after year – and the impact of poor data submissions becomes ever more painful.
Every year the autumnal peak of data returns is accompanied by an outbreak of gallows humour in student records offices across the land, and on social media. There are occasional howls of frustration and a smattering of ennui as the sector’s data professionals wrangle their datasets into shape and chase down the errors and queries to arrive at the rite of the data sign-off.
One way or another I’ve worked with this community for over thirty years and I’m sure they won’t mind me saying: they’re an unusual bunch. Not in a bad way; they are the most lovely, brilliant, tenacious, and thoroughly professional people you could wish to meet. They have a great sense of camaraderie and shared experience and an underlying feeling of “how did I get here?” They come from such a wide variety of professional and academic backgrounds and I’ve never met one who had “doing the HESA return” as a career goal in their earlier life.
Data careers
And herein lies the rub. HE institutions are critically dependent on their data professionals to meet the rapacious demands of funders and regulators and to satisfy the aspirations for data-driven insights that will create better organisations and provide better and more targeted student support. But we often struggle to articulate the skills and knowledge required to undertake these roles. Anecdotally we know it is becoming increasingly difficult to recruit and retain these staff and in many cases there’s little sense of a defined career path once you’re on the treadmill of data submissions.
The recent Wonkhe/Advance HE study into the changing people needs of higher education highlighted the significance of data technology as a driver for change and the challenge of developing capabilities to meet future needs. Similarly, Jisc is emphasising the need to put people at the heart of digital development in HE.
Growing the community
The challenges here are broadly understood; so how can we better support, grow and develop our community of data professionals?
We need to recognise that the community of data professionals transcends the myriad of professional groups in HE. This issue belongs to no one group or community; we’re all data professionals now and some kind of broad collective action is necessary to achieve change across the range of roles and professional disciplines in this area.
And we need a clear definition of the data skills we need so that we can create some kind of professional standards and career path for the data-centric roles in HE. This could include a competencies framework for data skills, supported by a training pathway or apprenticeship standard that covers both the science and the art of data.
It’s important that we make these roles more desirable in the eyes of aspiring data scientists. This is not just about salary levels. There is something about the lack of prominence of these capabilities in organisational design; a failure to properly acknowledge the value that these roles can and do deliver to institutions.
My final suggestion is a little more immediate and is addressed directly to leaders across the sector.
Some of your data professionals have had a terrible time these past few months. A few have taken to social media to vent anger and frustration at the Data Futures experience; many more have suffered in silence. There have been immense levels of stress and exhaustion, exacerbated by the knowledge that the stakes associated with these data returns have never been higher. Some have been damaged by this experience; some have been broken.
Right now, you need to show them some love.
The final paragraph is the most important – this is about people. There is an ever increasing requirement for HE data to support the regulation of the sector in ways we could not imagine five years ago. We should step back and ask if the perceived value of this forensic level of scrutiny is a good return on investment built upon significant human stress and undervalued endurance?
The complexity and intricacies of student, staff, and other HE data is much greater than is widely understood or appreciated. This is made even more complicated by the layers of performance indicators built on top of these (e.g. B3, TEF etc…) which require a detailed understanding of the often shifting definitions applied to the statutory data. It is not so much to do with the complexities of the transformations and analyses applied to the data; it is more in the complexity, interdependencies and fiddly nature of the definitions used to define student populations etc….
The registries and other teams that put these returns together are largely unsung heroes that create the data on which even the basic functioning of HE depends (I say this as an observer – I don’t work in a registry). Hopefully one good outcome of the DF troubles is that the work of these teams is better appreciated and valued – both within institutions and more broadly!
In the US and Australia this work is incorporated into the role of Institutional Researchers. There is a clear role, career progression, annual national conference and regional conferences, offering networking and support. Historically there have been nationally supported training linked to relevant PhD programmes, with certification and training on working on national datasets. This supports both internal institutional research and wider sector research and evaluation. This overlaps but is distinct from the strategic planning roles in the UK. Greater engagement, and extension of the HEIR group could be a way forward.
I wonder how many “Institutional Researchers” have the time to take on yet another role, given many Universities have downsized their Academic support admin staffs and that ALL Academics are now loaded with much of the administration work on top of what they were employed to do. “All work and no play, makes Jack a dull boy”.
@John, in the US ‘institutional research’ (effectively, research about institutions) is a term used to describe primarily administrative staff who are involved in collating, transforming and using data for decision making, such as internal management information, sector benchmarking.
Data Futures is horrific.
Universities told HESA and others why it wouldn’t work from day one and were never listened to.
If the main aim is to gather in-year data then it should have been done within the previous, well-established framework. Changes to the items collected and structure could have come later.
Those involved in the return are accustomed to the stress and pressure it brings annually, but this year has been off the scale and I sympathise immensely for all those who’ve had to contend with it.
Somebody should have been brave enough to call it the disaster it’s turned out to be and cancelled it.
This is nothing to do with the people submitting the returns or data skills and everything to do with the people who forced this through.
Andy Youell is a former Director of Data Policy and Governance for HESA and was heavily involved in introducing Data Futures. Reading this article by him makes me even more furious.
100% agree. Andy Youell was a prime mover in the introduction of Data Futures and totally refused to listen to all those in the sector who tried to warn how misconceived the approach being taken was and the likely outcomes. Frankly the article is breathtaking in its hypocrisy and highly patronising.
It is a little too modest. I’d be wary of HE leaders bearing ‘love’ or ‘recognition’ rather than salary and staffing levels commensurate with the skills required and value created.
“There is something about the lack of prominence of these capabilities in organisational design” shouldn’t be a mystery. These people have to diagnose errors produced by others’ systems and processes, so ‘fixing’ data to pass validation becomes a twighlight process to allow systems and processes to carry on regardless. Now, without reliable validation from JISC and a blizzard of auto-queries, reactive data quality and maintenance can’t cover the gap, and every field and error is new.
Hesa’s Pragmatic flexibility? Followed
by skills development in universities. This article is written with intention to deflect from hesas inadequacies.
We’ve certainly found this to be the case with our members, which is exactly why a membership model in which multiple providers share evaluation expertise works well. Great to see the shortage of staff with relevant data expertise being given the attention it deserves (although not so great the issue exists). I’ve lost count of the number of providers I’ve seen struggling to fill posts with a data/evaluation focus.
As someone who has worked in HE data leadership roles on both sides of the Atlantic, I would argue that US required data submissions are not comparable to the UK’s data returns. The level of complexity and detail required in the UK is much higher…and the data coming out the other end has also traditionally been of much higher quality and more useful in the UK. It is very sad to see things where they are, especially as many of us were raising red flags five years ago.
I agree with Andy’s general argument. Data people in universities come from a variety of different backgrounds. It is not always clear how you train to be a data person. Many of us, including myself, landed in it by accident. But these people have never been more important, and developments in AI (all types) only make high-quality data more central to everything that we do, operationally and strategically. Institutional research in the US is also trying to reinvent itself to recognize this broader reality.
We may be all be ‘data professionals’ but the first link (Hilditch, 2018) distinguishes return compilers (Data Quality, Statutory Reporting, or External Returns Officers) from systems managers and data customers, who report on validated data enriched by HESA. It outlines the range of skills necessary to generate valid data from disparate systems and processes. They know the defects in records systems and processes, and how the data got made, that’s why they’re kept in obscurity.
HESA once moderated data customers’ requirements with precaution, but as it’s been absorbed into JISC, data customers (led by OfS with internal data customers on board nodding along), ushered in their ideal Data Futures model — all things to all customers. Disregarding the risks of dependency upon software suppliers redeveloping their records systems, and records systems managers implementing developments. And ultimately, the assumption that return compilers, can and must weave together data that conforms to data customers’ ideal model. HESA Liaison seem to have been put in a similar position to compilers, obliged to work around holes in the validation of data under JISC’s experimental model and software, while the Project Board sample and monitor feedback remotely.
Data Futures is the return data customers wished for, on the assumption return compilers can and must stitch data together to suit the customers. Don’t they always? Now, due to complete dependency, the only mitigation data customers can conceive of are extensions to ward off their own day of reckoning (before retrospective corrections).
There have been various groups running since 2020 raising concerns with HESA and software suppliers and working through issues. These groups have been instigated via customers of software providers and local networks and have been invaluable, however there has been no formal group officially representing the sector on the Hesa data futures project with clear expectations established by the sector in engaging with the project formalised. This is where I would have liked to have seen hesas pragmatic approach extend to rather than falling short at deadline requests from the sector.
Considering the volume of providers who are expected to implement the change this should have been a fundamental project requirement. More engagement than sending out surveys about the data model was required.
This has resulted in the sector reluctantly accepting the project implementation issues from Hesa with significant difficulty. Providers have had to deal with their own internal challenges with the volume of change in the project as well as software supplier issues and Hesas issues. Appreciating the complexity of the project, the above would have been a collaborative approach working towards a more successful project.
The written announcement from OfS states that Inyear will go live when Hesa have the necessary functionality in place however this wasn’t in place for the first end of year return. The absence of functionality and the release of functionality with significant bugs added to the existing complexity.