In 2017, I wrote “Beyond Metrics: an open letter to Michael Barber.”
I congratulated him on his role as chair of the newly formed Office for Students, and acknowledged that his commitment to accessible, low cost, innovative education, and to whole system reform preceded him, as did his reputation for deliverology. Deliverology, in a nutshell, is the process of setting the right targets, combined with the right incentives and penalties, and then standing back to watch as the sector reshapes itself to deliver those targets.
I noted, colourfully I thought, the dangers of creating perverse incentives, also known as Goodhart’s Law, using the case study of Hamsterdam from the legendary television series The Wire, where in response to impossible targets to clean up crime from street corners and reduce the homicide statistics, the police create a zone for the Baltimore drugs cartels to operate freely. I asked Sir Michael to look beyond the statistics and recognise the value of less easily quantifiable benefits.
But that was then and this is now, and I think I’ve been won over to deliverology.
The purpose
For higher education providers who transform the futures of students from lower socioeconomic groups, creating a public narrative about the value of our work can be an uphill struggle, in the face of brickbats and media caricatures and the measurements which feed league tables. The absence of public validation of the purpose and merit of the contribution made by teaching-led anchor institutions impacts the morale and confidence of staff and students.
The provider TEF submission, underpinned by publicly available evidence, proved to be an opportunity to write that narrative, of the lives changed and the immense, dedicated, imaginative, resilient, persistent, expert, collegiate work that goes on across the sector to support students to achieve their academic and professional goals and potential.
The relatively short timeframe forced a “sprint” approach, used in project management to achieve an output at speed. The complexity of the TEF requires groups of people with different expertise and perspectives to work as a team to solve a series of ambiguous problems (so, although none of the longitudinal sector-wide projects successfully defined and measured learning gain, now you’d like us to have a shot in the next three months?).
It displaced other activities and imposed a schedule with momentum. Once surrendered to, that pressure generated focus, energy and teamwork.
And the process
We started from the perspective that the narrative must address student experience and outcomes and align with the provider level data. The public underpinning data limits any tendency to self-delusion and supports honest reflection and a shared understanding within the university. Once the overarching narrative was in place, we started exploring where the data for sections of the provision of groups of students was off the pace.
We found some data reporting errors (yes, really!) but we also found a lot of self-awareness, change already under way, and evidence of improvements which hadn’t yet hit the lagging public metrics. The page limit produced brevity and selectiveness, restrictions which are the conditions that produce a readable, recognisable, authentic narrative that describes the richness and diversity of our provision. The proof reader said it was the first time she’s really had a sense of the work of the university.
The pay-off
Over 50 staff, governors and students provided feedback on the draft document over a three week period in December and January, which meant there was genuine co-creation. New sections of text were provided, case studies, data, amendments and clarifications. Challenges from colleagues produced debates about priorities and how activities and goals should be articulated.
One of the perpetual challenges of academic quality is to avoid it degenerating into bureaucracy and reporting, an industry of ticking boxes without meaning or traction. The TEF exercise felt real and productive, and compared to some processes, efficient. It showed the landscape of the university as it is at the moment, based on a shared understanding of the metrics, and its impact on students and the region. A portrait of the university as a whole and as its parts, it’s a foundation on which to plan the next five years. If we’re here now, and we want to be there, what’s the journey? If this aspect, now we’ve drawn it fully out into the light, isn’t working as well as we’d like, what are we going to do about it?
Of course the public result when it arrives will be important to us, but perhaps even more importantly, TEF has been an opportunity to understand our provision better, to review our aims and how well we are achieving them, to renew our pride in all that we achieve, and use this to shape our priorities for the next five years, to improve how we as a university community educate and support our students, and enable them to fulfil their potential and their dreams.
I agree wholeheartedly with the value of the exercise allowing a deep dive into how we do teaching and learning in the sector, but only for those who have failed to scrutinise the quality of their T&L on a regular basis. If it takes a TEF – and like REF it will be interesting to capture the cost of the exercise – to achieve this, then it’s a poor reflection of the sector, frankly. I guess it smacks of the age old maxim that what get’s measured gets done, sadly. Students deserve better.
I recognise elements of the picture that Sian paints, and I’ve always felt that the data presented by TEF (including the earlier iterations) was interesting and could serve as a useful prompt for reflection and action (alongside reflection on all the other data, quantitative and qualitative) on the education we offer. The biggest problem with TEF has always been, and remains, the desire of the DfE and now OfS to paint in primary colours. The Gold/Silver/Bronze schema is simplistic, reductive and of no value. A real opportunity was missed with the current TEF to reform the way in which submissions are evaluated to adopt a more nuanced and valid approach, for example drawing on the way that KEF operates (which I’m sure isn’t perfect, but at least tries to recognise the complexity of the area it evaluates and reflect this in its outcomes).