This article is more than 1 year old

How students influenced TEF panel statements

Jim Dickinson identifies whether and how students were listened to in this iteration of the TEF
This article is more than 1 year old

Jim is an Associate Editor at Wonkhe

In the olden days, the Teaching Excellence Framework (TEF) “listened” to students in two ways – it used some of a provider’s National Student Survey (NSS) scores, and providers were “encouraged” to involve students in the production of their narrative submission, and to show how they had done so.

That represented a significant downgrade from that seen in Quality Assurance Agency (QAA) led institutional review processes – which as far back as 2002 had included the opportunity for students to develop their own, independent Student Written Submission into the process – with plenty of evidence of impact.

So when Shirley Pearce’s review of the TEF kicked off in 2018, it was pretty much inevitable that we would end up with a call for students to be able to input in a way that was both more independent, and more current than the satisfaction and outcomes metrics driving at least part of the process.

So building on a notably successful round of inviting student submissions into evaluations of provider progress on Access and Participation Plans, the regulator proposed something similar for this new TEF exercise – a lead student contact for every provider would be able to choose to develop and submit their own report, offering “additional insights” to the TEF panel on what it is like to be a student at that provider, and what they gain from their experience.

We now have 143 of these student submissions, along with the panel statements that tell us whether and how those submissions have been taken into account. I’ve spent some time looking at those statements (where they’ve been published), asking – what caught the panel’s eye, what did it give weight to, and what appeared to make a difference to the overall rating?

Process

We might see student representation as a means to an end. Many of the statements noted strong partnerships with their students’ unions, leading to significant improvements in student experiences, campus spaces, and educational support. There’s a major focus on how student views are captured and acted upon at a more granular level too, with many statements noting SU comments on the effectiveness of student representation systems.

And there are numerous examples of universities responding to SU student feedback, such as altering teaching methods, repurposing campus spaces, and introducing new initiatives linked to student outcomes. One is highlighted in six initial case studies.

There’s a particular focus on support for diverse student needs. Several student submissions recognise efforts to support Disabled students, as well as tailored support for different student groups that represent a read across from access and participation work.

And perhaps inevitably given both the pandemic and ongoing concerns about students’ proximity to campus, digital education and blended learning are recurring theme, with many statements noting universities’ receipt of positive feedback from their SU for their online resources and blended learning approaches.

Being seen to respond

This theme of being seen to respond to feedback as part of an ongoing partnership seems to be particularly important. Maybe SUs were relatively shy about direct criticism of the student experience at the point of submission – but evidence of responding to over the four year period appears to have picked up points.

Some universities had modified their teaching approaches based on student feedback, aiming to enhance the learning experience and better meet the needs of their students.

Several statements noted that feedback from students led to the repurposing of campus spaces, making them more suitable for study and other student activities. A number noted that universities had enhanced their student welfare and support services in response to feedback – particularly focusing on mental health and academic support.

The point is not so much that the student statements agreed or disagreed with the university statement. It’s that they demonstrated the way in which a culture of student representation causes change across the piece.

Not uncritical

That’s not to say that SUs were uncritical – although a handful of the “student” submissions from alternative providers don’t feel especially authentic. Some stress the need for more engaging course content, support services, and the variability in the quality of teaching and assessment.

A common issue – not a surprise given the state of NSS scores – was dissatisfaction with the quality, consistency, and timeliness of assessment and feedback. This included concerns about the effectiveness and impartiality of assessment methods. Lower than average satisfaction rates were noted in some submissions regarding student voice and representation, which for the panel backup up NSS metrics on how effectively student feedback is being acknowledged and acted upon.

A number of statements also note SU feedback on challenges with the variability in the quality of teaching. Submissions that reported inconsistencies across different courses and departments, affecting the overall learning experience were especially noted. And issues related to the availability and quality of learning resources, including library services, online learning platforms, and physical infrastructure, were highlighted in plenty of SU submissions picked up by the panel.

Several SU submissions had picked up criticisms from students over the need for more engaging and intellectually stimulating course content. Some students felt that the content was not sufficiently challenging or relevant. And several SU submissions noted by the panel picked up their euro-centric course content and a lack of steps towards enhancing equality, diversity, and inclusion. This included the need for decolonizing the curriculum and better representation of minority groups.

And several submissions mentioned student submission comments on the need for improved focus on employability, work readiness, and practical aspects of courses to better prepare students for their future careers.

Incentives for honesty

I’m still not 100 per cent convinced by the theory of change here – where there are problems, the incentives to be honest in the student submission are just too weak.

What’s also clear is that across the spread, the absence of a student submission, and the variable quality of those student submissions ought also to be a symbol of a provider’s wider commitment to that aspect of Condition B2 that covers student engagement.

Both the best bit of all this and the worst bit is that it feels like a one-off exercise rather than an ongoing process. The panel picked up positively on an ongoing response relationship where it existed – yet OfS hasn’t offered anything on what ought to be happening between now and TEF 2027.

If we think the SU ought to be writing a submission once every few years, we probably ought to be expecting it every year, hadn’t we? Finding ways in the next exercise to allow the student submission to come in early – as used to happen with QAA – so it can be responded to as part of the exercise would take some of the peril out of the process.

And both on B Conditions minimums, and TEF aspirations, OfS really need to try harder at putting students and their reps in a position where they understand what’s expected.

One response to “How students influenced TEF panel statements

  1. I’m aware that one university had 10 SU reps working on the TEF submission. A neighbouring and smaller university had 2. Any thoughts?

Leave a Reply