This article is more than 1 year old

What TEF submissions told us about the student experience

Many provider TEF submissions describe innovative, co-created initiatives. Livia Scott, Sunday Blake, and Jim Dickinson found that - in comparison - the student submissions told a slightly different story
This article is more than 1 year old

Livia Scott is Partnerships Coordinator at Wonkhe


Sunday Blake is associate editor at Wonkhe


Jim is an Associate Editor at Wonkhe

When we took a look at the TEF submissions, we were interested in seeing how providers are delivering high-quality student experiences for a diverse range of students at scale.

This is a challenge that many institutions are grappling with in real time – the student population is both diversifying and growing at speed, and the needs of students are growing at a similar pace.

A common trend in the submissions was to discuss the diverse nature of a provider’s student population. Some mentioned increasing proportions of students declaring a disability or mental health condition; others mentioned a higher proportion of commuting students or students with caring responsibilities. Providers with a larger black student population described the work they were doing as part of improving access and participation in addressing awarding gaps between students of colour and their white peers.

Few providers mentioned economic diversity between student populations and demographics, although there was mention of widening access schemes that aimed to help prepare first-in-family students, for instance, in their transition to university. Similarly, brilliant schemes of academic or professional mentoring projects were explicitly created to support students of colour or those with a declared disability.

Measuring belonging

Providers often referred to initiatives to increase students’ “sense of belonging” – particularly those with diverse and widening participation demographics – as evidence that students are having a good experience. However, there did not seem to be consistent metrics or measures used across the sector to demonstrate what contributes to students’ strong sense of belonging or what evidenced measures or methods can be used to improve belonging.

These efforts should be celebrated – as the TEF panel often did. Yet it is striking that there was little to no mention of economic diversity considering the significant financial challenges facing students – and their institution – over the past eighteen months, which was becoming increasingly evident when many TEF submissions were written.

There were some fascinating descriptions of good practice across providers of all sizes, often detailing their flexible learning options that go beyond the dichotomy of online or in-person, incorporating technology into teaching sessions to make learning both more accessible as well as more exciting and innovative. Additionally, a few providers described their institution’s commitment to smaller class sizes in order to provide students with one-to-one support – although this was not always recognised by the panel – and regular access to personal tutoring.

Upwards trajectories

Where subjects were marked below the benchmark for the student experience, the general trend amongst submissions was to describe how they are trying to improve this or include some narrative around the local – and sometimes national – context. These often included articulating the impact of the pandemic and subsequent lockdowns on student course satisfaction, the impact of industrial action and disruption on teaching delivery, and/or changes to the graduate career environment in some fields on graduate outcomes

This could potentially be verified by the panel by comparing providers with similar demographics, regions, and offers to see if the unavoidable or insurmountable impact was felt across the sector, or if some providers managed it better.

The awarding panel also expressed confusion when student outcomes and student experience metrics were varied by programme or subject and a provider has not addressed the context for why this may but, or the steps they are taking to improve consistency.

What seems to be a feature of “outstanding” or “gold” providers to the TEF panel is where there is consistency in student experience across the board, regardless of a students’ individual programme – or if there are inconsistencies in the metrics, a provider is able to explain what innovative interventions they are doing to address it. For instance, how are they responding to student feedback in real time to better understand what is going wrong on some programme, and implementing a student-led response, often in partnership with the student union or voice structures.

NSS-mas

Many providers use changing NSS scores, i.e. an improvement of student satisfaction ratings, particularly by demographic, to evidence impact on the student experience; others are using internal surveys with similar questions to the NSS, module evaluations, students’ union surveys and speak weeks. Speak weeks are usually run by SUs and involve sabbatical officers or student union staff asking students on the spot questions about various topics from learning resources, study spaces, assessment and feedback etc.

They were commonly mentioned in student submissions, as well as referenced in provider submissions when discussing what they did with that feedback. That said, there does not appear to have been a preference from the panel on the type of evidence provided. What seems to have mattered more to the panel is that the provider gathered student voice and then how it responded to student feedback.

Panels looked favourably on providers who demonstrated dynamic responses to student feedback – and any innovative ways feedback was gathered. When a provider can respond to that student feedback and implement change if needed was important to the panel, i.e. within an academic year, before the students complete the module etc. Where panels seem to have pulled providers up is if there appears to be a recurring issue with an area of the student experience, such as assessment and feedback quality, or assessment bunching.

This is often commented on within the student submission that students have raised an issue consistently via NSS open text comments or in course rep forums, for example. From here, submissions highlight future intervention plans to act on student feedback going forward – perhaps prompted by conversation and thinking around the TEF submission. For example, implementing an institution-wide review of assessment and feedback, usually in association with the student union.

One notable example the panel viewed as “very high quality” was an institution that delivers a “one block at a time” learning structure to a course, allowing tailored feedback, with assessment in each block, designed to inform learning and delivery in the next block of teaching. Another provider had responded to student feedback about space on campus by investing money, time, and resources into the library, collaboration workspaces, and social spaces.

Frequently, submissions would point to the existence of a student voice structure in principle – such as a staff-student liaison forum – rather than what this structure actually achieves. This often resulted in the institution being labelled by the panel as having a “very high quality” feature. Our hypothesis is that a submission would have needed to articulate how the provider realises the value of student partnership to the wider student experience, not just state that it has it, to reach the dizzy heights of “outstanding.”

We found that the corresponding student submissions in some of these cases were critical and gave examples of where their providers have not proactively responded to feedback. This also links back to the idea that the speed and dynamism at which a provider can prove it has responded to student feedback is particularly important.

Another way providers evidenced the impact they had on the student experience was by detailing financial investments made. For example, some submissions described how they had spent X amount of money on refurbishing collaborative study spaces following student feedback. Some had invested in the SU building.

Investment in Physical Spaces: universities are redesigning spaces based on direct student input, focusing on creating environments conducive to both social interaction and individual study.

Others detailed how they had improved teaching spaces to be more accessible by installing lecture recording equipment. Usually, this investment in improving physical spaces was the result of student feedback that had criticised the availability of study spaces or detailed demand for lecture recordings following the availability of these during the Covid-19 lockdown periods.

External awards and league tables are often used as ways of measuring success. However, comparison between providers not using the same league tables is difficult due to the variation and uncertainty of metrics used in each. It’s not clear that touting these awards had much of an impact on the panel.

Curriculum (re)design

Panels clearly had preferences for curriculum (re)design with input from students and industry, such as modules created in consultation with students and industry partners. Some of these collaborations with local businesses and communities to shape curricula aim to address regional skill gaps and provide practical, real-world learning opportunities for students

There was also a preference for “authentic assessment”,  real-world learning environments, integrating real-world scenarios and project-based learning – often in collaboration with industry partners – to provide students with hands-on, practical experience. Others emphasised a diverse range of assessment options. Several student submissions, however, cited “assessment bloat”, complaining that there was not a joined-up approach to assessment timetabling between modules.

There is also a bit of innovation around personal tutoring in the sector – trialling different structures, monitoring engagement and introducing more intervention points for staff to reach out to students.

This was in contrast to other submissions where students were expected to self-declare as needing support at existing support points – similar to disparities in student voice where some institutions had the structure in place, whereas others described both the structure and the impact its use is having in their submission.

The above article appears as part of a crowdsourced analysis of TEF submissions, panel statements, and student submissions in December 2023. The authors would like to thank the following colleagues who gave up their time voluntarily to take part in this exercise and supported the analysis of the student experience: Gulfareen Naz, Peach Smith, Sheila Amici-Dargan, Megan Knight, Amanda Bolt, Mandy Bentham, Sephora Imomoh, Katy Kennedy, Rebecca Lindner, Gayatri Patel, Nicola McCullogh, Duna Sabri, Lynne Wyness, Nicky Hindmarch, Adam Keogh, Harvey Anderson, Jack Fox, Dan Chard, Emily Pollinger, and Kim Edwards.

Leave a Reply