Just a few weeks left to complete the National Student Survey! Take the survey now! Tell us about your experience! Around the country final year students are being bombarded with messages about making their voices heard.
We’ve known for a long time that NSS only captures part of the student experience. It’s a moderately useful snapshot for checking on student experience in the (increasingly narrow) areas it asks about and provides a national public dataset that can do some helpful work in benchmarking and informing regulatory intervention. But since its inception it has been afforded totemic status far in excess of what it deserves.
Worse, it has created perverse incentives that can shape student feedback practice inside higher education institutions in ways that actively contribute to student disengagement. How so? Well, many institutions have adopted the NSS questions for internal surveys – a bit like a dress rehearsal before they are invited to complete the real thing in their final year. Worse still, while students are asked to complete the same generic list of questions at the end of every module, they report little or no awareness of how their views are acted on. Failing to close the feedback loop with students sends the message that their views aren’t taken seriously. It deprives them of the opportunity to offer a thoughtful reflection on their learning. And it takes the possibility of addressing problems out of the equation entirely.
It’s time to dethrone socialising students to respond positively to NSS in favour of feedback mechanisms and processes that create a dialogue with students. When students are asked to share insight with their institution they should expect to be asked questions about things that matter to them and that have meaning to them. And they have a reasonable expectation of being listened to and responded to in a timely way – something that can only happen if they have been asked questions that actually support a decision-maker taking action in response.
Our work with institutions that are developing their student feedback practice suggests a number of effective approaches that offer a path towards a more authentic conversation with students. But none of these approaches will work in isolation or deployed off the peg – revitalising student feedback requires acting with intention and with a fair bit of courage as well. After all, gathering feedback and taking it on often requires hearing inconvenient truths! But when student disengagement and low wellbeing is such a widespread concern, there is no action plan or solution that can be created without first listening carefully to students and being interested in their opinions.
Take a step back to look at student feedback in the round
Students express their opinions all the time and in lots of formal and less formal ways – it only becomes feedback when those opinions are gathered and digested and generate a response. Different feedback mechanisms can serve different purposes – sometimes decision-makers need to respond to an issue that has bubbled up among a specific cohort of students and at other times students need to be asked to contribute their insight on a particular issue or agenda.
At the University of Derby, associate provost for learning and teaching Neil Fowler describes three dimensions organising institutional thinking on student voice: a temporal axis which considers whether the focus of the feedback is immediate and pressing or longer-term and strategic; a constituency axis which considers whether the issue is meaningful to all students, a subset of students (such as a module group or those with a protected characteristic) or just a few students; and a “formality” axis which considers whether feedback has arisen through informal engagement with students, to more formal survey systems and encounters with students representatives, all the way up to submission of complaints.
“If we can plot it out we can show students the best route to raise concerns or share ideas,” Neil explains. “Navigating higher education can be really difficult, especially if you are a first in family student or new to the UK sector. The lack of a ‘here is how to talk to us’ guide can make the whole system ineffective. We are teaching adult learners – we should expect them to be able to take responsibility for raising issues, but we can’t expect them to arrive with that skill set. So we do need to break down the model, look for ways to develop that practice, and show students that we have listened to them.”
Breaking down the various dimensions of student feedback in this way creates space for university staff to be thoughtful about the ethos and purpose of the university’s student feedback systems, and challenge the usefulness of the idea of “satisfaction” as a heuristic for quality. “It’s not about satisfaction in the soft, comfortable, fluffy way,” says Neil. “It’s about moving away from the notion of customer satisfaction towards have we challenged you, have we met your expectations in that regard? Did you know what you were going to get? Did what you got match that? Did you know it was going to be hard? Was it hard? Good, then that’s what we set out to do.”
Help students to give you constructive feedback
Surface-level questioning generates surface-level response, and it is very difficult to take useful action on the response to generic questions. Every question that students are asked should be meaningful in that it offers students an opportunity to take a view on something that matters to them, and that the person responsible for that thing is empowered to change or adapt in light of students’ feedback.
At Queen’s University Belfast, pro vice chancellor for education and students Judy Williams emphasises the importance of working closely with students throughout the feedback cycle – starting with spending informal time with students to better understand their lives and the various blocks and barriers to engagement with learning, and continuing through to working with student representatives and the students’ union to develop survey topics and identify actions in response to findings.
“In the past we haven’t necessarily asked students the questions they most wanted to answer,” says Judy. “Rather than simply pushing our own agenda, working with students allows us to gather a wealth of information and knowledge. Our students’ union is as passionate as we are about wanting to drive forward and create transformative experiences for students. Students are amazing and they will find creative solutions that you hadn’t actually thought of.”
At module level, Judy argues that module leaders may need permission to ask students targeted questions that enable them to adapt their teaching approach to the needs and preferences of the current cohort of students. “Asking for feedback demonstrates emotional intelligence in the classroom,” she says. “It’s really important to have a flexible approach so that small adaptations can be made. You probably can’t do something like totally changing the assessment, but sometimes responding to suggestions can make a difference to learners. Each group of learners will have different preferences and ideas about what will work for them, and we should be flexible enough to respond to that.”
Try to respond to all students not just accept the views of the majority
In democratic systems, a representative majority is a powerful force for decision-making. But most of the time when students are asked for feedback, it’s not with the aim of arriving at a single point of truth or making a collective decision, but of trying to capture the range of opinions and experiences. If the goal is simply to check whether the majority of students are satisfied with some aspect of their module or programme then reporting a majority view is a reasonable approach. But in the spirit of creating a dialogue with students it might be even more important to hear from those who are less happy, so that their issues can be acknowledged and addressed.
Taking a majority view could mask the strength of feeling of the minority – for example, on a classic five-point Likert scale the numbers that have selected neutral, or mostly or strongly (dis)agree offer valuable insight into how deeply felt the sentiment is. It could also obscure the particularities of the experiences of a particular subset of students who happen to be in the minority. Other feedback contexts simply aren’t about finding a majority view. Pre-arrival questionnaires, for example, are much less about deciding what most students think than they are about surfacing the variation in response that can inform a more nuanced approach to the delivery of services.
This is where survey data can usefully interact with other forms of feedback to allow for a more rounded picture to emerge. At Middlesex University, the student engagement and advocacy team is working on developing a range of approaches to hearing from a highly diverse student body – and is proud to be a top ten university for students reporting via NSS that their feedback has been acted on. Module surveys are scheduled so that results can feed into meetings of student representatives with programme teams. For every major survey of students the team will follow up with communication back to students sharing the findings, celebrating the positives, appreciating students’ efforts and setting out any changes planned in response to the feedback.
“We are trying to find the right balance for Middlesex on creating a more dialogic approach,” says student engagement and enhancement manager Ravinder Bassi. “We have focused a lot on module surveys and pulse surveys, and we are looking at developing programme surveys that can inform a more personalised approach to student support. But there are moments when we need to hear from particular students or we need insight in a different way – it’s about balancing and triangulating feedback from surveys with different kinds of student insight. We are constantly weighing up what is really needed, how things fit together, what different mechanisms might engage students differently, and what might create overwhelm.”
One example is a calling campaign in which the team contacted students who had been identified via the learner engagement analytics system as being less engaged and at risk of non-progression. While the point of the campaign was to support those students to re-engage in their studies, the feedback the calls generated offered significant insight as to why those students had disengaged in the first place.
“These less engaged students might not reply to a survey or participate in other student voice mechanisms,” explains David Gilani, head of student engagement and advocacy. “This makes hearing from them incredibly important, but we have to be careful about how we position their feedback. Their particular experiences may mean that their feedback is less positive than we’re used to seeing through other feedback mechanisms, and so we need to prepare staff for that.”
Get the hygiene factors right
It’s not uncommon in our work with higher education institutions that we discover that inside institutions there’s no clear ownership of student surveys, or there’s no join-up with wider student feedback mechanisms, or the team that owns the system isn’t empowered to make the most effective use of the features and automations that are built into our platform.
There’s really no shame in that – universities are complicated organisations with policies and processes that are constantly evolving. But if you want to engage students through asking for their feedback, having the plumbing sorted behind the scenes so that survey implementation is streamlined and automated as much as possible should free up time for the much more complex work of designing questions and interpreting results. Clarity over lines of responsibility and processes for initiating new surveys also ensure that the system as a whole remains authentic and engaging – avoiding over-use of surveys, ensuring that data is directed to the correct decision-makers, and that there is close alignment between purpose, survey design, and implementation.
At London Metropolitan University, we worked with the academic quality and development team to support them in their efforts to build a survey ecosystem, comprising institution-wide surveys, module surveys, and ad-hoc surveys. Making better use of features such as linking module leaders to their modules on the system, making surveys accessible to students through the VLE and the institution’s study portal, automating reminder emails and data reporting, and piloting of qualitative analysis tools meant that the process became much more efficient and, not unexpectedly, response rates improved and staff engagement with the surveys increased.
But much of the added value of this project came through the academic quality and development team working much more closely with colleagues across the institution to improve survey practice – reducing the volume of questions, advising staff on good practice in survey design and promotion, and overhauling student communications. Good hygiene in system management enabled good survey practice to evolve.
Encouraging students to complete the NSS is an important part of the student feedback cycle. But if students have been engaged in a dialogue with their institution, module and programme leaders and professional services from the very start of their journey, in authentic and supportive ways, and can see the impact of their feedback, the chances are they will tick those “agree” buttons in NSS as well.
For those that have struggled at times, or had a bad experience, repeatedly being asked the same set of broad based questions that have no direct relevance to them, or being endlessly told “you said we did” when their personal experience is of the opposite being the case, isn’t going to increase the sense that they matter to their university. And for university staff attempting to parse out the many possible meanings of students’ responses and build action plans for change, NSS-style questioning is only going to leave them feeling powerless and frustrated, and create barriers to positive relationships with students.
“It is really difficult and nobody does it perfectly,” acknowledges Neil Fowler. “In regulated environments the risk is that we become much more compelled with demonstrating compliance with the process than by engaging with the richness of the insight generated by the process. But students will only talk to us if they think we are really listening.”
“Everyone wants the same thing – great student experience, great staff experience, and a good work life balance,” says Judy Williams. “If we create a culture of positive dialogue, we can make a difference by pulling together towards the same destination rather than pulling in different directions.”
Student feedback is crucial for improving the student experience and fostering success. However, current methods often prioritise metrics like NSS scores over genuine student engagement. This can incentivise superficial responses and discourage universities from addressing underlying issues.
The way forward lies in creating a culture of open dialogue. This requires:
- Meaningful questions: Ask students questions relevant to their experience and empower those who can act on the feedback.
- Diverse mechanisms: Utilise various feedback channels, from surveys to informal conversations, to capture a wider range of perspectives.
- Actionable insights: Actively respond to feedback by implementing changes and communicating them to students.
- Student ownership: Help students understand how to navigate feedback systems and empower them to voice their concerns.
By prioritising genuine student engagement over chasing metrics, universities can create a more positive and productive learning environment for everyone.
This article is published in association with evasys. Join Wonkhe and evasys for a free online event The changing face of student feedback on Tuesday 23 April, 2.00-3.00pm.