It was in the first year of my politics degree that I first heard about the idea of participation in democracy as a sort of “civic duty”.
My memory now associates this principle with John Stuart Mill in the 50s, but perhaps it’s even older than that.
Regardless, it was an idea that instantly clicked within me – and I entirely agreed with it. I vote and respond to surveys when they are sent to me because it is my civic duty and privilege to do so.
I assumed everyone thought the same – until I found myself discussing Amazon and Love2Shop vouchers at an SU Voice Team meeting, as if responding to a housing survey deserved a reward.
I was shocked and horrified – but that initial shock has worn off as more time has gone by in the role. I don’t know if I’m losing the idealist in me, or buying the line that anything we do to put money in students’ pockets is good.
It seems to be okay that we nudge them to participate in elections, fill out surveys, and essentially have an opinion on things.
But I wanted to see how deeply rooted this new concept was, which I’m sure it’s making Mill, Campbell, and Gurin turn in their graves. I did research on a few standard things that all SUs and universities do, to figure out if this is a sector wide trend – or if it’s just some SUs that have fallen down the incentivising participation hole.
I found it particularly interesting that some universities are very openly institutionalising this approach to driving engagement. Imperial has a whole section on their website about improving participation rates, where incentives are welcome and categorised as a vehicle to make participation better.
And if you look at the big engagement moments in both universities and SUs, the image is even clearer.
NSS Survey
The Office for Students (OfS) encourage institutions to promote the survey amongst eligible students and do little to clarify what that means. But a quick Google search showed how many institutions take this encouragement seriously.
Warwick have a full page on NSS incentives and what students could get for filling it out. It outlines that departments can spend £5 per student to achieve engagement. That sounds like a massively expensive way of getting students to be active citizens.
Manchester’s approach was a little different – publishing a blog encouraging staff to promote it. Again, their webpage on NSS survey includes a subheading about NSS incentives, which include 17 different prizes among which an iPad Air and “graduation day packages” stood out.
The University of the Highlands and Islands (UHI) doesn’t avoid the trend but does include £200 Amazon vouchers as a ‘thank you to students participating in the survey’. And let me clarify, I am in no way singling out institutions for giving these incentives. Here at Bath, we have done the same thing.
I chose the Russell Group as a sample pool (purely for the purpose of giving myself some guidance as to which universities to look at) and went to each of its universities’ websites to look at any incentives they give students to participate in the survey. 18 out of 24 have clear incentives as part of their NSS website – three-quarters.
The incentives vary from a £5 voucher given just for participating, to a pound given to charity per student that fills it out, to incentivising the departments instead, by giving them money based on how many of their students fill it in.
I might have missed incentives in the other 6, or perhaps they were only offered to students in more direct communications that I don’t have access to. Whether or not that’s the case, there is definitely a question of why such a high proportion of our HEIs have normalised rewarding students for filling out a survey.
Bristol stands out because their Code of Practice for Surveying University Students includes a point on avoiding students receiving a completion incentive. But after asking people that work in Bristol, I found out they don’t pass by unscathed.
They threw pizza parties to celebrate the NSS survey where people were thanked and fed for participating. But perhaps if you get the pizza before you answer the survey, then it isn’t a completion incentive.
SU Officer Elections
This one was a little harder to put together.
Like NSS survey emails not being accessible, some of the incentives that SUs do for their elections are likely to not be recorded on their websites.
For example, in Bath I know we encouraged students to “Doughnut forget to vote” while, in case it isn’t obvious, giving them a doughnut in exchange for a vote. But this sort of incentive isn’t listed on our website, and I suspect the same is true in several SUs across the country.
But some incentives were, like the NSS ones, institutionalised. Westminster introduced a vote meter, with incentives increasing with the number of votes. The same is true for Salford SU, with daily giveaways to bring more excitement to the election process. Once at UEA, the SU promised to give away free food, free entry to clubs, they even convinced the VC to “dab”, and they promised to construct a 10ft statue of “Cloud Dog”, the campus mascot, in the SU if 2,000 votes were cast.
Lincoln SU offered over 2,000 prizes for casting a vote in 2022. They achieved 32 per cent voter turnout with this tactic, which for anyone who’s either stood, voted, or organised sabbatical officer elections, this number sounds impressive.
There’s also an interesting phenomenon where candidates themselves incentivise students to vote. In places like Nottingham, individuals get a campaign budget limit which they can spend on their campaigns. In Bath the same is true, and I can certainly tell you people spend most of their budgets on sweets and other things to have something to offer students in exchange for their time and a vote.
I have sinned of the same thing. I even got Red Bull to give me free ones to give out to people. People wouldn’t stop walking if you asked for a vote, but they would if you said “do you want a free Red Bull in exchange for a vote? It’ll take 30 seconds!” They would then ask you how to vote, who you were, and you would see them choose the candidates for the other posts at random, with R.O.N getting quite a few clicks.
This was probably the last drop for me. It certainly made me wonder if we should be as proud as we’d been to publicly (and loudly) state that we reached 22.5 per cent voter turnout. How many of those votes were made at random, and just at the promise of a Red Bull, a doughnut, or a sweet?
SU specific surveys
The same question remains for other specific surveys that SUs run throughout the year. Whether it is to hear about the state of private sector housing, wellbeing, engagement in university and SU activities, it is relatively common to have some sort of data and insight team that helps shape the campaigns and informs the direction unions go in each year.
I was tempted to put out a survey for Bath students asking, “does the promise of a reward or some incentive improve the chances of you filling out a survey or participating in elections?”, but clearly, the decision of me offering incentives or not for this survey would have influenced its results.
Instead, I went through a few of the recent surveys we’ve done. In Bath, we don’t offer all students something for participating, but we do it in the form of prize draws. This is mostly a budget decision rather than an ethical one. For context, as of December 2023 we had 20,470 students. In surveys in which I don’t clarify, this would be the pool of possible respondents.
- Student Life Pulse Survey (run quarterly with a random sample of students each time): for each pulse we offered £75 voucher of choice as incentive. Engagement levels were 229 students on the December 2023 one; 238 students on the February one; and 180 students on the April 2024 pulse.
- Student Housing Survey: ran jointly with Bath Spa’s SU, we offered £100 voucher of choice and 602 Uni of Bath students responded to it (this was in February, this year).
- Advice feedback form: this one was only sent to students who’d used our advice team’s services. With no incentive to participate, 17 students out of a possible 545 (approximately) have filled it up to date (this is still open, but considering the academic year is over, it seems unlikely that we’ll get many more responses).
- BUCS value survey: ran to ask students about the value for money of their BUCS experience in March 2024. We offered 20 x £10 Plug and Tub (our SU bar) vouchers and received 609 responses out of a possible 1500 (approximately).
- Together we shape tomorrow survey: an SU mass engagement survey thinking about our five-year manifesto. We ran it in May 2024, and we got 885 responses. Incentives were daily prizes ranging from £20 vouchers to summer ball tickets, Stanley cup, SU branded 2-pint cups. There were also overall prizes of £150, £75, £50, £25 cash/vouchers.
- Be Well Survey October – November 2023 (run jointly with Uni) 1995 responses, incentives £500, 2 x £100, 2 x £50 cash/voucher.
I understand the limitation of this sample size of surveys, as they are all from recent years and from only one institution. It doesn’t show an overall picture of the sector, much less of across-the-board trends. But I still think there are some interesting things to say about these specific surveys.
Surveys that had more incentives, were advertised more broadly, and were open for longer, received higher levels of engagement (for instance the Be Well Survey, received a 9.75% response rate from our entire student body).
Surveys with no incentives like the Advice feedback form, which were also open for a longer period (and is still open, as a matter of fact), received a response rate of 3.12%. For the Together We Shape Tomorrow survey, we got a 4.32% response rate.
This is only marginally higher than the advice form, and whether this has to do with incentives, the time of year we held it, or the amount of comms students received during this period, it’s hard to tell.
What these things tell us
Students come to university to learn and earn a degree. But I think everyone in education will agree with me if I say that the overall experience is infinitely more enhancing than the textbook knowledge you earn.
The fact that me and my group of close friends did some sort of social science probably limits my ability to look at the issue more objectively.
We’re all suckers for active citizenship. But what to do about those that weren’t idealised into it throughout their time in uni, and spent more time in labs than they did in round table discussion sessions?
My sense is that, as SUs, we can either hope that all the good work we do encourages people enough to think they should have a say in who runs things and how (which is what we’ve been doing thus far, and failing), or we can approach active citizenship differently.
I think the answer to the ‘moral’ incentive question for those students is a matter of framing. We need to give them a sense of having skin in the game. The best turnout we’ve had in elections, surveys, and other engagement activities have been those that the students care about.
See, for example, the response rate to our BUCS survey. The incentives in themselves weren’t any different to what we did in other cases. Yet, the survey received a 40.6 per cent response rate – they care about how much they pay to be a student athlete.
Students will participate when they have something to say. Maybe the real definition of active citizenship is about asking people to participate in things that matter to them, and not things we’re telling them to care about.
In my experience, active citizenship is one of the things that you should learn while at university. My admittedly limited research on the matter shows that we’re either desensitising students by generating expectations of incentives in exchange for participation, or these expectations are forcing us to play into this vicious cycle.
Which way the causal relationship goes, I cannot say. But either way, my intention was to get the conversation started on the intentionality behind our incentives. Do we want to continue down this path or do we want to stand our ground on incentivising participation? Would the officer elections be empty were we to not offer doughnuts? Would the NSS response rate drop down?