It’s been an amazing thirty years for anyone with an interest in learning and teaching in higher education.
Since 1988 nearly a billion pounds has been directly allocated to support the enhancement of teaching quality – enabling academics and support staff to improve teaching.
But in 2018 – in England, at least – the well has run dry. This is the first year since 1988 that there has been no direct funding for teaching quality enhancement. Part of this is due to changes in the way higher education is funded, part is due to changes in the way academics are employed. There have been wild swings in the fashions that surround models of change management – six different prime ministers heading six different governments, each with unique perspectives on the quality of teaching in higher education.
A certain level of decorum has been maintained in Scotland. The Enhancement Themes programme, now in its 16th year, has provided stability and reliability, though it has never been a major stream of funding. In Wales and Northern Ireland, enhancement has tended to follow English trends – again, at a lower level of funding than England, and sometimes with the more obviously harmful edges smoothed off.
Thirty years has seen a government-sponsored industry – or if you prefer, a subject discipline complete with a sizeable literature – grow, and now virtually disappear.
Why has this happened? And what can we learn from the whole experience?
Why public funding for teaching enhancement?
Although there has always been a certain level of government interest in what actually happens in university teaching, the first serious look under the bonnet – as it were – happened in the early 1960s. No – not Robbins. I’m talking about the 1964 Report of the Committee on University Teaching Methods – otherwise known as the Hale Report. Sir Edward Hale, incidentally, was a senior civil servant, and an honorary graduate of the University of Leeds.
Hale was a response to the lack of insight into university teaching in Robbins, a current of opinion that led eventually to the birth of SEDA.
We fear that the quality of university teaching has not improved over recent years and, in fact, are inclined to believe that it is declining with the expansion of the universities.
The Hale Report includes what can justifiably be described as the first ever National Student Survey. There’s a lot of fun to have with it (though it’s sample- rather than population-based so not directly comparable – maybe more so with the HEPI-HEA survey). There are a few eyebrow raising comparisons between then and now – like this on undergraduate contact hours.
But I digress.
An optimistic sixty-three per cent of academic staff surveyed by the Hale Report claim to have checked the efficacy of their teaching practice. In response, the report comments on the widespread traditionalism of university teaching, and makes the following recommendations (inspired by the work of the Ford Foundation in the US):
a policy of well-directed experiment in university teaching” organised on an “inter-university basis.
A body to promote and steer a concerted programme of experiment”, with the ability to “make or recommend grants for experiments in the methods and organisation of university teaching”.
The results of experiments should be published in such a way as to reach all those for whose teaching the results might be significant”
Unfortunately, none of these things happened (we got the tiny “Academic Co-Ordinator” funding instead – eventually axed by Thatcher in the 1980s) and Hale languishes mostly forgotten. But the literature remained rich and occasionally – Donald Bligh, Sally Brown, Lewis Elton, Liz Beaty, Phil Race, or Graham Gibbs, for example – popular. But funding for such activity has always been next-to-nonexistent.
Up to date with Dearing
The Hale Report committee could not have foreseen the focus on research at universities that came to define the 1980s. The first review of research quality, in 1986, and the subsequent linking of this measure to a (shrinking) funding stream, meant that there was a financial incentive for research to be given priority. The business-minded leanings of post-Jarratt Report managers, and the perennial keenness of many academics to spend more time on research, meant that this tendency grew and metastasised.
It was the Dearing Report that saw the first attempt to bring about the chimera we have come to deride as “parity of esteem”. The terms of reference said that “the effectiveness of teaching and learning should be enhanced”, and the report noted that “One current barrier is that staff perceive national and institutional policies as actively encouraging and recognising excellence in research, but not in teaching.”
So what was needed, clearly, were national and institutional policies encouraging and recognising excellence in teaching. And Dearing proposed:
- The development of institutional learning and strategies
- An Institute for Learning and Teaching in Higher Education
The latter would support:
- the accreditation of teacher education programmes
- research and development in learning and teaching
- stimulation of innovation in learning and teaching
There was also some expectation that the new institute would support computer-aided learning. But what is notable for our purposes is the clear position on national funding.
If the Institute is to do a useful and credible job it will need to be adequately funded. The Funding Bodies already provide some funding for developments in learning and teaching (for example, the Higher Education Funding Council for England provided £8 million over two years for the Fund for the Development of Teaching and Learning, and all the Funding Bodies have contributed some £32 million to the Teaching and Learning Technology Programme over three years).
We believe that, in the future, this funding would be better spent by an institutionally-owned body with a coordinated and focused mission towards learning and teaching development. Institutions may wish to second staff to the Institute to contribute towards its costs. As a result of coordination and coalescence (as appropriate), the organisation should cost less than the current complex arrangements.
There should be three elements to the funding of the organisation:
- core funding to support the organisation through institutional subscription;
- payment by institutions for specific services;
- public funds from the Funding Bodies or government departments, for example to enable the Institute to launch focused initiatives in learning and teaching development.
From recommendation to action
So even before Dearing, HEFCE and predecessor bodies were investing in teaching quality enhancement. Initial support came in the form of funded investigations and development for computer-aided teaching – programmes like the Computers in Teaching Initiative (CTI) and the Technology in Teaching and Learning Programme (TLTP) were noted by Dearing (as a possible source of ILTHE income). The Fund for the Development of Teaching and Learning (FDTL) had arisen as a way to address the issues arising for subject review – or, for the more cynical, as sugar to sweeten the QA pill as the Quality Assurance Agency (QAA) took over quality assurance from the Higher Education Quality Council (HEQC).
In the great missed opportunity that was the Institute for Learning and Teaching in Higher Education (ILTHE) we never quite got to the “coordinated and focused” body that would effectively have been a research council for learning and teaching. All of the above mentioned schemes continued – HEFCE added funding in support of learning and teaching strategies to the mix, and kicked off what has been the longest running of our enhancement funds, the National Teaching Fellowship Scheme (NTFS).
A year or so later saw the launch of another part of the Teaching Quality Enhancement Fund (TQEF, as this portfolio, designed by Graham Gibbs, became known): the Learning and Teaching Support Network (LTSN). Twenty-four subject centres drawing on the old CTI network, along with a generic centre co-located with the ILTHE, would “promote and disseminate” good teaching practice. The Teaching and Learning Research Programme (TLRP) was launched at a similar time, but there was no clear link to the work of any of the other initiatives.
Paradoxically, ILTHE – other than some start-up costs, and one emergency cash injection, was left to stand on its own, under the ownership of the sector representative bodies. The 2003 formation of the Higher Education Academy (from ILTHE, LTSN and TQEF NCT) added significant (and rising) state funding – which turned out to be both a strength and a weakness.
As Roger Brown put it in 2008:
My view is that we need a body that takes a neutral view, something such as the King’s Fund in health or even Ofqual (the new Office of the Qualifications and Examinations Regulator), which monitors school standards. If the academy cannot do this, then we may have to think afresh.
£1bn later…
The 2004 White Paper did nothing to stem the tide of initiatives – with the advent of the behemothic Centres for Excellence in Teaching and Learning (CETLs) viewed with suspicion by those that were already starting to build careers around the smaller projects offered by HEFCE (and, increasingly JISC). The CETLs were, at heart, an honest attempt to build the Research Assessment Exercise for teaching that Dearing hinted at. Seventy-four Centres, sharing a colossal £315m were established from 2005. The sheer scale of the awards, coupled with a bidding process that blended equal parts future plans and track record, seemed custom designed to capture the attention of senior managers.
Alongside this, the Higher Education Academy gathered into the guts of the ILTHE a range of other initiatives. Now a component of Advance HE, it faced adversity from before its birth – the ILTHE only narrowly voted to join the new organisation, with many members feeling as if there was no other choice. In taking on the LTSN and TQEF National Coordination Team (NCT), and responsibility for the NTFS, it gained access to the funds that made the organisation able to continue in the face of less-than-stellar subscription levels.
But, little remembered now, it was funds to Support Professional Standards (SPS) that were most welcomed by professionals. These HEFCE allocations, following on from a little noticed paragraph in the 2003 White Paper, doubled the ring-fenced institutional TQEF funds, and linked to a required strategy that explicitly drew on the skills of institutional educational developers.
Twenty-twelve
The recommendations of the Better Regulation Review Group (BRRG) back in 2003 made it clear that the smaller “jam jars” of funding (to use Alan Johnson’s phrase) were on borrowed time. Initially this affected smaller projects only, but by the turn of the decade the whole idea of ring-fenced allocations was anathema to the senior strata of the sector.
Accordingly, in 2009, funding for learning and teaching strategies, staff development, and other priorities was rolled into a single non-hypothecated allocation: “teaching excellence and student support” (TESS). Though nominally this continued the funding that supported education development in institutions, in practice this was a time of learning and teaching centre closures, educational developer job losses, and general deprioritisation of the agenda.
Coupled with the post-Browne Review move to teaching funding largely following the student via higher fees and less public money, hypothecation was no longer a viable model either. TESS disappeared, and with the gradual defunding of the (perhaps unfairly?) unpopular Higher Education Academy – and a change of focus meaning the end of Jisc-supported projects – a few Catalyst projects and the NTFS were all the sector had to support teaching development.
By 2018, even these had been laid to rest, and thirty years of funded activity came to a close.
What was the use?
There are arguably three absolutely key evaluative sources that look back on the efficacy or otherwise of all this activity and funding. First, Trowler, Ashwin and Saunders examined the role of HEFCE in enhancement for the HEA in 2014. David Gosling published “Quality enhancement in England” in 2013. And Roger Brown’s 2004 book “Quality Assurance in Higher Education”, alongside his later paper “What price quality enhancement?”, covers similar themes. There are elements of “practitioner research” here – Roger Brown led HEQC, and David Gosling co-managed the TQEF NCT.
Each of these attempts to unpick a theory (or theories) of change – often castigating HEFCE or ministers for an absence of theory and of coherence. However, it is worth considering whether “whole sector enhancement” was ever a realistic endpoint for such investment.
Trowler, Ashwin and Saunders identified a range of implied theories of change from previous HEFCE initiatives:
Instruments | Pilot or beacon projects (eg CETLs) | Bid-and-deliver (eg NTFS) | Allocated formula funding (eg TESS) | Conditional funding (eg enhancement themes) | Professionalisation of teaching (eg UKPSF) | Consumer empowerment (eg KIS, NSS) |
---|---|---|---|---|---|---|
Mechanisms | Mini projects, web resources. | Developing curricular materials and resources. | Teaching projects, piloting new approaches. | Embedding a teaching theme across curriculum. | Qualification frameworks for staff | Instruments to measure 'satisfaction' |
Adding in consumer empowerment as a model of change in itself is perplexing in this context. NSS (and now TEF) do nothing in themselves to enhance teaching, merely serving as a tool for measuring the effectiveness of other interventions, or as a catalyst to bring about interventions.Whereas a few of these are arguable (is NTFS *really* bid-and-deliver, and as TESS was not ring-fenced could it really be seen as enhancement funding?) this is a good overview of both the range of activity and the differing goals of enhancement funding.
The idea of such data sources as reliable and comprehensible student-facing data would be laughable were it not the apparent belief of those designing the future of the sector. I feel Roger Brown is apposite here:
– There are not, and there never can be, valid and reliable indicators of educational quality.
– There are simply too many variables, too many of them ‘unknowns’.
– Even if there were such indicators, and they could be made sufficiently accessible to the ‘two clicks’ generation, there is little evidence that more than a small subset of students would ever refer to them. Moreover, they would probably be those who, because of inherited or acquired social and cultural capital, had least need of such information.
The very thing such data should encourage – experimentation with teaching enhancement – is immediately lost to a Campbell effect. Although the initial threat to link TEF performance with fee levels foundered in Parliament, enough senior managers draw personal validation from league tables to reasonably postulate an inverse effect – the visibility of these data sources leads to a decline in “risky” experimentation, and will eventually lead to a learning and teaching practice that is, at best, “good enough”, and at worst “non-threatening”.
The Michael Barber bit
The idea of a “beacon” model of change in education has an unlikely forefather – Michael Barber, now Chair of the Office for Students. He was at DfES when the model was promulgated in the Beacon School initiative, and had a hand in both the design and promotion of the initiative. Though throughout his published work he has been happy to draw on the “literacy hour” experience, he has yet to write about his part in this particular scheme, which saw schools designated excellent and rewarded with funding and promotion, with the expectation that other teachers and schools would learn from the best.
As Gosling put it:
The ideological foundation of the CETLs was intended to be ‘reward’.
By the time CETLs came around this model was becoming discredited in the school system – whereas it gave the appearance of enhancement (networks, discussion, etc) there was no way of plotting what, if anything, had been achieved. Sure enough, interim and summative evaluations (incidentally the latter briefly managed at HEFCE by one Mark Leach – whatever happened to him?) of CETLs reached a similar conclusion.
To quote Trowler, Ashwin and Saunders:
The theory of change embedded in this ‘dissemination from a beacon’ conception is relatively weak in that how and under what conditions an exemplar would create changes within the wider system is often not made explicit or remains opaque.
Barber’s thinking on change mechanisms has changed a little in the intervening years. His “How to run a government…” presents five paradigms of systemic reform:
- Trust and altruism
- Hierarchy and targets
- Choice and competition
- Devolution and transparency
- Privatisation
Spoiler alert – he doesn’t think the first one works. We can see evidence of the last three in the development of the Office for Students – notably market-based models that are predicated on the availability of data to inform consumers.
Though Barber’s description of the trust and altruism model – “give us money and get out of the way” – is clearly loaded, and his choice of examples deliberately shocking, there is a lot to be said for this model of change. A combination of an emphasis on staff professionalism and a light-touch monitoring and support mechanism describes a number of the more successful HEFCE initiatives – here measuring success as sustained impact on the practice of individual academics.
Examples include FDTL, or the smaller JISC-funded projects. Evaluations of these, both singly and en masse, tend to focus on the lack of wider sector impact. Which surely misses the point – the impact on the careers of those touched by the project is often huge.
Where these worked best they were supported and encouraged by institutional structures – often those supported by TQEF/SPS funding. This ring-fenced allocation required investment in learning and teaching – monitoring was again light touch (more so with the advent of the “single conversation”) but enough to ensure that what was promised was built.
What could work here as a viable model of change is something akin to Von Hippel’s “lead user theory” – which empowers and supports professionals to identify the best solutions to problems they face, which can then be extrapolated elsewhere. I like this because of the impact on individual educational developers and individual academic staff.
You are here
All this sounds lovely, but it is not where we are now in terms of enhancement. To start with – there is scarce funding for pedagogic research, and we don’t fund teaching quality enhancement in England any more on a national level. Some institutions do notably well (often down to the empowerment of key enthusiasts) in encouraging academics to take teaching (and by this I mean actual teaching, not the TEF) seriously.
But this is nothing short of a national scandal. A lot more noise should be made. With TEF and NSS we have enough pressure to improve teaching, so how about some support for those who are trying to do so? I’m not talking about another round of CETLs (although there was initially supposed to be one, scheduled for 2010…) but of low-level funding for educational development and institutional support.
The loss of the link between TEF and fee levels means that we need to wait and see what use is made of the exercise’s results going forward. TEF entry is – after all – a condition of OfS registration. And the TEF results – which due to the downgrading of NSS data and the addition of new considerations – will not be comparable between Year 2 and Year 3. Perhaps TEF will focus employer attention on the need for improvements where the problem actually lies: in the graduate job market. We can but hope.
Attention in enhancement circles is focused on measurement – analytics, learning gain, student experience. Whereas any field of inquiry with pretensions of rigour should be keen to derive methodologies that capture reliable data – we do seem to be capturing data at the expense of measuring impact… and it is unclear exactly what impact we are measuring.
The policy environment seems actively designed to discourage experimentation. Can we legitimately use the same data as an indicator as we do as a measurement? Are there forms of teaching that benefit students that students do not enjoy? As things stand we are unable to say.
Another time?
There are two big changes to the sector – certainly since Hale, and even since Dearing – that suggest that it may be time to come back to the idea of small funded projects in teaching quality enhancement.
The first is the way that – in England, Wales, and Northern Ireland – funding now follows the student. Ring-fencing funding would require active investment from regulators who are under pressure to support vulnerable subjects and other, more fashionable strategic goals. The sheer range of higher education institutions, particularly the rise of the smaller private college, means that in many cases it would not be possible to support work at any meaningful scale.
The second is the rise of precarious employment. Enhancement has historically relied on enthusiasts, able to use “spare” time to participate in the latest local and national developments. With the growth of academic staff employed purely to teach – without the funded capacity even to carry out their own research, this capacity has largely disappeared. Even full-time staff now find they have less time, and fewer incentives to use what scarce spare time they do have, to work on teaching enhancement projects.
Small funded projects actively purchase the time of staff in such situations, making it possible to devote energy to improving teaching. The downside is the opportunity cost of participating in bidding process, but work within Jisc and elsewhere has seen the establishment of lighter-touch processes – for example not insisting on a fully-costed business plan at an early stage. We should hope that institutions, wary of the use of TEF and NSS in league tables, and increasingly collecting data via analytics approaches, can also find space to focus on actual enhancement. Smaller funded enhancement projects can help in this.
Putting students at the heart of the system is a laudable aim, but without giving the sector the support it needs to deliver excellent teaching we use data (like TEF) as a punishment without offering a path to improvement. Enhancement needs time, needs permission, and needs support. All of these things need funding – and of all the ways the OfS seeks to actively intervene in the sector, teaching quality enhancement does not currently appear to be a priority.
David
(a) this is a brilliant summary
(b) it is a sad history of too many mostly unsuccessful attempts to make a difference
(c) the LTSN/HE Academy subject centres were one of the most successful initiatives because they were run by teachers in the disciplines, for teachers in the disciplines. They give the lie to Barber’s ‘trust and altruism don’t work’. It was a sad day when the HEA under Paul Ramsden decided to close the subject centres down, rather than do the unthinkable and close the York office down instead, in the face of big budget cuts.
(d) the CETL initiative was hamstrung by appealing too much to the worst instincts of institutions and being written too much by people (at the centre) wanting to leave a legacy. The requirement to spend a lot of the money on a building was very odd, and in practice the CETLs were all designed for narrow subject-based initiatives. This wasn’t in itself a bad thing, but there was an opportunity to support an institution-wide L&T strategy, which was missed. (Full disclosure, I fronted a bid to seek funding for such a strategy. It might have been a bad bid, but I don’t think so – it was put together by a group of colleagues for whom I had and have the greatest respect. It made the second round but I think it just didn’t fit the mindset of the CETL-choosers). See also https://www.academia.edu/4697241/The_failure_of_the_CETLs_and_the_usefulness_of_useless_research_SRHE_News_8_Editorial_May_2012
(e) “without giving the sector the support it needs to deliver excellent teaching we use data (like TEF) as a punishment without offering a path to improvement. Enhancement needs time, needs permission, and needs support. All of these things need funding”. Absolutely right. But it is surely for the sector to find the funding. We shouldn’t need the OfS to tell us what our priorities are. Unfortunately, we probably do, because too many people think higher teaching quality takes too long to translate into higher revenue, or never does.
Thanks Rob – you are very kind.
I’ve written in more depth on CETLs before – https://wonkhe.com/blogs/cetls-and-the-ghosts-of-teaching-excellence-past/ – I was a part of the small team at HEFCE who delivered the programme within the constraints set by the DfES.
I would agree with you on the subject centres (interestingly many still exist as a network beyond funding, demonstrating the strength of the community – and a few have been taken on wholesale by relevant professional bodies).
But I disagree with you at point (e). I don’t expect OfS to tell people what their priorities are, merely to offer small amounts of funding and support so people can address their own priorities. Time, permission and support all have a cost – and very few institutions have the willingness (or in some cases, capacity) to cover this.
Thanks for this valuable summary. It is dispiriting to find I have spent so much of my career chasing the shape-shifting mirage of teaching quality. In its assessment form it was largely treated with resignation verging on contempt. The few real attempts at enhancement were much more favorably regarded. Chief among these were the Subject Centres. These were staffed by professionals dedicated to searching out good practice and then curating resources for others to use – really use – in the classroom. This is the the only thing that made a real difference to my teaching practice and I mourned their loss. Bring back the Subject Centres.
David, all who are concerned with teaching and learning in higher education in the UK are, I feel, in your debt. It is so important to have this story (dismal as it will be to many) set out and placed on the record, not least in an era where matters of history and institutional memory seem to have fallen by the wayside.
I’d just add a further historical strand, that an additional impetus for developments in higher education teaching was provided by the Nuffield Foundation in the early 1970s. Led by Tony Becher, its Group for Research and Innovation in Higher Education produced – on the basis of research into and visits widely to higher education institutions – a huge array of reports, many of which stand the test of time. Its final report, ‘Making the Best of It’, is lucidly written and many of its case studies and even several of its recommendations hold water today.
Perhaps, too, particular mention might be made of (Professor Sir) David Watson who played a significant part in many of the organisations involved in this story, from CNAA onwards, in promoting the cause of teaching and of its development in higher education.
Cheers
Ron Barnett
David, let me add my thanks to those of the distringuished colleagues who have already commented. An excellent and valuable piece. I will now go away to read your piece about the CETLs.