Higher education has often been slow to embrace digital tools or to use them to their full advantage.
There are many reasons for this, including digital capability, capital investment, critical perspectives towards technology, and a reliance on a handful of tech enthusiasts to lead the way.
In the age of generative AI though, the sector might need to be quicker to embrace these new tools, just as other sectors already have or are doing. Generative AI is already becoming deeply embedded in working practices across a wide range of professions.
By following this lead, university teachers, and indeed professional services staff, can save time on tasks – and this might enable us to have more time to talk and engage with students, individually or in small groups. In addition, if we are transparent about our use of generative AI, we can also demonstrate to our students the significance of this technological advance and model safe ways to use it.
Bouncing ideas off the screen
Course planning can be a hugely time-consuming task for education professionals. Whether planning a new course or reviewing an old one, the planning process takes considerable effort. Systems like ChatGPT and Bing Chat, if prompted carefully and thoughtfully, can be invaluable in supporting this work.
For example, prompting generative AI systems to write learning outcomes based upon a narrative description of a course or module could provide a very useful starting point for polishing those learning objectives – or conversely might make it very clear that the narrative for the course or module isn’t saying what we thought.
Similarly, early drafts of learning outcomes can be put to generative AI to seek clearer or more student-focused ways to pitch the learning outcomes for a course or a module. The tool in effect becomes more than a proof-reader, enabling us to engage in critical conversation with ourselves by challenging the meaning of our own words.
Alongside the statements that form the basis for a course, the planning team also have to think through how the course might best be delivered. This includes learning materials and learning activities, including assessments. Here again, much time may be spent in isolation developing ideas for classroom exercises, ice-breaker questions, examples to illustrate complex concepts – possibly designed for different levels of prior knowledge and understanding – slides and other presentation formats, discussion questions, and scenarios for authentic assessments.
The creation of all of these can now be supported by generative AI tools. This list is not exhaustive and there are probably many more teaching artefacts that generative AI can help with the creation of, that can only be thought of by the subject expert, deeply immersed in ensuring the success of their students.
Digital juicing
Some may doubt that generative AI tools like ChatGPT can save time, especially given the need to carefully check the outputs from such systems.
The experience of the authors, who have used such tools to, for example, generate a first draft of slides for a presentation or to turn a bland descriptive video script into a more engaging narrative between two fictional characters or to develop an overall plan for a teaching session, is that time does get saved.
And not only does time get saved, but there can also be elements of relief, fulfilment and indeed joy in being able to seek advice, inspiration and just great ideas from this assistive tool. At times when we struggle to find the energy to populate a blank page, in our experience the views and ideas from a generative AI can get your creative and analytical juices flowing again – even if you totally disagree with or just hate its outputs.
A critical friend
We do not advocate for the wholesale move to having lessons, reading lists and assessments planned and generated solely by generative AI. Rather, despite – or indeed because of – the need to carefully and critically analyse AI output, these tools provide a useful companion, a second voice, or perhaps even a critical friend, as we begin the process of planning courses, modules and classroom activities.
Through their potentially transformative impact on many time-consuming processes that contribute to the planning of a course, they have the potential to support us in delivery of a quality learning experience.
We would be doing our students a disservice by not embracing time saving generative AI tools in our own work, such as in our course planning processes. We also need to be transparent about how we legitimately and sensibly use these tools and encourage our students to follow our example in their work.
We believe that – for a sector seeking more substantial partnerships with students in learning – this is a far better way to handle innovation, rather than further opening the divide between those who assess and those who are assessed.
In my opinion this fails to consider the history of what you call ‘time saving’ devices, which have surely, under capitalism, functioned as labour saving devices – devices whose primary aim is to render capital independent of workers, who are deskilled and/or thrown into unemployment. Thus far digital tools, from ppt to Blackboard or Moodle, have not reduced academic workloads but increased them. If AI is indeed a tool which will free capital – eg ed tech capital, but also marketised universities – further from its dependence on labour, it would be quite wrong to argue that academics would be doing a ‘disservice to students’ to resist further digital encroachment/automated alienation: in my experience students prefer their human tutors to HEI surpluses and the automated means to extract them. The moralising discourse presenting pressure to push increasing amounts of academic labour online as a service to students is frequently a technosolutionist mask for an age old capitalist ideology. It’s a short piece, I realise, but it would be good to know how this time, things would be different.