In our discussions with UK higher education leaders, we’ve become fascinated by the question of how universities are thinking about digital technology as part of their strategic planning – especially in the context of learning and teaching.
The pressure to deliver value for money for students and be more efficient are powerful drivers. So, too, is the power of digital technology to expand access to education, enrich the learning environment through technologies, and generate data-informed insight about how students are engaging with learning – and how teaching staff are engaging with students.
As a result over the last few decades the vast majority of universities have embraced digital, from cloud-based student record systems to virtual learning environments, from augmented reality classrooms to bots managing student queries, from lecture capture to learning analytics. High speed wifi and device charging points everywhere on campus are a matter of basic hygiene.
But these efforts have not been straightforward, and some providers have had greater success than others in adopting digital technologies. University leaders have described to us creaking systems that require endless patches and workarounds to meet the needs of the students and staff of today, never mind tomorrow. The stories of enterprise-systems blocking improvements to the education experience is for most senior leaders a major pain point and “technical debt” is commonplace language in university boardrooms.
Technology moves faster than universities do, and it’s not always clear which technologies are a fad and which will become seamlessly integrated into our daily lives in the next decade. Backing the wrong horse could prove to be costly, and so “let’s wait and see” often becomes the dominant voice.
However, if universities are to master digital it will be vital to avoid digital determinism: the idea that organisations are obliged to adopt new technologies – or allow them to “disrupt” established way of doing things – simply because the technology is there.
Technology should work for us, not the other way around! Yet across the sector we sense a degree of anxiety about what it looks and feels like to be an organisation that is fully prepared for a digital future.
There’s plenty of guidance out there about digital transformation, digital leadership, “digital first” approaches and the rest of it – and Jisc does a great job in bringing all this together and supporting universities to connect to the best of the tech landscape. We won’t presume to supersede the advice of the experts but we do have some reflections on how universities might focus their strategic thinking on digital to create the best possible experience for students.
The future is uncertain, so flexibility is key
When it’s not clear what technologies will hold sway in the next ten years, or what students expectations might be, the focus of digital thinking has got to be less on trying to predict the future than on creating space for adapting and experimenting to it as it unfolds. Think about something like voice technology – frequently hyped as the next big thing, it can’t and won’t be anything at all unless it presents a solution to a specific learning and teaching challenge students or teaching staff are grappling with – and that challenge may present differently in different subject areas or levels of study.
What matters most, then, is not that the technology and organisations are designed according to a set of current assumptions about how people ought to use it to meet a hazy set of “future” needs, but that they are capable of adapting to future changes and evolutions of practice.
But the real challenge and opportunity, in our view, is building the possibility of change and evolution into curriculum design. Having technology that can – for example – help students to build connections on shared interests that cross subject disciplines and year groups to supplement the formal curriculum is no use if the aspiration to make those connections is not possible within the institution’s pedagogical approach.
Co-production with students
We’ve heard university leaders say confidently that students are digital natives and are better equipped than educational leaders to articulate what they want from their digital learning environment. Albeit generalising, the first may be true, but we don’t think the second follows automatically.
There are good examples of where students have given useful feedback on their experience of digital technology. We don’t advocate not asking students about their perspective. But unless the students happen themselves to be experienced digital enthusiasts, we’re sceptical about putting too much confidence in student views of how technologies should evolve.
Instead, tried and tested methods of engaging with students over how they experience the learning environment, what inspires them and what makes them feel discouraged, where they lack confidence and where they feel under-served, is probably going to generate more meaningful insight about how digital technology can support a thriving learning community.
Digital consumption and production
One higher education leader articulated the challenge as “students are very familiar with the consumption of digital materials, but they are not always skilled at being digitally productive”.
The capabilities to understand the power and limitations of data, and how to work collaboratively with others in different locations and different time zones are vital in a technologically saturated age.
Understanding the importance of data privacy and resisting the lure of unproductive – or even harmful – digital engagement may be as important a skill as productive engagement, and one that is vital to the collective wellbeing.
The digital landscape is full of buzzwords. Many universities are pursuing “personalisation” as a way of being digitally up to speed. But personalisation is arguably just another word for students being able to chart their own course through the available learning resources (whether these are text-based, other forms of media, or people) and forge a personal connection with the material.
Aula’s ethos has always been pedagogy first, technology second. That digital technologies can enable and enhance this personalisation journey is fantastic, but it requires a collective understanding among teaching staff of how students can develop the capability to co-create their learning for the technology to be effective.
Digital technologies should embed values
On the staff side, developing digital capabilities can be too focused on how to use specific technologies. Good tech shouldn’t require detailed training – people should want to use it because it’s easy to use and likely to enhance their teaching practice.
People will always have different attitudes towards digital technology – ranging from early-adopting enthusiasts to outright suspicion. Both extremes have pros and cons, and most of us fall somewhere in the middle.
This means that at the organisational level, it matters that space is created to do the deep thinking about how to adapt to the digital future. We’ve heard stories of digital innovation ideas shunted from committee to committee because nobody wants to take ownership of it. This is not an environment in which digital innovation can flourish and scale up when it works.
Likewise, digital brings real ethical challenges around data collection and use, mental health and community standards of online behaviour, and differential access to digital capital. These need thinking through within institutions – with the voices of the sceptics heard as well as those of the enthusiasts. Digital technologies can and should be carriers of institutional values, not an end in themselves.
This article is published as part of a Conversations on Learning series, in association with Aula.