Concerns over assessment and challenge in latest OfS quality investigation
Michael Salmon is News Editor at Wonkhe
Tags
The latest “boots on the ground” investigation from an Office for Students (OfS) assessment team has reported – it’s the second to look at undergraduate provision in the subject area of computing, after one published in November which found no areas of concern at Goldsmiths.
This one, at Bradford College, does find some things that the inspectors were worried about, specifically in the area of educational challenge and in assessment. Bradford College’s six different computing BScs (or three, each with a foundation year option) are validated and awarded by the University of Bolton, and in the academic year 2022–23 catered to 72 FTE students.
As with previous investigations, there’s no snap decision about the consequences for providers of having areas of concern detected – OfS will “continue to carefully consider this report before deciding whether to take further regulatory action.” We remain in the dark about how long these careful considerations are likely to take. All published investigations which have found areas of concern are still open, the earliest being from 12 September 2023 – for University of Bolton business courses – which is now more than eight months ago.
Areas of concern
The review team – three academics with expertise in computing, and an OfS staff member – visited the college in March and April 2023, with the decision to investigate taken back in December 2022.
The lines of inquiry that emerged during and from these visits were around regulatory conditions B1 (academic experience), B2 (resources, support and student engagement), and B4 (assessment and awards). Following investigation, these have given rise to three concrete concerns which are now reported back to OfS, potentially for regulatory action:
- the level of educational challenge and coherence was “below what would be expected of a computer science higher education course” (B1)
- high volumes of non-technical assessment in modules with technical subject matter, and issues around marking criteria (B4)
- issues with standards and marking that “may suggest that students achieve higher grades than the technical skills demonstrated support” (B4).
The report, we’re reminded, “does not represent any decision of the OfS in respect of compliance with conditions of registration.”
Once again, the assessment team has done a commendably thorough job in looking in quite some depth at module specifications and assessments. One particular area where this shines through is in making comparisons between modules at the college at levels 4, 5 and 6 and typical A level syllabus learning outcomes. This leads to the conclusion that:
in the academic judgement of the assessment team, core elements of a computer science curriculum were not taught at a level that was as challenging or deep as a Level 3 computer science course.
We also get what the team considered to be “indicative learning outcomes” for the topics at higher education level, drawing on what the British Computer Society (BCS) requires for the degrees elsewhere which it accredits. This gets quite specific – so where for the team at higher education level students “should be confident in manipulating Boolean expressions and values to solve a range of complex computer science problems,” at the college students “only need to demonstrate a theoretical understanding of Boolean logic and to illustrate this through the mechanical process of constructing a truth table.”
The reviewers argue that this is “a process similar to basic arithmetic, and part of the AQA Computer Science GCSE curriculum.”
The reviewers also looked beyond the syllabus to student work in making their assessment. For a selection of final year projects on software development lifecycles, they judge that none of the submissions they saw demonstrated a particular learning outcome that they felt should be there. Elsewhere it’s mentioned that not one of 14 final year project examples “demonstrated a level of technical skills and knowledge that would be expected of a computer science graduate.”
Assessment
There are other concerns specifically about assessment arrangements which get flagged, with overall concern about awarding:
In most examples of marked work that was shared with the assessment team, marking criteria were not strictly applied to the work being assessed and instead awarded grades were higher than would be earned through strict application of the criteria.
They note that a high proportion of assessment involve written reports and presentations, that academic writing and referencing appears to be heavily emphasised in both teaching and marking (“despite it not being one of the learning outcomes for a module, nor a skill that is especially relevant to being a computing or IT practitioner”), and that templates are used which students then add code to, rather than writing code from scratch, at points where the team expected to see evidence of fully independent work.
Questions around marking rubrics also give rise to the only comments from the review team which looks at validation arrangements:
In discussions with college staff, it was explained to the assessment team that all assessments must be marked using rubrics supplied by the college’s validating partner – the University of Bolton – that are designed for the assessment of written work. This meant that where assessments were designed to test students’ practical technical skills, rather than testing writing skills, these were marked using a rubric which was not designed for the purpose of assessing practical skills.
The assessment team considered that this suggested that the academic regulations (marking criteria) were not designed to ensure the credibility of relevant awards because they were not designed for the assessment of technical practical work.
The wider context
For students studying computing degrees at Bradford College in the 2020–21 academic year, 89.4 per cent were from the most deprived IMD quintiles 1 and 2, according to OfS analysis mentioned in the report. The assessment team recognises this in its interviews, and links it to the question of challenge and appropriate course design:
Staff acknowledged that they considered a purpose of their courses was to provide “educational opportunities for all” including students who were not confident enough to succeed at other providers. The fact that many students were the first member of their family to study for a degree was provided as an explanation for why many students lack confidence. Teachers explained that a consideration in the design of the programme was to “not put students off” by the level of challenge.
The review doesn’t spell out any conclusion here around the vexed question of context and regulatory judgement, but the assessors are clear throughout that the educational challenge and coherence of the courses was “below what would be expected of a computer science higher education course,” giving rise to concerns that students were not required to develop the skills they should be developing.
As with some of the previous assessments, and in keeping with OfS’ move away from the QAA’s approach, within the report there’s a contrast between the quality assurance process in place at the college and the output of this process:
The assessment team acknowledge that the provider’s quality assurance processes have been robustly followed. These include programme approval, including external academic input, periodic review of programmes, consultation with industry partners, consideration of external technology certification and external examiner reports. The assessment team can see evidence from the paperwork resulting from these quality assurance processes that some aspects of the courses have benefited.
However, the conditions of registration require that the application of a provider’s quality assurance processes result in a high-quality course. It is the responsibility of a provider, not only to create and apply quality assurance processes, but also to ensure that these processes and their application are sufficient to meet the conditions of registration.
And, as ever, there is an interesting readthrough to 2023 TEF results, where the college was awarded silver for student experience and overall, and commended for its “very high quality teaching, feedback and assessment practices that support students’ learning, progression, and attainment.”
***
On 22 May, Bradford College told us that the computing programmes were to be discontinued. Its comment in full:
We are open to the findings of the recent Office for Students (OfS) quality assessment report relating to our BSc (Hons) Computing courses and addressing any identified areas of improvement. However, the College undertook an internal review of this area prior to the assessment and has already executed plans to target these concerns.
As such, the BSc (Hons) Computing programmes will discontinue after current students complete their studies this academic year. The College is also working with local employers to develop a range of relevant alternative higher technical qualifications that will boost in-demand regional skills and ensure we retain computing progression routes.
Bradford College is one of the largest education providers in the region with Ofsted rated ‘Good’ provision and a ‘Silver’ quality rating for undergraduate provision from the Teaching Excellence Framework (TEF). Our career-focused courses give students a head start across a wide range of aspirational careers, as we continue our mission of working together to transform the lives of some of the city’s most hard-to-reach students.
The report makes little to no mention of external examiners, who should have picked up on the issues the assessment team identified about level of student work. Therefore how robust were the quality processes. I’d be interested to know if sample from the college was buried in a wider UoB sample for the EE, so overlooked or whether the EE did review work from students at the college. If was the later it raises the question of two different peer view processes (EE and assessment team) not agreeing on the matter of standards.