It had been rumoured that the latest batch of Longitudinal Education Outcome (LEO) data was due for release the same week as the Teaching Excellence Framework (TEF) as a sop to those prestigious universities expected to do badly in TEF, but where graduates are high earners. Both releases are set to shake up the HE sector, but LEO has the potential, though its granularity, to have a much bigger impact on prospective students’ decision-making and as a tool for the government.
Publishing data about graduates’ earnings is a sure fire way to draw attention to the economic value of a degree. Or, rather, to those individuals who have particular qualifications and end up paying tax in the UK. Let’s say it upfront: the link between the subject studied at a particular university and graduate earnings are not in a neat cause-and-effect relationship.
Daylight come
LEO is supposed to help students make choices about where to study; the sunshine of presenting the data should be the disinfectant sought by policy-makers hoping to stimulate the great HE marketplace. The insipid quote from the Department for Education says as much:
“Our universities rank among the best in the world and this data confirms that having a degree can lead to rewarding and well-paid jobs. Young people recognise this, with more of them going to university than ever before – including record numbers of 18-year olds from disadvantaged backgrounds. We are improving the university system to benefit all students by giving them and their families the best possible information so they can make the right choices for their futures.”
But there is a legitimate fear that – in addition to its role in public information for prospective applicants – LEO will be used as a policy tool to manage institutions’ or students’ access to student loans company funds. As David Morris has said elsewhere on Wonkhe:
“At present, the Treasury continues to underwrite a significant proportion of student loan repayments under the terms of income contingent loans. LEO’s power is that it can be used to identify the long-run individual economic returns to higher education, which can in turn be used to regulate (read: minimise) levels of state subsidy.”
Understanding the politics of LEO requires some understanding of problems with the data, as well as its capacity for political deployment. Much of the sector’s concern about LEO has focused on these deficits and they’re important, though pointing out the problems isn’t the same as getting your own way on how the dataset will be used.
Known groans
The meat of the latest LEO data release is the capacity for analysis by subject and institution; that’s the data of greatest interest to the sector given the earlier publication of data on earnings by subject at aggregate level. However, the DfE’s explainer on the release goes into significant detail on the subject areas rather than drawing out the institutional results (leaving that for the league tablers?). As those of you familiar with subject coding know, the 23 subject titles in use cover a wide variety of programme types and subjects. This therefore masks variation by factors such as whether any given course of study has a vocational focus.
Woven throughout the LEO analysis is the caveats of what’s in, and what’s not. For example, the data release doesn’t include alternative providers, but the University of Buckingham is in as it returns data to HESA. The assessment of students’ prior attainment covers their A-level scores, but doesn’t show other qualifications (e.g. BTECs). And prior attainment is also only available for English HEIs.
The biggest caveat of all, glibly expressed in the DfE document is that while we have the data by institution and subject studied: “There are a number of factors that can influence the employment and earnings outcomes of graduates beyond the subject and institution attended.” And there’s a similarly important note about regional variations:
“There are also well-documented regional difference in pay across the UK. We have published the region that each university is located in. However, we do not have the current address of the graduates, so we do not know whether they have stayed within the region where they went to university or have moved to a different region to access a job with higher pay.”
The debate will rage about whether, given these known issues, it was ever the right thing to do to release the data in this way. But that debate is well-known to DfE: as part of its suite of documents released with the LEO data, the Department published a response to its survey on LEO following last December’s data release. The consultation received a paltry 24 responses from providers, representative bodies and other interested groups.
The most pressing concern in the sector’s responses was a plea for the contextualisation of the LEO data, something which might have overcome some of those identified issues. DfE’s survey response shows that there were calls for concurrent data released to show salary data relative to regional variations and also benchmarked for similar student cohorts (as TEF is):
The response to the questions prompted a large number of suggestions. Generally, respondents felt that including information from the Destination of Leavers from Higher Education (DLHE) and the Teaching Excellence Framework (TEF) was important. There was also a number of suggestions for including regional information on salary differences, the average graduate salary along with regional labour market characteristics and sector benchmarking as a means of contextualising the employment and wage data.
While we know that the social and financial capital of students prior to university had a big impact on their employment prospects, there is the potential – perhaps inevitably – for LEO to impact on student recruitment strategies. With variations by gender and ethnicity, the danger of misuse of the LEO data should be a significant concern:
One respondent argued that the large variations in outcomes and earnings by subject and characteristics should be measured against benchmarks rather than raw measures. They feared the data would offer a perverse incentive to avoid recruiting groups with characteristics that are associated with lower employment Rates.
The response promises, helpfully, that there will be more information to follow: “DfE has also commissioned research which will use LEO data to look at graduate outcomes after controlling for the different characteristics of graduates.” This should be welcomed, but in spite of the survey responses, and the widely-discussed problem of presenting the data in precisely the way the release does, the heat of the debate may still be around particular institutions and particular subject areas without sufficient nuance.
Sin of omission
The call for benchmarking the LEO data to take into account the many factors affecting student employability and earnings should, in theory at least, leave us with results which should the ‘added value’ of particular courses and institutions. But doing so is a particularly complex process, not least when we don’t tackle some of the major issues facing students’ prospects. In the December data release, there was extensive analysis of variations in outcomes by ethnic group, something completely missing from the June release. As we have seen in UCAS data, the HE sector has an uncomfortable questions to answer about race. Perhaps this was too uncomfortable for DfE.
The big picture
The LEO data release is just one of the new ways in which higher education providers are finding themselves judged on outcome measures. Institutions, alert to this quantitative turn, need to understand their own data and to make the necessary interventions if they’re going to survive. If LEO’s full force is deployed – either as the rocket fuel for the market, or as a tool to micro-manage access to funding – then there is the potential for some very hard times ahead for universities facing a string of political, social and economic woes. Leo’s roar will be heard.