We’ve been waiting a long time for this. But now we know, for sure, that region of employment does have an impact on graduate salaries.
That may not in itself seem like much. It’s an obvious conclusion that anyone who has ever applied for a job in the UK would draw. But now we know that DfE are fully, publicly, quotably, aware of this differential – and the problems it causes for using Longitudinal Educational Outcome (LEO) data to identify the “best” or “worst” providers.
We got the first inklings that our (frequent) requests for regional data in LEO were being taken seriously back in the summer of last year – this release looked at the propensity for graduates from each HE provider to live in the region they studied, and offered a tantalising glimpse of the impact of the home region, and the regions of study and residence. (For the super keen, we also saw earnings by the current local authority of residence).
It makes sense, on any level of interpretation, to consider salary and employment norms in region for graduates – and with the tension between a “civic mission” seeing graduates enrich the local area but possibly on a low salary, and a “wealth mission” seeing graduates moving to London to maximise incomes, this omission has been one of the key strands of the emerging LEO critique.
But what was missing was the impact of these regional variations on the salaries of graduates from individual providers. The DfE told us that there was clear evidence that “region has a large and significant effect on earnings”. And if you squinted, you could see provider level data plotted in figure 3. But that’s where they left it – with a dangling proposal to:
show the outcomes for each provider split into the following categories:
Those that stay in the region of the provider, Those that move to London, and those that move elsewhere in the UK.”
And that, today – is where we are. “Experimental statistics showing employment and earnings outcomes of higher education graduates by provider and current region of residence”. What we have is actually better than what we expected – we can see graduates separated out into all regions (including graduates working abroad where we have that data) for each provider.
Still not perfect
The region a graduate lives in has an impact on provider level medians. DfE use a weighted median in their top level data (which I haven’t plotted here for reasons that will become clear), and gleefully reports significant differences between median salary by provider even after a graduate region-based weighting is applied.
Of course, a gold standard LEO would absolutely need take into account subject of study and provider alongside current region (or indeed, local authority) of residence – alongside the sex of a graduate, their GCSE performance, and a suitable measure of inequality. However, such a nuanced examination would provide numbers too small to publish without identifying individuals.
There is hopefully a point in the middle, where LEO gives us enough information to be usable while remaining publishable. This release is not at that point, but I feel like we are gradually iterating around it.
The publication itself is clear on the limitations:
It should be noted that the data presented here does not control for many other factors that can influence graduate outcomes e.g. prior attainment, subject studied and other characteristics. It should also be noted that a higher education will have a range of personal and societal benefits that extend beyond earnings, which by its nature are not captured in the statistics presented here.”
So how useful is the weighted median? If we’re not controlling by subject or by sex, not very. We (and DfE) know from previous LEO releases that these are two major predictors of graduate salary, to the extent that even the very first release separated data out in this way. Subject will also have a regional component – providers frequently offer courses that directly support local or regional industry. So rather than plot the top level data, I’ve gone straight to the underlying sources.
What we have got
First up, I’ve plotted the standard upper and lower quartiles and median for each provider, filterable by years after graduation and region of current graduate residence (the usual filters by group and provider region are there too). As well as giving us a chance to note which provider supplies the highest earning graduates living in the North East five years after graduation (take a bow, er, King’s College, London) we also run into our first problem.
I’ve included on each tooltip the number of graduates and matched (the ones they could find in the data) graduates, plus the percentage of these in employment with or without further study. This allows you to take a view on how meaningful the (median, remember) salary data is – if we are only talking about 20 (+/- 5, for rounding) graduates then the median can be dragged all over the place – another sign of where this has happened would be out-of-kilter upper and lower quartiles.
I’ve also included a plot of the proportion of graduates working in each region, by cohort. This is a way of understanding the behaviour of graduates from each provider – and makes more sense if you look at a mission or regional grouping.
We see that many providers see the majority of graduates stay in the region their alma mater is in – with a particularly pronounced Scotland effect. All this is weaker for Russell Group providers, and a few other exceptions (for example, only 18 per cent of Loughborough graduates, and 24 per cent of Leicester graduates are still in the East Midlands five years on) – and particularly strong for Million+ and FECs.
And finally, both salary and proportion data can be examined on my provider dashboard – which gives you insight into trends in a single institution. The proportion is shown as a thin red line, the salary data using the usual symbols.
What it means
This publication is a critique of previous LEO publications – demonstrating how incomplete previous analysis has been without this component. However, as it does not include other characteristics (most notably, subject of study, sex, and prior attainment) already known to have an impact on salary, we can’t use this release for detailed analysis either.
It’s a promising direction of travel – it makes salary data used everywhere from Discover Uni to Sam Gyimah’s app competition winners obsolete (not that we’ll see any changes there!) and gives us a glimpse at trends in graduate behaviour we have not been able to see before. But the questionmarks remain over the practical utility of LEO for policy.
This data driven approach to student outcomes is the future of education providing better information for future student to choose where and what to study.
For international students we have been collecting data since 2016 and are currently tracking the graduate outcomes of 162,290 unique international graduates.
The latest report from Cturtle looks at data from international graduates who have studied in the UK including; university support in part time employment while a student, internship program participation, time to first job after graduation, location of first job and current job, industry of first and current job, monthly income of first job and current job.
Cturtle is the market leader in tracking international graduates who have studied in the UK, USA and Australia with a network of over 1.3M international alumni who have returned to ASEAN, Greater China and South Asia.
Cturtle works with innovative universities to connect their international graduates with over 12,000 hiring managers across Asia.
Report Link: https://cturtle.co/2020/01/14/uk-universities-international-graduate-career-outcomes-2020/
“many providers see the majority of graduates stay in the region their alma mater is in – with a particularly pronounced Scotland effect. ”
Sigh … Scotland is not a ‘region’ like those in England!
It’s not rocket science or profound. Scotland is a nation of a union-state, with its own distinct education system to that of England and the rest of the UK state, hence the ‘stay’ pull. It certainly isn’t to do with extraneous factors – such as the glorious climate and scenery!
Of course Scotland is also part of the UK macro-economy and employment market (at least for the time being…) but ‘region/nation’ is a more accurate descriptor of top-level UK variations … see ‘The UK Regional-National Economic Problem’ by Philip McCann for more correct use of terminology. See “The Scottish Economy” by Gibb, Maclennan, McNulty & Comerford for some up to date analysis and data.
But also within the Scottish national education system and economic environment, there are significant regional variations comparable in significance to those in the (much larger) England, examples include: the impact of the oil and gas industries in the north-eastern region on the offer and graduate destinations of universities located there, the ‘pull’ of the tourism and finance sectors in Edinburgh, and the impact of poverty and de-industrialisation in participation and graduate employment in the western areas of the central belt.
So, treating the whole of Scotland as an ‘additional’ single region of the English economy is false science and deserving of more than an aside or footnote in analytical frameworks.
Region or *nation*, please, if you’re using Wales… 😉
I use region to denote European Union Parliament electoral region, as this is what most UK official data uses. When I am offered more detailed geographic areas I prefer to use them.
I have been wondering if we’ll still get data using these regions post-Brexit…
Sorry if I have misunderstood, but I can’t see that any consideration has been given to the demographics of student intake within each region. i.e. Some institutions have a relatively local pool of students and so it would be misleading to consider these students as “stay[ing] in the region their alma mater is in” they’re just *continuing* to stay in their home region after graduation.
@JW you are correct – this analysis is missing from the report.
This further highlights the limitation of the LEO data. Institutions in the central belt may bemoan comparisons with institutions in the Souh East of England, but the same would be true for an institution like UHI being compared with a University in Edinburgh. Charlie Ball has lots to say on this, including a WonkHE blog and more insights are coming soon…