Third Mission ActivitiesImpact

Why Do Most University Impact Studies Fail to Adequately Assess Universities’ Engagement with Communities?

The traditional ‘big number’ approach to measuring economic impact is out of step with what places need from their universities. As a result, improvements in engagement are difficult to analyse, plan and implement.
Written by James Ransom

Assessing economic impact in higher education has always been a challenging endeavour. With their traditional foci on teaching and research, universities tend to produce impact that is not always directly observable and easily measurable. Universities’ ‘Third Mission’ activities, and especially their varied formal and informal engagements with the local community and beyond, render measuring impact particularly problematic. Still, institutions need to ensure continuous improvement in engagement by consistently evaluating their performance against targets. Economic impact studies have gained popularity in the recent decade as a convenient way of showcasing progress, but they tend to focus on ‘big picture’ numbers and therefore are often misleading, if not entirely meaningless. By overlooking important nuances and details, they provide a simplistic picture and can hardly be used as a foundation on which universities can develop successful strategies. I argue that universities should understand the limitations of impact studies and should find alternative methods of evaluating their performance in engagement activities.

Why we need change

You don’t have to look far to see economic impact studies. My former employer had a flagship biennial report with a steadily-increasing figure for the impact of UK universities – £21.5 billion to UK gross domestic product at last count – which it has used successfully for lobbying and campaigning. As long as this figure keeps increasing, everybody is happy. Many institutions have their own studies – £650 million of impact here, £400 million impact there – and often with LEP-level or regional disaggregation. Of course, such studies are not limited to higher education. We’re informed that shooting contributes £2 billion to the UK economy and supports the equivalent of 74,000 full-time jobs. Ornamental horticulture and landscaping contributed £24.2 billion to national GDP in 2017.

There are helpful academic papers which deconstruct the methodologies for calculating economic impact, and the common pitfalls. Instead, I want to challenge the preoccupation we seem to have with ‘one big number’ impact studies and what we lose in the process.

There are two shifts taking place which render the traditional impact study less effective:

1. A single large number fails to capture what is increasingly important. The shift towards universities being ‘for’ a place, rather than simply ‘in’ or ‘from’ a place, means this data needs to be far more nuanced. We need to know specifically who is benefitting, and how, and who is missed out. We need to know the businesses and the communities behind these numbers. As disillusionment grows with traditional methods of measuring economic success – GDP, GVA – and attention on ‘inclusive’ and social development begins to be translated into policy change, economic impact analysis needs to keep up.

Traditional impact studies simply don’t do justice to the range of university activities. They measure spending, output and employment, but do not capture the full impact of engaging with communities in a marginalised neighbourhood, or working with small businesses to strengthen their supply chains, for example – activities that may have huge impact but make little difference to a £400 million impact figure. (Accounting for social value can help here).

Traditional impact studies measure spending, output and employment, but do not capture the full impact of engaging with communities.

2. As we grapple with recovery from Covid-19, it is both tone-deaf and ineffective for universities to be shouting about how good they are, whilst also asking for assistance from government. Rather than communicating about the size of their value-added, university messaging needs to focus on solutions and partnerships. Policymakers need a more sophisticated understanding of impact which moves beyond broad figures to specific information on which communities, businesses and industries have benefited from the university, and who stands to benefit from future support.


What else is wrong with traditional impact studies?

I should note that economic impact studies are not all bad. It is helpful to see returns on investment, and to raise awareness that universities have economic clout and should be seen alongside other major industries. But they risk being a blunt instrument, obscuring what is often highly patchy and inconsistent local impact behind impressively large numbers. Economic impact studies need to be married to a rich understanding of local impact – perhaps through something like an institutional heat map combined with a survey of perceptions or social impact assessments.

Four further shortcomings that come to mind:

Uniformity. Despite huge variation in local contexts across the UK, and the individual histories and missions of universities, impact studies all end up looking pretty much the same. If you line up five university impact studies and remove the university name, can you tell who (or where) they are talking about? The uniformity of approach, and measuring success against numerical benchmarks, means we lose out on what may be needed. By working towards what is measured and counted, impact ends up converging into a standardised set of headline numbers and we lose the local context.

If you line up five university impact studies and remove the university name, can you tell who (or where) they are talking about? The uniformity of approach, and measuring success against numerical benchmarks, means we lose out on what may be needed.

Impact. Slightly tongue-in-cheek, I would like to see an impact study of impact studies. Do they lead to positive change? Or boost perceptions of universities? Quite possibly. But next time you are in a taxi to a university, ask the driver about the impact of the university. You’re unlikely to be quoted an economic impact figure of £450 million a year to the LEP’s economy. You’ll probably be told about the business that decided to open a new site near the university, or the impact of students volunteering with communities (and how the university is good business for the taxi company – at least before lockdown). You might argue that economic impact analysis is aimed instead at funders and policymakers. But should it not also reach residents and businesses?

Fatigue. Somewhat cynically, does anyone really care whether the economic impact is £600 or £900 million? Beyond a certain point, big number fatigue sets in. Figures between institutions are not always directly comparable, and the process of reaching the figures is not always transparent (or easily replicable).

Unintended consequences. We are not at this point, but I can imagine a league table of economic impact rankings. Universities should be well aware of the limitations of league tables, and the uncanny ability of rankings to shape and warp policies away from what is important – both for the institution and for the place.

Above all, my concern is that economic impact analysis can mask inequalities and ‘cold spots’ in university engagement. Of course, heatmapping as an experimental alternative brings its own set of issues. Consistency between institutions, subjective judgements over the importance and intensity of shading, and the complexity of trying to map such a wide range of activity are issues that need to be resolved. But they may also expose quite starkly where a university is not working, and not having an impact – things that are hidden in the ‘one big number’ approach. Without a more nuanced approach to assessing impact, universities will find it difficult to ensure continuous improvements of their engagement practices.


The original article was published on my website and adapted for publication in ACEEU Spotlight.




Keywords

engagement engaged university assessment impact Third Mission

About the author

James Ransom
Independent Higher Education Researcher

James is an independent higher education researcher, and has recently worked with the British Council, the Royal Society and the National Centre for Entrepreneurship in Education. He is a PhD candidate at UCL Institute of Education, a Research Affiliate at the University of Rwanda, and an Associate at Yorkshire Universities, and previously worked for Universities UK and the Association of Commonwealth Universities. His research focus is on the relationship between universities and place.

LinkedIn Profile

Image References

jchizhe @ envato