Assessing the impact of university research is not complicated; it carries benefits for both the university and society, says Professor Chris Brink

Published On: 23 September 2023|

Professor Chris Brink, who has been focusing on the societal impact of higher education research for about 10 years in both in the United Kingdom and Hong Kong, says people are initially perplexed by the idea of it.

They say: “I don’t know what it means, and I don’t know how to do it. I’m not sure how you can have a definition of it. I don’t see how you can evaluate it. It’s all very complicated”.

He demonstrated to delegates at Universities South Africa’s (USAf’s) 7th Biennial Research and Innovation Dialogue how simple this was, conceptually. He was speaking during a panel discussion on Research and Innovation Impact at the Dialogue, which took place in Umhlanga on 21 and 22 September. The Dialogue is an initiative of USAf’s Research and Innovation Strategy Group and is part of the Group’s priority to strengthen research and innovation in the higher education sector.

Brink (left) is regarded as an expert on the subject. He started focusing on the impact of higher education when he was Vice-Chancellor of Newcastle University in the United Kingdom (UK), which followed his holding the same position at Stellenbosch University. He was convener of the sector-wide Hong Kong Research Assessment Exercise (RAE) in 2020 and is also convening the 2026 one. Bestowed Commander of the British Empire (CBE), one of the Queen’s honours for his distinguished contribution in higher education, Brink is the author of the seminal book, The Soul of a University – Why excellence is not enough (Bristol University Press, 2018).

Professor Chris Brink, who has been focusing on the societal impact of higher education research for about 10 years in both in the United Kingdom and Hong Kong, says people are initially perplexed by the idea of it.

They say: “I don’t know what it means, and I don’t know how to do it. I’m not sure how you can have a definition of it. I don’t see how you can evaluate it. It’s all very complicated”.

Switching the mind

Brink said understanding impact is a mind switch. “And I’m going to try persuade you it’s worth doing,” he said.

He suggested starting with case studies of impact, such as those done by the Research Excellence Framework in the United Kingdom (UK), which assesses the quality of research in its institutions. The four UK higher education funding bodies do the framework, namely Research England, the Scottish Funding Council, the Higher Education Funding Council for Wales, and the Department for the Economy, Northern Ireland.

It was first carried out in 2014 and then again in 2021 and replaced the previous Research Assessment Exercise.  It involves each of the research units of 150 universities submitting their research outputs and narrative case studies of the societal impact of their research.

The department must answer two questions:

  • What beneficial change has your unit’s research brought about in society; and
  • What evidence can you provide of such change?

The website of the Research Excellence Framework 2021 includes a searchable database of every case study that every department and every research unit had to submit. Type in biology in the database, https://results2021.ref.ac.uk/impact, for example, said Brink, and although quite slow as it has thousands of examples, it brings up 325 impact case studies.

For each case study there is a summary. There are also options to see the underpinning research, the details of the research, and the evidence. And the search facility includes the option to select research from a particular university.

Brink’s search for mathematics research brought up 208 examples. One case study was Acoustics for the benefit of musicians, done by London South Bank University, and it stated the impact type was cultural.

“You can do it for UK 2014. You can do it for Hong Kong RAE (https://impact.ugc.edu.hk/). And for the Australian Research Council’s engagement and impact assessment (https://dataportal.arc.gov.au/EI/Web/impact/ImpactStudies assessment),” said Brink.

Brink said an informal definition of impact used to be what your peers in the same discipline think of your research. Although that was valuable as it ensured quality, it now had an added dimension. “Now it is about what difference has this made in society? This is the mind switch,” he said.

He also provided a formal definition of impact they use in Hong Kong. Its key word was “demonstrable”, he said:

“Impact is defined as the demonstrable contributions, beneficial effects, valuable changes, or advantages that research qualitatively brings to the economy, society, culture, public policy or services, health, the environment or quality of life, and that are beyond the academia. Impact in this context includes, but is not limited to:

  • positive effects on, constructive changes or benefits to this activity, attitude, awareness, behaviour, capacity, opportunity, policy, practice, process or understanding of an audience, beneficiary, community, constituency, organisation or individuals; or
  • the reduction or prevention of harm, risk, cost or other negative effects.”

He said impact is not the same as outreach engagement or community work. Neither is it geographically bound. “It might be, but it doesn’t have to be where you are. You might generate an impact in your town or in Africa. Or you might generate an impact in China or around the world. It doesn’t matter. Just tell me what impact you made. You might have planned it, or you might be lucky to get some impact without it being planned,” he said.

Definition of research impact

Brink said an informal definition of impact used to be what your peers in the same discipline think of your research. Although that was valuable as it ensured quality, it now had an added dimension. “Now it is about what difference has this made in society? This is the mind switch,” he said.

He also provided a formal definition of impact they use in Hong Kong. Its key word was “demonstrable”, he said:

“Impact is defined as the demonstrable contributions, beneficial effects, valuable changes, or advantages that research qualitatively brings to the economy, society, culture, public policy or services, health, the environment or quality of life, and that are beyond the academia. Impact in this context includes, but is not limited to:

  • positive effects on, constructive changes or benefits to this activity, attitude, awareness, behaviour, capacity, opportunity, policy, practice, process or understanding of an audience, beneficiary, community, constituency, organisation or individuals; or
  • the reduction or prevention of harm, risk, cost or other negative effects.”

He said impact is not the same as outreach engagement or community work. Neither is it geographically bound. “It might be, but it doesn’t have to be where you are. You might generate an impact in your town or in Africa. Or you might generate an impact in China or around the world. It doesn’t matter. Just tell me what impact you made. You might have planned it, or you might be lucky to get some impact without it being planned,” he said.

Evaluating impact

He said points to consider when evaluating research impact included:

  • It must be research based, that is, “If you are the head of the Department of Biology at a particular university, you must be able to trace the impact back to research done at that department at your university,” he said;
  • It must be substantiated by verifiable evidence; and
  • “Promises are not good enough,” he said. “It must have happened already.” 

Evaluation needed peer review panels, in the same way research outputs does. “But usually, and importantly, you supplement the academics with people from society. So if some department submitted impact studies in entrepreneurship, then you have some entrepreneurs on the panel. If somebody submitted impact case studies on water management, you have some people who work in water on the panel. This is about societal impact so evaluation must have societal input,” he said.

Evaluation criteria must also include reach. “That is, how far did the impact go relative to the natural domain of application? So if it’s about water, which areas of drought or water-scarce areas did you reach? If it’s about education in schools, which schools did you reach?” said Brink.

The evaluation is judged on the same scale as research outputs, similar to South Africa’s National Research Foundation ratings, such as world-class, internationally-recognised and so on.

He pointed out that websites such as the Research Excellence Framework’s do not have the judgments because those are given to universities and are not in the public domain.

Impact attracts research funding

One lesson they have learnt is that impact generates funding. “When you begin, people are sceptics. It takes them out of their comfort zone, and you just must keep explaining. Eventually, it turns out that if you have a portfolio of case studies, when you next apply for funding, that really turn’s people’s minds,” said Brink. And that way, the sceptics are won over.

He said it is not excellence versus impact. “It is the same thing, just looking at it from a different angle,” he said.

Impact is not about activities or promises

An activity is not an impact. “Oftentimes you get people say, ‘but I have organised school children to come and visit my department’. And I say, ‘Well, I’m delighted to hear it. But that was an activity. Now tell me, what difference does it make?” said Brink. “Doing something is not an impact. You must check to see what has changed as a result of doing that.

“And finally, I’m not interested to hear about what you intend to do five years from now. I’d like to know what you accomplished five years ago, or three years ago or last year or this year. It’s track record, not promises”.

Impact is not boxed into disciplines

Brink concluded his presentation with a diagram that showed disciplinary levels on the left-hand side and on the right, areas where those disciplines had created impact. The diagram was a complicated mesh of lines. It was not fabricated but an analysis done by a company called Clarivate.

“The left-hand side is organisational, the right-hand side is an empirical study. If you follow the discipline of biology, for example, you see biology could have an impact in agriculture and fisheries. And it did. Because this is a real study.

“And likewise, if you begin on the right-hand side, and look at public policy, for example, and see where the impacts in public policy come from, you we will see they came from practically anywhere. So the lesson here is that impact doesn’t come boxed in disciplinary boxes. It is by nature interdisciplinary,” said Professor Brink.

Gillian Anstey is a contract writer for Universities South Africa.