May 7, 2024
Grants and hiring: will impact factors and h-indices be scrapped?

Grants and hiring: will impact factors and h-indices be scrapped?

Researchers across all disciplines worldwide could soon be assessed under a new agreement that drops impact factors and other standard metrics.Credit: Getty

Universities, scientific academies, funding institutions and other organizations around the world will have the option to sign a document that would oblige signatories to change how they assess researchers for jobs, promotions and grants.

Signatories would commit to moving away from standard metrics such as impact factors, and adopting a system that rewards researchers for the quality of their work and their full contributions to science. “People are questioning the way they are being evaluated,” says Stephane Berghmans, director of research and innovation at the European University Association (EUA). The Brussels-based group helped to draft the agreement, which is known as the Agreement on Reforming Researcher Assessment. “This was the time.”

Universities and other endorsers will be able to sign the agreement from 28 September. The European Commission (EC) announced plans last November for putting together the agreement; it proposed that assessment criteria reward ethics and integrity, teamwork and a variety of outputs, along with ‘research quality’ and impact. In January, the commission began to draft the agreement with the EUA and others.

The details of the agreement were published on 20 July. The intention behind the two-month head start, Berghmans says, was to give universities time to fully consider the document and the ramifications of signing. The process is likely to involve reviews by institutions’ upper levels of administration, and perhaps approval by legal departments, he says.

Labour-intensive

“This initiative is an excellent move,” says Pernilla Wittung-Stafshede, a biophysical chemist at Chalmers University of Technology in Gothenburg, Sweden, and a member of the Nobel Committee for Chemistry at the Royal Swedish Academy of Sciences. She adds, however, that her own university is unlikely to sign. The proposed approach is much more labour-intensive than a simple calculation of an impact factor, she says. “It’s going to be really hard as it speaks for doing quality evaluations, which take more time.”

More than 160 European universities and more than 190 other research organizations expressed support for the initiative at a stakeholder assembly in July, but Berghmans says that there’s much work yet to be done to raise awareness of the final agreement. The EUA represents more than 850 universities, so a significant proportion of the membership has yet to express a willingness to sign. “We know there are many universities out there that don’t know the details or haven’t really heard about it,” Berghmans says. “Our hope is to build momentum.”

Pernilla Wittung-Stafshede headshot.

Pernilla Wittung-Stafshede headshot.

Pernilla Wittung-Stafshede, a biophysical chemist at Chalmers University of Technology in Gothenburg, Sweden, backs the initiative, but warns that many universities still know little about it.Credit: Chalmers University

Only two universities in the United Kingdom — the University of Glasgow and Loughborough University — participated in the assembly. By comparison, the Netherlands was represented by a dozen universities. Berghmans notes that universities in the Netherlands are generally “well ahead” in efforts to reform researcher assessment. “The place where you start doesn’t matter as long as there’s progress,” he adds.

The agreement requires signatories to consider researchers’ contributions to science that go beyond publications and citations; these would include teaching, academic leadership and peer review. It calls for institutions to abandon the “inappropriate” use of the journal impact factor and h-index, two common measures of researcher productivity. Such rankings offer an incomplete and sometimes misleading summary of researcher quality, Berghmans says. “We aren’t saying stop using these metrics,” he says. “We’re saying don’t misuse them.”

Subconscious bias

The agreement also calls for institutions to stop considering the ranking or reputation of a researcher’s institution when making hiring decisions or delivering awards and grants. Such factors give some researchers an unfair advantage, Wittung-Stafshede says. She notes that researchers who have positions at highly ranked institutions are often seen as more accomplished, resulting in a sometimes subconscious bias that is hard to erase. She adds that such bias sometimes takes place at a country-wide level. “It’s easier to get ERC [European Research Council] grants if you work in the UK — before Brexit — or Germany or France than if you work in, say, Hungary or Norway,” she says.

“University rankings, being a concoction of opinion polls and weak proxies, generally do a poor job of evaluating the richness and breadth of institutional capacity for research and education,” says Stephen Curry, a structural biologist at Imperial College London who chairs the global Declaration on Research Assessment (DORA). “No serious scholar could defend their use in the evaluation of individual researchers.”

DORA, a statement originally drafted in 2013, calls for doing away with impact factors in issues of hiring and promotion. The initiative has been signed by more than 19,400 individuals and 2,600 organizations worldwide.

Berghmans emphasizes that the new agreement goes beyond a declaration because it comes with actual commitments. No enforcement measures or specific penalties for non-compliance are in place, but signatories will face “peer pressure” to live up to the document, he says. “Over a year, after five years, there will be a need for organizations that sign up to show what they are doing and how they are progressing.”

Although the document was created by European institutions, Berghmans says that he would welcome signatories from elsewhere in the world. “This is not just a European issue,” he says.

Source link