Funders and research institutions want to be able to monitor the progress of the research they are funding and, in turn, are increasingly requiring researchers to provide outcomes data. There are clear advantages in improving insight into researcher contributions and impact – whether cultural, economic, medical, political, scientific, or social – but this often comes at a cost. The administrative burden on researchers, in particular, is arguably higher today than it has ever been; and there is also concern within the research community about the overuse of metrics such as the Impact Factor in making hiring, promotion, and tenure decisions, as expressed for example in the DORA declaration.
The Metric Tide, a recent Higher Education Funding Council of England (HEFCE) report on the role of metrics in research assessment and management calls for what it terms ‘responsible metrics’, which are defined as having:
• Robustness: basing metrics on the best possible data in terms of accuracy and scope
• Humility: recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment
• Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results
• Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system
• Reflexivity: recognizing and anticipating the systemic and potential effects of indicators, and updating them in response
James Wilsdon, Chair of the report, gave us a sneak preview at the recent ORCID-CASRAI conference in Barcelona, and we are delighted that the final report confirms the recommendation to mandate ORCID iDs for the next UK Research Excellence Framework (REF) – the national research assessment system – as part of a broader effort to improve the data infrastructure that supports research information management. Perfect timing, given our recent agreement with Jisc for a national consortium in the UK – and also because the UK Research Councils’ preferred research management system (RMS) is Researchfish, whom we have just welcomed as an ORCID member.
“Working with ORCID will enable researchers to link their ORCID iD with their Researchfish account,” says Frances Buck, Director of Researchfish. “This will allow researchers to push and pull publications and award data between the two systems. This researcher-led interoperability means that any system able to consume the ORCID iD will be able to benefit from the high-quality, verified-at-source data that the researcher chooses to share publicly via their profile. Researchfish will automatically associate the funder’s grant reference number (verified by the funders using Researchfish) with the publication data, to enrich the ORCID repository and provide another valuable link in the research data chain. To date researchers have reported in excess of 500,000 publications in Researchfish.”
Researchfish is also starting to support funders outside of the UK, in particular in Canada and Scandinavia, to help researchers in those countries benefit from a lighter administrative burden. Our national agreement in Italy, also announced recently, has a similar goal. Under the auspices of ANVUR who, like HEFCE, are responsible for the national assessment exercise (the Italian VQR), the Cineca consortium is implementing ORCID across 70+ universities and research institutes in the first instance. They see ORCID as a way to reduce the reporting burden on researchers, for example, as a result of integrating ORCID iDs into university and funder systems, as well as auto-updating between CrossRef and ORCID records (more on this shortly). Their intention is to ensure that at least 80% of Italian researchers have an ORCID iD, with links to their research output back to 2006, by the end of 2016.
At the ORCID-CASRAI Conference in May, João Moreira (FCT) and Alcino Cunha (Univeridade do Minho) presented the Portuguese vision for an ORCID based synchronization framework for a national CRIS ecosystem. Again, the goal is to use ORCID as a way of reducing the administrative burden on researchers through the twin aims of inputing once, reusing often; and automatic synchronization between research data systems.
Other countries will soon be launching similar national burden-reduction schemes. In Australia, for example, the recently announced Joint Statement of Principle: ORCID – Connecting Researchers and Research has been endorsed by Universities Australia (UA), the Australasian Research Management Society (ARMS), the Council of Australian University Librarians (CAUL) and the Australian National Data Service (ANDS). A joint statement issued by the ARC and NHMRC states that: “The use of ORCID may help streamline research administration and reporting for researchers and administering institution
To quote Liz Allen, Head of Evaluation at the Wellcome Trust, ORCID Board member, and a member of the steering group that worked on The Metric Tide report: “The cost of national research exercises is great and comes in many forms, including monetary, administrative burden, and emotional energy for all involved…The call for more ‘science of science’ is therefore exciting and timely – how might we make funding science more scientific? The question presents an opportunity for sector-wide engagement and collaboration to bring insight and efficiencies to how we fund research, what we should value in different contexts, and what does and doesn’t work.”