Contents
Open Research Indicators
UKRN Open Research ProgrammeLead
Neil Jacobs, University of Bristol
Background and scope
The Programme aim is to accelerate the uptake of high quality open research practices. The Center for Open Science theory of change highlights five types of action that can promote change. This is a supporting project, that is focused on enabling institutions and UKRN to monitor changes in open research practice over time and between groups, to better target support and other interventions at all levels in the COS theory of change.
Aim
The overall aim is to establish good practice in institutional monitoring open research, for example in the design and use of dashboards and reporting tools.
Progress so far
Sector priorities have been identified, potential solution providers have commented on their interest in and ability to respond to those priorities, publishers and funders have shared their priorities on monitoring open research, and institutions have reviewed this evidence and selected the pilot projects they would like to pursue.
Current activities
The remaining project objectives are to convert sector priorities into candidate indicators, and to develop, test and evaluate prototype/pilot solutions within institutions. We are interested, therefore, in exploring the feasibility of providing valid, reliable and ethical indicators for the following features of open research: open/FAIR data, the use of data availability statements, pre-registration and the use of the CRediT taxonomy.
Method
Each pilot will include the following steps:
- establish a clear rationale for monitoring any particular aspect of open research, using the INORMS SCOPE framework [led by institutions]
- develop and test prototype/pilot solutions within institutions [collaboration between institutions and providers]
- evaluate the pilots using defined and agreed evaluation criteria [led by UKRN and institutions, with input from providers]
Participants
UK universities
The following UK universities are participating in and leading various pilots:
- Bristol
- Edinburgh
- Exeter
- Glasgow
- Hull
- Kings College London
- Leeds
- Leicester
- Liverpool
- Manchester
- Newcastle
- Reading
- Sheffield
- Surrey
- Sussex
Providers
The following providers were prioritised by the institutions and are participating in various pilots:
Participation principles
The work involves collaboration between multiple institutions and multiple solution providers, some of whom may compete in markets or in other ways. However, the work is expected to be pre-competitive, and so all partners (institutions and solution providers) will be expected to act according to the following principles:
- Sector-led: The work will be driven by the priorities articulated by the UK research sector, based on the UKRN call for priorities. This means that potential indicators and related solutions will be preferred that best meet sector priorities, before any consideration of which systems, solutions, or providers might be relevant to them.
- Interoperability and solution neutrality: All partners are free to use their own or third-party products and services. All partners will make all reasonable efforts to ensure optimal interoperability for their data and services.
- Transparency, inclusion, and collaboration: The work aims to make science and research more transparent, efficient, inclusive, openly and freely accessible, and collaborative. The partners agree that the broadest possible audiences should have the opportunity to participate, to make use of and to contribute to the scientific process.
- Access to research data and metadata: Preference will be given to solutions that both exploit and contribute to FAIR and open sector-wide / system-wide data about research. At the same time, preference will also be given to solutions that both exploit and contribute to institutional data about research, to render solutions that are – as far as possible – context-sensitive and unsuitable for rankings.
- Data portability: All partners agree that any data they choose to share as part of this work can be freely transferred by others to their own or to a third-party host environment, in line with the terms of any sharing agreement.
Outputs
- Webinar March 2023
- Webinar March 2024: YouTube | Slides
- Information of further opportunities to engage with the pilots through their work will be advertised here and on the UKRN News page shortly.
Final report
The final report template is evolving but the following sets of draft headings give an overview of the intended outcomes.
Indicator report
- A reasonably precise definition of the thing being counted that provides criteria for deciding what counts and what doesn’t in almost all likely cases;
- Guidance on the quantitative or qualitative evidence provided by the indicator, including:
- the data needed to produce the indicator in terms of its quality (relevance, scope, extent, granularity) and its features (validity, reliability, transparency, etc), but not specification of a particular dataset, although examples might be given;
- the processing steps needed to turn data from potentially several sources into an indicator, in terms of its functions (inputs/outputs) and its features (transparency, reproducibility, etc); this doesn’t specify particular tools although examples might be given;
- Guidance on the narrative within which the indicator should be placed, including relevant aspects of:
- how to present the institutional context, e.g. how to draw from the INORMS SCOPE review
- the features of the evidence (2) e.g., scope of application, limitations, sensitivity, susceptibility to gaming)
Evaluation report
- validity of monitoring / indicator [does it measure what it claims to?]
- reliability and generalisability of monitoring / indicator [is the measure repeatable – at different times, in different contexts?]
- openness / transparency of monitoring / indicator [does the measure allow for ongoing scrutiny and reproducibility?]
- feasibility of monitoring / indicator for institution [resource, ethical and other implications for the institution]
- feasibility of monitoring / indicator for provider [resource, ethical and other implications for the provider]
- interest of the institution in supporting the indicator [priority, likelihood of investing and sustaining resources in this]
- interest of the provider in supporting the indicator [priority, likelihood of investing and sustaining resources in this]