This section provides a collation of the case studies and scenarios provided in the Good institutional practice guide (the Guide).

Table of contents

The Guide is designed as a resource for institutional and research leaders (hereafter referred to as leaders) seeking to promote an open, honest, supportive and respectful institutional research culture that supports the conduct of high-quality research.

The following case studies based on real-world examples, and hypothetical scenarios, demonstrate how some institutions have achieved positive cultural change.

Implementing cultural change

Case Studies: Two different approaches to achieving institutional cultural change

At the University of Glasgow, research culture is part of a multi-disciplinary team focused on creating a positive research culture by promoting collegiality, career development, research recognition, open research and research integrity.1 The objectives, activities and measures of progress are comprehensively described in their Institutional strategic priorities for research culture 2020-2025 plan.2 To date, they have:

  • established a Research Culture Commons which people can join and contribute to cultural change and shared goals
  • undertaken research culture surveys to understand where they are making progress and where there is still work to be done
  • established annual awards to recognise and celebrate supervisors, principal investigators and research professional colleagues who contribute to a positive research environment
  • created a Talent Lab with six diverse initiatives focusing on developing leadership in research and researchers as leaders.

The Stanford Program on Research Rigor and Reproducibility (SPORR), run by Stanford Medicine, has a variety of initiatives in place and others in the pipeline to support a culture of research rigor and reproducibility (R&R).3 Their initiatives are aimed at faculty, staff, graduate and postgraduate students and fellows, and include:

  • core R&R courses such as ‘The practice of reproducible research’ and ‘Foundations of statistics and reproducible research’
  • ReproducbiliTea, which is an international community of journal clubs that advance open science and improve academic research culture
  • monthly R&R Grand Rounds; a consultation and feedback service on data sharing and data management plans
  • free consultations for research teams writing training grants.

Early-career researchers can obtain help with study-design, analysis and interpretation from a network of like-minded experts across Stanford Medicine. It is intended that Stanford Medicine researchers and staff will be rewarded for their R&R accomplishments. They are also planning to incorporate R&R monitoring and accountability, incentives, and cultural change into the everyday research workflow.

Role modelling and leadership

Scenario: Research Quality Champions

It was after a team meeting where a postdoctoral fellow gave a presentation on research quality followed by a robust discussion, that the research team leader decided to hold a meeting with the other research leaders in the department to talk about how they could give research quality more focus. The result was the Research Quality Champions, a networking group in which early career researchers could discuss research quality and responsible research practices. The idea for the network was based on the model of Research Integrity Advisors, as required by the Australian Code for the Responsible Conduct of Research,4 and the University of Cambridge’s Data Champion program.5 A pilot for the Research Quality Champions network was actively supported by leaders. The Champions organised training, by internal and external experts, about research quality and responsible research practices. They now hold regular face-to-face meetings and have a virtual community space, to provide peer support and to exchange experiences and ideas. Not only does the network allow researchers to seek advice about research quality from researchers external to their own team, but the Champions also help their institution to continually develop and improve its processes related to research quality and research culture. Evaluation of this pilot clearly indicated its success, and it has been expanded across all departments in the institution. Furthermore, participation in the network is soon to be recognised by leaders in terms of workload and promotion criteria.

Scenario: Learning to give and receive respectful feedback

A team leader noticed that giving and receiving feedback during team meetings was becoming a little fraught as members were taking feedback as personal criticism and this was preventing what could have been constructive discussions about different ways of tackling problems from occurring. In response, the team leader engaged a facilitator to run a ‘giving and receiving feedback’ workshop with the team. Although some members were initially sceptical and saw it as an imposition on their time, they all participated, and it turned out to be a very worthwhile investment. The workshop gave the team a shared language and purpose around giving and receiving constructive feedback and having respectful conversations. The team felt valued, their communication skills improved, and much less time was spent diffusing tension and overcoming misunderstandings. In addition, the team leader noticed ideas were getting braver, which meant that projects were being taken in new and interesting directions.

Institutional resources

Case study: Institutional resources provided to ensure clinical trials are reported

In response to publicity in 2019 indicating that only 17% of clinical trials at European universities had reported their results, Karolinksa Institute (KI) decided to address this issue at their own institution.6,7 By 2022, KI was reported to have uploaded the most results between December 2020 and November 2021 and received international praise from TranspariMED for their initiative. The following steps were important to the success of the initiative:

  • Having the support of management who could ensure that resources were allocated for the long-term. Responsibility for the registration and reporting of clinical trials/studies was centralised to its existing research support unit and two additional full-time staff were hired for this unit.
  • Making it easy for researchers to register their clinical trials. Staff developed a template containing the same mandatory fields as in the European clinical trials portal. Researchers were able to complete the template with trial results without having to learn how to navigate the portal. The support staff then easily and efficiently upload the results to the portal on behalf of the researchers.
  • Developing an internal website with important and detailed information about registration and reporting of clinical trials so that researchers can easily find what needs to be done and how. The website includes a step-by-step guide for various trial registers and frequently asked questions.
  • Providing specific support for researchers. The Chief Data Officer offers individual research support via email, as well as lectures and workshops, about what is required and how it is carried out.
  • Joining networks of other researcher administrators working with registration and reporting. The Chief Data Officer found this to be a good way to make valuable contacts who could provide them with advice and tips.
Case Study: Senior role – Academic Lead for Research Improvement

Institutions can create formal roles in their senior management teams (an Academic Lead for Research Improvement or similar) with responsibility for, and supporting implementation of, activities to support the conduct of high-quality research. This approach is based on a key element of the UK Reproducibility Network (UKRN).

The UKRN was established as a peer-led organisation, with the aim of raising research quality and promoting initiatives that may help achieve this, as well as supporting a positive research culture. This includes the investigation of factors that contribute to robust research, promoting training activities and disseminating best practice, and working across local networks, institutions, and external stakeholders to ensure coordination of efforts across the sector. The key feature of reproducibility networks is their structure, which is flexible enough to allow for national, institutional, and disciplinary differences, while also enabling coordination of activity within and between these agents in the research ecosystem.8,9 Key features of the UKRN are:

  • local networks - informal, self-organising groups of researchers and other staff at individual institutions, represented by a Local Network Lead
  • institutions - universities that have formally joined the Network by creating a senior academic role focused on research improvement
  • other sectoral organisations – organisations that have a stake in the quality of research (for example, funders, publishers, learned societies).

Institutions in Australia can consider joining and supporting the Australian Reproducibility Network, which has been recently established based on the UKRN.10

Scenario: Appointment of a biostatistician to the Health Research Ethics Committee

The Chair of an Australian Human Research Ethics Committee (HREC) was struck by the apparent poor knowledge of biostatistics amongst those conducting research involving human participants. Following consultation with a senior manager in the institution, a qualified biostatistician was appointed as a member of the HREC. Initially, the biostatistician found problems with the biostatistics and protocol design in roughly one quarter of the research protocols in applications submitted to the HREC. Errors included simple ones such as being unable to replicate sample size, incorrect use of commercial statistical software and incorrect protocol design. The institution also supported a system of ‘biostatistician interns’ for the HREC - biostatistics students who had the chance to look at real world protocols as part of their studies.

Addressing these issues in consultation with the researchers led to improvements in research design and analysis, which are essential for the conduct of high-quality research. It also demonstrates respect for the participants in the research because well-designed research and appropriate analysis of the results is more likely to lead to useful outcomes.

Education and training

Case study: Improving experimental design through education and training

With the support of senior management, the Baker Heart and Diabetes Institute undertook an exercise to encourage preclinical researchers to improve the quality of their cardiac and metabolic animal studies.11 This involved provision of education and training to increase awareness of concerns which can arise from suboptimal experimental designs, and provide knowledge, tools, and templates to overcome bias.

Participants received a one-hour presentation that included questions and discussion on concerns regarding the quality of animal research, the ARRIVE Guidelines,12 types of bias, and practical examples for improving experimental design. They also attended a seminar on improving disease modelling and candidate drug evaluation and were provided with flowcharts and templates to encourage them to track and report exclusions of animals. Two short surveys were conducted over 12 months to monitor and encourage changed practices. The major findings included:

  • a willingness of investigators to make changes when provided with knowledge and tools that were relatively simple to implement, for example, structured methods for randomisation, and de-identifying interventions/drugs
  • resistance to change if this involved more personnel and time
  • evidence that changes to long-term habits require time, follow-up, and incentives/ mandatory requirements.
Case study: The Dilemma Game: An app to stimulate critical discussion

Like in any profession, researchers are frequently faced with dilemmas: Can I exclude particular observations from my research? Can I use exactly the same data set for multiple papers? The Dilemma Game app has been developed by Erasmus University Rotterdam to stimulate awareness of, and an open and critical discussion about, integrity and professionalism in research.13

The game prompts participants to consider, choose and defend (and possibly reconsider) alternative courses of action regarding a realistic dilemma related to professionalism and integrity in research. The game consists of dilemmas with four possible courses of action which the players can choose from. It is important to note that due to the complexity of integrity-related dilemmas, there is no winning or losing in this game. Rather, by defending and discussing these choices in the context of a critical dialogue, the game aims to support researchers in further developing their moral compass. The game can be used in a variety of settings, and has three modes: Individual, Group, and Lecture.

For some years, the Dilemma Game was played as a card game. In 2020 the game was digitalised in order to reach a wider audience and inspire continuous attention to the topic of research integrity. Discussing research integrity is vital as it contributes to an open, safe, and inclusive research culture in which responsible research practices are deeply embedded.

Rewards and recognition

Case study: Evaluating for hiring and tenure

When the QUEST (Quality-Ethics-Open Science-Translation) Center for Transforming Biomedical Research at the Berlin Institute of Health in Germany evaluates applications for hiring and tenure, criteria include responsible research practices, with questions covering practices such as publishing of null results, open data and stakeholder engagement.14 QUEST office staff screen applications and participate in hiring committee meetings to support committee members in understanding, evaluating, and applying the criteria.

Case study: Evaluating research staff

University Medical Center Utrecht in the Netherlands undertook a consultative process with staff to develop a new framework for evaluating staff for promotions that moved away from bibliometrics and formally required qualitative indicators and a descriptive portfolio.15 Along with other elements, Utrecht candidates now provide a short essay about who they are and what their plans are as faculty members. Candidates must discuss their achievements in terms of the following domains with bibliometrics comprising only one domain:

  • managerial responsibilities and academic duties, for example, conducting reviews for journals and contributing to internal and external committees
  • teaching and supervision of students, for example, how much time is devoted to students and any courses they have developed
  • clinical work undertaken, for example, involvement in organising clinical trials and research into new treatments and diagnostics
  • entrepreneurship and community outreach.

Reported outcomes of this change are:

  • group leaders engaging with, debating about and then embracing the new framework
  • EMCRs engaging with the framework and proposing forward-looking ideas to improve scientific outcomes
  • students organising a brainstorming session with high-level faculty members about how to change the medical and life-sciences curriculum to incorporate reward-and-incentive structures
  • the PhD council choosing a ‘supervisor of the year’ on the basis of the quality of supervision, instead of the previous practice of the highest number of PhD students supervised.
Scenario: Rewarding desired behaviours

The leaders of a research group were aware that although they had invited a speaker to their regular meeting to speak about transparent research behaviours and had followed up with an email with links to resources, there had been no change in uptake of those behaviours. They decided to implement a reward scheme, where any member of the research group could receive $100 as a dining/movie/retail voucher, or as a contribution to their research account, for:

  • pre-registering their research project
  • preparing a data management plan, including to share the data at the end of the project
  • depositing a preprint of any manuscript
  • making any publications openly accessible
  • sharing data from the project based on the FAIR principles16
  • publishing code from the project.

When communicating about this reward scheme, the research group leaders were careful to stress that it was not intended as a reward based on metrics. Because each of the behaviours that were eligible under the reward scheme were measurable, the leaders were able to see a quantifiable improvement in the behaviours after 12 months.

Communication

Case study: Openness of animal research

To address misconceptions surrounding research involving the use of animals, the Openness Agreement on Animal Research and Teaching in Australia was launched in 2023.17 NHMRC is a supporter of this agreement. Similar openness agreements have been developed for other countries.18 Internationally, the outcomes from openness agreements include:

  • better public access to information about animals in research, directly from those who do the research
  • a greater understanding and appreciation of the role of animal care staff, both in and outside the sector
  • increased profile of animal facilities within their establishments, leading to greater investment and better animal welfare
  • better access to see inside animal facilities (for those interested in this work)
  • fewer reactive communications on the use of animals in research, due to more information proactively placed in the public domain.18
Scenario: Using a Research Quality Promotion Plan to improve communication

An analysis of the issues being reported by staff at a research institution showed that the majority related to miscommunication such as misunderstandings between scientific collaborators, and how best to engage with the public. To address these issues, institutional and research leaders agreed that these issues could be improved if better procedures were put in place to optimise communication and collaboration. They supported the development of a Research Quality Promotion Plan (RQPP), based on SOPs4RI consortium’s Research Integrity Promotion Plan,19 to guide the development of appropriate policies and procedures. The Research Office was given responsibility for designing an RQPP. The plan comprised:

  • a description of the current situation, including the policies and procedures already in place and how effective they are
  • areas in need of improvement
  • a detailed plan for future activities.

The plan for future activities involved:

  • specifying the change-related goals
  • employee participation and agreement on a shared outcome of the change
  • description of the institutional set-up for implementing the envisioned change
  • finding the right tools in the SOPs4RI toolbox20 that match the goals
  • specifying actions to be taken by specific people
  • a set of indicators or targets to be used for evaluating the effectiveness of the change process.

The outcome was implementation of sound policies and procedures to guide effective and transparent communication and collaboration between staff. The policies and procedures were communicated regularly to all institutional staff via internal staff communications and newsletters and were made available on the institution’s internal and external websites.

Case study: Using the Open Science Framework to keep track of your lab work

The Open Science Framework (OSF) was created by Brian Nosek and his graduate student, Jeff Spies, with the aim of preventing the loss of research material, while creating incentives for preservation and transparency.21 It is a free open-source web application that helps individuals and research teams organise, archive, document and share their research materials and data. Information connected to an OSF project might include study materials, analysis scripts and data, as well as a wiki, and attached files, submissions to institutional review boards, notes about research goals, posters, lab presentations or pre-prints. Because each action is logged and version histories of the wikis and files kept, the history of the research process is recoverable, and materials are not lost. This means that the work is more easily reproduced either by the project authors or by others.

The OSF allows research groups to make their scripts, code and data available to the public, enabling others to reproduce their analyses and findings or reanalyse the data for their own purposes. To encourage such transparency of findings, the OSF includes incentives such as statistics documenting the number of project views and files downloads for public projects, and a novel citation type called a ‘fork’ that registers when others are using and extending your research outputs. As Nosek says, ‘without openness and reproducibility in the scientific process, we are forced to rely on the credibility of the person making the claim, which is not how it should be. The evidence supporting the claim needs to be available for evaluation by others, hence the need to help create a research culture that is open and transparent.’

Monitoring, evaluation and reporting

Case study: How to evaluate education and training about responsible research practices22

There are various ways of evaluating a course on the responsible conduct of research, ranging from recording student attendance to assessing their attitudinal changes.

A decision should be made about whether you are going to evaluate the effectiveness of the course delivery or assess actual learning and/or change. The simplest forms of evaluation are paper or online surveys whose questions often focus on program mechanics, delivery by presenters and completion of required activities. They don’t indicate whether any learning has actually occurred and whether behaviours will change as a result of the education and training. In contrast, qualitative evaluation questions require written responses and take more time and effort from the respondent. However, they can provide useful information on, for example, how the discussions and readings were received.

Since there are significant benefits to be gained from determining whether any learning is taking place, it may be worthwhile collecting standardised data over several years to look for a cumulative effect (summative or outcome evaluation).

When formulating questions to assess what has been learned, it is useful to categorise the types of learning that can take place into the following: knowledge, skills, attitudes, and behaviours, and possibly beliefs. Then carefully specify the intended learning outcomes from each session under each of these categories. It is important that these learning outcomes are designed to be measurable. As it is particularly difficult to measure impact on people’s behaviours, it is useful to formulate questions that ask about their anticipated future behaviours. With carefully designed questions, it should be possible to obtain useful feedback on how participants are receiving and processing the information presented, and this can then be used to continually improve the teaching process.

Case study: Feedback on ways to facilitate data management and sharing

Following the introduction of a Data Management and Sharing Policy by the US National Institutes of Health in 2023, the Stanford Program for Rigor and Reproducibility (SPORR) at Stanford Medicine, in collaboration with Stanford’s Lane Library, conducted focus groups to understand the data sharing and management practices of Stanford early career researchers and the support they might need to follow NIH policy.23 The results showed that participants:

  • wrote data management plans only when required by an ethics committees or funding body
  • shared data only when required by funders or journals
  • generally used cloud-based services to store their research data and to share with collaborators or statisticians but were unsure about the security of these services and the best methods for using them
  • emphasised the effort required to prepare and store data properly
  • feared that, without dedicated funding, incentives or mandates to make these practices required, investing the time in data management might put them at a disadvantage for career advancement.

The participants in the focus groups suggested that the following web resources would be helpful:

  • one main data page that collates all data policies, services and resources
  • data management plan templates
  • flowcharts for data sharing and management per data type
  • guidance on how to initiate data discussions at their lab.

These results from the focus groups will now be used to develop a survey that will be sent to all members of Standford School of Medicine.


1 University of Glasgow. Research and Innovation Services: research culture. [Internet]. Accessed 21 Nov 2023 from: https://www.gla.ac.uk/myglasgow/ris/researchculture/

2 University of Glasgow. Institutional strategic priorities for research culture 2020-2025. [Internet]. Accessed 21 Nov 2023 from: https://www.gla.ac.uk/media/Media_705595_smxx.pdf

3 Stanford Medicine: Stanford Program on Research Rigor & Reproducibility. [Internet]. Accessed 21 Nov 2023 from: https://med.stanford.edu/sporr/about/missionstrategyvalue.html

4 Australian Code for the Responsible Conduct of Research, 2018 Guide to Managing and Investigating Potential Breaches of the Australian Code for the Responsible Conduct of Research, 2018, and NHMRC policy on misconduct related to NHMRC funding, 2016. [Internet] Accessed 21 Nov 2023 from: https://www.nhmrc.gov.au/research-policy/research-integrity

5 University of Cambridge. Data Champions. [Internet] Accessed 21 Nov 2023 from https://www.data.cam.ac.uk/intro-data-champions

6 Bruckner T. Tips and tricks for improving clinical trial reporting from Karolinska university. 2023 TranspariMED. [Internet] Accessed 21 Nov 2023 from: https://www.transparimed.org/single-post/karolinska-university

7 Possmark S, Martinsson Bjorkdahl C. Responsible reporting of clinical trials: Reporting at Karolinska Institutet is well underway. Lakartidningen [Internet] 2023;120: 23007. Accessed 21 Nov 2023 from: https://lakartidningen.se/klinik-och-vetenskap-1/artiklar-1/temaartikel/2023/05/ansvarsfull-rapportering-av-kliniska-lakemedelsprovningar/

8 UK Reproducibility Network, Terms of reference, [Internet] 2023 Apr. Accessed 5 October 2023 from: https://www.ukrn.org/terms-of-reference/

9 Committee UKRNS. From grassroots to global: A blueprint for building a reproducibility network. PLOS Biology [Internet] 2021;19(11): e3001461. doi: 10.1371/journal.pbio.3001461

10 Australian Reproducibility Network. [Internet] Accessed 21 Nov 2023 from: https://www.aus-rn.org/

11 Weeks KL, Henstridge DC, Salim A, Shaw JE, Marwick TH, McMullen JR. CORP: Practical tools for improving experimental design and reporting of laboratory studies of cardiovascular physiology and metabolism. American Journal of Physiology-Heart and Circulatory Physiology [Internet] 2019;317:3, H627-H639. doi:10.1152/ajpheart.00327.2019

12 ARRIVE guidelines. [Internet] Accessed on 19 Dec 2024 from: https://arriveguidelines.org/

13 Erasmus University Rotterdam. Dilemma game. Professionalism and Integrity in Research. [Internet]. Accessed 21 Nov 2023 from: https://www.eur.nl/en/about-eur/policy-and-regulations/integrity/research-integrity/dilemma-game

14 Strech D, Weissgerber T, Dirnagl U, on behalf of QUEST Group (2020) Improving the trustworthiness, usefulness, and ethics of biomedical research through an innovative and comprehensive institutional initiative. PLOS Biology [Internet]. 2020 Feb 11;18(2): e3000576.  doi:10.1371/journal.pbio.3000576

15 Benedictus R, Miedema F, Ferguson M. Fewer numbers, better science. Nature [Internet] 2016; 538, 453–455. doi:10.1038/538453a

16FAIR Principles. GO FAIR [Internet]. Accessed 22 Nov 2023 from: https://www.go-fair.org/fair-principles/

17 ANZCCART. Openness Agreement on Animal Research and Teaching in Australia. [Internet]. Accessed 22 Nov 2023 from: https://anzccart.adelaide.edu.au/openness-agreement

18 Lear AJ, Hobson H. Concordat on Openness on Animal Research in the UK. Annual Report 2022 [Internet]. Accessed 22 Nov 2023 from: https://concordatopenness.org.uk/resources

19 SOPs4RI consortium. How to create and implement a research integrity promotion plan (RIPP). A guideline (ver. 2.0) [Internet]. Accessed 22 Nov 2023 from: https://sops4ri.eu/wp-content/uploads/Implementation-Guideline_FINAL.pdf 

20 SOPs4RI consortium. Toolbox for research integrity. V5.0 (CC BY 4.0) [Internet]. Accessed 22 Nov 2023 from https://sops4ri.eu/toolbox/

21 Nosek BA. Improving my lab, my science with the open science framework. Association for Psychological Science [Internet]. 2014 Feb 28. Accessed 22 Nov 2023 from: https://www.psychologicalscience.org/observer/improving-my-lab-my-science-with-the-open-science-framework

22 McGee R. Evaluation in RCR training- are you achieving what you hope for? Journal of Microbiology & Biology Education [Internet]. 2014 Dec 15;15(2). Accessed 22 Nov 2023 from: https://journals.asm.org/doi/10.1128/jmbe.v15i2.853

23 Stanford School of Medicine. Stanford Program on Research Rigor and Reproducibility. Focus group and Stanford-wide survey on data sharing and management practices. [Internet] Accessed on 20 Nov 2024 from: https://med.stanford.edu/sporr/monitoring.html?tab=proxy#focus-groups-and-stanford-wide-survey-on-data-sharing-and-management-practices