key: cord-0932773-uzk85owd authors: Evans, Thomas Rhys; Pownall, Madeleine; Collins, Elizabeth; Henderson, Emma L.; Pickering, Jade S.; O’Mahony, Aoife; Zaneva, Mirela; Jaquiery, Matt; Dumbalska, Tsvetomira title: A network of change: united action on research integrity date: 2022-04-14 journal: BMC Res Notes DOI: 10.1186/s13104-022-06026-y sha: 3fecec3cc200f9ac55ee32992de4a3823e508d52 doc_id: 932773 cord_uid: uzk85owd The last decade has seen renewed concern within the scientific community over the reproducibility and transparency of research findings. This paper outlines some of the various responsibilities of stakeholders in addressing the systemic issues that contribute to this concern. In particular, this paper asserts that a united, joined-up approach is needed, in which all stakeholders, including researchers, universities, funders, publishers, and governments, work together to set standards of research integrity and engender scientific progress and innovation. Using two developments as examples: the adoption of Registered Reports as a discrete initiative, and the use of open data as an ongoing norm change, we discuss the importance of collaboration across stakeholders. Evidence of a number of problematic practices and norms across the research cycle give us good reason to doubt the credibility of much research [12, 15] . This, coupled with mostly unsuccessful attempts to replicate core research findings in psychology [18] and elsewhere [5] , exemplifies the far-reaching issues of research integrity that the scientific community currently face. Researchers prioritising research transparency, quality, and culture have driven changes in research norms across the world, with open science/scholarship initiatives playing a central role in developing and championing new approaches and standards. Whilst the scale of change achieved in the last decade is notable, a central barrier to sustainable change in integrity norms is the extent to which all research stakeholders collaborate to embed and progress such developments [19] . Here, we summarise two developments, open data and Registered Reports, which can tackle this wider crisis of science through increased transparency, research quality, and changes to research culture. We discuss how the research community needs to collectively tackle such issues, acknowledging how action from one stakeholder can alter demands and value for other stakeholders, thus requiring coordinated action. One driver of the current crisis is a lack of transparency-a lack of open sharing of data and materials. As observed during the COVID-19 pandemic, making data openly accessible is transformative for scientific and public understanding, providing accountability within psychological research [1] . Unfortunately, sharing data has been uncommon historically, and when materials and data are not shared, researchers, funders, and journals cannot adequately assess the robustness of published work, slowing scientific progress. Openness is also an important facilitator of reproducibility, as researchers Inaccessibility of data, and thus low transparency, makes attempts to progressively build upon previous research inefficient for funding and researcher hours. It is harder to replicate and establish the boundaries of effects and to evaluate the quality of work. It can also hinder error detection and correction, and the identification of fraud (e.g., [22] . Therefore, research transparency can have multifaceted direct and indirect consequences on the quality and speed of research developments, and should be a priority for stakeholders. Advocating for transparency in research requires a cultural shift and a fundamental realignment of expectations. Currently, scientific norms encourage researchers to state that data is available "upon reasonable request", but subsequent rates of data sharing by request are unacceptably low [13] ,Wicherts et al., 2016; [6] . A priority for the scientific community should be ensuring that data are safely preserved, conform to the FAIR principles (Findable, Accessible, Interoperable, Reusable [23] , and are openly available for re-use and re-analysis where possible. Table 1 Researchers that are willing to share their data face challenges in resourcing and knowing how to do so ethically whilst conforming to FAIR principles [23] . To facilitate data sharing, co-ordinated change is needed across stakeholders. For example, changes to journal data availability statement policies can facilitate sharing practices (e.g., [10] , but this increases demands upon training, support and infrastructure of consequence to researchers, research support (e.g., libraries, technicians), universities, and funders [11] . Table 2 considers the various responsibilities each research stakeholder have towards co-ordinated reform of standards. Research quality is a vital component of research integrity. We cannot promote better integrity of research if we do not first consider how the quality (i.e., robustness, reliability, and validity) can be improved. One barrier to research quality actively propagated by many publishers and journals is 'publication bias' , whereby null/non-significant results are much less likely to be published than statistically significant findings. This incentivises questionable practices such as p-hacking data to 'find' a significant result, or selectively reporting significant results [2, 8] . This directly contributes to the crisis because it makes publication contingent upon the results of the work, rather than the theoretical significance and methodological rigour of the research. Concerned by publication bias, researchers have developed several initiatives to improve research practices and standards in methodology and publishing. Deviating from the traditional publication route where papers are peer-reviewed following study completion, Registered Reports (RRs) are one such innovation in publication. At Stage 1, the introduction, hypotheses/research questions, methods, and analyses undergo peer-review before data collection. This feedback can identify flaws in the protocol and allows substantive changes to be made before using resources (e.g., funding, participant time). Work receives in-principle acceptance from the journal, whereby the subsequent completed (Stage 2) report will be published regardless of the findings, if the authors have collected and reported data according to Stage 1 [3] . RRs reduce publication bias because acceptance is based on the importance of the research question and methodological rigour, rather than the results. This reduces pressure to produce significant results and counters the incentives that drive selective reporting and other questionable research practices [4] . RRs are valuable amid ongoing concerns of widespread 'false-positive findings' in the published literature, as hypotheses are supported much less frequently among RRs than conventional research articles [21] , providing initial evidence for the value of the approach (Fig. 1) . Further structural support is needed in order to implement RRs more widely, including training, funding, and wider journal adoption. See Tables 1 and 2 outlining the interconnected roles and responsibilities of research stakeholders for RRs. Registered Report Funding Partnerships have been proposed as a method of extending the RR model by integrating it with the grant funding process, such that researchers receive both funding and in-principle acceptance for publication based on the integrity of the theory and methods. Combining funding and publication decisions may streamline processes and reduce the burden on reviewers, while also providing the aforementioned benefits of RRs in reducing questionable research practices and publication bias [14] . Such RRfunding partnerships, and similar innovations for drug marketing authorisation [16] , offer important and innovative examples of how stakeholders and processes can be unified to improve standards for research quality. Overcoming the issues underlying the current crisis requires united action across research stakeholders. For example, individuals may wish to conduct RRs, but journals must offer this option and funders must value and incentivise such work. Similarly, journals can mandate open data sharing, but researchers require training, support and infrastructure to facilitate this. Initiatives designed to improve research integrity should be mapped out with consideration to the different demands and value provided to each of the different stakeholder groups. This allows obstacles to be anticipated and encourages co-ordinated action, increasing the likelihood of such initiatives becoming sustainable. Acknowledging our priorities of transparency, rigour and culture, open data and RRs represent only two initiatives which require more collective action. While we focused here on open data, transparency could also be prioritised by promoting open sharing of research materials, which rely on the same mechanisms. Similarly, we focused on RRs as one method to alleviate publication bias, but other initiatives, such as open peer review and crowd-sourced open review, also represent promising avenues to improve research integrity. Thus, the priorities and ideas here should be viewed as a starting point for a wider, more comprehensive consideration of how the transparency, quality, and culture of research, and thus integrity, can be improved together. Open science saves lives: lessons from the COVID-19 pandemic Ethical consistency and experience: an attempt to influence researcher attitudes toward questionable research practices through reading prompts Registered reports: a new publishing initiative at Cortex The past, present and future of registered reports Investigating the replicability of preclinical cancer biology Developments in open data norms A network of change: three priorities requiring united action on research integrity. UK Parliament Negative results are disappearing from most disciplines and countries Retracted science and the retraction index Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal cognition Wagenmakers EJ. Data sharing in psychology: a survey on barriers and preconditions Why most published research findings are false The dawn of open access to phylogenetic data Improving the efficiency of grant and journal peer review: registered reports funding A manifesto for reproducible science An open science pathway for drug marketing authorization-Registered drug approval convenient online submission • thorough peer review by experienced researchers in your field • rapid publication on acceptance • support for research data, including large and complex data types • gold Open Access which fosters wider collaboration and increased citations maximum visibility for your research: over 100M website views per year • At BMC, research is always in progress. Learn more biomedcentral.com/submissions Ready to submit your research Ready to submit your research ? Choose BMC Promoting an open research culture Estimating the reproducibility of psychological science Promoting open science: a holistic approach to changing behaviour The what, why, and how of born-open data An excess of positive results: comparing the standard psychology literature with registered reports Just post it: the lesson from two cases of fabricated data detected by statistics alone The FAIR guiding principles for scientific data management and stewardship Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations Not applicable.Author contributions TRE was responsible for conceptualization, project administration, funding, writing (original draft) and writing (review & editing). MVP, EC, ELH, JSP, AO and MZ were responsible for conceptualization, writing (original draft) and writing (review & editing). MJ and TD were responsible for conceptualization and writing (review & editing). All authors read and approved the final manuscript. The authors have no funding specific to this work to declare. Not applicable. Ethics approval and consent to participate Not applicable. Not applicable. All authors have contributed to attempts to reform scientific practice. This includes through leadership, membership, roles, or collaboration within a number of groups including the Framework for Open and Reproducible Research Training (FORRT), Journal of Open Psychology Data, UK Reproducibility Network (UKRN), Registered Reports Steering Committee, and Society for the Improvement of Psychological Science (SIPS). A previous version of this work was submitted and published (RRE0007) as written evidence towards the UK Parliament's Science and Technology Committee on Reproducibility and Research Integrity and was subsequently preprinted [7] , https:// psyar xiv. com/ r6gpj).