key: cord-0980438-gso8tfex authors: Kagan, Jonathan; Lassa, Jerome; Zuckerman, Judith; Cull, Ellen; Boan, David; Lysander, Julia; Njoh, Wissedi; Johnson, Kumblytee; Sardana, Ratna; Stern, Kaytee; Grace, Beth; McNay, Laura; Tegli, Jemee title: Strategy management in collaborative clinical research partnerships date: 2021-08-18 journal: Contemp Clin Trials Commun DOI: 10.1016/j.conctc.2021.100833 sha: 743593356772050044c1b559ced80609161a8a5f doc_id: 980438 cord_uid: gso8tfex PURPOSE: Today's clinical trial partnerships frequently join multi-disciplinary investigators and stakeholders, from different countries and cultures, to conduct research with a broad array of goals. This diversity, while a strength, can also foster divergent views about priorities and what constitutes success, thereby posing challenges for management, operations, and evaluation. As a sponsor and partner in such collaborations, we seek to assist and support their development and implementation of sound research strategies, to optimize their efficiency, sustainability, and public health impact. This report describes our efforts using an adaptation of the well-established Kaplan-Norton strategy management paradigm, in our clinical trials setting. We share findings from our first test of the utility and acceptance of this approach for evaluating and managing research strategies in a collaborative clinical research partnership. RESULTS: Findings from pilot studies and our first implementation in an ongoing clinical research partnership in Liberia, provide initial support for our hypothesis that an adapted version of the Kaplan-Norton strategy management model can have use in this setting. With leadership from within the partnership, analysis artifacts were gathered, and assessments made using standardized tools. Practical feasibility, resonance of the findings with partners, and convergence with other empirical assessments lend initial support for the view that this approach holds promise for obtaining meaningful, useable results for assessing and improving clinical research management. CONCLUSIONS AND IMPLICATIONS: Engaged leadership, thoughtful timing to align with partnership planning cycles, support for the process, and an eye towards the collaboration's long-term goals appear important for developing model understanding and practice. Skepticism about evaluations, and unease at exposing weaknesses, may hinder the effort. Acceptance of findings and associated opportunities for improvement by group leadership, support a growing sense of validity. Next steps aim to test the approach in other partnerships, streamline the methodology for greater ease of use, and seek possible correlations of strategy management assessments with performance evaluation. There is hardly a better example than the COVID-19 pandemic, to spotlight the need for efficient and effective clinical research partnerships to address global health challenges. While heartened by the collaborative spirit driving the effort so far, we cannot let our enthusiasm lull us into thinking that nobility of purpose or an abundance of good will is sufficient. Careful monitoring and adjustment of clinical research strategy in response to changes (e.g., demographics, pathogen evolution, research acceptance, political and cultural environments) are vital to making the needed adjustments that can guide these programs toward successful outcomes. We hope that our work can raise awareness about the importance, relevance, and feasibility of sound strategy management in clinical research partnerships, especially during this time when there is so much at stake. As of 2010 there were nearly 300 clinical research networks in the United States and Canada, nearly half of which carry out clinical trials [1] . Many involve partnerships between governments and organizations in different nations. These programs frequently take a multi-disciplinary approach and have broader goals than traditional individual investigator-initiated research. In addition to their scientific outputs, they are often expected to foster team science, develop new investigators, build scientific infrastructure, and invoke state-of-the-art prevention, diagnostic, and treatment techniques. It is expected that efficiencies and synergism in these consortia will positively impact population health and behavior and inform health policy by generating and completing innovative, relevant, and timely research [2] . These broad goals often foster diverse perceptions as to what constitutes success and can pose substantial management, operations, and evaluation challenges for leadership. Infectious disease clinical trials networks, like those conducting trials for HIV, Ebola and COVID-19, are exemplary of the complexities in strategizing, managing and implementing timely, quality clinical research for optimal global health impact. We hypothesized that taking a structured evaluation of their research strategies could potentially strengthen these partnerships by: 1) critically appraising their scientific goals and objectives; 2) determining how strongly their trial portfolios align with their research goals; 3) gauging the suitability of the partnership's organization, staffing and budget allocations to carry out its priority research; 4) evaluating the efficiency and effectiveness of the group's core processes, and; 5) assessing the partnership's skill at monitoring progress on its goals, and responding to change. Our manuscript describes a model for guiding this kind of evaluation and reports on its first implementation, including lessons learned, risks, and limitations. In 2005, NIAID's Division of Clinical Research (DCR), to help meet its goals and objectives, implemented the Kaplan-Norton (K-N) strategy management paradigm [3] . Strategy management is an overarching process incorporating strategic planning, monitoring, analysis, and assessment of all the elements (including organizational structure, core business processes, budgets) that affect an organization's success. The K-N paradigm was born out of strategic planning models developed in the 1980s and remains one of the most popular strategy management frameworks created. Its core elements of strategy formulation and translation to strategic objectives (Stages 1 & 2); aligning organizational structure, resources (fiscal, human), operations planning and process improvement (Stages 3 & 4) , and monitoring and adapting (Stages 5 & 6), give leaders a way to keep strategies in continuous focus while paying attention to environmental changes, and allowing evolution based on learning by doing. A schematic depiction of the K-N framework is seen in Fig. 1 . The model has been widely employed around the world, in countless environments, both private and public. Breakdowns in its fundamental elements are seen to associate with organizational underperformance, whereas adherence has been shown to help achieve improved outcomes [5, 6] . After five years building our own K-N experience with the model 'in house', DCR sought to extend the approach, bridging to our clinical research partnerships, and in so doing, integrating both environments under a unified strategic plan. We reasoned that based on its broad usage, the K-N model could be contextualized for use in our collaborative clinical research environment and evaluated for its utility and acceptance among our partners and stakeholders. In this report we show how we contoured the K-N strategy management paradigm for use in our clinical trials partnerships, and report our first findings testing the fidelity, applicability, and validity of the model in one of our collaborative clinical research projects, the Partnership for Research on Vaccines and Infectious Diseases in Liberia (PREVAIL). Because the K-N model was developed for business, several of its elements and terms were not relevant in our research setting and needed to be contextualized to fit an enterprise in which knowledge generation, rather than profit, is the 'bottom line'. Discerning examination of K-N's fundamental elements and terminology facilitated modification of standard K-N terms into language more familiar and applicable to researchers. For example, "sales forecasting" was rephrased in terms of projecting future research resources (e.g., funding). Similarly, "profitability" was re-expressed in terms of research results and progress. For our strategy management effort, we used these and other adapted clinical research terms, as shown in Table 1 . The NIAID DCR clinical research partnerships in which we invoked strategy management are shown in Table 2 , which documents our 10+ years cumulative experience using this approach in a variety of research settings. While there exist several methods for evaluating business strategy [7, 8] and processes [9, 10] , the K-N paradigm itself provides no assessment tools for measuring the full cycle of strategy management. This led us to build our own assessment scheme shaped around three elements: 1) Fidelity: the scheme should fully address each stage of K-N strategy management; 2) Applicability: the scheme should integrate the core elements of clinical research, and 3) Validity: the scheme should be acceptable to clinical researchers and able to generate findings that are credible and have value across a range of partnership stakeholders. To develop our K-N assessment scheme we convened a team of experts in clinical research management, strategy management, evaluation, and program administration. Recognizing the need for integrity and reproducibility, the scheme needed to be evidence-based and quantifiable. For each stage of K-N strategy management, data would be gathered from multiple sources, including documents (e.g., mission statements, strategic analyses and reports, operational plans, metric analyses, meeting notes, leadership communications, etc.), evaluations (e.g., internal, and external reviews), and interviews with partnership Opportunity for Improvement leaders, management and operations personnel (individuals and focus groups). Data would be categorized by K-N stage relevance and jointly analyzed by DCR strategy management staff and partnership personnel. Next, we needed a way to translate qualitative assessments of K-N data into a score reflective of a partnership's strategy management proficiency. We also strived to keep a utilization focus, conducting these evaluations in ways that would enhance the use of the findings (and the process itself), to improve the organization [11] . With these goals in mind, we crafted a developmental evaluation construct incorporating elements from two sources: 1) the Baldrige Performance Excellence Program [12] , a framework that helps leaders identify strengths and improvement opportunities, and 2) the Capability Maturity Model Integration (CCMI) [13] , a process level improvement training and appraisal program used to help organizations meet their goals. The Baldrige framework includes methods to examine how organizations develop and implement strategy, design, and improve work systems and processes, and measure, analyze, and improve organizational performanceall of which align with elements of the K-N strategy management model. The CMMI maturity model recognizes developmental stages of improvement and comparison of an organizations' processes to best practices. We combined elements from both, forming a way to assess the degree to which K-N strategy management requirements were fulfilled along a continuum of three developmental tiers: basic, intermediate, and advanced (analogous to Baldrige "basic, overall and multiple" levels). Additional quantitation (ranging from 0 to 100%) within each tier allowed for further discernment in assessing maturity. We believe this provided a reasonably intuitive way to assess maturity and identify opportunities for growth. Since many individuals and groups would be involved in evidence review and scoring, it was vital to assure the validity and reproducibility of the analyses. For this, we developed scoring guides specific to each K-N stage, designed to: 1) exemplify stage-specific best practices, and 2) clearly define the requirements for each developmental tier (basic, intermediate, advanced). For each K-N stage, the guide included the emphasis and principal requirements of the stage, the main stagespecific activities (using the adapted terms from Table 1 assessment of partnership practices against the model. Excerpts from the partnership score guide show how it facilitated assessment and scoring. Using a single K-N Stage (Stage 3, "Aligning the Partnership") as an example, Table 3A . shows how the guide provided a definition of the K-N stage emphasis and the factors (1-4) to be considered in assessing a partnership's attainment of the stage. Again, using K-N Stage 3 as an example, Table 3B shows how the developmental scoring model supported the rating of partnership strategy management, linking increasing accomplishment of stagespecific requirements to higher levels of K-N fulfilment. Using a descriptive case study method, we piloted our strategy management assessment scheme in one of the DCR research partnerships. The pilot was designed to address the questions: 1) Was the evidence gathering plan feasible for use in our partnership setting, and could it yield data of relevance and quality to inform strategy management assessment in full alignment with the K-N framework? 2) Was our approach capable and feasible to enable a wide-ranging strategy management assessment that encompassed the vital elements of clinical research management in our environment? 3) Was our scheme acceptable for use in the partnerships and able to produce results that held convergent validity for all involved (i.e., the clinical researchers, managers, funders, and the strategy management staff)? The pilot was conducted in three stages. First, leadership and staff from PREVAIL were briefed on the K-N model and our proposed assessment scheme, beginning with the collection of document-based evidence. The second stage included semi-structured interviews with leaders, managers, and operational staff to fill in gaps in knowledge and understanding not obtained in stage 1. The third stage was to gauge the assessment scheme for its acceptability and ability to reveal findings that held validity and to engage partnership personnel in collaborative interpretation of the findings. The ensuing discussions enhanced understanding of the assessments and the evidence underpinning them. Specific opportunities for improvement were collaboratively identified which could help the partnership move to higher levels of strategy management proficiency. The findings from our pilot lent plausibility to the fidelity of our assessment scheme to the K-N paradigm, applicability to clinical research, and a perceived validity with partnership stakeholders, and thus encouraged us to move forward with our first 'real' implementation. In 2017 we began an implementation of our assessment scheme in a case study of the Partnership for Research on Vaccines and Infectious Diseases in Liberia (PREVAIL). This partnership between the Liberian Ministry of Health and the U.S. Department of Health and Human Services was begun in late 2014, with the goal to develop treatment and prevention strategies for Ebola virus disease (EVD) [14] . Our aims in this first application of our approach were to: 1) engage, sustain and build capacity for this type of work within the partnership; 2) produce analyses that could add meaningful value, and 3) gain further experience which could improve and possibly streamline future assessments. The main questions of the assessment were about how the partnership: 1. Develops its strategy 2. Translates strategy into objectives, initiatives, and measures 3. Plans its operations to support its initiatives 4. Organizes itself and allocates resources in alignment with its strategy 5. Monitors performance on operations and strategy and learns from results 6. Learns how and when to adapt its strategy We began by forming a collaborative team comprised of four of PREVAIL's senior Liberian scientific, operational, management leads, (including representation of site managers and physicians) and members of the DCR strategy management staff (that had developed the assessment scheme). Following briefings on the K-N model and the assessment scheme, the team carried out its strategy management assessment, beginning with initial impressions shaped from documents and artifacts, and supplemented with team member hands-on knowledge. These early evaluations were further refined through additional information gathering and collaborative interpretation engagements with PREVAIL staff and stakeholders across the several domains of the partnership, including clinical, site and data management (e.g., laboratory, pharmacy), regulatory and ethics, social mobilization, and communication. The team prepared an interim report for PREVAIL leadership, and finalized its strategy management assessments, including a derivative set of opportunities for improvement (OFI) for presentation to PREVAIL leadership. The process, from start to finish, took 18 months. A summary of the PREVAIL K-N stage level assessments is shown in Table 4 , followed by a narrative rationale for the scoring for each stage. PREVAIL satisfied criteria for strategy development at the high end of the intermediate level. Senior leaders were engaged and communicative and developed a strategy that included mission, values, and vision statements. The strategy delineated strategic objectives and described strategic shifts, the "from/to" changes that would be addressed. Understandably, the initial planning for PREVAIL was on a very fast track driven by the need to initiate research at the height of the Liberian EVD outbreak. Nonetheless, as the epidemic abated and the program grew, deeper assessments of both internal capacities (e.g., staffing, skills, facilities, technology) and the external environment (e.g., other collaborating or competing research and public health efforts in the region) could have helped to evolve the strategy in ways that might better have supported the effort going forward. PREVAIL planned the operational implementation of its strategy at the intermediate level. The plan specified strategic objectives to carry out the high-level goals but had few measures and targets to guide the operational working groups and support units, or to guide resource allocation. A strategy map was created which illustrated how strategic objectives were related to perspectives of mission, customer/resource, process, and learning and growth. However, the map lacked sufficient specificity to function as a management tool and did not appear to have been utilized in that way. Alignment helps assure that strategy can be cascaded throughout the organization. PREVAIL's organizational alignment was assessed at the lower end of the intermediate level, with some stage-specific factors scored at the basic level. As PREVAIL grew from its earliest days, much like other 'startups', it needed to expand to carry out its mission. What emerged was a complex structure with large numbers of working groups for which clarity of purpose, priorities, responsibility, and authority were sometimes unclear, and measures and targets not well-defined. A Most scientific and support units (sites, labs, HR, Finance, etc.) have well-developed plans that directly cascade from the partnership strategic plan. Employee performance and training plans are fully aligned to support achievement of partnership goals; Knowledge and best practices are shared across operating units. Table 3B . Baldrige/CMMI Developmental Scoring Model for K-N Stage 3: Align the Partnership. An example of the Baldrige/CMMI developmental scoring model used to guide the rating of partnership strategy management, linking increasing accomplishment of stage-specific requirements to higher levels of K-N fulfilment. coordinating center and some research sites had written plans describing alignment with parts of the partnership strategy, but these were not consistently in place for research and support units across the organization. An operations plan, though available, was not integrated into use by leaders and managers, and this correlated with observed breakdowns in cascading strategic goals through the organizational structure. Of nearly 50 intermediate and end outcomes in the operational plan, few had specific goals, objectives, and measures. Communications from leadership helped employees understand the strategy and sustained motivation to execute it. Staff training was also seen to align with operational challenges, with an emphasis on strengthening key processes. However, there was little evidence of linkage between employee performance plans and the partnership strategy. PREVAIL's operational planning to link business process improvement to strategic priorities and align resource allocations to strategy needs, was assessed at the basic level. Business processes (e.g., research, operational) were in early stages of definition, as was linkage to strategic priorities. There were numerous examples of process improvement efforts focused on capacity building (e.g., biospecimen protocol implementation, biospecimen inventory and management, data management), but few were linked to measures that could demonstrate impact on strategy. Financial planning in PREVAIL is complex and involved many organizations, agencies, and funding streams. Forecasting, resource capacity planning, costing, and budgeting were largely determined at higher levels of governance and were not always accessible to operational working groups, research sites and support units. Better understanding of resource allocations at the operational level could help to improve the alignment of those groups' activities with the strategy. PREVAIL monitoring and learning from operations and strategy reviews was assessed at the basic level. Operational review meetings and monitoring took place frequently during the earliest days of the partnership when the EVD emergency response was the main focus. Over time however this was not as well sustained. The operational plan was not consistently updated. This limited the ability to maintain awareness and understanding of trouble spots and detect priorities and resource allocations that may have strayed from the partnership's strategic focus. Strategy review meetings were established and occurred periodically, though these more often addressed operational issues rather than strategy assessments. PREVAIL's strongest strategy management performance was in the way that it adapted its strategy over time in response to a changing research environment, justifying its rating at the lower end of the advanced level. The gradual abatement of the EVD epidemic in Liberia, coupled with advances in EVD prevention, diagnosis, and treatment, represented major changes from the environment in which PREVAIL had begun. Leadership led PREVAIL through cycles of strategy testing and adaptation, gathering input from experts from within and outside of PREVAIL. In this way, they assured continuous realignment of strategy with changing goals and resources and steered the partnership towards an evolving long-term vision as a sustainable clinical infectious disease research program, with broader goals to improve the health of the Liberian citizenry and contribute to global health research. The detailed scoring for each K-N stage, including the overall score, and 'within-stage' factor-specific scores for PREVAIL, are shown in Supplement Table 1 . In seeking to optimize its clinical research partnerships, NIAID's Division of Clinical Research selected the Kaplan-Norton (K-N) strategy management paradigm as a scaffold to plan and advance its strategic goals with the aim to ultimately achieve greater success in fulfilling the mission, vision, and values of these programs. We theorized that though born of the business world, the model could be adapted for use in our government/academic clinical research environment. Our research gives preliminary support to our proposition that a viable adaptation of the K-N model could be deployed and evaluated for feasibility and utility in our clinical research partnerships, as demonstrated by its ability to generate cogent results that resonated with and had practical value for our research partners and stakeholders. Since the 1960's, when strategic planning started becoming widespread, its impact has been under study. Analyses of its value and effects on organizational performance have not always been consistent [15] [16] [17] . False assumptions, lack of commitment to the plans, failures of leadership, inflexibility, politics, and misunderstandings about the environment can affect impact [18] . Kaplan and Norton maintain that it is breakdowns in an organization's processes and tools for managing Table 4 Summary of PREVAIL K-N stage level strategy management assessment. strategy that lead to underperformance. This has been broadly accepted and numerous adaptations of the K-N model, especially the Balanced Scorecard [3] , have been used to help improve strategies and outcomes in both for profit and not-for-profit organizations, including healthcare and health research and development [19] [20] [21] [22] . Much has been written on the challenges faced in international clinical research collaborations [23] [24] [25] , often pointing to burdensome regulations and requirements [26, 27] and inefficient processes [28] [29] [30] . We have not however, seen investigations of how research strategy is managed or evaluated in these partnerships, and thus, the focus of our work herein. Our experience and findings provide initial indications that our approach, with its built-in guidance on K-N best practices, developmental scoring model, and standardized data gathering tools, thus far appears feasible and applicable. The reception to our findings by PRE-VAIL leadership, and their uptake in identifying opportunities for improvement, further suggest the validity of the approach. Five lessons learned may help us and others seeking to strengthen strategy management in clinical research collaborations. 1. Successful assessment is heavily dependent on the engagement of the group leadership for promotion of the effort within the organization, resource allocation, and translation of learnings into practice. Leaders help to understand their partnership's 'as is' and 'to be' states and can help scope assessments that are scalable to their group's needs and capabilities. Of note, PREVAIL, was just three years old when it began this work, and as such would be considered at the 'younger' end of the horizon for organizational maturation [31, 32] . That its leaders were able to begin building strategy management capacity at a relatively early stage suggests that this kind of work may contribute meaningfully even in relatively new partnerships. 2. Plan for a learning curve, for both the facilitators and the organization. Choosing to lead the effort 'from within', the assessment of PREVAIL took about 18 months. While future iterations of the process may be streamlined for quicker deployment, building the expertise and capacity is an investment that takes time. 3. To optimize meaningful usage of findings, it is best if strategy management reviews can be timed to produce results which can coordinate with an organization's existing planning cycles (e.g., strategy, funding, budgeting). The results of the PREVAIL assessment were provided in time to support a strategic planning initiative for the transition of PREVAIL from its original emergency response focus to a long-term sustainable research capacity. 4. It can be advantageous to enlist the support of professional consulting resources for greater efficiency in training, analysis, writing reports and facilitating ongoing monitoring. External consultants may possibly approach the work with greater objectivity. 5. Envision a longer-term horizon, perhaps greater than 2-3 years, for model understanding and practice to grow and become a routine part of partnership management. We also identified risks in this work. 1. Evaluations, especially when led by funders, can be met with cynicism and mistrust. Reluctance to reveal problems or spotlight areas in need of improvement, can undermine the value of the work. Some approaches we took to help avert this were: a) encouraging leadership of the assessment from within the organization; b) openly sharing the assessment model; c) focusing on findings that could be put to practical use; d) avoiding comparisons to other partnerships; e) maintaining sensitivity to language barriers, as well as cultures and norms of communities within which the work is being done, especially differences in comfort levels with critique, or disagreement. 2. Dismissal of the K-N model as irrelevant because it originated in the business world. Our efforts to avert this were mainly in: 1) adapting K-N terms to the clinical research context and 2) learning from experience gained using the model within our own Division at NIAID. Our findings from this first implementation encourage our thinking that the K-N model can be contexed and adapted for use in assessing strategy management in collaborative clinical research partnerships. Strategy management reviews can be led from within a partnership and conducted effectively. With appropriate resources, evidence can be gathered and reproducibly analyzed using standardized tools. Resonance of the findings with partner leadership and convergence with other empirical assessments, suggest the approach has the potential to produce meaningful, useable results for clinical research groups. Future studies with other partnerships will seek to: 1) corroborate these initial observations; 2) enhance our understanding of the adaptability and utility of our assessment model in different settings (e.g., cultural, political); 3) streamline the methodology for easier use, and 4) gain insight into potential correlations of strategy management assessments with clinical research performance evaluation. The State of Clinical Research in the United States: An Overview Developing a conceptual framework for an evaluation system for the NIAID HIV/AIDS clinical trials networks The balanced scorecard -measures that drive performance The Execution Premium: Linking Strategy to Operations for Competitive Advantage Mastering the management system Strategy as the focus for evaluation How to Evaluate Corporate Strategy The evaluation of business strategy A structured evaluation of business process improvement approaches Developing and evaluating a methodology for business process improvement Utilization-Focused Evaluation Baldrige Excellence Framework (Health Care): A Systems Approach to Improving Your Organization's Performance for the Prevail I Study Group, Beating the odds: successful establishment of a Phase II/III clinical research trial in resource-poor Liberia during the largest-ever Ebola outbreak The tenuous link between formal strategic planning and financial performance Strategy planning and financial performance: a meta-analytic review Strategic planning and firm performance: a synthesis of more than two decades of research The Fall and Rise of Strategic Planning A novel performance monitoring framework for health research systems: experiences of the National Institute for Health Research in England The balanced scorecard in healthcare organizations: a performance measurement and strategic planning methodology Application of balanced scorecard in the evaluation of a complex health system intervention: 12 Months post intervention findings from the bhoma intervention: a cluster randomised trial in Zambia Development of performance indicators for clinical research coordinators using the balanced scorecard in South Korea Challenges of international oncology trial collaboration-a call to action Implementing clinical trials on an international platform: challenges and perspectives Conducting clinical trials-costs, impacts, and the value of clinical trials networks: a scoping review Regulatory impediments jeopardizing the conduct of clinical trials in Europe funded by National Institutes of Health Increasing value and reducing waste in biomedical research regulation and management Invisible barriers to clinical trials: the impact of structural, infrastructural, and procedural barriers to opening oncology clinical trials Steps and time to process clinical trials at the cancer therapy evaluation program Evaluating protocol lifecycle time intervals in HIV/AIDS clinical trials Planning horizon as a key element of a competitive strategy Strategic planning and individual temporal orientation The authors gratefully acknowledge the PREVAIL leadership and staff for the willingness and enthusiasm with which they have engaged in this work. We are thankful to Lisa Giebeig, Christen Osburn and Kathleen Igo for outstanding administrative support. We also appreciate the cooperation and contributions of the several PREVAIL partnership organizations who have played vital roles in making the partnership successful. Finally, we thank the PREVAIL study participants, their families, and close contacts. This project has been funded in whole or in part with Federal funds from the National Cancer Institute, National Institutes of Health, under Contract No. 75N91019D00024. The content of this publication does not necessarily reflect the views or policies of the Department of Health and Human Services, nor does mention of trade names, commercial products or organizations imply endorsement by the U.S. Government. The authors declare that they have no conflicts of interest. 1) We wish to confirm that there are no known conflicts of interest associated with this publication and there has been no significant financial support for this work that could have influenced its outcome. Supplementary data to this article can be found online at https://doi. org/10.1016/j.conctc.2021.100833.