key: cord-257322-39k015kf authors: Al-Janabi, Hareth; Coles, Jenny; Copping, John; Dhanji, Nishit; McLoughlin, Carol; Murphy, Jacky; Nicholls, Jean title: Patient and Public Involvement (PPI) in Health Economics Methodology Research: Reflections and Recommendations date: 2020-09-17 journal: Patient DOI: 10.1007/s40271-020-00445-4 sha: doc_id: 257322 cord_uid: 39k015kf Patient and public involvement (PPI) can be used in methods research, as well as applied research, in health economics. However, methods research goals may seem quite abstract when compared to the lived experiences of lay participants. This article draws on 4 years of PPI in a research project to develop methods for including family carer outcomes in economic evaluation. Key challenges in using PPI for health economics methods research relate to (1) training and preparation, (2) maintaining involvement, and (3) selecting suitable tasks. We suggest three criteria for selecting a research task for PPI input based on task importance, professional researcher skills gap, and potential PPI contribution. Health economics research is beginning to benefit from patient and public involvement (PPI) [1] [2] [3] . This research can be 'methodological' or 'applied' in nature, and both forms of research raise important challenges in using PPI. In this article, we reflect on PPI input alongside a health economics project to develop methods to include informal (family) carer outcomes in economic evaluation. The article starts with a brief overview of PPI and methods research in health economics. We then describe our research project and experience of using PPI. Drawing on our experience, the remainder of the article discusses issues in PPI in health economics methods research. PPI has grown substantially over the last 2 decades and is increasingly required for health research in the UK [4] and elsewhere [5, 6] . It can be used in various stages of the life cycle of a research project from prioritising research questions through to impact evaluation [7] . PPI has been used in various ways in health economics, including in support of priority setting [3] , setting the scope of costs and outcomes [2] , selecting health states for valuation [1] , and identifying approaches to collecting cost and outcome data [2] . It is important to distinguish PPI from qualitative research (an important parallel development in health economics [8] ) in the sense that PPI is research done in partnership with the public, rather than using the public as participants. PPI covers 'the experience of patients, service users and carers [as] a fundamental and valued source of knowledge' [9] . Various benefits of using PPI in health economics have been noted, including covering gaps in knowledge, embedding the voice of the public in research, ensuring tasks are presented in an appropriate format, and providing an economic explanatory framework [1] [2] [3] . Methods research in health economics potentially covers a wide terrain, but a key focus of health economics methods research is improving the techniques of economic evaluation. This includes, for example, the development and testing of new outcome measures [10, 11] , the development of costing tools [12] , devising theoretical or practical frameworks for economic evaluation [13] , and methods advances, such as in decision modelling [14] , that are 'nested' in applied studies. Good methods research is crucial to ensure costs and outcomes are accurately measured, synthesised, and modelled; economic analyses are comprehensive and reflect society's values; and innovation and scientific rigour are promoted in economic evaluation research. Two important features differentiate methods research from applied research in health economics. First, the intended impact is different. Methods research seeks to develop techniques for application across the discipline, and therefore the intended impact is improved science conducted by the research community. Conversely, applied health economics research focuses on directly informing how health and social care services can be efficiently and equitably delivered. Thus the intended impacts are on the provision of services by care professionals for members of the public. While both methods and applied research can ultimately impact on the delivery of care, the impact of methods research is indirect, and therefore the research goals may be more abstract from the public perspective. Second, methods research is often undertaken as a standalone project, and therefore there may be a need to establish a bespoke PPI group. Conversely, applied health economics research is often nested within a clinical study. As such, it can benefit from PPI which is undertaken as part of the wider clinical study. In the next section we discuss the input of a bespoke PPI group (the lived experience advisory panel [LEAP]) as part of a standalone health economics methods project. The aim of the research project itself was to generate (and test) research methods to measure and value the quality of life of family carers. A LEAP that consisted of five people with current or recent experience of family care was recruited. The group included individuals who had fed into the design of the project, particularly in terms of how PPI would be used and how the project was communicated. The LEAP members were recruited through charity organisations in mental health and dementia and, in one case, via a colleague. As far as possible we sought diversity in caring experiences (e.g. care for spouses, adult children, and parents, and in the home and outside). The group's size was based on a desire to reflect the diversity of caring experiences while ensuring participants felt involved and able to contribute throughout the process. Ultimately, there were three work packages of the research project that the LEAP contributed to: -Understanding the mechanisms behind carer spillovers This work package comprised interviews, focus groups [15] , and a Delphi survey to understand the mechanisms by which health and social care interventions may affect carer wellbeing. -Measuring carer quality of life This work package comprised postal questionnaires [16] and think-aloud interviews to study the validity of different tools to measure the quality of life for carers. -Valuing carer quality of life This work package comprised a person trade-off (PTO) study [17] and econometric analysis [18] to value carer quality-of-life improvements to use in economic evaluation. The work was funded through a National Institute of Health Research Career Development Fellowship. To support the work, the LEAP and academic research team met 12 times with specific meeting dates timed to coincide with relevant work. Twelve full-day meetings represented a significant commitment for all concerned, but it was felt this was needed, given the multiple work packages, different phases of work within each work package (recruitment, survey design), and need to maintain contact throughout a 4-year project. The LEAP meetings consisted of discussions and specific tasks related to the work packages of the project. Table 1 summarises the details of the LEAP involvement in the work, in the form of the Guidance for Reporting Involvement of Patients and the Public-version 2 (GRIPP2) short form, the recommended system for documenting PPI within health research [6] . Below, we discuss the range of tasks undertaken by the LEAP over the 4-year period in more detail. Understanding the mechanisms behind carer spillovers involved qualitative research with family carers and care professionals to investigate how health and social care delivery may impact on family carers [15] . In early meetings involving the LEAP, we discussed which individuals would have insight into these issues, how to ensure different perspectives could be understood, where potential participants could be located, and how individuals should be approached. We discussed the process of the interviews and focus groups, how participants could be made to feel comfortable, and how we could structure the session to best meet the research objectives. Once interviews and focus groups started, the LEAP reviewed initial transcripts and identified issues that contributed to the process of open-coding the transcripts [8] . Once the work had been completed, it became apparent that the research findings on 'spillover mechanisms' would potentially have wider relevance (beyond health economics and the research community). The LEAP helped to design a summary of the findings and identify organisations to share this research with. Another major area of LEAP involvement was in designing surveys across the work programme, where it was important to achieve satisfactory response rates and meaningful answers. The LEAP completed early versions of the surveys. Terminology, in relation to mental health issues, was amended to improve clarity. The timescales and format The aim of the programme was to develop a framework for measuring and valuing carer quality of life for economic evaluation. This comprised investigations of The mechanisms behind carer quality-of-life spillovers (work package 1) The validity of different quality-of-life measures with carers (work package 2) The valuation of carer quality of life (work package 3) Inclusion of carer and patient quality of life in economic evaluation (work package 4) The main clinical contexts for the work were mental health, dementia, and stroke, three areas associated with major, though differing, challenges for family carers Methods A LEAP was established to engage individuals with recent experience of family care. LEAP members were recruited via charities in each of the three clinical areas. The LEAP included individuals with experience of caring for adult children, spouses, and parents, within the home and outside the home. Four of the five individuals had been involved in research before as a lay participant and two were involved in conducting research themselves. Twelve 1-day meetings (2016-2019) were held to coincide with relevant work and led by HA. Panel members were reimbursed for their time and expenses using INVOLVE guidelines. The meetings focused on recruitment strategies (e.g. for the interviews and Delphi), the language and content of research materials (including questionnaires and participant information sheets), the process of interviews and focus groups, the interpretation of study findings, and dissemination strategies. Each meeting had a major focus (e.g. participant recruitment), with supplementary issues discussed. Some preparation was needed for the meetings. Detailed notes and actions were written up by the academic researchers. A member of the LEAP also sat on the scientific advisory panel for the project Four core members attended all meetings, and one further member attended one meeting. The PPI work resulted in the following outcomes: 1. Participant recruitment Recruitment to focus groups and interviews was expanded and facilitated by contacts in the LEAP 2. Interview process The process of the focus groups and interviews was influenced by the LEAP; this included the location, terminology, socio-demographic mix, prompts (to ensure focus), brevity of questions, use of flip charts, and 'magic bullet question' 3. Coding The early coding of transcripts built on the perspectives of the LEAP 4. Questionnaire design Survey questions were amended in various ways, including to ensure appropriate terminology and clarity (especially around mental health). Changes were made to response categories, question ordering, and question content (including removal of question on 'strain'). The timescales and format of reminders to complete the survey was also informed by the LEAP 5. Delphi study design In the Delphi survey, a number of elements were amended, including the instructions (e.g. in relation to how long the survey was likely to take), language, use of open responses, formatting of feedback information, and the 'prioritisation task' 6. Think-aloud interview In the 'think-aloud' study, the number of instruments included (maximum of three), the handling of upsetting questions, and ways of handling the presence of the care recipient was informed by the LEAP 7. PTO design The PTO tasks were made more comprehensible. The LEAP encouraged us to use PTO rather than DCE, and estimate equivalence values through iterative rather than direct methods. Instructions were amended to standardise information. Graphical methods (scales and stick people), worked examples, and the scale and duration of benefit were all informed by the discussion with the LEAP 8. Dissemination The work on 'IMPACT' mechanisms was disseminated to various health and care organisations, NHS trusts, and charities on the recommendation of the LEAP. The format and language used in dissemination was influenced by the LEAP. Further contact and face-to-face presentations have resulted from the dissemination Discussion and conclusion The LEAP played a particularly important role in opening up new avenues for recruitment and dissemination and making research surveys more accessible for participants. The latter was challenging because the surveys required participants to make difficult trade-offs, between, for example, improving the lives of patients and carers. An important objective of the research was to ensure carers could answer questions about their own quality of life openly and honestly, and the input of the LEAP was important in achieving this Reflection and critical perspective Members of the LEAP commented on feeling useful in helping to shape surveys and understanding more about the research process (putting them in a good situation to contribute to other studies). The LEAP made numerous contributions to ensuring the research project was more effective (see study results). However, there were also important lessons in terms of how to best prepare for the work, maintain involvement over a period of time, and ensure tasks are appropriate for us to work on jointly. The LEAP involvement gave the participants the experience and motivation to become involved in future PPI and research activity. This is worth stressing because the expansion of PPI nationally will require a greater number of such participants of reminders to complete the survey were also informed by the LEAP. In later meetings, the LEAP piloted and helped to refine the Delphi survey. This influenced the instructions (e.g. in relation to how long the survey was likely to take), language, use of open responses, formatting of feedback information, and the 'prioritisation task'. The LEAP also contributed in a similar way to designing the PTO study (to value carer quality of life) [17] . In this case, the LEAP completed a lengthy initial survey with both discrete choice experiment (DCE) and PTO tasks, formatted in various ways to try to identify the most feasible method. This work was followed by in-depth discussion between the LEAP and academic team, which helped us identify that an iterative PTO task was the most viable option. Graphical methods (scales and stick people), worked examples, and the scale and duration of benefit were all informed by the discussion with the LEAP. Towards the end of the project, the LEAP completed an online version of the PTO task, as part of the process of moving to the final experiment. Our work over the 4 years has (presented in Table 1 ) suggested three important lessons in making best use of PPI in a health economics methods project. First, lay participants and professional researchers need training and preparation. Second, practical measures (such as reimbursement) and 'soft' skills (such as maintaining a welcoming atmosphere) are needed to ensure ongoing engagement in a long research project. Third, care needs to be taken to select appropriate research tasks for PPI. We discuss these three lessons in more detail in the next two sections. A key issue raised in our experience of using PPI for methods research is the need for prior training and clear objectives. In our study, the team (i.e. LEAP and academics) had initial discussions about the purpose of the project and how PPI could inform this. However, some lay participants reflected that more upfront training would have helped them feel more reassured about their input. Discussions within our team also revealed differing expectations of timescales and understanding about how PPI would fit into a research project. Training needs include research methods (and timescales), as well as health economics, with specific training (e.g. in outcome valuation) added on as needed. Written materials that came to light during the project [19, 20] could also be a useful supplement to more interactive training. To avoid duplication, it might be helpful to develop online training modules that all PPI participants could have access to. Making PPI work also requires efforts from academic researchers. Two of the academic team trained in qualitative research methods. However, in common with most health economists, the academic team had not been exposed to PPI through their training and career development. As with lay participants, a combination of written materials and interactive training would undoubtedly help. Specific areas of training could include the following: the practicalities of recruiting and communicating with lay participants, running tasks, facilitating group discussion, safeguarding welfare, handling different perspectives, and ending a research project. As with other academic projects, methodology research projects can often last several years and the project's impact on the field may be gradual. As such, there is a need for practical measures to ensure lay participants feel engaged throughout. In our project, we had a range of activities that the LEAP supported (Table 1 ) which helped to vary the meetings, we met every 3-4 months, and we distributed agendas and actions from the meetings, to ensure a clear trail of what we were doing and how the LEAP was contributing. LEAP members were reimbursed for full day meetings in accordance with INVOLVE guidelines [21] . Diaries (i.e. keeping a record of tasks and comments), remote participation and individualised feedback on impact [22] might be additional ways to maintain involvement through a long project. Thought also needs to be given to how to end the project. We used the final few meetings to discuss follow-on projects and dissemination. It will make sense to disband the group in some cases and to maintain it in other cases, to support dissemination or follow-on work. Using 'soft skills' to maintain a positive group dynamic was also important in maintaining involvement throughout a long project. PPI meetings can be intimidating, even if they feel fairly informal from the perspective of the academic team. We were fortunate in that four of the group stayed throughout the 4-year project. This created a sense of cohesion and familiarity that encouraged involvement. However, when people do drop out of PPI groups, this needs to be carefully handled, recognising there may be a trade-off between replacing a lost perspective in the group and maintaining the group's identity. We found it particularly useful that several of the panel had experience of research before either as a lay member or through other means. This enabled them to bridge the lay and academic worlds to some extent, helping others in the group to feel at ease. Of course, having lay participants with research experience may not always be helpful, particularly if some members feel alienated through a lack of research experience. This reinforces the need for training prior to the start of the research. As detailed in Table 1 , we undertook a wide range of tasks with the PPI group. In general, these tasks worked well. But not all aspects of a research project are equally amenable to PPI. Reflecting on the process of selecting tasks, we suggest three elements that are needed for PPI to add real value to a research task. First, the task must be necessary in meeting the research goals of the project. This may sound obvious, but it means that tasks should not be invented for the purposes of PPI. This creates unnecessary work for everyone and undermines the legitimacy of PPI. Second, the task should be one where the academic research team alone have a knowledge or skills gap. It would not really make sense, for example, for PPI participants to execute analysis that the researchers are trained and experienced in, simply for the sake of it. Third, the lay participants should have some additional knowledge or skills, in relation to the task, above that which the professional researchers possess. This means a task will not necessarily be suitable for PPI just because the academic team are struggling with it-there should be a reasonable expectation that PPI participants can bring additional insights to the task. We would argue all three criteria need to be fulfilled for PPI to be valuable in relation to a research task. Qualitative coding of transcripts is one task where careful consideration of the relevance of PPI may be needed. Coding is a vital stage of the qualitative research process. In our experience, involvement of the LEAP enriched initial discussions about the issues that were raised in interviews. However, once the formal (or axial) coding was underway, academic researchers were best placed to do this, given their professional training and arguably greater objectivity. Of course, lay participants bring many skills, which may include relevant research skills, but in general, data preparation and analysis is an area where lay participants ought not to be replacing professional researchers. Conversely, identifying organisations for sampling is an activity that can really benefit from PPI. For the research to happen, participants need to be identified, approached, and recruited. While professional researchers will be expected to know the sampling criteria, they will not always be aware of the voluntary and professional groups that exist in each disease context and how open such groups are to participating in research. As documented in Sect. 2, the LEAP members were helpful in identifying carers and care professionals we could include, where to access them, and how to approach them. PPI participants will often have first-hand experience of these organisations, personal networks, and an awareness of how open such organisations might be to research involvement. Relating this to our framework in Fig. 1 , this is certainly an area where a professional researcher may have a knowledge gap, and lay participants can make an important contribution. A second research activity that is likely to benefit from PPI is what we might term 'deep piloting' of surveys and other research materials. Over the course of the work, the LEAP assisted in developing a postal survey to assess the validity of quality-of-life measures, a Delphi study of carer/care professional attitudes, and a person-trade-off experiment. Good practice in outcome measurement and valuation of course involves piloting measures and experiments before use [23] [24] [25] . However, PPI is capable of going further than conventional piloting. In our experience, this process encompasses several elements. In Box 1, we have focused on what we did in the PTO task, but we also used a relatively similar process to develop our Delphi survey and quality-of-life survey. The real benefit of deep piloting with a PPI group is the ability to work with a group of people who have a 'foot in both camps' (as researchers and participants) in identifying problems and generating solutions through several iterations. Step 1: Initial pilot. Lived experience advisory panel (LEAP) members completed paper-based PTO and discrete choice experiment (DCE) scenarios as if they were a participant, to generate information on response patterns, time taken, and completion rate. Step 2: Reflective survey questions. At the end of the exercise, LEAP members completed survey questions about feasibility, ease, attractiveness, confusing questions, and so forth. Step 3: Group discussion on initial pilot. Following steps 1 and 2, we had a group discussion about the feasibility and problems with the task; we then co-developed solutions to these problems. Step 4: Redesign task. Redesign task based on feedback from steps 1-3. Repeat steps 1-4 with version 2 of pilot. The academic team redesigned the preference elicitation study (and it was put into an online format) for a second similar phase piloting a few months later. The final PTO survey built on these two phases of piloting and redesign. PPI contributed in many important ways to the quality, timeliness, and relevance of our methodology research project. We found that effective PPI required careful preparation, steps to maintain involvement in the project, and the careful selection of tasks. Research tasks will only add value if they are central to the research objectives and in areas where academic researchers have a knowledge or skills gap that the PPI can fill. This is particularly important in health economics methods research where the steps and tasks might be quite abstract to the lay public. A significant feature of our lay panel was that it was composed of family carers. This is a group whose voice is often under-represented, and their role in this research study was especially relevant, given its focus on the quality of life of family carers. The LEAP provided a valuable opportunity for members to discuss the impact on their own well-being, rather than their role as a provider of care to a patient. These discussions were facilitated by a smaller PPI group, which we felt led to carers feeling more comfortable sharing their own concerns. The group size may need to be larger for projects where a high level of diversity is needed or for certain research tasks, for example, if extensive cointerviewing is needed. Global and national pressures on family carers, including coronavirus disease (COVID-19), inadequate social care provision, and ageing populations, mean the importance of involving family carers in PPI will remain and is likely to grow. Because health economics methods research may not occur alongside a clinical project, it is less likely there will be an available PPI group to access. This means specific efforts will be needed, on behalf of the health economists, to engage and maintain members (as discussed in Sect. 3). The point is well made in other studies that researchers need to work sensitively [26] , see PPI as a two-way education process [2] , and avoid exploitation [27] . Our work adds to a small but growing amount of literature on PPI in health economics. Many of the principles for best practice will be relevant regardless of the focus of the research project. Others have highlighted the need for clear aims and objectives and to ensure appropriate tasks are selected [1, 2] . The fact that PPI is most successful when members can contribute actively is also underscored by Gibson et al. [28] , who highlight PPI as a dynamic interaction between lay and professional knowledge. However, health economics methods research involves perhaps paying even more careful attention to preparation and task selection, given the more abstract nature of the work. PPI requires significant resources, both from lay participants and academics, and ultimately the funder and society. It is therefore legitimate to ask how value-for-money can be maximised. Where PPI is needed for a methodology project, we suggest devoting resource to upfront training and careful selection of tasks. Fewer meetings, with well-briefed participants, focused on key tasks will contribute more to the research than more frequent, tokenistic discussions. In conclusion, PPI in health economics methods research has much potential, but making the best use of PPI requires careful consideration. PPI contributed in many important ways improving the quality, timeliness, and relevance of our work. We hope the tools developed through the work, in addition to the ideas in this article about how to make best use of PPI will be helpful to the broader research community. We look forward to further discussion and debate about this important topic. Involving members of the public in health economics research: insights from selecting health states for valuation to estimate quality-adjusted life-year (QALY) weights. Appl Health Econ Health Policy Working with patients and members of the public: informing health economics in child health research. Phar-macoEconomics Open Knowledge of public patient involvement among health economists in Ireland: a baseline audit PPI resources for applicants to NIHR research programmes GRIPP2 reporting checklists: tools to improve reporting of patient and public involvement in research Guidance for researchers on PPI Qualitative methods for health economics. London: Rowman & Littlefield Going the extra mile: improving the nation's health and wellbeing through public involvement in research Development of a self-report measure of capability wellbeing for adults: the ICECAP-A Developing a descriptive system for a new preferencebased measure of health-related quality of life for children Core items for a standardized resource use measure (ISRUM): expert Delphi consensus survey A framework to include family health spillovers in economic evaluation Alternative decision modelling techniques for the evaluation of health care technologies: Markov processes versus discrete event simulation Six mechanisms behind carer wellbeing effects: a qualitative study of healthcare delivery Validity and responsiveness of preference-based quality of life measures in informal carers: a comparison of 5 measures across 4 conditions Valuing carer quality of life for economic evaluation: a person trade-off (PTO) study. Discussion paper at Health Economist's Study Group Meeting Newcastle Valuing care-related outcomes for economic evaluation: an application of the wellbeing valuation method. Discussion paper at Health Economist's Study Group Meeting A research handbook for patient and public involvement researchers A researcher's guide to patient and public involvement Is it worth it? Patient and public views on the impact of their involvement in health research and its assessment: a UK-based qualitative interview study Qualitative research and content validity: developing best practices based on science and experience Reporting formative qualitative research to support the development of quantitative preference study protocols and corresponding survey instruments: guidelines for authors and reviewers Using qualitative methods for attribute development for discrete choice experiments: issues and recommendations A framework for public involvement at the design stage of NHS health and social care research: time to develop ethically conscious standards Patient and public involvement: how much do we spend and what are the benefits? Evaluating patient and public involvement in health research: from theoretical model to practical workshop The authors thank Eve Wittenberg (Harvard University) and James Hall (University of Birmingham) for comments on an earlier draft of the manuscript. Funding This research was funded by a National Institute for Health Research (NIHR) Career Development Fellowship awarded to HA (award number: CDF-2015-08-025). This paper presents independent research funded by the National Institute for Health Research (NIHR). The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care. The first draft of the manuscript was written by HA, and all authors reviewed previous versions of the manuscript. All authors read and approved the final manuscript.