key: cord-0855604-j7rl84cg authors: Childs, Jessie; Thoirs, Kerry; Quinton, Ann; Osborne, Brooke; Edwards, Christopher; Stoodley, Paul; Lombardo, Paul; Mcdonald, Sandra; Slade, Debbie; Chandler, Amanda; Taylor, Lucy; Long, Jodie; Pollard, Karen; Halligan, Toni title: Development of a professional competency framework for Australian sonographers—perspectives for developing competencies using a Delphi methodology date: 2022-03-21 journal: Int J Qual Health Care DOI: 10.1093/intqhc/mzac017 sha: 26d267d9a3f7181c0f79693d1e2c46ac36999e13 doc_id: 855604 cord_uid: j7rl84cg BACKGROUND: Professional competencies are important for enhancing alignment between the needs of education, industry and health consumers, whilst describing public expectations around health professionals. The development of competency standards for the sonography profession defines the behaviours, skills and knowledge sonographers should demonstrate for each learning and experience level. OBJECTIVE: The objective of this project was to develop a set of professional competency standards for the sonography profession which described in depth the behaviours, skills and knowledge sonographers should demonstrate across multiple learning and experience levels. METHODS: Representatives of three Australian ultrasound professional associations and seven tertiary institutions involved in entry-level sonographer education in Australia formed a research team (RT). The RT recruited an expert panel that responded to six survey rounds. Using a Delphi methodology, the results and free-text comments from each previous round were fed back to participants in the subsequent survey rounds to achieve a consensus. RESULTS: The project developed a professional competency framework for sonographers, which included four major domains: detailed competency standards, sonographer knowledge, sonographer attitudes and a holistic competency matrix [https://doi.org/10.6084/m9.figshare.17148035.v2.] CONCLUSION: The Delphi methodology is an effective way to develop professional competency standards. This paper describes the methods and challenges in developing such standards for sonographers which could be translated to other health professionals. Health professions use competencies as measurable standards to describe specific behaviours, skills and knowledge [1] . Competencies facilitate safe practice, transparency in professional regulation, standardized assessments and curriculum development. Furthermore, they enhance alignment between the needs of education, industry and health consumers and differentiate one health profession from another [1, 2] . Competencies can be defined for different levels of practice [2] . Sonographers in Australia who complete relevant qualifications apply to enter the Australian Sonographer Accreditation Registry (ASAR). ASAR approves and accredits training courses, requiring each course to assess a range of theoretical and practical elements. The ASAR's 'standards for the accreditation of sonographer courses' [3] define each element and incorporate a list of competencies described in the Australasian Sonographers Association (ASA) Competency standards [4] , published in 2011. The existing ASA competencies [4] address minimum expectations but not modern ultrasound practices and advanced professional roles [5, 6] . Point of Care Ultrasound (POCUS) [7] and new and emerging technologies such as Artificial Intelligence [8] also challenge current work practices. The development of a more detailed and cotemporary competency framework is timely. A lack of conceptual and practical guidelines or methods for developing competency frameworks leaves framework developers and users grappling with how to interpret the suitability and utility of different options [9] . The Delphi methodology is one way to build competency standards, utilizing anonymous sequential questionnaires to achieve a consensus amongst 'experts' [10, 11] . Consensus is facilitated by the sharing of opinions across multiple rounds [9] . This collective review by a representative group is appropriate for competency development, which requires a judgemental rather than an empirical process [11] . This aim of this paper is to describe the Delphi methodology used to update Australian sonographers' competencies and to offer insights for competency development in other health professions. The final outcome of this project, the professional competency framework for sonographers, is published on the non-peer-reviewed platform figshare and is available at https://doi.org/10.6084/m9.figshare.1714 8035.v2 [12] . A research team (RT), consisting of representatives from three Australian ultrasound professional associations and seven institutions involved in entry-level sonographer education, developed and implemented the study, including developing draft competencies. The RT used the study results to develop a professional competency framework for sonographers. Ethical approval was granted before the study commenced (UniSA HREC protocol number 201916). The overall purpose was to determine the behaviours, skills and knowledge that sonographers should demonstrate and the subsequent appropriateness of these across different levels of learning and experience. This was to be achieved across four domains: (i) detailed competencies, which differentiate entry-level sonographers and more advanced levels, (ii) expected sonographer's attitudes, (iii) knowledge items specific to areas of practice, with differentiation between those expected at entry level and those expected at more advanced practice levels, and (iv) holistic competencies scaffolded across five practice levels. For each competency domain, consensus was sought using two to three survey rounds. Survey rounds specific to each domain were staggered across a total of six survey rounds (Table 1) . An expert panel (EP) of sonographers was sought by the RT to participate in the Delphi study. EP sonographers were accredited by the ASAR, which is the minimum standard for sonographers practising in Australia. They were also required to have at least 5 years post-accreditation experience, including experience in at least one of the following: (i) clinical supervision and/or assessment of sonography students, (ii) performance management of sonographers or (iii) receiving or responding to patient or referrer feedback. Sonographers working in academic roles and not currently in clinical practice were excluded, as were members of the RT. Sonographers for the EP were recruited via posts on professional websites (ASA, ASAR and Australian Society for Ultrasound in Medicine) and electronic mailing lists. Interested sonographers were provided with written project information and invited to submit an expression of interest (EOI). Information in the EOI was used to confirm eligibility and provide a record of the EP demographics by geographic area and clinical practice setting. Eligible sonographers were sent information prior to the study and returned written consent prior to commencing Round 1. Recruitment aimed to obtain a pool of sonographers from diverse geographic locations and clinical settings with diverse expertise across all sonographic procedures. There are no prescriptive guidelines regarding the ideal number of experts to be recruited for a Delphi study [13] . Therefore, recruitment focused on obtaining sufficient expertise encompassing relevant practical, theoretical and research perspectives to inform competency standards [10, 14] . After initial recruitment, any deficits in representation were solved by promoting the study via professional networks and targeted invitations. After recruitment of the EP was finalized, the Delphi process commenced. Anonymous surveys were distributed using the web-based survey management platform LimeSurvey (Hamburg, Germany) [15] . Survey responses were either binary (yes/no) or weighted Likert 5-point agreement scales. Binary scales were used initially, but as a strategy to facilitate consensus, weighted Likert 5-point scales were introduced in subsequent rounds. At the end of each question group and at the end of each survey, free-text questions allowed qualitative responses and an opportunity for the EP to provide suggestions for alternate or additional competencies. Written and/or video-recorded survey instructions were supplied at the beginning of each round to explain the survey's purpose and format and the results from the previous round. These results included quantitative summary results (percentage agreements) and unedited freetext responses. Free-text responses guided the development of surveys in subsequent rounds and informed the EP of alternate perspectives which they may not have considered. Competencies reaching 100% consensus were accepted into the competency framework. Competencies not reaching a 100% consensus were carried forward into the following round and if they reached 70% consensus or more, they were accepted into the competency framework. If they did not reach 70% consensus, they were once again carried forward into the following round. Competencies not reaching 70% consensus agreement after three rounds were notated as such in the competency framework. The definition of consensus and how best to achieve this using a Delphi study are unclear in the literature; methods used include a percentage of agreement above a certain threshold ranging from 50% to 97% or a proportion of ratings within a particular range [16, 17] . In this Delphi project, the threshold level of achieving consensus for each item was decided a priori at 70%, consistent with minimum rates to maintain rigour [10, 18] . Each survey round opened for approximately 3 weeks, during which time reminders of the closing date were communicated to the EP. Short extensions were granted on request. Between survey rounds, a 3-week interval was used to collate and summarize survey data and to develop the subsequent round. Each survey was checked for its alignment with the project aims, for clarity and for lack of bias before its distribution. Accuracy of the data generated from the previous round was also checked. The RT used online meetings to discuss any arising issues. Disagreement and agreement scores (%) determined consensus and were calculated for both binary scales and Likert scales. For binary scales, the percentage of total responses answering in the affirmative or negative of the statement represented the 'agreement score' or 'disagreement score', respectively. For the Likert scale questions, the 'agreement score' was calculated by assigning values of 1 = strongly disagree, 2 = disagree, 3 = neither agree nor disagree, 4 = agree and 5 = strongly agree. The number of participants selecting each option was multiplied by the value of that option, giving five resultant values-one for each option. These values were then added together in what represented the 'total score'. The maximum possible total score occurred if every respondent selected 'strongly agree' (total number of respondents × 5). The total score was then divided by the maximum possible Note: Some of the EP worked in more than one geographical area. total score and multiplied by 100 to determine the 'agreement score'. The 'disagreement score' was calculated in the same way with the numerical values of the options reversed, 1 = strongly agree, 2 = agree, 3 = neither agree nor disagree, 4 = disagree and 5 = strongly disagree. Recruitment occurred from May 2019 to August 2019, and the six survey rounds were administered between August 2019 and May 2020. The EP was representative of all sonographic areas of the expertise and all Australian states and territories and included representation of metropolitan, regional and remote areas. Geographic distribution of the EP across all states and territories of Australia was similar to the geographic distribution of Australia's population of sonographers as reported by the ASAR (Table 2) . Most sonographers worked in private settings (45%, n = 25/55). Twelve (22%, 12/55) worked in public settings and sixteen (29%, 16/55) worked in both private and public settings. Fifty-one sonographers (93%, 51/55) had combined roles of clinical practice and education, one was an applications specialist, one (2%, 1/55) worked clinically but not in education, and two (4%, 3/55) were clinical educators only. Table 3 demonstrates the EP expertise across different clinical areas of practice. There was gradual attrition of respondents from the first to the final rounds, in which 1/55 (2%) and 17/55 (31%), respectively, did not complete the survey (Table 4 ). Across the six survey rounds, competencies for each of the four domains were developed (Table 1) , resulting in the new professional competency framework for sonographers [19] . Detailed competencies were developed across Rounds 1-3. In Round 1, the EP was presented with a draft list of detailed competencies, informed by an existing sonographer competency list [4] and competency lists from related professions [19] [20] [21] . The list documented five primary units split into 19 elements, with each unit containing between two and six elements ( Table 5 ). The elements were further broken into 77 basic competency units, called 'Performance Criterion' (PC), representing behaviours and attributes required to successfully perform critical work functions. The PCs were supported by 383 'cues' that provided illustrative examples of skills and knowledge ( Table 5 ). The EP was asked if each PC and its 'cues' should be included, excluded or amended. Round 1 reached consensus on exclusion, inclusion or amendment of all PCs and 'cues'. Round 2 questions addressed the 63 PCs and 'cues' that were voted to be amended, by presenting revised working options based on free-text responses from Round 1. Following round 2, there were 45 'PC's and 'cues' that had reached consensus to be included, but consensus had not yet been reached on their wording. In order to avoid EP fatigue, the wording was finalised by the RT as they agreed they could finalise wording without bias. These were presented back to the EP in Round 3 to determine inclusion or exclusion. Rounds 4-6 aimed to differentiate if the 'cues' determined in Rounds 1-3 were minimum requirements for entry-level sonographers or beyond the requirements of an entry-level sonographer. In Round 4, consensus was reached for 247/383 cues as appropriate for entry-level sonographers (71-98%) and 22/383 cues as beyond the expectations of entry-level sonographers (72-98%). In Round 5, the 114 'cues' not reaching consensus in Round 4 were revisited. Consensus was reached for further 16 'cues' as appropriate for entry-level sonographers (70-92%) and 18 as beyond the expectations of entry-level sonographers. Those not reaching consensus were revisited in Round 6. After Round 6, consensus determined that 307 cues were suitable for entry-level practitioners and 42 were appropriate for the more advanced levels, while 34 cues did not reach consensus. A comprehensive list of knowledge items for sonographers was developed in Rounds 1-3. In Round 1, the EP was asked if 63 knowledge items drawn from the existing competency standards [4] should be included as 'core' knowledge (minimum knowledge required of an entry-level sonographer). Consensus agreement (76-100%) was reached for all knowledge items to be included as 'core' knowledge. The EP also provided suggestions for additional knowledge items. Round 2 addressed additional 148 knowledge items drawn from Round 1 free-text responses. The EP were asked if these additional knowledge items were required for sonography practice, and further, if the knowledge should be In Round 3, the knowledge items not reaching consensus in Round 2 were revisited. There was consensus that 10/62 were 'core' (73-74%) and 15 were 'advanced or specialist' (70-78%). Consensus was not reached for 37 knowledge items. After Round 3, there was a consensus to denote 125 knowledge items as 'core', 49 as 'advanced or specialist' and 37 as not reaching consensus which were indicated as such in the final document. A list of sonographer 'attitudes' was developed across Rounds 2 and 3. In Round 2, the EP was presented with a list of twenty 'attitudes' extracted from Round 1 qualitative responses. They were questioned if this list should be included in the professional competency framework as a standalone domain or integrated into the detailed sonographer competency standards. The EP was asked to rate their agreement to include each listed 'attitude'. Consensus was reached to include all 'attitudes' (71-96%), but there was no consensus for how they should be included into the professional competency framework. Despite consensus to include all attitudes, free-text responses proposed multiple changes. In Round 3, a modified list of 16 'attitudes', developed by the RT using Round 2 qualitative responses, was presented to the EP. They were asked to rate their agreement for this modified list against the unmodified list. They were also asked to rate their agreement on three options for how 'attitudes' should be presented in the professional competency framework. Consensus was reached (71%) for the modified list of 16 'attitudes' to be presented as a standalone sonographer attitudes domain in the professional competency framework (77%). The holistic competency matrix was developed in Rounds 4-6 to provide a more generalized competency document that focused on a set of universal characteristics. In Round 4, a draft holistic competency matrix, developed by the RT and informed by other competency models and taxonomies for skills, knowledge and attitudes [22] [23] [24] [25] [26] [27] , was presented to the EP. The matrix contained eight general competencies (applied knowledge, psychomotor skills, standard of work, autonomy, coping with complexity, perception of content, attitudes to learning and attitudes towards self, professional colleagues and patient/clients). Each competency was supported by a description of how the level of expertise is demonstrated across five sonographer expert levels, based on the Dreyfus model of skill acquisition [22] : novice student, advanced beginner student, competent sonographer, proficient sonographer and advanced sonographer. This resulted in a matrix containing 40 cells. The EP was asked to rate their agreement for descriptions in each cell of the matrix. Consensus for the descriptions for all cells was achieved (83-88% agreement). Free-text responses suggested changes. In Round 5, across 75 questions, the EP was asked to rate their agreement on potential amendments to descriptions within 35 matrix cells based on Round 4 free-text responses. Consensus was not achieved for descriptors in six cells. Round 6 addressed descriptors in cells not reaching consensus in Round 5. Twenty-four descriptor options relating to six cells were offered based on Round 5 free-text responses. Consensus was reached for amendments to descriptors in each cell (70-81%). After Round 6, the consensus-based holistic competency matrix was accepted into the professional competency framework. This paper describes the development of a professional competency framework for sonographers using a Delphi methodology. It offers insights that may be translated to other health professions. The detailed set of sonographer competency standards (PCs) describes the required knowledge and skills for sonographers to perform effectively in different areas of practice. The 'cues' support the PCs by providing illustrative examples of skills and knowledge at both entry-level and advanced levels, assisting in defining specific threshold expectations for educators, students and regulatory bodies. The list of attitudes can guide personal development and overall approach to practice. The holistic competency matrix focuses on universal characteristics fundamental to practice. Its five incremental levels from novice to expert and reduced granularity lend itself to the assessment of students, performance management and career planning post accreditation [26] . The professional sonographer competency framework describes the minimum skill thresholds that student sonographers should develop to be eligible for accreditation, as well as milestone targets for accredited sonographers who are developing advanced skills over their years of practice and continuing professional development. Similar frameworks have been adopted by health professions internationally [19] [20] [21] . The Delphi methodology allowed the EP to steer study outcomes and therefore have potentially positive effects on EP engagement and ownership and future stakeholder acceptance and adoption. EP feedback highlighted the need to differentiate between entry-level and more advanced practitioners and to develop holistic competencies. The anonymous process was important to encourage candid responses, to limit individuals dominating the process and to give confidence to those who hesitate in sharing views that are contrary to the majority [28] . Although the EP were self-selected and therefore likely to have an interest in the project, we used strategies to foster engagement and motivation to minimize dropout rates and maximize response rates which were threatened by time-consuming survey rounds with high question volumes [10, 29] . We used a personal communication approach which included emails informing them of project aims and outcomes and their potential relevance to sonographic practice [10] . Personal communications from individuals were rapidly and respectfully answered and, with permission, shared with the EP if relevant to the whole group [9] . Complex concepts were explained using video messages. Being flexible with survey closing dates and fast turn-around times between rounds attempted to minimize dropouts [30] . Despite these strategies, there was a gradual attrition of ∼30% from the first round to the final round. We could not track nonrespondents and reasons for attrition due to anonymity. Two national disasters (fire and flood) affecting large areas of the country in the Australian spring and summer and the emergence of the coronavirus disease 2019 pandemic from January 2020 were probable external factors contributing to attrition. The web-based survey platform presented occasional technical problems, such as failing to save responses, an unwelcome inconvenience for people with busy work and personal schedules. More stringent trials of the selected survey platform may have helped minimize such problems. Despite attrition, respondents continued to generate prompt and meaningful data. As an example, in the final round 135 free-text comments were generated from 38 respondents. EP fatigue was also considered. There are no strict guidelines for the optimum number of survey rounds in the Delphi methodology; however, recommendations vary between two and four [10] . In this project, survey rounds for each domain were capped at three to avoid fatigue and limit project overrun. A total of six surveys were necessary, as the different domains emerged from iterative EP feedback and reduced overload in single rounds. Limiting the survey rounds resulted in not all competencies reaching consensus. We denoted these as such in the final framework. Other competency developers using consensus methodologies should decide how to treat competencies not meeting consensus. Lowering the consensus threshold would improve consensus results. Consensus thresholds as low as 51% have been used [30] , but if too low, EP morale may be threatened when there is little difference between the scores of competencies reaching and not reaching consensus [18] . Some items may never reach consensus, in which case the stability of responses should be considered [10] . In initial rounds, we used binary scale questions believing they would be less burdensome than multiple point scales. We used 5-point scales in later rounds to provide more detail. This approach may not have been as burdensome as thought, as questions with more options may require less psychological effort [31] . We also avoided involving the EP in complex exploratory processes to determine competency domains and lists [10] , by presenting the EP with predeveloped draft competencies, based on existing and relevant competency frameworks [32] . An EP consisting of industry stakeholders who understood sonographic practice's current and future roles [10] and who would be impacted by introducing a new competency framework was essential to minimize the risk of potential biases [18] . Fortunately, the EP was a smaller and more manageable group than what was initially estimated due to sonographers working in settings with diverse case mixes and across multiple geographic areas and practice settings. The EP had a similar geographic distribution compared to national data. There was no data available for national distributions of clinical settings and roles, but we believe there was good representation across the profession using anecdotal industry knowledge. Bias was also minimized by asking respondents to only answer questions within their expertise. This did result in lower response rates for competencies in specialized areas; however, this corresponds with a smaller proportion of sonographers who work in these areas. A limitation of this research is the lack of representation of sonographer trainees who may have provided additional insight into the practicality of adopting some components of the framework as 'core'. One RT member undertook most of the work in analysing data, preparing summary results, soliciting RT feedback, acting on that feedback and preparing the next survey in a 3-week interval between rounds. This was challenging, and the process could be strengthened by allocating tasks across more people [33] . Furthermore, the construction of the initial questionnaire by the RT rather than beginning with suggestions from the EP may be a potential limitation. There is a risk in Delphi methodology that anonymity leads to a lack of responsibility and accountability for responses [34] . EP members may also change their opinion to reflect the majority view, resulting in conformity rather than consensus [35] . Particular to this study was the risk of influence bias from the RT on the competencies at both the beginning and end of the process [30] . Moreover, the results only reflect opinion from a moment in time of a group of experts. Further work is required to assess the authenticity and to validate the competencies. However, they provide a base for future policy development and to drive future practice standards for sonographers. This paper describes a Delphi competency project which identified an extensive collection of competencies in detailed and holistic formats, along with descriptions of the expected behaviours, skills and knowledge of sonographers across different levels of practice. Challenges and strategies to recruit a representative EP, limit EP fatigue, maintain engagement and manage large amounts of data are discussed. The competencies are helpful in education, professional development and performance management. Dichotomy and dialogue in conceptualizations of competency in health professionals' education Developing clinical competency: Experiences and perceptions of Advanced Midwifery Practitioners in training Standards for the Accreditation of Sonographer Courses Competency Standards for the Entry Level Sonographer Organisational and professional structures shaping the sonographer role in obstetrics Advanced cardiac sonographer: a reality at last Point of care ultrasound: a WFUMB position paper The application of artificial intelligence in the sonography profession: Professional and educational considerations Identifying core competencies for practicing public health professionals: results from a Delphi exercise in Uttar Pradesh Improving the practical application of the Delphi method in group-based judgment: A six-step prescription for a well-founded and defensible process The Delphi technique in educational research Professional Competency Framework for Sonographers The Delphi technique: making sense of consensus Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review LimeSurvey: An Open Source survey tool Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies How to use the nominal group and Delphi techniques Consulting the oracle: ten lessons from using the Delphi technique in nursing research Medical Radiation Practice Board AHPRA. Professional Capabilities for Medical Radiation Practice Medical Radiation Technologist Board. Policy: Competence Standards for Medical Imaging and Radiation Therapy Practitioners in Aotearoa New Zealand Policy Title Competence Standards for Medical Imaging and Radiation Therapy Practitioners in Aotearoa New Zealand Reference Number 2018-Jul-V2-MRT Physician Competency Framework. Royal College of Physicians and surgeons of Canada The five-stage model of adult skill acquisition The classification of educational objectives in the psychomotor domain A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives Understanding the Dreyfus model of skill acquisition to improve ultrasound training for obstetrics and gynaecology trainees Holistic competence and its assessment Developing and Writing Behavioral Objectives Delphi methodology in healthcare research: How to decide its appropriateness Higher number of items associated with significantly lower response rates in COS Delphi surveys Methodological and conceptual issues confronting a cross-country Delphi study of educational program evaluation Considerations in using the Delphi approach: design, questions and answers Doing competencies well: Best practices in competency modeling Reflections on the application of the Delphi method: lessons from a case in public transport research Utilizing the Delphi survey approach: a review The effect of controlled opinion feedback on Delphi features: Mixed messages from a real-world Delphi experiment Not applicable. The authors disclose receipt of the following financial support for the research, authorship and/or publication of this article: This work was financially supported by the Australian Sonographers Accreditation Registry with a $20 000 grant. All listed authors J.C., K.T., A.Q., B.O., C.E., P.S., P.L., S.M., D.S., A.C., L.T., J.L., K.P. and T.H., as members of the research team, had substantial participation in the conception and design of the work, execution of the work, analysis of the data and contribution of methodological expertise. All authors also contributed to the writing of the manuscript and approve the final version. Availability of data and material: Data containing the results of each Delphi round are available on reasonable request from the corresponding author.