key: cord-1008544-ngmu8m0p authors: Prendergast, Caroline O.; Satkus, Paulius; Alahmadi, Sarah; Bao, Yu title: Fostering the Assessment Processes of Academic Programs in Higher Education During COVID‐19: An Example from James Madison University date: 2022-04-04 journal: Assessment Update DOI: 10.1002/au.30291 sha: 713be079dbeb03ab5de73e22f712abbedb9fbc08 doc_id: 1008544 cord_uid: ngmu8m0p nan Assessment at JMU is overseen by the Center for Assessment and Research Studies (CARS). Within CARS, a faculty member leads a team of graduate students who are specifically tasked with assisting academic programs in their assessment responsibilities. This team, called Program Assessment Support Services (PASS), offers free consultations to academic programs, conducts data analysis, and oversees the collection and review of annual assessment reports. PASS does not collect data for programs, nor does the team determine which measures should be used. Instead, each academic program has an assessment coordinator who directs the program's assessment process (often in consultation with PASS). To tailor our support services in response to the pandemic, the PASS team sought extensive feedback from the assessment coordinators after JMU announced its transition to remote learning in March 2020. The PASS team, recognizing the difficulties assessment coordinators would likely encounter, reacted to the impending changes. We administered several surveys to the assessment coordinators to determine their needs, the barriers they expected to face in conducting their assessment, and their recommendations for altering the typical assessment procedures and timelines. Sixtysix percent of the responding assessment coordinators expressed their preference for postponing the annual assessment report submission date, and 61% reported apprehension about decreased assessment quality damaging the evaluations of their procedures. Specifically, respondents identified concerns about delays in administering assessment instruments and collecting data, abrupt changes to the mode of assessment (e.g., unplanned shifts to remote or web-based assessment), decreased data quality, and increased time and resource constraints due to the sudden switch to emergency remote teaching. Upon approval from the provost, we used this information to implement a modified assessment reporting process for assessment coordinators whose regular assessment processes were unattainable during the pandemic. In a typical year, all academic programs at JMU submit a report by June 1 detailing their assessment procedures, findings, and plans for future assessment. The reports typically summarize assessment activities in the preceding academic year, although some programs instead use a calendar yearbased assessment timeline. In the weeks following the report submissions, the reports are evaluated by trained faculty and staff raters as well as PASS consultants, who provide numerical ratings as well as qualitative feedback on 14 rubric elements. The rubric used to evaluate assessment reports at JMU is freely available on the Center for Assessment and Research Studies website https://bit.ly/3HYaSYr. After an extensive quality control process, these reviews are released to the assessment coordinators on October 1. Afterward, the assessment cycle begins anew. This meta-assessment process has been the subject of over a decade of research (Fulcher and Orem 2010; Rodgers, Grays, Fulcher, and Jurich 2012) , and it has been adopted by multiple universities (e.g., Auburn University, the University of North Florida, the University of Idaho, and the University of Southern Alabama). Academic programs are provided with extensive resources to complete their assessment reports, (continued on page 15) The editor welcomes short articles and news items for Assessment Update. Guidelines follow for those who would like to contribute articles on outcomes assessment in higher education. Prices are exclusive of tax. Asia-Pacific GST, Canadian GST/HST and European VAT will be applied at the appropriate rates. For more information on current tax rates, please go to https://onlinelibrary.wiley.com/library-info/products/price-lists/ payment. The price includes online access to the current and all online backfiles to January 1, 2018, where available. For other pricing options, including access information and terms and conditions, please visit https://onlinelibrary.wiley.com/library-info/products/price-lists. Terms of use can be found here: https://onlinelibrary.wiley.com/ terms-and-conditions. Copyright and Copying (in any format): Copyright © 2022 Wiley Periodicals LLC. All rights reserved. No part of this publication may be reproduced, stored, or transmitted in any form or by any means without the prior permission in writing from the copyright holder. Authorization to copy items for internal and personal use is granted by the copyright holder for libraries and other users registered with their local Reproduction Rights Organisation (RRO), e.g. Copyright Clearance Center (CCC), 222 Rosewood Drive, Danvers, MA 01923, USA (www.copyright.com), provided the appropriate fee is paid directly to the RRO. This consent does not extend to other kinds of copying such as copying for general distribution, for advertising or promotional purposes, for republication, for creating new collective works, or for resale. Permissions for such reuse can be obtained using the RightsLink "Request Permissions" link on Wiley Online Library. Special requests should be addressed to: permissions@wiley.com. Delivery Terms and Legal Title: Where the subscription price includes print issues and delivery is to the recipient's address, delivery terms are Delivered at Place (DAP); the recipient is responsible for paying any import duty or taxes. Title to all issues transfers Free of Board (FOB) our shipping point, freight prepaid. We will endeavour to fulfil claims for missing or damaged copies within six months of publication, within our reasonable discretion and subject to availability. Our policy is to replace missing or damaged copies within our reasonable discretion, subject to print issue availability, and subject to Wiley's Title-By-Title Journals Subscriptions Terms & Conditions. Claims for missing or damaged print issues must be sent to cs-journals@wiley. com within three months from date of subscription payment or date of issue publication, whichever is most recent. Back issues: Single issues from current and recent volumes are available at the current single-issue price from cs-journals@wiley.com. Disclaimer: The Publisher and Editors cannot be held responsible for errors or any consequences arising from the use of information contained in this journal; the views and opinions expressed do not necessarily reflect those of the Publisher and Editors, neither does the publication of advertisements constitute any endorsement by the Publisher and Editors of the products advertised. Journal Customer Services: For ordering information, claims and any enquiry concerning your journal subscription please go to https:// wolsupport.wiley.com/s/contactsupport?tabset-a7d10=2 or contact your nearest office. Americas: Email: cs-journals@wiley.com; Tel: +1 877 762 2974. Europe, Middle East, and Africa: Email: cs-journals@ wiley.com; Tel: +44 (0) 1865 778315; 0800 1800 536 (Germany). Asia Pacific: Email: cs-journals@wiley.com; Tel: +65 6511 8000. Japan: For Japanese speaking support, Email: cs-japan@wiley.com. Visit our Online Customer Help at https://wolsupport.wiley.com/s/ contactsupport?tabset-a7d10=2. Wiley's Corporate Citizenship initiative seeks to address the environmental, social, economic, and ethical challenges faced in our business and which are important to our diverse stakeholder groups. Since launching the initiative, we have focused on sharing our content with those in need, enhancing community philanthropy, reducing our carbon impact, creating global guidelines and best practices for paper use, establishing a vendor code of ethics, and engaging our colleagues and other stakeholders in our efforts. Follow our progress at www. wiley.com/go/citizenship. View this journal online at www.wileyonlinelibrary.com/journal/AU. As we begin the process of returning to our typical assessment timelines, we hope to carry lessons from the pandemic forward to foster more efficient and resilient assessment systems. including on-demand consultations with the PASS team, sample exemplary assessment reports from previous years, and detailed explanations of the rating process and criteria. Additionally, CARS offers various professional development opportunities every academic year, including a week-long assessment crash course, workshops for both new and returning assessment coordinators, and assessment report rater training. Participation in assessment report rating in particular serves to provide quality feedback to the programs submitting assessment reports while developing the assessment skills and knowledge of the faculty who participate in the rating process. After receiving feedback from assessment coordinators and other stakeholders within the university, two major adjustments were made to the annual reporting process. The first adjustment was to offer programs a choice between two postponed due dates for submitting the assessment report: August 1 and October 15 (instead of the standard June 1 deadline). The programs that submitted the report in August received feedback in early October (close to the typical feedback date), whereas the programs who submitted the report in mid-October received feedback in mid-December. Although receiving feedback later in the year may prevent some programs from incorporating feedback into their 2020-2021 AY assessment processes, delaying the due dates allowed crucial additional time for programs that were unable to meet the standard deadline. Of the 111 programs required to submit reports in 2020, 43 programs met the first due date, 66 met the second due date, and only two programs met neither. Additionally, we offered programs their choice of two feedback types. The formal feedback option retained the typical, highly structured feedback, including quantitative ratings along with the qualitative comments provided in typical years. The second option, informal feedback, provided a less structured feedback report that omitted the quantitative ratings but provided a brief summary of the assessment strategy and targeted suggestions for improving the assessment process. This option was specifically developed to accommodate pandemicrelated disruptions. We were motivated to introduce the informal feedback option for programs that were severely affected by the pandemic and were concerned that they would be unable to meet the typical reporting standards. Programs pursuing this option were provided with briefer feedback that considered the impact of the pandemic. This option reduced the time spent both producing the reports (for the program) and rating the reports (for the PASS team) while easing fears about the repercussions of disrupted assessment during the pandemic. Regardless of the deadline and feedback type selected by a program, we encouraged all Assessment Coordinators to provide a narrative of the impact of the pandemic on their assessment process. A review of these narratives provides an overview of common disruptions, which we have used to provide targeted support as the pandemic continues. Multiple programs reported canceling signature assessments, such as poster presentation sessions, due to the abrupt shift from campus-based learning. Other programs indicated low response rates to assessments, as contacting students became more difficult without face-to-face meetings. Finally, many programs noted faculty were overextended and overwhelmed because of the pandemic, necessarily reducing the available person-hours to devote to assessment work. Notably, a handful of programs reported the pandemic did not result in disruptions to their assessment procedures. Responses to these barriers provide insights that will be useful long after the pandemic ends. First, it is imperative to consistently communicate expectations Indeed, as Banta (2002) reminded us, such reviews "can encompass all aspects of the life of an academic department-from the credentials and research interests of faculty members to the methods they use to demonstrate student learning-and the collective judgment of peers is the form of departmental assessment most universally accepted by faculty" (p. 183). Defining peer review, identifying appropriate peers, and understanding the strengths and challenges to peer review processes are important first steps. Concurrently, it is also necessary to value the multitude of perspectives, contexts, and methods related to assessment and improvement. We will discuss this principle in Volume 34, Number 3. ■ for assessment coordinators, as well as the resources available to aid programs in meeting those expectations. At our university, assessment coordinators may change every few years for a given program. What is known to one assessment coordinator may not be known to their successor unless continuous communication is prioritized. Moreover, maintaining such communication remotely is especially challenging. We interacted with assessment coordinators via various means including emails, virtual meetings, and surveys. We also organized the annual New Assessment Coordinator Orientation to smooth their transition to their new role. Second, transparency between all parties is critical to quality assessment. JMU's criteria for evaluating assessment reports focus on the quality of the assessment and the thoroughness of its documentation rather than the degree of learning identified by a program's measurement strategy. However, if an assessment coordinator expects that they will be "judged" for a dip in student scores, additional stress will be added to an already stressful assessment cycle. As another example, many programs experienced steep declines in response rates; if these limitations were well-documented and considered in the interpretation of assessment results, they did not lead to declines in report ratings. However, when these intentions are not communicated or when the gap between report writing and the release of report feedback is long, there is ample opportunity for stress to build unnecessarily among the assessment coordinators. In years when flexibility is paramount (as it has been since the spring of 2020), it is important to communicate that flexibility to the assessment coordinators to prevent undue pressure. Third, this experience has underscored the importance of a proactive approach to problem-solving. As assessment professionals, we are well-versed in options for measure adaptations, platforms for testing administration, and balancing the constraints and requirements of real-world data collection. Proactively offering consultations about navigating the move to remote assessment, assistance with migrating formerly paper-based tests to an online environment, recommendations for internally developed online resources and other useful services can build rapport with assessment coordinators while saving precious time-a scarce commodity in the time of a pandemic. While the importance of anticipating and quickly responding to our colleagues' needs was crucial in the past year, it will continue to be an important part of our role as we return to some semblance of normalcy. We will benefit from scanning the horizon for potential disruptions and developing resources and solutions before they are needed. Finally, we must continue to balance the need for ongoing, high-quality assessment with the need to maintain strong relationships with our colleagues across campus. Each of the assessment coordinators is a person undergoing a time of severe stress, uncertainty, and upheaval, as we all are. It is important, now and always, to acknowledge the human (or team of humans) represented by each report. Although we do not wish to portray assessment as an optional chore that can be cast aside during difficult times, we can and should search for ways to make the task more meaningful and less taxing when resources-financial and otherwise-are thin. As we begin the process of returning to our typical assessment timelines, we hope to carry lessons from the pandemic forward to foster more efficient and resilient assessment systems. As noted above, consistent communication with assessment coordinators and other faculty is key to ensuring that needs are acknowledged and met. For our team, this has meant administering surveys to gather information about the roadblocks to returning to "normal" assessment practices and using these responses to tailor resources for academic programs. The disruptions wrought by the pandemic may also offer opportunities for reflection upon existing assessment processes. Which strategies are useful? Which approaches are not providing highquality, actionable information? What components of the assessment plan can be streamlined? Which measures could be improved? These questions can guide the transition back into a typical assessment timeline by providing new insight into the quality and utility of assessment approaches. Finally, as we return to our typical assessment process, we hope to see more programs using assessment to improving student learning. Frustrations with assessment frequently stem from the perception that assessment data are not useful, or that the time devoted to collecting, scoring, analyzing, and interpreting student work is not well-spent. Shifting to a learning improvement model-using assessment as a single tool in a larger effort to improve students' knowledge and skills across an entire academic program-can help to refocus assessment efforts and maximize their utility. ■ Building a Scholarship of Assessment Encyclopedia of International Higher Education Systems and Institutions New Forums Press and the Professional and Organizational Development Network in Higher Education James Madison University: Assessing and Planning During a Pandemic Evolving from Quantity to Quality: A New Yardstick for Assessment Improving Academic Program Assessment: A Mixed Methods Study