key: cord-0955692-hqaevffm authors: Ewell, Sharday N.; Josefson, Chloe C.; Ballen, Cissy J. title: Why Did Students Report Lower Test Anxiety during the COVID-19 Pandemic? date: 2022-04-05 journal: Journal of microbiology & biology education DOI: 10.1128/jmbe.00282-21 sha: 1f9dbc69b8afd33f0c3a711cacd5a305ec3e2e42 doc_id: 955692 cord_uid: hqaevffm Test anxiety is a common experience shared by college students and is typically investigated in the context of traditional, face-to-face courses. However, the onset of the COVID-19 pandemic resulted in the closure of universities, and many students had to rapidly shift to and balance the challenges of online learning. We investigated how the shift to online learning during the pandemic impacted trait (habitual) and state (momentary) test anxiety and whether there was variation across different demographic groups already vulnerable to performance gaps in science, technology, engineering, and mathematics (STEM) courses. Quantitative analyses revealed that trait and state test anxiety were lower in Spring 2020 (COVID semester) than in Spring 2019 and were higher overall in women than men. We did not find a difference in either trait or state anxiety in first-generation students or among persons excluded because of ethnicity or race. Qualitative analyses revealed that student priorities shifted away from coursework during Spring 2020. While students initially perceived the shift to online learning as beneficial, 1 month after the shift, students reported more difficulties studying and completing their coursework. Taken together, these results are the first to compare reports of test anxiety during a traditional, undisrupted semester to the semester where COVID-19 forced a sudden transition online. Summative assessments are a common part of the college experience, but performance challenges can be associated with course failure and science, technology, engineering, and mathematics (STEM) attrition (1) . As a result, students frequently cite tests as one of their greatest sources of anxiety (1) . Test anxiety is defined by increased levels of discomfort and worry during an exam and results in a decline in academic performance that ultimately misrepresents student content knowledge (1, 2) . Test anxiety is characterized by two components: cognitive (i.e., negative thoughts and worry) and affective-physiological (i.e., emotionality). The cognitive component refers to concerns that students may have about being evaluated in a testing situation, while the affective-physiological component refers to student perception of their physiological reactions (e.g., increased heart rate) that may occur as a result of the anxiety (3, 4) . Test anxiety impacts all students regardless of background, and learners with high anxiety have been shown to have deficits in the encoding and retrieval of information (5) . One model that may explain the relationship between test anxiety and performance is the skills deficit model. This model proposes that high test anxiety is the result of a deficit in learning acquisition due to poor study skills (3, 6) . These poor study skills result in anxiety that leads to poor test performance because of interference or distractibility (5) . Posttest, students may attribute their failure to a lack of academic competence and eventually develop negative attitudes toward preparation for testing (5) . In contrast, others have put forth a course deficit model approach to considering the relationship between test anxiety and performance, which focuses on shortcomings of the assessment itself. From this perspective, attributes of the assessment create high anxiety, which impacts performance (7) . Previous literature across STEM fields repeatedly shows how test anxiety negatively impacts academic performance (8) (9) (10) (11) (12) . On average, test anxiety levels vary based on gender, with women consistently reporting higher levels than men (2, 3, 9, (13) (14) (15) , even when performance outcomes are similar (13) . Previous work demonstrates how in introductory biology (14, 16) and across other STEM disciplines (11, 17, 18) , for women only, test anxiety has a stronger negative influence on exam performance. However, other research did not observe these patterns, concluding that the relationship between performance and test anxiety was unrelated to gender (11) . Test anxiety in highereducation STEM among persons historically excluded from science due to ethnicity or race (PEERs) has not been studied as extensively, to our knowledge. One study examined the relationship between test anxiety, incoming preparation, PEER status, and academic performance outcomes (19) . Through mediation analyses, the authors showed that PEER status was related to students' incoming preparation and both were associated with higher anxiety scores. These higher anxiety levels were associated with lower exam scores. In another study, Salehi et al. showed that test anxiety and its impact on performance for PEERs varied by institutional context, painting a more complex picture of the relationship between affective traits and performance (20) . We focused on two different perceptions of test anxiety: trait-based test anxiety is a student's predisposition to feelings of apprehension, dread, and tension in a testing situation, whereas state test anxiety refers to momentary perceptions of anxiety in response to an immediate testing environment (11, 21, 22) . Because these two traits can lead to different results (20) , by measuring both we are able to gain a more nuanced understanding of student testing experiences. While few studies have investigated test anxiety in the context of online assessments, previous research has demonstrated that online testing does not induce additional anxiety or impact academic performance (23) . Instead, students who typically experience higher levels of test anxiety in the classroom report lower levels of test anxiety in online environments (23, 24) . However, it is important to note that previous research centers on students who choose to take online coursework and have more experience with online learning. In contrast, during the COVID-19 pandemic, students were forced to participate in online learning and often had little to no previous experience. Furthermore, many students had to also contend with the challenges presented by the pandemic (e.g., illness of a loved one and challenges associated with employment status or housing). These students may have been less proficient using technology required for learning, felt uncomfortable preparing for exams or in their testing environments, and experienced increased test anxiety during this time (23, 25) . To our knowledge, no studies have compared reports of student test anxiety from an undisrupted, traditional semester to the semester when COVID-19 forced a sudden transition online. In this study, we examined the relationship between the rapid transition to emergency remote learning during Spring 2020 and student test anxiety. We hypothesized that differences in student experiences of anxiety related to how students prepared for exams, which was dramatically different in the COVID-19 semester. Specifically, this exploratory study asked the following questions: 1. A. How did the transition to emergency remote learning impact trait and state test anxiety? B. Were there variations in anxiety across different demographic groups already vulnerable to performance gaps? 2. How did potential differences in student experiences of test anxiety relate to how students prepared for exams? The data from our exploratory study provide insight into student experiences during this time and provide instructors with best practices for effective teaching online and during times of crisis. Data for this study were gathered from two introductory biology courses with a total enrollment of 691 students across two class sections lead by the same instructor during Spring 2019 (n = 264) and Spring 2020 (n = 427) at a research-intensive, land-grant university in the southeast region of the United States. This course focuses on developing student understanding of the evolution, classification, structure, and diversity of living organisms with a specific emphasis on plants and animals. It is the second course in an introductory biology sequence, and students who take this course are generally life science majors. All procedures for this study were approved by the Auburn University Institutional Review Board (IRB ID: . Students enrolled in the Spring 2019 semester met in a traditional face-to-face format. Specifically, students met for two 75-min class periods per week in a large auditorium with seating for 270 students. Each class period included lecture, formative assessments (e.g., checks for understanding using iClicker questions), and group activities to reinforce concepts learned during the lecture. Furthermore, students completed three individual examinations and group examinations to assess content mastery. Students enrolled in the Spring 2020 semester initially met in the traditional face-to-face format for the first 10 weeks of the semester. However, due to the COVID-19 pandemic, faceto-face class meetings were canceled, and class instruction shifted to online instruction for the remainder of the semester. The shift online occurred approximately 1 week before the second midterm exam. Because exam 1 was face to face and during what students experienced as a relatively "normal" semester, for the purposes of this research we focus on exam 2 and exam 3, both of which took place online, to assess the impacts of the pandemic on student test anxiety. During the period of online instruction, students learned from prerecorded lectures that were accessible through the university's learning management system (Canvas) and completed 10-question quizzes to review the concepts discussed during the prerecorded lectures. The same instructor developed course materials for both semesters and taught independently in Spring 2019. In Spring 2020, the instructor cotaught the course with another instructor. While the content from lectures was nearly identical and the exams covered the same concepts, students did not work in groups to complete the quizzes online, as they would have completed iClicker questions in face-to-face class. The online exams were administered differently from the face-to-face exams in several ways. For example, they were proctored using an online platform. While exams were timed across both semesters, because online students were spread across multiple time zones in Spring TEST ANXIETY DURING COVID-19 PANDEMIC 2020, we allowed a 24-h window during which exams were to be completed. Students who required extra time on exams were provided this accommodation in Spring 2020 as they would have been during a traditional semester (26) . Most of the participants in this study were women and did not identify as a first-generation college student or as persons excluded because of their ethnicity or race (PEER) ( Table 1) . We had access to institutional data that included American College Test (ACT) and Scholastic Assessment Test (SAT) composite scores, and for students who did not have an ACT composite score available, their SAT composite score was converted using the ACT/SAT concordance calculator available on the ACT website. For ease of interpretation during analyses, all raw scores were normalized. Additionally, high school grade point average (GPA) and anxiety constructs were normalized for the entire sample. During the Spring 2019 and Spring 2020 semesters, instructors encouraged students to take a voluntary survey online via Qualtrics for a small amount of extra credit points which students received for clicking into the survey. Data included in this article were collected from a total of 417 students during Spring 2019 (n = 181) and Spring 2020 (n = 236). The survey included questions about their experience with test anxiety at five different time points: at the beginning of the semester, after each of three exams, and at the end of the semester. Students were included as participants in this study if they responded to the surveys at all five time points. Students were asked the following questions using a 1-to-7 Likert scale where 1 represented "not true of me at all" and 7 represented "very true of me": (i) "I am so nervous during a test that I cannot remember facts I have learned"; (ii) "I have an uneasy, upset feeling when I take a test"; (iii) "I worry a great deal about tests"; and (iv) "When I take a test, I think about how poorly I am doing." This construct is part of the Motivated Strategies for Learning Questionnaire (MSLQ) and has been validated on several student populations (13, 14, 16, (27) (28) (29) . We focused on trait (habitual) anxiety and state (momentary) anxiety in biology (11, 22) . At the beginning and end of the semester, to measure trait anxiety, we specified in the survey instructions that test anxiety survey items were meant to reflect how students felt about exams in general; to measure state anxiety, after each exam, we asked students to gauge the extent to which they felt anxiety about the exam they had just completed. We confirmed that the measures from the survey represented the intended construct of test anxiety through confirmatory factor analysis (CFA) following the work of Knekta et al. (30) . In Spring 2020, the survey also included an open-ended question that asked students to discuss the impact of COVID-19 on their study habits. While this did not directly ask students about test anxiety, we hypothesized that student descriptions of study habits would illuminate why or why not they experienced anxiety during the exams. This survey question was previously analyzed for another project (31), but we reanalyzed these data and created new codes for the purposes of our research questions that related to test anxiety. This question allowed us to investigate potential changes in testing experiences for students. We found that 80% of students responded to the question following exam 2 and 71% of students responded following exam 3. We analyzed students' responses to this question to identify factors that might explain potential changes in test anxiety within the semester. (32) . The responses were assigned as many codes as fit, meaning that one student might generate one or more codes in the student's response. One of the authors (S.N.E.) and an undergraduate research assistant coded through firstand second-cycle analyses. Briefly, the coders conducted initial coding independently and then met to code to consensus, meaning that the two coders agreed on code assignments for all responses. Twelve percent of the responses were not specific enough to fit any of the codes and were not considered in the analysis (e.g., "a lot" or "very much"). We used mixed-model regression analyses in R to examine the impact of COVID semester, exam number, and gender on student trait anxiety following exams (RStudio v. 1.2) (33) using the statistical package 'lme4' (34) . We used the variables semester, exam, and gender as fixed effects and student identifier (ID), normalized ACTor ACT-equivalent score, and precourse anxiety score as random effects. We compared Akaike information criterion (AIC) values to determine the best-fit model (see Table S1 in the supplemental material). Additionally, we used mixed-model regression analyses to examine whether there was a statistical difference in performance (i.e., final grade for the class and averaged grades for exams 2 and 3) between the two semesters, using incoming preparation (normalized ACTor ACT-equivalent score) as a random effect. We specified the CFA using a one-factor model with four items as described in Methods. The specified CFA demonstrated close model fit (χ 2 = 6.126, df = 2, P = 0.047, comparative fix index [CFI] = 0.998, root mean square error of approximation [RMSEA] = 0.034, and standardized root mean square residual [SRMR] = 0.007). Together, these parameters mean that the model fit was acceptable but not excellent. The latent factors were constrained to have a mean of 0 and a variance of 1 in our model so that the factors were standardized. Factor loadings were all greater than 0.70 for all four items. For an item with a factor loading of 0.70, there is 1 standard deviation increase in the theorized factor for a 0.70 standard deviation increase in the survey item. 1A. How did the transition to emergency remote learning impact trait and state test anxiety? 1B. Were there variations in anxiety across different demographic groups already vulnerable to performance gaps? To address our first two research questions, we used mixed-model regression analysis performed in R. Our analysis revealed lower postcourse (trait) anxiety in the COVID semester (Spring 2020) relative to the traditional face-to-face semester (Spring 2019) [β = À0.19, t(362) = À1.96, P = 0.05] (Fig. 1A) . We also found that trait anxiety in both semesters was significantly higher in women than in men [β = 0.484, t(363) = 3.59, P < 0.001]. We did not find a significant semester-by-gender interaction. Our model for trait anxiety was not significantly improved by including first-generation or PEER status, and thus, these variables were left out of the final model. However, we found that postexam (state) anxiety was significantly lower in Spring 2020 than in Spring 2019 [β = À0.38, t(499) = À3.58, P < 0.001] (Fig. 1B) . For both semesters, postexam anxiety was lower after the third exam than after the second exam [β = À0.33, t(393) = À2.29, P = 0.022], and there was a semester-by-exam interaction such that students reported more anxiety after exam 3 in To examine whether our results may be simply due to more lenient grading over the COVID semester, we compared student performances across the two semesters using final grades and average exam scores. When we analyzed student performance using final grades, we found there was no significant difference between the two semesters [β = À1.48, t(415) = À1.88, P = 0.062]. However, when controlling for incoming preparation (i.e., normalized ACT or ACT-equivalent score), there was a difference between the semesters such that students in the Spring 2020 semester had a small but significant decrease in their final grade that amounted to 1.72% (on a 100-point scale) less than the final grade scores in Spring 2019 [β = À1.72, t(394.6) = À2.18, P = 0.030]. We also found a marginal difference in the averaged exam scores for exams 2 and 3 between the two semesters (β = 2.43, t = 1.98, P = 0.059) but not when controlling for incoming preparation [β = 2.25, t(396) = 1.82, P = 0.069]. 2. How did potential differences in student experiences of test anxiety relate to how students prepared for exams? The results from our quantitative analysis led us to question why we observed, paradoxically, less trait and state test anxiety during the semester interrupted by a global pandemic and transition to emergency remote instruction. We hypothesized this related to how students prepared for exams, which was dramatically different in the COVID-19 semester. Students completed exam 2 1 week following the university's transition to emergency remote learning following the onset of the COVID-19 pandemic and exam 3 1 month after the transition online. After both exams, we qualitatively analyzed the survey question "to what extent did the Coronavirus disease impact your study habits?" Six categories related to factors that impacted test anxiety emerged: (i) difficulty maintaining attention, (ii) inability to use academic supports, (iii) difficulty constructing meaning, (iv) competing/ shifting priorities, (v) difficulty organizing academic tasks, and (vi) no change/limited impact/more time to prepare ( Table 2) . For exam 2, 85 students indicated that they had competing/ shifting priorities that took their focus off coursework while 72 students viewed the transition to online learning as beneficial. Interestingly, while students expressed that they were unable to access academic supports (76 students) and had difficulty maintaining attention (51 students), few students expressed difficulty with completing academic tasks (24 students), lack of motivation (17 students), and difficulty with constructing meaning (11 students) (Fig. 2) . Students completed exam 3 1 month after the university's transition to emergency remote learning. Unlike the responses observed for exam 2, fewer students viewed the shift to emergency remote learning as beneficial (50 students) and/or had competing priorities to distract from classwork (39 students). Instead, the majority of the survey responses indicated that students experienced difficulty maintaining attention (75 students), were unable to use previously used academic supports (46 students), had difficulty constructing meaning (43 students), had difficulty organizing academic tasks (35 students), and experienced lack of motivation (30 students) (Fig. 2 ). The transition to emergency remote learning due to COVID-19 has been described by students as an unpleasant experience due to difficulties learning in an online environment, with students in STEM classes reporting decreases in motivation and engagement (35, 36) . However, despite these difficulties, some students experienced increases in academic performance (37, 38) . The aim of this study was to investigate the factors that contributed to potential changes in testing experiences for students during the transition to emergency remote learning during the COVID-19 semester. Specifically, we investigated the relationship between the rapid transition to emergency remote learning and test anxiety. We found that during the transition to emergency remote learning, students reported lower trait and state test anxiety compared to the previous semester. Our qualitative analyses revealed that students initially perceived the shift to online learning as beneficial to their learning. However, 1 month after the shift, fewer students considered it beneficial and reported more difficulties preparing for exams. Test anxiety is context specific and commonly experienced by college students. Experiences can be described as trait anxiety (i.e., baseline anxiety where effects of text anxiety are more severe) or state anxiety (i.e., triggered by the testing event) (5) . Previous work showed that higher rates of anxiety during the COVID-19 pandemic were the result of the transition to online classes and of difficulty with online learning (36, 39, 40) . In biology courses, while instructors attempted to ease this transition by implementing increased flexibility with students (e.g., use of extended deadlines and use of open book exams), students reported negative impacts on their learning (41, 42) . Specifically, students reported challenges with focus and understanding/remembering content (41) . Interestingly, our results indicate that during the Spring 2020 shift to emergency remote learning, students did not experience increased trait or state test anxiety compared to the previous face-to-face semester. This finding may be due to students perceiving that, initially, the shift to online learning was beneficial. Online learning allowed students to participate in activities in various locations at times that were convenient for them (43) (44) (45) (46) . Specifically, students commonly stated that the shift provided them with more time to prepare for the second exam, which took place shortly after the transition online during the COVID-19 semester, and the third exam, which took place 1 month following the transition online. Furthermore, students reported that availability of recorded lectures during the third exam aided in their ability to adequately prepare for the exam. This supports the course deficit model, in which classroom practices are primarily responsible for shifting student perceptions, leading to enhanced or reduced performance outcomes. During the COVID-19 semester, why did we observe heightened test anxiety over time? Our results showed that during the COVID-19 semester, students felt lower test anxiety after the second exam (1 week into the emergency transition online) and higher test anxiety after the third exam (1 month after the emergency transition online). We explain our results in the context of two non-mutually exclusive concepts: (i) competence beliefs and (ii) competing priorities. Test anxiety can be impacted by situational factors surrounding the exam such as the students' perceptions of academic competence, or their level of confidence in their knowledge of the content (5, 47) . Previous work relates academic competence to FIG 2 . Impact of COVID-19 on student study habits. Students were asked to respond to the following prompt: "to what extent did the Coronavirus disease impact your study habits?" Student responses are ordered by frequency of student reporting. test anxiety, with lower perceptions of competence predicting higher levels of test anxiety and higher competence beliefs providing a buffer against test anxiety (5, 47, 48) . Because the second exam occurred immediately after the shift online, students re-ceived course content in the traditional format, which could have bolstered students' perceptions of preparedness for the exam and therefore lowered their state anxiety. Prior to the third exam, students received all the course content virtually, which may have reduced competence beliefs, leading to observed increased anxiety relative to the second exam. Our results also showed that immediately after the third exam, fewer students considered the online format beneficial while more students had difficulties with constructing meaning, organization, focus, and motivation. The skills deficit model of test anxiety may explain higher reports of test anxiety after the third exam, where relatively high test anxiety was the result of students' struggle in learning from poor study skills extending from prolonged quarantine and the unfamiliar online learning environment (3, 6) . The course deficit model also explains these findings, with the changes in content delivery influencing student perceptions and anxiety surrounding the test. While students in online learning environments are expected to manage the expectations of work, school, and home, the abrupt shift to emergency remote learning during the COVID-19 pandemic provided a significant disruption to students' lives that had the potential to impact their learning (49) (50) (51) . In addition to learning how to navigate an online learning environment, some students had to contend with housing and food insecurity due to the closure of on-campus accommodations, volatile home environments, discrimination and xenophobia due to COVID-19, and financial insecurity (36, 49, 52) . Furthermore, in addition to schoolwork, some students had expanded caregiving roles and were tasked with caring for children or siblings due to school closures, tending to ill family members, and caring for elders. Some students became ill themselves (49) . Previous studies have demonstrated that additional priorities, such as the ones previously mentioned, are barriers that limit student academic engagement in introductory STEM courses (53) . Taken together, this suggests that immediately following the transition to emergency remote learning, students shifted their attention from exams to more pressing concerns, which was reflected in lower reported test anxiety after exam 2. However, 1 month after the transition online, after students completed the third exam, fewer students reported competing priorities that shifted their attention. At that point, they had a month of learning online and attempting to focus on coursework. However, students still struggled with motivation and focus on studying during this time (31) . We found that a month after the transition online, fewer students considered the online format to be beneficial, and they had difficulties with constructing meaning from the posted content, organization, focus, and motivation. All of these contribute to lower perceptions of academic competence. While some consider this generation of students "digital natives" due to their use of digital technology for communication activities (e.g., social media) from an early age, students often have no prior experience with online learning and lack the skills necessary to successfully navigate and remain engaged in online learning environments (36, (54) (55) (56) . Specifically, students lack competency in interacting with the learning content and constructing meaning (36, 55) . Our data demonstrate that a month after shifting to online learning, these competencies not only remain underdeveloped in students but may also contribute to student test anxiety. Consequently, it is important for educators to provide students with additional support in constructing meaning and identifying concepts to prepare for tests. We present a few ideas that may help instructors foster student perceptions of academic competence: Incorporate online formative assessments. Online formative assessments are essential for gaining, refocusing, and extending student attention following STEM lectures as well as providing students with immediate feedback that students can use to modify their learning (57) . To help students develop effective self-assessment skills, instructors are encouraged to use their institution's learning management system to create online assessments that students can access. Additionally, instructors can use websites such as Kahoot!, Quizizz, and EdPuzzle to add elements of game playing (e.g., point scoring and competition with others) to online formative assessments and encourage student engagement. Provide high-quality feedback. In online learning environments, providing regular feedback becomes a crucial element of the learning process as it allows students to identify gaps in knowledge and assess their learning progress (58) . In order to promote the development of academic competence, not only should instructors provide timely feedback on course assignments, but they should also include content-related information in their feedback and recommend effective study or learning strategies (58) . There are several limitations that can impact the interpretation of findings in the present study. First, we acknowledge that we collected data over the course of two semesters in two sections of introductory biology at a single institution. Students who had a different instructor, took a different biology course, or were enrolled at another institution may have experienced COVID-19 differently than our student population. The present study recruited students from a research-intensive, public university located in the southeastern United States. Taken together, these findings may not be applicable to all student populations, and future research should include a wider range of colleges and universities. For example, participants were mostly women and identified as non-PEERs. Furthermore, few students identified as first-generation college attendees. COVID-19 presented significant challenges for first-generation students including financial hardship, food and housing insecurity, and challenges adapting to online learning (59) . Furthermore, during this time, PEERs also had to contend with structural racism, discrimination, and xenophobia (49) . As previously mentioned, these are all barriers that can potentially impact test anxiety. Additionally, within our student population, we may have introduced bias through selection effects because a subset (rather than the whole) of the student population responded to our survey. We note, however, that student participation was relatively high across both semesters. Finally, without controlling for incoming preparation, we found some metrics of student performance marginally significantly different between semesters. We cannot rule out the possibility that lower anxiety during the COVID-19 semester was the result of higher grades on exams. However, the grade differences were relatively small (approximately 2 to 3% points), suggesting other variables may better explain observed differences in test anxiety. Despite these caveats, this study provides insights into student experiences during the time of the COVID-19 pandemic and provides instructors with insight on how to best support students when teaching during times of crisis. Regardless of the learning environment, effective science instruction increases content knowledge and develops the metacognitive learning skills necessary for success at higher levels of science (60) . Our results demonstrate that while students reported decreased anxiety in the online environment and viewed online learning as generally beneficial in terms of flexibility and accessibility, they also found it challenging to interact with the learning content and construct meaning, which may have contributed to test anxiety. As students continue to take online courses due to the unpredictable nature of the pandemic, instructors must revisit their perception of students as "digital natives" who are adept at navigating the online learning environment. Instead, instructors should be cognizant of the challenges students encounter with online learning and take an active role in helping students to develop the skills needed to construct their conceptual knowledge and online learning procedures. Supplemental material is available online only. SUPPLEMENTAL FILE 1, PDF file, 0.1 MB. We thank the Biology Education Research lab at Auburn University for their support. We also thank Taylor Gusler for their assistance with the qualitative analysis of the data. C.C.J. was supported by NSF grant number 1097314 during this work. Any opinions, findings, conclusions, and recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF. We declare that there are no conflicts of interest. The authors certify that they have NO affiliations or involvement in any organization or entity with any financial interest (such as honoraria; educational grants; participation in speakers' bureaus; membership, employment, consultancies, stock ownership, or other equity interest; and expert testimony or patent licensing arrangements) or nonfinancial interest (such as personal or professional relationships, affiliations, knowledge, or beliefs) in the subject matter or materials discussed in this manuscript. Perceptions and incidence of test anxiety Using collaborative two-stage examinations to address test anxiety in a large enrollment gateway course Effect of study habits on test anxiety and academic achievement of undergraduate students Test anxiety and physiological arousal: a systematic review and meta-analysis Test anxiety and metacognitive performance in the classroom Test anxiety versus academic skills: a comparison of two alternative models for predicting performance in a statistics exam Can mixed assessment methods make biology classes more equitable Relationship of test anxiety with students' achievement in science Test anxiety and academic performance in undergraduate and graduate students On the relationship between test anxiety and academic performance Can test anxiety interventions alleviate a gender gap in an undergraduate STEM course? Test anxiety among nursing students: a systematic review Gender performance gaps across different assessment methods and the underlying mechanisms: the case of incoming preparation and test anxiety Exams disadvantage women in introductory biology Cognitive test anxiety and academic performance Gender gaps in the performance of Norwegian biology students: the roles of test anxiety and science confidence Demographic gaps or preparation gaps?: the large impact of incoming preparation on performance of students in introductory physics A multivariate investigation of the differences in mathematics anxiety Variation in incoming academic preparation: consequences for minority and first-generation students Context matters: social psychological factors that underlie academic performance across seven institutions State and trait anxiety revisited Do girls really experience more anxiety in mathematics? The effects of online formative and summative assessment on test anxiety and performance Effects of online testing on student exam performance and test anxiety 21st century assessment: online proctoring, test anxiety, and student performance COVID-19 and undergraduates with disabilities: challenges resulting from the rapid transition to online course delivery for students with disabilities in undergraduate STEM at largeenrollment institutions Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ) Effects of test anxiety on engineering students' STEM success. 2020 ASEE Virtual Annu Conf Content Access Validating and adapting the motivated strategies for learning questionnaire (MSLQ) for STEM courses at an HBCU. AERA Open 4 One size doesn't fit all: using factor analysis to gather validity evidence when using surveys in your research Learning principles of evolution during a crisis: an exploratory analysis of student barriers one week and one month into the COVID-19 pandemic The coding manual for qualitative researchers R: a language and environment for statistical computing. R Foundation for Statistical Computing Fitting linear mixed-effects models using lme4 Student engagement declines in STEM undergraduates during COVID-19-driven remote learning College students' use and acceptance of emergency online learning due to COVID-19 Emergency remote teaching and students' academic performance in higher education during the COVID-19 pandemic: a case study Influence of COVID-19 confinement on students' performance in higher education The Covid-19 pandemic and mental health of first-year college students: examining the effect of Covid-19 stressors using longitudinal data Effects of COVID-19 on college students' mental health in the United States: interview survey study Undergraduate biology students received higher grades during COVID-19 but perceived negative effects on learning Lessons learned through listening to biology students during a transition to online learning in the wake of the COVID-19 pandemic Virtual classrooms: how online college courses affect student success A randomized assessment of online learning Performance gaps between online and face-to-face courses: differences across types of students and academic subject areas Online and hybrid course enrollment and performance in Washington state community and technical colleges. CCRC working paper no. 31 Personal and situational predictors of test anxiety of students in post-compulsory education Self-perceived competence and test anxiety: the role of academic self-concept and self-efficacy More than inconvenienced: the unique needs of U.S. college students during the COVID-19 pandemic Integrating work-life balance with 24/7 information and communication technologies: the experience of adult students with online learning Student views of the online learning process during the covid-19 pandemic: a comparison of upper-level and entry-level undergraduate perspectives Concerns of college students during the COVID-19 pandemic: thematic perspectives from the United States, Asia, and Europe External and internal barriers to studying can affect student success and retention in a diverse classroom Online learning: a panacea in the time of COVID-19 crisis Student preparedness for university e-learning environments Are digital natives a myth or reality? University students' use of digital technologies Exploring design elements for online STEM courses: active learning, engagement & assessment design Automatic feedback in online learning environments: a systematic literature review Firstgeneration students' experiences during the COVID-19 pandemic Promoting self-regulation in science education: metacognition as part of a broader perspective on learning