key: cord-0061179-3xls5qlb authors: Walker, Kristen A.; Koralesky, Katherine E. title: Student and instructor perceptions of engagement after the rapid online transition of teaching due to COVID‐19 date: 2021-02-07 journal: nan DOI: 10.1002/nse2.20038 sha: 6abe5b2beb4da9a3880fd02c27faebbbdcd4d7ab doc_id: 61179 cord_uid: 3xls5qlb Engagement involves students’ investment in learning activities, as well as interrelated affective (emotive responses), behavioral (active responses), and cognitive (mental effort) components. This study assessed undergraduate student and instructor perceptions of the interrelated components of engagement during and after the rapid online transition of teaching in March 2020 due to the COVID‐19 pandemic. Fifteen courses—including laboratory, discussion‐based, large lecture, tutorial, and problem‐based learning—within a multi‐disciplinary faculty at a large research‐intensive Canadian university were surveyed to: (a) assess student and instructor perceptions of students’ levels of engagement during and after the rapid transition to online teaching due to the COVID‐19 pandemic; (b) describe which aspects of engagement were enhanced or diminished due to the rapid online transition; and (c) identify which learning activities students would find most engaging in an online setting so as to assist in developing student‐centered online pedagogical techniques. Student engagement was lower after the rapid online transition. Students who engaged by connecting with peers and instructors through in‐class discussion (affective engagement) had diminished engagement, whereas students who engaged by listening to lectures, reading course materials, and reviewing slides (cognitive engagement) had enhanced engagement. Overall, students found synchronous activities more engaging. Students experienced positive and negative outcomes related to classroom engagement when transitioning rapidly to online learning during a global pandemic. In March 2020, due to the global pandemic and response to the spread of the coronavirus disease (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), instructors at the University of British Columbia (UBC) were informed that all classes would transition online & Heiberger, 2013; Means, Toyama, Murphy, & Baki, 2013) . Online learning can be effective for students who are motivated, self-disciplined, organized, and have good timemanagement skills (Jacob & Radhai, 2016) , but can be less effective for students who lack appropriate technology or where there are reduced interactions between students and instructors (Bullen, 1998; Charkraborty & Nafukho, 2014) . The movement away from recorded lectures and standard exams toward the inclusion of a variety of engaging, interactive learning activities including incorporating strategies that, for example, motivate the student and encourage social interaction, can provide quality online learning environments (Dixson, 2010; Johnson & Aragon, 2003) . Students engage differently in online courses (Dumford & Miller, 2018) . Some students prefer in-person or faceto-face learning rather than online courses due to the perceived lack of interaction (Tichavsky, Hunt, Driscoll, & Jicha, 2015) . Dixson (2010) surveyed students across six universities in the United States regarding engagement in online courses and found higher student engagement correlated to student-student and student-instructor communication. Dixson (2010) highlighted that identifying multiple ways for instructors to create meaningful connections with students was more important than the type of activities conducted in a course. In the absence of physical presence, meaningful connections can assist students to feel more engaged in online courses (Dixson, 2010) . Researchers have approached studying student engagement using different definitions and conceptual frameworks; conceptions of engagement in the late 1980s focused on time-based indices, for example measuring "time-on-task" or "student involvement" (Chapman, 2002; Mandernach, 2015; Trowler, 2010) . Later, researchers expanded their framework to include behavioral, cognitive, and emotional dimensions (Fredricks, Blumenfeld, & Paris, 2004) . Other conceptions of engagement have been used in research, such as measuring interactions between learners, instructors, and content (Bollinger & Martin, 2018) , using engagement indicators from the U.S. National Survey on Student Engagement to assess how students engage in online courses (Dumford & Miller, 2018) , or focusing on behaviors related to class attendance, assignment completion and participation (Gares, Kariuki, & Rempel, 2020) . However, many researchers agree that engagement is a multi-dimensional construct (Reschly & Christenson, 2012) . Researchers have used this construct to better understand how students engage in face-to-face classrooms (Reeve & Lee, 2013; Wiggins et al., 2017) and in online teaching environments (Chakraborty & Nufuko, 2014; Dumford & Miller, 2018 ; see also review by Pentaraki & Burkholder, 2017) . There is still debate, however, regarding construct dimensions and definitions (Reeve & Lee, 2013; Reschly & Christenson, 2012) , measurement in online courses (Henrie, Halverson, & Graham, 2015) , and how to apply the • Student engagement in online teaching was lower during the COVID-19 pandemic. • Students who engaged by interacting with others (affective engagement) had diminished engagement. • Students who engaged through independent work (cognitive engagement) had enhanced engagement. • Students found synchronous activities more engaging than asynchronous activities. construct in changing contexts (Trowler, 2010) , for example during the rapid online transition of teaching that occurred in March 2020. For this study, we used the definition of student engagement discussed in reviews by Chapman (2002) , Fredricks et al. (2004) , Mandernach (2015) , and Trowler (2010): student engagement is a multi-dimensional construct consisting of three interrelated dimensions including affective, behavioral, and cognitive engagement. Affective engagement refers to students' emotional investment in learning activities. Behavioral engagement refers to students' active response to learning activities. Cognitive engagement refers to students' mental effort in learning activities. Trowler (2010) also proposed that students can engage positively or negatively within each dimension. For example, positive affective engagement can include a sense of belonging and enjoyment in learning, but negative affective engagement can include feelings of boredom or even rejection. Positive behavioral engagement can include attending class and being involved with classroom activities, but negative behavioral engagement can be shown through missing or being inattentive during class (e.g., having side conversations, using social media). Finally, positive cognitive engagement can include incorporating and integrating other course materials into assignments, but negative cognitive engagement can include failing to meet the minimum standards of an assignment. The rapid transition to online teaching during the COVID-19 global pandemic may have impacted student engagement in courses. Furthermore, activities that were once conducted in person were transitioned to an online setting and course instructors were faced with the task of quickly designing an online course, which may or may not have addressed student engagement. These combined aspects led to our study with three specific research aims: (a) assess student and instructor perceptions of undergraduate students' levels of engagement during the rapid transition to online teaching during the COVID-19 pandemic; (b) describe which aspects of engagement were enhanced or diminished due to the rapid online transition; and (c) identify which learning activities undergraduate students would find most engaging in a sustained online setting so as to assist in developing studentcentered online pedagogical techniques. This study was conducted in the multi-disciplinary Faculty of Land and Food Systems (LFS) at the University of British Columbia (UBC), in Vancouver, Canada. Instructor surveys and student surveys were administered through the online survey development platform Qualtrics (Qualtrics, 2020) . In May 2020, instructor surveys were administered to 13 instructors teaching 24 courses from 4 different degree programs within LFS. Courses were taught in variety of teaching formats including laboratory, discussion-based, large lecture, tutorial, and problem-based learning. Courses that went online in March 2020 used a variety of online platforms including Zoom, Collaborate Ultra, and Canvas. Eleven instructors responded to the survey covering 20 courses of varying course formats. Information on perceptions of student engagement and participation in the course were gathered. Instructors were asked if they would agree to send a pre-designed survey (see Supplement 1) to students in their courses. In July 2020, student surveys were sent by 9 instructors to 15 courses, for which approximately 1,000 students were enrolled, covering 3 different degree programs. Student surveys remained open for 12 days from the date of survey dissemination. A total of 145 students responded to the survey; 34 responses were from courses with ≤30 students, 68 responses from courses with 31-99 students, and 45 responses from courses with ≥100 students. Students were required to provide consent before they could participate in the survey; no reimbursement was offered for participation. Survey responses were anonymous and confidential. We did not ask for demographic information as some class sizes were less than 20 students and we did not want individual students to potentially be identified. After consenting, students were presented with a mixed methods survey including quantitative Likert scales and qualitative open-ended questions. We asked students to focus responses on their cognitive and affective engagement and presented them with the following statement before they started the survey: "For this survey we would like you to think of your own level of engagement in [course name]; including your cognitive engagement (exerting effort in an activity) and your affective engagement (being invested in an activity)." We did not include behavioral engagement because this is typically assessed by observing students in classrooms and results in poor correlation between student self-reported engagement and how they may look engaged to an observer (Wiggins et al., 2017) . Therefore, we asked students to focus on their cognitive and affective engagement, and we then identified instances when they referred to behavioral engagement in the open-ended questions. To assess engagement in the course during the rapid online transition we asked instructors and students the same question about engagement: "During the rapid shift to online teaching, overall, how engaged were you/your students in [course name] compared to previous in-person teaching?" Responses were collected using a 5-point Likert scale to measure engagement (1 = more engaged; 2 = a little more engaged; 3 = same as in-person; 4 = a little less engaged; 5 = not engaged). After responding, instructors and students were asked the following set of questions: "Were there any aspects of engagement that you felt were diminished in [course name] as a result of the rapid shift to online teaching?" If participants selected "Yes," we asked them: "Please explain specifically how engagement was diminished in the text box below." We then asked the same question but instead of the word "diminished," asked participants to explain what was "enhanced." There were no word limits to these open-ended responses. Students were then presented with 29 activity prompts that were created using both the scholarly literature on engagement and the UBC Guiding Principles for Fall 2020 Course Adaptations (UBC, 2020b) distributed in June 2020 at UBC. The 29 prompts included questions related to synchronous and asynchronous activities, student preparation, peer student engagement, individual assessments, and laboratory activities (where applicable). For each prompt, student responses were collected using a Likert scale (1 = very engaging, 2 = engaging, 3 = neutral, 4 = a little engaging, 5 = not engaging). Survey responses were downloaded from Qualtrics (Qualtrics, 2020) and organized in Microsoft Excel 16.42. Median values were calculated for Likert scale response sets of student and instructor perceptions of students' levels of engagement. Student Likert response sets from questions pertaining to future online engagement activities were collapsed into three nominal categories by combining scales of engaging and less engaging together (Brill, 2011) : more engaging (1 = more engaged and 2 = a little more engaged), neutral (3 = same as in-person), and less engaging (4 = a little less engaged and 5 = not engaged). Instructor perception and student self-reflection of student engagement during the rapid online transition of teaching due to COVID-19 Survey responses to open-ended questions regarding what was diminished or enhanced about engagement were downloaded and organized in Microsoft Excel 16.42. Diminished and enhanced responses were uploaded separately into NVIVO (version 2020; QSR International) to analyze and compare responses for qualitative analysis. Directed content analysis as described by Hsiu-Fang and Shannon (2005) was used. Broadly, content analysis refers to the process of focusing on the content or contextual meaning of text through the identification of themes in the text. With directed content analysis, the goal is to use an existing theory as initial thematic categories to analyze data. In our case, we used the multi-dimensional engagement construct consisting of affective, behavioral, and cognitive dimensions discussed earlier. Student responses were first categorized by K.E. Koralesky as affective, behavioral, or cognitive engagement. Some responses included more than one dimension and in these cases each part of the response was categorized. Once categorized, coding (i.e., the process of determining a word or short phrase to understand the meaning of a participant's response) was used to develop sub-themes within the three dimensions. A subset (15% of total responses) of responses were randomly selected by K.E. Koralesky, and K.A. Walker categorized these responses following the same process to establish intercoder reliability. Intercoder reliability can help identify variance in the interpretation of themes and codes (Guest, MacQueen, & Namey, 2012) . Discrepancies were discussed and resolved when agreement was reached. This process enabled us to determine and specifically describe what was enhanced or diminished regarding student engagement within the three dimensions. Instructors and students had similar perceptions of student engagement; both instructor perception of student engagement and student self-reflection of engagement had a median Likert scale value of 4.0, "a little less engaged." Of the 15 courses surveyed, 60% of the instructors and 57% of students felt students were "a little less engaged" after the rapid online transition to teaching due to COVID-19, whereas 40% of instructors and only 18% of students felt that student engagement remained "the same as in person" (Figure 1 ). Approximately 12% of the students felt they were "a little more engaged" or "more engaged," and 13% of the students felt "not engaged." Students were presented with 29 activity prompts preceded by the question: "If you were to take [course name] again how engaging would you find the following activities?" Students found synchronous activities to be more engaging (Table 1) . For example, greater than 70% of students expressed higher engagement (Likert ≥ 4) for activities such as "answering instructor-led polls during class" and "having the instructor interact with the chat at regular intervals during the class." Approximately 50% of students rated lower engagement (Likert ≤ 2) activities as "summarizing postings of your peers from a course discussion board" and "watching pre-recorded video lectures" (Table 1) . T A B L E 1 Activity prompts presented to students to assess how engaging a student would find a particular activity if they were to take the same course again in a fully online setting. Data are presented as percentage of students reporting in three categories: More engaging (Likert ≤ 2), neutral (3 = same as in-person), and less engaging (Likert ≥ 4) Note. The bold numbers are the highest values in the "more engaging" column/category and the lowest in the "less engaging" column/category. Students also provided open-ended qualitative responses regarding what was enhanced or diminished about their engagement. A multi-dimensional construct of engagement consisting of affective, behavioral, and cognitive engagement was used to thematically analyze and describe what was enhanced and diminished for students regarding their engagement after the rapid online transition due to COVID-19. Results are illustrated in further detail below and quotes are presented to exemplify themes. A unique identifier is included for each quote, which consists of the letter "S" for "student" followed by a number randomly generated in Microsoft Excel (e.g., S18 or S106). Affective engagement was enhanced in two primary ways: (a) increased inclusivity and (b) decreased nervousness, both through use of the online portal chat box. First, students expressed an increased sense of belonging and inclusivity due to the use of the chat box. One student wrote that they thought there was a "more diverse group of people asking questions compared to our in-class discussions" (S52). Second, students admitted that they felt less nervous about asking questions or making comments using the chat box compared to the previous in-person classroom setting. One student wrote: "I liked being able to ask questions without having to speak during lecture in front of all the other students" (S106) and another admitted that they felt "more brave [sic] to ask and answer questions" (S18) using the chat box. Affective engagement was also diminished for students, however, and this was mostly due to the perceived inability to replicate in-class discussions online. One student explained: "I personally didn't feel as motivated to go to classes after they were transitioned virtually. The best part of attending the in-class lectures was to be supported and encouraged by my team members. It is a very team-oriented course and I feel that being online really negatively affected our ability to work as a group" (S103). For this student and others, feeling "supported" and "encouraged" through work with peers and in-class activities was important for their learning. Additionally, this quote exemplifies that for some students, affective engagement ("feeling supported and encouraged by team members") was related to their behavioral engagement ("not motivated to attend class"). Behavioral engagement was enhanced for students that expressed how they adapted to an increased amount of time in their schedules and changed their behaviors toward coursework. For example, one student wrote: "Office hours were easier to visit because I had more time free due to no commuting" (S28). Another stated that they "had more time to work on the term paper" (S65). In these cases, students adjusted their behaviors in a positive way after the rapid online transition by investing time and learning into other course materials. In contrast, students expressed that their behavioral engagement was diminished due to (a) distractions and (b) lack of accountability. Students mentioned a variety of distractions that made it difficult to focus while participating in the online classroom. First, students were distracted due to the pandemic and "really nerve-wracking circumstances" (S87) that made it harder to concentrate. Several students struggled in general to "pay attention and focus in online courses" (e.g., S5, S52, S55, S80). Other students linked their difficulty in paying attention to technical difficulties with the online platform, as one student wrote: "Online discussions with group mates were made slightly difficult if individuals had technical difficulties such as microphones that were not working" (S77). Finally, some students communicated that they felt a lack of accountability for attending class online, demonstrated by one student who stated: "Because there was no accountability, I essentially had no reason to pay attention" (S135). For these students, actively responding to an online classroom was difficult. Cognitive engagement was enhanced due to two main subthemes: (a) revisiting recorded lectures and (b) use of other course materials. For example, one student wrote: "I understood the concepts in greater detail listening to the prerecorded videos" (S121) and another articulated: "I liked being able to revisit or pause the lectures in order to get the right information" (S91). Other students wrote similar statements that linked revisiting recorded lectures or "pausing," "slowing down," or "re-watching" lectures to their ability to learn and understand course material. Additionally, students mentioned that they engaged with other course materials, for example: "As I am learning I continuously refer to other books or sources to see if there is more detailed information that can help me to study" (S119). For these students, engagement with lectures and other course materials helped them to better understand concepts discussed in the course. Cognitive engagement was diminished for students who linked their critical thinking and learning to in-class discussion. One student wrote: "I think the discussion element was particularly helpful to my learning. Although all the content was taught at that point where we switched to the online portion, I felt it was particularly difficult to write my essays without continual discussion even if it is to reinforce what is already taught" (S131). In another example one student expressed: "Questions cannot be asked in real time and students don't get the benefit of hearing others' questions answered" (S93). For these students, learning was directly linked to in-class discussion, which they felt they could not do as well in an online setting. Instructor perception of student engagement, and student selfreport of engagement, demonstrated that the rapid transition to online teaching due to the COVID-19 pandemic resulted in lower student engagement. By viewing engagement as a multi-dimensional construct, we were able to analyze quali-tative responses and disentangle the different ways in which students felt their engagement was enhanced or diminished. This was helpful to understand how some students successfully adapted whereas others struggled. For example, students adjusted their behavioral engagement positively by using time gained to work on other aspects of their courses, but also expressed challenges in focusing due to distractions and a lack of accountability. Our findings align with other studies that have discussed the advantages and disadvantages for students in online education (Chakraborty & Nafukho, 2014; Dixson, 2010) . We found that students' affective engagement was mostly diminished after the rapid online transition. However, some students (12%) felt more engaged. None of the instructors, however, noted that students were more engaged and thus these results demonstrate the need to assess engagement qualitatively. Asking students what was enhanced or diminished allowed us to identify aspects of the classroom instructors could use to improve engagement for different types of student learners in online teaching. Perhaps if instructors in March 2020 had more time and resources to review and change their course activities to make them more engaging or had the information on student views of engagement, a different outcome may have followed. Research assessing online and face-toface instruction has shown that an existing face-to-face course cannot merely transition online; the course needs to keep faceto-face elements while also building in online learning activities (Means et al., 2013) . Furthermore, if students were offered a choice between more independent or group-based learning (Nickerson & Shea, 2020) outcomes may have been different. Future studies could conduct a before and after survey to understand initial perceptions about engagement and how perceptions of engagement changed if instructors had more time to research, learn, and integrate other pedagogical methods of interactions into their online courses. The COVID-19 pandemic and response to online instruction in higher education may result in courses and classrooms that look very different moving forward compared with before the pandemic; however, some recognize that online teaching may never fully replace face-to-face learning (Radcliffe et al., 2020) . Henrie et al. (2015) reviewed the literature on student engagement in technology-mediated learning experiences and found that clear definitions of engagement were lacking. Although many authors discussed engagement as a multidimensional concept including affective, behavioral, and cognitive dimensions, they measured only one dimension (e.g., 77% of the 113 articles reviewed measured behavioral engagement; Henrie et al., 2015) . Additionally, some researchers have developed scales to measure these dimensions (Burch, Heller, Burch, Freed, & Steed, 2015; Gunuc & Kuzu, 2015; Wiggins et al., 2017) . Our use of this engagement construct to analyze qualitative responses allowed us to disentangle the different ways in which students felt their engagement was enhanced or diminished and therefore we recommend other researchers consider using this approach. However, through our analysis, we found that in some cases it was difficult to clearly separate the behavioral construct from the affective and cognitive constructs. For example, affective engagement was diminished mostly due to the perceived inability to replicate in-class, face-to-face discussions and students indicated that this influenced their actions, such as attending class (i.e., behavioral engagement). We also found that students, having additional time in their schedules, redirected their behaviors to engage cognitively, for example, by utilizing course readings or visiting office hours. Such links between engagement dimensions can help explain our finding that overall, 57% of students felt less engaged after the rapid online transition. As discussed by Henrie et al. (2015) , the construct we used was developed for face-to-face instruction (Fredricks et al., 2004) where instructors could observe student behavior directly (e.g., students participating by asking questions or attending class) and therefore may need to be adapted for use in online classrooms. We coded behavioral engagement as instances when students expressed enhanced engagement through actively using their time to engage further with material or when students expressed diminished engagement due to difficulty in focusing or being distracted. We suggest that other studies using open-ended qualitative responses apply the multi-dimensional engagement construct in a similar way to determine whether similar difficulties are experienced when using this construct, or consider using Learning Management Systems (e.g., Canvas) to measure the amount of time students spend engaging with course materials. Students rated synchronous activities as more engaging and these activities involved instructor participation. However, some asynchronous activities such as attending virtual office hours offered by the instructor and having the instructor respond to student posts on a discussion board were found to be more engaging by students. Activities that were found to be less engaging included activities that required watching pre-recorded videos or summarizing postings from peer discussions, both conducted in the absence of live interactions with instructors or peer groups. It has been recognized that student-instructor interactions are an important aspect of the classroom and these interactions may be even more important in an online setting (Chakraborty & Nafukho, 2014; Dixson, 2010; Wallace, 2003) . These findings align with our qualitative results, which demonstrated that students who engaged through connecting and interacting with others in the classroom had diminished affective engagement. Synchronous activities such as shared whiteboards (Reushle & Loch, 2008) and the use of chat functions (Repman, Zinskie, & Carlson, 2005) have been shown to strengthen studentinstructor interactions online. Asynchronous activities such as recorded instructor introductory videos (Malhotra, Mann, Avery, & Brett, 2019) , the recording of synchronous lectures for students to access later (Salazar, 2010) , providing consistent feedback in a timely manner (Chakraborty & Nafukho, 2014) , and instructors being present in online discussion forums (Mandernach, Gonzales, & Garrett, 2006) have all been suggested to increase student engagement. Given that we were in the midst of a global pandemic and that students had other personal factors influencing their studies, this study is limited in scope. Furthermore, there may have been an influence on engagement due to time zone differences for students located in other regions of the world, as well as for students whose first language is not English. Although we recognize the importance of demographics on student engagement, we chose not to collect this information as some classes were small and we wanted to ensure no student could be personally identified. Questions regarding student classroom performance or grade point average may have given insight into the link between classroom achievement and engagement and may have verified that our survey sample was representative of the general student population. Additionally, the analysis of student engagement by class type (e.g., discussion-based, laboratory) or class size was not possible due to the sample size obtained for this study. Future work could assess student engagement across various facilities and disciplines, and course types, to get a fuller representation of student engagement. Future work could also investigate student perceptions of engagement in online activities before and after the implementation of online activities in a course to measure how engagement changes. COVID-19 resulted in widespread alterations to higher-level education, including the rapid transition of teaching to an online setting. We assessed instructor and student perceptions of student engagement during this transition of teaching and found students felt less engaged. We identified what was enhanced or diminished regarding student engagement and found that affective engagement was diminished while cognitive engagement was enhanced. Students found synchronous activities more engaging. Future research is needed to test the multi-dimensional engagement construct in an online setting. These results highlight some of the ways undergraduate students adapted to but were also challenged by moving rapidly to online learning during a global pandemic. This study further highlights the need to consider multiple pedagogical techniques to engage students in the online setting. This study was approved by the University of British Columbia Behavioral Research Ethics Board no. H18-02392. This project would not have been possible without the support of the UBC Scholarship of Teaching and Learning (SoTL) SEED program funding. In particular, we would like to thank Gerald Tembrevilla for his SoTL specialist guidance. We would also like to thank Jonathan Graves for his input on earlier versions of this manuscript. Thank you as well to all of the instructors and students who participated in this project. The author declare no conflicts of interest. Instructor and student perceptions of online student engagement strategies Likert scale Participation and critical thinking in online university distance education Student engagement: Developing a conceptual framework and survey instrument Strengthening student engagement: What do students want in online courses? Alternative approaches to assessing student engagement rates Engaging online learners: The impact of web-based learning technology on college student engagement Creating effective student engagement in online courses: What do students find engaging Online learning in higher education: Exploring advantages and disadvantages for engagement School engagement: Potential of the concept, state of the evidence Community matters: Student-instructor relationships foster student motivation and engagement in an emergency remote teaching environment Validity and reliability (credibility and dependability) in qualitative research and data analysis Student engagement scale: Development, reliability and validity. Assessment & Evaluation in Higher Education Measuring student engagement in technology-mediated learning: A review. Computers & Education Three approaches to qualitative content analysis Trends in ICT e-learning: Challenges and expectations An instructional strategy framework for online learning environments Putting twitter to the test: Assessing outcomes for student collaboration, engagement and success Exploring the relationship between instructor created online video characteristics and pedagogy Assessment of student engagement in higher education: A synthesis of literature and assessment tools An examination of online instructor presence via threaded discussion participation The effectiveness of online and blended learning: A meta-analysis of the empirical literature First-semester organic chemistry during COVID-19: Prioritizing group work, flexibility, and student engagement Emerging evidence regarding the roles of emotional, behavioral, and cognitive aspects of student engagement in the online classroom Qualtrics (version Moving online: Roadmap and long-term forecast Students' classroom engagement produces longitudinal changes in classroom motivation Effective use of CMC tools in interactive online learning Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct Conducting a trial of web conferencing software: Why, how, and perceptions from the coalface Staying connected: Online education engagement and retention using educational technology tools It's just nice having a real teacher": Student perceptions of online versus face-to-face instruction Guiding principles for Fall 2020 course adaptations. Vancouver: The University of British Columbia Online learning in higher education: A review of research on interactions among teachers and students. Education ASPECT: A survey to assess student perspective of engagement in an active-learning classroom