key: cord-0967441-8gww7192 authors: Saucier, Donald A.; Schiffer, Ashley A.; Jones, Tucker L. title: “Exams by You”: Having Students Write and Complete Their Own Exams During the COVID-19 Pandemic date: 2022-05-06 journal: Teach Psychol DOI: 10.1177/00986283221097617 sha: 06d57a36a993c11032c837f055904accd814851c doc_id: 967441 cord_uid: 8gww7192 BACKGROUND: The COVID-19 pandemic made it difficult to proctor exams after the forced transition to remote teaching and learning. OBJECTIVE: We evaluated students’ experiences creating and answering their own exam items in an upper-level psychology course during the transition to remote teaching and learning during the COVID-19 pandemic. METHOD: Students in an advanced social psychology course wrote and answered exam items to demonstrate their learning when proctored exams became impossible during the COVID-19 pandemic. The exams were graded for breadth, depth, and accuracy, and compared to other demonstrations of learning, including traditional in-class exams taken prior to the transition to remote teaching and learning. RESULTS: Students performed well on these exams. Students reported positive perceptions of these exams during the COVID-19 pandemic, but reported that, while these exams reduced test anxiety, they were time-intensive. CONCLUSION: Students can demonstrate their learning by writing and completing their own exams. Their higher grades may be related to reduced test anxiety and greater time investments. TEACHING IMPLICATIONS: Instructors should consider having students write and complete their own exams to demonstrate their learning, especially when proctoring exams is difficult or impossible. One of the more ubiquitous changes to education during the COVID-19 pandemic was the rapid transition of face-to-face (F2F) classes to remote teaching and learning modalities. Many instructors made revisions, often with little training or notice, to their classes so they could continue to teach in reasonable ways (Johnson et al., 2020; Silva de Souza et al., 2020; Sunasee, 2020) . Immediately after our university's decision to move all F2F classes to remote teaching and learning modalities, we reissued our syllabus with an empathy addendum to let students know we understood things were different and challenging, to encourage connection and communication, and to remind them we want them to succeed. As evidence of this, we also shifted the values and parameters of our remaining assignments to be more realistic and appropriate for our students, namely by eliminating some assignments, substituting others, and lowering the stakes of other point opportunities (e.g., grading for completion). But assessing students' learning via online exams posed one of the greatest logistic challenges for instructors (Clark et al., 2020) . At many universities, including our own, there was no possibility of assessing learning in traditional F2F ways during the COVID-19 pandemic (i.e., exams completed independently in class, without collaboration or resources, while instructors or teaching assistants proctor the exams against cheating and potentially answer questions). Even remote exams had additional concerns to consider, like limited access to technology (Nambiar, 2020) , expensive online proctoring services (e.g., Cluskey et al., 2011) , academic integrity (Guangul et al., 2020; Moralista & Oducado, 2020) , and exacerbated test anxiety that is already commonly experienced under much more normal conditions (Zeidner, 2010) . However, less traditional learning assessment methods, like involving students more in exam development, may resolve these issues. Some ways to involve students in exam development include having students create a question bank for exams that the instructor chooses from (e.g., Ahn & Class, 2011; Jobs et al., 2013) or having students write their own exams. As an example of the latter, Corrigan and Craciun (2013) had students in three undergraduate courses write and answer multiple choice and short essay questions in an open-book, take-home format, limiting the instructor's role to specifying assignment parameters (e.g., required number of questions). Corrigan and Craciun (2013) argued this approach minimizes academic dishonesty, and they found students reported greater involvement in their learning and less test anxiety compared to instructor-written exams. Accordingly, after the pivot to remote teaching and learning and our one F2F exam (i.e., "Exam 1 In Class"), we replaced our remaining exams with "Exams by You" (i.e., "Exam 2 by You", "Exam 3 by You"), in our Advanced Social Psychology course. Beyond a traditional essay exam in which instructors provide set topics or questions, Exams by You are consistent with Corrigan and Craciun's (2013) student-written exam approach in which students wrote and answered their own short essay questions. We believed Exams by You would be a good alternative to F2F or traditional online exams during the COVID-19 pandemic because they may reduce test anxiety (e.g., Corrigan & Craciun, 2013) ; the potential for academic dishonesty both by making it more easily identifiable given the uniqueness of each student's question and answer and allowing students to use their resources (e.g., notes, readings, lectures); and the need to access technology during specified time windows. Given that essay exams are a historically valid method of learning assessment, Exams by You allowed for further creativity and flexibility for both students and instructors during the pandemic. This led us to the following research questions: (a) what were students' general experiences while writing their own exams? (b) how did these experiences relate to exam grades, if at all? and (c) what was students' feedback for instructors about writing their own exams? Our class included 28 students in an upper-level psychology course at a large Midwestern research intensive university. After our F2F exam (i.e., Exam 1 In Class), we offered extra credit for students who completed surveys after both Exam by You opportunities. All participants were psychology majors. The average age was 21 years (SD = 1.20) for Exam 2 by You survey respondents (N = 19) and 21.13 (SD = 1.20) for Exam 3 by You survey respondents (N = 16). In both surveys, the majority were White (n Exam2 = 17, 89%; n Exam3 = 15, 94%) women (n Exam2 = 17, 89%; n Exam3 = 13, 81%) who were either juniors (n Exam2 = 7, 37%; n Exam3 = 7, 47%) or seniors (n Exam2 = 9, 47%; n Exam3 = 7, 47%). In total, we had 35 survey respondents: 22 were unique respondents (i.e., they completed at least one of the optional follow-up surveys; M age = 20.91, SD = 1.15; White: n = 19, 86%; women: n = 19, 86%; juniors: n = 10, 45%; and seniors: n = 9, 41%) and 13 completed both surveys after each Exam by You (M age = 21.31, SD = 1.25; all White, women: n = 11, 85%; and seniors: n = 7, 54%). Study materials are available in Open Science Framework (OSF; see Saucier et al., 2022) . This includes the Exams by You guidelines and rubric as well as the free-response survey items we created. Exam 1 In Class. Throughout the course of the Spring 2020 semester, students completed three exams. Exam 1 In Class refers to the first and only exam administered before the shift to remote teaching and learning. Consistent with traditional F2F exams, it included four sections: 10 multiple-choice items; 10 matching items; 10 short answer items answered with a sentence or phrase; and five longer answer items answered with a couple sentences, list, or paragraph. Exams by You. Rubric. Each Exam by You was worth 170 points (the class was based on 1000 points), and students wrote 10 questions and answered them in a maximum of 250 words each. Students had 4 and 5 weeks to complete each Exam by You, respectively, although most seemed to wait toward the end of the unit to begin. Students were graded on their adherence to the item ratio (i.e., 8 lecture items, 2 reading items; 10 points), breadth of lecture and reading content (20 and 5 points, respectively), depth of lecture and reading content (20 and 5 points, respectively), the accuracy of each answer (10 points each), and the overall clarity and quality of questions and answers (10 points). See Saucier et al. (2022) . Surveys. The surveys administered after each Exam by You inquired about students' experiences in writing their own exams and received IRB approval. Because the current project involved analyzing and reporting students' grades, we also received approval from our Registrar's office. We offered an alternative assignment of comparable time and effort in lieu of the survey, but no students completed it. On a 1 (Strongly Disagree) to 9 (Strongly Agree) Likerttype scale, students responded to 10 face-valid items we created regarding their general attitudes toward writing their own exams (e.g., I was happy to write and take my own "Exam by You"), comparisons to standard testing measures (e.g., I preferred writing my own exam over taking an exam written by my instructor), and perceptions of this exam format's validity (e.g., I feel as though this is an accurate and valid way to assess my learning). To broadly discuss our findings and to ease comparisons across surveys, we used a principal components analysis on all response ratings for the Exam by You surveys. No dimension constraints were used and no items were eliminated due to poor loadings. These generated three factors within our response items that explained more than 75% of the variance (see Table 1 for the complete list of items): Preference (eigenvalue = 4.98, all items loaded >|.70|, α0s > 0.81; e.g., I preferred writing my own exam over taking an exam written by my instructor), Valid (eigenvalue = 1.44, all items loaded >.73, α0s >0.85; e.g., I feel as though this is an accurate and valid way to assess my learning), and Successful (eigenvalue = 1.11, all items loaded >.63, α0s >0.63; e.g., I demonstrated my learning effectively on my "Exam by You"). We also created nine free-response items regarding their Exams by You experience (e.g., What was the best part about doing the "Exam by You"? What was the worst part about doing the "Exam by You"? What would you recommend to other students who will be writing "Exams by You" in the future? What is your biggest complaint about the "Exam by You"? see Saucier et al., 2022) . The first and second author independently coded these responses and agreed on classifications. This research used a within-subjects design in that after completing each of the two Exams by You, we invited students to take extra credit surveys to provide their feedback and opinions on this exam format. Participants provided informed consent, their unique school ID number to match survey responses to exam grades, basic demographic information, and then completed the survey items. Upon completion, participants were granted extra credit. This process repeated after each Exam by You. When appropriate, we examined students' grades in relation to their survey responses. The Exam 2 by You survey was students' first opportunity to provide feedback on the revised exam structure. Survey respondents were somewhat neutral in their preference for writing their own exams (M = 5.83, SD = 1.60), but perceived the exam as both a valid measure of their learning (M = 7.09, SD = 1.01) and a successful alternative during the pandemic (M = 7.32, SD = 0.96). Exam 3 by You survey respondents tended to enjoy writing their own exams (M = 6.16, SD = 1.94) and also felt the exam was both a valid (M = 7.02, SD = 1.64) and successful (M = 7.25, SD = 1.46) measure of their learning. Among students who completed both post-Exam by You surveys, there were no significant differences between Exams 2 and 3 by You in terms of Preference, t = 0.86, p = .406, d = .21; Valid, t = 0.30, p = .771, d = .09; or Successful, t < 0.01, p = 1.00, d = .00. Note. Factor 1 was named Preference, Factor 2 was named Valid, and Factor 3 was named Successful. ** denotes a reverse-scored item for the composite. What was the best part about doing the exam by you? More time to complete 5 (26%) "I liked how we could work on the exam little by little. There was no rush and no time limit." "We were able to create it over a span of time, so it was easier and less stressful to complete." Better learning 5 (26%) "I felt like it made me more thoroughly learn the material."; "It allowed me to fully demonstrate concepts that I knew well." Reduced test anxiety 3 (16%) "Less test anxiety"; "I didn't have to worry about completely drawing a blank when answering a question." Personal importance 3 (16%) "It gave me the opportunity to focus on what I think is important instead of making me try to guess what the instructor thinks is important."; "I enjoyed that we could go in depth over the KEY things that stood out to us personally. It kind of made me want to know more or in a way I was more passionate about certain points concerning the lectures!" Creative and fun 3 (16%) "Have the ability to be creative, I had a lot of fun writing the questions and providing answers."; "I liked the freedom we were allowed to write our own questions. In particular, I found the "Creative" question to be a fun way to apply lecture material in a hypothetical situation." What was the worst part about doing the exam by you? Meeting parameter guidelines 10 (53%) "Making sure you are meeting all of the requirements that are for creating the test."; "The structure of the exam is kind of loose so just trying to determine how many questions should be denoted to each section." Time-consuming 5 (26%) "Time consuming"; "Probably the time it took. While the Exam by You format feels better, it also takes a LOT more time than a synchronous pre-made exam with multiple choice items." General difficulty 4 (21%) "I think just the stress of this being the first time ever doing something like this and wanting to do a good job."; "Personally I don't like to write so that was a little intimidating but there was nothing wrong with this format of the test other than my personal biases." What would you change about the exam by you in the future? Nothing 7 (37%) "Overall, I liked the set up and the assignment."; "Nothing! It was a unique alternative!" Given the variable nature of the Spring 2020 semester, we compared average performance before the pivot to remote learning to average performance during remote learning. Table 2 . Overall, given the adjustments we made in our evaluations of student learning, it appears that our students rose to the challenge and meaningfully engaged with the course content, earning higher grades on exams as the semester progressed. Although we did not offer a formal research question regarding grades for survey respondents versus nonrespondents (i.e., students who completed the Exam but did not respond to the survey), we performed independent samples t-tests for these two groups to check for the possibility of response bias in our surveys. There were no significant differences in Exam by You grades for survey respondents versus nonrespondents for either Exam 2 by You (M respondents = 94.92%, SD = 6.11%; M nonrespondents = 92.75%, SD = 7.26%; t (27) = 0.78, p = .450, d = 0.34) or Exam 3 by You (M respondents = 93.49%, SD = 3.64%; M nonrespondents = 93.92%, SD = 3.94%; t (27) = 0.29, p = .771, d = 0.11). More interestingly, we examined how students' general Exam by You experiences (i.e., Preference, Valid, and Successful scores) correlated with their grades on these exams (see Table 3 ). While Successful scores, especially if students felt they demonstrated their learning effectively, were positively associated with Exam 2 by You respondents' grades, Preference, Valid, and Successful scores did not correlate with Exam 3 by You respondents' grades, suggesting attitudes toward writing one's own exams did not manifest in performance. A summary of the responses to our qualitative questions in both the Exam 2 by You survey and the Exam 3 by You survey are provided in Tables 4 and 5, respectively. In general, students liked being able to focus on the information they felt was most important or interesting. Additionally, students reported appreciating the reduced test anxiety with this type of exam format. However, this may only be the case when students did not procrastinate writing (and answering) their own questions, given that they noted how Exams by You require more work from students than traditional in-class Keep it short 4 (21%) "It was hard to cover ALL the material. Maybe shorten the length of material before doing the Exam?"; "Number of questions" Provide categories 3 (16%) "I would assign a set number to how many questions you want from each lecture topic.;" "categories would be helpful" Question variation 2 (11%) "Perhaps not have to do so many of the long answer questions and more of a variation or long, shot and multiple choice."; "Maybe add in some other types of questions we can put in" Miscellaneous 3 (16%) "Maybe giving some examples of expectable questions"; "Maybe have 3 research questions" What would you recommend to other students who will be writing exams by you in the future? Do it along the way 12 (63%) "Even though it was said multiple times, but doing it along with the lectures was extremely helpful"; "Really try and start writing questions as you go through the content so you don't have to go back over it." Time management 4 (21%) "Start a couple days early on the exam. it takes longer than you think to apply the info creatively"; "Take the time to do it well" Miscellaneous 3 (16%) "More information the better"; "Yes." What is your biggest complaint about the exam by you? Uncertainty 5 (28%) "No structure. When I was done with the exam i had no idea if I did well or not"; "The requirements were a bit vague" Better learning demonstration on regular exams 4 (22%) "I think I would demonstrate my understanding better with a normal test." "Although I mostly enjoyed the process of writing my own exam questions, I wasn't sure if they were truly representative of my learning than an exam written by the professor." None 3 (17%) "I have none."; "None." Creating questions 2 (11%) "Coming up with the actual questions."; "It was hard to think of questions." Time-consuming 2 (11%) "As I said earlier, the time it takes to complete it. I can't see this being done on a piece of paper in a 50-minute time period. It just takes too long for that."; "Very time consuming." Note. One respondent did not respond to the last question about their biggest complaint. What was the best part about doing the exam by you? Personal importance 6 (40%) "Being able to focus on what I feel is important, instead of what someone else tells me is important"; "Getting to go over all of the information from the section and dive into the points you find most important." Creative and fun 4 (27%) "You can be creative and apply the knowledge"; "I enjoyed creating my own questions and adding spin to them to question myself a little deeper when answering" Better learning 3 (20%) "I didn't have to study for memorization. I actually learned"; "I think it helped me understand the material better" Reduced test anxiety 2 (13%) "I didn't have to worry about forgetting the answer to something. Basically, there was less pressure."; "The best part was the time allowed for the test as you didn't have to write the exam within 2 hours" What was the worst part about doing the exam by you? Time-consuming/Amount of work 6 (40%) "It takes a lot of time to properly get questions and answer them completely"; "Takes a lot longer than a 40 min test." Breadth 3 (20%) "It was slightly hard to completely make sure that my questions were all encompassing"; "The hardest part was trying to cover every subject to fit the breadth part of the assignment, but I wouldn't use the word 'worst'. That was just my biggest challenge." Coming up with questions 2 (13%) "The worst part was actually coming up with questions"; "The worst part was coming up with questions that I thought would be the best to showcase my learning. This exam was more difficult than the last one because I thought it was more difficult to create questions for the material." Uncertainty 2 (13%) "Not knowing exactly what you want from each question... Basically questioning if each question was good enough"; "You have no idea how you did on them" Miscellaneous 2 (13%) "It didn't really feel like an exam. It felt more like submitting a project I've been working on. Which is fine, but it takes away from the "magic" of a test. That sounds weird"; "Not getting to see what the instructor thought were the most important parts of the material. Also, you don't really have to study the material to do the Exam which can decrease the amount of information retained." What would you change about the exam by you in the future? Nothing 6 (40%) "Nothing, I honestly would keep it the same"; "I don't think I would change anything" Provide examples 4 (27%) "This is just a suggestion, but I might suggest including a sample question with an attached answer (just so students can get an idea of what a good, content-filled question and answer looks like)"; "Maybe provide some written examples" Keep it short 3 (20%) "I would have a 100-300 word limit. There is no reason to be forced to write 230 words for each"; "It was a lot of material to cover for 10 questions." Question variation 2 (13%) "It would be really cool if students were able to make a full exam that was not just all long answer items."; "I understand the longer answer questions are more for showing a deeper understanding and knowledge on the subject, but I think giving students the opportunity to write multiple choice, matching, or fill in the blank answers would be fun for points or even extra credit!" What would you recommend to other students who will be writing exams by you in the future? Do it along the way 11 (73%) "Write the exam as you go."; "Definitely come up with the questions as you go through the material." Miscellaneous 4 (27%) "Ask for clarification and more information topics you don't understand."; "Create questions that could be out of the box. Ask for opinions or how they can relate to your life and in social settings. Don't over think it" What is your biggest complaint about the exam by you? Parameters 5 (33%) "If you can answer the question in less words that should be allowed"; "I guess just having to do all long answer and no other form of question" A lot of information and work 4 (27%) "I think it was a lot of content to cover in only a few exams."; "For me (someone who doesn't have to study much for tests and has no test anxiety), it was more work to write and take my own test." None 3 (20%) "Nothing"; "None" Miscellaneous 3 (20%) "It's different than anything I've done before"; "Just that we don't get to hear from the instructor about what the most important parts of the lectures/readings are." Note. Although 16 students completed this survey, only 15 responded to these qualitative questions. exams. Accordingly, students consistently recommended spreading the work out (e.g., writing two questions per week) while listening to the lectures for future students taking Exams by You. In terms of what we should change about the Exam by You format, students suggested we provide more structure (e.g., topic categories) and examples as well as allow for different types of questions (e.g., multiple-choice, short answer). Finally, 58% of respondents to the Exam 2 by You survey (n = 11) and 73% of respondents to the Exam 3 by You survey (n = 11) recommended using Exams by You in future F2F classes instead of more traditional exam formats. Thus, although students may have initially been ambivalent toward the Exam by You format, they nonetheless took it seriously and put in effort to do well. This approach is not without its limitations, most notably Exams by You may be more labor-intensive for students and graders, at least at first. Another limitation is that our grading may have been too lenient, creating a ceiling effect in our analyses; however, based on common feedback, students worked for their grade. Although it is possible students avoided the most difficult material, they would have failed to earn "breadth" points outlined in our rubric. Another possibility is that student-written exams simply tap into a different kind of learning that more closely resembles essay exams (e.g., Bizzell & Singleton, 1988) . In any case, grading Exams by You may be less tedious because of the variability and creativity in students' exams, as Corrigan and Craciun (2013) suggest. Finally, this approach may be too time-consuming for graders in larger classes (Corrigan & Craciun, 2013) . However, this approach could easily be adapted to overcome these limitations. For instance, more specific guidelines (e.g., one question on [altruism]) would likely benefit both students and graders. Also, it may be beneficial to consider using a hybrid option for future testing, mixing who writes the question (i.e., instructor or student) and the type of question (i.e., multiple choice, short-answer, longer-answer). In this case, one could also specify student-written questions cover specific topics to make sure that content is represented on the exam and to make the items more predictable for graders. While we have addressed many limitations to the Exam by You format, there are likely others that exist. Most notably, it is difficult to disentangle the potentially confounded relationship between the adoption of this exam format (or any other COVID-19inspired pedological technique) and the emergence of the COVID-19 pandemic. Nonetheless, our goal was to offer a method of assessment during and, perhaps beyond, the pandemic for instructors to incorporate into their classes at their discretion. Student-written exams present many benefits for both students and instructors. Most notably, based on students' feedback, this approach reduced test anxiety that can be experienced with standard testing. Additionally, students took these exams seriously and seemed to enjoy the opportunity to demonstrate their learning in our Exams by You format because it encouraged student creativity and exploration of topics that interested students the most. Moreover, given the increased concern about maintaining academic integrity with the pivot to remote learning (Guangul et al., 2020; Moralista & Oducado, 2020) , the Exam by You approach would likely make academic dishonesty more easily identifiable given that every question and answer are idiosyncratic to each student. The flexibility of having students write their own exams can apply to a variety of disciplines and be easily adapted to fit a given course's objectives and design. Overall, the Exam by You format presented a wonderful alternative to standard F2F testing when teaching and learning remotely that worked to address issues related to student anxiety and possibilities of academic dishonesty. As a result, Exams by You should be incorporated into course design even after the COVID-19 pandemic to provide a flexible, empathetic approach to testing for students and instructors. The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. The author(s) received no financial support for the research, authorship, and/or publication of this article. This article has received badge for Open Materials. More information about the Open Materials badges can be found at http://www. psychologicalscience.org/publications/badges ORCID iD Ashley A. Schiffer  https://orcid.org/0000-0003-2191-869X Student-centered pedagogy: Coconstruction of knowledge through student-generated midterm exams What can we do about essay exams? Testing in the time of COVID-19: A sudden transition to unproctored online exams Thwarting online exam cheating without proctor supervision Asking the right questions: Using student-written exams as an innovative approach to learning and evaluation. Marketing Education Review Challenges of remote assessment in higher education in the context of COVID-19: A case study of middle east college Question-writing as a learning tool for students-outcomes from curricular exams U.S. faculty and administrators' experiences and approaches in the early weeks of the COVID-19 pandemic Faculty perception toward online education in a state college in the Philippines during the coronavirus disease 19 (COVID-19) pandemic The impact of online learning during COVID-19: Students' and teachers' perspective Exams by you": Having students write and complete their own exams during the COVID-19 pandemic Brazilian students' expectations regarding distance learning and remote classes during the COVID-19 pandemic Challenges of teaching organic chemistry during COVID-19 pandemic at a primarily undergraduate institution Test anxiety. The corsini encyclopedia of psychology