key: cord-0711047-78v9d6yg authors: Rodríguez-Planas, Núria title: COVID-19, College Academic Performance, and the Flexible Grading Policy: A Longitudinal Analysis() date: 2022-01-25 journal: J Public Econ DOI: 10.1016/j.jpubeco.2022.104606 sha: ea120a021b5d99c6fba1ea95d23ac6054e8c7241 doc_id: 711047 cord_uid: 78v9d6yg I use an unbalanced panel of over 11,000 academic records spanning from Spring 2017 to Spring 2020 to identify the difference in effects of the COVID-19 pandemic across lower- and higher-income students’ academic performance. Using difference-in-differences models and event study analyses with individual fixed effects, I find a differential effect by students’ pre-COVID-19 academic performance. Lower-income students in the bottom quartile of the Fall 2019 cumulative GPA distribution outperformed their higher-income peers with a 9% higher Spring 2020 GPA. This differential is fully explained by students’ use of the flexible grading policy with lower-income ones being 35% more likely to exercise the pass/fail option than their counterparts. While no such GPA advantage is observed among top-performing lower-income students, in the absence of the flexible grading policy these students would have seen their GPA decrease by 5% relative to their counterfactual pre-pandemic mean. I find suggestive evidence that this lower performance may be driven by lower-income top-performing students experiencing greater challenges with online learning. These students also reported a higher use of incompletes than their higher-income peers and being more concerned about maintaining (merit-based) financial aid. There is mounting evidence that the COVID-19 pandemic, with its subsequent closing of schools and campuses and move to online teaching, may be widening socio-economic educational gaps (Andrew et al., 2020; Aucejo et al., 2020; Chetty et al., 2020; Engzell et al., 2021; Maldonado and De Witte, 2020; Sass and Goldring, 2021; Rodríguez-Planas, forthcoming) . The digital divide and uneven access to online learning resources is one mechanism underlying the greater learning losses among lower-income students (Barnum and Bryan, 2020; Bacher-Hicks et al., 2021; Altindag et al., 2021) . Other factors contributing to the learning delay include disadvantaged students' lower access to physical learning space and conducive learning environment (Andrew et al., 2020) , and higher stress and anxiety due to greater uncertainty and disruptions (Aucejo et al., 2020; Barnum and Bryan, 2020; Rodríguez-Planas, forthcoming; Jaeger et al., 2021) . At the same time, because online learning requires more discipline and self-regulated learning than traditional in-person learning, it would be reasonable to expect the educational gap between lowand high-performing students to widen if basic skills are necessary to acquire additional skills (Cunha et al., 2006) . Using pre-COVID-19 estimates of learning losses from extended absences from school (i.e., summer months and natural disasters), Kuhfeld et al. (2020) project substantial learning losses from the COVID-19 pandemic, especially for low-performing students. Using data from primary schools in Germany, Grenewig et al. (2020) find that, after COVID-19 school closures, higherperforming students spent more time on school-related activities daily than their lower-performing peers. Kofoed et al. (2021) find that West Point students randomized in an online version of a "Principles of Economics" course in Fall 2020 underperformed in assignments and exams relative to those in the in-person sessions. Using data from Virginia community college system, Bird et al. (2021) find a decrease in course completion in the courses students started in-person in Spring 2020 as compared to those they started online. Both studies find that the losses were greater among the lowperforming students. Despite this evidence, little is known on how the pandemic affected college students' academic performance during the Spring 2020 and whether it had any differential effect on lower-income students' academic performance relative to that of their higher-income peers. This is the main objective of this paper. This study provides novel evidence on the differential effects of the pandemic on Spring 2020 GPA and credits taken, earned, and failed by students' pre-pandemic income among students enrolled in Queens College. To do so, it uses an unbalanced panel of over 11,000 academic records spanning from Spring 2017 to Spring 2020. The study further identifies the mechanisms driving its main findings by analyzing students' Spring 2020 transcripts as well as their responses to a rich online survey. Queens College (QC) is a four-year college in the City University of New York (CUNY) system. Considered one of the most affordable colleges in the country-with a median undergraduate tuition of $6,530-, QC is an urban college with a socially vulnerable and ethnically diverse student population located in the borough of Queens in New York City. The identification strategy relies on both difference-in-differences models and event study analyses. In addition, to control for timeinvariant unobserved heterogeneity, I exploit within-student variation by controlling for student fixed effects. During Spring 2020, higher-income students earned a 13.4% higher Spring 2020 GPA relative to their pre-pandemic mean. In contrast with students' academic expectations for Spring 2020 (Aucejo et al., 2020; Rodríguez-Planas, forthcoming) , lower-income students outperformed their wealthier peers as they earned a 5.1% higher GPA. However, average effects hide important differences by pre-COVID-19 academic performance. Lower-income students in the bottom quartile of the 2019 Fall cumulative GPA distribution outperformed their higher-income counterparts with a 9% higher Spring 2020 GPA relative to the counterfactual's pre-pandemic mean. This higher performance is strongly associated with the flexible grading policy as it vanishes when I use the GPA students would have earned in the absence of the flexible grading policy. Indeed, transcript data reveal that lower-income students from the bottom quartile were 35% more likely to exercise the pass/fail option than their higher-income counterparts. I further find suggestive evidence that greater concerns with maintaining financial aid among the bottomperforming lower-income students (relative to their peers) may have driven their differential prevalence of pass/fail grade. While no such GPA advantage is observed among top-performing lower-income students, in the absence of the flexible grading policy these students would have underperformed relative to their higher-income counterparts as their GPA would have been 4% lower (relative to the pre-pandemic mean for the comparison group of 3.718). To put it differently, in the absence of flexible grading, lowerincome top-performing student would have seen their GPA decrease by 5% relative to their counterfactual pre-pandemic mean. I find suggestive evidence that this lower performance may be driven by lower-income top-performing students experiencing greater challenges with online learning than their wealthier peers. These students also reported a higher use of incompletes and being more concerned about maintaining (merit-based) financial aid. Transcript data also reveal that they were 57% less likely to exercise the pass/fail option than bottom-performing lower-income students relative to the differential observed between their wealthier counterparts. To the best of my knowledge, this paper is the first to use higher-education administrative records and transcript data to study the short-run effects of the pandemic on college students' academic performance using a student fixed effects model. This work relates to at least the following two strands of literature. First, it is close to studies analyzing the effect of violent conflicts or natural disasters on students' academic performance (Brück et al., 2019; Sacerdote, 2012) or of economic recessions on students' earnings after graduation (Oreopoulos et al., 2012; Fernández-Kranz & Rodríguez-Planas, 2018) . While these studies focus on the effects of events long after they occurred, I study the immediate effects of the event. Second, the current paper contributes to a nascent literature analyzing the consequences of the COVID-19 pandemic on college education. In contrast with Aucejo et al. (2020) and Rodríguez-Planas (forthcoming), which focus on students' self-perceived challenges, I find that the immediate effects of the COVID-19 pandemic on academic performance are not only positive but seem to benefit most lower-performing lower-income students. While Kofoed et al. (2021) study the impact of switching from in-person to online learning on grades during the pandemic, their analysis is confined to an Economics course during Fall 2020 with no temporal analysis. Bird et al. (2021) and Altindag et al. (2021) also estimate the effect of the pandemic on students' academic performance due to the unexpected switch to online learning during Spring 2020. However, both papers focus on universities 5 with well-established online education programs, and their identification strategy relies on comparing academic performance of students who took courses in both types of instruction modes before and after the pandemic. Because online teaching was practically non-existent prior to the pandemic at QC, estimates in the current study pick up both the switch to online teaching and other disruptions caused by the pandemic. By merging administrative records with transcript data and students' responses to a survey on COVID-19 challenges, I document differential effects of the pandemic given students' prepandemic income and pre-pandemic academic performance more broadly. Importantly, my analysis reveals students' differential use of the flexible grading policy based on their financial and academic needs. The data in this paper come from three different sources. I merged individual administrative academic records from QC with survey data on students' challenges collected during the early months of the pandemic. To better understand students' use of the flexible grading policy, I also used transcripts level data. Most of the analysis focuses on an unbalanced panel of 11,443 academic records from 2,817 students spanning from Spring 2017 or later (if the student enrolled in QC at a later date) to Spring 2020. 1 For each semester, I observe students' semester GPA; credits taken, which do not include courses officially withdrawn as they do not affect the GPA; credits earned; and credits for which a failing grade was earned. In regular years, the latter include credits from: (1) courses unofficially withdrawn (where the student stopped attending and never withdrew); (2) courses with an F grade; and (3) courses with a grade of Failed Incomplete, which is the grade assigned when an incomplete is not resolved by the following semester. In Spring 2020, this category also includes credits for courses with a grade of No Credit (NC). Other information available in QC administrative records includes students' sex, age, race, and ethnicity. In addition, I also observe the following information collected at the beginning of Spring 6 2020: students' major, class level (indicating whether the student is a freshman, sophomore, junior, senior, or in graduate school), Fall 2019 cumulative GPA, part-time student status, and whether the student had ever received the federal Pell grant. I use information on ever Pell grant receipt to define lower-income students because most Pell grant money goes to students with a total family income below $20,000. In addition, Spring 2020 transcript data provide information on whether the student exercised the pass/fail option (no-letter grade), took an incomplete, or simply removed the class record after the end of the semester (the NC grade). I use such information to estimate both the prevalence and intensity of such options during the Spring 2020. Transcript data also inform us on whether the student withdrew a course officially or unofficially. If the student officially withdrew the course, she did so within the first nine weeks of the semester, and her GPA was not affected. If the student never withdrew the course but stopped attending, it is considered as an unofficially withdraw, and the student received a failing grade. I use such information to estimate both the prevalence and intensity of courses withdrawn officially and unofficially. Importantly, since transcript data also include the initial grade received before changes were made, and the credits each course was worth, I use such information to calculate the students' Spring 2020 GPA in the absence of the flexible grading policy. To further identify potential mechanisms behind these findings, I use the students' responses to an online survey on their experiences during Spring 2020. The survey was sent to all students enrolled in Spring 2020, and it was fielded between Friday, July 24 th and Friday, September 18 th 2020. The response rate of 23% is higher than the usual response rate on CUNY online surveys of 13%, and the response rate of 10% to 12% obtained around the same time in 28 universities (Jaeger et al., 2021) . The sample in the current paper is almost twice as large as that of other post-COVID-19 higher-education survey studies (Aucejo et al., 2020) . As seen in Table 1 , I observe small differences between students in my sample (columns 1 and 2) and the overall QC student population (column 4) in the racial/ethnic distribution, the share of parttime students, the distribution of majors 2 , and the share of Pell recipients. For example, the share of 7 students who ever received the Pell grant in my sample is 57% if only undergraduate students are considered, which is not far from the 55% observed at the college level. There is a higher share of females (68% versus 57%) and older-than-25-years-old students (35% versus 29%) than the overall QC population, and a lower share of US born (44% versus 68%), English second-language learners (ESL) learners (22% versus 36%), and transfer students (22% versus 55%). The lower rate of transfer students reflects the lower engagement of these students to regular college life as they frequently combine college with either part-time or full-time employment. CUNY is known to be an institution that educates some of the poorest students in the country. It is also known to have a very diverse student population. Hence, it is not surprising that QC students are more racially diverse than students from the largest public university in each state. For example, only 27% of QC students are non-minority students compared to an average of 61% of non-minority students in the largest public universities in each state (Aucejo et al. 2020 ). Comparing columns 1 and 2 reveals that lower-income students (defined as those who ever received a Pell grant) are more likely to be Asians or Hispanics than higher-income students. They are also more likely to be first-generation college students, transfer students, and ESL students, and less likely to be US born than higher-income students. They also have a lower Fall 2019 cumulative GPA. Given the Pell-grant requirements, lower-income students are younger, and less likely to be graduate students or study part-time than higher-income students. To estimate the differential effect of the COVID-19 pandemic on lower-income students' academic performance, I estimate the following difference-in-differences model with individual fixed effects: where Y ist is the outcome of interest (for example, semester GPA) for student i in semester s and year t. Spring2020 s is a dummy equal 1 if the academic record is for Spring 2020 and 0 prior to that. Low-Income i is equal to 1 if student i ever received the Pell grant and 0 if the student never received the Pell grant. i represents the individual fixed effects, Fall s is a dummy equal 1 if the academic record is for the Fall semester and 0 if it is for the Spring semester, and Year t represents the year fixed effects. Standard errors are clustered at the student level. The coefficient of interest, β 2 , captures the differential post-pandemic effect on the outcome, Y ist , for lower-income students relative to their higher-income peers. Note that the individual fixed effects, i , absorb the lower-income indicator (as well as all the other time-invariant observable and unobservable characteristics). The coefficient β 1 captures how the academic performance of higherincome students changed in Spring 2020 when the COVID-19 pandemic hit. As identification comes from comparing outcomes from the same student before and after the pandemic, there is no need to control for time-invariant observable characteristics. The Fall semester dummy controls for semesterspecific characteristics, and the year fixed effects control for year differences over time. The critical identifying assumption is that there are parallel trends in the outcome variable across both groups (lower-versus higher-income students). To assess the validity of this assumption, I check for pre-existing diverging trends using the following event-study framework: where S sj is a dummy which takes value 1 if the outcome is observed in j th semester before (-j) or after (+j) January 27 th 2020, which is when Spring 2020 began, and 0 otherwise. Fall 2019 dummy is the omitted semester. In the absence of any pre-existing differential pre-trends between lower-and higherincome students, the estimated coefficients corresponding to the semesters prior to the Spring 2020 would not be statistically different from zero. Because average effects may hide differences by students' pre-COVID-19 academic performance, I also estimate the differential effect for students in the different quartiles of the 2019 cumulative GPA distribution using the following difference-in-differences model with individual fixed effects: where 2019 is a dummy variable which takes the value 1 if the Fall 2019 cumulative GPA of student i is in the q th quartile-the reference category is the first quartile. All the other covariates have been previously defined. In this specification, the individual fixed effects, i , absorb the lower-income indicator, the quartile indicators, 2019 , and the interaction of the low-income indicator with the , as well as all the other time-invariant observable and unobservable characteristics. The coefficient 1 captures the change in outcome during Spring 2020 experienced by higherincome students in the bottom quartile. The sum of 1 and q , ( 1 + q ), captures the change in outcome during Spring 2020 experienced by higher-income students in the q th quartile. And the coefficient q measures the size of the differential effect in the change in outcome before and after January 27 th 2020 for higher-income students in q th quartile relative to similar students in the bottom quartile. To identify changes by income level, the coefficient 2 captures the differential effect in the change in outcome after the pandemic between lower-and higher-income students in the bottom quartile. The sum of 2 and q , ( 2 + ), captures the differential effect in the change in outcome after the pandemic between lower-and higher-income students in q th quartile. The coefficient measures the differential effect in the change in outcome post-pandemic between lower-and higher-income students in q th quartile relative to that experienced between lower-and higher-income students in the bottom quartile. At the bottom of Panel B of Table 2 , I display ( 1 + q ) and ( 2 + q ) for the top three quartiles (q = 2, 3, and 4). Because equation (3) is an individual fixed effects model, time-invariant individual unobserved heterogeneity is held constant. Finally, when analyzing pre-pandemic income and academic performance heterogeneity effects with transcript or survey data, which is only available for Spring 2020, I estimate the following equation using ordinary least squares (OLS): where 2020 is the student's i outcome during Spring 2020. 0 is a vector of for all the baseline characteristics listed in Table 1 and students' major. Importantly, I control for students' pre-pandemic academic performance with the Fall 2019 semester GPA. The coefficient α 1 captures the association between being a lower-income student in the bottom quartile and the students' outcome relative to their higher-income counterfactual. Similarly, (α 1 + ) captures the association between being a lowerincome student in the q th quartile and the students' outcome relative to their higher-income peers. The coefficient measures the differential effect in the in Spring 2020 outcome between lower-and higherincome students in q th quartile relative to that experienced between lower-and higher-income students in the bottom quartile. Higher-income students' academic performance improved in Spring 2020 as students took and earned more courses than before the pandemic. They also got higher grades. As seen in Panel A of Table 2 , the effect of the pandemic on higher-income students' academic performance shows a 13.4% increase in semester GPA, a 4% increase in credits taken, and a 5% increase in credits earned during Spring 2020 relative to pre-pandemic means. All three effects are statistically significant at the 1% level. It is plausible that the pandemic affected the number of credits taken because these do not include any courses officially withdrawn within the first nine weeks of the semester (that is, by March 30 th 2020). While the flexible grading policy was effective on April 1 st 2020, discussions on its content occurred during most of March and both faculty and students were aware of the policy's consequences for students' GPA. Hence, the Spring 2020 increase in credits taken is likely due to students not officially withdrawing courses by March 30 th because they already knew that they would be able to exercise the NC grade without penalizing their GPA. In contrast, the flexible grading policy could not affect credits earned as the student is not awarded credit with the NC grade. Instead, the increase in credits earned can only be explained by an increase in passing versus failing grades during the Spring 2020. The increase in GPA is also likely related to the fact that as many as 17.2% of higher-income students exercised the pass/fail option, 6.3% exercised the NC grade, and 5% exercised the incomplete option. Comparing the official GPA increase before and after the pandemic with the GPA increase students would have experienced in the absence of the flexible grading policy (columns 1 and 5, Panel is important to underscore that, even in the absence of the flexible grading policy, the Spring 2020 GPA increased by 7.6% among higher-income students relative to the pre-pandemic mean suggesting that other factors beyond flexible grading may be at play here. Moving to the differential post-pandemic effect on academic performance across lower-and higher-income students, lower-income students outperformed higher-income ones during Spring 2020 as they earned a 5.1% higher GPA and they failed 28% fewer credits (or exercised fewer NC grades) than their wealthier counterparts. Both effects are statistically significant at the 5% level or lower. Lower-income students' higher relative academic performance during the first semester of the pandemic is largely associated with the flexible grading policy as the GPA differential by income status vanishes (β 2 is close to zero and not statistically significant) when I estimate the effect of the pandemic with the GPA prior to students exercising the flexible grading option. As seen in Panels A and D of Figure 1 , after five semesters of pre-pandemic parallel trends, we observe an increase in lower-income students' official GPA and a decrease in credits failed in the Spring 2020 relative to their counterparts. While in Panel B, lower-income students attempted less credits in Spring 2020 than their wealthier counterparts, they also did so in earlier semesters. In Panel C, the relative increase in credits earned between lower-and higher-income students is small and not precisely estimated. As discussed earlier, in the absence of the flexible grading policy, there is no differential effect in the semester GPA post-pandemic between lower-and higher-income students (as seen in Panel E). As seen in Panel B of Table 2 , the higher relative average performance among lower-income students is solely driven by students in the bottom quartile of the distribution. These students' Spring 2020 GPA increased an additional 9% relative to their higher-income counterparts' pre-pandemic average. This differential effect is marginally statistically significant at the 10% level. Adding the overall postpandemic effect, ( 1 + 2 ) from equation (3) results in lower-income students from the bottom quartile of the distribution earning 1.073 additional points on their semester GPA in Spring 2020, an increase of 55% relative to their counterparts' pre-pandemic GPA mean of 1.939. The coefficient ( 1 + 2 ) is statistically significant at the 1% level (standard error = 0.051). This higher relative GPA among low-performing students in the bottom quartile is strongly associated with the flexible grading policy as the differential vanishes when I use the GPA students would have earned in the absence of the flexible grading policy. Column 5, Panel B in Table 2 shows that β 2 is close to zero, negative, and not statistically significant for this group of students. Indeed, as seen in column 3 of Table 3 , lower-income students from the bottom quartile were 8.1 percentage points more likely to exercise the pass/fail option than their higher-income counterparts. This effect represents a 35% increase in the prevalence of the pass/fail option relative to their counterparts' prevalence of 22.9% and is statistically significant at the 5% level. A distinct pattern is observed between lower-income students in the top and bottom quartile of the distribution relative to the GPA gap by pre-pandemic performance of their higher-income counterparts as shown by a negative and statistically significant 4 in column 1, panel B in Table 2 . While the Spring 2020 GPA of top-performing higher-income students, ( 1 + 4 ), increased by 0.097 points relative to their pre-pandemic mean of 3.718 (a statistically significant 2.6% increase), the Spring income top-performing students, ( 2 + 4 ), would have been 4% lower than that of their higher-income counterparts relative to the pre-pandemic mean for the comparison group. 5 This effect is statistically significant at the 5% level. To put it differently, in the absence of flexible grading, lower-income topperforming students' Spring 2020 GPA would have been 0.189 points lower than their pre-pandemic average ( 1 + 4 + 2 + 4 = −0.189), representing a 5% decrease relative to their counterfactual prepandemic mean GPA of 3.718. This effect is statistically significant at the 1% level. Consistent with these estimates, the event study (shown in Figure 2 ) paints a quite different picture for lower-income students in the bottom and top quartiles of the Fall 2019 cumulative GPA distribution relative to their wealthier peers. Panel A, which focuses on the bottom quartile, shows a lack of pre-pandemic trends followed by an increase in lower-income students' Spring 2020 official GPA relative to their wealthier peers (LHS figure) . However, no such increase is observed had the flexible grading policy not been available (RHS figure) . In contrast, Panel B, which focuses on the top quartile, shows that the lack of pre-pandemic trends is followed by a decrease in lower-income students' Spring 2020 GPA prior to students exercising the flexible grading policy relative to their wealthier peers (RHS figure) , with no post-pandemic change observed with the official GPA (LHS figure). Appendix Figure A .1 shows the event studies for the other outcomes and quartiles. To address concerns that this differential pattern in the official Spring 2020 GPA may be driven by students differentially selecting majors according to their income, I re-estimate equation (3) Table A .3) are robust to the above specifications suggesting that these alternative explanations are not driving this differential pattern by income. It is worthwhile highlighting that, as for bottom-performing students, the flexible grading policy helped lower-income top-performing students increase their Spring 2020 GPA. In this case, the increase relative to their wealthier counterparts was of 0.088 points (from -0.144 to -0.056), the equivalent of 2.4% increase relative to the pre-pandemic counterfactual mean of 3.718. But, in contrast with lowerincome students in the bottom quartile, the GPA for those in the top quartile did not increase relative to their wealthier peers once the flexible grading policy has been taken into account. Table 3 suggests that this may be due to the differential exercise of pass/fail option between top-and bottom-performing lower-income students relative to the differential prevalence among their wealthier peers. Indeed, column 3 shows that top-performing lower-income students were 9.8 percentage points less likely to exercise the pass/fail option than bottom-performing lower-income students relative to the differential observed between their wealthier counterparts. As 17.25% of higher-income students exercised the pass/fail option, this represents a reduction of 57%. This effect is statistically significant at the 5% level. Yet, this does not explain the differential income effect of the pandemic on top-performing students before students exercised the flexible grading policy. Below, I explore plausible mechanisms behind lower-income top-performing students underperformance relative to their wealthier counterparts. Because of the disruptions that COVID-19 represented to higher education and the abrupt move to online learning, we would have expected academic performance of college students to drop. As explained in Section IV, this did not happen. Instead, the official Spring GPA of higher-income students increased by an average of 13.4% relative to the pre-pandemic and after controlling for time-invariant heterogeneity. While about two fifths of this increase is explained by the flexible grading policy, a 7.6% remains unexplained. Several reasons could explain this improvement beyond students' greater flexibility in their grading choices due to the flexible grading policy: (1) a different assessment process with easier exams and/or more lenient grading; (2) more difficult supervising process as exams were online, leading to potentially greater cheating; (3) an improvement in students' learning strategies with online learning; (4) lower opportunity costs of studying due to less employment available; and (5) lower financial stress due to greater availability of emergency relief funds from the college or the government. It is plausible that changes in both faculty's leniency, as well as exams' assessment and supervision may be behind some of the higher post-pandemic GPA observed for the whole sample. However, given the findings on the role of the flexible grading policy, it is unlikely that they drive the observed academic-performance differences between lower-and higher-income students. Assuming dynamic complementarity and self-productivity, we would expect multiplier effects of skills and ability (Cunha and Heckman, 2007) . In such case, the online-learning costs would be lower for higher-than lower-performing students, consistent with recent findings (Grenewig et al. 2020; Kuhfeld et al. 2020; Kofoed et al. 2021; Bird et al. 2021 ). Yet, estimates from panel B of Table 2 indicate that the Spring 2020 GPA increase was inversely related to pre-pandemic performance. To explore the other possibilities, I estimate model (4) using as outcomes students' survey responses regarding: (1) their perception of having challenges with online learning; (2) their opportunity costs of studying, measured by whether they worked during the Spring 2020, and by whether they worked less due to COVID-19; (3) their receipt of emergency relief funds; and (4) their perception of how flexible grading influenced their grading choices. 6 While these estimates are not capturing a causal relationship, they may present suggestive evidence of which mechanisms may be associated with the differential outcomes by income. As seen in column 1 in Table 4 , top-performing lower-income students were 6.8 percentage points more likely to experience challenges with online learning than their higher-income counterparts. This effect is statistically significant at the 5% level. A similar income disadvantage is observed among students in third quartile, albeit the effect is only marginally significant. No such income differential effect is observed among students in the bottom half of the distribution. Lower-income students worked, on average, 3.9 percentage points less due to the pandemic but received 15 percentage points more emergency relief funds than their higher-income peers. Both effects are statistically significant at the 5% or lower (estimates available from author upon request). While there is no differential effect by pre-pandemic performance for receipt of emergency relief funds among lower-income students, the income differential effect on working less in Spring 2020 is driven by the lowest performing students (albeit only marginally significant). This suggests that a plausible driver of the differential Spring 2020 GPA prior to exercising the flexible grading between lower-and higherincome students in the bottom quartile could be the post-COVID-19 lower opportunity cost of studying among lower-income students. It is noteworthy that, because of the pandemic, top-performing lower-income students were 9.7 percentage points more likely to report asking for an incomplete than their bottom-performing lowerincome peers and 5.8 percentage points more likely to ask for an incomplete than their higher-income peers as seen in column 7 of Table 4 . Both estimates are significant at the 1% level. It is important to underscore that the students' survey responses to the influence of the flexible grading policy is not directly comparable to the observed in the transcripts because they capture a change in behavior relative to the pandemic. To the extent that incompletes are allowed under normal times and students can sometimes also select Pass/No Credit option, the survey responses may be picking up a change whereas the transcript data only measure levels for Spring 2020. In contrast, bottom-performing lower-income students were 7.1 percentage points more likely to report choosing a pass/fail or NC grade than their higher-income peers as seen in column 6. 7 Even though, this estimate is only marginally significant (at the 10% level), it corroborates results from transcript data suggesting that the bottom-performing lower-income students improved their Spring 2020 academic performance by converting poor grades in pass/fail, preventing them from lowering the semester GPA and dropping courses. It is interesting to underscore that the bottom-performing lower-income students are 17.1 percentage points more likely than their higher-income peers to report facing challenges to maintain financial aid. To maintain financial aid (including the Pell grant), students must be enrolled in at least 6 credits (12 to receive the full amount) and maintain a GPA of 2.0. Hence, it is not surprising that lower-income bottom-performing students, concerned with losing their Pell Grant or having to return a portion of the funding already received, may had been willing to use the pass/fail grade to avoid getting a grade that would have hurt their GPA. Interestingly, there is a statistically significant income differential in students' challenges to maintain financial aid for those in the third and top quartile, although this differential is half to two thirds of the size observed among bottom performers. This income differential among top performers may reflect their need to maintain a certain GPA to preserve merit-based scholarship, which would explain their self-reported higher intake of incompletes due to COVID-19. Using individual students' administrative academic records from Spring 2017 to Spring 2020 and controlling for individual fixed effects, this paper first documents an increase in college students' Spring 2020 GPA relative to prior academic performance. Difference-in-differences models and event-study analyses with individual fixed effects reveal heterogeneity by pre-pandemic income and pre-pandemic academic performance inequalities. I find that bottom-performing lower-income students outperform their higher-income peers. This differential is fully explained by students' use of the flexible grading policy with lower-income students being 35% more likely to exercise the pass/fail option than their counterparts. Students' survey responses suggest that lower-income students' greater concerns with maintaining financial aid may be behind their higher use of flexible grading. In contrast, there is no GPA advantage by income among top-performing students. Instead, in the absence of the flexible grading policy, lower-income top-performing students would have underperformed relative to their higher-income counterparts. I find suggestive evidence that this lower performance may be driven by lower-income top-performing students experiencing greater challenges with online learning than their wealthier peers. These students also reported a higher use of incompletes than their higher-income peers and being more concerned about maintaining (merit-based) financial aid. Transcript data also reveals that they were less likely to exercise the pass/fail option than bottomperforming lower-income students relative to the differential observed between their wealthier counterparts. The current research underscores the relevance of the flexible grading policy and its differential use based on students' particular needs. The findings in this paper suggest that the flexible grading policy was able to counteract negative shocks, especially among the most disadvantaged students. Because low-income students regularly face idiosyncratic challenges, the results in this paper suggest that a higher use of the pass/fail grade (if not for all courses, for certain courses) may support students during critical moments. Future research ought to analyze how the pandemic will affect students' academic performance in the medium and longer run, especially once the flexible grading policy no longer applies and in-person teaching resumes. Another interesting line of research is to analyze whether flexible grading per se will impact students' future academic performance. FIGURE 1. Semester GPA and Credits Differential by Income Level Notes: These figures plot the coefficients on the interaction between the semester dummies and the low-income dummy (and the 95% confidence intervals) from an individual fixed-effects model as in equation (2) in the main text. The Fall 2019 semester dummy is the omitted semester. Standard errors are clustered at the individual level. In Panel A, the outcome is the official semester GPA. In Panel B and C, the outcome is credits taken and credits earned, respectively. In Panel D, the outcome is credits failed or NC earned. In Panel E, the outcome is the official semester GPA except for the Spring 2020, in which case I have replaced the official GPA with the GPA the student would have earned had the flexible grading policy not been available. I estimated this GPA using the courses the student enrolled in, and the grades originally earned in those courses (including official and unofficial withdraws) from transcript data. Standard errors are reported in parentheses in column 3. Column 3 presents the coefficient on the lowincome dummy from a regression model with no other controls. Significant at the: ***1 percent level, ** 5 percent level, *10 percent level. a Source: https://www.qc.cuny.edu/about/research/Pages/CP-Enrolled%20Student%20Profile.aspx. b Excludes graduate students. Notes: Spring 2020 is a dummy equal 1 if the outcome is measured during Spring semester 2020, and 0 otherwise. Low Income is a dummy equal 1 if the student ever received the Pell grant, and 0 otherwise. Q1 is a bottom quartile dummy equal to 1 if the student's cumulative GPA is below 2.64 and equal 0 otherwise. Q2 is a second quartile dummy equal to 1 if the student's cumulative GPA ranges between 2.64 and 3.23, both included, and equal 0 otherwise. Q3 is a third quartile dummy equal to 1 if the student's cumulative GPA ranges between 3.24 and 3.71, both included, and equal 0 otherwise. Q4 is a third quartile dummy equal to 1 if the student's cumulative GPA is higher than 3.71, and equal 0 otherwise. Panel A reports individual fixed-effects estimates associated with post-pandemic and low-versus high-income students on the dependent variables indicated in column headings. I estimate equation (1) The table reports individual OLS on the low-income dummy, quartile dummies, and interactions between the two from a regression where the outcome is indicated in column headings. I estimate equation (4) in the main text. The bottom quartile is the omitted quartile. All regressions include a female, race and ethnicity indicators, a USA born indicator, a first-generation student indicator, a transfer-student indicator, an ESL-learner indicator, class-level indicators (including a graduate student indicator), major dummies, and Fall 2019 cumulative GPA. Robust standard errors are reported in parentheses. * , ** , *** Estimate significantly different from zero at the 0.1 or 0.05 level or 0.01 level. 28 Appendix Is Online Education Working? Inequalities in Children's Experiences of Home Learning during the COVID-19 Lockdown in England The impact of COVID-19 on student experiences and expectations: Evidence from a survey Inequality in household adaptation to schooling shocks: Covid-induced online learning engagement in real time America's great remote-learning experiment: What surveys of teachers and parents tell us about how it went Learning the hard way: The effect of violent conflict on student academic achievement Negative Impacts from the Shift to Online Learning during the COVID-19 Crisis: Evidence from a Statewide Community College System The Economic Impacts of COVID-19: Evidence from a New Public Database Built Using Private Sector Data The Technology of Skill Formation Learning loss due to school closures during the COVID-19 pandemic The Perfect Storm: Effects of Graduating in a Recession in a Segmented Labor Market COVID-19 and Educational Inequality: How School Closures Affect Low-and High-Achieving Students The Global COVID-19 Student Survey: First Wave Results Zooming to Class?: Experimental Evidence on College Students' Online Learning during COVID-19 Projecting the potential impact of COVID-19 school closures on academic achievement The effect of school closures on standardised student test outcomes The short-and long-term career effects of graduating in a recession Hitting Where It Hurts Most: COVID-19 and Low-Income Urban College Students When the saints go marching out: Long-term outcomes for student evacuees from Hurricanes Katrina and Rita Student Achievement Growth During the COVID-19 Pandemic Notes: Spring 2020 is a dummy equal 1 if the outcome is measured during Spring semester 2020, and 0 otherwise. Pell is a dummy equal 1 if the student ever received the Pell grant, and 0 otherwise. Q1 is a bottom quartile dummy equal to 1 if the student's cumulative GPA is below 2.64 and equal 0 otherwise. Q2 is a second quartile dummy equal to 1 if the student's cumulative GPA ranges between 2.65 and 3.23, both included, and equal 0 otherwise. Q3 is a third quartile dummy equal to 1 if the student's cumulative GPA ranges between 3.24 and 3.71, both included, and equal 0 otherwise. Q4 is a third quartile dummy equal to 1 if the student's cumulative GPA is higher than 3.71 and equal 0 otherwise. The table reports individual fixed-effects estimates associated with post-pandemic and lowversus high-income students from different cumulative Fall 2019 GPA quartiles on the semester GPA. Standard errors clustered at the individual level are reported in parentheses. All regressions include a Fall-semester indicator and year 2017 and year 2018 dummies. Column (1) is our baseline specification also shown in Table 2 . Column (2) adds to the baseline specification interactions between the major dummies and the Spring 2020 dummy. Column (3) adds to the baseline specification interactions between the class level dummies and the Spring 2020 dummy. Class level dummies include freshman, sophomore, junior, senior for undergraduates, and graduate degree. Column (4) re-estimates our baseline specification excluding observations from Spring 2017. * , * , ** , *** Estimate significantly different from zero at the 0.1 level, 0.05 level, or 0.01 level. Notes: These figures plot the coefficients on the interaction between the semester dummies and the low-income dummy (and the 95% confidence intervals) from an individual fixed-effects model as in equation (2) in the main text. In column 1, the outcome is the official semester GPA. Each Panel shows estimates using students from a different quartile of the Fall 2019 semester cumulative GPA distribution. In column 2, the outcome is the official semester GPA except for the Spring 2020, in which case I have replaced the official GPA with the GPA the student would have earned had the flexible grading policy not been available. I estimated this GPA using the courses the student enrolled in, and the grades originally earned in those courses (including official and unofficial withdraws) from transcript data. In columns 3 and 4, the outcome is credits taken and credits earned, respectively. In column 5, the outcome is credits failed or NC earned. Each panel shows estimates for students in different quartiles of the Fall 2019 semester cumulative GPA distribution. Núria Rodríguez-Planas Highlights  We find an income differential effect of the pandemic on college students' GPA  We use DiD models and event study analyses with individual fixed effects  Low-income low-performing students outperformed their wealthier peers  This differential is fully explained by students' use of flexible grading  No such GPA advantage is observed among top-performing lower-income students  Without flexible grading, these students would have seen their GPA decrease by 5% Flexible grading counteracted negative shocks, especially among disadvantaged students