key: cord-1056114-d8p788ba authors: Chatterji, Pinka; Li, Yue title: Effects of COVID-19 on school enrollment date: 2021-05-07 journal: Econ Educ Rev DOI: 10.1016/j.econedurev.2021.102128 sha: 8fc21083772422099476e4cff872ab3e9fd41a24 doc_id: 1056114 cord_uid: d8p788ba We estimate effects of the COVID-19 pandemic on self-reported school enrollment using a sample of 16-to-18-year-old youth from the January 2010 to the December 2020 Current Population Survey (CPS). The pandemic reduced the likelihood of students reporting that they were enrolled in high school by about 1.8 percentage points in April 2020 vs. in the same month in prior years, although enrollment rebounded back to typical levels by October 2020. Adverse effects on school enrollment were magnified for older vs. younger students, males vs. females, and among adolescents without a college-educated household member vs. adolescents from more educated households. Greater school responsiveness to the pandemic and high school graduation exit exams appear to have protected students from disengaging from school. The widespread closure of schools is a byproduct of the COVID-19 pandemic that may have long-lasting, detrimental effects on youth in the U.S.. On March 12, 2020, Ohio became the first state to close all schools state-wide due to the pandemic (Education Week July 1, 2020a). By March 25, 2020, every public school building in the US was closed, with the vast majority closing buildings for the remainder of the school year (Education Week July 1, 2020a). After closing in March 2020, most schools offered remote learning but the quality was, at best, uneven (Gross and Opalka, June 10, 2020 , Harris et al., July 2020 , Hamilton et al., 2020 . As of April 2021, at least 12 states have mandated that schools open in-person to some extent (Education Week (Map) July 28, 2020b). Most states, however, left that decision to local school districts, and some districts have not yet re-opened or have just re-opened their buildings (CNN July 15, 2020, Education Week (Map) July 28, 2020b). At this time, we know little about the COVID-19 pandemic's impact on student outcomes, and whether higher-quality online school and more stringent state-level education policies may have buffered any adverse effects. In this paper, we provide some of the first evidence on this topic by estimating the effects of the pandemic on school enrollment using data on 16-to-18-year-old youth from the January 2010 to the December 2020 Current Population Survey (CPS). The CPS is a monthly labor force survey administered to civilians aged 16 and older not living in institutions. In February 2020, the response rate was 82 percent (US Bureau of Labor Statistics 2020a, Flood et al., 2020) . The COVID-19 pandemic caused the CPS to halt in-person interviews as of March 20, 2020, which negatively affected survey response (US Bureau of Labor Statistics July 2, 2020b). The response rate was 70 percent in April 2020 and rose to 77 percent in December 2020. 1 The analysis sample includes respondents aged 16-18 years old without a high school degree and who have not attended college, do not reside in group quarters, and were in a CPS household for at least one month between January 2010 and December 2020 excluding the summer months of each year. In this paper, we consider the summer months to be May, June, July, August, and September since the start and end dates of school years vary by states and by schools, and students potentially may be out of school due to summer vacation in any of these months; therefore, these five months are excluded from the analysis sample. Our outcome variable is an indicator of whether the respondent self-reports being enrolled in high school during the previous week. Respondents are coded as not being enrolled in school if they are on summer vacation (unless enrolled in summer school), but respondents are considered enrolled in school if they are on vacation during the school year in the prior week. First, we estimate month-by-month effects of the pandemic on school enrollment using Eq. 1: In Equation 1, y iast is a 0-1 indicator of "currently enrolled in school" for individual i of age a living in state s in year t. The term D τ it represents a set of indicators for each month in calendar year 2020 (other than the months from May to September which are excluded from the sample, as described previously). Equation 1 includes the full set of interactions between each of the 7 calendar months (Jan-Apr and Oct-Dec) and age dummies for age 16, age 17, and age 18 (γ at ) to control for differential seasonal patterns by age, in addition to a linear trend in year (t t ) and state fixed effects (γ s ). The individual controls (X it ) include dummy variables for: female, completed 10 th grade or below, Black, Other race/ ethnicity, Latino, household has any members with a college degree, and county population density (thousands of residents per square mile as of 2019). The estimated coefficients δ τ capture the month-by-month effects of the pandemic-the difference in school enrollment in each month in calendar year 2020 relative to the corresponding month in previous calendar years. Next, to estimate and evaluate the heterogeneity of the average effect in April 2020, we use Eqs. 2-3 below: When estimating Eq.s 2-3, for the year 2020, we only include the months of January, February, and April in the sample. The rationale for only including these three months in 2020 is: (1) March 2020 was a transitional month for schools, and thus should be excluded; and (2) month-by-month analyses (Eq. 1) show no effects of the pandemic on school enrollment in October-December 2020, so we do not include those months in this analysis of average effects. In Eq. 2, Post t is an indicator for the month of April in 2020. The coefficient on Post t , therefore, captures effects of the pandemic (April 2020) on school enrollment, after adjusting for other potentially confounding trends. To test for heterogeneity in the effect, we include interaction terms between Post t and the variables Group i (Eq. 3). In Eq. 3, Group i are a set of subgroup indicators, and the estimated coefficients ζ i capture the differential effects of the pandemic in each sub-group vs. baseline. The sub-groups are defined by: individual/household characteristics (female, highest grade completed is 10 th grade or lower, Black, Other race/ethnicity, Latino, household has at least one college graduate); community characteristics (county population density, COVID-19 deaths in county as of April 12, 2020 per thousand population in 2019) and state-level measures of school responsiveness/quality (statelevel school responsiveness score from (Harris et al., July 2020), state compulsory school age, state required students to pass exit exams to earn high school diplomas but exams were cancelled due to pandemic, state had exit exams and these exams were not cancelled). 2,3 The models also included the employment rate of people aged 16-25 in the current month in each county; this variable is not interacted with Post t so the level effect is absorbed by the intercept. In all models, we apply CPS person weights, and estimate robust standard errors with two-way clustering on state and year-month. In Table 1 , we see that in the years 2010-2019, there were no changes in school enrollment or demographics between the months of January-February and April, but there was a 2.3 percentage point drop in school enrollment between January-February and April in 2020. Sample characteristics did not change between January-February and April 2020 relative to the same period in previous years, suggesting that nonrandom attrition may not be an issue. Fig. 1 , which shows the estimated coefficients on the month indicators in Eq. 1, highlights that school enrollment was about 2 percentage points lower in April 2020 relative to the same month in prior school years. Enrollment rebounded to typical levels by the start of the 2020-2021 school year (Fig. 1 ). In Fig. 2 , we show results from the same model estimated using a sample that includes 16-18-year-olds who are high school graduates and have ever attended college. Using this "unrestricted sample" allows for adjustments along the extensive margin due to the pandemic. The pattern of findings in Fig. 2 is similar to that of Fig. 1 . Using the CPS, Ahn et al. (Ahn, Winters and Lee, October 2020) estimate that high school completion rates were 6.9 percentage points higher for 18-year-olds in 2020 vs. in prior years. They argue that this increase may be due to decreased labor market opportunities. Using the unrestricted sample, we also find that the likelihood of having a high school diploma was elevated during the pandemic, and that employment was almost 10 percentage points lower in April 2020 relative to the same month in prior years, with a steady return to a slightly lower-than-usual employment level by the fall of 2020 (results available upon request). As 2 Harris et al. (2020) develop a school-level score on pandemic responsiveness based on data from school/district websites representing 3,519 private, public, and charter schools, including elementary, middle, and high schools. They regress the score on school-level characteristics obtained from the National Longitudinal School Database (NLSD) and state fixed effects. The state responsiveness scores used in this paper come from the estimated coefficients shown in Table 9 of (Harris et al., July 2020)). Information on exit exams comes from (Ahn, Winters and Lee, October 2020)). 3 For ease of interpretation in the models, we subtracted the mean (as of 2019) from each of the following variables: county population density, COVID-19 death rate, school responsiveness score, and minimum school dropout age. For population density and minimum school dropout age, we de-mean using the sample-wide mean in 2019. For the COVID-19 death rate, we use the death rate from the week before the April 2020 CPS interview and assign this death rate to each county (regardless of year). Then, we use the sample-wide mean of the death rate in 2019 to de-mean the COVID-19 death rate for the analysis. If we had used the mean death rate from 2020, we faced the challenge that many months from 2020 were excluded from the analysis and also there were unusually low response rates during the pandemic. We follow a similar procedure to de-mean the COVID-19 school responsiveness score; we merged the score into the CPS data by respondent's state (regardless of year) and then de-mean using the sample-wide mean score in 2019. a group, these findings may indicate that while the pandemic reduced enrollment, at the same time, marginal students were more likely to graduate, perhaps because some schools relaxed standards. Column 1 in Table 2 shows estimates of Eq. 2, which indicate that school enrollment was 1.8 percentage points lower than usual (a 2 percent reduction at the sample mean) in April 2020 versus in the same month in prior years. The magnitudes differ substantially across subgroups (column 2). First, we note that the pandemic had smaller detrimental effects on females, younger students, non-Latinos, and students from households with a college-educated member. Second, the findings show that students were more likely to stay enrolled in school if they were from states with higher-than-average school responsiveness to COVID-19, and if they were from states with high school graduation exit exams that were required and were not cancelled, holding other factors constant. Compulsory school ages, having exit exams that were required but cancelled, death rates from COVID-19, and county population density did not interact with the effects of the pandemic on school enrollment. In particular, a one standard deviation (i.e., 1.7) increase in the school responsiveness score is associated with a 1.1 percentage point increase in school enrollment, and the difference in the estimated coefficients for exit exams required and held and exit exams required but cancelled is significant at the 1% level. Higher employment rates are associated with lower school enrollment, which is consistent with Ahn et al. (Ahn, Winters and Lee, October 2020). However, Ahn et al. (Ahn, Winters and Lee, October 2020) find that state education policy responses to the pandemic have no significant effects on high school completion after controlling for local employment changes, and column 2 shows that exit exams and greater school responsiveness improve enrollment even after controlling for local employment rates. This difference likely reflects that requiring students to pass exit exams to earn diplomas and greater school responsiveness are indicators of higher school standards, which are protective in keeping students engaged in school, but do not necessarily lead to higher graduation rates. In column 3 of Table 2 , we show findings from a model which includes only three interaction termsschool responsiveness score interacted with Post; exit exams required and cancelled interacted with Post t ; and exit exams required and held interacted with Post. The findings from this pared-down model help clarify the interpretation of these interaction effects, which are of considerable policy interest. From column 3, we see that, holding other factors constant, higher-than-average school responsiveness to the pandemic reduced the pandemic's detrimental impact on school enrollment. In addition, relative to states without exit exams, having exit exams that were not cancelled due to the pandemic eliminated the detrimental negative effect of the pandemic on enrollment, suggesting that these looming exams may have served as a powerful incentive for students to stay engaged. We found that exit exams that were cancelled are also protective for students, although this Notes: Data come from CPS, 2010-2020. *** indicates that the difference between column (6) and (3) is statistically significant at the 0.01 level. N = 320,045 for the entire sample, 86,641 for column (1), 45064 for column (2), 7078 for column (4) and 3,213 for column (5). Notes: Table shows estimated coefficients and standard errors from Eq. (2) in column 1 and Eq. (3) in columns 2-7. Estimates come from linear probability models with standard errors estimated with two-way clustering at the state and year-month level. Models also include state fixed effects, linear trend in year, and full set of interactions between calendar month and age (estimated coefficients and intercepts not shown in table). "Post" is an indicator for the month of April in the year 2020. For ease of interpretation, we subtracted the sample-wide mean (as of 2019) from each of the following variables: population density, COVID death rate, responsiveness score, and minimum school dropout age. * indicates statistically significant at the 0.10 level, ** indicates statistically significant at the 0.05 level and *** indicates statistically significant at the 0.01 level. Mean attendance rate is calculated for Jan-Feb of 2020. coefficient is not statistically significant, and is smaller than that of exit exam held at the 1% level. In columns 4-5 and columns 6-7, we examine interactions among males vs. females and among students without and with a collegeeducated household member, respectively, holding other factors constant. Race/ethnicity and family education play different roles for male vs. female students. Among females, Black race is associated with larger detrimental effects, but there is no difference by Black race among males, while being Latino exacerbates negative effects for males but not for females. In addition, having a college-educated household member buffers the negative effects of the pandemic for males, but not for females, as does higher-than-average compulsory school ages and required state exit examinations that were held. Higher population density, and having required exams that were cancelled, is protective for females, but not for males, holding other factors constant. In columns 6-7, the findings highlight differences in the effects of the pandemic on school enrollment by household education levels. Female gender is protective for students in less-educated households, but gender does not matter for those living in more-educated households. Minority race/ethnicity exacerbates the detrimental effects of the pandemic for students living in households without a college graduate, but minority race/ethnicity is, if anything, protective for students living in households with a college graduate. Higher-than-average COVID-19 death rates are negatively associated with school enrollment for students in less-educated households, but death rates are positively associated with enrollment for students in more-educated households, controlling for other factors. School responsiveness protects students from disenrolling from school regardless of whether the students have a college-educated household member, but state exit exam requirements (including exams that were cancelled and exams that were held) protect students living in less-educated households only (cols. 6-7). It is possible that having state requirements for exit exams is an indicator of high school standards, and, for more disadvantaged students, higher standards (rather than the actual administration of the exams) protected them from the detrimental effects of the pandemic. Similarly, higher population density protects students from the effects of the pandemic for students living in less-educated households only. In addition, higher employment rates among 16-25-year-olds are negatively associated with enrollment for students living in less-educated households only, probably because these students have higher risk of leaving school for employment opportunities during the pandemic. Our early evidence from the CPS suggests that the pandemic reduced school enrollment by about 2 percent during April 2020; the magnitudes of these effects varied widely, and they were larger among disadvantaged groups. While some "not enrolled" students may have dropped out of school, other students may respond that they are "not enrolled" if they are not attending remotely at the time of the CPS survey. Thus, we acknowledge that "not enrolled in high school" may capture both lack of attendance/engagement in addition to high school dropout. We find that higher school responsiveness, state-level exit exams, and higher state-level minimum dropout ages buffer the detrimental effects of the pandemic on school enrollment, at least for some sub-groups of students. These findings add to the growing literature documenting that state-level education policies often have broader and sometimes unanticipated effects on students. Higher state-level minimum dropout ages, for example, have been associated with reductions in dropout but also to reductions in crime and displacement of crime (Anderson, Hansen and Walker, 2013, Angrist and Krueger, 1991) . As the pandemic continues to disrupt daily life around the world, we need information about how policies can remediate the effects of the pandemic on adolescents. An important first step in doing so is to document effects on school enrollment during the pandemic in this age group. Future research should address the longer-term effects of the pandemic on school engagement, on learning outcomes, and, ultimately, on college attendance and wages. None. Pinka Chatterji and Yue Li were involved in all aspects of the paper, with Yue Li leading the data analysis and Pinka Chatterji leading the writing of the paper. The minimum dropout age and school victimization Employment opportunities and high school completion during the COVID-19 recession Does compulsory school attendance affect schooling and earnings? Several big US school districts are extending remote classes into the fall The Coronavirus Spring: The historic closing of U.S. schools Map: Where are schools required to be open Too many schools leave learning to chance during the pandemic How America's Schools Responded to the COVID Crisis COVID-19 and the State of K-12 Schools: Results and Technical Documentation from the Spring 2020 American Educator Panels COVID-19 Surveys. Creative Commons Attribution 4.0 International Public License Coronavirus infects surveys too: Nonresponse bias during the pandemic in the CPS ASEC Integrated Public Use Microdata Series Office of survey methods research The impact of the coronavirus (COVID-19) pandemic on the employment situation for