key: cord-0891759-2oi1ic5f authors: Krylova, Olga; Earn, David J. D. title: Patterns of smallpox mortality in London, England, over three centuries date: 2020-12-21 journal: PLoS Biol DOI: 10.1371/journal.pbio.3000506 sha: 4051bd9957e08519f18ea78a6e2d4f28a15fd43f doc_id: 891759 cord_uid: 2oi1ic5f Smallpox is unique among infectious diseases in the degree to which it devastated human populations, its long history of control interventions, and the fact that it has been successfully eradicated. Mortality from smallpox in London, England was carefully documented, weekly, for nearly 300 years, providing a rare and valuable source for the study of ecology and evolution of infectious disease. We describe and analyze smallpox mortality in London from 1664 to 1930. We digitized the weekly records published in the London Bills of Mortality (LBoM) and the Registrar General’s Weekly Returns (RGWRs). We annotated the resulting time series with a sequence of historical events that might have influenced smallpox dynamics in London. We present a spectral analysis that reveals how periodicities in reported smallpox mortality changed over decades and centuries; many of these changes in epidemic patterns are correlated with changes in control interventions and public health policies. We also examine how the seasonality of reported smallpox mortality changed from the 17th to 20th centuries in London. Smallpox was declared eradicated 40 years ago, in 1980 [1] , after unparalleled devastation of human populations for many centuries [2, 3] . Until the 19th century, smallpox is thought to have accounted for more deaths than any other single infectious disease, even plague and cholera [2] [3] [4] [5] [6] [7] . In the city of London, England alone, more than 320,000 people are recorded to have died from smallpox since 1664. Investigation of smallpox dynamics in the past is important not only for understanding the epidemiology of a disease that has been exceptionally important in human history but also in the context of the potential for its use as a bioterrorist agent in the future [8] [9] [10] [11] [12] . From the historical perspective, much can be learned by relating patterns of smallpox outbreaks to demographic changes and uptake of preventative measures [13] [14] [15] [16] and to wars and other historical events [4] . More broadly, appreciation for how intentional interventions and coincident events might impact disease dynamics is important when attempting to control the spread of any infection, including new diseases such as Coronavirus Disease 2019 [17] [18] [19] . Previous work on smallpox dynamics in London has either been based on annual records [20] [21] [22] or been restricted to a few decades [23] [24] [25] . We present and examine 267 years of weekly records of smallpox mortality in London, beginning in 1664. The data span an early era before any public health practices were in place, the introduction of variolation and then vaccination, and then the decline of smallpox mortality until it became an extremely unusual cause of death. The temporal resolution in the data enables analysis of short-term fluctuations and seasonality, which are smoothed out by yearly data. Our statistical descriptions of the weekly smallpox data will help sharpen and quantify research questions concerning the mechanistic origin of changes in the temporal patterns of epidemics [13] [14] [15] [16] . In addition, we present a timeline of major historical events that occurred during the epoch we have studied. Overlaying the historical timeline with smallpox mortality and prevention patterns provides an illuminating view of three centuries of smallpox history. Smallpox is an acute, highly contagious, and frequently fatal disease. The name "small-pox" was first used in England at the end of the 15th century to distinguish it from syphilis, which was known as "great-pox" ( [2] , pp. [22] [23] [24] [25] [26] [27] [28] [29] . The disease is caused by two variants of the Variola virus, which differ substantially in severity of symptoms and case fatality proportion (CFP) ( [1] , pp. 1-68; [26] , pp. 525-527). Variola major, which was the only known smallpox type until the beginning of the 20th century, had a CFP of 5% to 25% and occasionally higher ( [1] , p. 4). Variola minor (also known as alastrim), which was first recognized in 1904 [2] , was less virulent and had a CFP of about 1% or less. Variola minor was the only endemic type of smallpox present in England after 1920 ( [2] , pp. 8, 97; [1] , pp. 243). By 1935, transmission of smallpox within England appeared to have ended; further naturally acquired infections are believed to have arisen only from importation [2] . Among those who survived it, morbidity from smallpox was severe in many cases; victims could be left blind or disfigured for life [27] . In the pre-vaccination era, there were attempts to reduce morbidity and mortality from smallpox by a method initially known as inoculation, and later given the more specialized term variolation. This procedure involved deliberately infecting a healthy individual with smallpox virus taken from a pustule or dried scabs of a person suffering from smallpox [3] . Edward Jenner's discovery of a smallpox vaccine in 1796 [28] was a major milestone, not only for smallpox control but also for modern medicine more generally, as it inspired the development of vaccines for many other pathogens. Jenner's vaccine provided a safer, cheaper, and more effective alternative to variolation. His original method, which he called "vaccine inoculation" [28] , was to inject a person with cowpox virus (Vaccinia, from the Latin vacca for cow). The existence of a smallpox vaccine was the key factor that made eradication of the disease an achievable goal. Other important factors included the absence of asymptomatic infections, easy recognition of the disease from its symptoms, absence of an animal reservoir, and relatively low infectivity [29] . The World Health Organization launched its eradication campaign in 1967 [1, 30] . Ten years later, the world's last endemic smallpox case was registered in Somalia. In 1980, a Global Commission declared smallpox eradicated [31] . This was the first disease to be eradicated entirely by human efforts. The only remaining viral samples were stored in laboratories in Russia and the United States [11] . not all deaths. Later, with the creation of the office of the Registrar General in the mid-19th century, a formal system of registration of all deaths was developed and maintained. We accessed original documents in London, England, in the Guildhall Library, the British Library, the Wellcome Library, and the London Metropolitan Archive. We digitized weekly reported birth and death records for London throughout the period over which smallpox was listed as a cause of death (1661 to 1934). The last smallpox death reported in London was in the week beginning 17 February 1934. The last year when more than one smallpox death was reported in a single week was 1930, so we do not present data or analyses after 1930 (in total, only 7 smallpox deaths were reported from 1931 to 1934). Until 1752, the New Year was celebrated on 25 March in England [33] . However, we always indicate years following the modern practice of defining New Year's Day to be 1 January. All data analyzed in this paper are shown in Fig 3 and The London Bills of Mortality (LBoM) included information about baptisms and church burials, categorized by cause of death (Fig 1) The accuracy of the LBoM has been considered by a number of historians and demographers [23, [37] [38] [39] [40] [41] [42] . Shortcomings that are often highlighted include: All of these data quality issues led to underreporting of mortality. Nevertheless, the data from the LBoM are "probably more complete and more accurate than any available elsewhere in England in that time" [37] and remain the most comprehensive summaries of baptisms and burials of the 17th and 18th centuries in London. While the bills are certainly not a perfect record of mortality, it is reasonable to assume-as we do in this paper-that the temporal pattern of smallpox burials quantified in the LBoM, is roughly proportional to the true historical smallpox mortality in London. We implicitly assume that changes in the degree of underreporting of smallpox deaths were sufficiently slow that we can make inferences about seasonality of smallpox deaths and the frequency of epidemics. However, from the LBoM, we can estimate only a lower bound for the total burden of smallpox deaths in London between 1661 and 1841. Burials are tabulated by cause in the LBoM, but the Bills predated formal classifications of disease, casting doubt on the reliability of the listed causes [37, 40, 44] . However, smallpox records are likely among the most accurate in the LBoM, due to the unique and easily identifiable presentation of the disease ([36], p. 530; [37, 40, 43] ). Before 1701, smallpox was listed in the weekly LBoM under the heading "flox and smallpox". "Flox" is an older term that referred to a rare type of smallpox infection involving especially severe symptoms, including hemorrhaging, and high case fatality (approximately 96%) ([36], p. 436). After 1701, "smallpox" was used consistently. Other disease names that occur in the bills and are considered by some to be associated with smallpox are "flux" and "bloody flux". Razzell [3] suggested that bloody flux was a name used for hemorrhagic smallpox and that it was considered a distinct disease ( [3] , p.104). However, Creighton [36] described bloody flux as an old name for dysentery and not as something related to smallpox ([36], p.774). Other historians also refer to "bloody flux" and "flux" as diseases not related to smallpox, but rather the names for dysentery and diarrhea respectively ( In any case, mortality from "bloody flux" and "flux" was negligible compared to smallpox (<5,000 deaths in total from "bloody flux" and "flux" compared to 322,219 total smallpox deaths). Consequently, even if they were truly smallpox deaths, including them would not significantly influence our findings. We therefore used the sum of "flox and smallpox" and "smallpox" records and did not include "bloody flux" and "flux" in our data. By the end of the 18th century, it was evident that a better system of collection of vital statistics was needed in England. The accuracy of the old system of parish records had been compromised by the rapid growth of London's population [48] . The Registrar General's Office was created in 1836 by the Births and Deaths Registration Act [49] to provide a comprehensive and accurate national registration system of births, marriages, and deaths, including causes of death ([50], p. 77). The new (national) system of civil registration began in 1837, and the first Registrar General's Weekly Return (RGWR) was published for the week ending 11 January 1840 ([51], p. 119). Unlike the LBoM, the new registration system included all deaths (not only burials), all sectors of the population (not only Anglicans), and greater geographical coverage. Initially, five additional parishes were included [39] . Other areas were added when the RGWRs were established, and this expanded area was used to define the metropolis in the 1841 census [52] . A thorough review of the quality of data in the Registrar General's Returns is provided by Hardy [44] . In 1841, 234 smallpox deaths were reported in the LBoM. In 1842, only 34 smallpox deaths were reported in the LBoM, whereas 357 were reported in the RGWR. We use the RGWR from January 1842 onwards. From the middle of the 18th century, there was often a week late in the year with a remarkably high number of deaths reported in the LBoM. This phenomenon is well known ([53], p.14) and resulted from backlogged records being submitted together (heaped), typically in early December in the last reporting week of the year. To address the problem of heaping, we examined each year from 1760 to 1841 and performed the following steps: • We considered the number of reported smallpox deaths from week 45 of the focal year to week 5 of the next year. • We classified as a heap week any week from week 45 to week 5 in which reported smallpox mortality was more than three standard deviations from the median of these weeks. • We verified by visual inspection that the automatically detected heap weeks were indeed unusual, and we looked for visually apparent heap weeks that were missed. • We replaced the reported smallpox deaths in heap weeks with the average of the previous and following weeks. • We calculated the difference between original heaped count and the replaced value (the excess due to heaping) and redistributed this number of smallpox deaths in proportion to reported smallpox throughout the year (so the adjusted counts have noninteger values). This redistribution ensured that the original and revised time series contained the same annual numbers of smallpox deaths. (We separately considered redistributing the excess uniformly throughout the year and did not detect any differences in our results.) The first year with a heap week identified in this way was 1768, and the last was 1841. Thirtynine years with heap weeks were discovered by the three standard deviations criterion. Another 14 years with heap weeks were identified visually, giving a total 51 years involving heaping. From 1793 to 1841, almost every year had a heap week. The mortality bills for some weeks have been lost. Fortunately, all gaps are small (typically 1 to 5 weeks) with the largest gap being 9 weeks. We filled all gaps using linear interpolation, so there are no missing values in the final time series. Annual summaries of mortality were published in London from 1629 onwards. Initially, these were annual Bills of Mortality, and later, they were annual summaries of the Registrar General's Returns. Creighton [36] tabulated annual smallpox mortality in London for the period 1629 to 1893 (based on annual Bills of Mortality until 1836 and on the Registrar General's Annual Returns for 1837 to 1893). In Fig 2, we compare these annual counts with annual aggregations of our weekly data. Differences between the two data sets are indicated with white stacked bars if the annual counts from Creighton are larger, and black stacked bars if our annual sums from the weekly data in this paper are larger. Before 1664, there are no weekly data, so the bars are entirely white. After 1893, we show only our annual sums and color them red. Annual smallpox mortality was greatest in 1871; our annual sum for that year is 7,982, and Creighton's table indicates 7,912. The top of this bar is cut off in Fig 2. In all other years, there were fewer than 4,000 smallpox deaths. The data from the two sources align well for most years. The most substantial discrepancies are during the period of transition from the Bills of Mortality to the Registrar General's Returns (1837 to 1841); we summed weekly LBoM data for this period, whereas Creighton used the Annual Registrar General's Returns. Creighton notes that the annual count for 1837 includes only six months, not the full year ([36], p. 613), but even so, it exceeds our sum from the weekly bills (which is not surprising given the much larger geographical area covered by the Registrar General's Returns; see Transition to the Registrar General's Weekly Returns section). Decennial censuses of London began in 1801 [54] . We estimated London's population at earlier times based on Finlay and Shearer [55] for the period up to 1700, and Landers [40], who provides decadal population estimates from 1735 to 1795 (see Table 1 and the top of Fig 3) . Finlay and Shearer [55] estimated the total population of London for every half century from 1500 until 1700 based on information about the population north and south of the river Thames, taking into account inaccuracies in data collection. Their book [55] also contains data for other years for the population north ( [55] , Table 2 ) and south [55] of the river that were not included in their summary ( [55] , Table 5 ). Population numbers for some years, for the south of the river, were missing (years highlighted with an asterisk in Table 1 ); we used linear interpolation to estimate the population south of the river for these missing years. Final total Entries with an asterisk were estimated based on linear interpolation from ( [55] , Tables 2 and 5) . https://doi.org/10.1371/journal.pbio.3000506.t001 Patterns of smallpox mortality in London, England, over three centuries population estimates were calculated by the same method as in [55] , i.e., adding the populations north and south of the river and multiplying the sum by an inflation factor (1.1, except 1.2 in 1650 and 1660) to account for inaccuracy in data reporting. Note that Finlay and Shearer's estimate for 1650 appears to be inaccurate due to an arithmetic error (according to their own figures, it should be approximately 400,000, not 375,000). Over the 267 years that we study, London underwent major demographic and social changes, and there were a variety of historical events that may have had substantial impacts on smallpox dynamics. The introduction of smallpox control measures (variolation and later vaccination) would be expected to influence smallpox dynamics. Other events that could potentially have impacted smallpox epidemics include wars [56, 57] and the Industrial Revolution, which was accompanied by urbanization and demographic transitions [58] [59] [60] [61] . We annotate the smallpox time series in Fig 3 with the major events and developments that we describe below. The first recorded outbreak of smallpox in England is dated to 12 Lady Mary Wortley Montagu (an influential writer and poet [62] ) is credited with introducing variolation to Great Britain [1] [2] [3] 30, 63] . She had her daughter professionally variolated in London in April 1721 (this year is annotated with "Introduction of variolation" in Fig 3) . For the next two decades, variolation occurred but was not popular. Only 857 people were variolated in the whole of Great Britain from 1721 to 1727 and only 37 in 1728 [36]. The variolation level for this period is consequently indicated as "Very low" in Fig 3. In the decade following 1728, the frequency of variolation is unknown, but we assume it was between the very low level before 1728 and the moderate level later when it became a more common practice (in Fig 3, the period 1728 to 1740 is indicated as "Low" variolation levels). English medical practitioners implemented variolation very crudely with deep incisions that caused severe symptoms, morbidity, and high mortality of up to 2% ( [64] , p. 464) [65] . Variolation became more popular in the 1740s (indicated by "Moderate" uptake levels in Fig 3) Fig 3) . Until 1762, variolation was usually preceded by four to six weeks of preparation, which included purging, bleeding, and a restricted diet with limited quantities of food. Variolation was followed by an isolation period of two or more weeks. Isolated individuals were placed in purpose-built inoculation houses ( [1] , [3] , p. 255). The preparation period was shortened after Robert Sutton's improvement of variolation using light incisions in 1762 (annotated in Fig 3) . Sutton's new method dramatically decreased the severity of symptoms and death and reduced the cost. Consequently, it became more common to offer variolation to the poor population free of charge. Sutton's variolation technique spread quickly around England and became very popular in rural areas. When a new epidemic appeared to be highly probable, "general variolation" of entire villages and communities was performed. In large towns and cities, the situation was quite different [25] . In London, the use of variolation was irregular and attempts to perform "general variolation" were sporadic and rare. Poor Londoners were variolated only through Smallpox Charities. They performed public variolation in batches, separately for males and females, 8 to 12 times a year ([36], p. 506). The charities did not admit children under 7 years of age despite the fact that the vast majority of smallpox cases at that time were in infants and young children ([36], p. 507). The full extent of variolation in London after the Suttonian innovation is unclear. The figures from London Hospital show that the number of variolated individuals increased dramatically from 29 in 1750, to 653 in 1767, and 1084 in 1768 ([36], p. 506). Based on a variety of historical reports, Razzel concluded that variolation gained considerable popularity in London at the turn of the 19th century ( [3] , p. 72). From these qualitative descriptions, it seems likely that uptake of variolation increased after 1768 and reached a maximum during 1790 to 1808 [3, 69] (annotated in Fig 3) . During the Industrial Revolution (annotated as 1760 to 1830 [70] in Fig 3) , a "growing number of people moved to urban centres" [71] such as London, precipitating significant demographic and social changes [38, [58] [59] [60] [61] . London's population more than doubled from about 730,000 in 1765 to about 1,900,000 in 1831 (see Table 1 and section on London's population). If this increase in population size was associated with an increase in population density, and/or an increase in the average number of contacts people had, then the rate of transmission of smallpox would have increased and affected epidemic patterns [72] . Transmission dynamics would also have been affected by changes in fertility [13, 15] , which may [38,60] or may not [58] have increased during the Industrial Revolution. It has also been proposed that smallpox transmission increased as a result of viral evolution during this period [23] [24] [25] . The idea of vaccination was initially met with skepticism by the scientific and medical communities [1, 2] . Jenner "was advised not to send a record of his observations to the Royal Society, which was prepared to refuse it, but to publish it as a pamphlet; and as a pamphlet it appeared in 1798" ( [73] , p. 62). Unlike variolation, vaccination came with relatively little risk to the vaccinee, no preparatory period, and much lower cost. Consequently, in spite of the initial skepticism, vaccination was adopted by the public more quickly and more widely than variolation ever was [1] . There were initially many impediments associated with ineffective vaccine distribution and storage, shortage of cowpox virus, inadequate vaccine efficacy, waning immunity (hence a need for periodic revaccination), and religious and philosophical objections. These challenges were overcome over the course of the 19th century, and by the turn of the 20th century, vaccine uptake had risen sufficiently to cause a dramatic decline in smallpox mortality (Fig 3) . Unfortunately, quantitative reports of early smallpox vaccine uptake are sorely lacking. Vaccinations were poorly recorded until the end of the 19th century. Available data are incomplete, uncertain, and inconsistent. For example, the figures from the London Smallpox and Inoculation Hospital show the percentage of vaccinated patients admitted to the hospital increasing steadily from 32% in 1825 to 73% in 1856 [74] , whereas the Royal Commission on Vaccination found that only 25% of newborns were vaccinated by 1820 and about 70% in some parishes by 1840 [35] . Mooney [75] states that during 1854 to 1856, the percentage of vaccinated infants might have ranged from 28% to 81% based on one source, but that another source indicates that infant vaccination rates for London during the period 1845 to 1890 were much lower than the national average and never increased above 500 per 1,000 live births (i.e., 50%). There appear to be no surviving records concerning vaccinations of older age groups during this period; however, it is hypothesized that many adults escaped vaccination ( [74] , p. 117). In Fig 3, we indicate the qualitative pattern of change in vaccine uptake. A number of additional developments and policy changes during the 19th century are indicated in the timeline in Fig 3. State involvement in the control of smallpox in England began with the foundation of the National Vaccine Establishment in 1808. It provided free vaccination at its London stations and distributed vaccine to other parts of England [76] . Around this time, the London Smallpox and Inoculation Hospital ceased variolation and began vaccination in greater numbers. Smallpox outbreaks over the next decade were very mild, possibly resulting from increased vaccination; however, this was precisely the period of the Napoleonic Wars (1803 to 1815) during which other infectious diseases were also less prevalent than usual in England ([36], p. 569). A strikingly large smallpox epidemic occurred in London from 1837 to 1838 and exploded into a European wide pandemic [2] . The authorities in England realized that some radical measures had to be taken, which led to the first Vaccination Act of 1840, providing vaccination free of charge and banning variolation. It was followed by the Vaccination Act of 1853, which made vaccination of every child during the first four months of life compulsory. The Vaccination Act of 1867 introduced penalties for not complying with compulsory vaccination. The Franco-Prussian war, which began in late July 1870, is believed to have initiated the worst pandemic of smallpox in all of Europe in the 19th century. It resulted in at least 500,000 deaths. England alone lost more than 40,000 people. Thanks to compulsory vaccination, fatality rates in England were three times lower than in Prussia, Austria, and Belgium ( [2] , pp. 87-91). The immediate response of the English government to this devastating pandemic was the Vaccination Act of 1871, which enforced very strict control (through the courts) of the implementation of the previous Acts. In the second half of the 19th century, many vaccine-related challenges were resolved. Arm-to-arm vaccination (vaccine transfer from the infectious pustule of vaccinated individual to a nonvaccinated individual [1] ) was the main method of vaccine distribution in the beginning of the century. It was dangerous because it could transmit other diseases such as syphilis and was consequently outlawed in 1898 [2] . It was replaced by the new technique of passing cowpox from cow to cow. The new method of vaccine distribution was first introduced in Naples, Italy, in 1843 [77, 78] . However, it arrived in England only in 1881. Another important discovery was made in 1891 by Monckton Copeman [78] , who demonstrated that adding glycerine to smallpox vaccine reduces bacterial contamination, making it more efficacious and reliable. The last large outbreak of Variola major occurred in London from 1901 to 1902 (Fig 3) and was probably seeded from another country [74] . After 1902, only very small outbreaks occurred with very low incidence and very few deaths. In 1967, the World Health Organization launched its global smallpox eradication campaign. In 1980, smallpox was certified as the first infectious disease to be eradicated by human efforts [27]. We used a variety of methods to elucidate the patterns in the London smallpox time series. Spectral analyses and a visualization of seasonality allow us to reveal important characteristics of the data that cannot be gleaned by inspection of the raw time plot (Fig 3) . Due to population growth and changes in city boundaries, the size of the population from which smallpox deaths were reported for London changed over the course of the time series. In addition, sampling of deaths in the LBoM was less complete in the last few decades before the establishment of the RGWR. Sampling deficiencies presumably affected deaths from all causes in the same way; consequently, we attempted to obtain a consistent normalization of weekly smallpox deaths by dividing by the trend of weekly all-cause mortality (ACM) in London, rather than estimated population size. To calculate the trend in ACM, we used Empirical Mode Decomposition (EMD), which is designed to identify trends in nonlinear and highly nonstationary time series [79, 80, 81] . EMD was developed to overcome the drawbacks of moving averages, other linear filters, or linear regression, which often perform poorly on nonstationary data. EMD decomposes a signal into several components with a well-defined instantaneous frequency via intrinsic mode functions (IMFs). IMFs are basically zero-mean oscillation modes present in the data: The first IMF captures the high frequency (shorter period) oscillations, while all subsequent IMFs have lower average frequency (longer period). Each IMF is extracted recursively starting from the original time series until there are no more oscillations in the residue. The last residual component of this process can be considered an estimate of the trend [80] . We used spectral analyses to identify the strongest periodicities in the smallpox time series, both globally (with a traditional Fourier analysis) and locally (via wavelet analysis). Before computing spectra, we normalized, square-root transformed, and detrended the data in order to reduce variation in amplitude without affecting periodicities [82, 83] . Classical power spectrum. We computed the standard power spectral density [14, [84] [85] [86] of the entire smallpox time series to obtain a global estimate of its frequency content. We used the R [87] function spec.pgram with no taper (taper = 0) and a standard modified Daniell smoother (kernel = kernel("modified.daniell",c(3,3))). Wavelet spectrum. Because infectious disease time series are typically highly nonstationary, wavelet analysis has become increasingly popular in epidemiological research [16, 82, 83, [88] [89] [90] . We computed a wavelet transform [91, 92] of the smallpox time series in order to examine how smallpox periodicities changed over the course of the three centuries. A wavelet transform is computed with respect to a basic shape function, the analyzing wavelet, which has a scale parameter that controls its width. Narrower (wider) scales correspond to higher (lower) frequency modes in the time series. At any given time and scale, a stronger correlation between the analyzing wavelet and the signal yields a larger value of the wavelet transform. We obtain a complete (two-dimensional) time-frequency representation of the data by convolving the analyzing wavelet-at each scale-with the original time series. We used the Morlet wavelet [92] as the analyzing wavelet to obtain the wavelet transform of the smallpox time series. The standard computation of the wavelet spectrum requires that the number of points in the time series be a power of 2. Consequently, we pad the ends of the time series with zeros to bring the number of time points in the data to the nearest power of 2. The resulting artificial discontinuity where the zero-padding begins reduces the accuracy of the wavelet transform at the ends of the time series. Regions of lower accuracy are identified by the cone of influence. Data outside this cone should be interpreted with caution. Ninety-five percent confidence regions are computed based on 1,000 Markov bootstrapped time series [83, 89] . An indication of underlying seasonality in a time series is the occurrence of a spectral peak at a period of one year. However, this crude measure suppresses the detailed seasonal pattern and, in particular, does not reveal the times of year when peaks or troughs occur. Following Tien and colleagues [7] , we visualized the evolving seasonal pattern of smallpox dynamics with a heat map in the time-of-year versus year plane (we refer to this as a seasonal heat map). Before constructing the seasonal heat map, we square-root transform and detrend the normalized smallpox time series (because we have found that seasonality is represented mostly clearly after this transformation). Detrending has the effect of shifting the local mean at a given time toward the global mean of the entire time series. Consequently, the last few decades of the smallpox time series, which have a true mean close to zero, are represented by "medium heat" in the seasonal heat map because the true zeros are shifted to the global mean. To further clarify changes in smallpox seasonality, we separately identified the peak week of each epidemic by visual inspection of the normalized time series, and displayed all peak weeks in the time-of-year versus year plane. To assist with identification of trends, we also computed a moving average of the week-of-the-year in which an epidemic peak occurred (using a circular average that considered week 53 and week 1 to be the same). Our moving average was calculated with a 21-epidemic window, i.e., each plotted point is the average of the previous 10, current, and next 10 epidemic peak weeks (peaks did not necessarily occur every year). The time plot of the raw data (Fig 3, top panel) displays substantial changes in the structure, amplitude, and frequency of smallpox epidemics over time. However, some of the apparent changes are misleading because they do not account for population growth and inconsistency of data sources. For example, the epidemic of 1871 appears to be the largest, but it was not the largest relative to population size or relative to ACM. In contrast, the epidemic of 1838, which is frequently mentioned in the literature [2, 36] , is not easily identifiable in the raw data, likely because it occurred just at the time of transition between the LBoM and the RGWR. Weekly ACM in London (1661 London ( to 1930 From the earliest times in the series, there were recurrent epidemics. Four large outbreaks in 1667, 1674, 1681, and 1694 stand out in the normalized time series, and an epidemic of this magnitude did not occur again until 1752. The epidemic pattern before about 1705 appears to have been less stationary than the pattern in the subsequent decades (the frequency and amplitude of outbreaks was more consistent from 1705 to 1750). The years from 1770 to 1810 were characterized by stricter regularity of epidemics. This period coincided with more common variolation (the practice gained popularity after the Suttonian innovation of 1768). Beginning around 1810, the data show a dramatic reduction in the amplitude of epidemics, though outbreaks were more frequent and the data are noisier. The declining trend in epidemic severity is temporally associated with the introduction of vaccination; unfortunately, this was precisely the period over which the parish registration system collapsed, increasing the difficulty of estimating the true impact of vaccination in the early vaccine era. After 1835, interepidemic intervals increased and (normalized) epidemic peak heights declined, with the exception of four large epidemics in 1837, 1871, 1876, and 1902. During this period, variolation was eliminated and vaccination levels increased. The coincidence of the 1840 Vaccination Act (which made variolation illegal) with the radical change in data collection from the LBoM to the RGWR limits our ability to identify potential causal links between policy and behavioral changes and epidemic patterns. The bottom panel of Fig 3 also shows the EMD-computed trend of normalized smallpox mortality in London over the centuries. As a proportion of ACM, smallpox mortality rose steadily from 1664 until 1680 and then fell until about 1700. After this drop, the trend generally increased (with a shallow dip around 1740) until approximately 1770. After 1770, the trend declined gradually until 1908, when smallpox deaths were nearly eliminated in London. After 1840, epidemics became more regular with a longer interepidemic period of 3 to 4 years. After the exceptionally large epidemic of 1871 (there were 10,618 reported smallpox deaths from 1 January 1870 to 31 December 1872), subsequent smallpox outbreaks were much smaller and irregular. The last substantial outbreak was in 1902, after which there were very few smallpox deaths reported. Classical power spectrum. Fig 5 shows the period periodogram (power spectrum as a function of period) for the full smallpox time series. A strong peak at one year suggests underlying seasonality of epidemics. Other peaks (near 2.2, 2.4, 3, 5.1, and 6 years) suggest more complex dynamical patterns. Wavelet spectrum. Fig 6B shows the wavelet transform of the normalized London smallpox time series. Colors indicate the strength of signal at given periods (blue meaning weak and red meaning strong). The cone of influence and 95% confidence contours are shown in black. The black curves through the bright areas of Fig 6B indicate the periods with greatest power at each time point. From 1664 to about 1720, multiple spectral modes are prominent (initially periods near 2, 3, and 5 years, then near 3 and 5 years only from 1690 to about 1705, and then near 2 years only until about 1728). There is a weak signal between 2 and 3 years from 1728 to 1740, and then a strong signal near 2 years (1740 to about 1765) and near 3 years (about 1770 to 1808). A relatively weak spectral peak at one year can be seen over much of the time series before 1820, though its magnitude is below the threshold for drawing a black peak line except for the decade from 1798 to 1808. From about 1808 to about 1832, the time series is markedly noisier, displays much lower amplitude fluctuations, and much weaker spectral signals. The decay of the parish registration system may have contributed to this temporal pattern, though the clear recurrent epidemic pattern that emerges about 1832-well before the beginning of the RGWR era-suggests that there were genuine changes in smallpox dynamics at the end of the LBoM era, not just a change in sampling. Reported smallpox mortality was lowest in the spring until about 1840, and in the fall/winter afterwards (these are the bluest regions in Fig 6C) . The time series is generally very noisy during the troughs between epidemics, and there is rarely a clear minimum week. Around 1750, there was a shift in the typical timing of outbreak peaks from summer/fall to fall/winter. This change is clearest in the red moving average in Fig 6D. Epidemic patterns might have become less seasonal during the period from 1808 to 1840 (the range of "temperature" is narrower in the heat map in this segment of Fig 6C) . After 1840, as epidemics became less frequent, they peaked in the winter/spring. We have digitized and analyzed what is, to our knowledge, the longest existing weekly time series of infectious disease mortality. The time span of the data, from 1664 to 1930, covers an extraordinary period in London, England, during which smallpox changed from a terrifying and unavoidable danger to an easily preventable infection. A number of previous studies of smallpox in London are complementary to the analysis presented here. In the 1990s, Duncan and colleagues [21, 93, 94] studied annual smallpox mortality in London, from 1647 to 1893, and estimated interepidemic intervals in five segments of the time series ( [21] , Table 1 ) (these results are reproduced in Fig 7) . These authors attributed changes in interepidemic intervals to population growth, malnutrition associated with changes in wheat prices, and seasonal variations in temperature and humidity. Cliff and colleagues ([48], p. 101) have also estimated interepidemic intervals for smallpox in London using traditional time series analysis (also reproduced in Fig 7) . For comparison, in the lower panel of Fig 7, we show the interepidemic intervals inferred directly from the time between adjacent peaks in the smallpox time series. More recently, Davenport and colleagues [23] [24] [25] have examined individual, age-specific death records in a large area of London over a period of several decades (1752 to 1805). These authors discovered that, after 1770, smallpox mortality declined in adults and rose in children. They suggested that this change in the age distribution of smallpox mortality might have been a consequence of evolution of increased transmissibility of the Variola virus. Digitization of the full weekly record of mortality from smallpox (and from all causes) in London has enabled us to conduct a number of informative analyses that reveal changes in the seasonal structure of smallpox epidemics over the centuries. The wavelet spectrum (Fig 6B) provides a more detailed view of the periodic structure of smallpox than has been accessible previously. In addition, our seasonal heat map ( Fig 6C) and epidemic peak time visualization (Fig 6D) show how the seasonal timing of outbreaks varied over many decades. The annual data that have been studied previously smooth out the seasonal patterns that we have identified and restrict the resolution of spectral features. Even the classical power spectrum of the weekly data (Fig 5) reveals noninteger periods that cannot be detected from annual counts, and the wavelet spectrum ( Fig 6B) shows gradual changes in periodic structure. In Fig 7, we compare the spectral peaks from our wavelet spectrum with the interepidemic intervals previously estimated from annual data. The primary role of vaccination in the ultimate elimination of smallpox mortality in London is not in question. However, the causes of the various transitions in the spectral and seasonal structure of smallpox dynamics over the decades and centuries are not clear. The historical timelines of events and uptake of interventions, shown alongside the smallpox data and analyses in Fig 3, should aid in formulating mechanistic hypotheses, especially in relation to the history of control strategies. Previous work has shown that long-term changes in birth rates and vaccination levels have induced transitions in transmission dynamics of The red dots indicate the epidemic peaks, i.e., the highest value of normalized smallpox mortality during the epidemic year (identified by visual inspection). Panel B: wavelet transform of the normalized weekly smallpox mortality time series (square root-transformed and normalized to unit variance). Colors range from dark blue for low power to dark red for high power. Heavy black curves show the local maxima of wavelet power (squared modulus of wavelet coefficients [89] ) at each time. Thin black curves show 95% confidence contours, estimated from 1,000 bootstrapped time series generated by the method of [89] . Medium black curves near the left and right edges show the cone of influence [89, 92] , below which the calculation of wavelet power is less accurate because it includes edges of the time series that have been zero-padded to make the number of time points a power of 2. The wavelet spectrum was computed using MATLAB code kindly provided by Bernard Cazelles. other infectious diseases [13] [14] [15] [16] . Time series of additional historical and environmental variables are also important to consider. Economic factors, such as wheat prices [21] , might have affected transmission and/or mortality rates. Weather changes, especially in humidity and temperature, may also have influenced seasonality of smallpox [21, 40] , and climatic changes (see Fig 1 of [95] ) could have influenced periodic dynamical structure over longer timescales. More information about the age, social, and spatial structures of smallpox mortality would be extremely valuable. A case in point is the discovery by Davenport and colleagues [23] [24] [25] that, around 1770, smallpox burials declined in adults and rose in the very young. This shift in age structure of mortality seems likely to have been driven by a corresponding shift in the age structure of infections. The most obvious potential cause of a trend toward younger age at infection is an increase in the transmission rate [72] . Davenport and colleagues suggest that greater transmission may have resulted from "a sudden increase in infectiousness of the smallpox virus" ( [23] , p. 1289). Alternatively, the virus could have evolved to yield a longer infectious period. The plausibility of unusual viral evolution around 1770 is supported by recent molecular genetic analyses, which found that "diversification of major viral lineages only [occurred] within the 18th and 19th centuries, concomitant with the development of modern vaccination" [96] . It is certainly possible that evolutionary pressures resulting from changes in control strategies could have selected for increased transmission [97, 98] . Of course, a variety of other factors could also have contributed to a rise in the transmission rate. In particular, more common variolation could have promoted transmission (even if it reduced mortality), and population density might have risen during the Industrial Revolution, as mentioned above. Any or all of the sociodemographic, behavioral, and environmental factors mentioned above might have contributed in some way to the dramatic changes in the time of year when smallpox mortality peaked over the centuries (Fig 6C and 6D) . Recurrence of outbreaks does not necessarily imply that transmission dynamics are seasonally driven. However, the wavelet spectrum in Fig 6B shows a peak at one year from the beginning of the time series until the early 19th century, and the strong spectral peak at exactly one year in Fig 5 suggests underlying exogenous forcing by strictly seasonal variables (e.g., weather, holidays, or migrations during harvests). If smallpox transmission was seasonally forced, then the implications for dynamical interpretations of the time series are significant. Seasonal forcing of infectious disease transmission can stimulate persistence of complex epidemic cycles [99] and chaos [100] . The mechanistic origin of seasonal forcing may [101] or may not [102] be easy to determine. In the case of smallpox, previous work [1, 23, 103] found that, in temperate climates, the majority of smallpox cases occurred in the winter and spring, whereas in tropical climates, the seasonality was less pronounced. The general conclusion was that smallpox incidence always increases when the weather is cool and dry; this belief influenced the planning of the eradication campaign in India and seemed to help to improve its efficiency [1] . Previous smallpox seasonality studies were mainly based on data from the 19th and the 20th centuries, when preventative measures were already common [1, 103] . Our data set is of a particular interest in this respect since it includes a period when only naturally acquired smallpox immunity existed. Our goal in this paper has been to describe-and make publicly available-the weekly time series of smallpox mortality in London, and to present information on a variety of historical factors that might have influenced smallpox dynamics over the centuries. The annotated data (Fig 3) will help to frame hypotheses about the mechanisms that were responsible for the observed smallpox mortality patterns, including transitions from one type of pattern to another. Our spectral and seasonal analyses (Figs 5-7) quantify transitions in smallpox dynamics that should be possible to explain using mechanistic mathematical models [72, 86, 104, 105] . Dynamical transitions in measles [13, 16] and other childhood infections [14] during the 20th century have been successfully explained with mechanistic models, using observed changes in susceptible recruitment (through changing birth rates and vaccination levels) to predict changes in the frequency structure of epidemic patterns [13] [14] [15] [16] 86] . For historical smallpox in London, we do have relevant birth data (shown in the top panel of Fig 3) , but only qualitative information about vaccination levels (and earlier variolation levels, the effects of which are not entirely clear). Moreover, the time series is so long that the underlying pattern of seasonal forcing probably changed substantially. Changes in seasonal forcing can be accommodated in predictions [16, 106] , and estimation of underlying seasonal variation in transmission is becoming easier [107] ; however, obtaining a convincing mechanistic explanation for all the structure we have identified in London's smallpox dynamics represents a major challenge. Ultimately, we anticipate that meeting this challenge will involve further developments of the existing theory of transitions in epidemic patterns [13] [14] [15] [16] 106] , coupled with state-of-theart methods of inference to obtain parameter estimates for dynamical models [108] [109] [110] [111] [112] . " The greatest killer" [2] has not circulated since its eradication in 1980. While smallpox research in recent years has understandably tended to focus on the potential for accidental or intentional reintroduction in the future, it is enlightening to look back in time. The long history of documenting smallpox mortality in London provides an extraordinary opportunity to learn from the past about changing patterns in infectious disease transmission. Much remains to be understood about the data we have presented in this paper. From an ecological perspective, the key challenges are to explain the observed smallpox dynamics in London as consequences of intrinsic nonlinear interactions, influenced by identifiable extrinsic forces. Better control naturally led to less smallpox mortality over time. However, how interventions influence the frequency structure and seasonality of epidemic time series over decades and centuries is much more subtle [13] [14] [15] . While preliminary work has been promising [113] , careful estimation [114, 107] and analysis [15, 106] of patterns of seasonal forcing will be required in order to use mechanistic models reliably to explain [16] the observed transitions in smallpox transmission dynamics. World Health Organization Geneva The greatest killer: smallpox in history The conquest of smallpox: the impact of inoculation on smallpox mortality in eighteenth century Britain. Caliban Books, 13 The Dock Plagues and peoples. Anchor The history of England from the accession of James the Second Acceleration of plague outbreaks in the second pandemic Herald waves of cholera in nineteenth century London Emergency response to a smallpox attack: the case for mass vaccination Group interest versus self-interest in smallpox vaccination policy Game theory of pre-emptive vaccination before bioterrorism or accidental release of smallpox The spectre of smallpox lingers The development and approval of tecoviromat (TPOXX®), the first antiviral against smallpox A simple model for complex dynamical transitions in epidemics Transients and attractors in epidemics Effects of the infectious period distribution on predicted transitions in childhood disease dynamics A century of transitions in New York City's measles dynamics Early transmission dynamics in Wuhan, China, of novel coronavirus-infected pneumonia Covid-19 -Navigating the Uncharted Facial Masking for Covid-19 -Potential for "Variolation" as We Await a Vaccine The dynamics of smallpox epidemics in Britain, 1550-1800. Demography Oscillatory dynamics of smallpox and the impact of vaccination Modelling the dynamics of smallpox outbreaks in London The decline of adult smallpox in eighteenth-century London 1 Inner London and Outer London: population and density history London 1500-1700: The Making of the Metropolis. Longman Group Limited Deadly comrades: war and infectious diseases War epidemics: an historical geography of infectious diseases in military conflict and civil strife The Industrial Revolution: past and future The industrial revolution and the demographic transition British Industrial Revolution, 1760-1860. World Economy History Nutrition, population growth, and the Industrial Revolution in England Lady Mary Wortley Montagu: comet of the enlightenment The prevention and eradication of smallpox: a commentary on Sloane (1755) 'An account of inoculation The smallpox story: life and death of an old disease Risks of smallpox vaccination Smallpox: the triumph over the most terrible of the ministers of death Scourge: the once and future threat of smallpox Bioinformatics for vaccinology The decline of adult smallpox in eighteenth-century London: A commentary Encyclopaedia Britannica. Industrial Revolution The British Museum. The Industrial Revolution and the changing face of Britain Infectious diseases of humans: dynamics and control The life of Edward Jenner MD, FRS: naturalist, and discoverer of vaccination Smallpox in London: factors in the decline of the disease in the nineteenth century A tissue of the most flagrant anomalies": Smallpox vaccination and the centralization of sanitary administration in nineteenth-century London Vaccination policy against smallpox, 1835-1914: a comparison of England with Prussia and Imperial Germany. The Society for the Social History of Medicine The XIX century smallpox prevention in Naples and the risk of transmission of human blood-related pathogens Development of smallpox vaccine in England in the eighteenth and nineteenth centuries Decomposition of functions into pairs of intrinsic mode functions On the trend, detrending, and variability of nonlinear and nonstationary time series The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis Regional-scale climate-variability synchrony of cholera epidemics in West Africa Time-dependent spectral analysis of epidemiological time-series with wavelets The analysis of time series: an introduction Time series analysis and its applications Mathematical epidemiology of infectious diseases R: a language and environment for statistical computing Travelling waves and spatial hierarchies in measles epidemics Wavelet analysis of ecological time series The role of mathematical models in explaining recurrent outbreaks of infectious childhood diseases The illustrated wavelet transform handbook A practical guide to wavelet analysis Modelling the different smallpox epidemics in England Smallpox epidemics in cities in Britain History and climate: memories of the future? 17th century variola virus reveals the recent history of smallpox Imperfect vaccination: some epidemiological and evolutionary consequences The evolutionary epidemiology of vaccination Infinite subharmonic bifurcation in an SEIR epidemic model Chaos versus noisy periodicity: alternative hypotheses for childhood epidemics Recurrent outbreaks of measles, chickenpox and mumps: I. Seasonal variation in contact rates Dynamical resonance can account for seasonality of influenza epidemics Smallpox and season: Reanalysis of historical data. Interdisciplinary Perspectives on Infectious Diseases Essai d'une nouvelle analyse de la mortalité causé e par la petite vérole et des advantages de l'inoculation pour la prévenir An attempt at a new analysis of the mortality caused by smallpox and of the advantages of inoculation to prevent it. 1766 Invariant predictions of epidemic patterns from radically different forms of seasonal forcing Fast estimation of time-varying infectious disease transmission rates Time series analysis via mechanistic models Mechanistic modelling of the three waves of the 1918 influenza pandemic Inferring the causes of the three waves of the 1918 influenza pandemic in England and Wales Statistical Inference for Partially Observed Markov Processes via the R Package pomp Long-term dynamics of measles in London: Titrating the impact of wars, the 1918 pandemic, and vaccination Predicting epidemiological transitions in infectious disease dynamics: Smallpox in historic London (1664-1930) Parameterizing state-space models for infectious disease dynamics by generalized profiling: measles in Ontario The data were photographed and/or entered primarily by Kelly Hancock, Claire Lees, James McDonald, Laxmi Pandit, and David Richardson. Valerie Hart facilitated work at the Guildhall Library, City of London. Bernard Cazelles shared his code for computing wavelet spectra. We thank all members of the Mathematical Biology Group at McMaster University for helpful comments and discussions.