key: cord-0194789-vpvlo3dl authors: Cherednik, Ivan title: Discrete Poisson hardcore 1D model and reinfections date: 2022-02-18 journal: nan DOI: nan sha: b91dcd164dbf317bc064d8dd87581ef6a029515f doc_id: 194789 cord_uid: vpvlo3dl Modeling reinfections in epidemics appeared of importance during the recent stages of the Covid-19 pandemic (2021-22); they were frequent, especially due to the Delta and Omicron strains. The classical Poisson distribution describes reinfections without the impact of immunity. However, immunity is important here, and a hardcore lattice version of the Poisson distribution is needed, a discrete variant of the so-called Matern II process in 1D. Combinatorially, segments of one or several different lengths (protective immunity intervals) are placed in a bigger segment (the epidemic cycle) with spaces between them. The role of the edges is significant: the duration of immunity is comparable with that of the epidemic cycles. We provide exact formulas for the corresponding distributions, the lattice one and its continuous limit. 1. Introduction. Modeling reinfections and recurrences of diseases is an obvious challenge, but this was of limited importance for the epidemics before Covid-19. Generally, the Poisson distribution is expected here if the immunity and the duration of disease are disregarded. However, immunity is very important for reinfections. For instance, reinfections are rare for short-term or seasonal epidemics (under the same strain) exactly because the period of natural immunity or that due to vaccinations is quite comparable with the duration of the cycle. There were not many papers on modeling reinfections. See e.g. [ADDP] , which was SIR-based. We note that SIR-type modeling generally proved to be insufficient for Covid-19; see [Ch1, Ch2] . Hardcore point processes. The corresponding mathematical tool is the theory of Poisson hardcore point processes, more specifically, the Matérn process II in dimension one and in its lattice variant. There are quite a few processes where the distances between neighboring objects must be greater than some constant. The usual examples are forestry, ecology, vehicular networks, cellular networks, etc. Also, see [Gi] for Tonks gas. In statistical physics, "small systems" are of this type, those far from thermodynamic equilibrium. In such and similar examples, the intervals between objects are mostly assumed small vs. the domains where they are considered. Accordingly, the edge effects are mostly ignored. This is different for epidemics. There is a vast literature on Matérn processes I,II,III, though mostly in 2D. See e.g. [KD] on vehicular networks and references there. Modeling vehicular networks, clear 1D processes, is somewhat similar to modeling reinfections. The interval between cars is a counterpart of the immunity intervals. It is not surprising that some of the expressions in our continuous model are similar to those in the moment measures in [KD] ; see e.g. formula (1) there. Though our continuous truncated Poisson distribution from (6) seems new. The duration of immunity is generally comparable with the duration of epidemic cycles. Accordingly, we focus on configurations of subsegments of the size comparable with that of the segment where they are considered. The continuous limit is when lim N →∞ L/N = ν > 0, where L is the immunity duration, and N is the total number of days of the epidemic cycle. The corresponding distribution has finitely many states, namely Ceiling[N/L] (in both, discrete and continuous variants), so it is some truncation of the classical Poisson one. Furthermore, we need a lattice (discrete) variant of this model because the assumption that the distribution of reinfections depends only on L/N can be too approximate. For instance, this does not hold if the number of reinfections is close to N/L or if the chances to be infected are relatively high. Generally, when the hardcore objects are relatively "dense" in the considered domain, a lattice model can be necessary. This paper is mainly written in the lattice setting; we obtain the continuous distribution as a limit of our lattice one. The lattice model and the exact calculation of the corresponding distribution is our first step; this requires only basic combinatorics. Due to the edge effect, the formula contains a summation of L + 1 binomial coefficients, where L is the immunity duration. We make it as explicit as possible in Corollary 3.3. To perform the continuous limit lim N →∞ L/N = ν > 0 we transform such a formula to the one with the number of terms depending only on the number of infections r. The latter number remains the same in the limit, namely 0 ≤ r ≤Ceiling[1/ν] for ν > 0. This results in the distribution from (6), which works especially well when r ≪ 1/ν. -19) . The duration of the Covid-19 epidemic (from late 2019) is already beyond 2 years; we are still in its 1st cycle. This epidemic was practically uninterrupted except for minor breaks between the waves (mostly during summer periods). Due to the unusually large number of the strains of Covid-19, all with with very high transmissibility, the natural immunity did not last too long for Covid-19, as well as the immunity due to the vaccinations. For instance, those infected by the "wild strain" (the G-strain dominated in Europe in early 2020) could be reinfected by Alpha, then by Delta (B.1.617.2 and AY lineages), and then by Omicron (B.1.1.529 and BA lineages). Due to such variability, the average immunity durations were limited: presumably about 5-8 months. Being infected by the same strain twice or simultaneously by 2 of them was statistically insignificant. The statistics of reinfections and the recurrences for Covid-19 is not very reliable. Some countries reported only the total number of (known) infected individuals, not the total number of detected infections. For instance, this was the case with England untli January 31, 2022; the data on Covid-19 in England are generally among the most systematic. According to the UK Health Security Agency (UKHSA), the number of detected reinfections can be about 10% in early 2022. In quite a few countries, it was significantly greater than this and reinfections were present well before 2022. We note that the available data are mostly for the detected cases and symptomatic ones, though massive testing began at the end of 2021 in quite a few countries (though not too many!). The cases of double reinfections (3 Covid-19 infections) were detected. They can be not too rare taking into consideration asymptomatic cases. We provide in this paper a general method, which can be naturally extended to any number of parallel infections with different immunity durations. There is only one constraint: the chances to have more than one disease at the same time must be almost zero. It is really very rare when someone could be infected by two parallel strains of the same virus at the same time. One of the key factors for different immunity intervals is that those vaccinated or with natural immunity against the earlier one generally have a weaker protection against the later one, which was the case of the Delta and the Omicron, and the Omicron lineages BA.1 and BA.2. Though there are factors working in the opposite direction (some universal immunity). Main hypotheses. Our 1st hypothesis is that people are exposed to the infection uniformly during the cycle of the epidemic, which is N days in the paper, with some constant probability β per day. According to [Ch1, Ch2] , the curves of the total number of detected infections in very many countries (all we considered) are essentially of Bessel type for phase 1 and are of linear type for phase 2. Actually, phase 1 has a relatively long period around the turning point when the curve is not far from linear as well. The uniformity assumption is perfectly applicable to the periods of linear growth of the total number of infections. Moreover, the numbers of consecutive waves in many countries was like 3-6, which provides another reason to assume that the spread of Covid-19 linear statistically. Generally, the Law of Large Numbers (LLN) is always a rationale for the uniformity assumption for such averages; 3-6 waves are sufficient for this. Not much will change statistically when considering the chances of reinfections if we try to incorporate the nonlinear (Bessel-type) parts of the curves. The 2nd hypothesis is that the impact of the vaccinations and (significant) number of undetected and asymptomatic cases can be addressed via diminishing the susceptible population of a country. The size of population does not directly appear in the formulas. The vaccinations generally decrease β, but the strains of Covid-19 increased their transmissibility during the epidemics. The 3rd hypothesis is a minor one: we disregard the duration of the disease. It is simply added to the immunity interval. We think that all 3 assumptions are quite reasonable for the last 2 years of Covid-19. So the challenge is to provide the distribution of the reinfections based on them, and then adjust it to the real data. Truncated Poisson distribution. We define and calculate the probabilities π r for (exactly) r infections during the total period. They depend on the duration of an epidemic, N days, the immunity interval L, and the probability β to be exposed to the infection during 1 day. Later, we make β = α N for some parameter α. We note that it can be only among symptomatic and reported case, which does not change our analysis. We then assume then that consecutive symptomatic cases must be separated by at least L days in one individual. The parameters α, β can be determined if the total number of noninfected people (during the whole cycle) and those with exactly 1 infection are known. In the absence of immunity, α can be estimated as the total number of infected people divided by the size of the population in the area. However, the immunity is of obvious importance for reinfections. Generally one can proceed as follows. If the number of noninfected people (no disease during the whole period of epidemic) is known, then e −α is approximately this number divided by the population of the country. This is as for the classical Poisson distribution. Indeed, the immunity is a factor only for people infected at least once. Similarly, we can (approximately) find L using our formula for the continuous limit π ′ 1 of the probability π 1 of exactly one infection during N days. For the classical Poisson distribution (no immunity), it is αe −α . This is different from our formula, where L is significant. So we can assume that β = α/N and L are known. Then π 2 and π 3 are the probabilities of 2 and 3 infections during N days. We obtain the formulas for any π ′ r , but higher π ′ r are not likely to occur for epidemics. Evaluating parameters α, L via π 0 , π 1 for Covid-19 is approximate by now; the available reinfection data are insufficient. We consider through the paper 3 model situation: N = 750 (about 25 months), L = 150 (about 5 months), and 3 values of α = 1, log(2), 0.5 in (4). For these values of α, about 37%, 50%, 60% of the susceptible population remain noninfected during 25 months. The 3rd case basically matches the number of reinfections for Covid-19 reported in England (until 02/2022) taking taking into consideration many undetected (asymptomatic) cases and the impact of vaccination programs. The latter increase L and reduce the size of susceptible population. Conclusion. The famous Poisson distribution is a straightforward limit of a simple distribution in terms of binomial coefficients, its lattice variant. This is one of the fundamental links between combinatorics and statistics. Our basic distribution {π r } for the probabilities of r infections is a sum of (L + 1) binomial coefficients, where L is the immunity duration. We transform it to the one where the number of terms depends only on r, the number of infections. The latter is used to calculate the limits π ′ r of π r as N → ∞ and lim N →∞ L/N = ν > 0; the justification we provide is interesting. We give other formulas for π r and those for the corresponding generating functions. In spite of the 1D setup, there can be various applications of the distributions we obtain, not only for reinfections. Almost any networks have refractoriness: excited agents cannot be immediately re-excited. Vehicular networks and trading equities in stock markets are typical examples. Though we focus in this paper on networks with relatively small number of possible states, when ν = L/N cannot be assumed negligible, which is obviously the case with reinfections. Our truncated Poisson distributions can be applicable to other 1D hardcore Poison-type processes: random configurations of disjoint subsegments of the length L in the segment of the length N under the Matérn II assumption, where the last L-segment can go beyond the N-segment. The subsegments can be of different types (their lengths and probabilities), which links our paper to stochastic processes. As far as we know, our "truncated" distributions {π r } and {π ′ r } are new, as well as their application to reinfections of epidemics. Though there is of course a long history of hardcore point models and related combinatorics. They are some natural truncations of the Poisson distribution with r ≤Ceiling[N/L]. The edge effects are important: we allow one of the L-subsegments to go beyond the right endpoint of a given N-segment. For epidemics, this assumption is necessary. Stock markets are such too: there can be open positions after the end of the considered period. Mathematically, the corresponding sum of all probabilities will be not 1 without the edge effects. 2. Hardcore Poisson-type processes. If the immunity factor is omitted, the distribution of reinfections is as follows. Assume that an epidemic lasts N days and β = α/N is the probability to be infected during one day. Then the probability to be infected r times during N days and its continuous limit are given by the classical Poisson distribution and its combinatorial counterpart. Namely: where e is the Euler number. Three basic examples. Let α = 1, α = log(2) ≈ 0.69, α = 0.5. Then, The corresponding values for the combinatorial p i are about the same. Adding immunity. Assume that an individual infected at day x cannot be infected again for days x + 1, x + 2, . . . , x + L, i.e. L < N is the duration of the immunity interval. Let π r be the probability of r infections during N days for r = 0, 1, . . . . If 1 ≤ x 1 < x 2 < x 3 < . . . < x r ≤ N are the infection days, then x 1 < x 2 − L < x 3 − 2L . . . x r − (r − 1)L and there are 2 cases: (a) x r + L ≤ N, and, otherwise, (b) x r + L > N. Here x i are for the actual infections, when the disease begins. The potential infections are the days when an individual was exposed to the infection, which is assumed with probability β. Due to the immunity, not all of the exposures result in the actual infection (disease). Any number of potential infections can occur (anywhere) during the periods x i + 1, . . . , x i + L for 1 ≤ i < r and during the end period x r + 1, . . . , min{x r + L, N}. This means that these periods can be removed from the consideration when counting the probabilities. Switching to it is 0 if r < 0 or when N < Lr + r. We obtain the following straightforward formula: To give an example: for L = 1, Here and for any fixed L, lim N →∞ π r = p ′ r , where p ′ r are from (1), where we set β = α/N. One has: ∞ r=0 π r = 1, where r ≤ N +L L+1 are sufficient in this sum. This is some combinatorial identity, which immediately follows from the definition of π r . Obviously, π 0 = p 0 for any N, L. For our three basic examples above, we will take N = 750, L = 150. Then, ν = 0.2 and π 0 ≈ 0.37, π 1 ≈ 0.44, π 2 ≈ 0.17, π 3 ≈ 0.02 for α = 1, π 0 ≈ 0.50, π 1 ≈ 0.39, π 2 ≈ 0.10, π 3 ≈ 0.01 for α = log(2), π 0 ≈ 0.61, π 1 ≈ 0.33, π 2 ≈ 0.06, π 3 ≈ 0.004 for α = 0.5. The change is not dramatic vs. (2) since α and ν are relatively small. For instance, π 1 ≈ 0.51 if ν = 0.4 for α = 1 (with the same π 0 ≈ 0.37). Let us rewrite the formula for π r without the L-summation. The number of terms in this formula depends only on r (not on L), which is the key when considering its limit as L, N → ∞. Continuous limit. Let us provide the first 4 cases of (5): Here N ≥ L, N ≥ 2L and N ≥ 3L correspondingly; x def == 1 − β. Generally, r ≤ N L . Recall that π r > 0 for N ≥ L(r −1) + r, i.e. for r ≤ N +L L+1 . For instance, π 1 = 1 − (1 − β) N for any L such that L > N. Setting β def == α/N and lim N →∞ N/L = ν ≥ 0, the limits π ′ r = lim N →∞ π r for r = 0, 1, 2 are as follows: We assume here and below that rν ≤ 1. Generally: The last value of r here is r ♭ =Floor[ 1 ν ], where Floor[x] is the integer part of x. The initial inequality r ≤ N +L L+1 gives that the last nonzero π ′ r is for r ♯ =Ceiling[N/L], which is r ♭ + 1 if 1 ν is an integer, and π ♭ otherwise. Indeed, N L + 1 > N +L L+1 > N L . One has when r ♯ = r ♭ + 1: The positivity of π ′ r for 0 ≤ r ≤ r ♭ + 1 can be readily seen from these formulas, though it of course follows from the origin of π ′ r . Here we calculated π ′ r ♭ +1 directly from the definition. Alternatively, it can be obtained from the identity r ♯ r=0 π ′ r = 1. This sum is obviously 1 (the telescoping summation), which holds a priori because r ♯ r=0 π r = 1, which is due to the definition of π r . We arrive at the following theorem. Theorem 2.2. Assume that N → ∞, β = α/N, lim N →∞ L/N = ν for some α > 0 and 0 ≤ ν ≤ 1. Then formula (6) holds for any 0 ≤ r ≤ r ♭ , as well as the additional formula for r ♯ = r ♭ + 1 when 1 ν is not an integer, where π ′ r ♯ is as above. Proof. We can simplify F r from (5) considered in the limit as follows. 1−X for the following formal differentiation B of the ring generated by X N −sL and 1/(1 − X) treated as independent symbols : This differentiation is β d/dX with the simplifications due to taking the limit. Namely, we replace N −sL r by N r r! for s ≤ r due to r ≪ N − sL. Also, we replace X N −sL ±p by X N −sL for 0 ≤ p ≤ r because these powers will be finally evaluated at X = 1 − β = 1 − α N → 1. We obtain that π ′ r = α r (1−rν) r r! e (rν−1)α + 1 (r−1)! Φ ′ r , where Φ ′ r is the limit of βΦ r after the evaluation 1−X → β, X N −Ls → e α(sν−1) . Then This gives formula (6). The calculation is similar for π r ♯ . The ν-dependence of π ′ r is interesting. For instance, since π ′ 0 does not depend on ν, π ′ 1 increases if ν increases. Indeed, the chances of reinfections (counted by π ′ r for r ≥ 2) diminish. Similarly, π r ♯ decreases if present (if 1/ν is not an integer). Generally, we have the following straightforward corollary. Corollary 2.3. Let δ r = rα e (rν−1)α α r (1−rν) r r! for 0 ≤ r ≤ r ♭ , and δ −1 = δ r ♯ = 0. Then d π ′ r /dν = δ r − δ r−1 for 0 ≤ r ≤ r ♯ . In particular, d π ′ r /dν ≥ 0 for 1 ≤ r ≤ 1 ν if and only if να e να ≥ (r − 1)(1 + z) r−1 z for z = ν 1−rν . Otherwise, this derivative is negative. Practically, triple reinfections (r = 4) are hardly possible for one cycle of any epidemic. Though there can be other random processes of this kind where big r make sense. The distribution in (6) is some quantization of the Poisson distribution, where ν → 0 is the quasiclassical limit. Accordingly, (3) is its "quantization" with 2 parameters, L and N. One more parameter can be added to (3) by switching to the q-binomial coefficients there, which we will not discuss. As a demonstration, let us try to employ these formula to the Covid-19 data from England. "As of 31 January (2022), updated figures for England show 14845382 episodes of infection since the start of the pandemic with 588114 (4.0%) reinfections covering the whole pandemic." So, approximately 14845382−588114 = 14257268 people were detected to be infected at least once. Let us assume conditionally that about 35M were involved in collecting the data; the population of England is about 57M. Our approach can be applied if only detected cases and reinfections are taken into account; however, α depends on the number of all infections, including the asymptomatic and undetected ones. Technically, we diminish 57M to 35M, but this can be done directly for α (to make π 0 , π 1 matching the reported numbers). As in the 3 basic cases, we take N = 750 and L = 150. Then α ≈ 0.5; indeed, π ′ 0 = e −0.5 ≈ 0.6 ≈ 1 − 14/35. This is basically the 3rd case in (4): π 1 ≈ 0.33, π 2 ≈ 0.06. Qualitatively, 0.06 matches the data from UKHSA: about 0.04 for π 2 (until January 31, 2022). 3. Generating functions. We will provide the generating function for π r (N, L). We will show now the dependence of π r on N, L. Let We fix L here and below. . This is the classical problem about tiling the segment with N boxes by (L + 1)-minos, sequences of L consecutive boxes, and with 1-minos. Its variant in a 2D square lattice with dominos and monominos (dimers and monomers) is important in statistical physics. Though there are no exact 2D formulas in the presence of 1-minos. Here we count the tilings with the weights as above. For u = 1, β = 0, L = 1: G • N = f N +1 for the Fibonacci numbers f N . Generally: For instance, G • L+1 = (1 − β) L+1 + βu. Using the standard facts in the theory of generating functions or a straightforward consideration: Due to formula (3), G N = ∞ r=0 π r (N, L)u r satisfies the same recurrence as for G • N , but with different initial conditions. Namely, G(t, u) = (1 + βut + βut 2 + . . . + βut L ) G • (t, u). Finally, For u = 0: which we know without any calculations. For u = 1: ∞ r=0 π r (N, L)t N = 1 1−t , which gives a combinatorial proof of the identities ∞ r=0 π r (N, L) = 1 for any N, L. Explicit formulas. The theorem readily gives that Performing the differentiation, we obtain the following "telescopictype" presentation of π r . This immediately gives that ∞ r=0 π r (N, L) = 1 for any N, L. One can use this corollary to make the formulas for π r quite explicit: directly expressed in terms of the binomial coefficients. The sums there can be calculated using the standard combinatorial identities. Two processes or greater. It is quite possible that several strains (point processes) can be present simultaneously. They can be generally with different intervals L and β. For instance, when the Delta strain and the Omicron strain overlapped for some times, the available vaccines were significantly more efficient for the former than for the later. Statistically, this alone makes the immunity durations different. The transmissibilities were different too (greater for the Omicron). Generally, vaccinations (a) make the chances to be infected smaller, and (b) increase immunity intervals for those vaccinated if they are infected. Correspondingly, they are reflected in β, L and α, ν (for sufficiently large N). As we discussed above, the chances to have no infection during the whole cycle and the chances of being infected exactly 1 time must be known for this. These chances depend on various factors, but the corresponding L = νN can be potentially used to evaluate the efficacy of different vaccines in different countries. Let β 0 , β 1 , β 2 be the probabilities of being non-infected during 1 day, being infected by strain 1 or by strain 2, assuming that that the simultaneous infections by 1 and 2 are very rare. The corresponding immunity intervals after the infections will be L 1 , L 2 . Let π r 1 ,r 2 (N, L 1 , L 2 ) be the probability to have r 1 cases for strain 1 and r 2 for 2. Accordingly, we need to calculate the generating function G = ∞ N =0 G N t N , where G N = ∞ r 1 ,r 2 =0 u r 1 1 u r 2 2 π r 1 ,r 2 (N, L 1 , L 2 ). Similar to the above consideration, the basic combinatorial problem is now to count the number of coverings of an N-segment by non-overlapping L 1 -subsegments, L 2 -subsegments, and 1-subsegments (monomers). Here one of the subsegment is allowed to go through the second endpoint of the N-segment. Then G N satisfies the recurrence relation G N = β 0 G N −1 + β 1 u 1 G N −1−L 1 + β 2 u 2 G N −1−L 2 , and we obtain: The latter formula can be readily extended to any number of simultaneous strains, though 1 or 2 of them seem mostly sufficient for epidemics. For β 0 = 1 − i β i in the natural notation: Using that the dependence of u 1 , u 2 is linear in the numerator and denominator, it is not difficult to perform the necessary u-differentiations and calculate the generating functions with fixed r 1 , r 2 . For instance, let u 1 = u = u 2 , π r (N, L 1 , L 2 ) be the probability that r 1 + r 2 = r, and P r (t) = ∞ N =0 π r (N, L 1 , L 2 )t N . Then This is for 1 infection of any type (from two). When L 1 = L = L 2 and β = β 1 + β 2 , we arrive at the case of one type of infection. An explicit combinatorial formula for N > L 2 ≥ L 1 is as follows: It becomes somewhat simpler combinatorially in terms of x, β 2 : There are L 2 + 1 powers of x here; the terms x 1 1 − (N − L 2 + 1)β 1 , x 1 (N − L 2 + 1)β 2 are present only if L 2 > L 1 + 1. For N ≤ L 2 , the number of terms is N: they are exactly the top N terms in the formulas above. The sums of the binomial coefficients in this formula can be readily calculated, which is useful for obtaining the limits as N → ∞, when Nβ i → α i , L i /N → ν i for i = 1, 2. We note that by setting β 1 = 0, β = β 2 , L = L 2 in the 1st formula, we obtain the 2nd where β 2 = 0, β = β 1 , L = L 1 . This is our formula for π 1 (N, L). Similarly, one can calculate π 1,0 (N, L 1 , L 2 ), which is the coefficient of u 1 of G, and π 0,1 , which is the coefficient of u 2 . By comparining them, we can analyze which combinations of parameters α i and ν i make the 1st or the 2nd strain dominant. Namely, for N > L 2 ≥ L 1 : where x def == 1−β 1 −β 2 . Accordingly, the top N terms must be taken if N ≤ L i . Here only L i occurs in formula i for i = 1, 2. Obviously, π 1 (N, L 1 ≤ L 2 ) = π 1,0 (N, L 1 ) + π 0,1 (N, L 2 ). In the limit β i N → α i , L i /N → ν i , we obtain for α 0 = α 1 + α 2 : Similar to Corollary 2.3 for r = 1: We see that π ′ 1,0 and π ′ 0,1 are increasing in terms of the corresponding ν i for 0 ≤ ν i < 1 and fixed α 1 , α 2 > 0. This is of course obvious due to their meaning: the greater ν i the smaller the number of reinfections. We use that π ′ 0 does not depend on ν i , so diminishing π ′ r for r > 1 results in increasing π ′ 1 . Stochastic processes etc. We note that the consideration of ensembles of segments of different lengths L i and different probabilities β i links our paper with stochastic process. One can use the generating function above to obtain the formulas for the corresponding partition functions. This provides some combinatorial approach to Whittakertype processes, where the distances between neighboring particles are the key. See [BC] . In our approach, L i (with the corresponding multiplicities) are counterparts of these distances, and we allow "defects", the gaps between the L i -segments. Thus if we know the distribution of {L i } we can define/calculate the corresponding partition function. Here Matérn II statistics is used: uniform distribution of points in a segment where those belonging to previously (in time) created segments are deleted (the operation of thinning). Actually, formulas in terms of binomial coefficients can be expected for other (similar) statistical assumptions too. It seems quite reasonable to discretize: allow only finitely many possible lengths of the non-overlapping segments (with random gaps between them). The partition function will be then in terms of the number of segments of each type counted with the corresponding probabilities. Similarly, there is a connection with the so-called interlacing sequences x 1 < y 1 < x 2 < y 2 < . . . < x n−1 < y n < x n+1 . They are sequences of n non-overlapping segments [x k , y k ] of lengths y k − x k in [x 1 , x n+1 ]. The corresponding transitional probabilities are associated with the t-residues of the function F (t) = (t−y 1 )(t−y 2 )...(t−yn) (t−x 1 )(t−x 2 )...(t−x n+1 ) at its poles: x 1 , . . . , x n+1 . See [Ke] . In our approach, these segments are [x k , x k + L i k ] where L i k is the length of the segment from x k . Then we calculate the generating function G(t) and expand it; its coefficient of t N u r 1 1 u r 2 2 · · · is the probabilistic measure of the corresponding Young diagram of order ≤ N. This diagram is with r i rows of the length L i . One can take here L i = i for i = 1, 2, . . . , n. Then our approach becomes some discretization of the transitional probabilities. Our variables u i are basically the coefficients of t i of the denominator of F (t). The numerator of G(t) incorporates the edge effects. The points y k do not appear in this approach; the distribution of L i is used instead. Also, we add the probabilities β i , which generally depend on the corresponding L i . Potentially, they can depend on the whole configuration of segments, but then it will be not Matérn II process. The classical theory results in the distribution of probabilities for Young diagrams related to the celebrated hook-formula and Jack polynomials. Thus, our formulas for Poisson-type distributions can be considered as some counterparts of explicit formulas for the coefficients of Jack polynomials. Modelling and optimal control of multi strain epidemics, with application to COVID-19 Momentum managing epidemic spread and Bessel functions Modeling the waves of Covid-19 Entropy and ordering of hard rods in one dimension Anisotropic Young Diagrams and Jack Symmetric Functions Moments of interference in vehicular networks with hardcore headway distance Macdonald processes, Probability Theory and Related Fields Cherednik) Department of Mathematics, UNC Chapel Hill, North Carolina 27599, USA, chered@email Acknowledgements. The author thanks very much Alexei Borodin for an important discussion, and Evgeny Feigin for his help. Support by NSF grant DMS-1901796 is acknowledged.