14
ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE SUBGRAPHS AND ARITHMETIC PROGRESSIONS Aaron Robertson 1 , William Cipolli 2 , and Maria Dasc˘ alu 3 Department of Mathematics, Colgate University, Hamilton, New York Abstract We investigate the distributions of the number of: (1) monochromatic complete subgraphs over edgewise 2-colorings of complete graphs; and (2) monochromatic arithmetic progressions over 2- colorings of intervals, as statistical Ramsey theory questions. We present convincing evidence that both distributions are very well-approximated by the Delaporte distribution. 1. Introduction Ramsey theory deals with finding order among chaos, two fundamental results espousing this being Ramsey’s Theorem and van der Waerden’s Theorem. Ramsey’s Theorem, in particular, proves the existence, for any k 2 Z + , of a minimal positive integer R(k) such that every 2-coloring of the edges of a complete graph on R(k) vertices contains a monochromatic complete subgraph on k vertices. Van der Waerden’s Theorem, in particular, states that there exists a least positive integer w(k) such that every 2-coloring of {1, 2,...,w(k)} contains a monochromatic arithmetic progression of length k. While the definitions of both Ramsey and van der Waerden numbers are simple, the computations of both, especially Ramsey numbers, are notoriously dicult. For examples, the most recent Ramsey number was determined by McKay and Radziszowksi [14], who used almost 10 years of cpu time; Kouril [11] used over 200 processors and 250 days to show that w(6) = 1132. Given the exponential nature of these numbers, the remaining unknown numbers seem intractable at the present time. Given the diculty of computing Ramsey numbers exactly we explore the potential value of a statistical approach with an ultimate goal of gaining some insight into R(k) and w(k). Starting with Ramsey numbers, let all edgewise 2-colorings of the complete graphs on n vertices be equally likely and define X k = X k (n) as the random variable giving the total number of monochromatic subgraphs on k vertices (i.e, K k ). Our goal is to find a very good approximation for the probability mass function (pmf) of X k . In [6], it is shown that X k is asymptotically Poisson as k !1 (with certain conditions on n and k); that is, with an appropriate restriction on n, as k !1 we have P(X k = j ) λ j e -λ j ! , where λ = ( n k ) 2 ( k 2 ) -1 . However, since this is an asymptotic (in k) result, using this for small values of k may not appropriate. In this article, we present (hopefully very convincing) evidence of what the distribution of X k for small k may be. It is well-known that there is a “Poisson Paradigm” with Ramsey object counts; see [3]. We briefly account here for why such a paradigm exists. Let {Y i } n i=1 be a sequence of indicator random variables each with the same small probability of being 1. Let Y = n X i=1 Y i with n large. If the Y i are independent, then Y follows a Binomial distribution. Since P (Y i = 1) = p for all i with p small, we know that the Poisson distribution well-approximates the Binomial (even for relatively small n). 1 Principal corresponding author; [email protected] 2 [email protected] 3 Undergraduate student, [email protected] 1

ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE SUBGRAPHSAND ARITHMETIC PROGRESSIONS

Aaron Robertson1, William Cipolli2, and Maria Dascalu3

Department of Mathematics, Colgate University, Hamilton, New York

Abstract

We investigate the distributions of the number of: (1) monochromatic complete subgraphs overedgewise 2-colorings of complete graphs; and (2) monochromatic arithmetic progressions over 2-colorings of intervals, as statistical Ramsey theory questions. We present convincing evidence thatboth distributions are very well-approximated by the Delaporte distribution.

1. Introduction

Ramsey theory deals with finding order among chaos, two fundamental results espousing this beingRamsey’s Theorem and van der Waerden’s Theorem. Ramsey’s Theorem, in particular, proves theexistence, for any k 2 Z+, of a minimal positive integer R(k) such that every 2-coloring of the edgesof a complete graph on R(k) vertices contains a monochromatic complete subgraph on k vertices. Vander Waerden’s Theorem, in particular, states that there exists a least positive integer w(k) such thatevery 2-coloring of {1, 2, . . . , w(k)} contains a monochromatic arithmetic progression of length k.

While the definitions of both Ramsey and van der Waerden numbers are simple, the computationsof both, especially Ramsey numbers, are notoriously di�cult. For examples, the most recent Ramseynumber was determined by McKay and Radziszowksi [14], who used almost 10 years of cpu time; Kouril[11] used over 200 processors and 250 days to show that w(6) = 1132. Given the exponential natureof these numbers, the remaining unknown numbers seem intractable at the present time. Given thedi�culty of computing Ramsey numbers exactly we explore the potential value of a statistical approachwith an ultimate goal of gaining some insight into R(k) and w(k). Starting with Ramsey numbers, letall edgewise 2-colorings of the complete graphs on n vertices be equally likely and define Xk = Xk(n)as the random variable giving the total number of monochromatic subgraphs on k vertices (i.e, Kk).Our goal is to find a very good approximation for the probability mass function (pmf) of Xk.

In [6], it is shown that Xk is asymptotically Poisson as k !1 (with certain conditions on n and k);

that is, with an appropriate restriction on n, as k !1 we have P(Xk = j) ⇡ �je��

j! , where � = (nk)

2(k2)�1

.

However, since this is an asymptotic (in k) result, using this for small values of k may not appropriate.In this article, we present (hopefully very convincing) evidence of what the distribution of Xk for smallk may be.

It is well-known that there is a “Poisson Paradigm” with Ramsey object counts; see [3]. We brieflyaccount here for why such a paradigm exists. Let {Yi}n

i=1 be a sequence of indicator random variables

each with the same small probability of being 1. Let Y =nX

i=1

Yi with n large. If the Yi are independent,

then Y follows a Binomial distribution. Since P (Yi = 1) = p for all i with p small, we know that thePoisson distribution well-approximates the Binomial (even for relatively small n).

1Principal corresponding author; [email protected]@colgate.edu3Undergraduate student, [email protected]

1

Page 2: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

Since we have p small when we are dealing with Ramsey object counts, we only need address theindependence assumption. Specializing to edge-colored complete graphs we let Yi indicate when theith complete subgraph is monochromatic. Then Yi and Yj are independent if they share at most 1

vertex. This has probability bounded below by 1� (k2)(n�k+2

k�2 )(n

k), which, via Stirling’s formula, is roughly

1�⇣

ek2

n

⌘2, which approaches 1 as n grows.

Hence, we see that each indicator random variable is independent of almost all of the others. Thus,the Yi are “almost” independent and so the Poisson distribution should be a good approximationasymptotically.

2. Sampling Algorithm for 2-Colored Kn

Given a user input of positive integers n, k, and g, our Python program GraphCount2 generates g graphs,each on n vertices, using an adjacency list. It colors the edges between pairs of vertices randomly usingthe Python random module. It then counts the total number of monochromatic complete subgraphson k vertices of each given graph. Compiling all results will give us an empirical pmf. The countingalgorithms is a straightforward recursive one.

GraphCount has a run-time of O(n2) for k = 3 and O(nk) for k � 4. Table 1 compiles a list ofempirical run-times.

Our goal is to run GraphCount with g � 1, 000, 000 in order to obtain an empirical probability massfunction that is a fairly good approximation of the probability mass function. However, this will onlyallow us (given reasonable time constraints) to investigate k = 6 for a few values of n. Furthermore, ascan be seen from the table of run-times (Table 1), gathering enough samples to attempt distributionfitting for k = 7 would require approximately 75 years on a single computer or a couple of years on thecluster of 36 computers we have available to us (with dedicated use, which we do not have). So, at thistime, pursuit of k = 7 is not realistic with our algorithm.

Input k Input n Time per graph (sec)3 6 0.00024 18 0.002755 43 0.155 49 0.2966 102 22.596 165 4077 205 2368

Table 1: Run-times for GraphCount

In Figure 1, we present the empirical probability mass functions for the number of monochromaticKk subgraphs over 2-colorings of the edges of Kn for small k and n. You will notice a similar shapefor all presented. This occurred in all histograms we obtained (for su�ciently large sample sizes).

2Available at http://www.aaronrobertson.org.

2

Page 3: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

0 50 100 150 200

0.00

0.01

0.02

0.03

0.04

Empirical PMF with k=4 n=14EM

Ppmf

0 50 100 150 200

0.000

0.005

0.010

0.015

0.020

0.025

0.030

Empirical PMF with k=4 n=15

EMPpmf

50 100 150 200 250

0.000

0.005

0.010

0.015

0.020

0.025

Empirical PMF with k=4 n=16

EMPpmf

50 100 150 200 250 300

0.000

0.005

0.010

0.015

0.020

Empirical PMF with k=4 n=17

EMPpmf

Sample size = 1M Sample size = 1M Sample size = 1M Sample size = 1M

50 100 150 200 250 300 350

0.000

0.005

0.010

0.015

Empirical PMF with k=4 n=18

EMPpmf

100 200 300 400

0.000

0.005

0.010

0.015

Empirical PMF with k=4 n=19EM

Ppmf

100 200 300 400 500

0.000

0.002

0.004

0.006

0.008

0.010

0.012

0.014

Empirical PMF with k=4 n=20

EMPpmf

1000 2000 3000 4000 5000

0.0000

0.0005

0.0010

0.0015

Empirical PMF with k=5 n=43

EMPpmf

Sample size = 1M Sample size = 1M Sample size = 1M Sample size = 1.1M

2000 3000 4000 5000

0.0000

0.0002

0.0004

0.0006

0.0008

0.0010

0.0012

0.0014

Empirical PMF with k=5 n=44

EMPpmf

2000 3000 4000 5000 6000 7000

0.0000

0.0002

0.0004

0.0006

0.0008

0.0010

0.0012

Empirical PMF with k=5 n=45

EMPpmf

2000 3000 4000 5000 6000 7000

0.0000

0.0002

0.0004

0.0006

0.0008

0.0010

0.0012

Empirical PMF with k=5 n=45

EMPpmf

2000 3000 4000 5000 6000 7000

0e+00

2e−04

4e−04

6e−04

8e−04

1e−03

Empirical PMF with k=5 n=46

EMPpmf

Sample size = 1M Sample size = 1M Sample size = 17.4M Sample size = 1.1M

Figure 1: Empirical probability mass functions (pmfs) for various scenarios.

k = 3 case, we do have E(Xk) ⌃ Var(Xk) for large n; see Table 2 in the next section. We can also

see from Figure 2 that the Poisson distribution appears under-dispersed (while Lemma 1 below proves

this). More fundamentally, for fixed values of k, the dependence between some of the subgraphs is not

accounted for with Poisson modeling.

Lemma 1. For any k ⌅ 4, we have limn⇤⌅

E(Xk(n))

Var(Xk(n))= 0.

Proof. Lemma 3.5 in [6] gives the asymptotic order of Var(Xk): Define ⇥(Xk) = min(E(XH)), where

the minimum is taken over all nontrivial subgraphs H of Kk. Then Var(Xk) ⇥ 12

E2(Xk)�(Xk) . Taking H = K3

(where K2 is the degenerate case) we see that

Var(Xk) ⇥1

2

�nk

⇥2/2(

2k2 )�2

�n3

⇥/22

= ⇤⇧⇧

n

k

⌃nk�3

22(k2)�2

⌃,

which agrees with the expressions given in Table 2 (in the next section). We know that E(Xk) =(n

k)

2(k2)�1

,

and by comparison with the above expression we see that the lemma’s statement holds.

4

Figure 1: Empirical probability mass functions (pmfs) for various scenarios.

3. Fitting the Empirical Probability Mass Function

We can view the random variable Xk as a sum of indicator random variables Yi, where Yi = 1 if theith Kk is monochromatic and Yi = 0 otherwise. Since n is typically much larger than k, most pairs ofKk’s are independent. Hence, we can view Xk as the sum of (somewhat) weakly dependent indicatorrandom variables, each of which have a small probability of being 1. By weak dependence, we meanthat the probability of two randomly chosen Kk’s are dependent is near 0. To see this, note that theprobability that two such subgraphs are dependent requires them to share at least 2 vertices, so thatthis probability is �k

2

��n�2k�2

��nk

� ⇡ k4

2n2.

Noting that n is typically much larger than k2, we see that this probability is quite low.Now, as k tends to infinity, the proportion of dependent pairs of subgraphs goes to 0. Hence,

asymptotically, we can view the set of all subgraphs as almost entirely independent. In the situationwhere the subgraphs are completely independent, since the probability that any given subgraph ismonochromatic is small, through the Poisson process we get a Poisson distribution and the result in

3

Page 4: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

[6]; see [8] for a general theorem about when we can have a limiting Poisson distribution with weakoverall dependence.

However, when we investigate fixed values of (small) k, as we can see from Figure 2, the Poissondistribution is not a good fit. This is to be expected since the Poisson distribution has a variance equalto its expectation, while we know that E(Xk) 6= Var(Xk) for k � 4 (see Lemma 1 below). For thek = 3 case, we do have E(Xk) ⇡ Var(Xk) for large n; see Table 2 in the next section. We can alsosee from Figure 2 that the Poisson distribution appears under-dispersed (while Lemma 1 below provesthis). More fundamentally, for fixed values of k, the dependence between some of the subgraphs is notaccounted for with Poisson modeling.

Lemma 1. For any k � 4, we have limn!1

E(Xk(n))Var(Xk(n))

= 0.

Proof. Lemma 3.5 in [9] gives the asymptotic order of Var(Xk): Define �(Xk) = min(E(XH)), wherethe minimum is taken over all nontrivial subgraphs H of Kk. Then Var(Xk) ⇣ 1

2E2(Xk)�(Xk) . Taking H = K3

(where K2 is the degenerate case) we see that

Var(Xk) ⇣12

�nk

�2/2(

2k2 )�2�n

3

�/22

= ⌦✓✓

n

k

◆nk�3

22(k2)�2

◆,

which agrees with the expressions given in Table 2 (in the next section). We know that E(Xk) = (nk)

2(k2)�1

,

and by comparison with the above expression we see that the lemma’s statement holds.

To further illustrate the point, in Figure 2 we present overlays of the best-fitting (defined in thenext paragraph) Poisson distributions over the empirical pmfs presented in Figure 1. As you can see,the Poisson distribution is clearly not a good fit for small values of k.

Our measure of best-fitting is via the `1-distance between two probability mass functions f(k) andg(k):

Pk�0 |f(k)� g(k)|.

To address the dependence and under-dispersion, we turn to a compound Poisson distribution,meaning a Poisson distribution where the “parameter” L is a random variable. We let L = c+G wherec is a constant and G is a random variable to be determined. The constant part of L accounts forthe “Poisson part” of the distribution, accounting for the independent subgraphs while we want G toaccount for the dependence issues. Simultaneously, this accounts for the under-dispersion issue of thestraight Poisson random variable.

As motivation for our choice of G we turn to the Polya-Eggenberger urn scheme for some motivation.As noted in [13], the Polya-Eggenberger urn models have been used as contagion models. This relatesto our situation since if one Kk is monochromatic, then any other Kk that share vertices is more likelyto be monochromatic, i.e., monochromaticity is contagious. Consulting [4], [5], [7], we see that for largeurn, the Gamma distribution is closely tied to the urn model.

Returning to our compound Poisson random variable, we describe L = � + G as having a fixed“Poisson part” � (for the independent subgraphs) and a “contagion driver” G, modeled by a Gammarandom variable, which helps to model the weak dependence. This gives rise to the following convolutionpmf called the Delaporte distribution.

4

Page 5: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

To further illustrate the point, in Figure 2 we present overlays of the best-fitting (defined in the

next paragraph) Poisson distributions over the empirical pmfs presented in Figure 1. As you can see,

the Poisson distribution is clearly not a good fit for small values of k.

Our measure of best-fitting is via the ⇧1-distance between two probability mass functions f(k) and

g(k):⌥

k⇥0 |f(k)� g(k)|.

0 50 100 150 200

0.00

0.01

0.02

0.03

0.04

0.05

0.06

0.07

Empirical PMF with k=4 n=14 with Poisson fit

EMPpmf

Poisson

0 50 100 150 200

0.00

0.01

0.02

0.03

0.04

0.05

0.06

Empirical PMF with k=4 n=15 with Poisson fit

EMPpmf

Poisson

50 100 150 200 250

0.00

0.01

0.02

0.03

0.04

0.05

Empirical PMF with k=4 n=16 with Poisson fit

EMPpmf

Poisson

50 100 150 200 250 300

0.00

0.01

0.02

0.03

0.04

Empirical PMF with k=4 n=17 with Poisson fit

EMPpmf

Poisson

Sample size = 1M Sample size = 1M Sample size = 1M Sample size = 1M

50 100 150 200 250 300 350

0.00

0.01

0.02

0.03

0.04

Empirical PMF with k=4 n=18 with Poisson fit

EMPpmf

Poisson

100 200 300 400

0.00

0.01

0.02

0.03

Empirical PMF with k=4 n=19 with Poisson fitEM

Ppmf

Poisson

100 200 300 400 500

0.000

0.005

0.010

0.015

0.020

0.025

0.030

Empirical PMF with k=4 n=20 with Poisson fit

EMPpmf

Poisson

1000 2000 3000 4000 5000

0.000

0.002

0.004

0.006

0.008

Empirical PMF with k=5 n=43 with Poisson fit

EMPpmf

Poisson

Sample size = 1M Sample size = 1M Sample size = 1M Sample size = 1.1M

2000 3000 4000 5000

0.000

0.002

0.004

0.006

0.008

Empirical PMF with k=5 n=44 with Poisson fit

EMPpmf

Poisson

2000 3000 4000 5000 6000 7000

0.000

0.002

0.004

0.006

0.008

Empirical PMF with k=5 n=45 with Poisson fit

EMPpmf

Poisson

2000 3000 4000 5000 6000 7000

0.000

0.002

0.004

0.006

0.008

Empirical PMF with k=5 n=45 with Poisson fitEM

Ppmf

Poisson

2000 3000 4000 5000 6000 7000

0.000

0.002

0.004

0.006

0.008

Empirical PMF with k=5 n=46 with Poisson fit

EMPpmf

Poisson

Sample size = 1M Sample size = 1M Sample size = 17.4M Sample size = 1.1M

Figure 2: Empirical pmfs for various scenarios with best-fit Poisson overlay

To address the dependence and under-dispersion, we turn to mixed-Poisson processes, i.e., Pois-

son processes with a parameter L that is a random variable (as opposed to being fixed). The result

will be a compound Poisson distribution. We need to maintain the asymptotic Poisson nature and

so, heuristically, having the parameter contain a fixed portion ⇤ and a random portion G addresses

this. Considering L = ⇤ + G gives our mixed-Poisson process a “Poisson part” ⇤ (for the independent

subgraphs) and a “local dependence corrector” G, which captures unknown dependence. While this

addresses the dependence issue, it also allows us to correct the under-dispersion of the Poisson approxi-

mation. This is to be expected since the under-dispersion is linked with the failure to account for some

dependence between events.

A commonly used choice for mixing with Poisson is the Gamma distribution. However, even though

there is ample empirical evidence for the use of Gamma as a mixing function, there is no real theoretic

5

Figure 2: Empirical pmfs for various scenarios with best-fit Poisson overlay

Definition 2. A random variable D = D(�,↵,�) is called a Delaporte random variable if it hasprobability mass function

P(D = j) =jX

i=0

�(↵ + i)�(↵)i!

✓�

1 + �

◆i✓ 11 + �

◆↵ �j�ie��

(j � i)!.

Furthermore, µ = E(D) = � + ↵�, Var(D) = E((D � µ)2) = � + ↵�(1 + �), and E((D � µ)3) =� + ↵�(1 + 3� + 2�2).

Remark. In the Delaporte pmf above, � is the parameter for the Poisson part while ↵ and � areparameters for the Gamma part of our L = � + G model for the Poisson process rate, which leads toa negative binomial distribution with parameters ↵ and �

1+� .

In Figure 3, we present the same empirical pmfs as in Figures 1 and 2 along with an overlay of thebest-fit Delaporte distribution.

In order to find the best-fitting such Delaporte distribution, we must find good estimates for the

5

Page 6: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

0 50 100 150 200

0.00

0.01

0.02

0.03

0.04

Empirical PMF with k=4 n=14 with Delaporte Fit

EMPp

mf

●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

0 50 100 150 200

0.00

00.

005

0.01

00.

015

0.02

00.

025

0.03

0

Empirical PMF with k=4 n=15 with Delaporte Fit

EMPp

mf

●●●●●●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

50 100 150 200 250

0.00

00.

005

0.01

00.

015

0.02

00.

025

Empirical PMF with k=4 n=16 with Delaporte Fit

EMPp

mf

●●●●●●●●●●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

50 100 150 200 250 300

0.00

00.

005

0.01

00.

015

0.02

0

Empirical PMF with k=4 n=17 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●

●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

Sample size = 1M Sample size = 1M Sample size = 1M Sample size = 1M

50 100 150 200 250 300 350

0.00

00.

005

0.01

00.

015

Empirical PMF with k=4 n=18 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

100 200 300 400

0.00

00.

005

0.01

00.

015

Empirical PMF with k=4 n=19 with Delaporte FitEM

Ppm

f

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

100 200 300 400 500

0.00

00.

002

0.00

40.

006

0.00

80.

010

0.01

20.

014

Empirical PMF with k=4 n=20 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

1000 2000 3000 4000 5000

0.00

000.

0005

0.00

100.

0015

Empirical PMF with k=5 n=43 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

Sample size = 1M Sample size = 1M Sample size = 1M Sample size = 1.1M

2000 3000 4000 5000

0.00

000.

0002

0.00

040.

0006

0.00

080.

0010

0.00

120.

0014

Empirical PMF with k=5 n=44 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

2000 3000 4000 5000 6000 7000

0.00

000.

0002

0.00

040.

0006

0.00

080.

0010

0.00

12

Empirical PMF with k=5 n=45 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

2000 3000 4000 5000 6000 7000

0.00

000.

0002

0.00

040.

0006

0.00

080.

0010

0.00

12

Empirical PMF with k=5 n=45 with Delaporte FitEM

Ppm

f

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

2000 3000 4000 5000 6000 7000

0e+0

02e−0

44e−0

46e−0

48e−0

41e−0

3

Empirical PMF with k=5 n=46 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

Sample size = 1M Sample size = 1M Sample size = 17.4M Sample size = 1.1M

Figure 3: Empirical pmfs for various scenarios with Delaporte Overlay

Given n, k Z+, let q = P(D >�nk

⇥) =

⌥j>(n

k)P(D = j). Our goal is to show that q is negligible.

We will use the one-sided Chebyshev inequality: P(X � µ ⌅ x) ⇤ ⌅2

⌅2+x2 , where µ = E(X) and

⌅2 =Var(X). In our situation we have

µ =2

2(k2)

⇧n

k

so we let

x =2(

k2) � 2

2(k2)

⇧n

k

⌃+ 1

to bound P (D >�nk

⇥). We also know that Var(D) = µ + �⇥2. In the next section, we present evidence

to suggest that ⌅2 ⇧�nk

⇥nk�32�k2

so we will use this as an assumption. Putting this all together, we

find that

P⇧

D >

⇧n

k

⌃⌃� 1

2kkkn3.

In practice, we have observed – for small k and n – that this bound is quite weak. Nevertheless, this

does show that the tail probabilities of our Delaporte distributions are quite negligible.

7

Figure 3: Empirical pmfs for various scenarios with Delaporte Overlay

parameters �, ↵, and �. The two main approaches are the maximum likelihood estimates (MLE) andthe method of moments estimates (MOM).

We have found that the MLEs consistently provide better results than the MOM estimates (this isgenerally true because likelihood methods are more e�cient in the sense that they use the data better,one measure being faster convergence; see Section 5). In fact, these MLEs produce near-optimal results,i.e., a total `1-distance between the empirical pmf and the MLE-estimated Delaporte distribution verynear 0. Unfortunately, closed-form formulas for the MLEs of �,↵, and � do not exist in our situation(it should also be noted that R’s calculation of the MLEs for k = 5 neared a day of computation timefor each n). Hence, in the next section we use MOM estimators for theoretical work. Although theMOM estimators do not provide the best fit, they are still very reasonable and relatively close to theMLEs as we show in our Simulation Study section.

A remark about the support of the Delaporte distribution as an approximation for the distributionof Xk is in order. We know that Xk can only take on values in {0, 1, 2, . . . ,

�nk

�} while the Delaporte

distribution’s support is the nonnegative integers. The `1-distances do include the tail of the Delaporte

6

Page 7: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

distribution, i.e., the value ofP

j>(nk) P(D = j). Hence, over the support of Xk, the total `1-distance

is smaller, although we will show that the di↵erence is negligible.Given n, k 2 Z+, let q = P(D >

�nk

�) =

Pj>(n

k) P(D = j). Our goal is to show that q is negligible.

We will use the one-sided Chebyshev inequality: P(X � µ � x) �2

�2+x2 , where µ = E(X) and

�2 =Var(X). In our situation we have µ = 2

2(k2)

�nk

�so we let x = 2(

k2)�2

2(k2)

�nk

�+ 1 to bound P (D >

�nk

�).

We also know that Var(D) = µ + ↵�2. In the next section, we present evidence to suggest that�2 ⇠

�nk

�nk�32�k2 so we will use this as an assumption. Putting this all together, we find that

P✓

D >

✓n

k

◆◆.

12kkkn3

.

In practice, we have observed – for small k and n – that this bound is quite weak. Nevertheless, thisdoes show that the tail probabilities of our Delaporte distributions are quite negligible.

4. Implications and Evidence

As stated before, it was shown in [6] that, loosely speaking, Xk is asymptotically Poisson. This wasdone by showing that the total `1-distance between the distributions of Xk and a Poisson randomvariable with mean E(Xk) tends to 0 as k tends to infinity (with n bounded from above by a functionof k). We will first show that the proposed Delaporte distribution is consistent with this fact. We willbe using the MOM estimates and note that E(Xk) = � + ↵� under the method of moments. We usethe following notation.

Notation. Let X be a random variable. We denote its moment generating function by mgf(X); thatis, mgf(X) = E(etX).

Theorem 3. Let n, k 2 Z+ with k � 3. Define D ⇠ Delaporte(�,↵,�), and P ⇠ Poisson(� + ↵�).Then the mgf(D)! mgf(P ) as k !1 under the following assumptions:

(1) ↵ ⇠�nk

�nk�1

;

(2) � ⇠ nk�2

2(k2)

;

(3) n = O⇣k1+ 1

k�1 · 2k2

Proof. Since D is a convolution of a Negative Binomial random variable with success probability �1+�

and mean ↵� and a Poisson random variable with mean �, using the moment generating functions ofthese, we easily have

mgf(D) =e�(et�1)

(1� �(et � 1))↵.

Isolate the denominator and use ln(1 + x) ⇡ x for small x. Since � ! 1 as k, n ! 1, with therestriction t ln

⇣1 + 1

�2

⌘we have �(et � 1) 1

� , so that �(et � 1) is small for su�ciently large �.Hence,

ln�(1� �(et � 1))↵

�⇡ �↵�(et � 1).

7

Page 8: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

For large n and k, this gives (1 � �(et � 1))↵ ⇡ e�↵�(et�1), for t ln⇣1 + 1

�2

⌘. Hence, we find that

mgf(D) ⇡ e(�+↵�)(et�1) = mgf(P ) on a small interval including t = 0, which is enough to conclude theresult.

Having Theorem 3 and knowing that there is a one-to-one correspondence between random variablesand their moment generating functions, we can state that, loosely, the Delaporte random variable isasymptotically Poisson. Hence, we are not violating the “Poisson Paradigm,” as noted in [3].

We will now give evidence to suggest that the assumptions in Theorem 3 are satisfied. We willbe using MOM estimates. We know that µ = E(Xk) =

�nk

�21�(k

2). Using Zeilberger’s Maple packageSMCramsey that accompanies [17] we find the leading terms for the second and third moments aboutthe mean for Xk for small k:

k E((X � µ)2) E((X � µ)3)

3✓

n

3

◆· 324

✓n

3

◆· 6n

26

4✓

n

4

◆· 12n

210

✓n

4

◆· 24n3

215

5✓

n

5

◆· 15n2

218

✓n

5

◆· 15n5

227

6✓

n

6

◆· 10n3

228

✓n

6

◆· 10n7

3 · 242

7✓

n

7

◆· 35n4

2 · 242

✓n

7

◆· 35n9

6 · 264

8✓

n

8

◆· 84n5

15 · 256

✓n

8

◆· 42n11

255 · 284

Table 2: Second and third moment orders

As noted in Definition 2, for the Delaporte random variable D, we have

E(D) = � + ↵�; E((D � µ)2) = � + ↵�(1 + �); E((D � µ)3) = � + ↵�(1 + 3� + 2�2).

By Lemma 1 and the fact that E((D � µ)2) = E(D) + ↵�2, we can deduce that ↵�2 ⇠�nk

�nk�3

22(k

2)�2.

Looking at the third moments in Table 2, we have evidence to suggest that 2↵�3 ⇠�nk

�n2k�5

23(k

2)�3. Taking

the ratio of these last two expressions yields

↵ ⇠�nk

�nk�1

and � ⇠ nk�2

2(k2)

.

Remark. The interested reader can obtain more accurate results by using the following MOM formulascalculated in Mathematica:

b� =2m2v + 2vf �m

�2f + v2

�s2v + 7

��+ 6v3

4m2 � 12mv + v2 (9� s2v);

8

Page 9: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

b↵ =�4(m� v)4

4m2(m� 4v)�f(4m� 6v)+s2v3(m� v)+v2(21m� 9v); b� =

�2m2 + f + 5mv � 3v2

2(m� v)2,

where m is the sample mean, v is the sample variance, s is the sample skewness, and we use thenotation f =

ps2v3(m� v)2. Note that we calculate the sample skewness using the default R command

skewness(). A summary of sample skewness calculations can be found in [10].

Remark. We are clearly extrapolating in our formulas for ↵ and �, but it is interesting to note thatthe order of n required in Theorem 3 is actually slightly better (by a factor of k

1k ) than the best known

lower bound on R(k, k), while a larger order for n would void our proof of the asymptotic Poissonnature of Xk via the Delaporte distribution. Might this be evidence that k2

k2 is the correct order for

the Ramsey number R(k, k)?

5. Simulation Study: MLE vs. MOM

Though the MOM estimators can be calculated in closed-form, these estimators are quite complicatedand the derivation of their expected value is impractical. MLEs are more burdensome and so far havebeen calculated numerically using the optim function in cran R [15].

Due to the nature of these estimators, a discussion on their accuracy and precision is di�cultas closed-form expectations are impractical to calculate. To explore and compare the accuracy andprecision of the MOM estimators and MLEs, consider their asymptotic behavior across three simulationsmotivated by estimators from the k = 4, n = 14; k = 5, n = 20; and k = 5, n = 49 graphs.

Tables 3 and 4 contain the results of these simulations of Delaporte data with parameter values aswell as MOM estimates and MLEs across increasing sample sizes n. As n increases, the MLEs quicklyapproach the true values, whereas the MOM estimators appear to require higher sample sizes. Thesesimulations suggest that both the MLEs and MOM estimates might be asymptotically unbiased where,as expected, the MLEs have lower variability.

MLEs

Parameters n b↵ b� b�↵ = 1.84� = 7.89� = 16.75

100100010000100000

2.03 (1.40)1.87 (0.39)1.85 (0.10)1.84 (0.03)

8.90 (3.41)7.92 (0.91)7.88 (0.27)7.89 (0.09)

16.96 (3.65)16.70 (1.22)16.74 (0.36)16.74 (0.11)

↵ = 3.74� = 15.46� = 93.57

100100010000100000

4.55 (2.46)3.68 (0.75)3.74 (0.22)3.74 (0.08)

15.48 (4.71)15.83 (1.88)15.46 (0.53)15.47 (0.18)

90.66 (14.76)94.46 (5.28)93.60 (1.51)93.58 (0.55)

↵ = 9.45� = 163.51� = 2178.57

100100010000100000

11.39 (10.25)9.84 (2.20)9.56 (0.66)9.48 (0.18)

183.02 (69.73)162.73 (19.11)162.66 (6.17)163.25 (1.67)

2159.43 (560.00)2159.16 (154.19)2171.45 (49.41)2176.68 (14.19)

Table 3: MLEs for Delaporte data simulated with previously calculated parameters across various sample sizes

9

Page 10: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

MOM Estimators

Parameters n b↵ b� b�↵ = 1.84� = 7.89� = 16.75

100100010000100000

3.13 (2.46)1.98 (0.63)1.86 (0.17)1.85 (0.06)

7.96 (4.10)7.89 (1.43)7.88 (0.44)7.87 (0.14)

14.08 (5.87)16.41 (2.02)16.71 (0.63)16.72 (0.20)

↵ = 3.74� = 15.46� = 93.57

100100010000100000

6.10 (4.20)3.95 (1.23)3.77 (0.37)3.73 (0.10)

14.06 (4.84)15.60 (2.71)15.44 (0.81)15.49 (0.24)

82.30 (20.90)92.85 (8.31)93.42 (2.69)93.63 (0.70)

↵ = 9.45� = 163.51� = 2178.57

100100010000100000

14.92 (12.81)10.49(4.11)9.62 (1.06)9.46 (0.28)

173.22 (95.54)162.38 (29.14)162.60 (9.31)163.46 (2.55)

1950.67 (709.17)2125.06 (275.23)2167.94 (79.22)2178.36 (21.67)

Table 4: MOM estimators for Delaporte data simulated with previously calculated parameters across varioussample sizes

The results of this simulation study lend credence to the validity of using MOMs in the last section.

6. Monochromatic Arithmetic Progressions

We follow a similar strategy for van der Waerden numbers. Let all possible 2-colorings of a giveninterval [1, n] of positive integers be equally likely and define Yk = Yk(n) to be the random variablegiving the total number of monochromatic arithmetic progressions of length k in the interval. Toapproximate the distribution of Yk, the program APCount3 takes a user input of positive integers n, k,and g and generates g instances of the integers between 1 and n each having one of two colors. Thecolor of any given integer is randomly decided, again with Python’s random module. The programcounts the total number of monochromatic arithmetic progressions of length k in each instance. Overall g instances, we then produce an empirical probability mass function for Yk.

Though APCount is dealing with larger numbers than GraphCount, it works much more quickly,given that arithmetic progressions are simpler than graphs. However, like GraphCount, APCount has arun-time of O(n2) for k = 3 and O(nk) for k � 4.

Input k Input n Time per progression (sec)3 9 0.000164 35 0.000495 178 0.02266 1132 4.97 3703 1678 11495 4980

Table 5: Run-times for APCount

In Figure 4, we present the empirical pmfs for small k and n, with n near w(k).At first blush, it is quite striking that the histograms for the number of monochromatic arithmetic

progressions have very similar shapes to the histograms for the number of monochromatic complete3Available at http://www.aaronrobertson.org.

10

Page 11: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

5 10 15

0.00

0.05

0.10

0.15

0.20

Empirical PMF with k=3 n=9 with Delaporte Fit

EMPpmf

20 40 60 80 100

0.00

0.02

0.04

0.06

Empirical PMF with k=4 n=35

EMPpmf

200 300 400 500 600

0.000

0.005

0.010

0.015

Empirical PMF with k=5 n=176 with Delaporte Fit

EMPpmf

Sample size = 1M Sample size = 1M Sample size = 1.7M

200 300 400 500 600

0.000

0.005

0.010

0.015

Empirical PMF with k=5 n=177 with Delaporte Fit

EMPpmf

200 300 400 500 600

0.000

0.005

0.010

0.015

Empirical PMF with k=5 n=178

EMPpmf

4000 4500 5000

0.000

0.001

0.002

0.003

0.004

Empirical PMF with k=6 n=1131 with Delaporte Fit

EMPpmf

Sample size = 1.1M Sample size = 1M Sample size = 0.85M

4000 4500 5000

0.000

0.001

0.002

0.003

0.004

Empirical PMF with k=6 n=1132 with Delaporte Fit

EMPpmf

4000 4500 5000

0.000

0.001

0.002

0.003

Empirical PMF with k=6 n=1133 with Delaporte Fit

EMPpmf

17000 17500 18000 18500 19000 19500 20000

0.0000

0.0005

0.0010

0.0015

0.0020

Empirical PMF with k=7 n=3703 with Delaporte Fit

EMPpmf

Sample size = 0.89M Sample size = 0.85M Sample size = 67K

Figure 4: Empirical pmfs for the number of monochromatic arithmetic progressions

subgraphs. However, our arithmetic progressions are mostly independently colored with some depen-dence between some pairs of arithmetic progressions. So, heuristically, the number of monochromaticarithmetic progressions would be asymptotically Poisson by very similar reasoning to the number ofmonochromatic subgraphs.

We will now show that for fixed (small) k, the Poisson distribution is under-dispersed for thedistribution of the number of monochromatic arithmetic progressions.

Lemma 4. Let k ⇥ 3. We have limn⇥⇤

E(Yk(n))Var(Yk(n))

< 1.

Proof. Let Zi be the indicator function for whether or not the ith k-term arithmetic progression

is monochromatic so that Yk =⇤ n2

2(k�1)

i=1 Zi. By the linearity of expectation, we have E(Yk) =⇤ n2

2(k�1)

i=1 E(Zi) = n2

2(k�1) · 12k�1 = n2

(k�1)2k . We also have Var(Yk) = Var(⇤ n2

2(k�1)

i=1 Zi) =⇤ n2

2(k�1)

i=1 Var(Zi) +⇤

i⌅=j Cov(Zi, Zj) =⇤ n2

2(k�1)

i=1

�E(Z2

i )� E2(Zi)⇥

+⇤

i⌅=j Cov(Zi, Zj). Since Z2i = Zi, this simplifies to

Var(Yk) =⇤ n2

2(k�1)

i=1

�1

2k�1 � 122k�2

⇥+

⇤i⌅=j Cov(Zi, Zj) = 2k�1n2

(k�1)22k�1 +⇤

i⌅=j Cov(Zi, Zj).

12

Figure 4: Empirical pmfs for the number of monochromatic arithmetic progressions

subgraphs. However, our arithmetic progressions are mostly independently colored with some depen-dence between some pairs of arithmetic progressions. So, heuristically, the number of monochromaticarithmetic progressions would be asymptotically Poisson by very similar reasoning to the number ofmonochromatic subgraphs.

We will now show that for fixed (small) k, the Poisson distribution is under-dispersed for thedistribution of the number of monochromatic arithmetic progressions.

Lemma 4. Let k � 3. We have limn!1

E(Yk(n))Var(Yk(n))

< 1.

Proof. Order all k-term arithmetic progressions in {1, 2, . . . , n} as desired (for example, lexicographi-cally based on the starting value and then the common di↵erence). Let Zi be the indicator function for

whether or not the ith k-term arithmetic progression is monochromatic so that Yk =P n2

2(k�1)

i=1 Zi. By

11

Page 12: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

the linearity of expectation, we have E(Yk) =P n2

2(k�1)

i=1 E(Zi) = n2

2(k�1) · 12k�1 = n2

(k�1)2k . We also have

Var(Yk) = Var(

n2

2(k�1)Xi=1

Zi) =

n2

2(k�1)Xi=1

Var(Zi)+Xi6=j

Cov(Zi, Zj) =

n2

2(k�1)Xi=1

�E(Z2

i )� E2(Zi)�+

Xi6=j

Cov(Zi, Zj).

Since Z2i = Zi, this simplifies to

Var(Yk) =

n2

2(k�1)Xi=1

✓1

2k�1� 1

22k�2

◆+

Xi6=j

Cov(Zi, Zj) =2k�1n2

(k � 1)22k�1+

Xi6=j

Cov(Zi, Zj).

We now must look at when Cov(Zi, Zj) 6= 0 so that we only need consider when Zi and Zj correspondto dependent arithmetic progressions, meaning that they share at least one term. Next, we note thatif two arithmetic progressions A and B share only one term, then Cov(A,B) = E(AB)� E(A)E(B) =

122k�2 � 1

2k�1 · 12k�1 = 0.

We have Cov(Zi, Zj) = E(ZiZj)�E(Zi)E(Zj) = E(ZiZj)� 122k�2 . We will now give a lower bound for

Cov(Zi, Zj). Given i, for each of those j values that correspond to an arithmetic progression that shares(exactly) t � 2 values with the ith arithmetic progression we have E(ZiZj) = 1

22k�t�1 � 122k�2 � 1

22k�t .For each i, we use the trivial lower bound of 1 for the number of values of j for which the ith and jth

arithmetic progressions share t terms.Putting the above together and noting that for each i we are using k � 1 values of j based on

how many common terms each has with the ith arithmetic progression, we getP

i6=j Cov(Zi, Zj) �n2

2(k�1)

Pk�1t=2

122k�t = n2

2(k�1)

�1

2k+2 � 122k

�� n2

(k�1)2k+4 . We now are done since

limn!1

E(Yk(n))Var(Yk(n))

limn!1

n2

(k�1)2k

n2

(k�1)2k + n2

(k�1)2k+4

=1

1 + 2�4=

1617

< 1.

2

Remark. We were unsuccessful in our attempts to show that the limit in the above lemma is 0 as isthe case with monochromatic complete subgraphs; however, the Poisson fits to the empirical pmfs arevisually very similar to those in Figure 2.

Based on the under-dispersion of the Poisson distribution compared to Yk and the weak depen-dence between monochromatic arithmetic progressions, we see that we are in a similar situation tothe monochromatic subgraphs. Hence, relying on the same heuristics, we investigate how well theDelaporte distribution approximates these new empirical histograms (see Figure 5, below).

The Delaporte distribution is, again, an unusually good approximation, this time for the number ofmonochromatic arithmetic progressions. The reader may notice that for the last histogram in Figures4 and 5 (with a sample size of 67K), there are spikes at the peak and that the Delaporte overlay missesthese spikes. Based on our many simulations (only a fraction of which are shown in this article), wefind that these spikes diminish as the sample size increases and that they settle near the Delaportepeak. We included this histogram to show what is expected when the sample size is relatively “small.”

Having both the number of monochromatic complete subgraphs and the number of monochromaticarithmetic progressions producing such similar empirical histograms, we end with the following ques-tion:

12

Page 13: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

Remark. We were unsuccessful in our attempts to show that the limit in the above lemma is 0 as is

the case with monochromatic complete subgraphs.

Based on the under-dispersion of the Poisson distribution compared to Yk and the weak depen-

dence between monochromatic arithmetic progressions, we see that we are in a similar situation to

the monochromatic subgraphs. Hence, relying on the same heuristics, we investigate how well the

Delaporte distribution approximates these new empirical histograms (see Figure 5, below).

2 4 6 8 10 12 14 16

0.00

0.05

0.10

0.15

0.20

0.25

Empirical PMF with k=3 n=9 with Delaporte Fit

EMPp

mf

●●

●● ● ● ● ●

● Delaporte w/ MLE

20 40 60 80 100

0.00

0.02

0.04

0.06

Empirical PMF with k=4 n=35 with Delaporte Fit

EMPp

mf

●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

200 300 400 500 600

0.00

00.

005

0.01

00.

015

Empirical PMF with k=5 n=176 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

Sample size = 1M Sample size = 1M Sample size = 1.7M

200 300 400 500 600

0.00

00.

005

0.01

00.

015

Empirical PMF with k=5 n=177 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

200 300 400 500 600

0.00

00.

005

0.01

00.

015

Empirical PMF with k=5 n=178 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

4000 4500 5000

0.00

00.

001

0.00

20.

003

0.00

4

Empirical PMF with k=6 n=1131 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

Sample size = 1.1M Sample size = 1M Sample size = 0.85M

4000 4500 5000

0.00

00.

001

0.00

20.

003

0.00

4

Empirical PMF with k=6 n=1132 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

4000 4500 5000

0.00

00.

001

0.00

20.

003

Empirical PMF with k=6 n=1133 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

17000 17500 18000 18500 19000 19500 20000

0.00

000.

0005

0.00

100.

0015

0.00

20

Empirical PMF with k=7 n=3703 with Delaporte Fit

EMPp

mf

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

● Delaporte w/ MLE

Sample size = 0.89M Sample size = 0.85M Sample size = 67K

Figure 5: Empirical pmfs for various scenarios with Delaporte Overlay

The Delaporte distribution is, again, an unusually good approximation, this time for the number of

monochromatic arithmetic progressions. The reader may notice that for the last histogram in Figures

4 and 5 (with a sample size of 67K), there are spikes at the peak and that the Delaporte overlay misses

these spikes. Based on our many simulations (only a fraction of which are shown in this article), we

find that these spikes diminish as the sample size increases and that they settle near the Delaporte

peak. We included this histogram to show what is expected when the sample size is relatively “small.”

Having both the number of monochromatic complete subgraphs and the number of monochromatic

arithmetic progressions producing such similar empirical histograms, we end with the following ques-

tion:

Is there a “Delaporte Paradigm” for Ramsey objects?

13

Figure 5: Empirical pmfs for various scenarios with Delaporte Overlay

Is there a “Delaporte Paradigm” for Ramsey objects?

References

[1] A. Adler, Delaporte: Statistical functions for the Delaporte distribution, R package version 3.0.0, 2016,https://CRAN.R-project.org/package=Delaporte.

[2] P. Allison, Estimation and testing for Markov model of reinforcement, Sociological Methods and Research 8 (1980),434-453.

[3] N. Alon and J. Spencer, The Probabilistic Method, fourth edition, Wiley, New Jersey, 2015.

[4] K. Athreya, P. Ney, Branching Processes, Springer-Verlag, Berlin, 1972.

[5] B. Chauvin, N. Pouyanne, and R. Sahnoun, Limit Distributions for Large Polya Urns, Ann. Appl. Probab. 21 (2011),1-32.

[6] A. Godbole, D. Skipper, and R. Sunley, The asymptotic lower bound of diagonal Ramsey numbers: a closer look,Disc. Prob. Algorithms 72 (1995), 81-94.

[7] S. Janson, Functional limit theorems for multitype branching processes and generalized Plya urns, Stochastic Pro-cesses and their Appl. 110 (2004), 177-245.

13

Page 14: ON THE DISTRIBUTION OF MONOCHROMATIC COMPLETE …math.colgate.edu/~aaron/pap/Delaporte.pdf · 2018. 7. 31. · all edgewise 2-colorings of the complete graphs on n vertices be equally

[8] S. Janson, Poisson convergence and Poisson processes with applications to random graphs, Stochastic Processes andtheir Applications 26 (1987), 1-30.

[9] S. Janson, T. Luczak, and A. Rucinski, Random Graphs, Wiley-Interscience, New York, 2000.

[10] D. Joanes and C. Gill, Comparing measures of sample skewness and kurtosis, The Statistician 47 (1998), 183-189.

[11] M. Kouril, The van der Waerden number W (2, 6) is 1132, Experiment. Math. 17 (2008), 53-61.

[12] J. Lawless, Negative binomial and mixed Poisson regression, Canad. J. Stat. 15 (1987), 209-225.

[13] A. Marshall and I. Olkin, Bivariate distributions generated from Polya-Eggenberger urn models, J. MultivariateAnal. 35 (1990), 48-65.

[14] B. D. McKay and S.P. Radziszowski, R(4,5) = 25, Journal of Graph Theory 19 (1995), 309-322.

[15] R Core Team (2016). R: A language and environment for statistical computing, R Foundation for Statistical Com-puting, Vienna, Austria, https://www.R-project.org.

[16] G. Venter, E↵ects of variations from Gamma-Poisson assumptions, CAS Proceedings LXXVIII (1991), 41-55.

[17] D. Zeilberger, Symbolic moment calculus II: why is Ramsey theory sooooo eeeenormously hard, Integers 7(2) (2007),#A34.

14