Brun’s Sieve

  • View
    49

  • Download
    0

Embed Size (px)

DESCRIPTION

Brun’s Sieve Let B 1 , …, B m be events, X i the indicator random variable for Bi and X = X 1 + … + X m the number of Bi that hold. Let there be a hidden parameter n (so that actually m = m ( n ), B i = B i ( n ), X = X( n )) which will define the following o , O notation. - PowerPoint PPT Presentation

Text of Brun’s Sieve

  • Bruns SieveLet B1, , Bm be events, Xi the indicator random variable for Bi and X = X1 + + Xm the number of Bi that hold. Let there be a hidden parameter n (so that actually m = m(n), Bi = Bi(n), X = X(n)) which will define the following o, O notation.

    Define S(r) = Pr[Bi1 Bir ], the sum over all sets {i1,,ir} {1,,m}.

  • Theorem 8.3.1 Suppose there is a constant so that E[X] = S(1) and such that for every fixed r, E[X(r) / r!] = S(r) r / r!. Then Pr[X = 0] and indeed for every t Pr[X = t]

  • Pr[X = r] S(r) = Pr[ ], where {i1,,ir} {1,,m}.

    The Inclusion-Exclusion Principle gives that Pr[X = 0] = Pr[ ] = 1 S(1) + S(2) -+(-1)rS(r)

    Bonferronis inequality: Let P(Ei) be the probability that Ei is true, and be the probability that at least one of Ei,, En is true. Then

  • Proof. We do only the case t = 0. Fix > 0. Choose s so that

    The Bonferroni Inequalities states that, in general, the inclusion-exclusion formula alternatively over and underestimates Pr[X = 0]. In particular, Select no(the hidden variable) so that for n no,

    for 0 r 2s.

  • Proof(cont.)For such n Pr[X = 0] + Similarly, taking the sum to 2s+1 we find no so that for n no Pr[X = 0] -

    As was arbitrary Pr[X = 0]

  • Let G ~ G(n,p), the random graph and let EPIT represent the statement that every vertex lies in a triangle.

    Theorem 8.3.2 Let c > 0 be fixed and let p = p(n),=(n) satisfy p3 = , = Then Pr[G(n,p) |= EPIT] =

  • Proof. First fix xV(G).For each unordered y, z V(G) {x} let Bxyz be the event that {x,y,z} is a triangle of G. Let Cx be the event and Xx be the corresponding indicator random variable.We use Jansons Inequality to bound E[Xx] = Pr[Cx]. Here p = o(1) so = o(1). as defined above.

  • Proof(cont.)Dependency xyz ~ xuv occurs if and only if the sets overlap (other than x). Hence

    Since . ThusNow define

    the number of vertices x not lying in a triangle. Then from Linearity of Expectation,

  • Proof(cont.)We need to show that the Poisson Paradigm applies to X. Fix r. Then

    the sum over all sets of vertices {x1,,xr}. All r-sets look alike so

    where x1,,xr are some particular vertices. But

    the conjunction over 1 i r and all y,z.

  • Proof(cont.)We apply Jansons Inequality to this conjunction. Again = p3 = o(1). The number of {xi,y,z} is , the overcount coming from those triangles containing two(or three of the xi). (Here it is crucial that r is fixed.) Thus As before is p5 times the number of pairs xiyz~ xjyz. There are O(rn3) = O(n3) terms with i = j and O(r2n2) = O(n2) terms with i j so again = o(1). Therefore and

  • Large DeviationsGiven a point in the probability space(i.e., a selection of R) we call an index set J I a disjoint family (abbreviated disfam) ifBj for every j J.For no j, j J is j ~ j. If, in addition, If j J and Bj then j ~ j for some j J. Then we call J a maximal disjoint family (abbreviated maxdisfam).

  • Lemma 8.4.1 With the above notation and for any integer s, Pr[there exists a disfam J, |J| = s]

    Proof. Let denote the sum over all s-sets J I with no j ~ j. Let denote the sum over ordered s-tuples (j1 ,, js) with {j1 ,, js} forming such a J.Let denote the sum over all ordered s-tuples (j1,, js).

  • Proof(cont.)Pr[there exists a disfam J, |J| = s]

  • For smaller s we look at the further condition of J being a maxidisfam. To that end we let s denote the minimum, over all j1, , js of ,the sum taken over all i I except those i with i ~ jl for some 1 l s.In application s will be small (otherwise we use Lemma 8.4.1) and s will be close to . For some applications it is convenient to set

    and note that s >= sv.

  • Lemma 8.4.2 With the above notation and for any integer s, Pr[there exists a maxdisfam J, |J| = s]

    Proof. As in Lemma 8.4.1 we bound this probability by of J = {j1, , js} being a maxdisfam. For this to occur J must first be a disfam and then , where is the conjunction over all i I except those with i ~ jl for some 1 l s.

  • Proof(cont.)We apply Jansons Inequality to give an upper bound to .The associated values satisfy

    the latter since has simply fewer addends. Thus

    and

  • When = o(1) and v = o(1) or, more generally, 3 = + o(1), then Lemma 8.4.2 gives a close approximation to the Poisson Distribution since Pr[there exists a maxdisfam J, |J| = s] For s 3 and the probability is quite small for larger s by Lemma 8.4.1