Upload
sajeer
View
214
Download
0
Embed Size (px)
Citation preview
7/30/2019 The Uniform Distributn
1/7
The Uniform Distribution
A continuous random variable X which hasprobability density function given by:
F (x) = 1 for a x b (1)b - a
and f(x) = 0 if x is not between a and b) follows a uniform distribution withparameters a and b. We write X ~ U (a , b)
Remember that the area under the graph of the random variable must be
equal to 1 (see continuous random variables).
Expectation and Variance
If X ~ U (a , b), then:
E (X) = (a + b) Var (X) = (1/12)(b - a)2
Proof of Expectation
2 2 21
2 2( ) 2
b
a
x x b a b adx
b a b a b a
+= = =
Cumulative Distribution Function
The cumulative distribution function can be found by integrating the p.d.f between 0 and t:
1( )
t
a
t aF t dx
b a b a
= =
Methods for generating random numbersfrom uniform distribution :
First: Inverse Transform method
http://www.mathsrevision.net/alevel/pages.php?page=52http://www.mathsrevision.net/alevel/pages.php?page=53http://www.mathsrevision.net/alevel/pages.php?page=52http://www.mathsrevision.net/alevel/pages.php?page=52#cdfhttp://www.mathsrevision.net/alevel/pages.php?page=52http://www.mathsrevision.net/alevel/pages.php?page=53http://www.mathsrevision.net/alevel/pages.php?page=52http://www.mathsrevision.net/alevel/pages.php?page=52#cdf7/30/2019 The Uniform Distributn
2/7
Suppose we can generate a uniform random number r on [0 1]. How canwe generatenumbers x with a given pdf P(x)?
To warm up our brain, let's first think about something else. Suppose we
generate auniform random number 0 < r < 1 and square it. So we have 2x r= .Clearly we also have0 < x < 1. What is the pdf P(x) for x? A wild guess might be that it is justthe square ofthe pdf for r, so x would also be uniform. It is however easy to see that thiscan't be true.Consider the probability p that x < 1/2. If x was uniform on [0 1] we wouldget p = 1/2.
In order to get an x < 1/2 we must have gotten an 1/ 2r< . The
probability for this is 1/ 2p = , not p = 1/2. So P(x) can't be uniform.
To figure out what P(x) is, consider the probability p that x falls in theinterval [ ]xx r+ . In the limit 0x we have, according to the definitionof a pdf, p = P(x) r . So if we can figure out p, we can compute P(x).
For x to fall in [ ]xx r+ , we must generate r in the range x x x + .
Since r is uniform, the probability for this (which is also p) is just the lengthof this interval. So we get
p x x x= + and we now use p = P(x) r and solve for P(x), obtaining
( ) x x xp xx
+ =
Taking the limit 0x we thus obtain
0
1( ) lim
2x
x x x dp x x
x dx x
+ = = =
With this result, we now have an instance of the inverse transform methodto generate
random numbers with pdf1/ 2 x : generate a uniform random number rand square it.
The general form of the inverse transform method is obtained bycomputing the pdf ofx = g(r) for some function g, and then trying to _nd a function g such thatthe desiredpdf is obtained.
Let us assume that g(r) is invertible with inverse 1( )g x . The chance that x
lies in the
interval [ ]xx dx+ is P(x)dx, for infinitesimal dx. What values of r should we
have gotten to get this? (Remember we are generating values x by callinga uniform random number generator to get r and then setting x = g(r).)
7/30/2019 The Uniform Distributn
3/7
We should have gotten an r in the interval [ r r + dr ], with r = 1( )g x )
and r + dr = 1g (x + dx). Write out the last formula and get
1 1| ( ) ( ) '( ) ,r dr g x g x dx + = =
where ' denotes the derivative. Using r = 1( )g x we can simplify this to
( )1 '( ) ,dr g x dx=The probability for the r value to be in the interval [r r + dr] is justdr. This is also theprobability for x to be in [x x + dx], which is P(x)dx. Using Eq. 1, weget
1
( ) ( ) '( ) ,p x dx g x dx
=thus
1( ) ( ) '( ).p x g x=
Integrating both sides and remembering that ( ) ( )x
F x P y dy
= , gives us
1( ) ( ),F x g x=
or
1
( ) ( ),g x F x
=provided F(x) has an inverse.In summary, to generate a number x with pdf P(x) using the inversetransform method , we first figure out the cdf F(x) from P(x). We then
invert that by solving r = F(x) for x, which gives the function 1( )F r . We
then generate a uniform random number 0 < r < 1 and compute x =1( )F r
7/30/2019 The Uniform Distributn
4/7
Second: Acceptance and rejection method
Let u and v be two independent random numbers with uniform
distribution in the intervals 0 1u< , and 1 1v . Do the transformation2/ ,x sv u a y u= + = . The rectangle in the (u,v) plane is transformed into a hat
functiony = h(x) in the (x, y) plane. All (x,y) points will fall under the hat curvey = h(x)
which is uniform in the center and falls like 2x in the tails. h(x) is a useful hat function
for the rejection method. The acceptance condition is v < f(x)/k.s and a are chosen so
that ( ) ( )f x kh x for allx, where
2
2
1......... ..
( )...........
( )
for a s x a s
h x selsewhere
x a
< +
=
The advantage of this method is that the calculations are simple and fast, and therejection rate is reasonable. Quick acceptance and quick rejection areas can be
applied. For discrete distributions, f(x) is replaced by f( |x| ) .
The following values are used for the hat parameters for the poisson, binomial andhypergeometric distributions
1,
2
a and = +
2
1, 1
2 1 3 3( )
2 2s s s
e e= + + =
where is the mean and 2 is the variance of the distribution (Ahrens & Dieter1989). These values are reasonably close to the optimal values. It is possible tocalculate the optimal values fora ands (Stadlober 1989, 1990), but this adds to theset-up time with only a marginal improvement in execution time. The optimal value ofkis of course the maximum off(x): k= f(M), whereMis the mode.
7/30/2019 The Uniform Distributn
5/7
This method is used for the poisson, binomial and hypergeometric distributions inStochasticLib.
Third: Convolution method
Convolution of Uniform Distributions
Consider a sumXof independent and uniformly distributed random
variables [ ], , 1,......, :i i iX U a b i n=:
1 ... nX X X= + +
then the following is true.
The sumXis symmetrically distributed around
1 1( ... ) ( ... )
2
n na a b b+ + + + +
If
2
1
( )n
n
i i
i
b a
=
(6)then by the central limit theorem the distribution of
[ ]
[ ]
1
1
n
ii
n
i
i
X E X
V X
=
=
tends for n to the standard normal distribution
Hence in case (6) and for sufficiently large n the distribution ofXcan be
approximated by the normal distribution1 1
( [ ], [ ])n n
i i
i i
N E X V X= = . However,
it turns out that in spite of the symmetry relation convergence to the
7/30/2019 The Uniform Distributn
6/7
normal distribution may be rather slow, if the lengths i ib a are verydifferent implying that the normal approximation might be bad even forlarge values ofn. Therefore, the exact distribution ofXwould be desirable.
Convolution of Identical Uniform Distributions
Historically, the case of independent and identically uniformly distributedXis played an important role, i. e.
1 ... nX X X= + + with [ ; ], 1,...,iX U a b i n=: (8)
The distribution ofXwas first studied by N. I. Lobatchewski in 1842. Hewanted to use it to evaluate the error of astronomical measurements, inorder to decide whether the Euclidean or the non-Euclidean geometry isvalid in the universe.
the density function ofX.
( ) ( )1
( , )
( )
0
11 ( ) .... ..
( ) ( 1)( )
0.....................................................................................
nn n x
i
n ni
nx na i b a if na x nb
x in b a
otherwise
=
=
%
( ), : [ ]x na
n n x b a
= =% largest integer less than
x na
b a
It is well-known that the speed of convergence to the normal distribution isextremely fast in the identically distributed case given by (8). Already forn = 4 the difference between the normal approximation and the exactdistribution is often negligible.
If the single distributions are not identical, but have a common length oftheir support,
.. .. 1,...,i i
b a b a for i n = = ,then it is possible to reduce the problem ofderiving the
7/30/2019 The Uniform Distributn
7/7
distribution ofXto the identically distributed case by suitabletransformations. However, allowing arbitrary uniform distributions requiresa different approach for determining the distribution function ofX.
Convolution of Arbitrary Uniform Distributions
The convolution of arbitrary uniform distributions can be obtainedby a result givenin [1] which refers to the distribution function of a linearcombination of independentU [0, 1]-distributed random variables. However, the given formula israther unsuited for practical applications and, therefore, a differentrepresentation of the distribution function is derived here.