37
Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet, Probability, Statistics, and Random Signals, Oxford University Press, February 2016. B.J. Bazuin, Spring 2022 1 of 37 ECE 3800 Charles Boncelet, “Probability, Statistics, and Random Signals," Oxford University Press, 2016. ISBN: 978-0-19-020051-0 Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS Sections 9.1 The Gaussian Distribution and Density 9.2 Quantile Function 9.3 Moments of the Gaussian Distribution 9.4 The Central Limit Theorem 9.5 Related Distributions 9.5.1 The Laplace Distribution 9.5.2 The Rayleigh Distribution 9.5.3 The Chi-Squared and F Distributions 9.6 Multiple Gaussian Random Variables 9.6.1 Independent Gaussian Random Variables 9.6.2 Transformation to Polar Coordinates 9.6.3 Two Correlated Gaussian Random Variables 9.7 Example: Digital Communications Using QAM 9.7.1 Background 9.7.2 Discrete Time Model 9.7.3 Monte Carlo Exercise 9.7.4 QAM Recap Summary Problems

Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 1 of 37 ECE 3800

Charles Boncelet, “Probability, Statistics, and Random Signals," Oxford University Press, 2016. ISBN: 978-0-19-020051-0

Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Sections 9.1 The Gaussian Distribution and Density 9.2 Quantile Function 9.3 Moments of the Gaussian Distribution 9.4 The Central Limit Theorem 9.5 Related Distributions

9.5.1 The Laplace Distribution 9.5.2 The Rayleigh Distribution 9.5.3 The Chi-Squared and F Distributions

9.6 Multiple Gaussian Random Variables 9.6.1 Independent Gaussian Random Variables 9.6.2 Transformation to Polar Coordinates 9.6.3 Two Correlated Gaussian Random Variables

9.7 Example: Digital Communications Using QAM 9.7.1 Background 9.7.2 Discrete Time Model 9.7.3 Monte Carlo Exercise 9.7.4 QAM Recap

Summary Problems

Page 2: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 2 of 37 ECE 3800

Gaussian Distribution and Density

The Gaussian or Normal probability density function is defined as:

𝑓 𝑥1

√2𝜋 ∙ 𝜎∙ 𝑒𝑥𝑝

𝑥 𝜇2 ∙ 𝜎

, ∞ 𝑥 ∞

where μ is the mean and σ is the variance

The Gaussian Cumulative Distribution Function (CDF)

𝐹 𝑥1

√2𝜋 ∙ 𝜎∙ 𝑒𝑥𝑝

𝑣 𝜇2 ∙ 𝜎

∙ 𝑑𝑣

The CDF can not be represented in a closed form solution!

NormalDistribution–Gaussianwithzeromeanandunitvariance.

The Normal probability density function is defined as:

xfor

xxN ,

2exp

2

1 2

The Normal Cumulative Distribution Function (CDF)

dvv

xx

v

N

2exp

2

1 2

Note the relationship between the Gaussian and Gaussian-Normal is

x

xF XX

see the MATLAB: GaussianDemo.m

-8 -6 -4 -2 0 2 4 6 80

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1Gaussian PDF and pdf

Page 3: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 3 of 37 ECE 3800

Gaussian or Normal Distribution

http://en.wikipedia.org/wiki/Normal_distribution http://en.wikipedia.org/wiki/Normal_distribution#Occurrence

Not yet proven reasons for importance:

1. It provides a good mathematical model for a great many different physically observed random phenomena that can be justified theoretically in many ways.

2. It is one of the few density functions that can be extended to handle an arbitrarily large number of random variables conveniently.

3. Linear combinations of Gaussian random variables lead to new random variables that are also Gaussian. This is not true for most other density functions.

4. The random process from which Gaussian random variables are derived can be completely specified, in a statistical sense, from a knowledge of the first and second moments. This is not true for other processes. All higher level moments are sums, products and/or powers of the mean and variance.

5. In system analysis, the Gaussian process is often the only one for which a complete statistical analysis can be carried through in either the linear or nonlinear situation.

6. The function is infinitely differentiable (all the derivatives exist).

Page 4: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 4 of 37 ECE 3800

Important notes on the Gaussian curve:

The pdf

1. There is only one maximum and it occurs at the mean value.

2. The density function is symmetric about the mean value.

3. The width of the density function is directly proportional to the standard deviation, . The width of 2 occurs at the points where the height is 0.6065 (exp(-0.5)) of the maximum value. These are also the points of the maximum slope. Also note that:

683.0Pr X

955.022Pr X

4. The maximum value of the density function is inversely proportional to the standard deviation, .

2

1Xf

5. Since the density function has an area of unity, it can be used as a representation of the impulse or delta function by letting approach zero. That is

2

2

0 2exp

2

1lim

xx

Page 5: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 5 of 37 ECE 3800

Specific Values for the Standard Normal CDF

𝜑 𝑥1

√2𝜋∙ 𝑒𝑥𝑝

𝑥2

, ∞ 𝑥 ∞

where μ = 0 is the mean and the variance σ = 1.

Φ 𝑥1

√2𝜋∙ 𝑒𝑥𝑝

𝑣2

∙ 𝑑𝑣

683.0Pr X

955.022Pr X

Page 6: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 6 of 37 ECE 3800

Gaussian to Normal is a linear scaling

Letting the linear relationship be defined as

𝑍𝑋 𝜇𝜎

The inverse mapping 𝑋 𝑍 ∙ 𝜎 𝜇

the Jocobian or derivative becomes 𝑑𝑥𝑑𝑧

𝜎

Therefore

𝑓 𝑧 𝑓 𝑥 ∙𝑑𝑥𝑑𝑧

Then for the normalized form the R.V.

𝑓 𝑥1

√2𝜋 ∙ 𝜎∙ 𝑒𝑥𝑝

𝑥 𝜇2 ∙ 𝜎

, ∞ 𝑥 ∞

𝑓 𝑧1

√2𝜋 ∙ 𝜎∙ 𝑒𝑥𝑝

𝑧 ∙ 𝜎 𝜇 𝜇2 ∙ 𝜎

∙ 𝜎

𝑓 𝑧1

√2𝜋∙ 𝑒𝑥𝑝

𝑧 ∙ 𝜎2 ∙ 𝜎

𝑓 𝑧1

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

𝜑 𝑦

In addition, we would expect

𝐹 𝑥 Φ 𝑧

𝐹 𝑥 Φ𝑥 𝜇𝜎

Page 7: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 7 of 37 ECE 3800

Two-sided Gaussian Probability

Pr μ σ x μ σ 0.6827

Pr μ 2σ x μ 2σ 0.9545

Pr μ 3σ x μ 3σ 0.9973

One‐SidedGaussianProbability

𝑃𝑟 𝑥 0 0.5

𝑃𝑟 𝑥 𝜇 𝜎 0.8413

𝑃𝑟 𝑥 𝜇 2𝜎 0.9772

𝑃𝑟 𝑥 𝜇 3𝜎 0.9987

For hypothesis testing and statistical confidence intervals … there will be multiple problems and examples where either a two-sided or one-sided Gaussian probability is required. There are differences in the solutions derived if the wrong one is selected!

Page 8: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 8 of 37 ECE 3800

EquivalentGaussianprobabilityrepresentations

𝑓 𝑥1

√2𝜋 ∙ 𝜎∙ 𝑒𝑥𝑝

𝑥 𝜇2 ∙ 𝜎

, ∞ 𝑥 ∞

𝜑 𝑧1

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

Manipulations

Pr a X b 𝐹 𝑏 𝐹 𝑎

Pr a X b Pr a 𝜇 X 𝜇 b 𝜇 , 𝑠ℎ𝑖𝑓𝑡𝑖𝑛𝑔 𝑚𝑒𝑎𝑛

Pr a X b Pra 𝜇𝜎

X 𝜇𝜎

b 𝜇𝜎

, 𝑙𝑖𝑛𝑒𝑎𝑟 𝑠𝑐𝑎𝑙𝑖𝑛𝑔

𝑍𝑋 𝜇𝜎

Pr a X b Pra 𝜇𝜎

Zb 𝜇𝜎

, 𝑛𝑜𝑟𝑚𝑎𝑙𝑖𝑧𝑎𝑡𝑖𝑜𝑛

Using normalized probability

Pr a X b Φb 𝜇𝜎

Φa 𝜇𝜎

The normalization of the Gaussian is often implemented using “Z”.

The computations with the standard normalization is referred to as a z-score.

EquivalentProbabilitiesPr Z b Φ b

Pr Z a 1 Φ a

Pr a Z b Φ b Φ a

Also note Φ z 1 Φ z

Page 9: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 9 of 37 ECE 3800

OtherrelationshipswithnormalizedGaussian

Pr a Z a Φ 𝑎 Φ 𝑎 Φ 𝑎 1 Φ 𝑎

Pr a Z a 2 ∙ Φ 𝑎 1

or in general

Pr a Z b Φ 𝑏 Φ 𝑎 Φ 𝑏 1 Φ 𝑎

Pr a Z b Φ 𝑏 Φ 𝑎 1

PerformingComputations

The error function is typically defined as

𝑒𝑟𝑓 𝑧2

√𝜋∙ 𝑒𝑥𝑝 𝑦 ∙ 𝑑𝑦

Φ z12

12∙ 𝑒𝑟𝑓

𝑧

√2

𝑍𝑋 𝜇𝜎

𝐹 𝑥12

12∙ 𝑒𝑟𝑓

𝑥 𝜇

√2 ∙ 𝜎

For multiple bounds

22

1

2

1

22

1

2

11

aerf

berfFbFbXaP XXX

𝑃𝑟 𝑎 𝑋 𝑏 𝐹 𝑏 𝐹 𝑎12∙ 𝑒𝑟𝑓

𝑏 𝜇

√2 ∙ 𝜎

12∙ 𝑒𝑟𝑓

𝑎 𝜇

√2 ∙ 𝜎

This definition is valid for MATLAB and EXCEL and WIKIPEDIA. There are other sources that do not define it this way, so check before use!

Φ z12

12∙ 𝑒𝑟𝑓

𝑧

√2

Page 10: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 10 of 37 ECE 3800

The complementary error function is defined as

𝑒𝑟𝑓𝑐 𝑧 1 𝑒𝑟𝑓 𝑧2

√𝜋∙ 𝑒𝑥𝑝 𝑦 ∙ 𝑑𝑦

Φ z 1 Φ 𝑧 112

12∙ 𝑒𝑟𝑓

𝑧

√2

Φ z 1 Φ 𝑧12

12∙ 𝑒𝑟𝑓

𝑧

√2

12∙ 𝑒𝑟𝑓𝑐

𝑧

√2

There are also inverse functions for erf and erfc!

z Φ 𝑃𝑟 √2 ∙ erfinv 2 ∙ 𝑃𝑟 1

The Q function in communications is “the tail of the Gaussian”

Q z 1 Φ 𝑧12

12∙ 𝑒𝑟𝑓

𝑧

√2

12∙ 𝑒𝑟𝑓𝑐

𝑧

√2

See gaussian.m and qfunction.m

function [pdf, cdf]=gaussian(x, mean, sigma) % The Gaussian probability mass function given: % mean is the mean % sigma is the variance % pdf is probability density function % cdf is cumulative distribution function pdf = (1/(sqrt(2*pi)*sigma))*exp((-(x-mean).^2)/(2*sigma^2)); cdf = 0.5+0.5*erf((x-mean)/(sqrt(2)*sigma)); %cdf1 = 0.5*erfc((x-mean)/(sqrt(2)*sigma)); %cdf2 = 1-qfunction((x-mean)/sigma); function y = qfunction(x) % Q Function generation routine % % From G.R. Cooper and C.D. McGillem % Probabilistic Methods of Signal and System Analysis % Oxford Univ. Press, New York, NY, 1999. p. 442. y = 0.5*(ones(size(x))-erf(x/sqrt(2))); % From Matlab %y = 0.5*erfc(x/sqrt(2));

Page 11: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 11 of 37 ECE 3800

Quantile Function

The inverse of the Gaussian CDF.

𝑝 𝑃𝑟 𝑋 𝑥 𝐹 𝑥

The inverse is then

𝑥 𝐹 𝑝 𝑄 𝑝 𝑄 𝐹 𝑥

As used in the textbook, the function is particularly useful when given a one-sided or two-sided probability value and you want to determine the appropriate offset from the mean defined (for one-sided) or “range” about the mean value defined (for two-sided).

Wikipedia has a much more extensive definition and discussion. See: https://en.wikipedia.org/wiki/Quantile

This will become highly used with decision making and hypothesis testing in statistics.

Page 12: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 12 of 37 ECE 3800

Moments of the Gaussian Distribution

All moments of a Gaussian can be defined based on the mean and variance. This makes the Gaussian very unique and also convenient … once the two are known, all others are known!

The normal function is even-symmetric. All products of z to an odd power are odd functions Therefore, the product of an odd power of z and the pdf will be odd-symmetric. If an odd-symmetric function is integrated from –infinity to +infinity, the result will be zero! Therefore, for all odd moments

𝐸 𝑋 0

This does no help for the computation of even moments.

Textbookderivations

The text derives the mean and variance for the normal function. p. 228-230.

𝜑 𝑧1

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

𝐸 𝑍 𝑧 ∙ 𝜑 𝑧 ∙ 𝑑𝑧 𝑧 ∙1

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

∙ 𝑑𝑧

𝐸 𝑍1

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

𝐸 𝑍1

√2𝜋∙ 𝑒𝑥𝑝

∞2

𝑒𝑥𝑝∞

20

And the variance

𝐸 𝑍 𝑧 ∙ 𝜑 𝑧 ∙ 𝑑𝑧 𝑧 ∙1

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

∙ 𝑑𝑧

Page 13: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 13 of 37 ECE 3800

𝐸 𝑍1

√2𝜋∙ 𝑧 ∙ 𝑒𝑥𝑝

𝑧2

∙ 𝑑𝑧

Integrating by parts: dvuuvduv

v = z 𝑑𝑢 𝑧 ∙ 𝑒𝑥𝑝𝑧2

dv = 1 𝑢 𝑒𝑥𝑝𝑧2

𝐸 𝑍1

√2𝜋∙ 𝑧 ∙ 𝑒𝑥𝑝

𝑧2

𝑒𝑥𝑝𝑧2

∙ 𝑑𝑧

𝐸 𝑍 0 01

√2𝜋∙ 𝑒𝑥𝑝

𝑧2

∙ 𝑑𝑧 1

𝑉𝑎𝑟 𝑍 𝐸 𝑍 𝐸 𝑍 1

Structure for higher order even moments

𝐸 𝑍1

√2𝜋∙ 𝑧 ∙ 𝑒𝑥𝑝

𝑧2

∙ 𝑑𝑧

Integrating by parts: dvuuvduv

𝑧 𝑧 ∙ 𝑒𝑥𝑝𝑧2

𝑛 1 ∙ 𝑧 𝑒𝑥𝑝𝑧2

𝐸 𝑍1

√2𝜋∙ 𝑧 ∙ 𝑥𝑝

𝑧2

𝑛 1 ∙ 𝑧 ∙ 𝑒𝑥𝑝𝑧2

∙ 𝑑𝑧

𝐸 𝑍 0 0 𝑛 1 ∙1

√2𝜋∙ 𝑧 ∙ 𝑒𝑥𝑝

𝑧2

∙ 𝑑𝑧

𝐸 𝑍 𝑛 1 ∙ 𝐸 𝑍

For n even

𝐸 𝑍 𝑛 1 ∙ 𝑛 3 ∙ ⋯ ∙ 1 ∙ 1

For n odd

𝐸 𝑍 𝑛 1 ∙ 𝑛 3 ∙ ⋯ ∙ 2 ∙ 0 0

Page 14: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 14 of 37 ECE 3800

Expected value of a Gaussian

dxxfxXEX X

xforx

xf X ,2

exp2

12

2

2

dxx

xXEX2

2

2 2exp

2

1

Letting

x

z with dx

dz

dz

zzXEX

2exp

2

1 2

2

dz

zzdz

zXEX

2exp

22exp

2

1 22

dz

zzXEX

2exp

2

2

2

exp2

2zXEX

XEX

Page 15: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 15 of 37 ECE 3800

Variance of Gaussian

dxxfXXE X22

xforx

xf X ,2

exp2

12

2

2

dxx

XXE2

2

2

22

2exp

2

1

dxx

XXE2

2

2

22

2exp

2

1

Letting

x

z with dx

dz

dz

zzXE

2exp

2

1 2

2

222

dz

zzXE

2exp

2

22

22

Integrating by parts: dvuuvduv

z

2exp

2zz

1

2exp

2z

dzzz

zXE2

exp2

12

2exp

2

2222

22

2 12002

XE

Page 16: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 16 of 37 ECE 3800

The MGF of a Gaussian

xforx

xf X ,2

exp2

12

2

2

MGF:

dxxtxftM XX exp

dxxtx

tM X exp2

exp2

12

2

2

When integrating Gaussians … form an integral of a “correctly formed” Gaussian pdf and equate it to 1.0.

dx

xtxxtM X 2

222

2 2

22exp

2

1

dx

xtxtM X 2

222

2 2

2exp

2

1

dxtxtx

ttM X

2

2222

2

2

222

2

2exp

2

1

2exp

dxtxt

tM X 2

22

22

222

2exp

2

1

2exp

The integral is now equal to 1.0. And we have

2

242

2

22422

2

2exp

2

2exp

tttt

tM X

2exp

22 tttM X

Page 17: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 17 of 37 ECE 3800

Now we can generate the moments of a Gaussian function.

The 1st Moment

2exp

2exp

222

22

0

ttt

tt

ttM

t tX

12

00exp0

222

0tX tM

t

The 2nd Moment

2

exp22

2

02

2 ttt

ttM

tt

X

2

exp2

exp22

222

22

02

2 tt

ttttM

tt

X

22222

02

2

110

t

X tMt

The 3rd Moment

2

exp22

222

03

3 ttt

ttM

tt

X

2exp2

2exp

2222

222232

03

3

ttt

tttttM

tt

X

102100 222232

03

3

t

X tMt

23

03

3

3

t

X tMt

Page 18: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 18 of 37 ECE 3800

The 4th Moment

2

exp322

2232

04

4 tttt

ttM

tt

X

2exp033

2exp3

2222222

2222242

04

4

ttt

tttttM

tt

X

10303

1030

22222

22242

0

4

4

t

X tMt

4224

04

4

36

t

X tMt

See: https://en.wikipedia.org/wiki/Normal_distribution

The above derivation matches the table provided ….

Page 19: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 19 of 37 ECE 3800

9.4 Central Limit Theorem

https://en.wikipedia.org/wiki/Central_limit_theorem

“In probability theory, the central limit theorem (CLT) establishes that, in most situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.”

The convolution of pdf of summed R.V. begins to look Gaussian after a large number of R.V. are summed.

SumsofIIDR.V.

S 𝑋

If n is known, the expected value of the sum should be expected

𝐸 S E 𝑋 𝐸 𝑋 𝜇 n ∙ 𝜇

𝑉𝑎𝑟 S E 𝑋 𝜇 𝑉𝑎𝑟 𝑋 𝜎 n ∙ 𝜎

If we normalize the summed random variance

YS 𝐸 S

𝑉𝑎𝑟 S

Then

𝐸 Y ES 𝐸 S

𝑉𝑎𝑟 S0

𝑉𝑎𝑟 Y VarS 𝐸 S

𝑉𝑎𝑟 S1.0

Based on the Central Limit Theorem, Y will be a Normal R.V. as n becomes very large.

Page 20: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 20 of 37 ECE 3800

CLT example convolutions.

Convolution with rectangles “GausConv_rect.m”

Sum of 50 uniform pdf R.V.

Convolution with exponentials (Erlang) “GausConv_exp.m”

Sum of 5 exponential pdf R.V. The curve shown in the textbook.

y0 5 10 15 20 25 30

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

y0 5 10 15 20 25 30 35 40 45 50

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0.2

ConvolutionGaussian

y0 1 2 3 4 5 6 7 8 9 10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

y0 1 2 3 4 5 6 7 8 9 10

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0.2

ConvolutionGaussian

Page 21: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 21 of 37 ECE 3800

TheCLTanddiscreteprobability.

Page 22: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 22 of 37 ECE 3800

Jointly Gaussian Random Variables

Notes and figures are based on or taken from materials in the course textbook: Probability, Statistics and Random Processes for Engineers, 4th ed., Henry Stark and John W. Woods, Pearson Education, Inc., 2012

If two R.V. are jointly Gaussian

2

2

2

2

2

2

,12

212

1exp

,

YX

Y

Y

YX

YX

X

X

YX

yyxx

yxf

If 0 :

𝑓 𝑥,𝑦

𝑒𝑥𝑝𝑥 𝜇2 ∙ 𝜎

𝑦 𝜇2 ∙ 𝜎

2𝜋 ∙ 𝜎 ∙ 𝜎

𝑒𝑥𝑝𝑥 𝜇2 ∙ 𝜎

√2𝜋 ∙ 𝜎∙

𝑒𝑥𝑝𝑦 𝜇2 ∙ 𝜎

√2𝜋 ∙ 𝜎

Visualizing Joint Gaussians …

Figure 4.3-4 Contours of constant density for the joint normal (X = Y = 0): (a) σX = σY, ρ = 0; (b) σX >σY, ρ=0; (c) σX <σY, ρ=0; (d) σX =σY ;ρ>0.

Page 23: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 23 of 37 ECE 3800

Joint Gaussian: Independent X and Y

For X and Y independent:

2

2

2

2

2exp

2

1

2exp

2

1,

X

X

XY

Y

Y

XY

xyyxf

2

2

2

2

22exp

2

1,

X

X

Y

Y

XYXY

xyyxf

If both functions have zero mean and identical variances

2

22

2 2exp

2

1,

xy

yxf XY

Figure 2.6-10 Graph of the joint Gaussian density. Stark & Woods.

This has been referred to as a hat function ….

Page 24: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 24 of 37 ECE 3800

Example 2.6-11 independent Gaussians, zero mean unit variance Stark & Woods … example in Chap. 8

Rectangular to circular conversion …

22 xyr and

x

yatan

Note that for an infinitesimal area ddrrdydx

Then, for cumulative distribution function

y x

XY ddyxF 2

exp2

1,

22

We could consider a change to circular area as

r

R ddrF0

2

0

2

2exp

2

1,

r

R ddrF0

2

0

2

2

1

2exp,

r

R drF0

2

2exp

12

exp2

exp2

0

2

rrF

r

R

2exp1

2rrFR

And the probability density function is

2exp

2rrrf R

with

20,2

1f

Also, they are independent …

randrr

rf R

020,

2exp

2,

2

Page 25: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 25 of 37 ECE 3800

The Rayleigh Distribution

If X and Y IID with Gaussian zero mean and defined variance, s2 under rectangular or magnitude and phase translation.

𝑓 𝑟𝑟𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

𝐹 𝑟 𝑓 𝜌 ∙ 𝑑𝜌𝜌𝑠∙ 𝑒𝑥𝑝

𝜌2 ∙ 𝑠

∙ 𝑑𝜌

𝐹 𝑟 𝑒𝑥𝑝𝜌

2 ∙ 𝑠𝑒𝑥𝑝

𝑟2 ∙ 𝑠

𝑒𝑥𝑝0

2 ∙ 𝑠

𝐹 𝑟 1 𝑒𝑥𝑝𝑟

2 ∙ 𝑠

The mean ….

𝐸 𝑅 𝑟 ∙ 𝑓 𝑟 ∙ 𝑑𝑟𝑟𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

∙ 𝑑𝑟

Integrating by parts: dvuuvduv

r 𝑟𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

1 𝑒𝑥𝑝𝑟

2 ∙ 𝑠

Page 26: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 26 of 37 ECE 3800

𝐸 𝑅 𝑟 ∙ 𝑒𝑥𝑝𝑟

2 ∙ 𝑠𝑒𝑥𝑝

𝑟2 ∙ 𝑠

∙ 𝑑𝑟

𝐸 𝑅 0 0 √2𝜋 ∙ 𝑠 ∙1

√2𝜋 ∙ 𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

∙ 𝑑𝑟

The integral of one half of the Gaussian!

𝐸 𝑅 √2𝜋 ∙ 𝑠 ∙12

𝜋2∙ 𝑠

The variance computation begins with a second moment ….

𝐸 𝑅 𝑟 ∙ 𝑓 𝑟 ∙ 𝑑𝑟𝑟𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

∙ 𝑑𝑟

Integrating by parts: dvuuvduv

𝑟 𝑟𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

2 ∙ 𝑟 𝑒𝑥𝑝𝑟

2 ∙ 𝑠

𝐸 𝑅 𝑟 ∙ 𝑒𝑥𝑝𝑟

2 ∙ 𝑠2 ∙ 𝑟 ∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

∙ 𝑑𝑟

𝐸 𝑅 0 0 2 ∙ 𝑠 ∙𝑟𝑠∙ 𝑒𝑥𝑝

𝑟2 ∙ 𝑠

∙ 𝑑𝑟

Recognizing the integral of the Rayleigh pdf!

𝐸 𝑅 2 ∙ 𝑠

The variance is then

𝑉𝑎𝑟 𝑅 𝐸 𝑅 𝐸 𝑅 2 ∙ 𝑠𝜋2∙ 𝑠 2

𝜋2

∙ 𝑠

Page 27: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 27 of 37 ECE 3800

Matlab

A Rayleigh distribution - two dimensional Gaussian x=randn(1,numsamples); y=randn(1,numsamples); xy=Gsigma*(x+1i*y);

Page 28: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 28 of 37 ECE 3800

Example: Archery target shooting with

A Rayleigh distribution - two dimensional Gaussian

Archer capability described by 4

125.0 YX in feet

From the Rayleigh distribution

0,0

0,2

exp2

2

2

rfor

rforrr

rfR

0,0

0,2

exp12

2

rfor

rforr

rFR

The specific values are

0,0

0,8exp16 2

rfor

rforrrrfR

0,0

0,8exp1 2

rfor

rforrrFR

𝐸 𝑅 √2𝜋 ∙ 𝑠 ∙12

𝜋2∙ 𝑠

𝜋2∙

14

0.3133

𝑉𝑎𝑟 𝑅 2𝜋2

∙ 𝑠 0.4292 ∙14

0.0268

Assume a 1-foot radius target with a 1-inch radius Bulls-eye

The archers expected performance can be described by ….

Probability of a Bulls-eye (1 inch radius)

0540.0144

8exp1

12

18exp1

12

1 2

RF

Probability of missing the target (1 foot radius)

42 1035.38exp18exp1111 RF

Page 29: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 29 of 37 ECE 3800

Example 2.4-5 Cell phone received signal power model.

The power can be described as a Rayleigh distribution.

rforrr

rf R

0,2

exp2

2

2

rforr

rFR

0,2

exp12

2

Assume mW1 for the r power radius.

What is the probability that the power W is less than 0.8 mW?

2

2

12

8.0exp18.0RF

or

8.0

02

2

2 12exp

18.0 dr

rrPR

Hint:

2

2

2

2

2 2exp

2exp

2

2

x

dxxx

2

0exp

2

8.0exp

2exp8.0

228.0

0

2rPR

29.032.0exp18.0 RP

Page 30: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 30 of 37 ECE 3800

The Laplace Distribution

𝑓 𝑥𝜆2∙ 𝑒𝑥𝑝 𝜆 ∙ |𝑥 𝜇|

𝐹 𝑥

12∙ 𝑒𝑥𝑝 𝜆 ∙ 𝑥 𝜇 , 𝑥 𝜇

112∙ 𝑒𝑥𝑝 𝜆 ∙ 𝑥 𝜇 , 𝜇 𝑥

𝐸 𝑋 𝜇

𝑉𝑎𝑟 𝑋2𝜆

As described in the text, the Laplace Distribution can be used under certain condition to provide the density of the estimation error.

Page 31: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 31 of 37 ECE 3800

The Chi-Squared

The chi-squared distribution is based on the summation of the squares of R.V. that have a Gaussian distribution.

S 𝑍 𝑍 𝑍 ⋯ 𝑍

S~χ

This would be defined as the χ2 distribution with a k degree of freedom

If the underlying R.V. are normal, IID with N(0,1)

𝑓 𝑥1

2 ⁄ ∙ Γ 𝑘 2⁄∙ 𝑥 ⁄ ∙ 𝑒𝑥𝑝 𝑥 2⁄ , 𝑥 0

where the Gamma function. A derivation of the pdf is available at Wikipedia.

The chi-square distribution is a special case of the gamma distribution and is one of the most widely used probability distributions in inferential statistics, e. g., in hypothesis testing or in construction of confidence intervals.

from https://en.wikipedia.org/wiki/Chi-squared_distribution

The gamma function is defined as

Γ 𝑥 𝑡 ∙ 𝑒𝑥𝑝 𝑡 ∙ 𝑑𝑡

Γ 𝑥 𝑥 1 ∙ Γ 𝑥 1

For n a positive integer and for n=1/2 Γ 𝑛 𝑛 1 !

Γ 1 2⁄ √𝜋

Graphically the gamma function appears as

Page 32: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 32 of 37 ECE 3800

Meanandvariance

𝐸 𝑆 𝑘, 𝑓𝑜𝑟 𝐸 𝑍 1

𝑉𝑎𝑟 𝑆 2 ∙ 𝑘, 𝑓𝑜𝑟 𝑉𝑎𝑟 𝑍 2

𝑉𝑎𝑟 𝑋2𝜆

Additional applications …

When determining mean squared error computations, particularly when signals and Gaussian noise are considered, the error is always squared and often defined based on the Gaussian noise. Therefore, it becomes the sum of IID squared Gaussians.

When we discuss confidence intervals, the chi-squared distribution will again be discussed.

F‐Distributions

The F distribution arises from the ratio of independent chi-squared random variables (S and T) with defined degrees of freedom (ds and dt). Such that

𝑍𝑆𝑇

~𝐹 𝑑 ,𝑑

Page 33: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 33 of 37 ECE 3800

Multiple Gaussian Random Variables

For two Gaussians R.V., X and Y, that are independent: 𝑓 𝑥,𝑦 𝑓 𝑥 ∙ 𝑓 𝑦

𝑓 𝑥,𝑦𝑒𝑥𝑝

𝑥 𝜇2 ∙ 𝜎

√2𝜋 ∙ 𝜎∙

𝑒𝑥𝑝𝑦 𝜇2 ∙ 𝜎

√2𝜋 ∙ 𝜎

𝑓 𝑥,𝑦

𝑒𝑥𝑝𝑥 𝜇2 ∙ 𝜎

𝑦 𝜇2 ∙ 𝜎

2𝜋 ∙ 𝜎 ∙ 𝜎

𝐹 𝑥, 𝑦 𝐹 𝑥 ∙ 𝐹 𝑦 Φ𝑥 𝜇𝜎

∙ Φ𝑦 𝜇𝜎

If the Gaussians are zero mean and IID

𝑓 𝑥, 𝑦𝑒𝑥𝑝

𝑥 𝑦2 ∙ 𝜎

2𝜋 ∙ 𝜎1

2𝜋 ∙ 𝜎∙ 𝑒𝑥𝑝

𝑥 𝑦2 ∙ 𝜎

If there are multiple zero mean IID Gaussians

𝑓 , , ,⋯, 𝑥 , 𝑥 , 𝑥 ,⋯ , 𝑥1

2𝜋 ∙ 𝜎 ⁄ ∙ 𝑒𝑥𝑝𝑥 𝑥 𝑥 ⋯ 𝑥

2 ∙ 𝜎

and for 𝑆 𝑋 𝑋 𝑥𝑋 ⋯ 𝑋

𝑓 , , ,⋯, 𝑥 , 𝑥 , 𝑥 ,⋯ , 𝑥1

2𝜋 ∙ 𝜎 ⁄ ∙ 𝑒𝑥𝑝𝑠

2 ∙ 𝜎

As an interpretation, the pdf is dependent on a “vector length” measurement from the origin of an n-dimensional space. Therefore, the probability function is circularly (or spherically) symmetric.

To “normalize” the symmetrical property we can define a new R.V.

𝑌𝑋𝑆

If both functions have zero mean and identical variances. Note:

𝑠 𝑥 𝑥 𝑥 ⋯ 𝑥

Page 34: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 34 of 37 ECE 3800

Two Correlated Gaussian Random Variables

If two R.V. are jointly Gaussian

𝑈 and 𝑉

𝜌𝜎

𝜎 ∙ 𝜎

𝑓 𝑥,𝑦

𝑒𝑥𝑝 12 ∙ 1 𝜌 ∙

𝑥 𝜇𝜎 2 ∙ 𝜌 ∙

𝑥 𝜇 ∙ 𝑦 𝜇𝜎 ∙ 𝜎

𝑦 𝜇𝜎

2𝜋 ∙ 𝜎 ∙ 𝜎 ∙ 1 𝜌

𝑓 𝑢, 𝑣𝑒𝑥𝑝 1

2 ∙ 1 𝜌 ∙ 𝑢 2 ∙ 𝜌 ∙ 𝑢 ∙ 𝑣 𝑣

2𝜋 ∙ 1 𝜌

The conditional probability of U given V becomes (again knowing that they are correlated)

𝑓 | 𝑢|𝑣𝑓 𝑢, 𝑣𝑓 𝑣

𝑓 | 𝑢|𝑣

𝑒𝑥𝑝 12 ∙ 1 𝜌 ∙ 𝑢 2 ∙ 𝜌 ∙ 𝑢 ∙ 𝑣 𝑣

2𝜋 ∙ 1 𝜌1

√2𝜋∙ 𝑒𝑥𝑝 𝑣

2

𝑓 | 𝑢|𝑣𝑒𝑥𝑝

𝑢 2 ∙ 𝜌 ∙ 𝑢 ∙ 𝑣 𝑣2 ∙ 1 𝜌 ∙ 𝑣

2

2𝜋 ∙ 1 𝜌

𝑓 | 𝑢|𝑣𝑒𝑥𝑝

𝑢 2 ∙ 𝜌 ∙ 𝑢 ∙ 𝑣 𝑣 𝑣 ∙ 1 𝜌2 ∙ 1 𝜌

2𝜋 ∙ 1 𝜌

𝑓 | 𝑢|𝑣𝑒𝑥𝑝

𝑢 2 ∙ 𝜌 ∙ 𝑢 ∙ 𝑣 𝜌 ∙ 𝑣2 ∙ 1 𝜌

2𝜋 ∙ 1 𝜌

𝑒𝑥𝑝𝑢 𝜌 ∙ 𝑣

2 ∙ 1 𝜌

2𝜋 ∙ 1 𝜌

The result is a Gaussian with a DC bias and variance both based on the correlation coefficient! In addition, the mean is based on v, but the variance is not!

Page 35: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 35 of 37 ECE 3800

The result can be converted back to terms in X and Y.

However, computing the mean and variance is sufficient in describing X given Y. 𝑋 𝑈 ∙ 𝜎 𝜇 and 𝑌 𝑉 ∙ 𝜎 𝜇

𝐸 𝑋|𝑌 𝑦 𝐸 𝑈 ∙ 𝜎 𝜇 |𝑉𝑦 𝜇𝜎

𝜇 𝜎 ∙ 𝜌 ∙ 𝑣

𝐸 𝑋|𝑌 𝑦 𝐸 𝑈 ∙ 𝜎 𝜇 |𝑉𝑦 𝜇𝜎

𝜇𝜎𝜎

∙ 𝜌 ∙ 𝑦 𝜇

𝐸 𝑋|𝑌 𝑦 𝜇𝜎𝜎∙𝜎

𝜎 ∙ 𝜎∙ 𝑦 𝜇 𝜇

𝜎𝜎

∙ 𝑦 𝜇

𝑉𝑎𝑟 𝑋|𝑌 𝑦 𝑉𝑎𝑟 𝑈 ∙ 𝜎 𝜇 |𝑉𝑦 𝜇𝜎

𝜎 ∙ 1 𝜌

Page 36: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 36 of 37 ECE 3800

9.7 Digital Communications Using QAM

Digital Communications:

Typically, you need to estimate the signal magnitude and phase information for a “symbol period”.

Phase shift Keying Quadrature Amplitude Modulation

Received symbols involve signal plus noise.

Noise is modeled as a two-dimensional Gaussian R.V. that is independent for each symbol estimated.

Detection regions/thresholds around the “constellation points” are defined that provide the “estimated symbol” received.

Received estimates for one QAM transmitted symbol may look like

Page 37: Chapter 9: THE GAUSSIAN AND RELATED DISTRIBUTIONS

Notes and figures are based on or taken from materials in the course textbook: Charles Boncelet,

Probability, Statistics, and Random Signals, Oxford University Press, February 2016.

B.J. Bazuin, Spring 2022 37 of 37 ECE 3800

MATLAB simulations of PSK and QAM

MPSK_Demo.m

QAMCFE_Example – must change ED/No level