Upload
nylnicart
View
235
Download
0
Embed Size (px)
Citation preview
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 1/34
DISCRETE MEMORYLESS SOURCE
Communication Systems by Simon HaykinChapter 9 : Fundamental Limits in Information Theory
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 2/34
INTRODUCTION
The purpose of a communication systcarry information bearing baseband
from one place to another ove
communication channel.
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 3/34
INFORMATION THEORY
●
It deals with mathematical modeling and aa communication system rather than with
sources and physical channel.
● It is a highly theoretical study of the efficiebandwidth to propagate information throu
electronic communications systems.
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 4/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 5/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 6/34
INFORMATION THEORY
A remarkable result that emerges from infotheory is that
if the entropy of the source is
less than the capacity of the chann
then error free communication over chcan be achieved.
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 7/34
UNCERTAINTY, INFORMATION, AND E
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 8/34
DISCRETE RANDOM VARIABLE, S
Suppose that a probabilistic experiment invoobservation of the output emitted by a disc
source during every unit of time (signaling
The source output is modeled as a discrete
variable, S , which takes on symbols fromfinite alphabet :
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 9/34
DISCRETE RANDOM VARIABLE, S
with probabilities:
that must satisfy the condition:
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 10/34
DISCRETE MEMORYLESS SOURCE
Assuming that the symbols emitted by the so
during successive signaling intervals are
statistically independent.
A source having such properties are called
DISCRETE MEMORYLESS SOURCE, amemoryless in the sense that the symbol
any time is independent of previous choic
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 11/34
DISCRETE MEMORYLESS SOURCE
Can we find a measure of how much inform
produced by DISCRETE MEMORYL
SOURCE?
Note: idea of information is closely related
uncertainty or surprise
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 12/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 13/34
The amount of information is related to the
inverse of the probability of occurrence
The amount of information gained after obse
event S = sk, which occurs with probability
logarithmic function(9.4)
**base of logarithmic is arbitrary
LOGARITHMIC FUNCTION
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 14/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 15/34
LOGARITHMIC FUNCTION
3.
(9.7)The less the probable an event is, the more infor
gain when it occurs.
4. if sare statistically independent.
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 16/34
BIT
Using Equation 9.4 in logarithmic base 2. Th
resulting unit of information is called the b
contraction of binary digit).
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 17/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 18/34
I S
K
)
The amount of information I(sk
) produced
source during an arbitrary signaling interva
depends on the symbol sk emitted by the
the time.
Indeed I (sk) is a discrete random variabletakes on the values
probabilities , respectively
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 19/34
MEAN OF I S
K
): ENTROPY
The mean of I(sk
) over the source alphabe
by
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 20/34
ENTROPY OF A DISCRETE MEMORYLESS SOUR
The important quantity H (S
) is called the
of a discrete memory less source with sou
alphabet.
It is a measure of the average informationper source symbol.
It depends only on the probabilities of the
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 21/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 22/34
SOME PROPERTIES OF ENTROPY
Furthermore, we may make two statements:1. H(S )= 0, if and only if the probability p
some k, and the remaining probabilities in
are all zero; this lower bound on entropy
corresponds to no uncertainty.2. H(S )= log K, if and only if pk =1/K for a
upper bond on entropy corresponds to ma
uncertainty.
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 23/34
EXAMPLE 9.1
ENTROPY OF BINARY MEMORY LE
Consider a binary memory less source for w
symbol 0 occurs with probability p0 and sy
with probability p1= 1 - p0, with entropy of:
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 24/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 25/34
EXAMPLE 9.1
SOLUTION
The function p0 is frequently encountered in
information theoretic problems, and define
This function is called as the entropy functiis a function of prior probability p0 defined
interval [0,1].Plotting the entropy function
versus p0
defined on the interval [0,1] as i
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 26/34
FIGURE 9.2 ENTROPY FUNCTION
The curve highlights the observations made
under points 1,2, and 3.
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 27/34
EXTENSION OF DISCRETE MEMORYLESS SOUR
-Consider blocks rather than individual sy
-Each block consisting of n successive symbols.
the probability of a source symbol S is
the product of the probabilities of the n symbols in S constituting the particulain S
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 28/34
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 29/34
EXAMPLE 9.2
: SOLUTION
The entropy of the source is:
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 30/34
EXAMPLE 9.2
: SOLUTION
Consider next the second order extension of
source.
With the source alphabet S consisting of thre
symbols, it follows that the source has ninsymbols.
Table 9 1 present the nine symbols its corre
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 31/34
Table 9.1
Alphabet particulars of second-order extension of a
memoryless source
Symbols of S
2 0 1 2 3 4 5 6
Corresponding
sequences of
symbols of S
s0s0 s0s1 s0s2 s1s0 s1s1 s1s2 s2s
Probability
p ( i ),
i = 0, 1, . . . , 8
1/16 1/16 1/8 1/16 1/16 1/8 1/8
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 32/34
EXAMPLE 9.2
: SOLUTION
The entropy of the extended source is:
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 33/34
EXAMPLE 9.2
: SOLUTION
The entropy of the extended source is:
Which proves:
8/11/2019 Discrete Memoryless Source Final
http://slidepdf.com/reader/full/discrete-memoryless-source-final 34/34
Presented by Roy Sencil and Janyl
END OF PRESEN