Upload
donald-gardner
View
218
Download
2
Embed Size (px)
Citation preview
Chapter 5Markov processes
Run length coding
Gray code
|transition probability
Markov Processes
Let S = {s1, …, sq} be a set of symbols. A jth-order Markov process has probabilities p(si | si1
… sij) associated with it, the conditional
probability of seeing si after seeing si1 … sij. This is said to be a j-
memory source, and there are qj states in the Markov process. Transition Graph
b½
½
c
¼¼a
⅓ ⅓
⅓
¼
¼
sj si
p(si | sj)
Weather Example:
Let (j = 1). Think:
a means “fair”
b means “rain”
c means “snow”
Transition Matrix
p(si | sj) i = column, j = row
next symbol
= M
5.2
currentstate
∑ outgoing edges = 1
⅓ ⅓ ⅓¼ ½ ¼¼ ¼ ½
a b c
a
b
c
Ergodic Equilibriums
Definition: A Markov process M is said to be ergodic if
1. From any state we can eventually get to any other state.
2. The system reaches a limiting distribution.
weather.average overall theiswhich 11
4,
11
4,
11
3example, aboveIn
.satisfies and , thecalled is
exists.lim:Fact
. ... thisrepeating),,(),,(
e
eee
n
n
ncbacba
p
pMpm solutionequilibriupMp
MM
pMppppMppp
5.2
Predictive Coding
Assume a prediction algorithm for the source which given all prior symbols, predicts the next.
s1 ….. sn1 pn en = pn sn error
input stream prediction
What is transmitted is the error, ei. By knowing just the error, the predictor
also knows the original symbols.
source channel destination
predictor predictor
enensn sn
pnpn
must assume that both predictors are identical, and start in the same state
5.7
Accuracy: The probability of the predictor being correct is p = 1 q; constant (over time) and independent of other prediction errors.
Let the probability of a run of exactly n 0’s, (0n1), be p(n) = pn ∙ q.
The probability of runs of any length n = 0, 1, 2, … is:
111
1
00
q
q
p
q
pqpqqp
n
n
n
n
)(
00
)1()1(run a oflength Expected
pf
n
n
n
n pnqqpn
So, .1
1)1()(
0 0
1
n n
nn cp
pcpdppndppf
qpfq
pp
pp
p
pppppf
1)( So, .
)1(
1
)1(
1
)1(
)1()1()(
2222
Note: alternate method for calculating f(p), look at 2
0
n
np
5.8
Coding of Run Lengths
Send a k-digit binary number to represent a run of zeroes whose length is between 0 and 2k 2. (small runs are in binary)
For run lengths larger than 2k 2, send 2k 1 (k ones) followed by another k-digit binary number, etc. (large runs are in unary)
Let n = run length. Fix k = block length.
Let n = i ∙ m + j 0 ≤ j < m = 2k 1
like “reading” the “matrix” with m cells and ∞ many rows.
nj
ikj
i
kk
n lkiBB
j
)1(1111110 length
binary in code
5.9
Let p(n) = the probability of a run of exactly n 0’s: 0n1. The expected code length is:
0n
nlnp
mmm
i
imm
i
mim
i
m
j
jim
i
m
j
jim
i
m
jjmi
p
k
ppkpipk
q
ppiqkppiqk
kiqpljmip
1)1(
1)1()1()1(
1)1()1(
)1()(
20
00
1
0
0
1
00
1
0
5.9
But every n can be written uniquely as i∙m + j where i ≥ 0, 0 ≤ j < m = 2k 1.
Expected length of run length code
Gray Code
Consider an analog-to-digital “flash” converter consisting of a rotating wheel:
0 0
00
0
0
00 0
0
0
0 01
1
11
11
1 1
1
11
1The maximum error in the scheme is ± ⅛ rotation because …
imagine “brushes” contacting the wheel in each of the three circles
The Hamming Distance between adjacent positions is 1.In ordinary binary, the maximum distance is 3 (the max. possible).
5.15-17
codeGray
bit )1(
1
1
0
0
)1(1
0)1(
0
12
12
0
n
G
G
G
G
nGGn
n
1-bit Gray code
}1,0{}1,0{Let 0101 iniini bbbBgggG
10111 nibbgbg iiinnEncoding
total)running a (keep10Decoding1
nigbn
ijji
5.15-17
12
0
)(
nG
G
nG Let :definition Inductive