75
Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman 1. Foundations: Probability, Uncertainty, and Information 2. Entropies Defined, and Why they are Measures of Information 3. Source Coding Theorem; Prefix, Variable-, & Fixed-Length Codes 4. Channel Types, Properties, Noise, and Channel Capacity 5. Continuous Information; Density; Noisy Channel Coding Theorem 6. Fourier Series, Convergence, Orthogonal Representation 7. Useful Fourier Theorems; Transform Pairs; Sampling; Aliasing 8. Discrete Fourier Transform. Fast Fourier Transform Algorithms 9. The Quantized Degrees-of-Freedom in a Continuous Signal 10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal “Logons” 11. Kolmogorov Complexity and Minimal Description Length

Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

  • Upload
    others

  • View
    11

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Information Theory and Coding

Computer Science Tripos Part II, Michaelmas Term

11 Lectures by J G Daugman

1. Foundations: Probability, Uncertainty, and Information

2. Entropies Defined, and Why they are Measures of Information

3. Source Coding Theorem; Prefix, Variable-, & Fixed-Length Codes

4. Channel Types, Properties, Noise, and Channel Capacity

5. Continuous Information; Density; Noisy Channel Coding Theorem

6. Fourier Series, Convergence, Orthogonal Representation

7. Useful Fourier Theorems; Transform Pairs; Sampling; Aliasing

8. Discrete Fourier Transform. Fast Fourier Transform Algorithms

9. The Quantized Degrees-of-Freedom in a Continuous Signal

10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal “Logons”

11. Kolmogorov Complexity and Minimal Description Length

Page 2: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Information Theory and Coding

J G Daugman

Prerequisite courses: Probability; Mathematical Methods for CS; Discrete Mathematics

Aims

The aims of this course are to introduce the principles and applications of information theory.The course will study how information is measured in terms of probability and entropy, and therelationships among conditional and joint entropies; how these are used to calculate the capacityof a communication channel, with and without noise; coding schemes, including error correctingcodes; how discrete channels and measures of information generalise to their continuous forms;the Fourier perspective; and extensions to wavelets, complexity, compression, and efficient codingof audio-visual information.

Lectures

• Foundations: probability, uncertainty, information. How concepts of randomness,redundancy, compressibility, noise, bandwidth, and uncertainty are related to information.Ensembles, random variables, marginal and conditional probabilities. How the metrics ofinformation are grounded in the rules of probability.

• Entropies defined, and why they are measures of information. Marginal entropy,joint entropy, conditional entropy, and the Chain Rule for entropy. Mutual informationbetween ensembles of random variables. Why entropy is the fundamental measure of infor-mation content.

• Source coding theorem; prefix, variable-, and fixed-length codes. Symbol codes.The binary symmetric channel. Capacity of a noiseless discrete channel. Error correctingcodes.

• Channel types, properties, noise, and channel capacity. Perfect communicationthrough a noisy channel. Capacity of a discrete channel as the maximum of its mutualinformation over all possible input distributions.

• Continuous information; density; noisy channel coding theorem. Extensions of thediscrete entropies and measures to the continuous case. Signal-to-noise ratio; power spectraldensity. Gaussian channels. Relative significance of bandwidth and noise limitations. TheShannon rate limit and efficiency for noisy continuous channels.

• Fourier series, convergence, orthogonal representation. Generalised signal expan-sions in vector spaces. Independence. Representation of continuous or discrete data bycomplex exponentials. The Fourier basis. Fourier series for periodic functions. Examples.

• Useful Fourier theorems; transform pairs. Sampling; aliasing. The Fourier trans-form for non-periodic functions. Properties of the transform, and examples. Nyquist’sSampling Theorem derived, and the cause (and removal) of aliasing.

• Discrete Fourier transform. Fast Fourier Transform Algorithms. Efficient al-gorithms for computing Fourier transforms of discrete data. Computational complexity.Filters, correlation, modulation, demodulation, coherence.

Page 3: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

• The quantised degrees-of-freedom in a continuous signal. Why a continuous sig-nal of finite bandwidth and duration has a fixed number of degrees-of-freedom. Diverseillustrations of the principle that information, even in such a signal, comes in quantised,countable, packets.

• Gabor-Heisenberg-Weyl uncertainty relation. Optimal “Logons”. Unification ofthe time-domain and the frequency-domain as endpoints of a continuous deformation. TheUncertainty Principle and its optimal solution by Gabor’s expansion basis of “logons”.Multi-resolution wavelet codes. Extension to images, for analysis and compression.

• Kolmogorov complexity. Minimal description length. Definition of the algorithmiccomplexity of a data sequence, and its relation to the entropy of the distribution fromwhich the data was drawn. Fractals. Minimal description length, and why this measure ofcomplexity is not computable.

Objectives

At the end of the course students should be able to

• calculate the information content of a random variable from its probability distribution

• relate the joint, conditional, and marginal entropies of variables in terms of their coupledprobabilities

• define channel capacities and properties using Shannon’s Theorems

• construct efficient codes for data on imperfect communication channels

• generalise the discrete concepts to continuous signals on continuous channels

• understand Fourier Transforms and the main ideas of efficient algorithms for them

• describe the information resolution and compression properties of wavelets

Recommended book

* Cover, T.M. & Thomas, J.A. (1991). Elements of information theory. New York: Wiley.

Page 4: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Information Theory and Coding

Computer Science Tripos Part II, Michaelmas Term

11 lectures by J G Daugman

1. Overview: What is Information Theory?

Key idea: The movements and transformations of information, just like

those of a fluid, are constrained by mathematical and physical laws.

These laws have deep connections with:

• probability theory, statistics, and combinatorics

• thermodynamics (statistical physics)

• spectral analysis, Fourier (and other) transforms

• sampling theory, prediction, estimation theory

• electrical engineering (bandwidth; signal-to-noise ratio)

• complexity theory (minimal description length)

• signal processing, representation, compressibility

As such, information theory addresses and answers the

two fundamental questions of communication theory:

1. What is the ultimate data compression?

(answer: the entropy of the data, H , is its compression limit.)

2. What is the ultimate transmission rate of communication?

(answer: the channel capacity, C, is its rate limit.)

All communication schemes lie in between these two limits on the com-

pressibility of data and the capacity of a channel. Information theory

can suggest means to achieve these theoretical limits. But the subject

also extends far beyond communication theory.

1

Page 5: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Important questions... to which Information Theory offers answers:

• How should information be measured?

• How much additional information is gained by some reduction in

uncertainty?

• How do the a priori probabilities of possible messages determine

the informativeness of receiving them?

• What is the information content of a random variable?

• How does the noise level in a communication channel limit its

capacity to transmit information?

• How does the bandwidth (in cycles/second) of a communication

channel limit its capacity to transmit information?

• By what formalism should prior knowledge be combined with

incoming data to draw formally justifiable inferences from both?

• How much information in contained in a strand of DNA?

• How much information is there in the firing pattern of a neurone?

Historical origins and important contributions:

• Ludwig BOLTZMANN (1844-1906), physicist, showed in 1877 that

thermodynamic entropy (defined as the energy of a statistical en-

semble [such as a gas] divided by its temperature: ergs/degree) is

related to the statistical distribution of molecular configurations,

with increasing entropy corresponding to increasing randomness. He

made this relationship precise with his famous formula S = k log W

where S defines entropy, W is the total number of possible molec-

ular configurations, and k is the constant which bears Boltzmann’s

name: k =1.38 x 10−16 ergs per degree centigrade. (The above

formula appears as an epitaph on Boltzmann’s tombstone.) This is

2

Page 6: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

equivalent to the definition of the information (“negentropy”) in an

ensemble, all of whose possible states are equiprobable, but with a

minus sign in front (and when the logarithm is base 2, k=1.) The

deep connections between Information Theory and that branch of

physics concerned with thermodynamics and statistical mechanics,

hinge upon Boltzmann’s work.

• Leo SZILARD (1898-1964) in 1929 identified entropy with informa-

tion. He formulated key information-theoretic concepts to solve the

thermodynamic paradox known as “Maxwell’s demon” (a thought-

experiment about gas molecules in a partitioned box) by showing

that the amount of information required by the demon about the

positions and velocities of the molecules was equal (negatively) to

the demon’s entropy increment.

• James Clerk MAXWELL (1831-1879) originated the paradox called

“Maxwell’s Demon” which greatly influenced Boltzmann and which

led to the watershed insight for information theory contributed by

Szilard. At Cambridge, Maxwell founded the Cavendish Laboratory

which became the original Department of Physics.

• R V HARTLEY in 1928 founded communication theory with his

paper Transmission of Information. He proposed that a signal

(or a communication channel) having bandwidth Ω over a duration

T has a limited number of degrees-of-freedom, namely 2ΩT , and

therefore it can communicate at most this quantity of information.

He also defined the information content of an equiprobable ensemble

of N possible states as equal to log2 N .

• Norbert WIENER (1894-1964) unified information theory and Fourier

analysis by deriving a series of relationships between the two. He

invented “white noise analysis” of non-linear systems, and made the

definitive contribution to modeling and describing the information

content of stochastic processes known as Time Series.

3

jgd1000
Oval
Page 7: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

• Dennis GABOR (1900-1979) crystallized Hartley’s insight by formu-

lating a general Uncertainty Principle for information, expressing

the trade-off for resolution between bandwidth and time. (Signals

that are well specified in frequency content must be poorly localized

in time, and those that are well localized in time must be poorly

specified in frequency content.) He formalized the “Information Di-

agram” to describe this fundamental trade-off, and derived the con-

tinuous family of functions which optimize (minimize) the conjoint

uncertainty relation. In 1974 Gabor won the Nobel Prize in Physics

for his work in Fourier optics, including the invention of holography.

• Claude SHANNON (together with Warren WEAVER) in 1949 wrote

the definitive, classic, work in information theory: Mathematical

Theory of Communication. Divided into separate treatments for

continuous-time and discrete-time signals, systems, and channels,

this book laid out all of the key concepts and relationships that de-

fine the field today. In particular, he proved the famous Source Cod-

ing Theorem and the Noisy Channel Coding Theorem, plus many

other related results about channel capacity.

• S KULLBACK and R A LEIBLER (1951) defined relative entropy

(also called information for discrimination, or K-L Distance.)

• E T JAYNES (since 1957) developed maximum entropy methods

for inference, hypothesis-testing, and decision-making, based on the

physics of statistical mechanics. Others have inquired whether these

principles impose fundamental physical limits to computation itself.

• A N KOLMOGOROV in 1965 proposed that the complexity of a

string of data can be defined by the length of the shortest binary

program for computing the string. Thus the complexity of data

is its minimal description length, and this specifies the ultimate

compressibility of data. The “Kolmogorov complexity” K of a string

is approximately equal to its Shannon entropy H , thereby unifying

the theory of descriptive complexity and information theory.

4

jgd1000
Oval
jgd1000
Oval
Page 8: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

2. Mathematical Foundations; Probability Rules;

Bayes’ Theorem

What are random variables? What is probability?

Random variables are variables that take on values determined by prob-

ability distributions. They may be discrete or continuous, in either their

domain or their range. For example, a stream of ASCII encoded text

characters in a transmitted message is a discrete random variable, with

a known probability distribution for any given natural language. An

analog speech signal represented by a voltage or sound pressure wave-

form as a function of time (perhaps with added noise), is a continuous

random variable having a continuous probability density function.

Most of Information Theory involves probability distributions of ran-

dom variables, and conjoint or conditional probabilities defined over

ensembles of random variables. Indeed, the information content of a

symbol or event is defined by its (im)probability. Classically, there are

two different points of view about what probability actually means:

• relative frequency: sample the random variable a great many times

and tally up the fraction of times that each of its different possible

values occurs, to arrive at the probability of each.

• degree-of-belief: probability is the plausibility of a proposition or

the likelihood that a particular state (or value of a random variable)

might occur, even if its outcome can only be decided once (e.g. the

outcome of a particular horse-race).

The first view, the “frequentist” or operationalist view, is the one that

predominates in statistics and in information theory. However, by no

means does it capture the full meaning of probability. For example,

the proposition that "The moon is made of green cheese" is one

which surely has a probability that we should be able to attach to it.

We could assess its probability by degree-of-belief calculations which

5

Page 9: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

combine our prior knowledge about physics, geology, and dairy prod-

ucts. Yet the “frequentist” definition of probability could only assign

a probability to this proposition by performing (say) a large number

of repeated trips to the moon, and tallying up the fraction of trips on

which the moon turned out to be a dairy product....

In either case, it seems sensible that the less probable an event is, the

more information is gained by noting its occurrence. (Surely discovering

that the moon IS made of green cheese would be more “informative”

than merely learning that it is made only of earth-like rocks.)

Probability Rules

Most of probability theory was laid down by theologians: Blaise PAS-

CAL (1623-1662) who gave it the axiomatization that we accept today;

and Thomas BAYES (1702-1761) who expressed one of its most impor-

tant and widely-applied propositions relating conditional probabilities.

Probability Theory rests upon two rules:

Product Rule:

p(A, B) = “joint probability of both A and B”

= p(A|B)p(B)

or equivalently,

= p(B|A)p(A)

Clearly, in case A and B are independent events, they are not condi-

tionalized on each other and so

p(A|B) = p(A)

and p(B|A) = p(B),

in which case their joint probability is simply p(A, B) = p(A)p(B).

6

Page 10: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Sum Rule:

If event A is conditionalized on a number of other events B, then the

total probability of A is the sum of its joint probabilities with all B:

p(A) =∑

Bp(A, B) =

Bp(A|B)p(B)

From the Product Rule and the symmetry that p(A, B) = p(B, A), it

is clear that p(A|B)p(B) = p(B|A)p(A). Bayes’ Theorem then follows:

Bayes’ Theorem:

p(B|A) =p(A|B)p(B)

p(A)

The importance of Bayes’ Rule is that it allows us to reverse the condi-

tionalizing of events, and to compute p(B|A) from knowledge of p(A|B),

p(A), and p(B). Often these are expressed as prior and posterior prob-

abilities, or as the conditionalizing of hypotheses upon data.

Worked Example:

Suppose that a dread disease affects 1/1000th of all people. If you ac-

tually have the disease, a test for it is positive 95% of the time, and

negative 5% of the time. If you don’t have the disease, the test is posi-

tive 5% of the time. We wish to know how to interpret test results.

Suppose you test positive for the disease. What is the likelihood that

you actually have it?

We use the above rules, with the following substitutions of “data” D

and “hypothesis” H instead of A and B:

D = data: the test is positive

H = hypothesis: you have the disease

H = the other hypothesis: you do not have the disease

7

Page 11: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Before acquiring the data, we know only that the a priori probabil-

ity of having the disease is .001, which sets p(H). This is called a prior.

We also need to know p(D).

From the Sum Rule, we can calculate that the a priori probability

p(D) of testing positive, whatever the truth may actually be, is:

p(D) = p(D|H)p(H) + p(D|H)p(H) =(.95)(.001)+(.05)(.999) = .051

and from Bayes’ Rule, we can conclude that the probability that you

actually have the disease given that you tested positive for it, is much

smaller than you may have thought:

p(H|D) =p(D|H)p(H)

p(D)=

(.95)(.001)

(.051)= 0.019 (less than 2%).

This quantity is called the posterior probability because it is computed

after the observation of data; it tells us how likely the hypothesis is,

given what we have observed. (Note: it is an extremely common human

fallacy to confound p(H|D) with p(D|H). In the example given, most

people would react to the positive test result by concluding that the

likelihood that they have the disease is .95, since that is the “hit rate”

of the test. They confound p(D|H) = .95 with p(H|D) = .019, which

is what actually matters.)

A nice feature of Bayes’ Theorem is that it provides a simple mech-

anism for repeatedly updating our assessment of the hypothesis as more

data continues to arrive. We can apply the rule recursively, using the

latest posterior as the new prior for interpreting the next set of data.

In Artificial Intelligence, this feature is important because it allows the

systematic and real-time construction of interpretations that can be up-

dated continuously as more data arrive in a time series, such as a flow

of images or spoken sounds that we wish to understand.

8

Page 12: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

3. Entropies Defined, and Why They are

Measures of Information

The information content I of a single event or message is defined as the

base-2 logarithm of its probability p:

I = log2 p (1)

and its entropy H is considered the negative of this. Entropy can be

regarded intuitively as “uncertainty,” or “disorder.” To gain information

is to lose uncertainty by the same amount, so I and H differ only in

sign (if at all): H = −I. Entropy and information have units of bits.

Note that I as defined in Eqt (1) is never positive: it ranges between

0 and −∞ as p varies from 1 to 0. However, sometimes the sign is

dropped, and I is considered the same thing as H (as we’ll do later too).

No information is gained (no uncertainty is lost) by the appearance

of an event or the receipt of a message that was completely certain any-

way (p = 1, so I = 0). Intuitively, the more improbable an event is,

the more informative it is; and so the monotonic behaviour of Eqt (1)

seems appropriate. But why the logarithm?

The logarithmic measure is justified by the desire for information to

be additive. We want the algebra of our measures to reflect the Rules

of Probability. When independent packets of information arrive, we

would like to say that the total information received is the sum of

the individual pieces. But the probabilities of independent events

multiply to give their combined probabilities, and so we must take

logarithms in order for the joint probability of independent events

or messages to contribute additively to the information gained.

This principle can also be understood in terms of the combinatorics

of state spaces. Suppose we have two independent problems, one with n

9

Page 13: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

possible solutions (or states) each having probability pn, and the other

with m possible solutions (or states) each having probability pm. Then

the number of combined states is mn, and each of these has probability

pmpn. We would like to say that the information gained by specifying

the solution to both problems is the sum of that gained from each one.

This desired property is achieved:

Imn = log2(pmpn) = log2 pm + log2 pn = Im + In (2)

A Note on Logarithms:

In information theory we often wish to compute the base-2 logarithms

of quantities, but most calculators (and tools like xcalc) only offer

Napierian (base 2.718...) and decimal (base 10) logarithms. So the

following conversions are useful:

log2 X = 1.443 loge X = 3.322 log10 X

Henceforward we will omit the subscript; base-2 is always presumed.

Intuitive Example of the Information Measure (Eqt 1):

Suppose I choose at random one of the 26 letters of the alphabet, and we

play the game of “25 questions” in which you must determine which let-

ter I have chosen. I will only answer ‘yes’ or ‘no.’ What is the minimum

number of such questions that you must ask in order to guarantee find-

ing the answer? (What form should such questions take? e.g., “Is it A?”

“Is it B?” ...or is there some more intelligent way to solve this problem?)

The answer to a Yes/No question having equal probabilities conveys

one bit worth of information. In the above example with equiprobable

states, you never need to ask more than 5 (well-phrased!) questions to

discover the answer, even though there are 26 possibilities. Appropri-

ately, Eqt (1) tells us that the uncertainty removed as a result of solving

this problem is about -4.7 bits.

10

Page 14: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Entropy of Ensembles

We now move from considering the information content of a single event

or message, to that of an ensemble. An ensemble is the set of outcomes

of one or more random variables. The outcomes have probabilities at-

tached to them. In general these probabilities are non-uniform, with

event i having probability pi, but they must sum to 1 because all possi-

ble outcomes are included; hence they form a probability distribution:

ipi = 1 (3)

The entropy of an ensemble is simply the average entropy of all the

elements in it. We can compute their average entropy by weighting each

of the log pi contributions by its probability pi:

H = −I = −∑

ipi log pi (4)

Eqt (4) allows us to speak of the information content or the entropy of

a random variable, from knowledge of the probability distribution that

it obeys. (Entropy does not depend upon the actual values taken by

the random variable! – Only upon their relative probabilities.)

Let us consider a random variable that takes on only two values, one

with probability p and the other with probability (1 − p). Entropy is a

concave function of this distribution, and equals 0 if p = 0 or p = 1:

11

Page 15: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Example of entropy as average uncertainty:

The various letters of the written English language have the followingrelative frequencies (probabilities), in descending order:

E T O A N I R S H D L C ...

.105 .072 .066 .063 .059 .055 .054 .052 .047 .035 .029 .023 ...

If they had been equiprobable, the entropy of the ensemble would

have been log2(126) = 4.7 bits. But their non-uniform probabilities im-

ply that, for example, an E is nearly five times more likely than a C;

surely this prior knowledge is a reduction in the uncertainty of this ran-

dom variable. In fact, the distribution of English letters has an entropy

of only 4.0 bits. This means that as few as only four ‘Yes/No’ questions

are needed, in principle, to identify one of the 26 letters of the alphabet;

not five.

How can this be true?

That is the subject matter of Shannon’s SOURCE CODING THEOREM

(so named because it uses the “statistics of the source,” the a priori

probabilities of the message generator, to construct an optimal code.)

Note the important assumption: that the “source statistics” are known!

Several further measures of entropy need to be defined, involving the

marginal, joint, and conditional probabilities of random variables. Some

key relationships will then emerge, that we can apply to the analysis of

communication channels.

Notation: We use capital letters X and Y to name random variables,

and lower case letters x and y to refer to their respective outcomes.

These are drawn from particular sets A and B: x ∈ a1, a2, ...aJ, and

y ∈ b1, b2, ...bK. The probability of a particular outcome p(x = ai)

is denoted pi, with 0 ≤ pi ≤ 1 and∑

i pi = 1.

12

jgd1000
Oval
jgd1000
Oval
Page 16: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

An ensemble is just a random variable X , whose entropy was defined

in Eqt (4). A joint ensemble ‘XY ’ is an ensemble whose outcomes

are ordered pairs x, y with x ∈ a1, a2, ...aJ and y ∈ b1, b2, ...bK.

The joint ensemble XY defines a probability distribution p(x, y) over

all possible joint outcomes x, y.

Marginal probability: From the Sum Rule, we can see that the proba-

bility of X taking on a particular value x = ai is the sum of the joint

probabilities of this outcome for X and all possible outcomes for Y :

p(x = ai) =∑

yp(x = ai, y)

We can simplify this notation to: p(x) =∑

yp(x, y)

and similarly: p(y) =∑

xp(x, y)

Conditional probability: From the Product Rule, we can easily see that

the conditional probability that x = ai, given that y = bj, is:

p(x = ai|y = bj) =p(x = ai, y = bj)

p(y = bj)

We can simplify this notation to: p(x|y) =p(x, y)

p(y)

and similarly: p(y|x) =p(x, y)

p(x)

It is now possible to define various entropy measures for joint ensembles:

Joint entropy of XY

H(X, Y ) =∑

x,yp(x, y) log

1

p(x, y)(5)

(Note that in comparison with Eqt (4), we have replaced the ‘–’ sign in

front by taking the reciprocal of p inside the logarithm).

13

Page 17: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

From this definition, it follows that joint entropy is additive if X and Y

are independent random variables:

H(X, Y ) = H(X) + H(Y ) iff p(x, y) = p(x)p(y)

Prove this.

Conditional entropy of an ensemble X , given that y = bj

measures the uncertainty remaining about random variable X after

specifying that random variable Y has taken on a particular value

y = bj. It is defined naturally as the entropy of the probability dis-

tribution p(x|y = bj):

H(X|y = bj) =∑

xp(x|y = bj) log

1

p(x|y = bj)(6)

If we now consider the above quantity averaged over all possible out-

comes that Y might have, each weighted by its probability p(y), then

we arrive at the...

Conditional entropy of an ensemble X , given an ensemble Y :

H(X|Y ) =∑

yp(y)

xp(x|y) log

1

p(x|y)

(7)

and we know from the Sum Rule that if we move the p(y) term from

the outer summation over y, to inside the inner summation over x, the

two probability terms combine and become just p(x, y) summed over all

x, y. Hence a simpler expression for this conditional entropy is:

H(X|Y ) =∑

x,yp(x, y) log

1

p(x|y)(8)

This measures the average uncertainty that remains about X , when Y

is known.

14

Page 18: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Chain Rule for Entropy

The joint entropy, conditional entropy, and marginal entropy for two

ensembles X and Y are related by:

H(X, Y ) = H(X) + H(Y |X) = H(Y ) + H(X|Y ) (9)

It should seem natural and intuitive that the joint entropy of a pair of

random variables is the entropy of one plus the conditional entropy of

the other (the uncertainty that it adds once its dependence on the first

one has been discounted by conditionalizing on it). You can derive the

Chain Rule from the earlier definitions of these three entropies.

Corollary to the Chain Rule:

If we have three random variables X, Y, Z, the conditionalizing of the

joint distribution of any two of them, upon the third, is also expressed

by a Chain Rule:

H(X, Y |Z) = H(X|Z) + H(Y |X, Z) (10)

“Independence Bound on Entropy”

A consequence of the Chain Rule for Entropy is that if we have many

different random variables X1, X2, ..., Xn, then the sum of all their in-

dividual entropies is an upper bound on their joint entropy:

H(X1, X2, ..., Xn) ≤n

i=1H(Xi) (11)

Their joint entropy only reaches this upper bound if all of the random

variables are independent.

15

Page 19: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Mutual Information between X and Y

The mutual information between two random variables measures the

amount of information that one conveys about the other. Equivalently,

it measures the average reduction in uncertainty about X that results

from learning about Y . It is defined:

I(X ; Y ) =∑

x,yp(x, y) log

p(x, y)

p(x)p(y)(12)

Clearly X says as much about Y as Y says about X . Note that in case

X and Y are independent random variables, then the numerator inside

the logarithm equals the denominator. Then the log term vanishes, and

the mutual information equals zero, as one should expect.

Non-negativity: mutual information is always ≥ 0. In the event that

the two random variables are perfectly correlated, then their mutual

information is the entropy of either one alone. (Another way to say

this is: I(X ; X) = H(X): the mutual information of a random vari-

able with itself is just its entropy. For this reason, the entropy H(X)

of a random variable X is sometimes referred to as its self-information.)

These properties are reflected in three equivalent definitions for the mu-

tual information between X and Y :

I(X ; Y ) = H(X) − H(X|Y ) (13)

I(X ; Y ) = H(Y ) − H(Y |X) = I(Y ; X) (14)

I(X ; Y ) = H(X) + H(Y ) − H(X, Y ) (15)

In a sense the mutual information I(X ; Y ) is the intersection between

H(X) and H(Y ), since it represents their statistical dependence. In the

Venn diagram given at the top of page 18, the portion of H(X) that

does not lie within I(X ; Y ) is just H(X|Y ). The portion of H(Y ) that

does not lie within I(X ; Y ) is just H(Y |X).

16

Page 20: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Distance D(X, Y ) between X and Y

The amount by which the joint entropy of two random variables ex-

ceeds their mutual information is a measure of the “distance” between

them:

D(X, Y ) = H(X, Y ) − I(X ; Y ) (16)

Note that this quantity satisfies the standard axioms for a distance:

D(X, Y ) ≥ 0, D(X, X) = 0, D(X, Y ) = D(Y, X), and

D(X, Z) ≤ D(X, Y ) + D(Y, Z).

Relative entropy, or Kullback-Leibler distance

Another important measure of the “distance” between two random vari-

ables, although it does not satisfy the above axioms for a distance metric,

is the relative entropy or Kullback-Leibler distance. It is also called

the information for discrimination. If p(x) and q(x) are two proba-

bility distributions defined over the same set of outcomes x, then their

relative entropy is:

DKL(p‖q) =∑

xp(x) log

p(x)

q(x)(17)

Note that DKL(p‖q) ≥ 0, and in case p(x) = q(x) then their distance

DKL(p‖q) = 0, as one might hope. However, this metric is not strictly a

“distance,” since in general it lacks symmetry: DKL(p‖q) 6= DKL(q‖p).

The relative entropy DKL(p‖q) is a measure of the “inefficiency” of as-

suming that a distribution is q(x) when in fact it is p(x). If we have an

optimal code for the distribution p(x) (meaning that we use on average

H(p(x)) bits, its entropy, to describe it), then the number of additional

bits that we would need to use if we instead described p(x) using an

optimal code for q(x), would be their relative entropy DKL(p‖q).

17

Page 21: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Venn Diagram: Relationship among entropies and mutual information.

Fano’s Inequality

We know that conditioning reduces entropy: H(X|Y ) ≤ H(X). It

is clear that if X and Y are perfectly correlated, then their conditional

entropy is 0. It should also be clear that if X is any deterministic

function of Y , then again, there remains no uncertainty about X once

Y is known and so their conditional entropy H(X|Y ) = 0.

Fano’s Inequality relates the probability of error Pe in guessing X from

knowledge of Y to their conditional entropy H(X|Y ), when the number

of possible outcomes is |A| (e.g. the length of a symbol alphabet):

Pe ≥H(X|Y ) − 1

log |A|(18)

The lower bound on Pe is a linearly increasing function of H(X|Y ).

The “Data Processing Inequality”

If random variables X , Y , and Z form a Markov chain (i.e. the condi-

tional distribution of Z depends only on Y and is independent of X),

which is normally denoted as X → Y → Z, then the mutual informa-

tion must be monotonically decreasing over steps along the chain:

I(X ; Y ) ≥ I(X ; Z) (19)

We turn now to applying these measures and relationships to the study

of communications channels. (The following material is from McAuley.)

18

Page 22: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

!"# $%'&)(+*-,.,/(+*-0132547684:9;05*-<&>=?0@4:A=B68&80(+*-21DCE&>CF4:=BGH/054'(/& I,-054J684:9;05*-<&>=K'I=5LM4ONP =B4;68&80505&80>QSRTA=U!4:A=&8N&>9V2 P =B4:68&80@0XW!I705GCZY[4,T*-0 \&>9;&>=@I25&8<]4:93&>I681D&8<\V&>^*.001;4(_9)\:=5I P 1;*.6>IM,.,-G'2@4\&821;&>=`(+*-217I2a(/4052I2@&ZK'I=@L4N P =B4:68&80@0TU!4:=+2b1;&cI, P 1IY&82d>e Hgf`HghTHi?Hj/kl*m9ng\;A=B&?o>pQ%7&l6>I9q21;&>9054,-N&rUs4:=t21;&T052I2@&T4:686>A P I9;68G A;0@*u9;\v 4(&8wAI25*-4:9;0ZWx2b1;*.0`&8yIC P ,-&l*-0+2=z*.N;*mI,m^8Q

D, 1/8

C, 1/8

B, 1/4

A, 1/2 A, 1/2

B, 3/8

C, 1/8

D, 1/8

B, 1/4

C, 1/8

A, 1/4

E, 1/4*-\:A=B&o>p|~=5I P 1;0T=B& P =B&805&>9V25*m9;\CF&>CE4:=BG;,-&8050_0@4:A=B68&ZIV9;< 2B(/4Z0@2I25&TK'I=5LM4ON P =B468&80509r\&>9;&>=5I,21;&>9r(/&C?IG`<&bn9;&I9;Gtn9;*.2@&052I25& P =B4:68&80@0(*.2b1`052I25&80 dO~>58888! k:H(+*-21E2b=5I9;05*-25*-4:9 P =z4:YIY;*-,-*.2@*.&80/5W-^Y[&8*m9;\Z2b1;& P =B4:YIY;*-,-*-2BG?4UCE4N:*m9;\rU=B4;C052I25& 254T052bI25& Wx(+*-21E2b1;&+&>CE*-0505*-4:94U054:CE&05GCEY[4,m^8Q*u=z052(/&+6>I9F<&bn9;&_21;&`&>9V2=B4 P G4U&>I681052I25&`*m921;&E9;4:=5C?IM,CI99;&>=8|E #) [zW-^[,.4V\ 5W-^

I9;<321;&>9J21;& &>9V2=B4 P GJ4UT21;&05G;0525&>C254JY& 2b1;& 0AC4MU21;&80@& *m9;<*-N:*-<gAI,~052I25&&>9V2=B4 P GNI,mA;&80(/&8*.\;125&8<3(+*-212b1;& 052IM25&4:686>A P IV9;68GWx6>I,-6>A;,mI25&8<3U=B4:C2b1;& v 4(&8wAI25*-4:9;0^8| E -5W-^,-4\5W-^ Wx :¡^

h/,-&>I=B,-GU!4:=+I~05*m9;\,-&E052I2@&H:(/&l1IN&T21;&T&>9V2=B4 P G?4MU21;&lCF&>CE4:=BG;,-&8050_0@4:A=B68&Qx¢ £"¥¤ ¦¥§c¨ª©]M"¦g VgRU!25&>9«(/&?(+*-017254 &b¬68*.&>9V25,-G=B& P =z&805&>92~21;&05GCEY[4,-0l\&>9;&>=5IM25&8<YVG­0@4:CE&?054;A=B68&Q%'&T01IM,.,684:9;05*-<&>=&>9;684:<*m9;\21;&l0@GCEY4V,.0`I0_Y;*m9I=BG<*-\*-250>Q

o>®

jgd1000
Note
Delete several intervening pages here.
Page 23: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Symbols EncodingSourceencoder

*-\:A=B&o>¯|iT*-056>=B&825&ECE&>CE4:=zG:,-&8050+0@4:A=B68&lI9;<)&>9;684:<&>=°±m²±x³ ´+µ·¶ ¸¹Dº·¸g»¼g½¾À¿MÁ¹¸Âh/4:9;0@*.<&>=&>9;684;<*u9;\2b1;&lÃÄ0@GCEY4V,.0 dOÅ k:H[&>92=z4 P GÆlHgIM0_IEngy;&8< ,-&>9;\2b1W!ÇT^/Y;,-4:68LY;*m9I=BG<*-\*-250>Q¦È 4?&>9;0bA=B&l(/&T6>I9<&8684;<&E21;&805&T05GCEY4,-0`(É&l9;&8&8< |

Ç]ËÊ Ì!ÍÏÎ W!ë^ Ð#I P 4(/&>=/4UÑÒ Ì!ÍÏÎ W!ë^5ÓÔ]o 4V21;&>=B(+*-05&(_1;&>=B& ÒxÕ ÓT*-0T21;&E,mI=B\V&8052_*m9V25&8\&>=`,-&8050T21IV9ÖEQ È+1;&c×bØ>ÙÚÛaÜVÝ!ÚT*-0T21;&>9«ÇËY;*-250 P &>=05GCEY4,!H:IV9;<FIM0(É&+L;9;4(DÄÞ Ì!ÍÏÎ W!ë^ 21;&>9ÄÞDÇEQÈ+1;&+&b¬c68*-&>9;68G?4MU21;&/684;<*u9;\ß *.0`\*-N&>9)YG| ß Ç%1;&>9'2b1;&cÃà05GCZY[4,-0ZI=z&?&8wA;* P =B4;YIY;,-&HI9;<Ãà*.0EI P 4O(/&>=r4U+2a(/4gH ß áoI9;<áÇEQ/Ð+4V25&T21I2+*âU(/&T2=BG254cIM6ã1;*-&8N&EÇ#äDå*m9 2b1;*.0`6>I05&T(/&lCEA;052_IM,.,-4;6>I25&?I2,-&>I052¦4:9;&_&>9;684;<*m9;\E254ZCE4;=B&/21I94:9;&+05GCEY4,;æE21;*-0tCF&>I9;021IM2(É&`I=B&+*m9;6>I P IY;,-&4UA9;*-wA;&8,-G <&8684;<*m9;\gQç 25*-,-, (+*-21&8wA;* P =B4;YIY;,-&T05GCEY[4,-0>HYA;2É(`1;&>9cÃè*.0+9;42/I P 4(/&>=4U 2B(/4gH21;*-0/684;<*u9;\*-0*m9;&b¬c68*-&>9V2>é254~2=BGl2@4l4N&>=B684:CF&É2b1;*.0¦(/&+684:9;05*-<&>=¦05&8wA;&>9;68&80É4MU05GCEY4,-0/4U,-&>9;\2b1ê'I9;<684:9;05*-<&>=¦&>9;684:<*m9;\E&>I681 P 40@05*mY;,.&T0@&8wA;&>9;68&`4U ,-&>9;\21?ê'IM0IlY;,-4:68LE4U Y;*m9I=BG<*-\*-250>H21;&>9(/&T4:Y;2I*m9| Ç] Ò ê~,-4\ íÓÔëoê(_1;&>=B&ÇS*-0F9;4(ì21;&cIN&>=5I\V&F9;ACEY&>=~4U+Y;*.2@0 P &>=~05GCEY[4,!QÐ425&21I2lI0EêD\V&8250,mI=B\&H ßZí oQ°±m²±mî ïlðgñµ·ðgòº·¸'ºm¸»¼g½¾D¿Á ¹ ¸Â97\&>9;&>=5IM,(É&<4­9;42E1I>NV&Z&8wA;* P =B4:YIY;,-&05GCEY4,-0>HI9;<7(É&(É4;A;,.<1;4 P &c254«I681;*.&8NV&054:CF&FCF4:=B&E684:C P =B&80@05&8<«U!4:=5Có4U/&>9;684:<*m9;\'YG'A;05&E4U+NI=B*mIY;,-&?,-&>9;\217684:<&80ræ­I9&8yIC P ,-&T4U0A;681I9&>9;684:<*m9;\*-0+K«4:=B05&`684:<&T<gIM25*m9;\EU=z4:C#21;&T<gIG;0/4U25&8,-&8\:=5I P 1;0>Q%'&T684:9;0@*.<&>=I\:I*m94:A=¦05*mC P ,-&TU!4:A=/0@GCEY4V,IM, P 1IVY&82_I9;<0@4:CE& P 4050@*uY;,-&ENOIV=B*mIY;,-&,-&>9;\21q684:<&80>|

Ö ô_WxÖT^ h/4;<&co h/4;<&ZÑ h/4;<&Zõe o>öÑ o ÷ ÷f o>ö ÷÷ o>÷ ÷oh o>ö¯ ÷o oo>÷ ÷ooi o>ö¯ o>÷ ooo oooÑ÷

Page 24: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

%'&T684:9;0@*.<&>=&>I681684;<&T*m9 2bA=59|oVQ+ø+05*m9;\E21;*-0/&>9;684:<*m9;\gH[(É&n9;<21I2 P =z&805&>92@&8<c(+*-21I~05&8wA;&>9;68&T,.*mL&?o÷÷oH(/&<49;42`L:9;4((_1;&821;&>=254c<&8684;<&?I0 e f e 4:=ilhTQgÈ+1;*-0lC?IVL&80+0A;681«IE684:<&A9;0I25*-0BUI68254:=zGQA=B21;&>=8H;*m9\V&>9;&>=5I,!H&8N&>9IE684;<&l(`1;&>=B&T0A;681I9ICEY;*-\:A;*-2aG684:A;,-< Y&l=B&8054V,.NV&8<A9;*-wA;&8,.G«YG,.4;4:LV*m9;\cI2ÉY;*-250tUA=B2b1;&>=ÉIV1;&>I<c*m921;&T0@2=B&>ICW!I9;< YI68L2b=5I68L*m9;\:^¦*-0A9;0bI25*-0BUI68254:=BGQRTY;05&>=BN&`21I2U!4:=2b1;*.0/684;<&H;21;&_684;<*m9;\?=5I25&HV4:=IN&>=5IM\&_9;ACZY[&>=4UY;*.2@0 P &>=05GCEY[4,!H*-0_\*-N&>9YVG|ù Å úbÄ ~û oÔ ü û Ñ+ÔDÑ û ý û Ñ þ (_1;*-6ã1q*.0,.&80@0_21IV921;&l&>9V2=B4 P GQÑ[Q+È+1;*-0T684:<&E*-0lA9;*-wA;&8,-G'<&8684;<gIY;,-&éUA=B2b1;&>=+21;*-0T684;<&?*-0T*m925&>=z&80525*m9;\*m9«21I2(É&6>I9<&8684:<&?ÿ8Ý!ÜÝ!ÜgÚbتæF2b1I2*-0_9;4?YI68L2b=5I68L*m9;\E*-0t=z&8wA;*m=B&8< é4:9;68&(É&?1IN&?2b1;&cY;*-250l4U/21;&?&>9;684;<&8<0@GCEY4V,(/&E6>I9«<&8684;<&c(+*-21;4;A;2l(+I*-25*m9;\Us4:=`CE4:=z&QlA=B21;&>=`21;*-0lI,-054)0I25*-0Bng&80T21;&ÛBÚ l684:9;<*-25*-4:9H21IM2_*-0l2b1;&>=B&E*.09;4T684:<&`(/4:=B<E(_1;*-6ã1*-0 P =B&bngy Wx*!Q &QV0ICE&+Y;*-2 P I2525&>=@9^4UI`,-4:9;\&>=¦684;<&+(/4:=B< Q921;*-0`6>I05&T21;&~684:<*m9;\=5I25&`*-0_&8wAI, 25421;&T&>92b=B4 P GQõ[Q/% 1;*-,-&E21;*-0_*-0`<&b684:<gIVY;,.&«W!I9;< 684;<*m9;\=@I25&l*-0`&8wAIM, 25421;&l&>9V2=B4 P GI\:I*m9^8H4:Y;05&>=BN&T2b1I2+*-2_<4;&80T9;42_1IN&l2b1;& P =B&bngy P =B4 P &>=B2BGI9;< *-0T9;42IV9*u9;0@2I9V2I9;&84:A;0+684;<&Q

ç 1I99;4:9 0/n=B052+2b1;&84:=B&>CS*-0`21;&ãØ:ÛB×bÚa×bØ>ÙVÿcÝ:ÚbØÛBÚ#(`1;*.681*-0>|g4:=I'<*-056>=z&825&CE&>CE4;=BG:,-&8050054:A=z68&­(*.2b1n9;*-25&7&>92=z4 P GDJéU!4:=cIV9GW P 40@*.2@*.NV&>^`*-2T*-0 P 40@05*mY;,.&254c&>9;684;<&?21;&E05GCEY4,-0TI2TI9«I>NV&>=5I\&l=5IM25&ÇEH:0bA;6ã1q21I2>| Ç]SÔ

Wx4:= P =B4:4MU05&8& ç 1I99;4:9 %'&>IN&>=8Q ^EÈ+1;*-0+*-0TI,-054?054:CF&825*mCE&80_6>IM,.,-&8<)21;&Z9;4*-05&8,-&8050684;<*u9;\E21;&84;=B&>CËI0¦*-2É<&>IM,.0/(+*-21684;<*u9;\E(+*-21;4;A;2684:9;05*-<&>=5IM25*-4:94U9;4*-05& P =B4:68&80@05&80Wx*!Q &QgY;*-2`684:=5=5A P 2@*.4;9&8256>^8QÈ+1;&/&>9V2=B4 P GTUA9;6825*-4:9E21;&>9=B& P =B&805&>9V250IUA9;<gICE&>9V2I,;,.*mCE*-2/4:9~21;&+9;ACZY[&>=4U Y;*-2504:9IN&>=5IM\&=z&8wA;*m=B&8<)254c=B& P =B&80@&>92+2b1;&T05GCEY[4,-0_4U21;&T054;A=B68&Q°±m²±m² lñ¸!¶¿Á ¹¸Â%'&E1I>NV&ZI,m=B&>I<G CF&>925*-4:9;&8<)21;& P =B&bngy P =B4 P &>=B2BGé[(É&rn9;<«21I2/U!4:=+I P =B&bngy684;<&254E&8y;*.0@2>H;*.2+CEA;052¦0I25*-0BU!G?21;&#"lÛBÜ%$8Ý'&«×&«ÿ-Ül*m9;&8wAI,-*-2aGQ/È+1I2/*-0>H IE9;&868&8050bI=BG­W!9;4V20AV¬c68*-&>9V2^684;9;<*.2@*.4;9EU!4:=I`684;<&_1IN:*m9;\ZY;*m9I=zGZ684;<&+(/4:=B<0(*.2b1l,-&>9;\2b1)( Þ*( Þ++,+ Þ-(/.254E0I2@*.0zUsG21;& P =B&bngy684:9;<*-25*-4:9)*-0>|.10 oÑ 32 Þo

Ño

jgd1000
Highlight
Page 25: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

x 4' 65 $7 g§c"¦¦87%'& 1IN&684:9;0@*.<&>=z&8<À2b1;& <*-056>=B&825&)054:A=B68&H/9;4( (/&684:9;05*-<&>=I«681I99;&8,+21=z4:A;\:1(_1;*-681?(/&+(+*-01254 P I0@005GCEY4,-0/\&>9;&>=5I2@&8<cYGE0A;681IT054:A=z68&_YGE054;CE&_I PP =B4 P =B*mI25&&>9;684;<*u9;\CE&86ã1IV9;*.0bCé(/&_I,-054E*m92b=B4:<gA;68&`21;&`*-<&>Il4MU9;4V*.0@&T*u9V254~21;&_0@G:052@&>Cæ?21I2*-0E(É&684:9;05*-<&>=E2b1;&c681I99;&8,/254­CF4:<*âU!G'21;&*u9 P A;2684:<*m9;\I9;< P 4V0505*mY;,-G\&>9;&>=@I25&054:CF&lCE4:<*âng&8<)N&>=B05*-4:9Q%'&/01;4:A;,-<?<*-052@*u9;\;A;*.0b1cY[&82a(/&8&>9~05G;0525&>C?IM25*-6tCF4:<*âng6>I25*-4:9F4Ug21;&/&>9;684:<&8<05GCEY[4,-0>H*!Q &Q<*-052@4:=B25*-4:9H;I9;<c9;4*-05&QiT*-052@4:=B25*-4:9~*-0(_1;&>9I9E*m9 P A;2/684:<&_IM,.(+IG;0=B&80A;,-250*m921;&21;&T0IVCE&T4:A;2 P A;2+684;<&é21;*-0 P =B4;68&8050+6>IV968,-&>I=B,-G Y&E=B&8N&>=B0@&8< QÐ+4V*.0@&l4:921;&T4V21;&>=1I9;<)*u9V2=B4;<gA;68&80`21;&T&8,-&>CE&>92`4U=5I9;<4:C9;&8050+*m92@4?21;&l=B&80A;,-25*m9;\4:A;2 P A;2É684;<&Q

Symbols Sourceencoder Decoder

Symbols

ChannelX Y

*-\:A=B&?o>®[|h/4;<*u9;\I9;<<&8684;<*m9;\4U05GCEY4,-0Us4:=¦2=@I9;0BU!&>=/4N&>=/IE6ã1IV99;&8,sQ%'&684:9;0@*.<&>=I9J*m9 P A;2I, P 1IY&829 d3:>>5;:8< k­I9;<4:A;2 P A;2cI, P 1IY&82>=Äd3?V>>>5@?,A klI9;<=@I9;<4:CìNOI=z*uIVY;,.&80 Õ I9;<>BË(_1;*-681=5I9;\&`4N&>=2b1;&805&IM, P 1IVY&8250>QÐ425&/21I2/ê­IV9;<DC 9;&8&8<9;42Y[&_21;&0ICE&¦ælU!4:=&8yIC P ,-&`(É&+C?IGE1I>NV&É2b1;&_Y;*m9I=BG*m9 P A;2IM, P 1IVY&82 d ÷ okI9;<21;&4:A;2 P A;2cI, P 1IY[&82 d ÷ o E k;H (_1;&>=B& E =z& P =B&805&>9V25021;&~<&8684:<&>=*-<&>925*âU!G:*m9;\054;CE&T&>=5=B4:=ÏQÈ+1;&l<*-056>=z&825&ZCE&>CF4:=BG;,.&80@0+6ã1IV99;&8,6>I9q21;&>9Y&E=B& P =B&805&>9V25&8<«I0+IE05&82+4U2=5IV9;05*-25*-4:9 P =B4:YIY;*-,-*-25*-&80>|W ?GFIHJ: ^¦ WB# ?GFKH Õ : ^È+1I2E*-0E21;& P =B4:YIVY;*.,-*-2BG2b1I2E* U : *-0E*m9,La&86825&8<3*m9254«21;&681I99;&8,!H ?GF *-0?&>CF*.2@25&8< éW ? F HJ:; ^*-021;&«684:9;<*-25*-4:9IM, P =B4:YIVY;*.,-*-2BGQ È1;*.0)6>I9]Y&«(_=B*-252@&>9ÀIM021;&×:Ü8gÚ,EÜÝ!Ûbÿ1| M ON PQQQQR

W ? HJ: ^ W ? HJ: ^ >> W ? A HJ: ^W ?V3HJ: ^ W ?GHJ:g ^ >> W ?,ASHJ: ^QQQ QQQ Q Q Q QQQW ?VHJ:/< ^W ?GHJ:/< ^ >> W ?,ASHT:/< ^U%VVVVW

Ð425&T21I2/(/&T1I>NV&T21;& P =B4 P &>=B2BG?21I2/U!4:=¦&8N&>=BG*m9 P A;2_0@GCEY4V,sH;(/&T(+*-,-,\V&82É0@4:CE&b21;*m9;\4:A;2>| AF 0 W ?GFIHJ: ^ oÑÑ

jgd1000
Note
Delete preceding printed page.
Page 26: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Ð&8y:2(/&+2IVL&21;&4:A;2 P A;24UI`<*-056>=B&825&`CE&>CE4:=BG;,-&80500@4:A=B68&+I021;&+*m9 P A;2/254TIT681I9V9;&8,!Q ç 4l(/&_1IN&`I05054;68*mI25&8<?(+*-21F21;&+*m9 P A;2+I, P 1IY&82/4U 21;&+681I99;&8,21;& P =z4:YIY;*-,â*-2BG?<*-052=B*mYA;25*-4:9q4U4:A;2 P A;2¦U=B4:CSIECE&>CE4:=zG:,-&8050/054:A=z68& d W : ^ l#o Ñ >8 êk;Q%'&21;&>974:Y;2I*m9«2b1;&XLB4*m92 P =B4:YIY;*-,-*-2BG<*-052=z*uYA;2@*.4;9J4U+21;&c=5I9;<4;CáNOIV=B*mIY;,-&80 Õ I9;<Bc|W : ;?GF ^à W Õ : B# ?GF ^ W ?GFKHT: ^!W : ^

%'&T6>I921;&>9n9;<21;&lCI=B\*m9I, P =B4:YIVY;*.,-*-2BGc<*-052b=B*mYA;25*-4:9)4UZYEH:2b1I2/*.021;& P =z4:YIY;*-,-*.2BG)4U4:A;2 P A;205GCEY[4, ?GF I PP &>I=z*u9;\|W ?GF ^ WB# ?GF ^ < 0 W ? F HJ: ^!W :; ^xU/(/&F1IN&?ê3[CJH(/&Z4MUs25&>9)*-<&>9V25*âUsG«&>I681«4:A;2 P A;2T0@GCEY4V,I0lY[&8*m9;\ 21;&?<&80@*u=z&8<=B&80A;,-2+4MU054:CE&`*m9 P A;2+05GCEY4,!QRT=/(/&TC?IG?05&8,-&8682+054;CE&_0bAY;05&82+4U4;A;2 P A;2/05GCEY[4,-0>HU!4:= &8yIC P ,-&/*m9~21;&/*u9 P A;2 d ÷ ok/I9;<E4:A;2 P A;2 d ÷ o %E k:Q %«&21;&>9~<&bn9;&+21;&ÉIN&>=@I\&P =B4:YIVY;*.,-*-2BG4U05GCEY[4, &>=5=B4;=ÉIM0>|

]\ AF 0 _^ FG`0 WB ?3FH Õ : ^ AF 0 < 0 _^ `0 F W ?GFHJ: ^!W : ^ Wx Ka^

I9;<684:=@=B&80 P 4;9;<*u9;\V,.GH2b1;&lI>NV&>=5I\& P =B4:YIY;*-,-*-2BG 4U684:=5=B&8682+=B&868& P 2@*.4;9 I0lo Z\ Q°±-°±x³ bEµ!»ðgñ,cDÂcedfd«¸:½>ñµx¿¿¾ð»»¸ºÈ+1;&_Y;*m9I=zGZ0@GC?CF&82=B*-6/6ã1I99;&8,1I02B(/4_*m9 P A;2ÉIV9;<?4:A;2 P A;205GCEY4,-0TW!A;0AI,-,-GE(_=B*-2B25&>9 d ÷ ok^TI9;<I684;C?CE4:9 P =z4:YIY;*-,-*.2BGH H4MUSg@*u9;684;=5=B&8682_h)<&8684:<*m9;\'4U_I9 *u9 P A;2I2T21;&E4:A;2 P A;2>é21;*-0l684:A;,-<Y[&cI05*mC P ,-*-0525*-6 CE4;<&8,4UÉI684:C?CEA9;*-6>I25*-4:9;0~,-*u9LHng\A=B&lÑ÷VIQÆ4O(/&8N&>=ÏHg254A9;<&>=B052bI9;<'21;&?IN&>=5I\V*u9;\ P =B4 P &>=B2BG4UÉ2b1;&Z&>=@=B4:=`=5I25& ]\ <&8056>=B*mY&8<IY[4ONV&H>684:9;0@*.<&>=ª21;&ng\;A=B&+Ñ÷YH(_1;&>=B&/(/&É1IN&_o÷Gi05GCZY[4,-0>H4Ug(_1;*-681E21;&ªn=B0521I0I P =B4:YIY;*-,-*-2aG«4UÉY[&8*m9;\ =B&868&8*-N&8<'*m9)&>=5=B4:=EWx4U÷ o>^ÏHgI9;<«21;&?=B&>C?I*m9;<&>=TI=z&ZI,-(+IG:0=B&868&8*-N&8< P &>= U!&86825,-GQÈ+1;&>974:Y;05&>=BN;*m9;\ 21I2lCF4052T4U+21;&?2@&>=5CE0T*m9'2b1;&?0ACá4;9«21;&=B*-\:1V2+4U&8wAIM25*-4:9 Ka?I=z&#j8&>=B4g|

]\ W ?VHJ:k ^!W :k ^ ÷ o û o>÷Il i o>÷ l/m Wx :p^Ñõ

Page 27: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

0

11

0

p

p

1-p

1-p

X Y

0

11

0

YX

0.1

0.9

1.0

1.0

1.0

2

3

2

3

999999999999

*-\:A=B&ZÑV÷|I^+f¦*m9I=BG05GCCE&82=B*-6l681I99;&8,!HY^+ÈIM,.&~4U21;&EA9;,mA;6ãLVG05GCZY[4,n 5 ª7_ ëg ¨# po$jy;25&>9;<*m9;\21;&_*-<&>I0`4U*m9VU!4:=5C?IM25*-4:9I9;<&>9V2=B4 P GEU!4:=¦21;&T<*-056>=B&825&~054:A=B68&H;(/&T6>I99;4( 684;9;05*-<&>=`21;&l*m9VU!4:=5C?I2@*.4;9­IVY4:A;2 Õ 4:Y;2I*m9;&8<'YG4;Y;05&>=BN;*u9;\2b1;&ZNI,mA;&E4U21;&BcQª%'&l<&bn9;&~21;&T&>92b=B4 P GIU!25&>=¦4:Y;05&>=zN:*m9;\>B ?GF |

Wq9 H B ? F ^ < 0 W :;rHT? F ^,-4\Ds oW : HT?3F ^Gt21;*-0`*.0`I?=5I9;<4:C NOI=B*mIY;,-&H054?(/&T6>I92IVL&T21;&lIN&>=5I\V&I\;I*m9|

Wq9 H =T^ AF 0 XW9 H B ?GF ^!W ?3F ^ AF 0 < 0 W : HJ?GF ^,-4\Ds oW :;rHJ? F ^ t W ?GF ^ AF 0 < 0 W : ;?GF ^[,-4\us oW : HJ?GF ^ t Wx :¯^

Wq9 H =^*-0l21;&)×ØÙÿxÝ!ÿ ØÜÚÝ!ÛBØ@vQE%'&E21;&>9«(_=z*.2@&?21;&:Ýq;Ü/ÿw$ãØVÛ;EÜÝ!ÿxØ4U21;&T681I99;&8,!| x Wq9)é_=_^ëXW9c^Wq9 H =_^È+1;*-0 P =z4ON;*-<&80ZA;0T(+*-21«ICE&>IM0A=B&E4UÉ2b1;&cICE4:A9V2`4UÉ*m9VU!4:=5C?I2@*.4;9 4;=TA9;68&>=B2I*m9V2aG=B&8<gA;68&8<IVY4:A;2 Õ IOUs25&>=4:Y;05&>=zN:*m9;\21;&E4:A;2 P A;2#B 4UÉI6ã1IV99;&8,Us&8<'YVG Õ QTÈ+1;&CEA;2AI,*m9VUs4;=5C?I25*-4:9q25&8,-,-0A;0+0@4:CE&821;*m9;\IY4:A;221;&T6ã1IV99;&8,sQe 9I,-25&>=59I25*-N&~N:*-&8( P 4*m92`*-0_2@4Z2b1;*u9LIVY4:A;2+2b1;&l681I99;&8, 254\V&821;&>=/(+*-21IE684;=5=B&86b25*-4:9~<&8N;*.68&tUs&8<E(+*-21~*m9VUs4;=5C?I25*-4:9U=z4:C4;Y;05&>=BNOIM25*-4:9;04UY4V21~21;&/*u9 P A;2IV9;<Z4;A;2 P A;24U/21;&?681I99;&8,!H&Q \gQng\:A=B&?ÑoQ?RT9;&CE*-\:12EI0L)1;4O(ËCEA;681«*m9VUs4;=5C?I25*-4:9«CEA;052lY[&

Ñ

Page 28: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Sourceencoder Decoder

X Y XCorrection

ObserverCorrection data

*.\;A=B&lÑo|ht4:=5=B&86825*-4:90@G:052@&>CP I050@&8<­I,-4:9;\?2b1;&l684:=5=z&86825*-4:9681I99;&8, 254=B&8684:9;052=@A;682+21;&l*m9 P A;2>é21;*-0`2A=59;0+4:A;2`254Y&EW Õ*H B^ÏQ%'&«6>I9XUA=B2b1;&>==B&8(_=B*-25&)21;&«&>92=z4 P G7UA9;6825*-4:9Wq9^?I0I'0ACà4N&>=21;&yLa4*m9V2P =B4:YIVY;*.,-*-2BG<*-052=B*mYA;25*-4:9|

Wq9^ < 0 W : ^[,-4\W-W : ^5^ û o < 0 W : ^[,-4\W-W : ^5^ û AF 0 W ?GFKHJ: ^ < 0 AF 0 W ?GFKHJ: ^!W : ^,-4\WuW : ^5^ < 0 AF 0 W : @?3F ^,-4\WuW : ^5^Æ&>9;68&l(/&T4:Y;2I*m9I9q&8y P =z&80505*-4:9U!4:=¦21;&lCEA;2AI,*u9VU!4:=5CI25*-4:9|x Wq9)é_=_^ < 0 AF 0 W : ;?GF ^[,.4V\us W : HJ?GF ^W : ^ t

%'&T6>I9q<&8<gA;68&lNI=B*-4:A;0 P =B4 P &>=z25*-&80_4U21;&ZCEA;2AI,*m9VU!4:=5C?I2@*.4;9|oVQ x Wq9)é_=+^OzD÷QÈ 401;4(#21;*-0>H(/&?9;425&E21I2/W : HJ?GF ^!W ?GF ^+ëW : ;?GF ^TI9;<«0AY;0525*-2A;2@&?21;*-0*u9q21;&`&8wAI2@*.4;9?IY;4N;&|x W9 é=_^ AF 0 < 0 W : @?3F ^,-4\ s W : ;?GF ^W : ^W ?GF ^ tz ÷ Wx :®^YGA;05&T4U21;&T*m9;&8wAIM,.*-2BG (/&T&8052bIY;,-*.0b1;&8< P =z&8N:*-4:A;05,-GQ

Ñ¡

Page 29: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Ñ[Q x Wq9)é_=+^¦÷~* U Õ I9;<|BSI=B&T052bI25*-0525*-6>I,-,-G *m9;<& P &>9;<&>9V2>QxU Õ I9;<6B I=z&Z*m9;<& P &>9;<&>92>HW : ;?GF ^+W : ^!W ?GF ^8H 1;&>9;68&?21;&?,-4\2@&>=5CY&8684:CE&80Oj8&>=B4gQõ[Q x *-0_05GC?CE&82b=B*-6H x W9 é=_^ x W=Zé@9?^8Qø+05*m9;\lW : HT?3F ^!W ?GF ^W ?3FIHJ: ^!W : ^ÏH(/&_4:Y;2bI*m9|x Wq9 é=_^à < 0 AF 0 W :;O@? F ^,-4\ s W :;HJ? F ^W : ^ t < 0 AF 0 W : @?GF ^,-4\> W ?GFKHT: ^W ?GF ^~ x W=Eé;9E^ W!¡÷^ Q+È+1;& P =B&868&8<*m9;\«,-&>I<0l2542b1;&?4:YVN:*-4:A;0~05GCCE&82=B*-6?<&bn9;*-25*-4:9 Us4;=`21;&cCEA;2AI,*u9VU!4:=5CI25*-4:9*m925&>=@CE0+4U2b1;&l&>9V2=B4 P G?4U21;&T4:A;2 P A;2|x Wq9)é_=+^ëXW=^'W= H 9?^¡[Q/%'&T<&bn9;&l21;&LB4*m9V2+&>92b=B4 P G?4U2B(/4?=5I9;<4:CSNI=B*mIY;,-&80`*u9q21;&T4:YVN:*-4:A;0`C?I9V9;&>=8HgYI05&8<4;921;&LB4*m92 P =B4:YIY;*-,-*-2aG<*-052b=B*mYA;25*-4:9|

Wq9 =_^# < 0 AF 0 W : ;?GF ^[,-4\W-W : ;?GF ^@^È+1;&lCEA;2AI,*m9VUs4:=@C?I25*-4:9q*.021;&ZCE4;=B&9IM2A=5I,-,-G(_=B*-2525&>9|x Wq9)é_=_^ëXW9c^ÔDW=T^'Wq9 =_^°±1±x³ bEµ!»ðgñ,cD¿V¾ð»»¸ºh/4:9;0@*.<&>=`2b1;&Z684;9;<*.2@*.4;9I,&>9V2=B4 P G I9;<'CEA;2AI,*m9VUs4:=@C?I25*-4:9qU!4:=+2b1;&FY;*m9I=zG 05GC~CE&82=z*.6«681I99;&8,!Q È+1;&«*m9 P A;2 054:A=z68& 1IM0 I, P 1IY&82|9 d ÷ ok«I9;<I05054;68*mI25&8<P =B4:YIVY;*.,-*-25*-&80 d o>öVÑ o>öVÑ;kI9;<2b1;&T6ã1I99;&8,C?I2b=B*-yc*-0>|s o+ o tÈ+1;&>9q21;&T&>92b=B4 P GH:684;9;<*.2@*.4;9I,&>9V2=B4 P GI9;< CEA;2AI,*m9VUs4;=5C?I25*-4:9)I=B&T\*-N&>9YVG|Wq9^à oXW9 H =`^ +,-4\ W-g^]WBo/ g^[,.4V\WBo/ g^x Wq9 é=+^ oÔ P ,.4V\W-g^ ÔWzo+g^,-4\WBo+g^*-\:A=B& ÑÑVI«01;4(+0E21;&6>I P I68*-2BG'4UT21;&681I99;&8,+I\:I*m9;0522=5I9;05*-25*-4:9 P =z4:YIY;*-,-*.2BGQÐ425&`21I2/21;&`6>I P I68*-2BGc4U21;&_681I99;&8, <g=z4 P 0/254j8&>=B4E(_1;&>921;&`2=5I9;05*-25*-4:9 P =z4:YIY;*-,-*.2BG*-0loÑH;I9;<*.0+C?IMy:*mCE*j8&8<(_1;&>921;&`2=5I9;05*-25*-4:9 P =B4:YIVY;*.,-*-2BG*-0t÷[é:4;=+o/æ?*âU(/&=B&8,-*mIY;,-G'2=5IV9;0 P 40@&Z2b1;&?05GCZY[4,-0l4:9)21;&E(+*m=B&(É&?I,-054)\&82T21;&?C?Iy;*mCEACÄICE4;A924U*m9VU!4:=5C?IM25*-4:921=B4:A;\;1*-\:A=B&TÑÑY01;4(+0/21;&T&@ &8682/4:9CEA;2AI,*m9VUs4;=5C?I25*-4:9U!4:=I05GC?CE&82b=B*-6T*u9 P A;2TI, P 1IY[&8250>Q

Ña

Page 30: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1P(transition)

I(X;Y)

I(X;Y)

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.90.1

0.20.3

0.40.5

0.60.7

0.80.9

0

0.5

1

P(transition)

P(X=0)

*-\:A=B&lÑÑ[|h+I P I68*-2BG4UY;*m9I=BG681I99;&8,I^¦05GCCE&82=B*-6HgY^+IM05GCCE&82=B*-6 §c"¦¦87Tvo¦!8$%'&(+*-017254 <&bn9;&21;&6>I P I68*-2BG­4MUI)681I99;&8,!HA;05*m9;\«21;&CE4:<&8,¦4U_IU=B&8&*u9 P A;2I, P 1IY[&82FIV9;<<& P &>9;<&>92E4:A;2 P A;2?I, P 1IY[&82 Wx\*-N&>9'YVG«21;&c681I99;&8,/C?I2=B*-y^8Q%'&9;425&~21I2/U!4:=+IF\*-N&>9)6ã1I99;&8,!H*âUÉ(/&T(+*-012@4C?IMy:*mCE*j8&l2b1;&ZCEA;2AI, *m9VU!4:=5C?IM25*-4:9H(/&_CEA;052 P &>=aUs4:=@C#IlC?Iy;*mCE*j>I25*-4:94N&>=I,-, P =B4:YIY;*-,-*-2BG?<*-052=B*mYA;25*-4:9;0/U!4:=21;&*u9 P A;2I, P 1IY[&82_ÖEQ%'&T<&bn9;&E21;&?×w:Ü/Ú×bÜ@Ü×ÿ ÝqH<&>9;4V25&8< YG>cHI0>| CIy@ x Wq9)é_=`^ W!¡o>^%1;&>93(É&«I681;*.&8NV&­2b1;*.0 =@I25&(/& <&8056>=z*uY[&'21;&«054:A=z68&­I0Y&8*m9;\ÀCI25681;&8<D254J21;&681I99;&8,!Q §c"¦¦87T¨ª©È44N&>=B684;CE&21;& P =z4:Y;,-&>Cà4Ur9;4V*.0@&­*m9321;&)05G:0@25&>CH/(/& CE*-\:1V2c684:9;0@*.<&>=IM<<*u9;\=B&8<gA9;<gI9;68G«<gA=B*m9;\21;&E&>9;684;<*m9;\ P =B4;68&8050T2544N&>=B684:CE& P 40@05*mY;,.&&>=5=B4:=B0QÈ1;&Z&8yVIC P ,-&80E21I2lIV=B&cA;05&8<1;&>=B&cI=z&c=B&8052=z*.682@&8<'254 0@4:A=B68&80l(`1;*.6817(É4;A;,.<9I2A=@I,-,.G'Y[&&>9;684;<&8<'*m9­I9;4*-05&8,-&8050~&>9N;*m=B4:9CE&>9V2IM0+ngy;&8<«05*j8&cY;,-4:68L684;<&80`æ*!Q &QI054:A=B68&EI,âP 1IY[&829)H;(_1;*-6ã1)1I0_Ñ &8wA;* P =B4:YIY;,-&E05GCEY4,-0>ég1;4(/&8N&>=8H;21;&T<*-056>A;0@05*-4:9 I PP ,-*-&80254cCF4:=B&`\&>9;&>=5I, 0@4:A=B68&80_I9;<)NOI=z*uIVY;,.&~,-&>9;\21684;<*m9;\c0@6ã1;&>CE&80QRT9;& P I=B25*-6>A;,mI=I0 P &8682254TY&¦684:9;05*-<&>=B&8<?*m9~=B&>I,A;05&804MUg6ã1I99;&8,;684;<*u9;\T*-021I2CI9G

Ñp

Page 31: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

054:A=z68&80É(`1;*.681(É&`I=B&`*u9V25&>=B&80@25&8<c*m9&>9;684:<*m9;\Us4;=2=5IV9;0CE*-0505*-4:9;0/1I>NV&Ir05*-\:9;*âng6>I92ICE4;A92`4Ut=z&8<gA9;<gI9;68GI,m=B&>I<GQEh/4:9;0@*.<&>=r05&>9;<*m9;\­I P *-&868&4UÉ05G9V2I6825*-6>I,-,-G«684:= =B&8682_IV9;<c05&>CI92@*.6>IM,.,-G CE&>IV9;*u9;\MUA;,j¦9;\,-*-014:=t684:C P A;25&>= P =B4\;=5ICì25&8y;2É2b1=B4:A;\:1I681I99;&8,g(_1;*-681?=5I9;<4;CE,-GZ684;=5=5A P 25&8<?4;9ZIN&>=@I\&To/*m9o>÷`681I=5I6825&>=z0_Wx0A;681?I0CE*-\:1V2Y&~*m92=z4:<gA;68&8<«YG2=5I9;0bCE*-0505*-4:9 IM6>=B4050+I?=5IM21;&>=/05*-68L,-G È&8,.&8y0@G:052@&>C?^8Q&Q \gQm|oVQ+fÉ=z*u9;\=B&8*m9VUs4:=z68&>CE&>92@0>Hg(/& =z&_\4V*u9;\254cI<NI9;68&Ñ[Q+!2 0/&>I0@Gc254=B&8684\:9;*-05&~0 P &8&8681/&8684:9;0@2=5A;6825*-4:9~U=B4:C2b1;&/Us4V,.,-4(+*m9;\l<gA;&254T684:=5=5A P 2@*.4;9E4Uo/*m9o÷`6ã1I=@I6825&>=B0(/4:A;,-<Y&~684:C P I=@I25*-N&8,-Gc0@2=5I*-\:1V2¦Us4:=z(tIV=B< |oVQ+fÉ=z*j8\=B&8*m9VUs4:=z68&l&>9V250>H(/& =B&`\4*m9;\254?I<NOIV9;68&Ñ[Q+!2 0/&>I0@GCE4=B&8684\:9;*-05&~0 P &8&8681

Æ4O(/&8N&>=ÏH(_1;*-,-&«21;& =B&8<gA9;<gI9;68G34UT21;*-0c054;A=B68& P =B4V25&868250cI\;I*m9;052?0A;681D=5IV9;<4:C681I=5I6825&>=¦&>=5=z4:=8H684:9;0@*.<&>=21;&T&>=5=z4:=/<gA;&T254cI1:AC?IV9CE*-0Ba1;&>I=B*m9;\g|oVQ+fÉ=z*u9;\21=B&8&ZIV9;<Us4:A= P &>9;68&H(É&â=B&_\V4*m9;\c254?IE<gIV9;68&QÑ[Q+!2 0/&>I0@Gc254(_=B&868LcI?9;*-68& P &>I681Q

È+1;&?684;<*m9;\'9;&8&8<0Z2@4684:9;05*-<&>=~21;&?&>=5=B4;=`6ã1I=@I6825&>=B*-0525*-680E4U/21;&?681I99;&8,ÉI9;<7<&b684;<&>=8HÉIV9;<2=BG'254'I681;*-&8N& I«05*-\:9;*âng6>I9V26g5<*-052bI9;68&hY&82B(/&8&>9 P ,mIA;05*mY;,-&'&>9;684;<&8<CE&8050bI\&80>Q°±±x³ ¸¸;½>µm½>µ·Á »lÁ ¹¸ÂRT9;&«4UT21;& 05*mC P ,-&8052)684:<&80*-0 IDÛBÚÚ>Ý!ÿxÝ!ÿxØ]×ØÙÚ>Q]%'&)2IL& IJY;*m9I=BG305GCCE&82=B*-6681I99;&8,l(+*-21ëIJ2=5I9;05*-25*-4:9 P =B4:YIY;*-,-*-2aG é~21;*-0 \V*.NV&80­IJ681I99;&8,T6>I P I68*-2BG o/Ô,-4\W-g^ÔWzo_«g^[,.4V\[Wzo_«g^8Q~È+1;&?9I2A=5I,Y;*m9I=BG«&>9;684;<*u9;\qUs4;=`21;*-0T*.0~21;&>9d ÷ ok¦æZ21;&T=B& P &82@*.2@*.4;9?684;<&_(*.,-, =B& P &>I221;&80@&_<*-\*-250+I9E4;<<9:ACEY&>=ª4U 25*mCE&80+I9;<P &>=aUs4:=@C C?IGLB4:=B*-2BG?N4V25*m9;\gQÆ&>9;68&+(/&+2=5I9;0bCE*-2(ÑÔ«o+Y;*.2@0 P &>=0@GCEY4V,sH;I9;<?(+*-,-,g4;Y;2I*m9?I9~&>=5=B4;=* UZÔ«o4:=+CF4:=B&TY;*.2@0I=z&l=B&868&8*-N&8< *m9&>=@=B4:=8H;21I2*.0|

Z\ ¢¡10 ¢¡ s Ñ#Ô]o£ t Wzo+g^ % ¡ l h/4:9;0@*.<&>=TIF2=5I9;05*-25*-4:9 P =B4:YIY;*-,-*-2BG 4U÷ ÷oVQÈ+1;&l681I99;&8,6>I P I68*-2BG I0+\*-N&>9)YG&8wAI25*-4:9l¡[o*-0Z ë÷ ®o>®VÑW ng\:A=B&ÉÑVõI^8Q È+1;&¦684:<&/=5I25&4MU21;&/=B& P &82@*.2@*.4;9l2@&86ã19;*-wA;&I\:IM*u9;0@2+21;&l=B&805*-<gAI, P =B4:YIVY;*.,-*-2BG4U&>=5=B4:=¦*-0_<&>CF4:9;052=5IM25&8<*m9ng\:A=B&lÑVõYQ

ѯ

Page 32: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

0.9

0.91

0.92

0.93

0.94

0.95

0.96

0.97

0.98

0.99

1

1e-08 1e-07 1e-06 1e-05 0.0001 0.001 0.01 0.1

Capacity

P(transition)

I(X;Y)(p)I(X;Y)(0.01)

1e-10

1e-09

1e-08

1e-07

1e-06

1e-05

0.0001

0.001

0.01

0.1

0.01 0.1 1

Residual error rate

Code rate

Repetitiion codeH(0.01)

*-\:A=B&?Ñõ|+IV^_h+I P I68*-2aG4MUÉY;*u9IV=BG 05GC?CE&82b=B*-6Z681I99;&8,I2`,-4( ,-4050l=@I25&80>HY^_j¬?68*-&>9;68G 4U=B& P &825*-25*-4:9)684;<&TU!4:=/IE2=5IV9;05*-25*-4:9 P =B4;YIY;*-,.*-2BG4U÷ ÷o¤ §c"¦¦87~§c¨ª©À£" g%'&lI=@=B*-N&lI2 ç 1I99;4;9 0+05&8684:9;<)21;&84:=B&>CqH21;&?×;Ü/Ú×ØÙÿcÝ;ÚØÛBÚ,Z|

g4:=+I681I99;&8,4U6>I P I68*-2BG| I9;<'I054:A=B68&E4U&>9V2=B4 P GJé* U+ ÞH21;&>9U!4:=+I=@Y;*.2b=5I=B*-,-G 0CI,-,8H21;&>=B&~&8y:*-052@0lIE684:<*m9;\05681;&>CE&E0A;68121IM221;&T054:A=B68&T*-0T=B& P =B4;<gA;68&8< (+*-21)I?=B&805*-<gAI,&>=5=B4:=+=@I25&`,.&80@0_21IV98Qç 1I99;4:9 0 P =B4:4MUÉ4U/21;*-0T21;&84:=B&>Có*-0ZIV9 &8y;*-0525&>9;68& P =z4:4U+=5I21;&>=`2b1I9«ICE&>I9;0T254684:9;052b=5A;682E0A;6ã1J684;<&80*u9J2b1;&\&>9;&>=5I,/6>I05&QJ9 P I=B25*-6>A;,mI=E21;&681;4*-68&4UlI)\4:4;<684;<&_*-0+<*-682I2@&8<YVGE21;&_681I=5IM6825&>=B*-0525*-680/4U 21;&`6ã1I99;&8, 9;4*-05&Q¦9 P I=5IM,.,-&8,(+*-2121;&9;4*-05&8,-&8050¦6>I05&HY&82525&>=684;<&80I=B&¦4U!25&>9lI681;*-&8N&8<?YGT684;<*m9;\ZCEA;,-25* P ,-&+*m9 P A;205GCEY[4,-0>Q°±1¥±x³ ¦)»¸§D¿µm¸»g½?¿Á ¹µ!»¼h/4:9;0@*.<&>=¦I_=@I21;&>=I=B2@* ng68*mI,681I99;&8,(_1;*-6ã1?CI>Gl=@I9;<4:CE,-Gl684;=5=5A P 24:9;&+Y;*-2*m9E&>I681Y;,-4:68L)4U05&8N&>97A;05&8<«254&>9;684:<&05GCEY[4,-0T*u9 21;&E6ã1I99;&8,ªæ(/&E2ILM&l21;& P =z4:YIY;*-,â*-2BG4UlIJY;*.2684:=5=@A P 25*-4:9J&8N&>9V2?*-0c21;&)0ICE& IM0?684:=5=B&8682=B&868& P 25*-4:9Q]%'&*m9,LB&8682 Ã&8wA;* P =z4:YIY;,-&l*m9 P A;2`05GCEY[4,-0ZWx68,-&>I=B,-GÃÄÞDÑ m U!4:=/A9;*-wA;&~<&8684:<*m9;\:^8Q¦%1I2/*-0+21;&6>I P I68*-2BG4U2b1;*.0`681I99;&8,1¨%'&?1IN&ZÑ m *m9 P A;2EI9;<«4:A;2 P A;2 P I2525&>=59;0éUs4:=`I\*-N&>9)*m9 P A;2 : (+*-21'Y;*m9I=BG)<*.\V*.2=B& P =B&80@&>92IM25*-4:9ú ú ú þ ú ü ú©ú i ú m H(/&l1IN&T&8*-\:12&8wA;* P =B4:YIVY;,.& Wx*!Q &Q[(+*-21'oV¯ P =z4:YIY;*-,-*.2BG^+4;A;2 P A;2+05GCEY4,-0?W!I9;< 9;4?42b1;&>=B0^8|

Ñ®

Page 33: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

ú ú ú þ ú ü ú©bú i ú mªú ú ú þ ú ü ú©bú i ú mú ªú ú þ ú ü ú©bú i ú mú ú ªú þ ú ü ú©bú i ú mú ú ú þ ªú ü ú © ú i ú mú ú ú þ ú ü ªú©bú i ú mú ú ú þ ú ü ú© ªú i ú mú ú ú þ ú ü ú©bú i ªú mÈ+1;&>9q684:9;05*-<&>=B*m9;\c2b1;&l*m9VUs4;=5C?I25*-4:96>I P I68*-2aG P &>=05GCEY4,!| « : W!W=T^ W= H 9T^@^ op PR pTJ F W ?GFKHT: ^[,.4V\ s oW ?GFKHT: ^ t W : ^ UW op PR p+Ô ¯gW o¯ ,-4\ o¯ ^ oà UW op p+ÔDÃW ¯¯ ,-4\ o¯ ^ oÃ~ pÈ+1;&«6>I P I68*-2BGD4MUl21;&«6ã1I99;&8,T*-0 :öVp'*m9VU!4:=5C?I2@*.4;9DY;*-250 P &>=Y;*m9I=BGD<*-\*-2)4Ul21;&681I99;&8,684;<*m9;\gQ?h+I9«(/&~n9;<ICE&8681I9;*-0Cè254&>9;684:<& *u9VU!4:=5CI25*-4:9'Y;*-250~*u97p681I99;&8,Y;*-250`0AYILB&8682+254?21;&T&>=@=B4:= P =B4 P &>=B2BGc<&8056>=z*uY[&8<'IY4N&¨È+1;&W!p :^Æ_ICCE*m9;\h/4;<& P =B4N:*-<&80TI,ãÝ!Ú,EÜÝ!ÿ ×`684;<&l254 P &>=aUs4:=@C 21;*-0æqI05G;0B25&>C?IM25*-6+684:<&`*-0/4:9;&+*m9?(`1;*.68121;&+4;YN;*-4:A;0ÉY;*m9I=BG&>9;684:<*m9;\?4U 21;&054:A=B68&`05GCZY[4,-0*-0 P =B&805&>9V2/*m921;&_681I99;&8,&>9;684;<&8<U!4:=5CQg4:=¦4:A=¦054:A=z68&_(_1;*-6>1&>CE*-250+I2¦&>I68125*mCE&*m92@&>=BNOI,/o~4Uto,aE05GCZY[4,-0>H(/&T2IL&T2b1;&ZY;*m9I=BG«=B& P =B&805&>9V2I25*-4:9q4U21;*-0lI9;< 684 P G*-2254Y;*-250lú þ H ú©H ú i IV9;<­ú m 4MUÉ21;&~&>9;684:<&8<Y;,-4;6ãLé 2b1;&F=z&>C?I*m9;*m9;\ Y;*.2@0ZI=B&~\*-N&>9«YGú ü ú ú HgIV9;<¬GgÙVÛaØZÚwtYVG Å ü Å bÅ |ú ü ú©e­ú i ­3ú m IV9;< HÅ ü ú ü ­ú©Z­3ú i ­3ú mú ú þ ­ú i ­3ú m IV9;< HÅ ú ­ú þ ­3ú i ­3ú mú ú þ ­ú©Z­3ú m IV9;< HÅM ú ­ú þ ­3ú©Z­3ú mRT9=B&868& P 25*-4:9* U21;&_Y;*m9I=BGc9;ACEY&>= Å ü Å8Å ë÷T21;&>9E2b1;&>=B&_*-0/9;4Z&>=@=B4:=8H&8,-05&lú®¯®°_®%±*-0_2b1;&lY;*.2`*m9&>=5=z4:=8QÈ+1;*-0 Æ`IC?CE*m9;\J684:<&'A;05&80 õJY;*-250c2@4684:=@=B&8682p]W! Ñ þ #o>^E&>=5=z4:= P IM2525&>=59;0I9;<2=5IV9;0BUs&>=ª ZA;05&bUA;, Y;*-250>Q9\&>9;&>=5I,IEÆ_ICCE*m9;\l684;<&A;05&80óY;*-250254~684:=5=B&8682/Ñ «o&>=5=B4;= P I252@&>=59;0lI9;<J2=5I9;0BU!&>=TÑ oTàA;05&bUA;,ÉY;*-250>Q È1;&cÆ_IVC?CE*m9;\ 684;<&80?I=B&6>I,-,-&8<²Ú>Û³$8Ú×ÝI021;&8G A;05&XáY;*-250`254E684:=5=B&8682_Ñ DoT&>=@=B4:=B0>Q

õ÷

jgd1000
Oval
jgd1000
Line
Page 34: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

È+1;&lÆ`IC?CE*m9;\?684;<&80+&8y;*-052/U!4:=/I,-, P I*m=B0ZW!Ñ Jo Ñ l ^ÉIV9;<<&825&8682+4:9;&lY;*-2+&>=5=z4:=B0>Qe ,.0@421;&«T4,mIG684;<& *-0 IDW!Ñõ o>Ñ^~Y;,-4:68L684;<&«(_1;*-6ã1¥684:=5=B&868250A P 2@4'21=B&8&«Y;*.2&>=5=B4;=B0>HI9A99ICE&8<)684:<&~&8y:*-052@0_I2ZW!®V÷ p¯V^(_1;*-681684:=5=z&868250+A P 254?2B(/4ZY;*-2`&>=5=B4:=z0>Q

1e-20

1e-10

1

1e-08 1e-07 1e-06 1e-05 0.0001 0.001 0.01 0.1

Residual error rate

Mean error rate

*-\:A=B&EÑ g|+W!pM :^/Æ_ICCE*m9;\?684:<&E=B&805*-<gAI,&>=5=B4:=/=5IM25&%'&~6>I921;&>9684;9;05*-<&>=`21;&ZCE4;=B&T\&>9;&>=5I,6>I05&H(_1;&>=z&l(/&l1IN&l=5I9;<4:CóY;*-2_&>=@=B4:=B0A9;*âUs4;=5CE,-G?<*-052=B*mYA;25&8<'Wx*!Q &Q(/&_I,-,-4(D21;& P 4050@*uY;*-,-*-2BG?4U2B(/4T4:=CE4;=B&+Y;*.2¦&>=5=B4:=z0 P &>=plY;*-2+Y;,-4:68L;^8Q e \:I*m9E(/&_4;Y;2I*m9E21;& P =B4:YIY;*-,-*-2BG?4U=B&805*-<gAI,&>=5=B4;=I0¦21;&_=B&>C?IM*u9;<&>=4U21;&lY;*m9;4:CE*mI,05&>=B*-&80>|

\ m 10 s p £ t WBo+g^m@l ´ µ¶¸·¹vº@·#»X¶¸»¸¼º·¸½,¶¾]¿ÁÀ¹vº_¶¸·%'&l9;4(684:9;0@*.<&>=21;&T6>I0@&T*u9q(_1;*-6ã1q21;&T05*-\:9I,-0`4:=+CE&80@0I\&80+(/&`(+*-012@4?2=5I9;0zUs&>=I=B&684;925*m9;A;4:A;05,-GNOIV=B*mIY;,-&é_2b1I2E*.0Y4V21J21;&05GCEY4,-0(É&(+*-01¥254'2=5IV9;0CE*-2?I=B&684:9V25*m9:A;4;A;0UA9;6825*-4:9;0H_I9;<321;&­I0@054:68*mI2@&8< P =z4:YIY;*-,-*.2@*.&80)4Ul21;&805&qUA9;682@*.4;9;0 I=B&\*-N&>9)YGIE684:9V25*m9:A;4;A;0 P =B4:YIY;*-,-*-2BG <*-052=B*mYA;25*-4:9Q%'&+684;A;,.<2=BGEI9;<?4:Y;2bI*m9E21;&+=B&80A;,-250/I0ÉI`,-*mCE*-25*m9;\ P =B4;68&8050U=B4:C2b1;&+<*-056>=B&825&+6>IM05&Qg4;=`&8yIC P ,.&H684:9;0@*.<&>=EI)=5I9;<4:CèNOI=B*mIY;,-& Õ (`1;*.68172IL&80T4:9 NOI,mA;&80 : F ÃÂÄ : HÂDå÷ ;Å o ;Å Ñ 88 H(+*-21 P =z4:YIY;*-,-*.2BG'W :/F ^Ä : H*!Q &Q P =B4;YIY;*-,.*-2BGD<&>9;0@*.2BG7UA9;682@*.4;9W :8F ^8Q%'&E1IN&_2b1;&ZI050@4:68*mI25&8< P =z4:YIY;*-,-*.2BG«9;4:=5C?I,-*j>I25*-4:9| F W :8F ^_Ä : o

õo

Page 35: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

ø05*m9;\c4:A=ÉUs4:=@CZA;,mI~U!4:=¦<*-056>=B&825&~&>92b=B4 P GI9;<2bIL*m9;\?2b1;&T,.*mCE*-2>|Wq9^ ,-*uCÆ Ç k F W :/F ^_Ä : ,-4\ oW :/F ^_Ä : ~

,-*uCÆ Ç k#È F W :/F ^[,-4\ oW :/F ^ ~ Ä : ',-4\ WqÄ : ^ F W : ^_Ä :IÉ ÊÌËl Ë W : ^,-4\ oW : ^,~DÍ : ~,.*mCÆ Ç k ,-4\ WqÄ : ^ ~ û ʲËl Ë W : ^ Í : ÎWq9^å,.*mCÆ Ç k ,-4\ WÄ : ^ W!¡Ñ^È+1;*-0l*-0l=5IM21;&>=`(/4:=5=BG;*m9;\I0T2b1;&Z,mI2@25&>=`,-*uCF*.2~<4:&80l9;4V2T&8y:*-052>QEÆ4O(/&8N&>=ÏHI0T(/&EI=B&4U!25&>9«*m9V25&>=B&80525&8<7*u9721;&<*Ï&>=z&>9;68&80FY[&82B(É&8&>9 &>92=z4 P *-&80qWs*sQ &Q *m9'2b1;&?684:9;05*-<&>=5I2@*.4;94UÉCEA;2AI,&>92=z4 P G4;=+6>I P IM68*.2BG^8H(É&~<&bn9;&?21;& P =B4:Y;,-&>CáI(tIGYG«A;05*m9;\21;&Tn=z05225&>=5C 4:9;,-GI0`I?CE&>I0A=z&T4UtÙVÿ Ð_ÚÛBÚgÝ!ÿxÜÚÝ!ÛBØ@v|ÎWq9^ÑʲËl Ë W : ^,-4\ oW : ^ ~>Í : W!¡õ^%'&E6>I9«&8y;25&>9;<«21;*-0l2@4 I?684:9V25*m9;A;4:A;0Z=5IV9;<4:CóN&868254:=4U/<*uCF&>9;05*-4:9Ò(7684:9;68&>I,-*m9;\21;&X([!U!4,-< *m92@&8\:=5I,Y[&>1;*m9;<­NV&868254:=+9;42bI25*-4:9I9;<­Y[4,-< 2BG P &|ÎWqÓ^ ʲËl Ë W ¶ ^,-4\ oW ¶ ^ ~ Í ¶ W!¡ :^%'&~6>I921;&>9 IM,.0@4?<&bn9;&E21;&La4V*u9V2TI9;<684:9;<*-25*-4:9I,&>9V2=B4 P *-&80/U!4:=/684;925*m9;A;4:A;0`<*.0z2=B*mYA;25*-4:9;0| ÎWqÓ Ô ^à ÊÕÊ3W ¶ c ^,-4\ oW ¶ c ^G~ Í ¶ Í cÎWqÓ H Ô ^à ÊÕÊ3W ¶ c ^,-4\ W c ^W ¶ c ^G~DÍ ¶ Í cÎW ÔÖH Óc^à ÊÕÊ W ¶ c ^,-4\ W ¶ ^W ¶ c ^G~ Í ¶ Í c(+*-21|

W : ^à ÊDW ¶ c ^ Í cW ? ^à ÊDW ¶ c ^ Í ¶*m9I,-,-GHgCEA;2AI,*m9VUs4;=5C?I25*-4:9F4U 684:92@*u9;A;4:A;0/=5I9;<4:C NOI=B*mIY;,-&80 Õ I9;<DB*-0/<&bn9;&8<A;05*m9;\21;&T<4:AY;,-&E*u9V25&8\:=@I,!|£W Õ éB^ ÊÊ W : @? ^[,-4\ W :]HJ? ^W : ^ ~ Í : Í ?(+*-21q21;&T6>I P I68*-2BG4U21;&T6ã1IV99;&8,\*-N&>9)YGC?Iy;*mCE*j8*m9;\21;*-0CEA;2AI,*u9VU!4:=5CI25*-4:94N&>=/I,-, P 4V0505*mY;,-&Z*m9 P A;2T<*-052=z*uYA;2@*.4;9;0+U!4:= Õ Q

õÑ

Page 36: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

n× Ø?Vpo+gMM g%'&?4;Y;2I*m9«NOIV=B*-4:A;0 P =B4 P &>=B25*-&80?I9I,mI\V4:A;0T25421;&?<*-056>=z&825&c6>IM05&Qc 9«21;&EU!4,-,-4O(*u9;\(/&E<g=B4 P 21;&?Y[4,-<2BG P &HYA;2T&>IM6ã1«<*-052b=B*mYA;25*-4:9HNOI=B*mIY;,-&&8256?01;4;A;,.<JY&2IL&>9)254Y&~4U( <*mCE&>9;0@*.4;9;0>QoVQ+j9V2=B4 P G'*-0CIy:*mCE*j8&8<3(+*-21&8wA;* P =B4:YIY;,-&g505GCEY4,-0_hgQsU : *-0c,-*mCE*-25&8<]254054:CE&`N4,mACE&XÙ«Wx*!Q &Q*-0+4:9;,-G9;4:9V'j8&>=B4(+*-21;*m921;&TNV4,mACE&>^/21;&>9ÎW : ^¦*-0_C?IyV*uCF*j8&8<«(_1;&>9EW : ^#oöÙgQÑ[QÎW :;? ^Þ*ÎW : ^ÔÎW ? ^õ[Q/% 1IM2c*-0 2b1;& CIy:*mCEACà<*Ï &>=B&>92@*uIM,T&>92=z4 P G¥Us4;=?0 P &868*âng&8< NI=B*mI9;68&«æD(/&6ã1;4;405&?2b1;*.0?I0~21;&NOI=B*mI9;68&*-0?I CE&>I0bA=B&c4MUIN&>=@I\& P 4(/&>=8QÆ+&>9;68& I)=B&b052I25&>CF&>92/4U2b1;& P =B4:Y;,-&>C *-0`254~n9;< 21;&lCIy:*mCEACS<*Ï &>=B&>9V25*mI, &>9V2=B4 P G?U!4:=IE0 P &868*âng&8<'CE&>I9 P 4(É&>=ÏQh/4:9;05*-<&>=/21;&ob<*mCE&>9;05*-4:9I,=5I9;<4;CSNOI=z*uIVY;,.& Õ H:(*.2b121;&T684:9;0@2=5I*m9V250>|Ê3W : ^ Í : oÊW : ¬Ú^ W : ^ Í : Û

(`1;&>=B&|eÚ Ê : W : ^ Í : W!¡¡^È+1;*-0É4 P 25*mCE*j>I25*-4:9 P =B4:Y;,-&>Cì*-0/054,-N&8< A;05*m9;\)Ü I\:=@I9;\&_CEA;,-25* P ,-*.&>=z0I9;<C?IyV*uCF*j8*m9;\g| ÊÁÝ>W : ^[,.4V\ W : ^Ô-Þ W : ^>W : ßÚ^ Ô-Þ W : ^%à Í :(_1;*-6ã1q*.04:Y;2I*m9;&8<­YVG054,-N:*m9;\g|

To+',-4\W : ^Ô-Þ W : ¬Ú^ ÔÞ ÷054E21I2+(+*-21<gA;&E=B&8\:I=z<Us4:=t21;&T684:9;052=@I*m92504:921;&T05G;0525&>C|

W : ^ oá ÑâpÛã l l/ä °;å %æ °1;&>9;68&| ÎW Õ ^ oÑ ,.4V\W!Ñâ ã Û ^ W!¡a^%'&l4:Y;0@&>=BN&l2b1I2>|/*m^/U!4:=`I9VG =5I9;<4:C NOI=z*uIVY;,.&yB (*.2b1 NI=B*mI9;68&)ÛHÎWBE^TÞÎW Õ ^8H*-*u^~21;&<*ç &>=B&>9V25*mI,¦&>92=z4 P G)*-0l<& P &>9;<&>9V2E4:9;,-G'4:9)21;&?NI=B*mI9;68&I9;<*.0`*m9;<& P &>9;<&>9V2l4U2b1;&ZCE&>I9H1;&>9;68&Z*-*-*m^T21;&~\:=B&>I2@&>=/21;& P 4(É&>=ÏH21;&l\;=B&>I25&>=21;&T<*Ï&>=z&>925*mI,&>92=z4 P GQÈ+1;*-0+&8y:2@&>9;<0_254CZA;,-25* P ,-&E<*mCE&>9;05*-4:9;0`*u9)21;&T4;YN;*-4:A;0_C?I99;&>=ÏQõõ

Page 37: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

n èE*é7 g921;&+6>IM05&_4MU684;925*m9;A;4:A;00@*.\;9I,-0tIV9;<c681I99;&8,-0/(É&TI=B&*u9V25&>=B&80@25&8<?*m9UA9;682@*.4;9;0rWs4U0IGE25*mCE&>^8HV6ã1;40@&>9EU=B4:C#Ir05&824U P 40505*mY;,-&`UA9;682@*.4;9;0>HI0*m9 P A;2/254l4;A=681I99;&8,!H(+*-21054:CF& P &>=B2A=5Y[&8<NV&>=B05*-4:9«4U/21;&EUA9;6825*-4:9;0?Y&8*m9;\ <&82@&86825&8<XIM2T21;&?4:A;2 P A;2Q?È+1;&805&UA9;6825*-4:9;0+21;&>92IVL&_2b1;& P ,mI68&~4U21;&T*m9 P A;2_I9;<4;A;2 P A;2+I, P 1IY&82504U2b1;&_<*-056>=B&82@&6>I05&Q%'&CEA;052ZI,-054«21;&>97*m9;68,mA;<& 21;& P =B4:YIY;*-,-*-2BG4U_I)\*-N&>9 UA9;682@*.4;9Y&8*m9;\'*m9,La&8682@&8<*m92@4Z2b1;&T6ã1I99;&8,(_1;*-6ã1Y=B*m9;\0`A;0É2@4Z2b1;&T*.<&>IF4UÉÚêãÚ,)ë-Úæ?21;*-0+\&>9;&>=5IM,IV=B&>IT4U(/4:=5LE*-0TL:9;4(_9IM0XEÚÜG;ÛaÚEÝ:ÚbØÛ@Qg4;=/&8yIVC P ,-&H684:9;05*-<&>=21;&l&>9;05&>CEY;,-&80>|oVQ ìGí Wîa^]0@*u9WîÔï^

jI681`NOI,mA;&¦4Uêït<&bn9;&80I/<*Ï&>=z&>92UA9;6825*-4:9HI9;<T2@4\&821;&>=(*.2b1lI P =B4;YIY;*-,.*-2BG<*.0@2=B*mYA;25*-4:9H0IGTWqï^8H(/&c1IN&I9«&>9;05&>CEY;,-&Q'Ð+4V25&?21I2ï 1;&>=B&C?IG'Y&<*.0@6>=B&825&l4;=684;925*m9;A;4:A;0æc684:9;05*-<&>= P 1I05&l0b1;* U!2_L&8G;*m9;\gQÑ[Q+h/4:9;05*-<&>=I705&824UZ=5I9;<4;CàNOI=B*mIY;,-&80 d «B|;£ ÷ @Å o ;Å Ñ >>8 k'(_1;&>=B&«&>I681«T2IVL&80?4:9IJ=5I9;<4:C NOI,mA;&'I68684:=B<*m9;\254JI'lIA;0@05*mI9<*-052=z*uYA;2@*.4;9](+*-21052I9;<gI=z<c<&8N;*mI25*-4:9 á Ã'é[21;&>9|ì W d «Vk îa^ «V 0@*u9;6W!Ñðñî6£ ^*.0/21;&yg5(_1;*-25&T9;4*-05&hE&>9;05&>CEY;,.&HYI9;<;,-*mCE*-25&8<254ð Æ&>=B2_j_I9;<(+*-21IN&>=5I\&P 4(/&>=Éà Qõ[Q/K­4;=B&T\&>9;&>=5I,-,-GH(/&l1IN&TU!4:=+=5IV9;<4:C NI=B*mIY;,-&80 d3: kl2b1;&l&>9;05&>CEY;,-&E4UÉYI9;<;,.*mCE*-25&8<)UA9;6825*-4:9;0>| ì W d3: k îa^ : 05*m9;6W!Ñðñî6£^(_1;&>=B&T4U684:A=B05&T(/&l=B&>CF&>CZY[&>=¦U=B4:CS21;&T0bIC P ,-*m9;\c21;&84;=B&>CS21I2|: ì £Ñð ~xU?(/&I,-054D684;9;05*-<&>=)UA9;682@*.4;9;0­,-*mCE*-25&8<#254]25*mCE&7*u9V25&>=BNI,òlH~21;&>9(/&'4;YV2I*m9J4:9;,-GXÑròð 9;4;9V'j8&>=B4'684;&b¬c68*-&>9250I9;<D(/&6>I9J684:9;05*-<&>=21;&&>9;05&>CEY;,-&254?Y&Z=z& P =B&805&>9V25&8<­YVGIV9(<*mCE&>9;0@*.4;9I,_Wq(ÑGòð^ P =B4:YIVY;*.,-*-2BG<*-052=B*mYA;25*-4:9W :8;:>>5;:[ ^ÏQ Q/K­4;=B&?0 P &868* ng6>IM,.,-G;H+*âU_(/&684:9;05*-<&>=E21;&&>9;05&>CEY;,-& 4U`,.*mCE*-25&8< P 4(/&>=W!YG ^8HYI9;<;,-*mCE*-25&8<DWx254 Å ð ^I9;<c2@*uCF&b,.*mCE*-25&8<)05*-\:9I,-0ZW!9;4;9V'j8&>=B4?4:9;,-G*u9*m9V25&>=BNI,W!÷ ò`^5^8H¦(É&)n9;<]21I22b1;&­&>9;0@&>CZY;,-&'*-0«=B& P =B&805&>9V25&8<YG]I9([<*uCF&>9;05*-4:9I,P =B4:YIY;*-,-*-2aG3<*-052=B*mYA;25*-4:9ë(_1;*-6ã13*-0j8&>=B44:A;2@05*-<&­2b1;&ó([0 P 1;&>=z& =@I<*mA;0Ìôá Ñð Q õ

jgd1000
Oval
jgd1000
Line
jgd1000
Line
Page 38: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

f¦G684;9;05*-<&>=B*m9;\21;&,mI2525&>=~2BG P &80?4U`&>9;05&>CEY;,.&80H(/&6>I9 ng2?21;&>Cå*m92@4­2b1;&n9;*.2@&<*mCE&>9;05*-4:9I,684:9V25*m9:A;4;A;0_<*Ï &>=B&>9V25*mI, &>9V2=B4 P Gc<&bn9;*-25*-4:9;0T\V*.NV&>9*m905&8682@*.4;9¡Qnx¢ §c"¦¦87~§cvo¦!8$%'&+684;9;05*-<&>=/6ã1IV99;&8,.0/*m9(_1;*-681c9;4*-05&`*-0/*u9,LB&86825&8<*m9;<& P &>9;<&>9V25,-G4U 21;&+0@*.\;9I,!é21;&P I=B2@*.6>A;,mI=¦9;4*-05&+054;A=B68&+4U*u9V25&>=B&80@2*-021;&054 6>I,-,-&8<­ÜÙVÙÿ Ý!ÿõ>Ú#ö];ÿ Ý!Úy÷+Ü@8ÿ Ü>gØVÿ18Ú>QA=z21;&>=/(/&l=B&8052=z*.682684:9;05*-<&>=5I25*-4:9;0`254?2b1;&`n9I, 68,mI050`4U&>9;05&>CEY;,-&Q%'&'1IN&­I705*-\:9I,T(*.2b1ÀIN&>=@I\& P 4(/&>= H+2@*uCF&bF,-*mCE*-25&8<2@4ò I9;<YI9;<(*.<2b1,-*mCE*-25&8<«254DðìQ%'&)21;&>93684:9;05*-<&>=21;&Ì(#Ñðñòó0IC P ,-&80W ÕDF I9;<-B F ^E21IM2c6>I93Y&'A;05&8<]254681I=5I6825&>=z*.0@&lY4212b1;&T*u9 P A;2TI9;<4:A;2 P A;2É0@*.\;9I,-0>QÉÈ1;&>921;&l=B&8,mI25*-4:9;01;* P Y&82B(/&8&>921;&T*m9 P A;2TI9;< 4:A;2 P A;2+*-0_\*-N&>9YVG|B F ÕDF Ôà F H/Â?#o Ñ >8 ((_1;&>=B&?à F *-0`U=B4:C 21;&?YI9;<;+,.*mCE*-25&8<lIVA;0505*mI9)<*-052=B*mYA;25*-4:9«(*.2b1|j8&>=z4 CE&>I9«I9;<NOIV=B*mI9;68&| Û Ã k ð(_1;&>=B&là k *-0`21;& P 4(/&>=/0 P &8682=5I,<&>9;05*-2BGQe 0Fà *-0?*m9;<& P &>9;<&>9V24U Õ (/&4:Y;2I*m9J21;&684;9;<*.2@*.4;9I,+&>9V2=B4 P G'I0?Y&8*m9;\0@4,-&8,.G<& P &>9;<&>9V2T4:921;&l9;4V*.0@&l054:A=z68&HgI9;<U=B4:CS&8wAI25*-4:9¡aZn9;< *-250+NI,mA;&|ÎWB H Õ ^ÑÎW!ë^ oÑ ,-4\ÉÑâ ã à k ðÆ&>9;68&| £W Õ éB^ÑÎWB?^6ÎWí^È+1;&T6>I P IM68*.2BG4U21;&T681I99;&8, *-0+2b1;&>94:Y;2I*m9;&8< YVGC?IMy:*mCE*j8*m9;\c2b1;*.04ONV&>=ÉIM,.,*u9 P A;2<*-052=B*mYA;25*-4:9;0~æ 21;*-0E*-0ZI681;*-&8N&8<YVG­C?IMy:*mCE*j8*m9;\'(+*-21'=z&80 P &8682~25421;&E<*-052=B*mYA;25*-4:94U Õ 0AYILB&8682+25421;&lIN&>=5I\& P 4(/&>=/,-*mCE*-2I25*-4:9|ø M Õ F N e 0_(/&Z1IN&T0@&8&>9 2b1;*.0`*-0IM6ã1;*-&8N&8<«(_1;&>9)(/&l1IN&ZI?~IA;0505*mI9q<*-052=B*mYA;25*-4:9H1;&>9;68&(/&Tn9;<'Y4V21 Õ I9;<²B CEA;052_1IN&Z~IA;0505*mI9q<*-052=B*mYA;25*-4:9;0>QT 9 P IV=B25*-6>A;,mI= Õ 1I0NOIV=B*mI9;68& I9;<B 1I0+NOIV=B*mI9;68& Ôà k ðìQ%'&T6>I9q21;&>9&8NOIM,uAIM25&l21;&T681I99;&8,6>I P I68*-2BGHgI0>|ÎWB?^à oÑ ,-4\/Ñâ ã W Ôà k ð ^ÎW!ë^à oÑ ,-4\/Ñâ ã W!à k ð^ oÑ ,-4\>goÔ Ã k ð ~ W!¡p^

õ¡

Page 39: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

È+1;*-0/6>I P I68*-2BG?*-0*m9Y;*-250 P &>=ª6ã1I99;&8,05GCEY4,!H;054~(É&6>I9E4:Y;2IM*u9Il=@I25& P &>=0@&8684:9;< HYVGCEA;,-25* P ,-*.6>IM25*-4:9'YVGu(öGòlH[*sQ &Q;U=z4:CÃ(ÑðñòlHgCEA;,-25* P ,.*-6>I2@*.4;9­YVG Ñðì|ñðá,-4\ oÔ Ã k ð ~ Y;*-2;0ç 4 ç 1I99;4;9 0_2b1;*u=z<21;&84:=B&>CH[21;&)gØÿ³B×:Ü/gÚ,×Ü@Ü×>ÿxÝqÝ:ÚØVÛaÚ,E|

È+1;&?6>I P I68*-2BG'4U_Iq6ã1I99;&8,+YI9;<,-*mCE*-25&8<3254Ìð Æ&>=B2_jH P &>=B2A=5Y[&8<XYVGI<<*-25*-N&/(_1;*-25&_lIVA;0505*mI9l9;4V*.0@&+4U P 4(/&>=0 P &8682=@I,<&>9;05*-2BGZà k I9;<?YI9;<;,.*mCE*-25&8<«254)ð *-0`\*-N&>9YG|ñðá,-4\ goÔ Ã k ðù~ Y;*-2;0 W¡¯^(_1;&>=B& *-0+2b1;&ZIN&>=5I\V&_2=@I9;0CE*-2525&8< P 4(É&>=ÏQ ±m²±x³ úÁ½8¸Â

È+1;&/05&8684:9;<E25&>=@C(+*-21;*m9E2b1;&É,-4\T*m9~&8wAI25*-4:9E¡¯*.02b1;&É0@*.\;9I,254T9;4*-05&+=5I25*-4?W ç Ð^8QoVQ+RTY;05&>=BN&?2b1I2l21;&6>I P I68*-2BG'*m9;6>=B&>I05&80?CE4:9;4V254:9;*-6>I,-,-GI9;<J(+*-21;4:A;2ZY[4:A;9;<I0/21;& ç Ðû*m9;6>=B&>IM05&80>QÑ[Q ç *uCF*.,mI=z,.G2b1;&_6>I P IM68*.2BG*m9;6>=B&>I05&80_CF4:9;4254:9;*-6>I,-,-GI0/21;&lYI9;<(+*-<21*m9;6>=B&>I0@&80YA;2+254cI~,-*mCE*-2>Q+ø+0@*u9;\ÈI>G;,-4:= 0/&8y P I9;05*-4:9Us4:=t,u9|

,u9WBoÔ*ü^Ñü« ü Ñ Ô ü þõ ü ü Ô +,++(É&T4;Y;2I*m9| í à k ,-4\ ãõ[Q+È+1;*-0+*-0_4U!25&>9=B&8(_=B*-252@&>9*u9q25&>=5CE0/4U+ÚÚ>ÛKOÚ>ÛXë8ÿxÝ!HøýbH(`1;*.681*-0_<&bn9;&8<«YG ÑøûýqcQ¦È+1;&T,-*uCF*.2@*u9;\)NOI,mA;&~*.0`21;&>9|øûýà k í ,.4V\ \ Ñl÷ a®õÈ+1;*-0+*-0_6>I,-,-&8<«21;& ç 1I99;4:9²Ü*mCE*-2>Q Q+È+1;&6>I P IM68*.2BG~4Ug21;&/6ã1IV99;&8,*.0IM6ã1;*-&8N&8<E(_1;&>9~21;&/054:A=z68&þg@,.4;4:LV0,-*mL&_9;4V*.0@&hgQÈ+1;*-0?*-0c21;& YIM05*-0EUs4;=E0 P =B&>I<0 P &8682=5AC 25&86ã19;*-wA;&80c4MUrCE4;<gA;,mI25*-4:93I9;<*m9P I=B25*-6>A;,mI=+h/4;<&liT*-N;*-05*-4:9)K A;,-25* P ,-& e 6868&80@0ZW!h/iTK e ^8Q

õa

jgd1000
Oval
Page 40: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

1. Foundations: Probability, Uncertainty, and Information2. Entropies Defined, and Why they are Measures of Information3. Source Coding Theorem; Prefix, Variable-, and Fixed-Length Codes4. Channel Types, Properties, Noise, and Channel Capacity5. Continuous Information; Density; Noisy Channel Coding Theorem6. Fourier Series, Convergence, Orthogonal Representation

7. Useful Fourier Theorems; Transform Pairs; Sampling; Aliasing

8. Discrete Fourier Transform. Fast Fourier Transform Algorithms9. The Quantized Degrees-of-Freedom in a Continuous Signal10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal “Logons”11. Kolmogorov Complexity and Minimal Description Length.

Jean Baptiste Joseph Fourier (1768 - 1830)

37

jgd1000
Note
Duplicated page.
Page 41: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Introduction to Fourier Analysis, Synthesis, and Transforms

It has been said that the most remarkable and far-reaching relationship in all of mathemat-ics is the simple Euler Relation,

eiπ + 1 = 0 (1)

which contains the five most important mathematical constants, as well as harmonic analysis.This simple equation unifies the four main branches of mathematics: 0,1 represent arithmetic,π represents geometry, i represents algebra, and e = 2.718... represents analysis, since one wayto define e is to compute the limit of (1 + 1

n)n as n → ∞.

Fourier analysis is about the representation of functions (or of data, signals, systems, ...) interms of such complex exponentials. (Almost) any function f(x) can be represented perfectly asa linear combination of basis functions:

f(x) =∑k

ckΨk(x) (2)

where many possible choices are available for the expansion basis functions Ψk(x). In the caseof Fourier expansions in one dimension, the basis functions are the complex exponentials:

Ψk(x) = exp(iµkx) (3)

where the complex constant i =√−1. A complex exponential contains both a real part and an

imaginary part, both of which are simple (real-valued) harmonic functions:

exp(iθ) = cos(θ) + i sin(θ) (4)

which you can easily confirm by using the power-series definitions for the transcendental functionsexp, cos, and sin:

exp(θ) = 1 +θ

1!+

θ2

2!+

θ3

3!+ · · · + θn

n!+ · · · , (5)

cos(θ) = 1 − θ2

2!+

θ4

4!− θ6

6!+ · · · , (6)

sin(θ) = θ − θ3

3!+

θ5

5!− θ7

7!+ · · · , (7)

Fourier Analysis computes the complex coefficients ck that yield an expansion of some functionf(x) in terms of complex exponentials:

f(x) =n∑

k=−n

ck exp(iµkx) (8)

where the parameter µk corresponds to frequency and n specifies the number of terms (whichmay be finite or infinite) used in the expansion.

Each Fourier coefficient ck in f(x) is computed as the (“inner product”) projection of the functionf(x) onto one complex exponential exp(−iµkx) associated with that coefficient:

ck =1

T

∫+T/2

−T/2

f(x) exp(−iµkx)dx (9)

where the integral is taken over one period (T ) of the function if it is periodic, or from −∞ to+∞ if it is aperiodic. (An aperiodic function is regarded as a periodic one whose period is ∞).For periodic functions the frequencies µk used are just all multiples of the repetition frequency;for aperiodic functions, all frequencies must be used. Note that the computed Fourier coefficientsck are complex-valued.

38

Page 42: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

ÿ "!$#&%('*),+-"-.+/10,23/4065&+7)98,+-:'*!&),'<;=>5,#&;7?('*!&#&%A@ BDCFE7G4'DHI+JHK=D!$-K%!$LM+ONP/J#&)PQRCG,@ BDC"S9N&E T@ BDCUEVHWX'<?ZY&!$5&?0*![%%O!1=]\J+#&+-(/40<'*?.^`_"+?Z/Ja4+NTcbJd HWe+!$f&%(+-.2J+?ZY&+O!&-.?ZY&!J\J!&#,/40<'*?.^g8,-:!$8,+-.?('*+7%!4=%'6#B>hCUE"/J#&);7!J%3B>iRCFE=j!$-'6#J?+7\[+-.%Oh/J#&)9iKk

lnm:op ;7!J%3BDiCUEq;7!J%BDhCUE(r$C T s b[d 't=huTciTcvdxwzy| !J?ZY&+-:_'<%+l m:op %'6#BDiCUEq%'6#BDhCUE(r$C T s v 't=huTciTcvdxw y| !J?ZY&+-:_'<%+l m:op %'6#BDiCUEq;7!J%BDhCUE(r$C T v QRh~Zi B.EWe+5&%+?ZY&+-.!$#&+7;7a4+-w=5,#&;7?'<!$#?(!gLP+/J#k

wzy|PT s hTciv !J?ZY&+-:_'<%+ Y&+#?ZY&+U!$5,-:'*+-&+-.'<+7%=j!$-"@ BDCFE"'<%k

N pb S3x B>NJ;7!J%3BDiCUExSz|%'6#B>iRCFEE BDbJE_OY&+-.+?zY&+U!&5,-.'<+- "!&+zg;7'<+#J?%/J-.+Jk

N[T d l m.op @ BDCUEq;7!J%BDiCUEr&C iv BDJEzT d l9m:op @ BDCUEq%'6#BDiCUEr&C ic B$E

We+PY&!$8,+?ZY,/4??ZY&+F!$5,-.'<+-&+-.'<+7%8,-.!42$'<),+7%/J#`+78,/[#&%'<!$#!4=]?zY&+=5,#&;7?'<!$#9@ BDCFE"'6#?+-LM%!4=;7!J%('#&+P/J#&)9%('#&+?+-LP%H ¡|¡¢J£,¤|¥¦c§F¨J¥£©cªx«¬­U§®¨9®4¯ °|§¢J­F®± +7?O²³´ BDCFE"f,+/[#J^`%Z5,Lµ!4= %'6#&+/[#&)9;7!J%'6#&+?+-LP%k

² ³´ BDCUE T N ³pb S ´¶ 4 BDN ³ ;7!J%3BDiCFES ³ %'6#KBDiCUEE BD·JE/J#&)9² ´ BDCFE7G&?ZY&+?Z-5,#&;/4?(+7)U!$5,-.'<+-&+-.'<+7%=D!$-/=>5,#&;7?'<!$#@ BDCFE7k

² ´ BDCUE T N pb S ´¶ 4 BDNJ;7!J%3BDiCFESz|%'6#KBDiCUEEJ¸

Page 43: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

_OY&+-.+¹N /J#&)º /J-.+P?ZY&+PU!$5,-:'*+-;7!&+zg;7'<+#[?%H¹ "!$#&%('*),+-?ZY&+P'6#J?+7\&-/40 \J'<2&'#&\?ZY&+),'<%;-.+8,/J#&;7^`f,+7?._"+7+#¹@ BDCUE/J#&)g² ³ B>CUEBD/4%(%Z5,LP'6#&\@ BDCFEK'*%_"+70*0f,+Y,/2J+7)¹+#&!&5&\$YP=D!$-?ZY&+'6#[?+7\$-/40x?!P+7$'<%?ZEVk l m.op » @ BDCFE ¼e² ³ BDCUE½ m r$C_OY&'<;7Y%'6L¹8&0<'t¾U+7%?!Fkl m.op ¿ @ BDCFEzÀ m r$C¼ b[dN mp ¼Ád ´¶ 4x BDN m S m ES b[d|BDN ³p ¼eN p E m Sd ´¶ 3x" BDN ³ ¼ÁN[$E m ScBD ³ ¼ez$E m4à BDÄJE

Å !J?+P?ZY,/4??ZY&+¹?+-(LP%O'6#[2J!J0<2&'#&\eN[³]/[#&)Æz³ /J-.+P/40<0OÇv/J#&)23/J#&'<%ZYÈ_OY&+#eN[³ TÉNJ/J#&)e ³ TÊzFH9ËKÌ$ÍÏÎÐJÑ$ÒzÓÍÒÔRÍÒZÓÍ7ÕPÓ6Õ¹ÖÌ$Í×ÍVÕØÖÙzÚJÚRÒ.Ð4Û4ÓÝÜPÙJÖDÓÐJÞ,ßÓÝÞeÖDÍÒZÜÕ¹Ð.à¹ÜPÍZÙ[ÞÕ7áÑ&ÙJÒ.ÍZâ`ÍÒzÒ.ÐJÒß"ÖDÐ@ãÖÌ$ÙJÖäZÙ[ÞÆ×(ÍPÙJä7Ì&ÓÝÍåÍzâgÑ[ÕØÓÞJæÖÌ$ÍVÕØÍPäÓÝÒ.äÑ&ç<ÙJÒKà7Ñ$ÞFäÖDÓÐJÞ,Õè>é ê­U¯ °¥¢J­F¦X­U©x¨1®Ï£©ìëZ°|©|í&¨J¥£©|® Y&+¹F!$5,-.'<+-&+-:'*+7%¹/J#&)ÈU!$5,-.'<+-;7!$+zg;7'<+#[?%¹/J-.+P),+z¾R#&+7)/4%/Jfq!42J+JH¹î!4_"+72J+-_"+L¹/^¹+#&;7!$5,#[?+-%!$LP+8,-:!$f&0<+LP%k[H"'#[?+7\$-(/40<%'6#+7ï,5,/4?'<!$#&%,G&P=>/1'*0x?!¹+7&'*%(?H +JHI\UHðk

@]B>CUE T C!$- @ BDCUE Tñs C9-(/4?'<!$#,/40v C'6--(/4?'<!$#,/40bqH/40<?ZY&!$5&\$YN[RG,z¹+7&'<%?GF?ZY&+%+-.'<+7%),!$+7%#&!J?;7!$#[2J+-.\J+JGqH"+72J+#`?ZY&!$5&\$Y`?zY&+%+-.'<+7%O;7!&#J2J+-:\J+7%G,?ZY&+-:+7%Z5&0<?O'<%O#&!J?@]B>CUE

@ BDCUE TÉs SP vPòCód¼ dòCóbJd?ZY&+#k NJPTcv,~zT bd l op %('#B>iRCFEr&CgTs ô o i!&),)v i+72[+#%!Uk @ BDCUE`õT dcö %('#B>CUES %'6#KBDJCUE S %'6#BD·JCFE· SX÷÷÷ øf,5&?%+-.'<+7%\J'<2J+7%O@ BDidE õTcv,H

$v

Page 44: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

-1.5

-1

-0.5

0

0.5

1

1.5

-1 0 1 2 3 4 5 6 7

fseries( 1,x)fseries( 7,x)fseries(63,x)

rect(x)

'<\$5,-:+¹Jk ù8,8,-.!4$'6L¹/4?('*!&#&%?!g-.+7;7?z/J#&\$5&06/J-8,5&0<%+

-1.5

-1

-0.5

0

0.5

1

1.5

-0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2

fseries( 63,x)fseries(163,x)

rect(x)

'<\$5,-.+b,kú'6f,f&%O8,Y&+#&!$LP+#&!&#U

Page 45: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

î!3_"+72J+-VG$'6#-.+/400<'t=j++7/[L¹8&0<+7%O+#&;7!$5,#[?+-.+7)'6#%'<\$#,/40 8,-.!&;7+7%%'6#&\`?ZY&'6#&\J%O/J-.+%('Lû8&0<+-O/1%k[Hü=|@]B>CUE"'<%f,!$5,#&),+7)'#BDvq~ZbJdE /[#&)n8&'<+7;7+7_'<%+P;7!$#[?'6#$5&!$5&%G,?ZY&+F!$5,-.'<+-"%+-:'*+7%;7!$#J2[+-.\J+7%?! ¿ @ BDC ¶ ExS@]B>CUýKEzÀþJb/4?'6#[?+-.'<!$-K8,!J'6#[?%]/[#&) ¿ @ BDvJýKERSã@ BDbJd ¶ EzÀþJb/4?v¹/J#&)9bJd H bqHü=@ BDCUE'*%"/40<%!;7!&#J?'6#&5&!$5&%"!$#9BDvq~ZbJdEVG?ZY&+%Z5,L!4=?ZY&+U!$5,-.'<+-&+-.'<+7%'<%|+7ï,5,/40?!¹@]B>CUE/4?/40<0 8,!J'6#[?%HqHNJg/[#&)nz¹?+#&)?(!¹ÿ7+-.!g/4?0*+/1%?O/4%"=/4%?/4%þJiKH

g£¦ì¡|¬­q¤ëZ£¢J¦ +7_O-:'*?(+5&%'6#&\ '6#?ZY&+!$f[2$'<!$5&%_/^P=-.!$Lµ?zY&+=j!$-(L5&06/4+=D!$-"%('#/[#&)9;7!J%k

@ BDCUET N pb S3x B>NJ;7!J%3BDiCUExSz|%'6#B>iRCFEET N pb S 4x N[ B S ¶ Eb Sz B ¼ ¶ Eb4' T ¶

BJE_'<?ZYk

p T N p þJbiv T BDN ¼ Eþ[b ¶ T BDNJSz$Eþ[b!$f&%+-.2J+Jk ¶ T

BJE_OY&+-.+

),+#&![?+7%?ZY&+;7!$L¹8&0<+7;7!$#5&\$/4?+JG,/J#&)k PT

bJd l m:op @ BDCUE ¶ r$C BD¸JE ¡|¡¢J£,¤|¥¦c§F¨J¥£©cªx«¥©x¨1­F¢J¡£¬§F¨1¥£© "!$#&%('*),+-Ï?ZY&+24/4065&+n!1=/e8q+-.'<!$),'<;=5,#&;7?'<!$#@ BDCFE¹/4? ),'<%;-.+7?++7ï5,/10*0<^%Z8,/1;7+7)23/105&+7%!4=|Ck C T"!$# B#T bJd G%!Tcv,~J~&&&.~'u¼E?Z-.^`?!¾R#&)9;7!&+zg;7'<+#[?% Ï%Z5&;ØY?ZY,/4?k@ B!$#E T ´¶ 3 p ( *)

. B.vJE+,.-0/213'145/-76*/8(9:45;<451=>45=?4:-18@/AB$CEDFAHG$DI8@DKJML*N(O(P>QR9:45;TSVUXWMJ?L*N>Y[ZEP$3-0AHJ?L*N]\^P%QR9:45;_SUXW^J?L*NM`ZEPIa

$b

jgd1000
Oval
Page 46: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

b 5&0<?'68&0*^fJ^ ¶ *) .y /J#&)%Z5,Lu_'<?ZY-.+7%Z8q+7;7?O?(!c!&k´¶ p @]Bd!0#Ee ¶ f) .y T ´¶ p ´¶

3 p ? *)2g< ¶ yih

f,5&?Of[^g?zY&+%Z5,Lµ!1=/P\[+7!$LP+7?Z-.'<;%+-:'*+7%/J#&)9f&06/4?Z/[#J?O/1%%+-.?'<!$#k´¶ p *) 2g< ¶ yih Tkjlllm llln

"¼*) ´ g< ¶ yih¼ *) g* ¶ yHh Tcv ioTch iTch%!Uk

y T ´¶ p @ B!$#RE ¶ *) .y B.JE

p &+-.;7'<%+Ï=j!&-?ZY&+-.+/4),+-7k%ZY&!4_ñ'6#&%+-:?'6#&\e+7ï5,/1?'<!$#[P'#[?!+7ï,5,/4?'<!$#ãv%Z/1?'<%.¾U+7%?ZY&+!$-:'*\['#,/10+7ï,5,/4?'<!$#&%H

-5

-4

-3

-2

-1

0

1

2

3

4

0 2 4 6 8 10 12

saw(x)’trig.pts.8’’trig.pts.64’

'<\$5,-.+,kqÏ/J#&)9Ä4g8q!J'6#J?'6#J?+-(8,!J06/4?'<!$#&%?!P%Z/_?!&!J?ZYM=5,#&;7?'<!$#We+;/J#?Z-/J#&%(0/1?+O?zY&'*%f,/4;7a¹'6#[?!¹;7!J%/J#&)9%'6#%+-.'<+7%k

@ BDC ZE|TìN p S ´[r m ¶ 3x BDN[|;7!J%3B bJdX!4i ESz %('#KB bJdX!4i EE_OY&+-.+gBÝ=j!$-vPóió ¹þJb[E7k

N p T ´¶ : p @ B b[dX! E$

Page 47: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

NJ T b ´¶ : p @ B b[dX! Eq;7!J%B bJdX!4i E T b ´¶ : p @ B b[dX! Eq%'6#B bJdX!4i E B.bJE

ù%_"+¹'6#&;-.+/4%+s _ +P;/J#ÈL¹/Ja4+¹?zY&+¹'6#J?+-(8,!J06/4?'<!$#Æ=5,#&;7?'<!$#e/4\$-:+7+¹_'<?ZYe@ BDCFE/4?LP!$-:+/J#&)nLP!&-.+8q!J'6#J?%H /Ja['6#&\ N[/4%O/J#+7/JLÏ8&0*+JG/4%iutwv Bð=D!$-"_"+70<0fq+Y,/2[+7)@ BDCUE(E7kN T d ´O¶ p @ B bJdx! E,;7![%VB bJdX!4i E b[d t d l m.op @]B>CUE,;7![%VB>iRCFEr&C B.JE

dy g£®4¥©|­º§R©<z|]¥©|­]­U¢[¥­U®~f&%+-.2[+?ZY,/4?"'t= @ BDCUE'<%%^LÏLP+7?Z-.'<;JG,?ZY,/1?'<%O@ B.¼CUETc@ BDCUE?ZY&+#k

N p T bd l op @ BDCUEr&CN[ T bd l op @ BDCUEq;7!J%BDiCUEr&Cz T vY&+#&;7+?zY&+U!&5,-.'<+-"%+-.'<+7%'*%%'6L¹8&0<^kN pb S ] p N ;7!J%3BDiCUE~#P?ZY&+"!J?zY&+-Y,/J#&)Ï' =@ BDCUE T¼@ B.¼CFE7kK?ZY&+#_"+ \[+7? ?ZY&+";7!$--.+7%Z8q!$#&),'6#&\%'6#&+%(+-.'<+7%k] p z %'6#B>iRCFE_'<?ZYk zT bd l op @ BDCFE,%'6#BDiCUE(r$Cp ,/JL¹8&0<+JG?Z/Ja4+`@]B>CUE¹T Ï=D!$-PvÁóCìód HWe+`;/J#º+7&?+#&)?ZY&'<%¹?!ef,+98q+-.'<!$),'<;_'<?ZYe8q+-.'<!$)ºbJdº+7'<?ZY&+-%^,L¹LP+7?z-.'<;/40<0*^JGK_OY&+#Æ_ +P!$f&?z/4'6#?ZY&+P;7!J%'6#&+¹%+-:'*+7%B_OY&'<;7Y'<%%'6L¹8&0<^eN p Tµb/4%+7,8,+7;7?+7)UE7Gx!$-_'<?ZYe/[#J?'<%^,L¹LP+7?z-.^ÁB%ï,5,/J-.+P_/2J+E_Y&+#?ZY&+%'6#&+%(+-.'<+7%O\J'<2J+7%5&%k

T d ö %'6#KBDCUES %('#KBDJCFE S %('#KBD·JCFE· Sì÷÷÷ ø Bð=D!$-"vPóCódEù#&!J?ZY&+-"+7,/JL¹8&0<+JGF?Z/Ja4+@ BDCUE TcCKBDd¼eCFE=j!&-vPóCódH Y&+;7!J%'6#&+%+-.'<+7%O'<%k

@ BDCUE T d mÄ ¼È;7!J%3BDbJCFEK¼ ;7!J%3B$CFEb m ¼ ;7!J%BDÄJCFE m ¼÷÷÷J

jgd1000
Oval
Page 48: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Y&+#&;7+/1%O@ BDCUE[t vÏ/4%OCst v,kd mÄ T|S b m S m Sc÷÷÷?ZY&+;7!$-(-.+7%Z8,!&#&),'#&\`%'6#&+P%+-.'<+7%'<%k

CBDd`¼ºCUE T d ö %'6#B>CUES %'6#BD[CUE S %'6#KBD·JCUE·$ SX÷÷÷ ø/J#&)9/4?CgTcdxþJb,k d Tc[b ö ¼ $ S ·$ ÷÷÷ ø~f&%+-.2[+?ZY,/4?|!&#&+%+-.'<+7%;7!$#[2J+-.\J+7%=/4%?+-?ZY,/J#`?ZY&+!J?ZY&+-VG$_Y&+#`?ZY&+O24/4065&+7%O!1=?ZY&+f,/4%'<%=>5,#&;7?('*!&#&% L¹/1?;ØYÏ?ZY&+24/4065&+7%!4=]?zY&+=>5,#&;7?'<!$#/4?"?ZY&+8q+-.'<!$),'<;¹f,!&5,#&)U/J-.'<+7%H £°¢J¥­F¢¨J¢J§©|®VëZ£¢J¦&?Z/J-:?'6#&\=-.!$LX?zY&+|;7!$L¹8&0<+7%+-.'<+7%'6#+7ï,5,/4?'<!$#¸qG3L¹/[a4+ /";7Y,/J#&\J+!4=U%;/40<+ ;7!$#&%('*),+-/8,+-.'<!&),'<;=5,#&;7?'<!$#xB>CUE_'*?zY8,+-.'<!&)b$ ),+z¾R#&+9@ BDCUEPTBDC^gþ[dE7G_OY&'<;7YY,/4%8,+-:'*!&)ebJd H

T bJd l o¶ o @ BDCFEe ¶ r$CT bJd d l¶ B&Ee ¶ o r r( B,E T bJd l¶ xBd$Ee ¶ f r? B.7$E

_OY&+-.+ÏTcidþ$ÁGF/J#&) B,E T þJdKH|î+#&;7+Jk@ BDCFE T (

B&E T B,Ee * dT B,Ee * w$ B.·JE

_O-.'<?'6#&\w$Ï=j!$-?ZY&+%?+8dþ HxùO0<0<!4_'6#&\twvXG,?ZY&+#`_"+Y&!$8,+Jk B,Ee f w$.t

l ¶ B,Ee * r(î+#&;7+_"+!$f&?Z/4'6#`?ZY&+U!$5,-:'*+- -/[#&%.=j!&-Lµ8,/4'6-7kBDCUE T b[d l ¶ B,Ee *

r? B.ÄJE B,E T l ¶ BDCFEe ¶ * r&C B.JE

Page 49: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

p ï,5,/4?'<!$#ÏÄ"+7,8,-.+7%%+7% /=5,#&;7?'<!$#P/4%/"%z8,+7;7?Z-5,Lì!4=&=>-:+7ï5&+#&;7^;7!$L¹8,!&#&+#J?%H /Ja4+#?!J\[+7?ZY&+-B_'<?ZY)U5&+;7!&#&%'<),+-/4?'<!$#=D!$-=>-.+7+24/J-.'6/Jf&0<+7%ZE_"+!$f&?Z/1'#k@ BDCFE T bJd l ¶ l ¶ @]Bd$Ee *

g ¶ h r?,r( B.JEU!&5,-.'<+-I%ü#[?+7\$-/10 Y&+7!$-:+L'<%O/%(?Z/4?+LP+#[?|!1= _Y&+#`?ZY&'<%|=D!$-LP5&06/Y&![0*),%,'t=]@ BDCUE'*%!4=f,!&5,#&),+7)23/J-.'6/4?('*!&#G/[#&) @ BDCUE4'<%"'#[?+7\$-(/Jf&0<+=-.!$Lu¼7v ?!.vXG&?ZY&+#¹+7ï,5,/4?'<!$#Y&!J0<),%¹B_"+70*0KLP!$-.+\J+#&+-(/40<0*^?zY&+),!$5,f&0<+'6#[?+7\$-/10R\['*2[+7%BD@ BDCUýKESã@]B>C ¶ EEþJbJEVHd _Ï¢J£¡­F¢4¨J¥­U®WÁ-.'<?'6#&\BDCUE[ B,E ?(!%'<\$#&'t=D^g?ZY&+?Z-/[#&%.=j!&-L8,/4'6-7G&_"+ Y,/2J+?zY&+=D!J0<0<!3_'#&\`8,-.!$8[û+-.?'<+7%k[H&^L¹LM+7?Z-.^g/J#&)-.+/40<'<?.^ ' =¡BDCUE"-.+/40x?ZY&+# B.¼7,E T B,E ' =¡BDCUE"-.+/40K/J#&)+72J+#k

B,EKTcbl p @ BDCUEq;7!J%B,CUEr&C ' =¡BDCUE"-.+/40K/J#&)!$),)

B,EKT¼b] l p @ BDCUEq%'6#B,CUE(r$C Y&+06/4%??._"!/J-:+/J#,/40<!J\$5&+7%!1=?ZY&+;7!J%'6#&+P/J#&)%'6#&+%+-:'*+7%¢¹;7!J%'6#&+/[#&)%'6#&+?Z-/J#&%.=D!$-LM%HbqH ± '6#&+/J-.'<?.^£=D!$-;7!$#&%?Z/[#J?%PNe/J#&)4G 't=7,3BDCFE. B,EP/J#&) m BDCFE. m B,E?ZY&+# N$,BDCUES m BDCUE[ N VB,ExS m B,EqH,8,/4;7+¤4?'6LP+%ZY&'t=j?'6#&\RxB>CP¼eN$E ¶ *'¥ B,EFH"R-.+7ï,5&+#&;7^%ZY&'t=j?('#&\RBDCUEe¦*§ B¼¨UE·qHH©'5ª+-.+#[?'6/4?'<!$#`!&#&;7+_ ³ BDCFE[@ B,EÄqHH7H7H/J#&)ni?'6LP+7%GM g<0h BDCFE[BI,E Bd,E« g£©x¬£¬° ¨J¥£x©|®

,5,8,8,![%+ @ BDCFE ­¹B,E7G¢xB>CUEu B,E¦_OY,/4?Ï=>5,#&;7?('*!&#¯®KBDCUEY,/4%9/º?z-/J#&%.=D!$-L° B,E T"­¹B,E B,Ee±ü#&%+-.?'6#&\`'6#J?!`?ZY&+'6#[2J+-.%+?Z-(/J#&%.=D!$-Lk®BDCFE T bJd l ­¹B,E B,Ee f r(

T bJd lll @ B&EIB²JEe f g ¶ ¶M³ h r(,r?$r?²T bJd lll @ B&EIBµ´ ¼&Ee * g¶ ¶^· h r?,r?$r´

Page 50: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

R-:!$LÉ+7ï5,/4?('*!&#e,G&_"+?ZY&+#!$f&?Z/4'6#`?ZY&+¹äzÐJÞUåÐJç<Ñ$ÖDÓÐJÞÏ!4=@/J#&)sk®B>CUE T l ¶ @ B$EExBDC¼&Er? B.¸JET @ BDCFE>¸7BDCUE BDbJvJE!$-7k l ¶ @ B$EExBDCP¼&Er? ­¹Bd,E B,E&'6LP'<06/J-.0<^Rk @ B&EIxBd$EX l ¶ ­¹BI¨UE B¼¨UEr ¨ù%O/P%Z8q+7;7'/10;/4%(+;7!$#J2[!J0<2J+?ZY&+-:+/40F=>5,#&;7?'<!$#&%@ BDCUE/J#&)9@ B.¼CUE7Gq?ZY&+#k­¹B:¼7,ET ­ Bd,E° B,E T ­¹Bd,Ee­ B,ET ¹­¹B,E m BDb,Eº^» B>CUE T l ¶ @ B$E(@]BdSCUEr? BDbJbJE Y&+`=5,#&;7?'<!$# º^» '6# p ï,5,/4?'<!$#b[b'<%¹?ZY&+ÁÙ[Ñ$ÖDÐäZÐJÒzÒ.Íç<ÙJÖDÓÐJÞ=5,#&;7?'<!$#ÇB!4= @E7G _Y&'*0<+:­¹B,E m '6#+7ï5,/4?('*!&#9b,'<%O?zY&+PÕÚ,ÍZäÖDÒ Ù[çRâ[ÍÞ,Õ7ÓÖ¼3H<~f&%+-.2J+½ /J-:%+723/40I%?ZY&+7!$-.+Lkº BDv[ET l ¶ ¿¾ @ B$EEÀ m r?T

bJd l ¶ ­¹B,E m r?Á £¦c­_ ¡§¥¢J®&!$LP++7,/JL¹8&0<+ -/[#&%.=j!&-Lµ8,/4'6-.%k[H&'LÏ8&0*+P+7,/JL¹8&0<+Jk

@]B>CUE Tñs ¶ ¥ Cãvv Cóãv G%­¹B,ET NS@ü= _ +LÏ/Ja4+?ZY&+=5,#&;7?'<!$#%(^L¹LM+7?Z-.'<;/Jf,!&5&?Ov,k@ BDCFE T" ¶ ¥Â  G%­¹B,E|T bJNN m S mbqHú/J5&%%('/[#`+7/JLÏ8&0*+B%+7+¾U\$5,-:+$E7k@ BDCUE T ¶ §Ã í¹B,E T l ¶ §Ã à ¶ * r&C

T ¶Ä ÃÅFÆ Ã l ¶ B § ýÈÇ ÄÃ Æ E à r&CT ¶Ä ÃÅFÆ Ã¨ l ý Ç Äà ƶ ý Ç ÄÃ Æ ¶MÉ Ã r?ÊT Ë d¨ ¶ Ä ÃÅFÆ Ã BDbJJE

(

Page 51: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

0

0.2

0.4

0.6

0.8

1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.2

0.4

0.6

0.8

1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2

'<\$5,-:+Uk|ú/[5&%%'6/J#`),'<%?Z-.'6f,5&?'<!$#&%/[#&)?ZY&+7'6-?Z-(/J#&%.=D!$-LP%ú/J5&%%('/[#Ï=>5,#&;7?'<!$#'<%O%+70t= )U5,/40DH A!$#&%'<),+-_Y,/4?OY,/J8,8q+#&% /4%̨ctvXHqH +7;7?Z/J#&\$5&06/J-8,5&0*%(+=5,#&;7?'<!$#k

!,BDCUE TÉs m ¥ C¡&óNv C¡?N G%͹B,E T %'6#B,N$E,N BDb4$EÅ !J?('*;7+?zY,/4?O/4%ONt v,GU/[#&)!$f&%+-.2J+͹Bd,E[t],_OY,/1?AY,/[8,8,+#&%?!Î!,BDCUE±

%Ï sÐ|­Ñe¥¢J§íz|­F¬D¨J§ÈëZ°|©|í&¨J¥£©We+%zY,/40<0,¾R#&)¹?ZY&+¢©'-(/4;OwVûD=>5,#&;7?('*!&#P?!f,+!4=;7!$#&%'<),+-/Jf&0<+5&%+'6#P?zY&+;7!$#&%'<),+-/4?('*!&#!4="%Z/JL¹8&0<'6#&\UHî!4_"+72J+-7G?zY&'*%Ò.=5,#&;7?'<!$#?Ó9'<%#&![?/`=5,#&;7?'<!$#e/4?/40<0'#È?ZY&+P;706/4%%'<;/40/J#,/40<^&%'<%%+#&%+JG/J#&)X_"+9_'<0<08&06/^=/4%?`/J#&)X0*!&!J%+È_'<?ZYã'*?(% 8,-.!&8,+-.?'<+7%/4% /È=5&0<0),'<%;5&%%'<!$#º'<%Pf,+7^J!$#&)º?ZY&+¹%(;7!$8,+`!4=?ZY&'<%;7!$5,-:%+JH Y&+g?z-.+/4?ZLP+#[?f,+70<!4_Ê'<%¹8,5,-.+70<^'6#[=j!&-L¹/40DH "!$#&%('*),+-:'#&\?ZY&+¹-.+7;7?Z/J#&\&5&0/[-8,5&0*%(+¹+7/[L¹8&0<+g/Jf,!42J+'6#+7ï5,/1?'<!$#b4g_"+¹/J-.+P'6#[?+-û+7%?+7)'6#?ZY&+8,-.!&8,+-.?'<+7%!4=Ô!,BDCFEBD-.+8&06/4;7'6#&\Ngf[^ÖÕ(E/4%×Õ<t v,H

l ¶ !,BDCUE(r$CPT s Î Õv ¹óc¼7Õ BDbJ·JE~#?zY&+_'<0*)e/4%(%Z5,L¹8&?'<!$#`?ZY,/1? '<?+7$'<%?(%k

0<'6LØÚÙ p !B>CUE Tcw,BDCFE&!$LP+8,-:!$8,+-.?('*+7%k(

jgd1000
Oval
Page 52: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

[H";7!$#&;7+8&?Z5,/40<0<^ wB>CUE T sÜÛ vµCgTcvv CoTcvbqH"?Z/Ja['#&\Ï?ZY&+0<'LM'*?!1=+7ï5,/1?'<!$#9bJ·qk

l ¶ w,BDCUEr&C¹Tñs kãvv góãvqH/4%%Z5,LP'6#&\e@ BDCUE'<%¹;7!$#[?'6#$5&!$5&%P'6#º?ZY&+`'6#J?+-:23/40B ¼ Õ7~ S"ÕEP/J#&)5&%'6#&\e?ZY&+),'*%z8&0/1;7+7)Æ8,5&0<%+=>5,#&;7?('*!&#u! Ø BDC¹¼ E¦f[^`?ZY&+'6#J?+-(LP+7),'6/4?+24/4065&+P?ZY&+7!$-.+LÉ=D!$-%!$LP+È´¹'6#?ZY&+'6#[?+-.24/40 B ¼Õ7~ SÕE7kl ¶ @]B>CUEe!,BDCP¼ Er&C¹Tc@ Bµ´$Eî+#&;7+_'<?ZY5&%Z5,/40),'<%Z-:+7\$/J-.)=j!$-"-.'<\J!&-.!$5&%Z#&+7%%G,?Z/Ja['6#&\¹?ZY&+0<'6LP'<?kl ¶ @ BDCUE(wB ¼eCFEr$CÏTc@ B E~f&%+-.2J+?ZY&'<%'<% /;7!$#[2J!J065&?'<!$#ÈH7HHFH=>5,-.?zY&+-#&!J?+ÈBDCUE(wB>C¼ E TÝB Ew,BDC¼ EVHü#%!&LP+_/^$%?ZY&+wVûD=5,#&;7?'<!$#/J#&)9'<?%=>-:'*+#&),%;/[#nfq+P;7!$#&%'<),+-.+7)e/4%/J#,/40<!J\J!$5&%?!+7&?+#&),'6#&\g?ZY&+-/4?('*!&#,/40<%?!P?ZY&+-.+/40<%,?ZY&+7^`f,!J?ZY`+#,/[f&0*+%+7ï,5&+#&;7+7%?!¹Y,/2J+O0<'6LP'<?%HWe+),+z¾R#&+P?ZY&+¹ÓâJÍzÙJçÕ7ÙJÜÚRç<ÓÝÞ[æà7Ñ$ÞFäÖDÓÐJÞ¹!4= '6#J?(+-.23/40x /4%k

w BDCUE T w,BDCϼei%eE_"+;/J#`U!&5,-.'<+-?Z-(/J#&%.=D!$-LÉ?ZY&'<%B+7&+-.;7'<%+=D!$-"?zY&+-.+/4),+-E/[#&)!$f&?Z/4'6#k

Þ B,E|T y w,B?¼eb[dhE_'<?ZYw Þ H §¦c¡|¬¥©[ß sÐ|­F£¢J­F¦~5,-;7!$L¹8&0<+7/J#&)g;7![%'6#&+¤4%'6#&+%+-:'*+7%"\$/2J+5&%A/),'<%;-.+7?(+%+7?|!4=KU!$5,-.'<+-;7!&+zg;7'<+#J?%=D!$-P/e8,+-.'<!&),'<;9=5,#&;7?'<!$#à"!$-0<!&!$aJ'6#&\/4?P'<?g/J#&!J?ZY&+-M_A/^=D!$-P/=>5,#&;7?('*!&#_OY&'<;7Yº'*%#&!$#[ûÿ7+-.!`'6#/P¾R#&'<?+¹-/J#&\[+JGU_"+;/J#),+z¾R#&+¹/¹8,+-:'*!&),'<;g+7&?+#&%'<!$#!4= ?ZY,/4?A=>5,#&;7?'<!$#H Y&+P%^,L¹LP+7?Z-:'*;P#,/4?Z5,-.+!4=|?ZY&+PU!&5,-.'<+-?Z-(/J#&%.=D!$-Lu_ !&5&0*)%Z5&\J\J+7%(?O?ZY,/1?O%!&LP+7?ZY&'6#&\'6#J?(+-.+7%?'6#&\¹LP'<\$Y[?|Y,/J8,8,+#`'t=?ZY&+F!$5,-.'<+- ?Z-(/J#&%.=D!$-Lì=5,#&;7?'<!$#¹'<%"/40<%!#&!$#[ûÿ7+-:!!$#&0<^'6#9/¾R#&'<?+P-/J#&\J+JH

Page 53: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

"!$#&%('*),+-K%z/JL¹8&0<'6#&\ /%'<\$#,/40áxB>CUE/4?K'#[?+-.24/40<%[µ?!!$f&?Z/1'#?ZY&+|),'<%;-:+7?+"?'6LP+|%'<\$#,/10jk BDCUE T xBDi>eEw,BDC¼ei>ÆE BDbJÄJET xB>CUEw BDCFE +LP+LPfq+-¹?ZY&+8,-.!$8q+-.?'<+7%g!1= ;7!$#[2J!J065&?'<!$#&%G"_ +9a&#&!4_?ZY,/4?¹/e8,-:!$)U5&;7?`'6#?zY&+9C),!$L¹/1'#ã'<%9/È;7!$#J2[!J065&?'<!$#'6#X?ZY&+?Z-/J#&%:=j!$-(L ),!$L¹/4'6#HÇWX'<?ZYBDCUE B,E`/J#&) BDCFE B,E7k B,E T B,E ¸TÞ B,ET y B,E ¸ w,B?¼ºbJdh`E

T y BP¼bJdh E BDb]JE

R-:!$LÉ?ZY&'<%_"+ #&!J?(+?ZY,/4? =D!$-"/J#[^ÎBDCFEâ B,E?ZY&+-.+7%Z5&0<?!4=%z/JL¹8&0<'6#&\ /4?"'6#[?+-.23/10*% '#Ï?ZY&+CÏ),!$L¹/4'6#¹-.+7%z5&0*?(%|'6#g/?z-/J#&%.=D!$-L B,EK_OY&'<;7Y¹'<%|?zY&+O8,+-.'<!&),'<;+7$?+#&%('*!&#!4= B,E_'<?ZY8,+-.'<!&)nbJdxþ$H "!$#[2J+-.%(+70*^'t=7BDCUE'<%P%?Z-.'<;7?0<^Áf,/J#&)&û0<'6LP'<?+7)f[^ã B>-/4),'6/J#&%¹8,+-CÁ5,#&'<?ZE7GK?ZY,/1?'*% Bd,E'<% ÿ7+-.!!$5&?%'<),+"?ZY&+"'6#J?(+-.23/40B.¼7ãº~'ãÇE7G?ZY&+#PfJ^%z/JL¹8&0<'6#&\/4?K'6#J?+-:23/40<%|bJdþ[b]ã'6#?ZY&+C),!$L¹/4'6#k

B,E TbJdb]ã B,EGB.¼7ã ó ó ã E/4%¹%zY&!3_#'6#º¾U\$5,-:+n·,H½K+-=D!$-LP'6#&\e?ZY&+F!$5,-.'<+-Ï?Z-/J#&%.=D!$-L !$#º+7ï5,/1?'<!$#bJÄqG|_"+!$f&?Z/1'#k

B,E TbJdb]ã xB bJdib]ã Ee ¶ * m.o r meä GB.¼7ã ó `ó ãÇE BDb]JEå 5&?¹_"+9a$#&!4_ B,E¹T v=j!&-/10*0Î X7æãç"?zY&+-.+z=j!&-.+?ZY&+9%(+7ï5&+#&;7+ ¿ xB>iRþ[b]ã EØÀ;7!$L¹8&0<+7?+70<^`),+z¾R#&+7% B,E/J#&)gY&+#&;7+¢xBDCFE7Hâè%'6#&\P+7ï5,/1?'<!$#gbá/J#&)g8q+-=D!$-LP'6#&\P?ZY&+'6#J2[+-.%+?z-/J#&%.=D!$-LkBDCUE T b[d l ¶ Bd,Ee *

r?T b[d l ä¶ ä bJdb]ã xB idã Ee ¶ * o r ä * r?T báã B idb]ã E l ä¶ ä * g¶ ¶ o r ä h r?T xB idã E %'6#BãÇC¹¼ºidEãÇC¹¼ºiRd BDbJ¸JE

/JaJ'6#&\`?ZY&++7,/JL¹8&0<+P!4= ?'6LP+=D!$-"?zY&+ ),!$L¹/4'6#GF_"+;/J#-.+7_-.'<?+?ZY&'<%'6#?(+-LP%!4=?ZY&+`%Z/JL¹8&0<'6#&\=-.+7ï,5&+#&;7^ @$éP#&!$-L¹/10*0<^ºï5&![?+7)Á'6#î+-.?ÿ-/4?ZY&+-?ZY,/J#º-/4),'6/J#&%¹8,+-%+7;Jk Bê E T xB i@$é E %'6#KBDd|BD@$éFêK¼eiRE(Ed|BD@$éF꼺iET xB i@$é E.%'6#&;,BD@$éeêK¼eiRE BDJvJE·Jv

Page 54: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

ë ëì ìí íî î vû@ï ï

ðë ë ì ìí í î îë ë ì ìí í î îë ë ì ìí í î îë ë ì ìí í î îë ë ì ìí í î î

ð

ð

ñòó

ñ ó

ñ ó

û@ï ô ï

ôû@ï ï

° ñ ó

ñõ]ó

ñö^ó

÷[øù(ú -Fû.ü^ý å õ]þ?ÿ ø.ø û ÿ ø¶ù þ^õ ý ñõ]ó û ò - ú ñ>ó×ñö^ó û ò - ú ñ%óñòódÿ û õ û--Fû ! þ û -"-Fû ò þ - ú ò eø þ"# û%$'&(*)+-,/.- õ û]ý õ% øù þ^õ_ö^õ]þ?ÿ ø.ø û ÿ ö10 ã ñ2 û- 43 óRòõ]þ ö û ú þ ø65^ú û 0ÿ û û- cø þ û ÿ"ö107'õ ø þ ù õ õ - õ û 8:9 é<;>=áã "# û .ø þ ø?.ú 'õ ø þ ù - õ û9 éA@B=]ã ø !# ûDCFEGH IJ¦êO- õ û "# û 'õ ø þ ùK!# û -Fû æø L M û eø û _òõ86 û ÿ !# û$A&N(O)M+-,. !# û -Fû

PRQSPT UFVSWXYWZ[ß@þ -Fû õ8 ø 0 ø ø qþ [ ø ö û 4 ö ú?ø ÿ !# û õ]þ^õ ]ù(ú û û-]\ #?ø ò # \ (ú ÿ # õO^ û _# û û- û ò -Fû ! þ û-Fû 5^ú?ø - û ÿ 4 õ$ò #?ø û ^ û !# ûa` 0 5^ú?ø - õ û ñ -Fû ! þ û # \ þL ù(ú -Fû×ü ñòóeó÷[øù(ú -Fûcb ÿ û d þ - õ û !# û - ö û æøef_# û 'õ ø þ ù - õ û ø [M \ - õ ùáø ^ û þ û- ïû # õ$ÿõ4 ú û ÿ õK øù þ^õ7ö^õ]þ?ÿ ø.ø û ÿ 4 ã õ]þ?ÿ<'õ û ÿ õ ` 0 5Mú?ø - õ ûg=]ãBh ö úa!# û ø¶ù þ^õiñ - õÎö^õ$ÿ0i û-Fû ÿK^ û- ø þ !# û øù þ^õ*ó # õTþ þ1 3 û- -Fû 5Mú û þ?ò0ò M þ û þ õ L -Fû 5^ú û þ?ò ø û #?ø¶ù# û- !# õ]þ ï \ #?ø ò #%!# û û- ø ÿ ø ò ø6 0 !# û - õ]þ - kj ñl^óòõ ú û 4 ö û õ$ÿ^ÿ û ÿ ø þ7@þm MMnáø þ ùo þ60 ø þ !# û ø þ û- ^$õ

üp

Page 55: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

-8 -6 -4 -2 0 2 4 6 8

y = sinc(x)

÷[ø¶ù?ú -Fûgq^ý "# û ø þ?ò dú þ?ò ø6 þ h ér6s1t6uOv*wuOv

0

0.2

0.4

0.6

0.8

1

1.2

0-W W

÷[øù(ú - û:b^ý]x ø õ ø þ ù û_yXû ò ûz õ ûü=

Page 56: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

ñ| ã<'ã ó ø õ û õ - !# õ g!# û õ ø # õO^ û ö ûû þ ~ ÿ û ÿ ^ û- õ]ö ?ú @ ã õ]þ?ÿ @"ã \ ø6_##?øùM# û- -Fû 5^ú û þ?ò ø û Tö û ø þ ù -Fû_%û ò û ÿ õ8 \Hû- -Fû 5Mú û þ?ò ø û *" õO^ ]ø ÿ !#?ø c - ö û h ø.ø þ - õ 4 õ ø?k4 2õ ûD\Hû iõáö ^ û !# û` 0 5Mú?ø - õ û ~ -Hûz õ û õáþ?ÿ%õ - ÿ qM ö Ag ú û ÿ ø þ !# û # þ û 40 û õ ø?4ö^õ]þ?ÿ ø.øA!# û õáþ^õ ]ù(ú û ø þ ú ø¶ù þ^õ 4 õáö (úa n 2 3 õ]þ?ÿ2õ û õ an 2 3 z õ û ×õ86 õ - ø û ø þ ø? õ ù û ÿ ø¶ùáø6ø63 õ eø þ \ # û- û ! ^õ eø õ -Fû 5^ú û þ?ò ø û #?øùM# û- !# õ]þ!# û-Fû úeø þ !# û eòõáþ^þ û- - ^ ø ÿ û òõ û- õ.òõ ú û õ ø õ ø þ ù û_yXû ò O g4]RgA [iR¡¢g£*¤]¥

ïû # õO^ û ûû þ # \¦\Hû òõ]þRÿ û eò - ø ö û õg û- ø ÿ ø òd øù þ^õXö10õ_ÿ ø eò - û û û â÷§(ú - ø û-ò û_¨ ò ø û þ _õ]þ?ÿò þ1^ û- û 60 # \ õÿ ø eò -Fû û û 8 ]ø þ Èòõ]þö û ú û ÿ 4 ÿ û eò - ø ö ûõÎö^õ]þ?ÿ ø.ø û ÿ ø¶ù þ^õ"Hø û 4 ò M û [ û õ - !# \Hû # õ[ò þ?ò û- þ (ú - û ^ û \ ø6_# ö^õáþ?ÿ ø.ø û ÿ© û- ø ÿ ø ò øù þ^õ×õáþ?ÿò þ ø ÿ û- !# ûLªA+,«*¬|­*.­®R¯)M¬_+°­*¬d±¬³²1´,°µ/¯¬_¶ õ !#?ø û þ?ÿ ø û 4 ò M ú õ ø6 þ ÷àú - !# û- · -Fû !# û ÷¸" ø .õ õ û þ^õ]ö û [ û_¨ ò ø û þ ø û û þ õ ø6 þõ _# ûd® ²8,/.]®R¯)¬!+°­*¬±¬|²´,°µ/¯1¬!¶

T¹QSP Ѻ¼»ZW½W¾RZY

þ ø ÿ û- õ¿ÿ%õ õi û 5Mú û þ?ò ûF¿ sÀi@¿ Á  OÃ*ÃOà ÄaÅ À ÷§ -_ûz õ û !# û û ò (ú ÿ-Fû -Fû û þ _# û ^0õ ú û A'õ û ÿ - M õ]þõ]þ^õ ]ù(ú û øù þ^õ J ñ ê ó \ ø!# sg@ÆJ ñÇ§È é ó"# û ø eò -Fû û ÷ ?ú - ø û- " - õ]þ - ø ý

jdÉ @ÄÅÂÊsË Á

sMÌ ÅÍÎ_ÏÐ É s h ñl @"ôOp*ÃOÃÃ|!C p ó ñ p óõ]þ?ÿ ø ø þ^ û- û]ý

sd@ pC

ÄÅÂÊÉ Ë Á

j:É Ì ÍÎ_ÏÐ É s h ñÇ @"ôOp*ÃOÃó_C p ó ñ = ó_þ û õÑ - ^õ ø þ !# û ò þ eø þ ú(ú ÷§(ú - ø û- " - õ]þ - õ]þ?ÿc û- ø û qþ \ ÿ ø 'õ û õ - h!# û-Fû ø 7þ :5^ú û eø þ 8 4 ø ö û ò þ^ û- ù û þ?ò û - ö û \ ø!# ø.ø ×õ8 _# û û ú õ -Fû õ86 £þ ø6 û T¹QT ÒcÓ1¾RÔº Ó8½8WSº Y

"# û - û- eø û !# û ÷¸"Æ.ø.ø ò _# û ò þ ø þ ú(ú ^ û- ø þ ýp 0 c û - 0õ]þ?ÿ -Fû õ ø 0

ü

Page 57: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Õ øÖ] s - û õ !# û þ j Å¼É @ jd×ÉÕ õ s ø a û- ø6 ÿ ø ò h j t ÄÙØ[Ú w Å É @ j:×t ÄÙØ4Ú wÜÛ É

= Ý ø þ û õ - ø 0 Þ s'ß7àá¼s # õ8 ÷¸" Þ jdÉ ß©àâ ÉÈø þ !# û ö^ ø(ú õ]þ^þ û- #?øÖeø þ ù ö û- ^ û õ% #?øe~sø þ !# û s ^0õ ú û ø - û õ60õ - õ eø þ ÷ - !# û- õ û ÿD û 5^ú û þ?ò û s Å sOã !# û ÷¸"Ýø j:É Ì Å Ú u*r É s*ã ØÄ

"# û-Fû ø ×õ86 c!# û ^õ - õ û 8 ò þ1^ úeø þ "# ûc«O+ܬ|«O)M䲬A«_¯´ åO¯ä)M.+°¯´ û 5Mú û þ?ò û s õ]þ?ÿ ás ø ¢ÿ û £þ û ÿö10 ý

EOsc@ÄÅÂÊæ Ë Á

æ ás Å æ h ñÇ @"ôOp*ÃOÃÃ~C p ó"# û ÷¸"¦8 E s ø !# û þ ý

çÉ @ÄÅèÂÊsË Á

EOsMÌ Å ÍÎ_ÏÐ É s

@ÄÅèÂÊsË Á

ÄÅèÂÊæ Ë Á

æ ás Å æ ÌÅ ÍÎ!ÏÐ É s

@ÄÅèÂÊæ Ë Á

æ ÌÅ ÍÎ_ÏÐ É æ

ÄÅèÂÊæ Ë Á

á s Å æ ÌÅ ÍÎ_ÏÐ É ts Å æ w

@ j É â É ñ óT¹Q°é êXYO½ëê¾RìÓWº§ÓíÓXZ¸Yïî!¾RÓð

x ø ø eø ò ø û û þ õ eø þ 8!# û ÷¸" \ (ú ÿ -Fû 5^ú?ø -FûiC Ú ò M ûz .ú eø ø òõ eø þ_õ]þ?ÿ C ñ C p óHò M ûz õ$ÿ^ÿ øeø þ*2 \Hû ^ û- !# û 40 c û - 0 8¸!# û ÷¸"òõ]þö ûÎûz áø6 û ÿ õ]þ?ÿõsÿ ø ^ ø ÿ û õ]þ?ÿò þ 5Mú û- - õ û ù 0i û õ$ÿ 4i!# û ÷ õ _÷§(ú - ø û-" - õ]þ - \ # û þ C ø ×õñ \Hû- = ÷ - ø ø ò ø 0 \Hû\O- ø ûóòô@ÆÌ Å¼Ú u*r ØÄ h !# û þ _# û ÷¸"õ - õáþ û ^ û þ.þ ú ö û-Cö@=÷hö û ò û ý

jdÉ @Ú|ø§ÅÂÊsË Á

sò s É h l @ ô*pOÃ*ÃÃ=÷ p ñ ó@

ø§ÅÂÊsNË Á

sò s É ßÚ|ø§ÅÂÊsNË ø

sò s É

@ø§ÅÂÊsNË Á

ñ saß sÛ ø ò É4ø ó ò É s@

ø§ÅÂÊsNË Á

ñ saß sÛ ø ñ| p ó É ó ò É s ñ ü óü

Page 58: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

ððððððð

ðððððððð

ðððððððð

ðððððððð

î î î î î î î î î î î î î î ù

î î î î î î î î î î î î î î ù

î î î î î î î î î î î î î î ù

ð

î î î î î î î î î î î î î î ùë ë ëë ë ë

ë ë ëë ë ë

ë ë ú

ë ë ëë ë ë

ë ë ëë ë ë

ë ë ú

ë ë ëë ë ë

ë ë ëë ë ë

ë ë ú

ë ë ëë ë ë

ë ë ëë ë ë

ë ë ú ß

ß

ß

ß

³ ]ø þ ÷¸"

³ ]ø þ ÷¸"

Á

Â

û

ü

þ

Mÿ

jLÁ

jLü

jdÚ

jLþ

jñÂ

jdý

jLû

jdÿ

ò Á ò Â ò Ú ò û

÷[øù(ú -Fû ý ø ^ ø ø þ ó ³ ]ø þ ×ø þ 4c \ ³ ]ø þ ÷¸" õ h¼ò ø @ ph # û þ?ò ûdò É[ø @ ñ| p ó É ïû òõáþ _# û þ ø? 60F û ^õ - õ û (ú û ^ û þõ]þ?ÿ ÿ^ÿ û- 8ÙjdÉ õ]þ?ÿ \iû ö õ ø þ ý

jdÚ @ø§ÅÂÊsNË Á

ñ saß sNÛ ø óñ ò Ú ó s h@"ô*p1*Ã*ÃÃ[!÷ p

j Ú Û Â @ø§ÅÂÊsNË Á

ñeñ s sNÛ ø ó ò s óñ ò Ú ó s h@"ô¼*pOÃ*ÃÃ|!÷ p

` eø ò û !# õ d!# û û õ -Fû \ ÷ ³ ]ø þ ÷¸" A!# û û 5^ú û þ?ò û ¿ s:ß sNÛ ø À õ]þ?ÿ¿ ñ s sÛ ø ó ò s À C ø _õd \Hû- =h !# û þ \Hû òõ]þsÿ û ò û ø þ !#?ø õ]þ^þ û- áù Ú C eø û ú þ eø \Hû ö õ ø þ C ø þ ù û ]ø þ - õáþ ~ - *"# û ÿ ø õ ù - õ |ø þ ù(ú -Fû # \ !# û ÿ ø ^ ø ø þ õáþ ³ ]ø þ ÷¸"Ýø þ 4d \ ³ ]ø þ ÷¸" * û ò ú - ø ^ û ÿ ø ^ ø ø6 þ ø þ !#?ø õ]þ^þ û--Fû ú ø þ _# û ÷ õ H÷§(ú - ø û- " - õ]þ - ñ ÷[÷¸" óï÷[øù(ú -Fû # \ !# ûÊ- û û eøeø ^ û ^õ 4 û- þ - û ÿuö û \Hûû þò û- õ ø þK ^õ ø - ]ø þ !#?ø )..­*¬& ^õ 4 û- þ #?øùM# øùM#1 a M û û õ 'ú -Fû 8Ù!# û ÷[÷¸" ý

ü]ü

jgd1000
Oval
jgd1000
Line
Page 59: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Stage 1 Stage 2 Stage 3÷[øù(ú -Fû^ýx þ ûz õ û )..­*¬f& ^õ 4 û- þõ û õ$ò #i!# û _# -Fûû õ ù û ø # \ þ ø þö ¶ÿp qõ$ò # ö ú4 û-³ 0 - û 5Mú?ø -Fû þ û ò ûz .ú ø ø òõ eø þÝõ]þ?ÿ \ õ$ÿ^ÿ øeø þ # û þ?ò û ÷[÷¸" -Fû 5^ú?ø -Fû ñ C = ó¼ ]ù Ú C ò M ûz .ú eø ø òõ eø þ õáþ?ÿ C MÚ Cò M ûz õ$ÿ^ÿ øeø þ \Hû # õO^ ûÁ-Fû ÿ ú ò û ÿ !# û ñ C Ú ó ÷¸" - ò û 4 4 õ]þ ñ C ]ù Ú C ó þ û = x û õò #ø û- õ eø þ \ ø!#?ø þ û õò # õ ù ûh\Hû þ ûû ÿ"ò þ ø ÿ û- þ0 \ ø þ úò û_¨ ò ø û þ \ #?ø ò #ù û þ û- õ û \ i(ú ú ò û_¨ ò ø û þ øe \Hû\Hû-Fû 4 4 -Fû

!# û ò û_¨ ò ø û þ ø þÎõ]þõ -- õO0 h !# û \ ?ú ú [òõ]þ òò ú 0 !# û 'õ û òõ eø þõ !# û \ ø þ ú .ñ !#?ø ò (ú - û ÿ û - 0 !# û ø þ ú7ø þ !# û - ò û 4'ó¸ó^ û þøÖ \Hû 4 -Fû !# û (ú ú_ø þõ]þ !# û- õ -- õ*0 _#?ø õ8 ù1 - ø!#ø A eø 6 þ60 ñ C óø þ ! ^õ$ò û û- * "R £þ?ÿ !# û òõ ø6 þ 8¸j:Éø þ _# û ÷[÷¸"¦(ú ú õ -(- õO0 h õ n û lõ×õ]þö ø þ^õ - 0þ ú ö û- ]ù Ú C ö ø6 hU-Fû ^ û- û !# û õ]þ?ÿ -Fû õ õ ø þ?ÿ ûz ø þ 4 õ -(- õO0

T¹Q OZ¹º§ÓYºÑ%êAí êAêAí

"# û Iþ^ û- û ÷¸"Ýø ùáø ^ û þuö10 ý sd@ p

CÄÅÂÊÉ Ë Á

j:É Ì ÍÎ_ÏÐ É s h ñÇ @"ôOp*ÃOÃó_C p ó"#?ø Èòõ]þö ûP-Fû\O- ø4 û þõ8 ý

C ×s @ÄÅÂÊÉ Ë Á

j ×É ò É s h ñÇ @"ôOp*ÃOÃÃ|!C p ó"#?ø ø ¤ ûû þ 4 ö û õL ÷¸"¦¹_# û ò M ûz ò þOÑ ú?ù õ û 8¹!# û ÷ ?ú - ø û- ò û_¨ ò ø û þ !#?ú !# û ÷[÷¸" òõ]þö û ú û ÿõ8 õ]þ ø þ1^ û- û ÷¸" õ û- õ - - ø õ û õ84'õ ù]ø þ ùi!# û ò û_¨ ò ø û þ þ ø þ ú õ]þ?ÿ ?ú ú

üq

Page 60: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

T¹Q ¾Óº"!Wðƺ§Z¸YW¾Z¸Y

ï # û þ û- õ eø þ ùd þ ø õ ù û h\Hû ~ û þ \ ø #[ ò þ ø ÿ û- !# û \ ÿ ø û þ ø þ^õ^ û- ø þ ¹!# û ÷ ?ú - ø û- û- ø û $# " - õ]þ - #g ø eò - û û " - õ]þ ~ - ÷§ - !# û ÷¸"m!# õ û õ]þxÿ û eò - ø ö ø þ ù õ dú þ?ò ø6 þ \ ^0õ - ø õáö û 9 ñ !E óàõàò M þ û þ Ì Å Ú uOr°t6sOv ØÄ Û&%(' Ø) w ý

*¹É,+ @ p- C)iÅÂÊ%Ë Á

ÄÅÂÊsNË Á 9

% + s Ì Å Ú u*r°t.0/1 Û3254Ð w÷[øù(ú -Fûpô # \ a û ¸_# û ò ø þ û ö^õ ø @ú þ?ò eø þ* ÷ - õc û 5Mú û þ?ò û qø õ ù û h\Hû .øùM#1 ò þ ø ÿ û- ÿ ø û þ ø þñ \ ! ^õò ûh þ û ø? û ó

cos(x)

01

23

45

6 01

23

45

6

-0.5

0

0.5

1

cos(y)

01

23

45

6 01

23

45

6

-0.5

0

0.5

1

cos(x)*cos(y)

01

23

45

6 01

23

45

6

-0.5

0

0.5

1

÷[øù(ú -Fûcpô^ý= ¯ò ø þ û ö^õ8 ø dú þ?ò ø6 þ"# û ÷[÷¸" õ ù - ø!# õ1 ø û û 5Mú õ860 \Hû ø þ #?øùM# û- ÿ ø û þ ø þ - ÿ ø? û þ ø þ76 h\Hû ö õ ø þõ]þ ñ C78 áù C óHõ ù - ø6_# - õ !# û- !# õ]þ !# û ñ C78 Û Â ó9 4]RiR¡¢g£*¤]¥ ¡¢;: g ¡¢g_¥ ¡=<[ïû # õO^ û .øù - õ û ÿ - Mæ÷§(ú - ø û- û- ø û Îõ]þ?ÿ " - õ]þ - ò þ eø þ ú(ú g_ ^õ$ò û -eø û @ú þ?ò eø þ \ #?ø ò # ûz -Fû 4 øù þ^õ ø þ -Fû 5^ú û þ?ò0_ ^õ$ò û 4 ÿ ø eò -Fû û - õ]þ - ÿ ø ò -Fû û dú þ?ò eø þO¸2 \Hû ^ û-h\Hû òõ]þ - û òõ !# û ÷¸"ø þÎõ d -Fû ù û þ û- õ - õ û \ - n ï - náø þ ùÎø þ \ ÿ ø û þ ø6 þ h \HûA\O- ø6 û?> 9A@R4 -Fû - û û þ ¤!# û õ - ø zàý

> 9A@ @BCCCCD

9*Á,+ Á 9OÁE+ Â FGFHF 9OÁE+ ÄÅÂ9ÂI+ Á 91Â+ Â FGFHF 91Â+ ÄÅÂ

9 )iÅÂ+ Á 9 )iÅÂ+ Â FGFHFB9 )iÅÂ+ ÄÅÂ

JLKKKKM

!# û þ \O- ø6ø þ ù > @ Åè - !# û ø þ1^ û- û 8a!# û-Fû û ^0õ]þ D õ - ø zèh _# û ÷¸" ^õ ø - ø !# û ö1^ ø(ú õ - ø z - ÿ ú ò ý

> *N@ @O>QP ) + )@ > 9R@ >QP Ä3+ Ä@> 9A@ @O>QP )S+ ) @ Å > *N@ >QP Ä3+ Ä @ ÅèÂ

üb

Page 61: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

\ # û-Fû >TP=U + U @Xø !# û;VW7V õ - ø z\ ø!# û û û þ ñX Çàó ù]ø ^ û þö0 ýpV Ì Å ÍÎ_ÏY %¸s h X Ç @"ô*p1*Ã*ÃOÃ|ZV p

@þ ù û þ û- õ !# û þ - þ þ1 ø þ ù(ú *õ - õ - ø ò û >Q[ @ õ]þ?ÿ >]\ @ h\Hû òõ]þò þ ø ÿ û- !# ûù û þ û- õ8 ø û ÿÿ ø eò -Fû û - õáþ ~ - ^õ ø - ù]ø ^ û þuö10 ý> *N@ @^>Q[ @ > 9A@ >]\ @

> 9A@ @O>Q[ @ Åè > *N@ >]\ @ ÅèÂx - øùM#1R!#?ø ] 0M û -Fû û þ õ eø þ - õ = dú þ?ò ø6 þ \ (ú ÿõ û õ - 4Hø þ?ÿ ø òõ û!# õ \Hû .ú ö û ÿ û õ ø þ ù \ ø!# õ]þ ñ C û ói - ò û 4 4 û ^$õ ú õ û _# û - õ]þ - õ - ø zèh ö úâø þ õ]þ10Èòõ û Mø þ û-Fû ñõ \iû # õO^ û ûû þ !# û ÷[÷¸" ó ú û 40 c û - 0õ]þ?ÿ - !#]ù þ^õ8 ø 0 -Fû *õ eø þA - ^ ø ÿ û ×õ · -Fû_û_¨ ò ø û þ õ ù - ø!#

éQSP ísиº_WYa`Óº¼½ºbD¾RYWZº7íaÓXZYî!¾RÓ1ð

"# û g ""ø þ \Æ\ ø ÿ û 0 ú û ÿ ø þ ø õ ù û ò M -Fû 4 ø þñced¸gf õ]þ?ÿ7hid(f kjA|@ó õ]þ ø õ ù û ø Îòõ - ^ û ÿ ú ø þ 4 õ86 5^ú õ -Fû eø û ClWoC ø z?û õ]þ?ÿ !# û g "õ ø û ÿ 4 û õ$ò #eø û ñõ û- ú.ø þ^õ]þ?ò û #ò # - .ø þ^õ]þ?ò û û ^õ - õ ø6 þ^ó å 0õ - - ø õ û5^ú õ]þ ø63 õ eø þõ]þ?ÿ û þ?ò ÿ ø þ ùK_# û ò û_¨ ò ø û þ õ #?øùM# 0ò M -Fû [ û ÿ ~ - !# ûø õ ù û òõáþuö û ù û þ û- õ û ÿñþ û !# õ _#?ø ø ×þ ×ú ú õ60õL 4 -Fûû - ò û 4'óï

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

0 1 2 3 4 5 6 7

DCTDFT

Original

÷[øù(ú -Fûcpp]ý g " ^è ÷¸" 4 õ*^ ]ø ÿÖö ò náø þ ù õ - eøe õ$ò - ø¶ùáø þ^õ8R øù þ^õ ø - õ]þ?ÿ M"# û ø û ø ûz û þ?ÿ û ÿ ø þõ[0 c û - ø ò õ]þ^þ û- 4 - ^ ø ÿ û õ dú þ?ò eø þ \ #?ø ò # -Fû û õ ø þ ö !# õáþ?ÿ E ò - ÿ ø þ^õ û \ ø!# û- ø ÿ =C "¤#?ø ñ û õÿ 4F þ0 !# û ò ø þ û

ü

Page 62: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

û- Tö û ø þ ùù û þ û- õ û ÿö10 !# û ÷¸" "¤# û ø - õáþ - û- 0 !#?ø ø !# õ õ !# ûcCmWC ø z?û eø ûÎû ÿ ù û h _# û g " û þ ú -Fû _# õ '9 ñ Å ó @ 9 ñ Û óï è ø þ ùK!# û ÷¸" þ !# û - øù]ø þ^õ eø û û õ$ÿ [ ä¯O«EnN+°´po²¬_.+ µ²«*.°,h\ # û-Fû õ û- ø þ1^ û- ø6 þñ øe \Hû# õO^ û õ - z ø? õ û ÿ !# û #?ø¶ù#F -Fû 5^ú û þ?ò0 ò þ û þ 'ó h !# û ø z?û eø ûsû ÿ ù û 'ú - þ(úd4# õO^ û !# û û õ]þ<^$õ ú û a_# û ø z?û Öõ :_# ûû þ?ÿ û õ$ò # - \ - ò ú þ÷[øù(ú -Fûcpp # \ !# û_û_yXû ò éQT ísиºqoXr!XèðXèÓs! ¾RÓutöXVSY0Ð íÓXZ¸Yïî!¾RÓð

x ñ = ó3v ²pw²1¶:²1¬xw õ - ø zèhr>Öâ U + U @ h ø ×õL40 c û - ø ò VyWzV õ - ø zD\ # ûÈû û û þ õ -Fû¢û ø6_# û-(dp õáþ?ÿ \ # û-Fû _# ûO- \ ×ñõ]þ?ÿ # û þ?ò û þ û ò û 42õ - ø 0Îò ú þ'óHõ -Fû .ú'ú õ0 - _#]ù þ^õ ÷ -Hûz õ û]ý

BCCCDp p p pp p p pp p p pp p p p

JLKKKM

x \ ø!#<!# û ÷¸" h !# û 2Èõ$ÿ%õ õ - ÿ - õáþ ~ - ># õõ µ/²,. ^ û- ø þ h õ # õO^ û 2Èõ]õ - õ - ø ò û ×ñ \ ø!# ò û_¨ ò ø û þ ô5dp ó ` û !# õ ]!# û - õ]þ - õ ø6 þ # û-Fû õ - û M û \ # õ ø û- 4 ò M ú ûA\ ø!# !# õ]þ !# û ò ûz û- 8¹!# û ÷¸" éQ°é |Ó½]и¾Z¸¾RÓðXèVî[ìZ(`½W¾Z¸Y@þ ù û þ û- õ \Hû õ -Fû Mnáø þ ùg - õ û - !# - þ - õ dú þ?ò eø þ_ñ \Hû # õ*^ û ú û ÿ: ø þ ûhò ø þ ûh ò ûzÎûz û þ eø õ h õáþ?ÿc£þ^õ0i2×õ$ÿ%õ õ - ÿ%ó \ ø!# \ #?ø ò #4 -Fû -Fû û þ õ@ú þ?ò eø þ"# û.ûz õ û ù]ø ^ û þK ñ õ - õ -Fû õ86qõ -Fû eò - ø ö û ÿ û @ú þ?ò eø þ ø þ?ÿ û û þ?ÿ%õ]þ g!# û dú þ?ò ø6 þ \ #?ø ò # \Hû õ - û - 0 ø þ ù[ - û -Fû û þ # \Hû ^ û-h ø ø Îõ 4 ø ö û 4 û û ò - _# þ - õ8(ö^õ ø @ú þ?ò eø þ \ #?ø ò # õ - û eø.ø û ÿ ~ - õa ^õ - eø ò ú fõ - dú þ?ò eø þ h - · -Fû ø þ û-Fû eø þ ù 0 õ õ .ø 0 R@ú þ?ò eø þ*"# ûA õ - #?ú þ û þ1³Ý û ^ û _# û -Fû ÿ û eò - ø ö û ú ò # õ û ò # þ ø65^ú û ø þ \ #?ø ò # ­O+Qo­*´§å*­_«*.¯¬[,~ ¶:²1.¬!+°«!­ï,k ¸!# û õ ú[ ò -(-Fû *õ eø þ @ú þ?ò eø þõ -Fû !# û ö^õ ø *A òõ]þö û # \ þ !# õ !#?ø .ÿ û ò M øeø þ ø !# û ö û õ$ò #?ø û ^0õ]ö û 2 \iû ^ û- !# û ò ú õ eø þ !# û ûû øù û þ1^ û ò 4 - ø ûz û þ ø ^ ûh ñ C ü ó - õ = ø õ ù ûh õ !#(ú?ù#K õ8 û- ^ û- ø þ.ÿ ûz ø ö^õ û ÿ þ ÷[÷¸" _# û ÿ û- ø ^$õ eø þ 8 \ #?ø ò #ø þ L - !# û 5Mú û õ cø # ñ° ûû û þ û ÿõ]þ?ÿ õ n h øù]ø õd ø ò 'ú -Fû d - ò û 4 ø þ ù óéQ bD¾RZ¹¾RVSì½W¾Zhõ]þ10 - ò û 4 û ø þ øù þ^õ - û ò þ øù]øeø þ ø þ1^ ^ û !# û ú û ò þ1^ úeø þ

üp

Page 63: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

G& þ ø ÿ û- !# û - ö û 8 û ÿ ù û ÿ û û ò eø þ ø þõ = ø õ ù ûh\ # û-Fû !# ûÎû ÿ ù û ø .þ þ û ò û 4'õ - ø 60L^ û- eø òõ - # - ø3 þ õ "#?ø ø þ1^ ^ û [õHò þ ø ÿ û- õ ø6 þ !# û ù - õÿ ø û þ R!# û @ú þ?ò eø þ 9 ñ _E ó¡õ þ ù ^õ !# þ !# û ú - õ$ò û !# õ [ø !# û ^ û ò 4 - ¹ õ ù þ ø'ú ÿ ûh ñ 9 %ó Ú ß ñ 9 E ó Ú h õáþ?ÿuÿ ø -Fû ò eø þ õ]þ ÅÂ ñeñ 9 E ó ñ 9 %óFó

-100

-50

0

50

100

150

200

0 50 100 150 200 250 300

functiongradient

÷[øù(ú -Fûcp*=^ýp û ÿ ù û ÿ û û ò eø þö0sò þ1^ úeø þ÷ - õÿ ø¶ùáø6 õ - û -Fû û þ õ eø þ !#?ø ø þ?ÿ ø òõ û Tõcÿ ø yXû-Fû þ?ò û.û 5^ú õ eø þ ýñ v 9 óñ !E ó 9 ñ !E óR 9 ñc p_E óñ ' 9 óñ !E ó 9 ñ !E óR 9 ñ _E p ó` û !# õ ¸!#?ø ø 7õ_ò þ^ úø6 þ ¹9sø þ !# û sõ]þ?ÿ E ÿ ø -Fû ò eø þ \ ø!# ¿ À ù]ø ^ û þRö0 ý

¿ pOp'ô2ôÃÃÃ À prïû õ -Fû ø þ û-Fû û ÿ ø þ -Fû ò þ - ú ò eø þ ù õ = ø? õ ù û ñ \ #?ø ò #F# õ$ÿ ø þ øeø õ ø z?û ^0õ8 ú û 9 ñ _E óó h - M ö ú --Fû ÿ ø? õ ù û ö û- ^ û ÿ M û eø û fõ û- !# û ö ú --Fû ÿ ø õ ù û õ eø ûê ø ñ !E§eê ó ïû .øùM#id ÿ û _# û ö ú - ø þ ù7 õ]þ ø? õ ù û õ8Öõÿ ø y ú ø6 þ - ò û 4 ^ û- ø? û øÖ \Hû þ ûû ÿ 4 ò þ ø ÿ û- !# û úeø þ 4!# ûK\iû n þ \ ^õ - ø õ8Xÿ ø yKû- û þ eø õ û 5Mú õ ø6 þ ø þ^ ^ ø þ ù!# û ÝXõ *õ$ò ø õ]þ ý

Ú è Ú ß Ú E Ú @ pl ê

q]ô

Page 64: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

-5

-4

-3

-2

-1

0

1

2

3

4

5

0 50 100 150 200 250 300

’P0903A.DAT’

-2000

-1000

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

0 50 100 150 200 250 300

’P0903B.DAT’

61

Page 65: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

9 Quantized Degrees-of-Freedom in a Continuous Signal

We have now encountered several theorems expressing the idea that even though a signalis continuous and dense in time (i.e. the value of the signal is defined at each real-valuedmoment in time), nevertheless a finite and countable set of discrete numbers suffices todescribe it completely, and thus to reconstruct it, provided that its frequency bandwidth islimited.

Such theorems may seem counter-intuitive at first: How could a finite sequence of num-bers, at discrete intervals, capture exhaustively the continuous and uncountable stream ofnumbers that represent all the values taken by a signal over some interval of time?

In general terms, the reason is that bandlimited continuous functions are not as free tovary as they might at first seem. Consequently, specifying their values at only certainpoints, suffices to determine their values at all other points.

Three examples that we have already seen are:

• Nyquist’s Sampling Theorem: If a signal f(x) is strictly bandlimited so that itcontains no frequency components higher than W , i.e. its Fourier Transform F (k)satisfies the condition

F (k) = 0 for |k| > W (1)

then f(x) is completely determined just by sampling its values at a rate of at least2W . The signal f(x) can be exactly recovered by using each sampled value to fix theamplitude of a sinc(x) function,

sinc(x) =sin(πx)

πx(2)

whose width is scaled by the bandwidth parameter W and whose location correspondsto each of the sample points. The continuous signal f(x) can be perfectly recoveredfrom its discrete samples fn(nπ

W ) just by adding all of those displaced sinc(x) functionstogether, with their amplitudes equal to the samples taken:

f(x) =∑

n

fn

(

W

)

sin(Wx − nπ)

(Wx − nπ)(3)

Thus we see that any signal that is limited in its bandwidth to W , during someduration T has at most 2WT degrees-of-freedom. It can be completely specified byjust 2WT real numbers (Nyquist, 1911; R V Hartley, 1928).

• Logan’s Theorem: If a signal f(x) is strictly bandlimited to one octave or less, sothat the highest frequency component it contains is no greater than twice the lowestfrequency component it contains

kmax ≤ 2kmin (4)

i.e. F (k) the Fourier Transform of f(x) obeys

F (|k| > kmax = 2kmin) = 0 (5)

andF (|k| < kmin) = 0 (6)

and if it is also true that the signal f(x) contains no complex zeroes in common withits Hilbert Transform (too complicated to explain here, but this constraint serves to

62

Page 66: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

exclude families of signals which are merely amplitude-modulated versions of eachother), then the original signal f(x) can be perfectly recovered (up to an amplitudescale constant) merely from knowledge of the set xi of zero-crossings of f(x) alone:

xi such that f(xi) = 0 (7)

Comments:(1) This is a very complicated, surprising, and recent result (W F Logan, 1977).

(2) Only an existence theorem has been proven. There is so far no stable constructivealgorithm for actually making this work – i.e. no known procedure that can actuallyrecover f(x) in all cases, within a scale factor, from the mere knowledge of its zero-crossings f(x) = 0; only the existence of such algorithms is proven.

(3) The “Hilbert Transform” constraint (where the Hilbert Transform of a signalis obtained by convolving it with a hyperbola, h(x) = 1/x, or equivalently by shiftingthe phase of the positive frequency components of the signal f(x) by +π/2 and shiftingthe phase of its negative frequency components by −π/2), serves to exclude ensem-bles of signals such as a(x) sin(ωx) where a(x) is a purely positive function a(x) > 0.Clearly a(x) modulates the amplitudes of such signals, but it could not change anyof their zero-crossings, which would always still occur at x = 0, π

ω , 2πω , 3π

ω , ..., and sosuch signals could not be uniquely represented by their zero-crossings.

(4) It is very difficult to see how to generalize Logan’s Theorem to two-dimensionalsignals (such as images). In part this is because the zero-crossings of two-dimensionalfunctions are non-denumerable (uncountable): they form continuous “snakes,” ratherthan a discrete and countable set of points. Also, it is not clear whether the one-octavebandlimiting constraint should be isotropic (the same in all directions), in which casethe projection of the signal’s spectrum onto either frequency axis is really low-passrather than bandpass; or anisotropic, in which case the projection onto both frequencyaxes may be strictly bandpass but the different directions are treated differently.

(5) Logan’s Theorem has been proposed as a significant part of a “brain theory”by David Marr and Tomaso Poggio, for how the brain’s visual cortex processes andinterprets retinal image information. The zero-crossings of bandpass-filtered retinalimages constitute edge information within the image.

• The Information Diagram: The Similarity Theorem of Fourier Analysis assertsthat if a function becomes narrower in one domain by a factor a, it necessarily becomesbroader by the same factor a in the other domain:

f(x) −→ F (k) (8)

f(ax) −→ |1a|F

(

k

a

)

(9)

The Hungarian Nobel-Laureate Dennis Gabor took this principle further with greatinsight and with implications that are still revolutionizing the field of signal processing(based upon wavelets), by noting that an Information Diagram representation of sig-nals in a plane defined by the axes of time and frequency is fundamentally quantized.There is an irreducible, minimal, area that any signal can possibly occupy in thisplane. Its uncertainty (or spread) in frequency, times its uncertainty (or duration) intime, has an inescapable lower bound.

63

Page 67: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

10 Gabor-Heisenberg-Weyl Uncertainty Relation. “Logons.”

10.1 The Uncertainty Principle

If we define the “effective support” of a function f(x) by its normalized variance, or thenormalized second-moment:

(∆x)2 =

∫ +∞

−∞

f(x)f∗(x)(x − µ)2dx

∫ +∞

−∞

f(x)f∗(x)dx

(10)

where µ is the mean value, or normalized first-moment, of the function:

µ =

∫ +∞

−∞

xf(x)f∗(x)dx

∫ +∞

−∞

f(x)f∗(x)dx

(11)

and if we similarly define the effective support of the Fourier Transform F (k) of the functionby its normalized variance in the Fourier domain:

(∆k)2 =

∫ +∞

−∞

F (k)F ∗(k)(k − k0)2dk

∫ +∞

−∞

F (k)F ∗(k)dk

(12)

where k0 is the mean value, or normalized first-moment, of the Fourier transform F (k):

k0 =

∫ +∞

−∞

kF (k)F ∗(k)dk

∫ +∞

−∞

F (k)F ∗(k)dk

(13)

then it can be proven (by Schwartz Inequality arguments) that there exists a fundamentallower bound on the product of these two “spreads,” regardless of the function f(x):

(∆x)(∆k) ≥ 14π (14)

This is the famous Gabor-Heisenberg-Weyl Uncertainty Principle. Mathematically it isexactly identical to the uncertainty relation in quantum physics, where (∆x) would beinterpreted as the position of an electron or other particle, and (∆k) would be interpretedas its momentum or deBroglie wavelength. We see that this is not just a property of nature,but more abstractly a property of all functions and their Fourier Transforms. It is thus astill further respect in which the information in continuous signals is quantized, since theminimal area they can occupy in the Information Diagram has an irreducible lower bound.

10.2 Gabor “Logons”

Dennis Gabor named such minimal areas “logons” from the Greek word for information, or

order: logos. He thus established that the Information Diagram for any continuous signal

can only contain a fixed number of information “quanta.” Each such quantum constitutes

an independent datum, and their total number within a region of the Information Diagram

represents the number of independent degrees-of-freedom enjoyed by the signal.

64

Page 68: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

The unique family of signals that actually achieve the lower bound in the Gabor-Heisenberg-

Weyl Uncertainty Relation are the complex exponentials multiplied by Gaussians. These

are sometimes referred to as “Gabor wavelets:”

f(x) = e−(x−x0)2/a2

e−ik0(x−x0) (15)

localized at “epoch” x0, modulated by frequency k0, and with size or spread constant a.

It is noteworthy that such wavelets have Fourier Transforms F (k) with exactly the same

functional form, but with their parameters merely interchanged or inverted:

F (k) = e−(k−k0)2a2

eix0(k−k0) (16)

Note that in the case of a wavelet (or wave-packet) centered on x0 = 0, its Fourier Trans-form is simply a Gaussian centered at the modulation frequency k0, and whose size is 1/a,the reciprocal of the wavelet’s space constant.

Because of the optimality of such wavelets under the Uncertainty Principle, Gabor (1946)proposed using them as an expansion basis to represent signals. In particular, he wantedthem to be used in broadcast telecommunications for encoding continuous-time informa-tion. He called them the “elementary functions” for a signal. Unfortunately, because suchfunctions are mutually non-orthogonal, it is very difficult to obtain the actual coefficientsneeded as weights on the elementary functions in order to expand a given signal in thisbasis. The first constructive method for finding such “Gabor coefficients” was developedin 1981 by the Dutch physicist Martin Bastiaans, using a dual basis and a complicatednon-local infinite series.

The following diagrams show the behaviour of Gabor elementary functions both as complexwavelets, their separate real and imaginary parts, and their Fourier transforms. When afamily of such functions are parameterized to be self-similar, i.e. they are dilates and trans-lates of each other so that they all have a common template (“mother” and “daughter”),then they constitute a (non-orthogonal) wavelet basis. Today it is known that an infiniteclass of wavelets exist which can be used as the expansion basis for signals. Because ofthe self-similarity property, this amounts to representing or analyzing a signal at differentscales. This general field of investigation is called multi-resolution analysis.

65

Page 69: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Gabor wavelets are complex-valued functions, so for each value of x we have a phasor inthe complex plane (top row). Its phase evolves as a function of x while its magnitude growsand decays according to a Gaussian envelope. Thus a Gabor wavelet is a kind of localisedhelix. The difference between the three columns is that the wavelet has been multipliedby a complex constant, which amounts to a phase rotation. The second row shows theprojection of its real and imaginary parts (solid and dotted curves). The third row showsits Fourier transform for each of these phase rotations. The fourth row shows its Fourierpower spectrum which is simply a Gaussian centred at the wavelet’s frequency and withwidth reciprocal to that of the wavelet’s envelope.

66

Page 70: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

The first three rows show the real part of various Gabor wavelets. In the first column,these all have the same Gaussian envelope, but different frequencies. In the second column,the frequencies correspond to those of the first column but the width of each of the Gaussianenvelopes is inversely proportional to the wavelet’s frequency, so this set of wavelets form aself-similar set (i.e. all are simple dilations of each other). The bottom row shows Fourierpower spectra of the corresponding complex wavelets.

67

Page 71: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

-0.5

0

0.5

Position in Degrees

-0.5

0

0.5Position in Degrees

00.

51

Z2D Gabor Wavelet: Real Part

-10

0

10

Spatial Frequency (CPD)

-10

0

10Spatial Frequency (CPD)

00.

51

Z

2D Fourier Transform

Figure 1: The real part of a 2-D Gabor wavelet, and its 2-D Fourier transform.

10.3 Generalization to Two Dimensional Signals

An effective strategy for extracting both coherent and incoherent image structure is the

computation of two-dimensional Gabor coefficients for the image. This family of 2-D filters

were originally proposed as a framework for understanding the orientation-selective and

spatial-frequency-selective receptive field properties of neurons in the brain’s visual cortex,

and as useful operators for practical image analysis problems. These 2-D filters are con-

jointly optimal in providing the maximum possible resolution both for information about

the spatial frequency and orientation of image structure (in a sense “what”), simultane-

ously with information about 2-D position (“where”). The 2-D Gabor filter family uniquely

achieves the theoretical lower bound for joint uncertainty over these four variables, as dic-

tated by the inescapable Uncertainty Principle when generalized to four-dimensional space.

These properties are particularly useful for texture analysis because of the 2-D spectral

specificity of texture as well as its variation with 2-D spatial position. A rapid method

for obtaining the required coefficients on these elementary expansion functions for the pur-

pose of representing any image completely by its “2-D Gabor Transform,” despite the non-

orthogonality of the expansion basis, is possible through the use of a relaxation neural net-

work. A large and growing literature now exists on the efficient use of this non-orthogonal

expansion basis and its applications.

Two-dimensional Gabor filters over the image domain (x, y) have the functional form

f(x, y) = e−[(x−x0)2/α2+(y−y0)2/β2]e−i[u0(x−x0)+v0(y−y0)] (17)

where (x0, y0) specify position in the image, (α, β) specify effective width and length, and

(u0, v0) specify modulation, which has spatial frequency ω0 =√

u20 + v2

0 and direction θ0 =

arctan(v0/u0). (A further degree-of-freedom not included above is the relative orientation of

the elliptic Gaussian envelope, which creates cross-terms in xy.) The 2-D Fourier transform

68

Page 72: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

F (u, v) of a 2-D Gabor filter has exactly the same functional form, with parameters just

interchanged or inverted:

F (u, v) = e−[(u−u0)2α2+(v−v0)2β2]ei[x0(u−u0)+y0(v−v0)] (18)

The real part of one member of the 2-D Gabor filter family, centered at the origin (x0, y0) =(0, 0) and with unity aspect ratio β/α = 1 is shown in the figure, together with its 2-DFourier transform F (u, v).

2-D Gabor functions can form a complete self-similar 2-D wavelet expansion basis, withthe requirements of orthogonality and strictly compact support relaxed, by appropriateparameterization for dilation, rotation, and translation. If we take Ψ(x, y) to be a chosengeneric 2-D Gabor wavelet, then we can generate from this one member a complete self-similar family of 2-D wavelets through the generating function:

Ψmpqθ(x, y) = 2−2mΨ(x′, y′) (19)

where the substituted variables (x′, y′) incorporate dilations in size by 2−m, translations inposition (p, q), and rotations through orientation θ:

x′ = 2−m[x cos(θ) + y sin(θ)] − p (20)

y′ = 2−m[−x sin(θ) + y cos(θ)] − q (21)

It is noteworthy that as consequences of the similarity theorem, shift theorem, and modu-lation theorem of 2-D Fourier analysis, together with the rotation isomorphism of the 2-DFourier transform, all of these effects of the generating function applied to a 2-D Gabormother wavelet Ψ(x, y) = f(x, y) have corresponding identical or reciprocal effects on its2-D Fourier transform F (u, v). These properties of self-similarity can be exploited whenconstructing efficient, compact, multi-scale codes for image structure.

10.4 Grand Unification of Domains: an Entente Cordiale

Until now we have viewed “the space domain” and “the Fourier domain” as somehow oppo-

site, and incompatible, domains of representation. (Their variables are reciprocals; and the

Uncertainty Principle declares that improving the resolution in either domain must reduce

it in the other.) But we now can see that the “Gabor domain” of representation actually

embraces and unifies both of these other two domains. To compute the representation of

a signal or of data in the Gabor domain, we find its expansion in terms of elementary

functions having the form

f(x) = e−ik0xe−(x−x0)2/a2

(22)

The single parameter a (the space-constant in the Gaussian term) actually builds a con-

tinuous bridge between the two domains: if the parameter a is made very large, then the

second exponential above approaches 1.0, and so in the limit our expansion basis becomes

lima→∞

f(x) = e−ik0x (23)

the ordinary Fourier basis. If the frequency parameter k0 and the size parameter a are

instead made very small, the Gaussian term becomes the approximation to a delta function

at location xo, and so our expansion basis implements pure space-domain sampling:

limk0,a→0

f(x) = δ(x − x0) (24)

Hence the Gabor expansion basis “contains” both domains at once. It allows us to makea continuous deformation that selects a representation lying anywhere on a one-parametercontinuum between two domains that were hitherto distinct and mutually unapproachable.A new Entente Cordiale, indeed.

69

Page 73: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

Reconstruction of Lena: 25, 100, 500, and 10,000 Two-Dimensional Gabor Wavelets

Figure 2: Illustration of the completeness of 2-D Gabor wavelets as basis functions.

70

Page 74: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

11 Kolmogorov Complexity and Minimal Description Length

An idea of fundamental importance is the measure known as Kolmogorov complexity: thecomplexity of a string of data is defined as the length of the shortest binary program forcomputing the string. Thus the complexity is the data’s “minimal description length.”

It is an amazing fact that the Kolmogorov complexity K of a string is approximatelyequal to the entropy H of the distribution from which the string is a randomly drawnsequence. Thus Kolmogorov descriptive complexity is intimately connected with informa-tion theory, and indeed K defines the ultimate data compression. Reducing the data to aprogram that generates it exactly is obviously a way of compressing it; and running thatprogram is a way of decompressing it. Any set of data can be generated by a computerprogram, even if (in the worst case) that program simply consists of data statements. Thelength of such a program defines its algorithmic complexity.

It is important to draw a clear distinction between the notions of computational com-

plexity (measured by program execution time), and algorithmic complexity (measured byprogram length). Kolmogorov complexity is concerned with finding descriptions whichminimize the latter. Little is known about how (in analogy with the optimal propertiesof Gabor’s elementary logons in the 2D Information Plane) one might try to minimizesimultaneously along both of these orthogonal axes that form a “Complexity Plane.”

Most sequences of length n (where “most” considers all possible permutations of nbits) have Kolmogorov complexity K close to n. The complexity of a truly random binarysequence is as long as the sequence itself. However, it is not clear how to be certain ofdiscovering that a given string has a much lower complexity than its length. It might beclear that the string

0101010101010101010101010101010101010101010101010101010101010101

has a complexity much less than 32 bits; indeed, its complexity is the length of the program:Print 32 "01"s. But consider the string

0110101000001001111001100110011111110011101111001100100100001000

which looks random and passes most tests for randomness. How could you discover thatthis sequence is in fact just the binary expansion for the irrational number

√2−1, and that

therefore it can be specified extremely concisely?Fractals are examples of entities that look very complex but in fact are generated by

very simple programs (i.e. iterations of a mapping). Therefore, the Kolmogorov complexityof fractals is nearly zero.

A sequence x1, x2, x3, ..., xn of length n is said to be algorithmically random if its Kol-mogorov complexity is at least n (i.e. the shortest possible program that can generate thesequence is a listing of the sequence itself):

K(x1x2x3...xn|n) ≥ n (25)

An infinite string is defined to be incompressible if its Kolmogorov complexity, in the limitas the string gets arbitrarily long, approaches the length n of the string itself:

limn→∞

K(x1x2x3...xn|n)

n= 1 (26)

An interesting theorem, called the Strong Law of Large Numbers for Incompressible Se-

quences, asserts that the proportions of 0’s and 1’s in any incompressible string must benearly equal! Moreover, any incompressible sequence must satisfy all computable statistical

71

Page 75: Information Theory and Coding - University of Cambridge · 2008-10-20 · Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman

tests for randomness. (Otherwise, identifying the statistical test for randomness that thestring failed would reduce the descriptive complexity of the string, which contradicts itsincompressibility.) Therefore the algorithmic test for randomness is the ultimate test, sinceit includes within it all other computable tests for randomness.

12 A Short Bibliography

Cover, T M and Thomas, J A (1991) Elements of Information Theory. New York: Wiley.

Blahut, R E (1987) Principles and Practice of Information Theory. New York: Addison-Wesley.

Bracewell, R (1995) The Fourier Transform and its Applications. New York: McGraw-Hill.

Fourier analysis: besides the above, an excellent on-line tutorial is available at:http://users.ox.ac.uk/˜ball0597/Fourier/

McEliece, R J (1977) The Theory of Information and Coding: A Mathematical Framework

for Communication. Addison-Wesley. Reprinted (1984) by Cambridge University Press inEncyclopedia of Mathematics.

McMullen, C W (1968) Communication Theory Principles. New York: MacMillan.

Shannon, C and Weaver, W (1949) Mathematical Theory of Communication. Urbana:University of Illinois Press.

Wiener, N (1948) Time Series. Cambridge: MIT Press.

72