10
1 •Information in Continuous Signals f(t) t 0 ractice, many signals are essentially analogue i.e. continuou speech signal from microphone, radio signal. So far our attention has been on discrete signals, typically represented as streams of binary digits. How to deduce the information capacity of continuous signals?

1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

Embed Size (px)

Citation preview

Page 1: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

1

•Information in Continuous Signals

f(t)

t0

In practice, many signals are essentially analogue i.e. continuous.e.g. speech signal from microphone, radio signal.

So far our attention has been on discrete signals, typically represented as streams of binary digits.

How to deduce the information capacity of continuous signals?

Page 2: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

2

•Sampling Theorem

f(t) fs(t)

tt

F(f)

-W W

Numbers of samples/s ≥2W (1/T ≥2W ), W Nyquist rate

F (fs)

f

f

f

Spectrum of continuous signal Spectrum of sampled signal

T

2T 3T 4T

Page 3: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

3

•Information capacity in continuous signal

Information per second:R=(number of independent samples/s) × (maximum information per sample)

number of independent samples/s =2W

What is maximum information per sample in continuous signal?

maximum information per sample in discrete signal? H=-Σp log p

number of distinguishable levels =S/N

C=2Wlog(S/N)=W log (SNR) noise squaremean

root is 2

2

,s

SNR

For continuous signal, maximum information per second is usually denoted as Information Capacity

Page 4: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

4

•Relative Entropy of Continuous Signal

Discrete systems

dv)v(plog)v(pH

ii ppH log

)v

()v(p2

2

2exp

2

1

Continuous systems

Gaussian

2 2log P,eP)v(H

Page 5: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

5

•Information Capacity of Continuous Signals

ChannelOutput (y)

Input (x)

Power (S)

Power (S+N)

)NS(elog)y(H

eNlog)n(H

)n(H)y(H

)x|y(H)y(H)xy(I

2

2

where

Information Capacity C=[H(y)-H(n)]×2W

This leads to Ideal Communication Theorem C=Wlog(1+S/N)Theoretically information could be transmitted at a rate up to C with no net errors.

Page 6: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

6

Transmission Media

The physical medium of electronic transmission limits its achievable bit rate. It acts as a “filter” on the signal being transmitted.

Page 7: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

7

Page 8: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

8

Shannon’s Theorem

Our phone line can carry frequencies between 300 Hz and 3300Hz unattenuated.The channel capacity C is

C=W log2 (1+S/N)

W is the bandwidth 3300-300=3000 Hz. S/N is the signal to noise ratio, typically1000, which corresponds to 10 log10 (S/N) dB = 30dB.

In our case C=30 kbs, corresponds well with a 28.8kbs modem.

Page 9: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

9

•Implications of the Ideal Theorem

I=WTlog(1+SNR) bits in time T.

A given amount of information can be transmitted by many combinations of W, T, SNR

W

SNR

C=3 unitsT=1s

3

2

1 a

b

c

6010a. W=1, SNR=7,b. Half W S/N =63. Requires very large increase in power.c. Half S/N W1.5. Useful can halve power with only 50% increase in bandwidth.

Page 10: 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio

10

•Maximum Capacity for given transmitted Power

C=W log (1+S/N) S/N0 nats =1.44S/N0 bits

(about 3×10 ‾ ²¹ W required to transmit 1 bit.)

C=Wlog(1+S/(N0 W)) , where N0 is noise power spectral density.

x)x(ex

lim

1log0

s/bitsN

S.s/nats

N

S

N

SWC

max

00

41

This suggests that power should be spread over a wide bandwidth and transmitted at as low P/N as possible for efficiency in power requirements.

Max value of C occurs for W ∞, and P/N 0