Upload
kathlyn-powell
View
215
Download
1
Embed Size (px)
Citation preview
1
•Information in Continuous Signals
f(t)
t0
In practice, many signals are essentially analogue i.e. continuous.e.g. speech signal from microphone, radio signal.
So far our attention has been on discrete signals, typically represented as streams of binary digits.
How to deduce the information capacity of continuous signals?
2
•Sampling Theorem
f(t) fs(t)
tt
F(f)
-W W
Numbers of samples/s ≥2W (1/T ≥2W ), W Nyquist rate
F (fs)
f
f
f
Spectrum of continuous signal Spectrum of sampled signal
T
2T 3T 4T
3
•Information capacity in continuous signal
Information per second:R=(number of independent samples/s) × (maximum information per sample)
number of independent samples/s =2W
What is maximum information per sample in continuous signal?
maximum information per sample in discrete signal? H=-Σp log p
number of distinguishable levels =S/N
C=2Wlog(S/N)=W log (SNR) noise squaremean
root is 2
2
,s
SNR
For continuous signal, maximum information per second is usually denoted as Information Capacity
4
•Relative Entropy of Continuous Signal
Discrete systems
dv)v(plog)v(pH
ii ppH log
)v
()v(p2
2
2exp
2
1
Continuous systems
Gaussian
2 2log P,eP)v(H
5
•Information Capacity of Continuous Signals
ChannelOutput (y)
Input (x)
Power (S)
Power (S+N)
)NS(elog)y(H
eNlog)n(H
)n(H)y(H
)x|y(H)y(H)xy(I
2
2
where
Information Capacity C=[H(y)-H(n)]×2W
This leads to Ideal Communication Theorem C=Wlog(1+S/N)Theoretically information could be transmitted at a rate up to C with no net errors.
6
Transmission Media
The physical medium of electronic transmission limits its achievable bit rate. It acts as a “filter” on the signal being transmitted.
7
8
Shannon’s Theorem
Our phone line can carry frequencies between 300 Hz and 3300Hz unattenuated.The channel capacity C is
C=W log2 (1+S/N)
W is the bandwidth 3300-300=3000 Hz. S/N is the signal to noise ratio, typically1000, which corresponds to 10 log10 (S/N) dB = 30dB.
In our case C=30 kbs, corresponds well with a 28.8kbs modem.
9
•Implications of the Ideal Theorem
I=WTlog(1+SNR) bits in time T.
A given amount of information can be transmitted by many combinations of W, T, SNR
W
SNR
C=3 unitsT=1s
3
2
1 a
b
c
6010a. W=1, SNR=7,b. Half W S/N =63. Requires very large increase in power.c. Half S/N W1.5. Useful can halve power with only 50% increase in bandwidth.
10
•Maximum Capacity for given transmitted Power
C=W log (1+S/N) S/N0 nats =1.44S/N0 bits
(about 3×10 ‾ ²¹ W required to transmit 1 bit.)
C=Wlog(1+S/(N0 W)) , where N0 is noise power spectral density.
x)x(ex
lim
1log0
s/bitsN
S.s/nats
N
S
N
SWC
max
00
41
This suggests that power should be spread over a wide bandwidth and transmitted at as low P/N as possible for efficiency in power requirements.
Max value of C occurs for W ∞, and P/N 0