Upload
raymond-ryan
View
215
Download
2
Embed Size (px)
Citation preview
1
Lecture 7
System Models
Attributes of a man-made system.
Concerns in the design of a distributed system
Communication channels
Entropy and mutual information
2
Student presentation next week
Up to 5 minute presentations followed by discussions.
All presentations in Power Point
Format: Title of project/research Motivation (why is the problem important) Background (who did what) Specific objectives (what do you plan to do) Literature
Each student will provide feedback about each presentation (grades - A, B, C, F- and comments).
3
Distributed system models
Process A Communication Channel
send(message)
receive(message)
Process B
4
System models
Functional models
Performance models
Reliability models
Security models
The effect of the technology substrate.
5
Attributes of a man-made system
A. Functionality
B Performance and dependabilityReliabilityAvailabilityMaintainabilitySafety
C. Cost
6
Major concerns
Unreliable communication.
Independent failures of communication links and computing nodes.
Discrepancy among communication and computing bandwidth and latency.BandwidthLatency
7
Information transmission and communication channel models
Physical signals
Digital/analog channels
Modulation/demodulation
Sampling and quantization
Channel latency and bandwidth
8
Source B DestinationCommunication Channel
D
propagation delay
transmission time
t1
t2
t3 D/V
L/B
L/B
D/V
L
messagelatency
time
t4
9
EntropyInput and output channel alphabets.
The output of a communication channel depends statistically upon its input. The output gives an idea of what was sent.
Measure of the uncertainty of a random variable.
Examples: Binary random variable
H(x) = -p log(p) – (1-p) log(1-p) Horse race
10
Entropy of a binary random variable
H(X)
p0 1/2 1
1
11
Joint entropy, conditional entropy, mutual information
H(X,Y) – joint entropy of X and YH(X/Y) – conditional entropy of X given YH(X,Y) = H(X) + H(Y/X) = H(Y) + H(X/Y)I(X;Y) = H(X) – H(X/Y) mutual informationI(X;Y) is a measure of the dependency between rv’s X and Y.
H(X) = H(X/Y) + I(X;Y) H(Y) = H(Y/X) + I(X;Y) H(X,Y) = H(X) + H(Y) – I(X;Y)
12
H(X|Y)
H(Y,X)
H(Y|X)
H(X)H(Y)
I(X;Y)
13
Noiseless and noisy binary symmetric channels
x=0 y=0
x=1
Source DestinationNoiseless, binary symmetric communication channel
y=1
x=0 y=0
x=1
SourceDestinationNoisy, binary symmetric communication channel
y=1
(a)
(b)
p1-p
1-p
p
14
Noisy binary symmetric channel
Each of the two input symbols 0 and 1 is altered with probability p and received as 1 and 0 respectively.
Then
I(X;Y) = H(Y) + H(Y/X) =
= H(Y) + p log(p) + (1-p) log(1-p)
We can maximize I(X;Y) when H(Y) = 1
15
Encoding
Encoding used to:Make transmission resilient to errors (error
detection and error correction)Reduce the amount of information transmitted
through a communication channel (compression)Ensure information confidentiality (encryption)
Source Encoding
Channel Encoding
16
Source ReceiverBinary Communication
ChannelSource
Encoder
Source
ChannelDecoder
ChannelEncoder
SourceEncoder
SourceDecoder
A 00B 10C 01D 11
00 A10 B01 C11 D
Binary CommunicationChannel
A 00B 10C 01D 11
SourceDecoder
Receiver00 A10 B01 C11 D
00 0000010 1011001 0101111 11101
00000 0010110 1001011 0111101 11
(a)
(b)
17
Channel capacity and Shannon’s theorem
Given a channel with input X and output Y the channel capacity defines the highest rate the information can be transmitted through the channel
C = max I(X;Y)
Shannon’s theorem
The effect of the signal to noise ratio (S/N)
C = B log ( 1 + S/N)
18
Error detection and error correction
Error detection parity bit used to detect any odd # of errors
Error correction
Code: a set of code words
Block codes:m – information symbolsk – parity check symbolsn = m + k
19
Source Receiver
Communication channelChannelDecoder
ChannelEncoder
original message original message
SourceEncoder
channel encoded message(n-tuples)
Decoder
souce encoded message(k-tuples)
channel encoded message(n-tuples)
souce encoded message(k-tuples)
20
Hamming distance
The number of position two binary code words differ.
Hamming distance is a metricNon-negativeSymmetricTriangle inequality
Example
The distance of a code
Nearest neighbor decoding
21
Error correction and error detection capabilities of a code
If C is an [n,M] code with an odd distance
d = 2e +1
Then C can:
correct e errors and
detect 2e+1 errors
22
c1 r
2e
d(c1,c2) > 2e
d(c3,c4 ) = 2e+1
cux
e e
c2
c3 c4
q
2e
t
23
The Hamming bound
What is the minimum number of parity check symbols necessary to correct one error?