Upload
jemima
View
37
Download
0
Tags:
Embed Size (px)
DESCRIPTION
The Transmission-Switching Duality of Communication Networks. Tony T. Lee Shanghai Jiao Tong University The Chinese University of Hong Kong Xidian University, June 21, 2011. A Mathematical Theory of Communication BSTJ , 1948. C. E. Shannon. Contents. Introduction - PowerPoint PPT Presentation
Citation preview
The Transmission-Switching Duality of Communication Networks
Tony T. Lee
Shanghai Jiao Tong University
The Chinese University of Hong Kong
Xidian University, June 21, 2011
A Mathematical Theory of Communication BSTJ, 1948
C. E. Shannon
ContentsContents
Introduction
Routing and Channel Coding
Scheduling and Source Coding
Reliable Communication
Circuit switching networkReliable communication requires noise-tolerant
transmission
Packet switching networkReliable communication requires both noise-tolerant
transmission and contention-tolerant switching
Quantization of Communication Systems
Transmission—from analog channel to digital channel Sampling Theorem of Bandlimited Signal (Whittakev 1915; Nyquist, 1928; Kotelnikou, 1933;
Shannon, 1948)
Switching—from circuit switching to packet switchingDoubly Stochastic Traffic Matrix Decomposition (Hall 1935; Birkhoff-von Neumann, 1946)
Noise vs. Contention Transmission channel with
noise Source information is a
function of time, errors corrected by providing more signal space
Noise is tamed by error correcting code
Packet switching with contention Source information f(i) is a
function of space, errors corrected by providing more time
Contention is tamed by delay, buffering or deflection
0101
0111
11010100
0001
Message=0101
Connection request f(i)= j
Delay due to buffering or deflection
Transmission vs. SwitchingTransmission vs. Switching
Source TransmitterChannel
capacity CReceiver
Message SignalReceived
signal
Shannon’s general communication systemShannon’s general communication system
Temporal information source: function f(t) of time t
Spatial information source: function f(i) of space i=0,1,…,N-1
Clos network C(m,n,k)Clos network C(m,n,k)
0
k-1
0
m-1
0
k-1
kxk mxnnxmo
n-1
o
n-1
N-n
N-1
N-n
N-1
SourceInput module Central module Output module
Internal contentionChannel capacity = m
Destination
Destination
Noise source
Noise
Channel Coding
Source Coding
Contention
Routing
Scheduling
Clos Network Communication Channel
Apple vs. Orange
350mg Vitamin C
1.5g/100g Sugar
500mg Vitamin C
2.5g/100g Sugar
ContentsContents
Introduction
Routing and Channel Coding
Scheduling and Source Coding
Rate Allocation
Boltzmann Principle of Networking
Output Contention and Carried LoadOutput Contention and Carried Load
Nonblocking switch with uniformly distributed destination address
'N/
N/
•ρ: offered load
•ρ’: carried load
eN
NN)1('1
The difference between offered load and carried load
reflects the degree of contention
01
N-1
01
N-1
Proposition on Signal Power of Switch
(V. Benes 63) The energy of connecting network is the number of calls in progress ( carried load )
The signal power Sp of an N×N crossbar switch is the number of packets carried by outputs, and noise power Np=N- Sp
Pseudo Signal-to-Noise Ratio (PSNR)
'1
'
][
][
][
][
p
p
p
p
NE
NEN
NE
SEPSNR
Boltzmann StatisticsBoltzmann Statistics
0123
ab
c
d
n0 = 5
n1 = 2
n2 = 1
a d
b,c
4
0 5
2 Micro State
Packet: Energy Quantum
Output Ports: Particles
ni = number of outputs with energy level packets are distinguishable, the total number of states is,
i,ii energy level of outputs = number of packets destined for an output.
rnnnr r
M
nnn
N
MMN
NW
)!()!1()!0(
!
!!!
!
!)!(
!10
10
Energy Total Packets ofNumber Total 210 210 rnrnnnM
Number of Outputs 10 rnnnN
4567
01234567
1 3 6 7
Boltzmann Statistics (cont’d)Boltzmann Statistics (cont’d) From Boltzmann Entropy Equation
Maximizing the Entropy by Lagrange Multipliers
Using Stirling’s Approximation for Factorials
Taking the derivatives with respect to ni, yields
WCS ln•S: Entropy
•W: Number of States
•C: Boltzman Constant
)()(ln)( i i
iii inMnNWnf
ii
ii
i iiiii
i
inMnN
innnnNNN
MNMNMNNNNnf
)()(
)!ln()ln(ln
)()ln()(ln)(
!i
en
i
i
Boltzmann Statistics (cont’d)Boltzmann Statistics (cont’d) If offered load on each input is ρ, under uniform loading condition
Probability that there are i packets destined for the output
Carried load of output
en
ni
N
M
ii
ii
2,1,0 ,!!
ii
ei
eN
nP
iii
i
eP 11 0'
Poisson distribution
Clos Network C(m,n,k)Clos Network C(m,n,k)
n(I+1)-1
•D = nQ + R
•D is the destination address
•Q = D/n --- output ⌊ ⌋module in the output stage
•R = [D] n --- output link
in the output module
•G is the central module
•Routing Tag (G,Q,R)
(n+1)Q-1
S
n(k-1)
0 0
I
k-1 m-1
Q
0
n-1
nI
nk-1
Input stage Middle stage Output stage
D
k-1
n x m k x k m x n
G
0
n-1
0
n-1
0
n-1
0
n-1
0
m-1
0
m-1
G
G
0
m-1
G0I
k-1 m-1
0G
0
G
0
I
I
k-1
0
k-1
k-1
k-1
k-1
0
0
0
0
m-1
m-1
Q
Q
Q
0
0
n-1
n-1
n(k-1)
nk-1
G
R
0
n-1
nQ
nQ+R
0
Slepian-Duguid condition m≥n
Clos Network as a Noisy ChannelClos Network as a Noisy ChannelSource state is a perfect matching Central modules are randomly assigned to input packetsOffered load on each input link of central module
Carried load on each output link of central module
Pseudo signal-to-noise ratio (PSNR)
m
nkk e
k
mn 1)1(1'
m
n
'][ kmSE p '1
'
'
'
][
][
kmkm
km
NE
SEPSNR
p
p
)][
][1ln(
p
p
NE
SEmn
Noisy Channel Capacity TheoremNoisy Channel Capacity Theorem
Capacity of the additive white Gaussian noise channel
The maximum date rate C that can be sent through a channel subject to Gaussian noise is
C: Channel capacity in bits per second
W: Bandwidth of the channel in hertz
S/N: Signal-to-noise ratio
)1log(N
SWC
Planck's law can be written in terms of the spectral energy density per unit volume of thermodynamic equilibrium cavity radiation.
Clos Network with Deflection RoutingClos Network with Deflection Routing
Route the packets in C(n,n,k) and C(k,k,n) alternately
Encoding output port addresses in C(n, n, k)
Destination: D = nQ1 + R1
Output module number:
Output port number:
nDQ 1
nDR 1
Encoding output port addresses in C(k, k, n)
Destination: D = kQ2 + R2
Output module number:
Output port number:
kDQ 2
kDR 2
Routing Tag = (Q1,R1, Q2,R2)
0
k-1
0
n-1
0
k-1
0
n-1
kxk nxnnxn kxk
C(n, n, k)C(k, k, n)
Loss Probability versus Network LengthLoss Probability versus Network Length
The loss probability of deflection Clos network is an The loss probability of deflection Clos network is an exponential function of network lengthexponential function of network length
Lloss caP
2906.1 4285.1
1For
ca
Shannon’s Noisy Channel Coding TheoremShannon’s Noisy Channel Coding Theorem
Given a noisy channel with information capacity C and information transmitted at rate R
If R<C, there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small.
If R>C, the probability of error at the receiver increases without bound.
Binary Symmetric ChannelBinary Symmetric Channel The Binary Symmetric Channel(BSC) with cross probability
q=1-p‹½ has capacity
There exist encoding E and decoding D functions
If the rate R=k/n=C-δ for some δ>0. The error probability is bounded by
If R=k/n=C+ δ for some δ>0, the error probability is unbounded
)1log()1(log1 ppppC p0
1
0
1p
q
q
nkE }1,0{}1,0{: knD }1,0{}1,0{:
nne caP 1,1 ca
Parallels Between Noise and ContentionParallels Between Noise and Contention
Binary Symmetric ChannelBinary Symmetric Channel Deflection Clos NetworkDeflection Clos Network
Cross Probability q<½ Deflection Probability q<½
Random Coding Deflection Routing
R≤C R≤n
Exponential Error Probability Exponential Loss Probability
Complexity Increases with Code Length n
Complexity Increases with Network Length L
Typical Set Decoding Equivalent Set of Outputs
nne caP 1,1 ca L
loss caP
Edge Coloring of Bipartite GraphEdge Coloring of Bipartite Graph
A Regular bipartite graph G with vertex-degree m satisfies Hall’s condition
Let A ⊆ VI be a set of inputs, NA = {b | (a,b) E, a A} , since edges ∈ ∈terminate on vertices in A must be terminated on NA at the other end.Then m|NA| ≥ m|A|, so
|NA| ≥ |A|
1200
1011
0111
1011
0100
1000
0010
0001
0100
0010
0001
1000
1000
0001
0100
0010
Route Assignment in Clos NetworkRoute Assignment in Clos Network
S=Input 0 1 2 3 4 5 6 7
D=Output 1 3 2 0 6 4 7 5
G=Central module 0 2 0 2 2 1 0 2
0 1 1 0 3 2 3 2
1 1 0 0 0 0 1 1
0
1
2
3
0
1
2
3
0
1
2
3
4
5
6
7
0
1
2
3
4
5
6
7
nDR
nDQ
Computation of routing tag (G,Q,R)
0
1
2
Rearrangeabe Clos Network and Channel Rearrangeabe Clos Network and Channel Coding TheoremCoding Theorem
(Slepian-Duguid) (Slepian-Duguid) Every Clos network with m≥n is Every Clos network with m≥n is rearrangeably nonblocking rearrangeably nonblocking The bipartite graph with degree n can be edge colored by m
colors if m≥n There is a route assignment for any permutation
Shannon’s noisy channel coding theoremShannon’s noisy channel coding theorem It is possible to transmit information without error up to a limit C.
},,1,)({ Niii
LDPC CodesLDPC Codes Low Density Parity Checking (Gallager 60) Bipartite Graph Representation (Tanner 81) Approaching Shannon Limit (Richardson 99)
x0
x1
x2
x3
+
+
0
1
0
0
VL: n variables VR: m constraints
x1+x3+x4+x7=1
Unsatisfied
x4
x5
x6
x7
1
1
0
1
+
+
x0+x1+x2+x5=0
Satisfied
x2+x5+x6+x7=0
Satisfied
x0+x3+x4+x6=1
Unsatisfied
Closed Under (+)2
Benes NetworkBenes Network
0
1
12
34
56
78
12
34
56
78
Bipartite graph of call requests
45386172
87654321
+
+
+
+
+
+
+
+
x1
x2
x3
x4
x5
x6
x7
x8
G(VL X VR, E)
x1 + x2 =1
x3 + x4 =1
x5 + x6 =1
x7 + x8 =1
x1 + x3 =1
x6 + x8 =1
x4 + x7 =1
x2 + x5 =1
Input Module
Constraints
Output Module
Constraints
satisfied
dunsatisfie
1
0num 2 Module
Not closed under +
Flip AlgorithmFlip Algorithm
Assign x1=0, x2=1, x3=0, x4=1…to satisfy all input module constraints initially
Unsatisfied vertices divide each cycle into segments. Label them α and β alternately and flip values of all variables in α segments
x3+x4=1
+
+
+
+
+
+
+
+
0
1
0
1
0
1
0
1
x1+x2=1
x5+x6=1
x7+x8=1
x1+x3=0
x6+x8=0
x4+x7=1
x2+x5=1
x1
x2
x3
x4
x5
x6
x7
x8
Input module constraints
Output module constraintsvariables
Bipartite Matching and Route AssignmentsBipartite Matching and Route Assignments
12
34
56
78
12
34
56
78
Call requests
45386172
87654321
01100110assignment
1
2
3
4
1
2
3
4
Bipartite Matching and Edge Coloring
ContentsContents
Introduction
Routing and Channel Coding
Scheduling and Source Coding
Concept of Path SwitchingConcept of Path Switching
Traffic signal at cross-road
Use predetermined conflict-free states in cyclic manner The duration of each state in a cycle is determined by traffic
loading Distributed control
N
S
W E
Traffic loading: NS: 2ρ
EW: ρ
NS traffic
EW trafficCycle
012
345
67
0
1
2
0
1
2
0
1
28
012
345
678
0
1
2
0 1 2
Call requests
0 1 2 3 4 5 6 7 8
2 4 1 7 5 3 8 6 0
Connection MatrixConnection Matrix
0
1
2
0
1
2
012
345
67
0
1
2
0
1
2
0
1
28
012
345
678
0
1
2
0 1 2
0
1
2
0 1 2
Time slot 1 Time slot 2
Path Switching of Clos NetworkPath Switching of Clos Network
Capacity of Virtual PathCapacity of Virtual Path
Capacity equals average number of edges
201
120
012
)]1([ ije
G1
G2
111
021
201
)]2([ ije
Time slot 0
Time slot 1
5.15.01
5.025.0
15.05.1
2
)2()1(][ ijij
ij
eec
Virtual path
G1 U G2
0
1
2
0
1
2
0
1
2
0
1
2
Contention-free Clos NetworkContention-free Clos Network
0
k-1
0
m-1
0
k-1
kxk mxnnxm
Input buffer Output bufferPredeterminedconnection patternin every time slot
Central module(nonblocking
switch)
Output module(output queued
Switch)
Input module(input queued
switch)
o
n-1
o
n-1
o
n-1
o
n-1
Buffer and
scheduler
Inputmodule i
Buffer and
scheduler
Inputmodule j
λij Source Destination
Scheduling to combatchannel noise
Buffering to combatsource noise
Virtual path
Complexity Reduction of Permutation SpaceComplexity Reduction of Permutation Space
Reduce the complexity of permutation space from N! to K
1 11
K
ii
K
iiiPC
Convex hull ofdoubly stochastic matrix
Subspace spanned by K base states {Pi}
K ≤ min{F, N2-2N+2}, the base dimension of C
BvN Capacity Decomposition and BvN Capacity Decomposition and Sampling TheoremsSampling Theorems
Packet switching Digital transmission
Network environment Time slotted switching system Time slotted transmission system
Bandwidth limitation
Capacity limited traffic matrix Bandwidth limited signal function
SamplesComplete matching,
(0,1) Permutation matrixes
Entropy,
(0,1) Binary sequences
Expansion
Birkhoff decomposition
(Hall’s matching theorem)
Fourier series
N
i
N
iijij mc
1 1
N
j
N
jijij mc
1 1
ijij C
w
wdFtf
2
2)(
2
1)(
wF 2|| ,0)(
Fm
n
K
nnn
n PF
MC
1 1
ijCF of rsdenominato of l.c.m.
)2(
)2(sin)(
nWt
nWtftf n
WT
2
1interval Sampling
BvN Capacity Decomposition and BvN Capacity Decomposition and Sampling TheoremsSampling Theorems
Packet switching Digital transmission
Inversion by weighted sum by samples
Reconstruction the capacity by running sum
Reconstruction the signal by interpolation
Complexity reduction
Reduce number of permutation from N! to O(N2). Reduce to O(N), if bandwidth is limited.Reduce to constant F if truncation error of order O( 1 / F ) is acceptable.
Reduce infinite dimensional signal space to finite number 2tW in any duration t.
QoSBuffering and scheduling,
capacity guarantee, delay bound
Pulse code modulation (PCM), error-correcting code, data compression, DSP
Source Coding and Scheduling Source Coding and Scheduling
Source coding: A mapping from code book to source symbols to reduce redundancy
Scheduling: A mapping from predetermined connection patterns to incoming packets to reduce delay jitter
Scheduling of a set of permutation matrices generated by decomposition
The sequence , ,……, of inter-state distance of state Pi within a period of F satisfies
Smoothness of state Pi
Smoothness of SchedulingSmoothness of Scheduling
K
iiiPC
1
with frame size F
Pi Pi Pi Pi Pi
)(1ix )(
2ix )(
3ix )(
4ix
F
)(1ix )(
2ix )(i
nix
Fn ii Fxxx in
ii
i )()(
2)(
1
212)(
21
1
2)( ])[(log)(log ii
n
k
iki xEnxL
i
Entropy of Decomposition and Entropy of Decomposition and Smoothness of SchedulingSmoothness of Scheduling
Any scheduling of capacity decomposition
Entropy inequality
K
iiiPC
1
12
1
1
iLK
i
K
i iiH
1
1log
K
iiiLLH
1
The equality holds when iikx 1)(
(Kraft’s Inequality)
Smoothness of SchedulingSmoothness of Scheduling
A Special Case If K=F, Фi=1/F, and ni=1 for all i, then for all i=1,…,F
Another Example
Fx )1(1
HFFF
LF
i
log)log(1
1
2
12Smoothness
The Input Set4,8
8
1,
8
1,
4
1,
2
1 KFi 75.1H
1,1,2,4in
The Expected Optimal Result
8,8,4,2)( ix P1 P2 P1 P3 P1 P2 P1 P4 3,3,2,1iL
75.138
13
8
12
4
11
2
1LH
Optimal Smoothness of SchedulingOptimal Smoothness of Scheduling
Smoothness of random scheduling
Kullback-Leibler distance reaches maximum when
Always possible to device a scheduling within 1/2 of entropy
K
i
K
iiiii HLL
1 1
)2log(2
1
K
iiiHL
1
)2log(2
1
Kk
1...21
K
i KKHL
1 2
1)
12log(
1
2
1
2
1 HLH
(Kraft’s Inequality)
Source Coding TheoremSource Coding Theorem
Necessary and Sufficient condition to prefix encode values x1,x2,…,xN of X with respective length n1,n2,…nN
Any prefix code that assigns ni bits to xi
Always possible to device a prefix code within 1 of entropy
N
i
ni
1
12
1
N
i ii
N
iii xp
xpxHxpnL11 )(
1log)()()(
1)()( xHLxH
Huffman Round Robin (Huffman Round Robin (HuRRHuRR) Algorithm) AlgorithmInitially set the root be temporary node Px, and S = Px…Px be temporary sequence.
Apply the WFQ to the two successors of Px to produce a sequecne T, and substitute T for the subsequence Px…Px of S.
If there is no intermediate node in the sequence S, then terminate the algorithm. Otherwise select an intermediate node Px appearing in S and go to step 2.
Step1
Step2
Step3
PYPX 0.250.25
P10.5
P20.125
P30.125
P40.125 0.125
P5
PZ 0.5
1
51314121
1111
1111
PPPPPPPP
PPPPPPPP
PPPPPPPP
YXYX
ZZZZ
Huffman Code 111,100,101,100,0 54321 PPPPP
logarithm of interstate time = length of Huffman code
Performance of Scheduling AlgorithmsPerformance of Scheduling Algorithms
P1 P2 P3 P4 Random WFQ WF2Q HuRR Entropy
0.1 0.1 0.1 0.7 1.628 1.575 1.414 1.414 1.357
0.1 0.1 0.2 0.6 1.894 1.734 1.626 1.604 1.571
0.1 0.1 0.3 0.5 2.040 1.784 1.724 1.702 1.686
0.1 0.2 0.2 0.5 2.123 1.882 1.801 1.772 1.761
0.1 0.1 0.4 0.4 2.086 1.787 1.745 1.745 1.722
0.1 0.2 0.3 0.4 2.229 1.903 1.903 1.884 1.847
0.2 0.2 0.2 0.4 2.312 2.011 1.980 1.933 1.922
0.1 0.3 0.3 0.3 2.286 1.908 1.908 1.908 1.896
0.2 0.2 0.3 0.3 2.370 2.016 2.016 1.980 1.971
Better Performance
Routing vs. CodingRouting vs. Coding
Noisy channel capacity theorem
Noisy channel coding theorem
Error-correcting code
Sampling theorem
Noiseless channel
Noiseless coding theorem
Random routing
Deflection routing
Route assignment
BvN decomposition
Path switching
Scheduling
Clos network Transmission Channel
Transmission-Switching DualityTransmission-Switching Duality
Boltzmann Equation
S = k logW
PermutationMatrix
Entropy
i
ii PPH log
Clos Network
Noisy Channel
Route Assignment
Channel Coding
Hall’s MatchingTheorem
(BvN Decomposition)
BandlimitedSampling Theorem
Schedulingand Buffering
Source Coding
Communication System
Law of Probability Law of Probability
Input signal to a transmission channel is a function of time The main theorem on noisy channel coding is proved by law of
large number
Input signal to a switch is a function of space Both theorems on deflection routing and smoothness of scheduling
are proved by randomness
Thank You!