Upload
others
View
15
Download
0
Embed Size (px)
Citation preview
The Gallager Converse
Abbas El GamalDirector, Information Systems Laboratory
Department of Electrical Engineering
Stanford University
Gallager’s 75th Birthday 1
Information Theoretic Limits
• Establishing information theoretic limits, e.g., channel capacity, requires:
◦ Finding a single-letter expression for the limit
◦ Proving achievability
◦ Proving a converse
• While achievability tells us about how to improve system design,
converse is necessary to prove optimality
• Proving a converse is typically harder and there are very few tools
available, e.g., Fano’s inequality, data processing inequality, convexity
Gallager’s 75th Birthday 2
Information Theoretic Limits
• Establishing information theoretic limits, e.g., channel capacity, requires:
◦ Finding a single-letter expression for the limit
◦ Proving achievability
◦ Proving a converse
• While achievability tells us about how to improve system design,
converse is necessary to prove optimality
• Proving a converse is typically harder and there are very few tools
available, e.g., Fano’s inequality, data processing inequality, convexity
• Gallager’s identification of the auxiliary random variable:
◦ Crucial step in proof of the degraded broadcast channel converse
◦ Has been used in the proof of almost all subsequent converses in
multi-user information theory
◦ I used it many times in my papers
Gallager’s 75th Birthday 3
Broadcast Channel
T. M. Cover, “Broadcast Channels,” IEEE Trans. Info. Theory,
vol. IT-18, pp. 2-14, Jan. 1972
• Discrete memoryless (DM) broadcast channel (X , p(y1, y2|x),Y1,Y2):
(W1,W2) Xnp(y1, y2|x)
Y n1
Y n2
W1
W2
Encoder
Decoder 1
Decoder 2
• Send independent message Wj ∈ [1, 2nRj ] to receiver j = 1, 2
• Average probability of error: P(n)e = P{W1 6= W1 or W2 6= W2}
• (R1, R2) achievable if there exists a sequence of codes with P(n)e → 0
• The capacity region is the closure of the set of achievable rates
Gallager’s 75th Birthday 4
Example 1: Binary Symmetric BC
• Assume that p1 ≤ p2 < 1/2, H(a), a ∈ [0, 1] binary entropy function
X
Z1 ∼ Bern(p1)
Z2 ∼ Bern(p2)
Y1
Y2
R1
R2
1 − H(p2)
1 − H(p1)
time-sharing
Gallager’s 75th Birthday 5
Example 1: Binary Symmetric BC
• Assume that p1 ≤ p2 < 1/2
X
Z1 ∼ Bern(p1)
Z2 ∼ Bern(p2)
Y1
Y2
{0, 1}n
cloud (radius ≈ nβ)
cloudcenter Un
satellitecodeword Xn
• Superposition coding: Use random coding. Let 0 ≤ β ≤ 1/2;
U ∼ Bern(1/2) and X ′ ∼ Bern(β) independent, X = U + X ′
◦ Generate 2nR2 i.i.d. Un(w2), w2 ∈ [1, 2nR2] (cloud centers)
◦ Generate 2nR1 i.i.d. X ′n(w1), w1 ∈ [1, 2nR1]
◦ Send Xn(w1, w2) = Un(w2) + X ′n(w1) (satellite codeword)
Gallager’s 75th Birthday 6
Example 1: Binary Symmetric BC
• Assume that p1 ≤ p2 < 1/2
X
Z1 ∼ Bern(p1)
Z2 ∼ Bern(p2)
Y1
Y2
R1
R2
C2
C1
time-sharing
• Decoding:
◦ Decoder 2 decodes Un(W2) ⇒ R2 < 1 − H(β ∗ p2)
◦ Decoder 1 first decodes Un(W2), subtracts it off, then decodes
X ′n(W1) ⇒ R1 < H(β ∗ p1) − H(p1)
Gallager’s 75th Birthday 7
Example 2: AWGN BC
• Assume N1 ≤ N2, average power constraint P on X
X
Z1 ∼ N (0, N1)
Z2 ∼ N (0, N2)
Y1
Y2
• Superposition coding: Let α ∈ [0, 1], U ∼ N (0, (1 − α)P ),
X ′ ∼ N (0, αP ) independent, X = U + X ′
◦ Decoder 2 decodes cloud center ⇒ R2 ≤ C ((1 − α)P/(αP + N2))
◦ Decoder 1 decodes clound center, subtracts it off, then decode the
sattelite codeword ⇒ R1 ≤ C (αP/N1)
Gallager’s 75th Birthday 8
Degraded Broadcast Channel
• Stochastically degraded BC [Cover 1972]: There exists p(y2|y1) such
that
p(y2|x) =∑
y1
p(y1|x)p(y2|y1)
• Special case of channel inclusion in:
C. E. Shannon, “A note on a partial ordering for communication
channels,” Information and Control, vol. 1, pp. 390-397, 1958
Gallager’s 75th Birthday 9
Degraded Broadcast Channel
• Stochastically degraded BC [Cover 1972]: There exists p(y2|y1) such
that
p(y2|x) =∑
y1
p(y1|x)p(y2|y1)
Special case of channel inclusion in:
C. E. Shannon, “A Note on a Partial Ordering for Communication
Channels,” Information and Control, vol. 1, pp. 390-397, 1958
• Physically degraded version [Bergmans 73]:
X p(y1|x)Y1
p(y2|y1) Y2
• Since the capacity region of any BC depends only on marginals p(y1|x),
p(y2|x) ⇒ The capacity region of the degraded broadcast channel is the
same as that of the physically degraded version
Gallager’s 75th Birthday 10
Degraded Broadcast Channel
• Cover conjectured that the capacity region is set of all (R1, R2) such that
R2 ≤ I(U ; Y2),
R1 ≤ I(X ; Y1|U ),
for some p(u, x)
• First time an auxiliary random variable is used in characterizing an
information theoretic limit
Gallager’s 75th Birthday 11
Achievability
P. P. Bergmans, “Random Coding Theorem for Broadcast Channels
with Degraded Components,” IEEE Trans. Info. Theory, vol. IT-19,
pp. 197-207, Mar. 1973
• Use superposition coding: Fix p(u)p(x|u)
Xn
UnX n
Un
◦ Decoder 2 decodes the cloud center Un ⇒ R2 ≤ I(U ; Y2)
◦ Decoder 1 decodes first decodes the cloud center, then the
sattelite codeword Xn ⇒ R1 ≤ I(X ; Y1|U )
Gallager’s 75th Birthday 12
The Converse
A. Wyner, “A Theorem on the Entropy of Certain Binary Sequences
and Applications: Part II” IEEE Trans. Info. Theory, vol. IT-10,
pp. 772-777, Nov. 1973
◦ Proved weak converse for binary symmetric broadcast channel
(used Mrs. Gerber’s Lemma)
Gallager’s 75th Birthday 13
The Converse
A. Wyner, “A Theorem on the Entropy of Certain Binary Sequences
and Applications: Part II” IEEE Trans. Info. Theory, vol. IT-10,
pp. 772-777, Nov. 1973
◦ Proved weak converse for binary symmetric broadcast channel
(used Mrs. Gerber’s Lemma)
P.P. Bergmans, “A Simple Converse for Broadcast Channels with
Additive White Gaussian Noise (Corresp.),” IEEE Trans. Info. Theory,
vol. IT-20, pp. 279-280, Mar. 1974
◦ Proved the converse for the AWGN BC (used Entropy Power
Inequality— very similar to Wyner’s proof for binary symmetric
case)
Gallager’s 75th Birthday 14
The Converse
A. Wyner, “A Theorem on the Entropy of Certain Binary Sequences
and Applications: Part II” IEEE Trans. Info. Theory, vol. IT-10,
pp. 772-777, Nov. 1973
◦ Proved weak converse for binary symmetric broadcast channel
(used Mrs. Gerber’s Lemma)
P.P. Bergmans, “A Simple Converse for Broadcast Channels with
Additive White Gaussian Noise (Corresp.),” IEEE Trans. Info. Theory,
vol. IT-20, pp. 279-280, Mar. 1974
◦ Proved converse for AWGN BC (used Entropy Power Inequality)
R. Gallager, “Capacity and Coding for Degraded Broadcast Channels,”
Problemy Peredaci Informaccii, vol. 10, no. 3, pp. 3-14, July-Sept 1974
◦ Proved the weak converse for the discrete-memoryless degraded
BC — identification of auxiliary random variable
◦ Established bound on cardinality of the auxiliary random variable
Gallager’s 75th Birthday 15
Weak Converse for Shannon Channel Capacity
• Shannon capacity: C = maxp(x) I(X ; Y ) (only uses “channel” variables)
• Weak converse: Show that for any sequence of (n,R) codes with
P(n)e → 0, R ≤ C
◦ For each code form the empirical joint pmf:
(W, Xn, Y n) ∼ p(w)p(xn|w)∏n
i=1 p(yi|xi)
◦ Fano’s inequality: H(W |Y n) ≤ nRP(n)e + H(P
(n)e ) = nǫn
◦ Now consider
nR = I(W ; Y n) + H(W |Y n)
≤ I(W ; Y n) + nǫn
≤ I(Xn; Y n) + nǫn data processing inequality
= H(Y n) − H(Y n|Xn) + nǫn
≤n
∑
i=1
I(Xi; Yi) + nǫn convexity, memorylessness
≤ n(C + ǫn)
Gallager’s 75th Birthday 16
Gallager Converse
• Show that given a sequence of (n,R1, R2) codes with P(n)e → 0,
R1 ≤ I(X ; Y1|U ), R2 ≤ I(U ; Y2) for some p(u, x) such that
U → X → (Y1, Y2)
• Key is identifying U
◦ The converses for binary symmetric and AWGN BCs are direct and
do not explicitly identify U
• As before, by Fano’s inequality
H(W1|Yn1 ) ≤ nR1P
(n)e + 1 = nǫ1n
H(W2|Yn2 ) ≤ nR2P
(n)e + 1 = nǫ2n
So, we have
nR1 ≤ I(W1; Yn1 ) + nǫ1n,
nR2 ≤ I(W2; Yn2 ) + nǫ2n
Gallager’s 75th Birthday 17
• A First attempt: U = W2 (satisfies U → Xi → (Yi1, Y2i))
I(W1; Yn1 ) ≤ I(W1; Y
n1 |W2)
= I(W1; Yn1 |U )
=n
∑
i=1
I(W1; Y1i|U, Y i−11 ) chain rule
≤n
∑
i=1
(
H(Y1i|U ) − H(Y1i|U, Y i−11 , W1)
)
≤n
∑
i=1
(
H(Y1i|U ) − H(Y1i|U, Y i−11 , W1, Xi)
)
=n
∑
i=1
I(Xi; Y1i|U )
So far so good.
Gallager’s 75th Birthday 18
• A First attempt: U = W2 (satisfies U → Xi → (Yi1, Y2i))
I(W1; Yn1 ) ≤ I(W1; Y
n1 |W2)
= I(W1; Yn1 |U )
=n
∑
i=1
I(W1; Y1i|U, Y i−11 )
≤n
∑
i=1
(
H(Y1i|U ) − H(Y1i|U, Y i−11 , W1)
)
≤n
∑
i=1
(
H(Y1i|U ) − H(Y1i|U, Y i−11 , W1, Xi)
)
=
n∑
i=1
I(Xi; Y1i|U )
So far so good. Now let’s try the second inequality
I(W2; Yn2 ) =
n∑
i=1
I(W2; Y2i|Yi−12 ) =
n∑
i=1
I(U ; Y2i|Yi−12 )
But I(U ; Y2i|Yi−12 ) is not necessarily ≤ I(U ; Y2i), so U = W2 does not
work
Gallager’s 75th Birthday 19
• Gallager: Ok, let’s try Ui = W2, Yi−11 (Ui → Xi → (Y1i, Y2i)), so
I(W1; Yn1 |W2) ≤
n∑
i=1
I(Xi; Y1i|Ui)
Now, let’s consider the other term
I(W2; Yn2 ) ≤
n∑
i=1
I(W2, Yi−12 ; Y2i)
≤n
∑
i=1
I(W2, Yi−12 , Y i−1
1 ; Y2i)
=n
∑
i=1
(
H(Y2i) − H(Y2i|W2, Yi−12 , Y i−1
1 ))
But H(Y2i|W2, Yi−12 , Y i−1
1 ) is not in general equal to H(Y2i|W2, Yi−11 )
Gallager’s 75th Birthday 20
• Gallager: Ok, let’s try Ui = W2, Yi−11 (Ui → Xi → (Y1i, Y2i)), so
I(W1; Yn1 |W2) ≤
n∑
i=1
I(Xi; Y1i|Ui)
Now, let’s consider the other term
I(W2; Yn2 ) ≤
n∑
i=1
I(W2, Yi−12 ; Y2i)
≤n
∑
i=1
I(W2, Yi−12 , Y i−1
1 ; Y2i)
=n
∑
i=1
(
H(Y2i) − H(Y2i|W2, Yi−12 , Y i−1
1 ))
But H(Y2i|W2, Yi−12 , Y i−1
1 ) is not in general equal to H(Y2i|W2, Yi−11 )
• Key insight: Since capacity is the same as physically degraded, can
assume X → Y1 → Y2, thus Y i−12 → (W2, Y
i−11 ) → Y2i, and
I(W2; Yn2 ) ≤
n∑
i=1
I(Ui; Y2i) Eureka !
Gallager’s 75th Birthday 21
• Note: Proof also works with Ui = W2, Yi−12 or Ui = W2, Y
i−11 , Y i−1
2 (both
satisfy Ui → Xi → (Y1i, Y2i)). Let’s try Ui = W2, Yi−12
First consider
I(W2; Yn2 ) ≤
n∑
i=1
I(W2, Yi−12 ; Y2i) =
n∑
i=1
I(Ui; Y2i)
Now consider
I(W1; Yn1 |W2) =
n∑
i=1
I(W1; Y1i|W2, Yi−11 )
≤n
∑
i=1
I(Xi; Y1i|W2, Yi−11 )
=
n∑
i=1
(
H(Y1i|W2, Yi−11 ) − H(Y1i|Xi)
)
=n
∑
i=1
(
H(Y1i|W2, Yi−11 , Y i−1
2 ) − H(Y1i|Xi, Ui))
≤n
∑
i=1
(
H(Y1i|W2, Yi−12 ) − H(Y1i|Xi, Ui)
)
=n
∑
i=1
I(Xi; Y1i|Ui)
Gallager’s 75th Birthday 22
Thus we finally have
nR1 ≤n
∑
i=1
I(Xi; Y1i|Ui) + nǫ1n
nR2 ≤n
∑
i=1
I(Ui; Y2i) + nǫ2n
By convexity and letting n → ∞, it follows that there exists
U → X → Y1 → Y2 such that
R1 ≤ I(X ; Y1|U ),
R2 ≤ I(U ; Y2)
Gallager’s 75th Birthday 23
Thus we finally have
nR1 ≤n
∑
i=1
I(Xi; Y1i|Ui) + nǫ1n
nR2 ≤n
∑
i=1
I(Ui; Y2i) + nǫ1n
By convexity and letting n → ∞, it follows that there exists
U → X → Y1 → Y2 such that
R1 ≤ I(X ; Y1|U ) + ǫ1n,
R2 ≤ I(U ; Y2) + ǫ2n
• But there is still a problem here; U can have unbounded cardinality, so
this is not a computable expression
• Gallager then argues that |U| ≤ min{|X |, |Y1|, |Y2|} suffices
• Remark: Proof extends to more than 2 receivers
Gallager’s 75th Birthday 24
Subsequent Broadcast Channel Results
• Direct application of Gallager converse:
◦ Physically degraded BC with feedback [El Gamal 1978]
◦ Product and sum of degraded and reversely degraded BC
[Poltyrev 1977, El Gamal 1980]
Gallager’s 75th Birthday 25
Subsequent Broadcast Channel Results
• Direct application of Gallager converse:
◦ Physically degraded BC with feedback [El Gamal 1978]
◦ Product and sum of degraded and reversely degraded BC
[Poltyrev 1977, El Gamal 1980]
• Gallager converse+Csizsar Lemma:
◦ Less noisy [Korner, Marton 1975]: I(U ; Y1) ≤ I(U ; Y2) for all
U → X → (Y, Z)
◦ BC with degraded message sets [Korner, Marton 1977]
◦ More capable [El Gamal 1979] (I(X ; Y1) ≥ I(X ; Y2) for all p(x))
◦ Semi-deterministic BC [Gelfand-Pinsker 1980]
◦ Nair-El Gamal outer bound for BC [2006]
• All BC channel converses have used Gallager converse
Gallager’s 75th Birthday 26
Degraded Broadcast Channel With Feedback
• Assume at time i: Xi = fi(W1, W2, Yi−11 , Y i−1
2 )
• Capacity region can now depend on joint pmf p(y1, y2|x) ⇒ capacity of
stochastically degraded BC with feedback not necessarily equal to that
of physically degraded counterpart
• My first result was showing that feedback doesn’t increase capacity of
physically degraded BC [El Gamal 78]
• Converse:
nR2 ≤ I(W2; Yn2 ) + nǫn
≤n
∑
i=1
I(W2, Yi−12 ; Y2i) + nǫn
≤n
∑
i=1
I(W2, Yi−11 , Y i−1
2 ; Y2i) + nǫn
But because of feedback, Y i−12 , (W2, Y
i−11 ), Y2i do not necessarily form a
Markov chain, thus cannot in general get rid of Y i−12
Gallager’s 75th Birthday 27
Let Ui = W2, Yi−11 , Y i−1
2 (Ui → Xi → Y1i → Y2i)
Now conisder
nR1 ≤ I(W1; Yn1 |W2) + nǫn
≤ I(W1; Yn1 , Y n
2 |W2) + nǫn
=n
∑
i=1
I(W1; Y1i, Y2i|W2, Yi−11 , Y i−1
2 ) + nǫn
=n
∑
i=1
I(W1; Y1i, Y2i|Ui) + nǫn
≤∑
i=1
I(Xi; Y1i, Y2i|Ui) + nǫn
=n
∑
i=1
I(Xi; Y1i|Ui) +n
∑
i=1
I(Xi; Y2i|Y1i, Ui) + nǫn
=n
∑
i=1
I(Xi; Y1i|Ui) + nǫn, (Ui → Xi → Y1i → Y2i)
• In general, feedback can enlarge the capacity region of degraded BC
• Capacity region of degraded BC with feedback is not known
Gallager’s 75th Birthday 28
Broadcast Channel with Degraded Message Sets
• Consider a general DM broadcast channel (X , p(y1, y2|x),Y1,Y2)
• Common message W0 is to be sent to both Y1 and Y2, while message W1
is to be sent only to Y1
• Korner-Marton 77: The capacity region is given by the set C1 of all
(R0, R1) such that:
R0 ≤ I(U ; Y2)
R1 ≤ I(X ; Y1|U )
R0 + R1 ≤ I(X ; Y1), for some p(u, x)
• Achievability: Superposition coding
• Converse: Gallager-type converse doesn’t seem to work:
◦ The first inequality holds with either Ui = W0, Yi−12 or
Ui = W0, Yi−11 , Y i−1
2 , but neither works for the second inequality
◦ The second inequality works with Ui = W0, Yi−11 , but the first
doesn’t
Gallager’s 75th Birthday 29
BC with Degraded Message Sets
• A Gallager type converse works, but needs a new ingredient [El Gamal]
• Define the set C2 of all (R0, R1) such that:
R0 ≤ min {I(U ; Y1), I(U ; Y2)}
R0 + R1 ≤ min {I(X ; Y1), I(X ; Y1|U ) + I(U ; Y2)} , for some p(u, x)
• It is easy to show that C2 ⊆ C1
• Weak converse for C2: Define Ui = W0, Yi−11 , Y n
2(i+1) !!
• It is not difficult to show that
R0 ≤1
nmin
{
n∑
i=1
I(Ui; Y1i),n
∑
i=1
I(Ui; Y2i)
}
+ǫn, R0+R1 ≤1
n
n∑
i=1
I(Xi; Y1i)+ǫn
• The difficult step is to show that
R0 + R1 ≤1
n
n∑
i=1
(I(Xi; Y1i|Ui) + I(Ui; Y2i)) + ǫn
Gallager’s 75th Birthday 30
• To show this, consider
n(R0 + R1) ≤ I(W1; Yn1 |W0) + I(W0; Y
n2 ) + nǫn
=n
∑
i=1
(
I(W1; Y1i|W0, Yi−11 ) + I(W0; Y2i|Y
n2(i+1))
)
+ nǫn
≤n
∑
i=1
(
I(W1, Yn2(i+1); Y1i|W0, Y
i−11 ) + I(W0, Y
n2(i+1); Y2i)
)
+ nǫn
≤n
∑
i=1
(
I(Xi; Y1i|Ui) + I(Y n2(i+1); Y1i|W0, Y
i−11 )
)
+
n∑
i=1
(
I(Ui; Y2i) − I(Y i−11 ; Y2i|W0, Y
n2(i+1))
)
+ nǫn.
Gallager’s 75th Birthday 31
• To show this, consider
n(R0 + R1) ≤ I(W1; Yn1 |W0) + I(W0; Y
n2 ) + nǫn
=n
∑
i=1
(
I(W1; Y1i|W0, Yi−11 ) + I(W0; Y2i|Y
n2(i+1))
)
+ nǫn
≤n
∑
i=1
(
I(W1, Yn2(i+1); Y1i|W0, Y
i−11 ) + I(W0, Y
n2(i+1); Y2i)
)
+ nǫn
≤n
∑
i=1
(
I(Xi; Y1i|Ui) + I(Y n2(i+1); Y1i|W0, Y
i−11 )
)
+n
∑
i=1
(
I(Ui; Y2i) − I(Y i−11 ; Y2i|W0, Y
n2(i+1))
)
+ nǫn.
• Now using the Csiszar lemma [Csiszar-Korner 1978]n
∑
i=1
I(Y i−11 ; Y2i|W0, Y
n2(i+1)) =
n∑
i=1
I(Y n2(i+1); Y1i|W0, Y
i−11 ),
the second and fourth terms cancel and the converse is proved
• Note: This problem is open for more than two users
Gallager’s 75th Birthday 32
Tightest Bounds on Capacity of Broadcast Channel
• Inner bound [Marton 79]: Set of rate triples (R0, R1, R2) such that
R0 ≤ min {I(W ; Y1), I(W ; Y2)} ,
R0 + R1 ≤ I(W,U ; Y1),
R0 + R2 ≤ I(W,V ; Y2),
R0 + R1 + R2 ≤ min{I(W, U ; Y1) + I(V ; Y2|W ), I(U ; Y1|W ) + I(W, V ; Y2)}
− I(U ; V |W ),
for some p(w, u, v, x)
• Outer bound [Nair-El Gamal 2006]: Set of rate triples (R1, R2) such that
R0 ≤ min {I(W ; Y1), I(W ; Y2)}
R0 + R1 ≤ I(W,U ; Y1)
R0 + R2 ≤ I(W,V ; Y2)
R0 + R1 + R2 ≤ min{I(W, U ; Y1) + I(V ; Y2|W, U ), I(W, V ; Y2)
+ I(U ; Y1|W,V )},
for some p(u, v, w, x) = p(u)p(v)p(w|u, v)p(x|u, v, w)
Gallager’s 75th Birthday 33
Tightest Bounds for Independent Messages
• Inner bound: Set of all rate pairs (R1, R2) such that
R1 ≤ I(W, U ; Y1)
R2 ≤ I(W, V ; Y2)
R1 + R2 ≤ min{I(W, U ; Y1) + I(V ; Y2|W ), I(W, V ; Y2) + I(U ; Y1|V )}
− I(U ; V |W ),
for some p(u, v, w, x) = p(w)p(u, v|w)p(x|u, v, w)
• Outer bound: Set of all rate pairs (R1, R2) such that
R1 ≤ I(W, U ; Y1)
R2 ≤ I(W, V ; Y2)
R1 + R2 ≤ min{I(W, U ; Y1) + I(V ; Y2|W, U ), I(W, V ; Y2) + I(U ; Y1|W,V )},
for some joint distribution p(u, v, w, x) = p(u)p(v)p(w|u, v)p(x|u, v, w)
Gallager’s 75th Birthday 34
Broadcast Channel: Summary
• Marton region is tight for all classes of BC channels where capacity is
known and contains all other achievable rate regions
• Nair-El Gamal region is tight for all classes of BC channels where
capacity is known
◦ Uses Gallager converse+Csizsar Lemma
• Capacity of BC is still not known in general
◦ Is there a “single-letter” characterization?
◦ Does it use auxiliary random variables?
◦ If so, then the Gallager converse is again likely to play a key role
Gallager’s 75th Birthday 35
Gallager Converse: Other Multi-User Channels/Sources
• Lossless source coding with side information [Ahlswede-Korner 1975]
• Lossy source coding with side information [Wyner-Ziv 1976]
• Channel with state known at encoder
[Gelfand-Pinsker 1980] — uses Gallager converse+Csiszar Lemma
• MAC with partially cooperating encoders [Willems 1983]
• MAC with cribbing encoders [Willems-van der Meulen, 1985]
• Rate distortion when side information may be absent
[Heegard-Berger 1985]
• Multi-terminal source coding with one distortion criterion
[Berger-Yeung 1989]
• Relay without delay [El Gamal-Hassanpour 2005]
• and, many others . . .
Gallager’s 75th Birthday 36
Conclusion
• Gallager converse has had a huge impact on the development of
multi-user information theory
• Gallager converse has had a huge impact on my work
Gallager’s 75th Birthday 37
Conclusion
• Gallager converse has had a huge impact on the development of
multi-user information theory
• Gallager converse has had a huge impact on my work
• But, Bob, what does the identification Ui = W2, Yi−11 really mean ?
Gallager’s 75th Birthday 38