136
Interactive Communication Jie Ren 2012/8/14 ASPITRG, Drexel University

interactive communication

  • Upload
    jie-ren

  • View
    36

  • Download
    0

Embed Size (px)

Citation preview

Page 1: interactive communication

Interactive Communication

Jie Ren

2012/8/14

ASPITRG, Drexel University

Page 2: interactive communication

Outline

Problem Setup (Two-Way Source Coding)

Interaction in Lossless Expected Length Case

Interaction in Zero-error Worst Length Case

Interaction in Function Computation

Page 3: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 4: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 5: interactive communication

Two-way Source Coding model

Two-terminal distributed source coding problem

Reconstruct X/Y on both sides

Alternating messages scheme (concurrent scheme)

Page 6: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 7: interactive communication

Why interested in it?Recall Wyner-Ziv problem

Question is, Can we save more rate?

Page 8: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 9: interactive communication

Mathematical Description

A K-round scheme of two-way source coding

1-round: The X codec starts by sending RX1 bits

then the Y codec replies Ry1 bits

The process repeats K times

Page 10: interactive communication

Mathematical Description

An example of 3-round scheme

Page 11: interactive communication

Mathematical Description

Denote Zk as the Kth-round forward message (X to Y)

Denote Wk as the Kth-round backward message

Page 12: interactive communication

Mathematical Description

The kth step of forward/backward passing

Both the Enc and Dec will consider all the former messages and the source

Page 13: interactive communication

Mathematical Description

X,Y, Z1:K W1:K forms markov chains shown as follows:

Page 14: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 15: interactive communication

Sum-rate-distortion Function

Page 16: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 17: interactive communication

Proof(Achievability)

Recall Achievability proof of Wyner-Ziv problem

Page 18: interactive communication

Proof(Achievability)

Recall Achievability proof of Wyner-Ziv problem

Jointly Strong Typicality

“Bin” method

Encoder:

Decoder:

See figure in the next slide

Page 19: interactive communication

Proof(Achievability)

Page 20: interactive communication

Proof(Achievability)

Similar to Wyner-Ziv’s proof

A codebook tree instead of codebook

Page 21: interactive communication

Proof(Achievability)

Consider one single step of message passing

Page 22: interactive communication

Proof(Achievability)

The random variables X Y Z W will satisfy the markov

property

Can show

Page 23: interactive communication

Proof (Converse)

Recall converse proof of Wyner-Ziv problem

Page 24: interactive communication

Proof (Converse)

Recall converse proof of Wyner-Ziv problem

One can prove

By the convexity of mutual information

Page 25: interactive communication

Proof (Converse)

Given an achievable point s=(rx,ry,dx,dy), prove that

Page 26: interactive communication

Proof (Converse)

There exists a system

Specified by the encoding functions

And decoding functions F,G satisfy

Page 27: interactive communication

Proof (Converse)

Can show:

Denote X(-) as X1:i-1 Y(+) as Yi+1:n

Then can show:

Page 28: interactive communication

Proof (Converse)

Define auxiliary random variables

We have

Page 29: interactive communication

Proof (Converse)Prove

(1)

(2)

Page 30: interactive communication

Outline

Problem SetupTwo-way Source Coding modelWhy Interested in ItMathematical DescriptionSum-rate-distortion FunctionProof(Achievability and Converse)Problems Remain Open

Interaction in Lossless Expected Length CaseInteraction in Zero-error Worst Length CaseInteraction in Function Computation

Page 31: interactive communication

Problems Remain Open

1. Does interaction strictly improves rate-distortion function?

2. Does an unbounded K helps?

3. The existence of an optimal K* s.t. K*<∞

4. Zero-error worst-length case

5. Probability of block error for lossless reproduction

6. How many bits we can save?

7. Interaction in function computation case

Page 32: interactive communication

Problems Remain Open

An example of interaction in zero-error case

Page 33: interactive communication

Problems Remain Open

An example of interaction in function computation

X~Uniform{1…L} Y~Ber(p)

fa(x,y):=0 fb(x,y):=xy

The benefit of interaction can be arbitrarily large

Page 34: interactive communication

Conclusion

Idea of Interaction

K-round Scheme of Two-Way Source Coding

Sum-Rate-Distortion Function

Achievability and Converse Proof

Some questions remain open

Page 35: interactive communication

Problems Remain Open

1. Does interaction strictly improves rate-distortion function?

2. Does an unbounded K helps?

3. The existence of an optimal K* s.t. K*<∞

4. Zero-error worst-length case

5. Probability of block error for lossless reproduction

6. How many bits we can save?

7. Interaction in function computation case

Page 36: interactive communication

Interaction Improves Rate-Distortion Function

Page 37: interactive communication

Interaction Improves Rate-Distortion Function

Page 38: interactive communication

Interaction Improves Rate-Distortion Function

Page 39: interactive communication

Interaction Improves Rate-Distortion Function

We have

Question:

Is the inequality strict?

Page 40: interactive communication

Interaction Improves Rate-Distortion Function

Key tool : rate reduction functionals

Definition:

Page 41: interactive communication

Interaction Improves Rate-Distortion Function

Lemma 1:

The following two conditions are equivalent

(1)

(2)

Page 42: interactive communication

Interaction Improves Rate-Distortion Function

Lemma 2: Let f(p) be a function differentiable around p=0 such that f(0)=0 and f’(0)>0. Then

Can be proved by the l’Hopital rule

Page 43: interactive communication

Interaction Improves Rate-Distortion Function

Theorem 1: There exists a distortion function d, a joint distribution pxy, and a distortion level D for which

Lemma 1,2 will be used in the proof of Theorem1

Page 44: interactive communication

Interaction Improves Rate-Distortion Function

Let

Let d be the binary erasure distortion function

0 1 e

0 0 INF 1

1 INF 0 1

Page 45: interactive communication

Interaction Improves Rate-Distortion Function

Let (X,Y) ~ DSBS(p)

Where a Kronecker function is used

Marginal distribution

X~Ber(1/2) Y~Ber(1/2)

00 0.5(1-p)

11 0.5(1-p)

01 0.5p

10 0.5p

Page 46: interactive communication

Interaction Improves Rate-Distortion Function

By Lemma 1, it is sufficient to prove there exist pY,1

and pY,2 such that

This can be proved by the following 5 propositions

Page 47: interactive communication

Interaction Improves Rate-Distortion Function

Proposition 1

0 1 e

0 0 INF 1

1 INF 0 1

Page 48: interactive communication

Interaction Improves Rate-Distortion Function

Proposition 2

Where,

Page 49: interactive communication

Interaction Improves Rate-Distortion Function

Proposition 3 The rate reduction funtionals can be reduced to the compact expression for binary erasure distortion and DSBS source

Page 50: interactive communication

Interaction Improves Rate-Distortion Function

Proposition 4

Holds for

Where

Page 51: interactive communication

Interaction Improves Rate-Distortion Function

Proposition 5 For all q ∈ (0, 1/2) and all ∈ (0, 1), there exists p∈ (0, 1) such that the strict inequality

holds for

Then theorem 1 has been proved.

Page 52: interactive communication

Interaction Improves Rate-Distortion Function

Theorem 2: If d is the binary erasure distortion and pXY the joint pmf of a DSBS with parameter p, then for all L>0 there exists an admissible two-message rate-distortion tuple (R1, R2, D) such that

Page 53: interactive communication

In which case Interaction improves the rate?

LOSSY LOSSLESS ZERO-ERROR

SourceReconstruction

Yes No Yes

Function Computation

Yes Yes Yes

Page 54: interactive communication

Outline

Problem Setup (Two-Way Source Coding)

Interaction in Lossless Expected Length Case

Interaction in Zero-error Worst Length Case

Interaction in Function Computation

Page 55: interactive communication

Lossless Expected Length Case

Known Y at the encoder does not improve the rate

Page 56: interactive communication

Outline

Problem Setup (Two-Way Source Coding)

Interaction in Lossless Expected Length Case

Interaction in Zero-error Worst Length Case

Interaction in Function Computation

Page 57: interactive communication

Outline

Interaction in Zero-error Worst-Length Case

Problem Setup

Definitions and Properties

Results in Zero-error Worst-Length Case

Proof of the Results

Conclusion

Page 58: interactive communication

Outline

Interaction in Zero-error Worst-Length Case

Problem Setup

Definitions and Properties

Results in Zero-error Worst-Length Case

Proof of the Results

Conclusion

Page 59: interactive communication

Problem Setup

Page 60: interactive communication

Outline

Interaction in Zero-error Worst-Length Case

Problem Setup

Definitions and Properties

Results in Zero-error Worst-Length Case

Proof of the Results

Conclusion

Page 61: interactive communication

Definitions and Properties

Support set of (X,Y)

Transmission length of the input (x,y)

Page 62: interactive communication

Definitions and Properties

Worst-case complexity of an protocol

M-message complexity of (X,Y)

Page 63: interactive communication

Definitions and Properties

Cm(X|Y) is a decreasing function of m since empty message works

C1={C1,0,0}

Can define C∞(X|Y)

Also

Page 64: interactive communication

Definitions and Properties

Define

Y’s ambiguity set

Page 65: interactive communication

Definition and Properties

Ambiguity

Maximum ambiguity

Page 66: interactive communication

Definitions and Properties

Separate-transmissions property

Page 67: interactive communication

Definitions and Properties

Implicit-termination property

Page 68: interactive communication

Definitions and Properties

Correct-decision property

Page 69: interactive communication

Hypergraph G(V,E)

Ordered pair (V,E)

Adjacent

Coloring of the hypergraph

if V1 and V2 are adjacent

K-colorable

Chromatic number

Page 70: interactive communication

Hypergraph G(V,E)

K-colorable (K=3,4,5…)

Chromatic number

Page 71: interactive communication

Outline

Interaction in Zero-error Worst-Length Case

Problem Setup

Definitions and Properties

Results in Zero-error Worst-Length Case

Proof of the Results

Conclusion

Page 72: interactive communication

Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 73: interactive communication

Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 74: interactive communication

Results

One-Way Complexity

The one-way complexity is the chromatic number of the characteristic hypergraph of (X,Y)

Page 75: interactive communication

Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 76: interactive communication

Results

The Limits of Interaction

The minimum number of bits we need to reconstruct X with zero-error

Page 77: interactive communication

Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 78: interactive communication

Results

Two Messages are Optimal

In some cases, two message are

enough to achieve the bound.

See example 1 english league

Page 79: interactive communication

Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 80: interactive communication

Results

Two Messages are Almost Optimal

In general case, we can prove

Two messages: log-reduction

More than two messages: linear-reduction

Page 81: interactive communication

Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 82: interactive communication

Results

Two Messages are Not Optimal

In some cases, two messages are not optimal.

See example 2 Playoffs

Page 83: interactive communication

Outline

Interaction in Zero-error Worst-Length Case

Problem Setup

Definitions and Properties

Results in Zero-error Worst-Length Case

Proof of the Results

Conclusion

Page 84: interactive communication

Proof of the Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 85: interactive communication

Proof of the Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 86: interactive communication

One-Way Complexity

Page 87: interactive communication

One-Way Complexity

Define ω(G(X|Y)) as the chromatic number of G

Then,

Page 88: interactive communication

Proof of the Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 89: interactive communication

The Limits of Interaction

For all nontrivial (X,Y) pairs

Here we prove an one-bit weaker result first

Page 90: interactive communication

The Limits of Interaction

High-level Idea of the proof

X sends a sub-graph with edges contain vertex x

Y decodes x based on the edge Y=y

Page 91: interactive communication

Proof of the Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 92: interactive communication

Two Messages are Optimal

Two messages are optimal when

Hypergragh degenerates to graph (example 1)

For given Y=y

(team i vs team j)

Page 93: interactive communication

Example 1 English League

t clubs in english league(t=16), two random teams play versus each other.

Source X:

Jayant knows Chelsea won

Source Y:

I know Chelsea Vs MU

Aim:

I know Chelsea won

(Reconstruct X on Y side)

Page 94: interactive communication

Two Messages are Optimal

High level idea of the proof

Construct a communication scheme as shown in

Example 1

Page 95: interactive communication

Two Messages are Optimal

Only need to show

By construct a protocol

Page 96: interactive communication

Two Messages are Optimal

X and Y agree on a ω(G(X|Y)) and on a log(ω(G(X|Y))) bit encoding of color

Y transmits the location

of the two color differs

X transmits that value

Page 97: interactive communication

Two Messages are Optimal

General Scheme

Y transmits a sub graph that only need 2 color

X gives the color

Page 98: interactive communication

Proof of the Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 99: interactive communication

Two Message are Almost Optimal

Page 100: interactive communication

Two Message are Almost Optimal

For all nontrivial (X,Y) pairs with

We have

Then we can show

Page 101: interactive communication

Two Message are Almost Optimal

High level idea of the proof

Y transmits a sub-hyper-graph using bits

The chromatic number of each sub-graph is b

(b>2)

X gives back the color

The idea of perfect hash functions are used here

Page 102: interactive communication

Proof of the Results

One-Way Complexity

The Limits of Interaction

Two Messages are Optimal

Two Messages are Almost Optimal

Two Messages are Not Optimal

Page 103: interactive communication

Two Message are Not Optimal

High level idea of the proof

Chromatic Decomposition number

Prove in some cases

Page 104: interactive communication

Two Message are Not Optimal

Chromatic-decomposition number

Define edge cover:

Define chromatic-decomposition number:

Page 105: interactive communication

Two Message are Not Optimal

Edge Cover:

E1={e1 e2 e3} E2={e4 e5 e6}

Page 106: interactive communication

Two Message are Not Optimal

Chromatic-decomposition

ω(E1)=2 ω(E2)=3

Page 107: interactive communication

Two Message are Not Optimal

Will show, in Example 2 Playoffs

Page 108: interactive communication

Example 2 Playoffs

L sub-leagues, t teams for each sub-league.

Totally l*t teams in the great association

First 2 teams of each sub-league come into playoffs

Source X:

Jayant know the result (champion/canceled)

Source Y:

I know the 2l teams in the playoffs

Aim: Reconstruct X on Y side (I know the result)

Page 109: interactive communication

Example 2 Playoffs

i.e. t=3 l=2

Page 110: interactive communication

Characteristic Table

Page 111: interactive communication

Example 2 Playoffs

Chromatic number in example 2

Any two teams belong to a common edge.

(no teams can share the color, l*t color needed)

“Cancel” belong to all edges

(additional 1 color is needed for “cancel”)

Page 112: interactive communication

Outline

Interaction in Zero-error Worst-Length Case

Problem Setup

Definitions and Properties

Results in Zero-error Worst-Length Case

Proof of the Results

Conclusion

Page 113: interactive communication

Outline

Problem Setup (Two-Way Source Coding)

Interaction in Lossless Expected Length Case

Interaction in Zero-error Worst Length Case

Interaction in Function Computation

Page 114: interactive communication

Interaction in Function Computation

Page 115: interactive communication

Interaction in Function Computation

Graph entropy

Example 1

Benefit can be arbitrary large

Example 2

Achievable Infinite-message sum-rate

Conclusion

Page 116: interactive communication

Interaction in Function Computation

Graph entropy

Example 1

Benefit can be arbitrary large

Example 2

Achievable Infinite-message sum-rate

Conclusion

Page 117: interactive communication

Graph Entropy

Maximum independent sets of G(V,E)

Page 118: interactive communication

Graph Entropy

Define random viable W,

Page 119: interactive communication

Graph Entropy

Graph Entropy:

Page 120: interactive communication

Graph Entropy

Optimal rate for function computation satisfies:

NN

Page 121: interactive communication

Graph Entropy

By the definition of Characteristic graph G, we have

Compare with our chromatic number result

Page 122: interactive communication

Interaction in Function Computation

Graph entropy

Example 1

Benefit can be arbitrary large

Example 2

Achievable Infinite-message sum-rate

Conclusion

Page 123: interactive communication

Example 1

The “buffer”example

X want to send message to a buffer Y

Buffer will output the message if it’s not full

But, will throw away any new coming message if it’s full

X~Uniform{1…L} Y~Ber(p)

fa(x,y):=0 fb(x,y):=xy

Page 124: interactive communication

Example 1

Scheme 1: X directly sends message to Y

Page 125: interactive communication

Example 1

Scheme 2: Y tells X if it’s full or not first

Page 126: interactive communication

Example 1

Scheme 1: X directly sends message to Y

Scheme 2: Y tells X if it’s full or not first

Page 127: interactive communication

Example 1

Fixed L, Rsum,1/Rsum,2 can be arbitrarily large

i.e. L=1024

Page 128: interactive communication

Example 1

Fixed p, Rsum,1-Rsum,2 can be arbitrarily large

i.e. p=1E-4

Page 129: interactive communication

Interaction in Function Computation

Graph entropy

Example 1

Benefit can be arbitrary large

Example 2

Achievable Infinite-message sum-rate

Conclusion

Page 130: interactive communication

Example 2

An achievable infinite-message sum-rate as a definite integral with inginitesimal-rate messages

X~Ber(p) Y~Ber(q)

X,Y independent

fA(x,y)=fB(x,y)=x^y

Page 131: interactive communication

Example 2

High level idea of the design:

Define real auxiliary random variable pair

Use real multiplication instead of AND

Sum-rate changes to a definite integral

Define a rate allocation curve to minimize the sum-rate

Page 132: interactive communication

Example 2

Sum-rate changes to a definite integral

Define a rate allocation curve to minimize the sum-rate

Page 133: interactive communication

Example 2

Optimize by the rate allocation curve

Can have

Compare with

Page 134: interactive communication

Example 2

i.e. p=0.5 q=0.5

Page 135: interactive communication

Interaction in Function Computation

Graph entropy

Example 1

Benefit can be arbitrary large

Example 2

Achievable Infinite-message sum-rate

Conclusion

Page 136: interactive communication

Reference

[1]Amiram H. Kaspi “Two-Way Source Coding with a Fidelity Critertion”

[2]Abbas El Gamal, Yound-Han Kim “Network Information Theory” Chapter 20,21

[3]Alon Orlitsky “Worst-Case Interactive Communication I: Two Messages are Almost Optimal”

[4]Alon Orlitsky “Worst-Case Interactive Communication II: Two Messages are Not Optimal”

[5]Nan Ma, Prakash Ishwar “Interaction Strictly Improves the Wyner-ZivRate-distortion Function”

[6]Nan Ma, Prakash Ishwar “Distributed Source Coding for Interactive Function Computation”

[7]Nan Ma, Prakash Ishwar “Infinite-message Distributed Source Coding for Two-terminal Interactive Computing”