31
1 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. Chapter 4 Channel Coding

Chapter 4 - Ubiquitous Sensing and Cloud Computing Labplato.csie.ncku.edu.tw/2012Fall_WIRELESS/Chapt-04.pdfa1, a2, a3, a4 a5, a6, a7, a8 a9, a10, a11, a12 a13, a14, a15, a16 Write

Embed Size (px)

Citation preview

1Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Chapter 4

Channel Coding

2Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Outline

FEC (Forward Error Correction) Block CodesConvolutional Codes InterleavingInformation Capacity Theorem Turbo CodesCRC (Cyclic Redundancy Check) ARQ (Automatic Repeat Request)

Stop-and-wait ARQGo-back-N ARQSelective-repeat ARQ

3Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Information to be transmitted Source

codingChannel coding

Modulation Transmitter

ChannelInformation

received Source decoding

Channel decoding

Demodulation Receiver

Channel Coding in Digital Communication Systems

Channelcoding

Channel decoding

4Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Forward Error Correction (FEC)

The key idea of FEC is to transmit enough redundant data to allow receiver to recover from errors all by itself. No sender retransmission required. The major categories of FEC codes are

Block codes,Cyclic codes,Reed-Solomon codes (Not covered here),Convolutional codes, andTurbo codes, etc.

5Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Block Codes

Information is divided into blocks of length kr parity bits or check bits are added to each block (total length n = k + r),.Code rate R = k/nDecoder looks for codeword closest to received vector (code vector + error vector)Tradeoffs between

EfficiencyReliabilityEncoding/Decoding complexity

6Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Block Codes: Linear Block CodesLinear Block Code

The block length c(x) or C of the Linear Block Code is

c(x) = m(x) g(x) or C = m G

where m(x) or m is the information codeword block length, g(x) is the generator polynomial, G is the generator matrix.

G = [p | I],

where pi = Remainder of [xn-k+i-1/g(x)] for i=1, 2, .., k, and Iis unit matrix.

The parity check matrix

H = [pT | I ], where pT is the transpose of the matrix p.

7Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

e HT = e HTe ) * HT =c HT

e be the received message where c is the correct code and e is the error

Block Codes: Linear Block Codes

Let x = c

Compute S = x * HT =( c

If S is 0 then message is correct else there are errors in it, from common known error patterns the correct message can be decoded.

The parity check matrix H is used to detect errors in the received code by using the fact

that c * HT = 0 ( null vector)

Generator matrix

G

Code Vector

C

Message vector

m

Parity check matrix

HT

Code Vector

C

Null vector

0

Operations of the generator matrix and the parity check matrix

8Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Block Codes: Example

Example : Find linear block code encoder G if code generator polynomial g(x)=1+x+x3 for a (7, 4) code.

We have n = Total number of bits = 7, k = Number of information bits = 4, r = Number of parity bits = n - k = 3.

[ ] ,

100

010001

| 2

1

==

L

LLLL

L

L

kp

pp

IPG

where ki

xgxofmainderp

ikn

i ,,2,1,)(

Re1

L=

=

−+−

9Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Block Codes: Example (Continued)

[ ]11011

Re 3

3

1 →+=

++

= xxx

xp

[ ]0111

Re 23

4

2 →+=

++

= xxxx

xp

[ ]11111

Re 23

5

3 →++=

++

= xxxx

xp

[ ]10111

Re 23

6

4 →+=

++

= xxx

xp

=

1010001111001001101001101000

G

10Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Cyclic Codes

It is a block code which uses a shift register to perform encoding and decodingThe code word with n bits is expressed as

c(x)=c1xn-1 +c2xn-2……+ cn

where each ci is either a 1 or 0.c(x) = m(x) xn-k + cp(x)

where cp(x) = remainder from dividing m(x) xn-k by generator g(x)if the received signal is c(x) + e(x) where e(x) is the error.

To check if received signal is error free, the remainder from dividing c(x) + e(x) by g(x) is obtained(syndrome). If this is 0 then the received

signal is considered error free else error pattern is detected from known error syndromes.

11Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Cyclic Redundancy Check (CRC)

Using parity, some errors are masked - careful choice of bit combinations can lead to better detection. Binary (n, k) CRC codes can detect the following error patterns1. All error bursts of length n-k or less. 2. All combinations of minimum Hamming distance

dmin- 1 or fewer errors. 3. All error patters with an odd number of errors if the

generator polynomial g(x) has an even number of nonzero coefficients.

12Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Common CRC Codes

161+x5+x15+x16CRC-CCITT

161+x2+x15+x16CRC-16

121+x+x2+x3+x11+x12CRC-12

Parity check bits

Generator polynomial g(x)

Code

13Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Convolutional Codes

Encoding of information stream rather than information blocksValue of certain information symbol also affects the encoding of next M information symbols, i.e., memory MEasy implementation using shift register

Assuming k inputs and n outputsDecoding is mostly performed by the ViterbiAlgorithm (not covered here)

14Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Convolutional Codes: (n=2, k=1, M=2) Encoder

Input Output

Input: 1 1 1 0 0 0 … Output: 11 01 10 01 11 00 …

Input: 1 0 1 0 0 0 … Output: 11 10 00 10 11 00 …

D1 D2

Di -- Register

x

y1

y2

c

15Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

01/001/1

11/1 11/0

State Diagram

11

10/1

00/0

00

10 0110/0

00/1

16Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Tree Diagram

00

1100

10

0111

11

0010

01

1001

00

1111

10

0100

11

0001

01

1010

11

00

01

10

01

10

11

00

1

0

… 1 1 0 0 1… 10 11 11 01 11

……

First inputFirst output

17Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Trellis

00 0000 0000 0000 0000 0000

10 10 10 10 10 10

01 01 01 01 01 01

11 11 11 11 11 11

11

10

11

11

01

… 11 0 0 1

10 10 10

01 01 0101 01 01

11 11 1111 11

10 10 1000 00 00

18Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Interleavinga1, a2, a3, a4, a5, a6, a7, a8, a9, …Input Data

a1, a2, a3, a4 a5, a6, a7, a8 a9, a10, a11, a12 a13, a14, a15, a16

Write

Rea

dInterleaving

a1, a5, a9, a13, a2, a6, a10, a14, a3, …Transmitting Data

a1, a2, a3, a4 a5, a6, a7, a8 a9, a10, a11, a12 a13, a14, a15, a16

Read

Writ

eDe-Interleaving

a1, a2, a3, a4, a5, a6, a7, a8, a9, …Output Data

19Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Interleaving (Example)

0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0,…Transmitting Data

0, 1, 0, 0 0, 1, 0, 0 0, 1, 0, 0 1, 0, 0, 0

Read

Writ

eDe-Interleaving

0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, …Output Data

Burst error

Discrete error

20Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Information Capacity Theorem (Shannon Limit)

The information capacity (or channel capacity) C of a continuous channel with bandwidth BHertz can be perturbed by additive Gaussian white noise of power spectral density N0/2, provided bandwidth B satisfies

where P is the average transmitted power P = EbRb (for an ideal system, Rb = C). Eb is the transmitted energy per bit, Rb is transmission rate.

ondbitsBN

PBC sec/1log0

2

+=

21Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Shannon Limit

0 10 20 30 Eb/N0 dB

Region for which Rb<C

Region for which Rb>C

Rb/B

-1.6

Shannon Limit

Capacity boundary Rb<C

1

10

20

0.1

22Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Turbo Codes

A brief historic of turbo codes :The turbo code concept was first introduced by C. Berrou in 1993. Today, Turbo Codes are considered as the most efficient coding schemes for FEC. Scheme with known components (simple convolutional or block codes, interleaver, soft-decision decoder, etc.)Performance close to the Shannon Limit (Eb/N0 = -1.6 db if Rb 0) at modest complexity! Turbo codes have been proposed for low-power applications such as deep-space and satellite communications, as well as for interference limited applications such as third generation cellular, personal communication services, ad hoc and sensor networks.

23Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Turbo Codes: Encoder

Convolutional Encoder 1

Interleaving

Convolutional Encoder 2

Data Source

XX

Y

Y1

Y2(Y1, Y2)

X: Information

Yi: Redundancy Information

24Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Turbo Codes: Decoder

Convolutional Decoder 1

Convolutional Decoder 2

Interleaving

Interleaver

De-interleaving

De-interleavingX

Y1

Y2

X’

X’: Decoded Information

25Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Automatic Repeat Request (ARQ)

Source Transmitter Channel Receiver Destination

EncoderTransmit

ControllerModulation Demodulation Decoder

Transmit

Controller

Acknowledge

26Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Stop-And-Wait ARQ (SAW ARQ)

Transmitting Data

1 32 3Time

Received Data 1 2 3Time

ACK

ACK

NA

K

Output Data 1 2 3Time

Error

Retransmission

ACK: Acknowledge

NAK: Negative ACK

27Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Stop-And-Wait ARQ (SAW ARQ)

Throughput:

S = (1/T) * (k/n) = [(1- Pb)n / (1 + D * Rb/ n) ] * (k/n)

where T is the average transmission time in terms of a block duration

T = (1 + D * Rb/ n) * PACK + 2 * (1 + D * Rb/ n) * PACK * (1- PACK)

+ 3 * (1 + D * Rb/ n) * PACK * (1- PACK)2 + …..

= (1 + D * Rb/ n) / (1- Pb)n

where n = number of bits in a block, k = number of information bits in a block, D = round trip delay, Rb= bit rate, Pb = BER of the channel, and PACK = (1- Pb)n

28Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Go-Back-N ARQ (GBN ARQ)

Transmitting Data

1Time

Received DataN

AK

Output Data Time

Error

Go-back 3

2 3 4 5 3 44 5 6 7 5

1Time

2 3 44 5Error

NA

K

Go-back 5

1 2 3 44 5

29Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Throughput

S = (1/T) * (k/n)

= [(1- Pb)n / ((1- Pb)n + N * (1-(1- Pb)n ) )]* (k/n)

where

T = 1 * PACK + (N+1) * PACK * (1- PACK) +2 * (N+1) * PACK *

(1- PACK)2 + ….

= 1 + (N * [1 - (1- Pb)n ])/ (1- Pb)n

Go-Back-N ARQ (GBN ARQ)

30Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Selective-Repeat ARQ (SR ARQ)

Transmitting Data

1Time

Received DataN

AK

Error

Retransmission

2 3 4 5 3 6 7 8 9 7

1Time

2 4 3 6 8 7Error

NA

K

Retransmission

5 9

Buffer 1Time

2 4 3 6 8 75 9

Output Data 1Time

2 4 3 6 8 75 9

31Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved.

Selective-Repeat ARQ (SR ARQ)

Throughput

S = (1/T) * (k/n)

= (1- Pb)n * (k/n)

where

T = 1 * PACK + 2 * PACK * (1- PACK) + 3 * PACK * (1- PACK)2

+ ….

= 1/(1- Pb)n