Upload
others
View
14
Download
0
Embed Size (px)
Citation preview
Coding For Wireless Channel
Presented by: Salah Zeyad Radwan
Presented to: Dr. Mohab Mangoud
Outline1. Introduction.
2. Linear Block Codes.2.1 Binary Linear Block Codes.2.2 Generator Matrix. 2.3 Syndrome Testing & LUT.
3. Convolutional Codes.3.1 Mathematical Representation.3.2 Tree diagram 3.3 Trellis Diagram. 3.4 Viterbi Algorithm.
4. Interleaver 5. Concatenated Codes
6. Conclusion7. References
1. Introduction
Block Diagram of a general Communication System
Information to be
transmittedSource
encodingChannel coding
Modulation Transmitter
Channel
Information received
Source decoding
Channel decoding
Demodulation Receiver
Channelencoding
Channel decoding
1. Introduction• Channel encoding : The application of redundant
symbols to correct data errors.
• Modulation : Conversion of symbols to a waveform for transmission.
• Demodulation : Conversion of the waveform back to symbols.
• Decoding: Using the redundant symbols to correct errors.
1. IntroductionWhat is Coding?
• Coding is the conversion of information to another form for some purpose
• Source Coding : The purpose is lowering the redundancy in the information. (e.g. ZIP, JPEG, MPEG2)
• Channel Coding : The purpose is to defeat channel noise.
1. Introduction
• channel coding starting from 1948 Claude Shannon was working on the information transmission capacity of a communicationchannel, He showed that capacity depends on (SNR)
C = W log2 (1 + S/N)
C : Capacity of the channel W : Bandwidth of the channel
1. Introduction• Channel coding can be partitioned into two study areas
1. Waveform Coding: Transforming waveforms to better waveforms “detection process less subject to errors”
2. Structured Sequences : Transforming data sequences into better sequences, having structured redundancy ‘redundant bits can be detect and correct”
• In this study I’m going to describe codes designed for AWGN channels, Which requires a background in block & Convolutional codes
2. Linear Block Code• Linear block codes are extension of single-bit-parity
check codes for error detection. characterized by the (n,k) notation
• This code uses one bit ”parity” in a block of n data to indicate whether the number of 1s in a block is odd or even.
• Linear Block Code using a large number of parity bits to detected or correct more than one error
b1,b2,……,bn-k m1,m2,……….,mk
Parity bits (n-k) Message bits (k)
Code word (n)
2. Linear Block Code• The information bit stream is chopped into blocks of k
bits.
• Each block is encoded to a larger block of n bits.
• The coded bits are modulated and sent over channel.
• The reverse procedure is done at the receiver.
Data blockChannelencoder Codeword
k bits n bits
2. Linear Block Code
• Encoding Process
Block Encoder
Coded Data Stream
k-symbol block
n-symbol block
Uncoded Data stream
2. Linear Block Code
GMessagem
Codewordc
• Generator matrix
This system may be written in a compact form using matrix notations
m=[m0,m1,m2,…..…mk-1] massage vector 1-by-kb= [b0,b1,b2,………bn-k-1] parity vector 1-by-(n-k)c= [c0,c1,c2,……..….xn-1] code vector 1-by-n
Note: all three vectors are row vectors
2. Linear Block Code
b = mP (1)Where P is the k- by-(n-k) coefficient matrix
p11 p12 … p1,n-k
p21 p22 … p2,n-k
P = . . . . .. . . .
pk1 pk2 … pk,n-k
2. Linear Block Code
c = [b m] (2)
From (1) in (2) we getc = m [P Ik] (3)
Where Ik is the k-by-k identity matrix.The generator matrix is defined as:
G = [P Ik] (4)
From (4) in (3) we getc = mG (5)
2. Linear Block Code
• Syndrome decoding
The received message can be represented as:R = c + e
The decoder computes the code vector from the received message by using what we call the syndrome, which depends only upon the error pattern.
s = R HT
2. Linear Block Code
Then we write the error matrix e = (2n-k×n) to get the LUT : e × HT
LUT help us to locate the bit corresponding to the bit error in the codeword then we can correct it by changing the location of the bit error “if the bit error 1 change it to 0 and if we receive 0 change it to 1” to get the correct codeword.
3. Convolutional Codes
• Convolutional codes are fundamentally different from the block codes. It is not possible to separate the codes into independent blocks. Instead each code bit depends on a certain number of previous data bits.
• We can coding using a structure consisting of shift register, a set of exclusive or (XOR) gates and multiplexer
3. Convolutional Codes
D D2
+
+
+
MU
X
Data
Codesequence
Input: 1 1 1 0 0 0 …Output: 11 01 10 01 11 00 …
Input: 1 0 1 0 0 0 …Output: 11 10 00 10 11 00 …
D1
x2
x1
3. Convolutional Codes
The previous convolutional encoder can be represented mathematically
g(1) (D) = 1+ D2
g(2) (D) = 1 + D + D2
For message sequence (1001), it can be represented as m (D) = 1 + D3
Hence the output polynomial of path 1 & 2 arec(1) (D) = g(1) (D) m (D) = 1 + D2 + D3 + D5
c(2) (D) = g(2) (D) m (D) = 1 + D + D2 + D3 + D4 + D5
By multiplexing the two output sequences we get the code sequence
c = (11,01,11,11,01,11)
Which is (nonsystematic)
3. Convolutional Codes
+
+ MU
X
Data
Code sequence
Data
parity
g(1) (D) = 1g(2) (D) = 1 + D + D2
Systematic Convolutional
D1 D2
3. Convolutional Codes
Codesequence
DataD2
+
+
+
MU
X
D1
x2
x1
Structural properties of convolutional encoder are portrayed in graphical form by using any one of
1. Code tree2. Trellis3. State diagram
Tree Diagram
First input
00
1100
10
0111
11
0010
01
1001
00
1111
10
0100
11
0001
01
1010
11
00
01
10
01
10
11
00
1
0
… 1 1 0 0 1… 10 11 11 01 11
……
First output
3. Convolutional Codes
Trellis for convolutional encoder
11d
01c
10b
00a
Binary descriptionstate
a
b
c
d
00 00 00
11 11 11
01
10
01
10
11
00
10
01
3. Convolutional Codes
We may collapse the code tree in the previous slides into a new form called a trellis
Trellis for convolutional encoder
11d
01c
10b
00a
Binary descriptionstate
a
b
c
d
00 00 00
11 11
01
1010
11
01
11
00
10
01
For incoming data 1001the generated code sequence becomes 11.01.11.11
11
3. Convolutional Codes
Viterbi Algorithm
• To decode the all-zero sequences when received as 0100100000
a
b
01 1
1
3. Convolutional Codes
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 001
1
1
3
2
2
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 101
1
1
3
2
2
2
2
5
3
3
3
2
4
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 10 2
2
2
3
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 10 002
2
2
3
2
44
2
34
34
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 10 00 2
2
3
3
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 10 00 002
4
3
3
5
3
4
4
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 10 00 002
3
3
3
3. Convolutional Codes
• To decode the all-zero sequences when received as 0100100000
a
b
c
d
01 00 10 00 00
00 00 00 00 00
3. Convolutional Codes
State diagram of the convolutional codes00
11
01
11
00
10
01
10
a
b
c
d
a
b
c
d
A portion of the central part of the trellis for the encoder
The left nodes represent the four possible current state of the encoder
The right nodes represent the next state
We can coalesce the left and right nodes to get “state diagram”
3. Convolutional Codes
11d
01c
10b
00a
Binary descriptionstate
c
d
b
a 10
01
01
0010
11
11
00
4. Interleaver
• The simple type of interleaver (sometimes known as a rectangular or block interleaver)
• Interleaver is commonly denoted by the Greek letter л and its corresponding de-interleaver by л-1.
• The original order can then be restored by a corresponding de-interleaver:
4. Interleaver987654321
4. Interleaver
987654321
2019181716
1514131211
109876
54321
Interleaver (л)
4. Interleaver
987654321
2019181716
1514131211
109876
54321
3171272161161
Interleaver (л)
Interleaved data
4. Interleaver
987654321
2019181716
1514131211
109876
54321
3171272161161
Interleaver (л)
Interleaved data
2019181716
1514131211
109876
54321
De-interleaver (л-1)
4. Interleaver
987654321
2019181716
1514131211
109876
54321
3171272161161
Interleaver (л)
Interleaved data
2019181716
1514131211
109876
54321
987654321
De-interleaver (л-1)
5. Concatenated Codes
• We have seen that the power of FEC codes increases with length k and approaches the Shannon bound only at very large k, but also that decoding complexity increases very rapidly with k.
• This suggests that it would be desirable to build a long, complex code out of much shorter component codes, which can be decoded much more easily.
5. Concatenated Codes
• The principle is to feed the output of one encoder (called the outer encoder) to the input of another encoder, and so on, as required.
• The final encoder before the channel is known as the inner encoder.
• The resulting composite code is clearly much more complex than any of the individual codes.
5. Concatenated Codes
Interleaver Outer encoder
Innerencoder
Outer decoder
deInterleaver Inner decoder
Channel
General block diagram for concatenated codes
Serial Concatenated Codes
6. Conclusion
We have reviewed the basic principles of channel coding including the linear block codes, the convolutional codes and the concatenated coding, showing how it is that they achieve such remarkable performance.
7. References
• Wireless Communications “Andrea Goldsmith”• Digital Communications “Simon Haykin”• Digital Communications “Bernard Sklar”