71
CONVOLUTIONAL CODES Presented By : Abdulaziz Al mabrok Al tagawy Course : Coding Theory - Feb/2016

Convolutional codes

Embed Size (px)

Citation preview

Convolutional codesPresented By : Abdulaziz Al mabrok Al tagawyCourse : Coding Theory - Feb/2016

1

Contents:IntroductionConvolutional Codes and EncodersDescription in the D-Transform DomainConvolutional Encoder RepresentationsRepresentation of Connections:State Diagram Representation:Trellis Diagram:Convolutional Codes in Systematic FormMinimum Free Distanceof a Convolutional Code

Maximum Likelihood DetectionDecoding of Convolutional Codes: The Viterbi AlgorithmCatastrophic Generator MatrixPunctured Convolutional CodesThe Most Widely Used Convolutional CodesPractical Examples of Convolutional CodesAdvantages of Convolutional Codes.

Beginning course details and/or books/materials needed for a class/project.2

Introduction

IntroductionConvolutional codes were first discovered by P.Elias in 1955.The structure of convolutional codes is quite different from that of block codes.During each unit of time, the input to a convolutional code encoder is also a k-bit message block and the corresponding output is also an n-bit coded block with k < n.Each coded n-bit output block depends not only the corresponding k-bit input message block at the same time unit but also on the m previous message blocks.Thus the encoder has k input lines, n output lines and a memory of order m .

IntroductionEach message (or information) sequence is encoded into a code sequence.The set of all possible code sequences produced by the encoder is called an (n,k,m) convolutional code.The parameters, k and n, are normally small, say 1k8 and 2n9.The ratio R=k/n is called the code rate.Typical values for code rate are: 1/2, 1/3, 2/3.The parameter m is called the memory order of the code.Note that the number of redundant (or parity) bits in each coded block is small. However, more redundant bits are added by increasing the memory order m of the code while holding k and n fixed.

An (n,k,m) convolutional code encoder

Convolutional Codes and Encoders

Convolutional Codes and EncodersConvolutional encoder is accomplished using shift registers (D flip-flop) and combinatorial logic that performs modulo-two addition.

Convolutional Codes and EncodersWell also need an output selector to toggle between the two modulo-two adders .The output selector (SEL A/B block) cycles through two states; in the first state, it selects and outputs the output of the upper modulo-two adder; in the second state, it selects and outputs the output of the lower modulo-two adder.

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and Encoders

Convolutional Codes and EncodersGenerator Matrix in the Time Domain:Ex: Generator Matrix of the Binary (2, 1, 2) Convolutional Code :

Convolutional Codes and Encoders

Convolutional Codes and EncodersGenerator Matrix in the Time Domain:

Description in the D-Transform Domain

Description in the D-Transform DomainThe D-transform is a function of the indeterminate D (the delay operator) and is defined as:

Description in the D-Transform DomainThe convolutional relation of the D transformD{u g} = U(D)G(D) is used to transform the convolution of input sequences and generator sequences to a multiplication in the D domain.

Description in the D-Transform Domain

Description in the D-Transform Domain

Description in the D-Transform Domain

Description in the D-Transform Domain

Convolutional Encoder Representations

Convolutional Encoder RepresentationsRepresentation of Connections:

Convolutional Encoder RepresentationsState Diagram Representation:

At each time unit, the output block depends on the input and the state,

Convolutional Encoder RepresentationsTrellis Diagram:

Convolutional Encoder RepresentationsAt the time m, the encoder reaches the steady state.

Trellis Diagram:

Convolutional Encoder RepresentationsTrellis Diagram:

Convolutional Encoder RepresentationsTermination of a Trellis:At this instant, the trellis converges into a single vertex.During the termination process, the number of states is reduced by half as each 0 is shifted into the encoder register.Trellis Diagram:

Convolutional Encoder RepresentationsTermination of a Trellis:

Trellis Diagram:

Convolutional Codes in Systematic Form

Convolutional Codes in Systematic FormIn a systematic code, message information can be seen and directly extracted from the encoded information.

Convolutional Codes in Systematic Form

Convolutional Codes in Systematic Form

Convolutional Codes in Systematic FormIn the case of systematic convolutional codes, there is no need to have an inverse transfer function decoder to obtain the input sequence, because this is directly read from the code sequence.

Minimum Free Distanceof a Convolutional Code

Minimum Free Distanceof a Convolutional Code

Minimum Free Distanceof a Convolutional CodeSince convolutional codes are linear, we can use the all-zero path 0 as our reference for studying the distance structure of convolutional codes without loss of generality.

Minimum Free Distanceof a Convolutional Code

Minimum Free Distanceof a Convolutional Code

Maximum Likelihood Detection

Maximum Likelihood DetectionIf the input sequence messages are equally likely, the optimum decoder which minimizes the probability of error is the Maximum Likelihood (ML) decoder.Maximum likelihood decoding means finding the code branch in the code trellis that was most likely to transmitted. Therefore maximum likelihood decoding is based on measuring the distance using either Hamming distance for hard-decision decoding or Euclidean distance for soft-decision decoding .Probability to decode sequence is then :

Decoding of Convolutional Codes: The Viterbi Algorithm

Decoding of Convolutional Codes: The Viterbi AlgorithmThe Viterbi algorithm performs maximum likelihood decoding but reduces the computational complexity by taking advantage of the special structure of the code trellis.It was first introduced by A. Viterbi in 1967.It was first recognized by D. Forney in 1973 that it is a MLD algorithm for convolutional code.

Decoding of Convolutional Codes: The Viterbi AlgorithmGenerate the code trellis at the decoder.The decoder penetrates through the code trellis level by level in search for the transmitted code sequence.At each level of the trellis, the decoder computes and compares the metrics of all the partial paths entering a node.The decoder stores the partial path with the largest metric and eliminates all the other partial paths. The stored partial path is called the survivor.

Basic Concepts:

Decoding of Convolutional Codes: The Viterbi Algorithm

Procedure:

Decoding of Convolutional Codes: The Viterbi AlgorithmProcedure:

Decoding of Convolutional Codes: The Viterbi AlgorithmProcedure:

Decoding of Convolutional Codes: The Viterbi AlgorithmProcedure:

Decoding of Convolutional Codes: The Viterbi AlgorithmProcedure:

Decoding of Convolutional Codes: The Viterbi AlgorithmExample :m = (101) Source messageU = (11 10 00 10 11) Codeword to be transmittedZ = (11 10 11 10 01) Received codeword

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:Label all the branches with the branch metric (Hamming distance)

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:i=2,

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:i=3,

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:i=4,

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:i=5,

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:i=6,

Decoding of Convolutional Codes: The Viterbi AlgorithmExample:

Catastrophic Generator MatrixA catastrophic matrix maps information sequences with infinite Hamming weight to code sequences with finite Hamming weight.For a catastrophic code, a finite number of transmission errors can cause an infinite number of errors in the decoded information sequence. Hence, theses codes should be avoided in practice.It is easy to see that systematic generator matrices are never catastrophic.A code which inherits the catastrophic error propagation property is called a catastrophic code.

Punctured Convolutional Codes

The Most Widely Used Convolutional Codes

Practical Examples of Convolutional Codes

Advantages of Convolutional CodesConvolution coding is a popular error-correcting coding method used in digital communications.The convolution operation encodes some redundant information into the transmitted signal, thereby improving the data capacity of the channel.Convolution Encoding with Viterbi decoding is a powerful FEC technique that is particularly suited to a channel in which the transmitted signal is corrupted mainly by AWGN.It is simple and has good performance with low implementation cost.

THANK YOU