Ece-V-Information Theory & Coding [10ec55]-Assignment

Embed Size (px)

Citation preview

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    1/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 1

    Assignment Questions 

    UNIT 1: 

    1.  Explain the terms (i) Self information (ii) Average information (iii) Mutual

    Information.

    2.  Discuss the reason for using logarithmic measure for measuring the amount ofinformation.

    3.  Explain the concept of amount of information associated with message. Alsoexplain what infinite information is and zero information.

    4.  A binary source emitting an independent sequence of 0‟s and 1‟s with  probabilities p and (1-p) respectively. Plot the entropy of the source.

    5. 

    Explain the concept of information, average information, information rate andredundancy as referred to information transmission.

    6.  Let X represents the outcome of a single roll of a fair dice. What is the entropy of X?

    7.  A code is composed of dots and dashes. Assume that the dash is 3 times as long asthe dot and has one-third the probability of occurrence. (i) Calculate the informationin dot and that in a dash; (ii) Calculate the average information in dot-dash code; and(iii) Assume that a dot lasts for 10 ms and this same time interval is allowed betweensymbols. Calculate the average rate of information transmission.

    8. 

    What do you understand by the term extension of a discrete memory less source?Show that the entropy of the nth extension of a DMS is n times the entropy of theoriginal source.

    9.  A card is drawn from a deck of playing cards. A) You are informed that the card youdraw is spade. How much information did you receive in bits? B) How muchinformation did you receive if you are told that the card you drew is an ace? C) Howmuch information did you receive if you are told that the card you drew is an ace ofspades? Is the information content of the message “ace of spad es” the sum of theinformation contents of the messages ”spade” and “ace”? 

    10. A block and white TV picture consists of 525 lines of picture information. Assumethat each consists of 525 picture elements and that each element can have 256 brightness levels. Pictures are repeated the rate of 30/sec. Calculate the average rateof information conveyed by a TV set to a viewer.

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    2/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 2

    11. A zero memory source has a source alphabet S= {S1, S2, S3} with P= {1/2, 1/4,1/4}. Find the entropy of the source. Also determine the entropy of its secondextension and verify that H (S

    2) = 2H(S).

    12. Show that the entropy is maximum when source transmits symbols with equal

     probability. Plot the entropy of this source versus p (0

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    3/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 3

    UNIT 2:

    1.  What do you mean by source encoding? Name the functional requirements to be satisfiedin the development of an efficient source encoder.

    2.  For a binary communication system, a „0‟ or „1‟ is transmitted. Because of noise on thechannel, a „0‟ can be received as „1‟ and vice-vers a. Let m0 and m1 represent the events oftransmitting „0‟ and „1‟ respectively. Let r 0 and r 0 denote the events of receiving „0‟ and „1‟respectively. Let p(m 0) = 0.5, p(r 1/m0) = p = 0.1, P(r 0/m1) = q = 0.2i.  Find p(r 0) and p(r 1)ii.  If a „0‟ was received what is the probability that „0‟ was sentiii.  If a „1‟ was received what is the probability that „1‟ was sent.iv.  Calculate the probability of error.

    v.  Calculate the probability that the transmitted symbol is read correctly at the receiver.

    3.  State Shannon-Hartley‟s law. Derive an equation showing the efficiency of a system in

    terms of the information rate per Unit bandwidth. How is the efficiency of the system relatedto B/W?

    4.  For a discrete memory less source of entropy H(S), show that the average code-word lengthfor any distortion less source encoding scheme is bounded as L≥H(S).

    5.  Calculate the capacity of a standard 4KHz telephone channel working in the range of 200 to3300 KHz with a S/N ratio of 30 dB.

    6.  What is the meaning of the term communication channel. Briefly explain datacommunication channel, coding channel and modulation channel.

    7.  Obtain the communication capacity of a noiseless channel transmitting n discrete messagesystem/sec.

    8.  Explain extremal property and additivity property.

    9.  Suppose that S1, S2 are two memory sources with probabilities p1,p2,p3,……pn for

    source s1 and q1,q2,…….qn for source s2 . Show that  the entropy of the source s1

    n

    H(s1)≤ ∑ Pk log (1/qk) 

    K=1

    10. Explain the concept of B/W and S/N trade-off with reference to the communication channel.

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    4/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 4

    UNIT 3:

    1.  What are important properties of the codes?

    2.  what are the disadvantages of variable length coding?

    3.  Explain with examples:

    4.  Uniquely decodable codes, Instantaneous codes

    5.  Explain the Shannon-Fano coding procedure for the construction of an optimum code

    6.  Explain clearly the procedure for the construction of compact Huffman code.

    7.  A discrete source transmits six messages symbols with probabilities of 0.3, 0.2, 0.2,0.15, 0.1, 0.05. Device suitable Fano and Huffmann codes for the messages anddetermine the average length and efficiency of each code.

    8.  Consider the messages given by the probabilities 1/16, 1/16, 1/8, ¼, ½. Calculate H. Usethe Shannon-Fano algorithm to develop a efficient code and for that code, calculate theaverage number of bits/message compared with H.

    9.  Consider a source with 8 alphabets and respective probabilities as shown:A B C D E F G H

    0.20 0.18 0.15 0.10 0.08 0.05 0.02 0.01 

    Construct the binary Huffman code for this. Construct the quaternary Huffman and

    code and show that the efficiency of this code is worse than that of binary code

    10. Define Noiseless channel and deterministic channel.

    11. A source produces symbols X, Y,Z with equal probabilities at a rate of 100/sec. Owingto noise on the channel, the probabilities of correct reception of the various symbols areas shown:

    P (j/i) X Y z

    X ¾ ¼ 0

    y ¼ ½ ¼

    z 0 ¼ ¾

    Determine the rate at which information is being received.

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    5/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 5

    12. Determine the rate of transmission l(x,y) through a channel whose noise characteristics isshown in fig. P(A1)=0.6, P(A2)=0.3, P(A3)=0.1

    A1  0.5

    B1

    T

    R  

    0.5A2 0.5 B2

    0.5

    A3  0.5 B3

    13. For a discrete memory less source of entropy H(S), show that, the average code-word

    length for any distortionless source encoding scheme is bounded as L H(S).

    14. Briefly discuss the classification of codes.

    15. Show that H(X,Y) = H(X/Y)+H(Y).

    16. state the properties of mutual information.

    17. Show that I(X,Y) = I(Y,X) for a discrete channel.

    18. Show that for a discrete channel I(X,Y) 0

    19. Determine the capacity of the channel shown in fig.

    y2 

    x1 ½

    y1 

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    6/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 6

    20. For a binary erasure channels show in figure below Find the following:

    i.  The average mutual information in bitsii.  The channel capacityiii.  The values of p(x1) and p(x2) for maximum mutual information

    1-p y1x1 

     pe(y2)

    21. A DMS has an alphabet of seven symbols whose probabilities of occurrences aredescribed here:

    Symbol: S0 S1 S2 S3 S4 S5 S6

    Prob 0.25 0.125 0.125 0.125 0.125 0.0625 0.0625

    Compute the Huffman code for this source, moving a combined symbol as high as

     possible, Explain why the computed source code has an efficiency of 100%.

    22. Consider a binary block code with 2n code words of same length n. Show that the Kraftinequality is satisfied for such a code.

    23. Write short notes on the following:

    Binary Symmetric Channels (BSC), Binary Erasure Channels (BEC)

    Uniform Channel, Cascaded channels

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    7/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 7

    UNIT 4: 

    1. Show that for a AWGN channel

    1.448 where /2 = noise power spectraldensity 

    in watts/Hz.

    2. Consider an AWGN channel with 4 KHz bandwidth with noise power spectral densitywatts/Hz. The signal power required at the receiver is 0.1mW. Calculate the capacity of the

    channel.

    3.  If I(xi, yi) = I(xi)-I(xi/y j). Prove that I(xi,y j)=I(y j)-I(y j/xi)

    4.  Design a single error correcting code with a message block size of 11 and show that byan example that it can correct single error.

    5.  If Ci and C j an two code vectors in a (n,k) linear block code, show that their sum is also acode vector.

    6.  Show CHT=0 for a linear block code.

    7.  Prove that the minimum distance of a linear block code is the smallest weight of the non-zero code vector in the code.

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    8/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 8

    UNIT 5: 

    1.  Design a single error correcting code with a message block size of 11 and show that byan example that it can correct single error.

    2.  If Ci and C j an two code vectors in a (n,k) linear block code, show that their sum is also acode vector.

    3.  Show CHT=0 for a linear block code.

    4.  Prove that the minimum distance of a linear block code is the smallest weight of the non-zero code vector in the code.

    5.  What is error control coding? Which are the functional blocks of a communicationsystem that accomplish this? Indicate the function of each block. What is the errordetection and correction on the performance of communication system?

    6. 

    Explain briefly the following:

    Golay codes and BCH codes.

    7.  Explain the methods of controlling errors

    8.  List out the properties of linear codes.

    9.  Explain the importance of hamming codes & how these can be used for error detectionand correction.

    10. Write a standard array for (7.4) code .

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    9/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 9

    UNIT 6: 

    1.  Write a standard array for Systematic Cyclic Codes code

    2.  Explain the properties of binary cyclic codes.

    3.  With a neat diagrams explain the binary cyclic encoding and decoding

    4.  Explain how meggit decoder can be used for decoding the cyclic codes.

    5.  Write short notes on BCH codes

    6.  Draw the general block diagram of encoding circuit using (n-k) bit shift register andexplain its operation.

    7.  Draw the general block diagram of syndrome calculation circuit for cyclic codes andexplain its operation.

    UNIT 7: 

    1. What are RS codes? How are they formed?

    2. Write down the parameters of RS codes and explain those parameters with an example.

    3. List the applications of RS codes.

    4. Explain why golay code is called as perfect code.

    5. Explain the concept of shortened cyclic code.

    6. What are burst error controlling codes?

    7. Explain clearly the interlacing technique with a suitable example.

    8. What are Cyclic Redundancy Check (CRC) codes

  • 8/16/2019 Ece-V-Information Theory & Coding [10ec55]-Assignment

    10/10

    Information Theory & Coding 10EC55 

    Dept. of ECE/SJBIT Page 10

    UNIT 8: 

    1. What are convolutional codes? How is it different from block codes?

    2. Explain encoding of convolution codes using time doman approach with an example.

    3. Explain encoding of convolution codes using transform doman approach with an

    example.

    4. What do you understand by state diagram of a convolution encoder? Explain clearly.

    5. What do you understand by code tree of a convolution encoder? Explain clearly.

    6. What do you understand by trellis diagram of a convolution encoder? Explain clearly.