Entropy Coding MCQ Quiz in తెలుగు - Objective Question with Answer for Entropy Coding - ముఫ్త్ [PDF] డౌన్లోడ్ కరెన్
Last updated on Apr 23, 2025
Latest Entropy Coding MCQ Objective Questions
Top Entropy Coding MCQ Objective Questions
Entropy Coding Question 1:
A source generates 5 symbols A = {a1, ….. a5} with P(ai) = {0.2, 0.4, 0.2, 0.1, 0.1} at rate of 3000 symbols per second. If all the symbols are generated independently, the average bit rate of the most efficient source encoder is ____.
Answer (Detailed Solution Below)
Entropy Coding Question 1 Detailed Solution
Concept:
Huffman code is most efficient
Bit rate = Average code word length × symbol rate
Calculations:
Letter |
Probability |
Code word |
a2 |
0.4 |
1 |
a1 |
0.2 |
01 |
a3 |
0.2 |
000 |
a4 |
0.1 |
0010 |
a5 |
0.1 |
0011 |
Average code word length:
l = (0.1 × 1) + (0.2 × 2) + (0.2 × 3) + (0.1 × 4) + (0.1 × 4)
= 2.2 bits/symbols
Bit rate = 2.2 × 3000
= 6600 bits/secondsEntropy Coding Question 2:
In order to permit the selection of 1 out of 16 equi-probable events, what is the number of bits required?
Answer (Detailed Solution Below)
Entropy Coding Question 2 Detailed Solution
Explanation:
Number of Bits Required to Select 1 Out of 16 Equi-Probable Events
Definition: The number of bits required to select 1 out of 16 equi-probable events can be calculated using the concept of information theory. Specifically, the number of bits corresponds to the binary logarithm of the total number of events (log2(number of events)). Each bit represents one binary decision (0 or 1), and the total number of bits required to distinguish between all possible events reflects the amount of information needed to identify a single event uniquely.
Calculation:
To determine the number of bits required:
- The number of equi-probable events is given as 16.
- The formula for calculating the number of bits is:
Number of bits = log2(number of events) - Substituting the value of 16:
Number of bits = log2(16) - Since 16 is a power of 2 (24 = 16), the binary logarithm is straightforward:
log2(16) = 4 - Therefore, 4 bits are required to select 1 out of 16 equi-probable events.
Correct Answer: Option 2 (4 bits)
Entropy Coding Question 3:
For a source with M equiprobable symbols, what is the formula for calculating entropy (H)?
Answer (Detailed Solution Below)
Entropy Coding Question 3 Detailed Solution
Concept
Entropy is a measure of the uncertainty or randomness in a set of data or information. In the context of information theory, it quantifies the average amount of information produced by a stochastic source of data. For a source with M equiprobable symbols, the entropy (H) is calculated using the formula:
where M is the number of equiprobable symbols.
This is because each symbol has an equal probability of occurring, and the entropy measures the average amount of information per symbol. When all symbols are equiprobable, the entropy is simply the logarithm of the number of symbols to the base 2, which represents the average number of bits needed to encode each symbol.
Therefore, the correct answer is option 2.
Entropy Coding Question 4:
In order to permit the selection of 1 out of 16 equi-probable events, what is the number of bits required?
Answer (Detailed Solution Below)
Entropy Coding Question 4 Detailed Solution
To determine the number of bits required to select 1 out of 16 equi-probable events, we can use the formula:
Number of bits = log₂(Number of events)
Given that there are 16 events:
Number of bits = log₂(16)
Since 16 is 2 raised to the 4th power (2⁴):
log₂(16) = log₂(2⁴) = 4
Therefore, 4 bits are required to uniquely select 1 out of 16 equi-probable events.
This calculation aligns with standard information theory, where the number of bits needed to represent a given number of distinct states is determined by the base-2 logarithm of the total number of states.
Therefore, the correct answer is:
Option 2: 4
Entropy Coding Question 5:
Given below are two statements:
Statement I: Turbo codes exhibits performance, in terms of bit error probability. that is very close to the Shannon limit and can be efficiently implemented for high speed use
Statement II: Convolutional codes provide worst performance in noisy channels where a high proportion of the bits are in error
In the light of the above statements, choose the correct answer from the options given below:
Answer (Detailed Solution Below)
Entropy Coding Question 5 Detailed Solution
Concept:
-
Error-correcting codes are a sequence of numbers generated by specific algorithms for detecting and removing errors in data that has been transmitted over noisy channels.
-
In Linear Block codes, information bits are immediately followed by the parity bits. Block codes take k input bits and produce n output bits where k and n are very large.
-
In convolutional codes, the message comprises of data streams of arbitrary length and a sequence of output bits are generated by the sliding application of Boolean functions to the data stream. Encoding of the current state depends on the previous state and past elements, thus it has a memory element for storing previous state information.
-
Turbo codes are forward error-correction (FEC) codes. The convolutional code and interleaver are combined in Turbo code to achieve the random encoding. In Convolutional codes, information bits are not followed by parity bits instead spread along the sequence.
- Turbo codes closely approach the maximum channel capacity or Shannon limit, a theoretical maximum code rate at which reliable communication is possible for a given noise level.
Solution:
Turbo codes are a class of convolutional codes whose performances in terms of bit error rate (BER) are close to the Shannon limit. Turbo codes are used in 3G/4G mobile communications and in satellite communications and hence, they can be efficiently used for high speed use. Therefore, Statement I is correct.
Convolutional Codes performs better than the block codes for the higher error probability rates in noisy channels. Hence, statement II is incorrect.
Additional Information
Turbo Codes Encoder:
The interleaver is used to permute the input bits such that the two encoders are operating on the same set of input bits, but different input sequences
Turbo Codes Decoder:
Drawbacks of Turbo codes:
- high decoding complexity.
- high latency due to interleaving and iterative decoding
Entropy Coding Question 6:
A memoryless source has symbols S = {−3, −1, 0, 1, 3} with corresponding probabilities {0.3, 0.2, 0.1, 0.2, 0.2}. The entropy of the source is:
Answer (Detailed Solution Below)
Entropy Coding Question 6 Detailed Solution
Concept :
Entropy : The average information of a source .
Entropy of source X : H(X) = The average information of a source X.
H(X) =
Pi : Probability of occurance of an event.
Calculation :
H(X) = -0.3log20.3 + -0.2log20.2 + -0.1log20.1 + -0.2log20.2 + -0.2log20.2
H(X) = 2.254 bits
Additional Information To convert log10 into log2 :
Calculate log in base 10 then multiply the result with 3.32 .
Entropy Coding Question 7:
Consider a source with four symbols. The entropy of the source will be maximum when probabilities of occurrence of symbols are:
Answer (Detailed Solution Below)
Entropy Coding Question 7 Detailed Solution
Concept :
Entropy :
The average information of a source .
Entropy of source X : H(X) = The average information of a source X.
H(X) =
for m events
Case 1: all m events are equiprobable
H(X) =
m =
p : probability of an event
Case 2: Any event is certain ( probability =1 )
H(X) = 0 bits/symbol
0
m=
The entropy of a source is maximum only when all events are equiprobable.
Explanation :
Four symbols: m=4
For maximum entropy ⇒ m=
4 =
p =
Hence option "2" is correct.
Entropy Coding Question 8:
Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?
Answer (Detailed Solution Below)
Entropy Coding Question 8 Detailed Solution
Given:
X is a discrete random variable , K possible distinct real values.
Analysis:
If 'K' symbols having equal probability:
Maximum Entropy H(X)max = log2 (K).
If 'K' symbols having different probabilities:
H(X) < log2 (K)
So, H(X) ≤ log2 (K) Option (1) is correct.
As per given option (2):
H(X) ≤ H(2X)
Now , Y = 2X
For Y = 2X, distant X values results in distinct ‘Y’ values so that H(X) = H(Y). So, option (2) is correct.
As per given option (3):
H(X) ≤ H(X2)
Now , Y = X2
Whereas,
Hence option (3) is incorrect.
As per given option (4) :
H(X) ≤ H(2X)
Now , Y = 2X
Here distinct ‘X’ values results in distinct ‘Y’ values. So that,
H(X) = H(Y) i.e. H(X) = H(2X) so, Option (4) is correct.
NOTE: Option 1, 2, 4 are correct.
Entropy Coding Question 9:
Directions: The item consists of two statements, one labeled as the ‘Assertion (A)’ and the other as ‘Reason (R)’.
You are to examine these two statements carefully and select the answers to the item using the codes given below:
Assertion (A): Source produces two symbols A and B with probability 3/4 and 1/4 respectively. For error-free transmission, this source should be coded using Shannon-Fano code.
Reason (R): For better transmission efficiency, source and channel must be matched.
Answer (Detailed Solution Below)
A is false but R is true
Entropy Coding Question 9 Detailed Solution
Concept:-
- Shanan fano coding can produce error free transmission when more than two symbols are transmitted.
- If source & channel maches then we observe higher entropy i. e. avg. Information (H)
Efficiency
Conclusion:-
Higher entropy gives higher transmission efficiency.Entropy Coding Question 10:
A discrete memory less source emits a symbol U that takes 5 different values U1, U2, U3, U4, U5 with probabilities 0.25, 0.25, 0.25, 0.125, 0.125 respectively. A binary Shannon – Fano code consist of code words of lengths:
Answer (Detailed Solution Below)
Entropy Coding Question 10 Detailed Solution
The length ωi of the code word ui is given as:
D = 2 for binary codes
Substituting the values of probabilities
U1 = U2 = U3 = 2
U4 = U5 = 3