Information Theory MCQ Quiz - Objective Question with Answer for Information Theory - Download Free PDF

Last updated on Mar 8, 2025

Latest Information Theory MCQ Objective Questions

Information Theory Question 1:

In the context of the Channel Capacity Theorem, what role does channel bandwidth play in determining the capacity of a communication channel?

  1. Channel Bandwidth has no effect on channel capacity. 
  2. Increasing channel bandwidth decreases channel capacity.
  3. Channel Bandwidth is directly proportional to channel capacity.
  4. Channel Bandwidth is inversely proportional to channel capacity.

Answer (Detailed Solution Below)

Option 3 : Channel Bandwidth is directly proportional to channel capacity.

Information Theory Question 1 Detailed Solution

Concept

The Channel Capacity Theorem, also known as the Shannon-Hartley theorem, is a fundamental principle in telecommunications and information theory. It determines the maximum rate at which information can be transmitted over a communication channel without error, given the channel's bandwidth and the signal-to-noise ratio.

According to the Shannon-Hartley theorem, the channel capacity is directly proportional to the channel bandwidth. This means that increasing the channel bandwidth will increase the channel capacity, allowing more information to be transmitted over the channel without error.

The theorem is expressed as:

\(C = B \log_2(1 + \frac{S}{N})\)

where:

  • C is the channel capacity in bits per second (bps).
  • B is the channel bandwidth in hertz (Hz).
  • S is the average signal power.
  • N is the average noise power.

 

Information Theory Question 2:

For a source with M equiprobable symbols, what is the formula for calculating entropy (H)?

  1. H = Mlog(M)
  2. H = log2 (M)
  3. H = -log2 (M)
  4. H = -Mlog2 (M)

Answer (Detailed Solution Below)

Option 2 : H = log2 (M)

Information Theory Question 2 Detailed Solution

Concept

Entropy is a measure of the uncertainty or randomness in a set of data or information. In the context of information theory, it quantifies the average amount of information produced by a stochastic source of data. For a source with M equiprobable symbols, the entropy (H) is calculated using the formula:

\(H = \log_2(M)\)

where M is the number of equiprobable symbols.

This is because each symbol has an equal probability of occurring, and the entropy measures the average amount of information per symbol. When all symbols are equiprobable, the entropy is simply the logarithm of the number of symbols to the base 2, which represents the average number of bits needed to encode each symbol.

Therefore, the correct answer is option 2.

Information Theory Question 3:

__________ is the informal networks of communication that intersect several path, circumvent rank or authority and can link organizational members in any combination or direction.

  1. Grapevine 
  2. Informational role 
  3. Hierarchy 
  4. Integration

Answer (Detailed Solution Below)

Option 1 : Grapevine 

Information Theory Question 3 Detailed Solution

The correct answer is Grapevine.
 Key Points
  • Grapevine is a casual business communication channel.
  • Its name stems from the fact that it extends in all directions throughout the company, regardless of authority levels.
  • The phrase "heard through the grapevine" refers to information obtained through rumors or gossip that is informal and unofficial.
  • The typical interpretation is that the information was spread orally among friends or coworkers, sometimes in a private manner.
  • Even if there are official channels in an organization, informal channels usually arise from interactions between members of the organization.

 Important Points

  • To achieve goals, managers in informational jobs create, acquire, and disseminate knowledge with staff members and superior colleagues.
  • All designs, including e-learning design, should adhere to the universal design concept of the hierarchy of information.
  • Combining data from disparate sources with various conceptual, contextual, and typographic representations is known as information integration (II). 

Information Theory Question 4:

Noise factor of a system is defined as:

  1. Ratio of input signal to output signal
  2. Ratio of input S/N ratio to output S/N ratio 
  3. Ratio of output S/N ratio to input S/N ratio
  4. Ratio of output signal to noise ratio

Answer (Detailed Solution Below)

Option 2 : Ratio of input S/N ratio to output S/N ratio 

Information Theory Question 4 Detailed Solution

Noise figure (NF) and noise factor (F) are measures of degradation of the signal-to-noise ratio (SNR), caused by components in a signal chain.

\(Noise\;Figure = \frac{{{{\left( {SNR} \right)}_{i/p}}}}{{{{\left( {SNR} \right)}_{o/p}}}}\)

In dB it is given as:

(N.F)dB = [(SNR)i/p]dB – [(SNR)o/p]dB

Important Points

In terms of noise resistance NF is given as:

\(NF= 1 + \frac{{{R_{eq}}}}{{{R_s}}}\) ------(1)

Req = Equivalent input resistance of the antenna

Rs = Noise resistance of the system

Information Theory Question 5:

An image of a document that was scanned and sent electronically by a telephone:

  1. Modem
  2. Fax
  3. Message
  4. E - mail

Answer (Detailed Solution Below)

Option 2 : Fax

Information Theory Question 5 Detailed Solution

Faxes are specifically designed to transmit scanned images of documents over telephone lines.

Key Points

  • The scanned image is converted into a signal that can be transmitted through the phone line and then converted back into an image at the receiving end.
  • This makes them a good option for sending documents that need to be preserved in their original format, such as signed papers or contracts.

Hint

  • Modem: A modem is a device that allows a computer to connect to the internet or another computer using a telephone line.
  • Message: This is a broad term that could refer to any type of communication, including text, voice, or video. 
  • E-mail: E-mail is a digital communication system that uses computers and networks to send and receive messages. 

Hence, the correct answer is Fax.

Top Information Theory MCQ Objective Questions

1 dB corresponds to ______ change in power level.

  1. 35%
  2. 26%
  3. 50%
  4. 14%

Answer (Detailed Solution Below)

Option 2 : 26%

Information Theory Question 6 Detailed Solution

Download Solution PDF

The correct option is 2

Concept:

Decibels are used to express the ratio of two power values. The decibel scale is a logarithmic scale, which is a more useful way to compare ratios than using a plain arit
hmetic scale.

The decibel (dB) is defined as 10 times the base 10 logarithm of the power ratio:

dB = 10 log(P2/P1)

To get the power ratio when the dB value is given, the formula is rearranged as:

P2/P1 = 10dB/10

So for a change of 1 dB, the power ratio would be:

10 (1/10) = 1.25892 (approx)

This is about 26% increase in power level, since 1.25892-1 = 0.25892, which is roughly 26%.

So, 1 dB corresponds to approximately a 26% increase or decrease when you are increasing or decreasing power levels respectively.

Let (X1, X2) be independent random varibales. X1 has mean 0 and variance 1, while X2 has mean 1 and variance 4. The mutual information I(X1 ; X2) between X1 and X2 in bits is_______.

Answer (Detailed Solution Below) 0

Information Theory Question 7 Detailed Solution

Download Solution PDF

Concept:

Mutual information of two random variables is a measure to tell how much one random variable tells about the other.

It is mathematically defined as:

I(X1, X2) = H(X1) – H(X1/X2)

Application:

Since X1 and X2 are independent, we can write:

H(X1/X2) = H(X1)

I(X1,X2 ) = H(X1) – H(X1)

= 0

Maximum data rate of a channel for a noiseless 2-kHz binary channel is -

  1. 2000 bps
  2. 1000 bps
  3. 4000 bps
  4. 3000 bps

Answer (Detailed Solution Below)

Option 3 : 4000 bps

Information Theory Question 8 Detailed Solution

Download Solution PDF

The correct option is 3

Concept:

The maximum data rate of a noiseless channel is determined by the Nyquist formula, which states that the maximum data speed (in bits per second) is 2 x Bandwidth x log2(L), where L is the number of signal levels.

For a binary channel, you only have two signal levels (0 and 1), so the log2(L) part of the formula will be log2(2) = 1

So, if you have a 2-kHz binary channel, you substitute the bandwidth value into the formula giving: 2 x 2000 Hz x 1 = 4000 bps.

An event has two possible outcomes with probability \(P_1=\frac{1}{2}\), and \(P_2=\frac{1}{64}\) The rate of information with 16 outcomes per second is:

  1. (38/4) bits/sec
  2. (38/64) bits/sec
  3. (38/2) bits/sec
  4. (38/32) bits/sec

Answer (Detailed Solution Below)

Option 1 : (38/4) bits/sec

Information Theory Question 9 Detailed Solution

Download Solution PDF

Concept:

Information associated with the event is “inversely” proportional to the probability of occurrence.

Entropy: The average amount of information is called the “Entropy”.

\(H = \;\mathop \sum \limits_i {P_i}{\log _2}\left( {\frac{1}{{{P_i}}}} \right)\;bits/symbol\)

Rate of information = r.H

Calculation:

Given: r = 16 outcomes/sec

\(P_1=\frac{1}{2}\), and \(P_2=\frac{1}{64}\) 

\( H = \frac{1}{2}{\log _22} + \frac{1}{64}{\log_264};\)

\(H=\frac{19}{32} \ bits/outcomes\)

∴ Rate of information = r.H

Rs = 16 x 19/32

Rs = 19/2 or 38/4 bits/sec

The channel capacity is measured in terms of:

  1. bits per channel 
  2. number of input channels connected 
  3. calls per channel 
  4. number of output channels connected 

Answer (Detailed Solution Below)

Option 1 : bits per channel 

Information Theory Question 10 Detailed Solution

Download Solution PDF

Channel Capacity theorem: 

It states the channel capacity C, meaning the theoretical highest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel that is subject to additive white Gaussian noise (AWGN) of power N.

The capacity of a band-limited AWGN channel is given by the formula:

\(C = B{\log _2}\left( {1 + \frac{S}{N}} \right)\)

C = Maximum achievable data rate with units of bits/sec for continuous nature of input and output. For discrete nature of input/output, bits/channel is used.

B = channel bandwidth

\(\frac{S}{N}\) = signal Noise power (in W)

Note: In the expression of channel capacity, S/N is expressed in Watts and not in dB.

A (7, 4) block code has a generator matrix as shown.

\(G = \left[ {\begin{array}{*{20}{c}} 1&0&0&0&1&1&0\\ 0&1&0&0&0&1&1\\ 0&0&1&0&1&1&{1}\\ 0&0&0&1&1&0&1 \end{array}} \right]\)

If there is error in the 7th Bit then syndrome for the same will be

  1. 001
  2. 010
  3. 100
  4. 011

Answer (Detailed Solution Below)

Option 1 : 001

Information Theory Question 11 Detailed Solution

Download Solution PDF

The generator Matrix is given by

\(G = \left[ {{I_K}{P^T}} \right]\)

\({P^T} = \left[ {\begin{array}{*{20}{c}} 1&1&0\\ 0&1&1\\ 1&1&1\\ 1&0&1 \end{array}} \right]\)

The parity check matrix is given by:

H = [P Ikn – K]

Syndrome

S = eHT

\({H^T} = \left[ {\begin{array}{*{20}{c}} {{P^T}}\\ {{I_{n - k}}} \end{array}} \right]\)

\({H^T} = \left[ {\begin{array}{*{20}{c}} 1&1&0\\ 0&1&1\\ 1&1&1\\ 1&0&1\\ 1&0&0\\ 0&1&0\\ 0&0&1 \end{array}} \right]\)

S = eHT

For error in 7th Bit

E = [000 0001]

\(S = \left[ {000\;000\;1} \right]\left[ {\begin{array}{*{20}{c}} 1&1&0\\ 0&1&1\\ 1&1&1\\ 1&0&1\\ 1&0&0\\ 0&1&0\\ 0&0&1 \end{array}} \right]\)

S = [ 0 0 1]

Extra information:

Syndrome for all possible errors

Error Pattern

Syndrome

0000000

000

0000001

001

0000010

010

0000100

100

0001000

101

0010000

111

0100000

011

1000000

110

________ is the average amount of information that must be delivered in order to resolve the uncertainty about the outcome of a trial.

  1. Bandwidth
  2. Entropy
  3. Quantum
  4. Loss

Answer (Detailed Solution Below)

Option 2 : Entropy

Information Theory Question 12 Detailed Solution

Download Solution PDF

Entropy: It is the average amount of information that must be delivered in order to resolve the uncertainty about the outcome of a trial.

Or, The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.

It is calculated as:

\(H = \mathop \sum \limits_{i = 1}^M {p_i}{\log _2}\frac{1}{{{p_i}}}\) bits

pi is the probability of the occurrence of a symbol.

The number of outcomes = M.

Notes:

BandwidthThe bandwidth of a communication channel is defined as the difference between the highest and lowest frequencies that the channel allows passing through it (i.e., its passband).

QuantumA quantum is defined as the minimum amount of any physical entity involved in an interaction. For example, a photon is a single quantum of light.

If the probability of a message is 1/4, then the information in bits is:

  1. 8 bit
  2. 4 bit
  3. 2 bit
  4. 1 bit

Answer (Detailed Solution Below)

Option 3 : 2 bit

Information Theory Question 13 Detailed Solution

Download Solution PDF

Concept:

Information associated with the event is “inversely” proportional to the probability of occurrence.

Mathematically, this is defined as:

\(I = {\log _2}\left( {\frac{1}{P}} \right)\;bits\)

P and I represent the probability and information associated with the event

Calculation:

With P = 1/4, the information associated with it will be:

\(I = {\log _2}\left( {\frac{1}{1/4}} \right)\;bits\)

\(I = {\log _2}\left( {4} \right)\;bits\)

I = log2 (22) bits

Since logx yn = n logx y, the above can be written as:

I = 2 log2(2)

I = 2 bits

26 June 1

Entropy: The average amount of information is called the “Entropy". It is given by:

\(H = \;\mathop \sum \limits_i {P_i}{\log _2}\left( {\frac{1}{{{P_i}}}} \right)\;bits/symbol\)

Information is:

  1. the synonym of probability
  2. not related to the probability of information
  3. inversely proportional to the probability of information
  4. directly proportional to the probability of information

Answer (Detailed Solution Below)

Option 3 : inversely proportional to the probability of information

Information Theory Question 14 Detailed Solution

Download Solution PDF

Concept:

Information associated with the event is “inversely” proportional to the probability of occurrence.

Mathematically, this is defined as:

\(I = {\log _2}\left( {\frac{1}{P}} \right)\;bits\)

P and I represent the probability and information associated with the event

Example

Sun rises in the east: The probability of this event to occur = 1, and the information associated with it will be = 0.

Sun rises in the west: P = 0 and I =

26 June 1

Entropy: The average amount of information is called the “Entropy". It is given by:

\(H = \;\mathop \sum \limits_i {P_i}{\log _2}\left( {\frac{1}{{{P_i}}}} \right)\;bits/symbol\)

Noise factor of a system is defined as:

  1. Ratio of input signal to output signal
  2. Ratio of input S/N ratio to output S/N ratio 
  3. Ratio of output S/N ratio to input S/N ratio
  4. Ratio of output signal to noise ratio

Answer (Detailed Solution Below)

Option 2 : Ratio of input S/N ratio to output S/N ratio 

Information Theory Question 15 Detailed Solution

Download Solution PDF

Noise figure (NF) and noise factor (F) are measures of degradation of the signal-to-noise ratio (SNR), caused by components in a signal chain.

\(Noise\;Figure = \frac{{{{\left( {SNR} \right)}_{i/p}}}}{{{{\left( {SNR} \right)}_{o/p}}}}\)

In dB it is given as:

(N.F)dB = [(SNR)i/p]dB – [(SNR)o/p]dB

Important Points

In terms of noise resistance NF is given as:

\(NF= 1 + \frac{{{R_{eq}}}}{{{R_s}}}\) ------(1)

Req = Equivalent input resistance of the antenna

Rs = Noise resistance of the system

Get Free Access Now
Hot Links: teen patti master teen patti go teen patti rummy teen patti winner teen patti glory