Apply Shannon – Fano algorithm to the source with M = 8 emitting message A, B, C, D, E, F, G, H having probability P(A) = 1/2, P(B) = P(C) = 1/8, P(D) = P(E) = P(F) = 1/16, P(G) = P(H) = 1/32, calculate entropy, average code length and efficiency of coding.
Given that the probabilities of the eight messages are respectively P(A) = 1/2, P(B) = P(C) = 1/8, P(D) = P(E) = P(F) = 1/16, P(G) = P(H) = 1/32,
Now applying the Shannon – Fano algorithm, we have
Table 1
Message |
Probability |
Step 1 |
Step 2 |
Step 3 |
Step 4 |
Step 5 |
Code |
A |
0.5 |
0 |
|
|
|
|
0 |
B |
0.125 |
1 |
0 |
0 |
|
|
100 |
C |
0.125 |
1 |
0 |
1 |
|
|
101 |
D |
0.0625 |
1 |
1 |
0 |
0 |
|
1100 |
E |
0.0625 |
1 |
1 |
0 |
1 |
|
1101 |
F |
0.0625 |
1 |
1 |
1 |
0 |
|
1110 |
G |
0.03125 |
1 |
1 |
1 |
1 |
0 |
11110 |
H |
0.03125 |
1 |
1 |
1 |
1 |
1 |
11111 |
The entropy of the message is
\(H\left(X\right)=\sum_{i=1}^{m}P\left(X_i\right){log}_2\frac{1}{P\left(X_i\right)}\) bits / symbol
\(H\left(X\right)=\left[\frac{1}{2}\times{log}_22+2\times\frac{1}{8}\times{log}_28+3\times\frac{1}{16}\times{log}_216+2\times\frac{1}{32}\times{log}_232\right]\)
\(H\left(X\right)=\left[\frac{1}{2}+\frac{3}{4}+\frac{3}{4}+\frac{5}{16}\right]=\frac{37}{16}=2.3125\) bits / symbol
The average code length is
L = 0.5 × 1 + (0.125 × 3) × 2 + (0.0625 × 4) × 3 + (0.03125 × 5) × 2
L = 0.5 + 0.75 + 0.75 + 0.3125 = 2.3125 bit / symbol
The efficiency of the coding is \(\eta = \frac{H\left(X\right)}{L} = \frac{2.3125}{2.3125} =100\)
Write short notes on:
What is angle modulation?
Derive the expression for overall noise figure of a cascaded system.
Draw and explain the square – law modulator.
What is scattering Parameters? Why is it used in microwave Network?