What is meant by entropy of a source?
In a practical communication system, we usually transmit long sequence of symbols from an information source. Thus we are more interested in the average information that a source produces than the information contain of a single symbol. In order to get the information content of the symbol, we talk notice of the fact that the flow of information in a system can fluctuate widely because of randomness involve into the selection of the symbols. Thus we required to talk about the average information contain of the symbols in a long message
For the average information some assumption will there.
The source is stationary so that the probabilities may be remain constant with time
The successive symbols are statistically independent and come from the source at an average rate of r symbols per second.
The mathematically representation of the entropy is given below:
\(H\left(X\right)=E\left[I\left(X_i\right)\right]=\sum_{i=1}^{m}P\left(X_i\right)I\left(X_i\right)=-\sum_{i=1}^{m}P\left(X_i\right){log}_2P\left(X_i\right)\sfrac{bits}{symbol}\)
Explain FM stereo \(\frac{T_x}{R_x}\) system with block schematic diagrams.
Write short notes on
What is strip line?