  Question:
Published on: 25 September, 2022

What is meant by entropy of a source?

In a practical communication system, we usually transmit long sequence of symbols from an information source. Thus we are more interested in the average information that a source produces than the information contain of a single symbol. In order to get the information content of the symbol, we talk notice of the fact that the flow of information in a system can fluctuate widely because of randomness involve into the selection of the symbols. Thus we required to talk about the average information contain of the symbols in a long message

For the average information some assumption will there.

1. The source is stationary so that the probabilities may be remain constant with time

2. The successive symbols are statistically independent and come from the source at an average rate of r symbols per second.

The mathematically representation of the entropy is given below:

$$H\left(X\right)=E\left[I\left(X_i\right)\right]=\sum_{i=1}^{m}P\left(X_i\right)I\left(X_i\right)=-\sum_{i=1}^{m}P\left(X_i\right){log}_2P\left(X_i\right)\sfrac{bits}{symbol}$$

Subjects
Trending

What are Narrowband FM and Wideband FM?

View : 46
25 September, 2022

Write short notes on:

• Ring modulator
View : 49
25 September, 2022

Random questions