ENTROPY_HUFFMANggggnnnnnnnnnn_CODING.pptx

HrdRay 4 views 10 slides May 01, 2024
Slide 1
Slide 1 of 10
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10

About This Presentation

Entropy Huffman coding problem


Slide Content

MEASURE OF INFORMATION The probability of occurrence of an event is a measure of its unexpectedness and, hence, is related to the information content the amount of information received from a message is directly related to the uncertainty or inversely related to the probability of its occurrence The more unexpected the event, the greater the surprise, and hence the more information

Average Information per Message: Entropy of a Source A memoryless source implies that each message emitted is independent of the previous message(s) the information content of message m, is I_i , given by The probability of occurrence of message m_i is P_i The average information per message of a source m is called its entropy

the entropy is a measure of uncertainty, the probability distribution that generates the maximum uncertainty will have the maximum entropy

SOURCE ENCODING Efficiency Average Code-word length Redundancy

Huffman Code The source encoding theorem says that to encode a source with entropy H(m), we need, on the average, a minimum of H (m) binary digits per message The number of digits in the codeword is the length of the codeword The average word length of an optimum code is H (m) it is not desirable to use long sequences, since they cause transmission delay and add to equipment complexity

symbol Prob.

Entropy

. The average length of the compact code

The merit of any code is measured by its average length in comparison to H(m) (the average minimum length) code efficiency Redundancy Huffman code is a variable length code The entropy H(m) of the source is given by