sadsad asdasd dasdsa dasda sadCHAPTER 13.pdf

durraizshuaib 9 views 34 slides Sep 29, 2024
Slide 1
Slide 1 of 34
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34

About This Presentation

asSsaSA DFSDFS


Slide Content

COMMUNICATION
SYSTEMS
INTRDODUCTION TO INFORMATION THEORY283

INTRODUCTION TO INFORMATION
THEORY
•ChapterNo.13ofTextBookRecommended
•Thislecture willbe uploaded on yourslate accounts aswellforyour
convenience.
•But I suggestyouto attendthe onlinelectures at time toavoidanyloss of
understandingtheparticulartopic.
•Foranyquery,youcancontactmeon
[email protected]

Information Theory
•Information isthe source of acommunicationsystem,whether it isanalog
ordigital.Informationtheoryisamathematicalapproachtothestudyof
coding of information alongwiththequantification,storage,and
communicationof information.
•Informationistheintelligence,ideasormessageininformationtheory.
•In communicationsystem,informationis transmitetedfromsourceto
destination.285

Measure of Information
•Itisaninformationcontentof amessage.
•Consider a sourcethat emitsmessages????????????
1,????????????
2,…,????????????
????????????with probabilities
????????????
1,????????????
2,…,????????????
????????????respectively.(Totalprobabilityof sourcemustbeequalto1)
•Amountof informationcanbecalculatedas:
????????????
????????????=????????????????????????????????????
????????????
1
????????????
????????????
????????????????????????????????????????????????
Where????????????isthebaseof numbersystemand????????????denotesthenumberof message.286

Measure of Information
•Considerabinary source thatemitsmessages????????????
1,????????????
2,????????????
3withprobabilities0.2,0.5,0.3
respectively.
????????????
????????????=????????????????????????????????????
????????????(
1
????????????
????????????
)
????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
1,????????????
????????????1=????????????????????????????????????
2
1
????????????
????????????1
????????????
????????????1=????????????????????????????????????
2
1
0.2
= 2.3219????????????????????????????????????????????????287

Uncertainty
•Uncertainty can be defined as unpredictability of the event.
•Suppose asystem emits messages{????????????
0,????????????
1,…,????????????
????????????}withprobabilities????????????
1,????????????
2,…,????????????
????????????
respectively.
•Uncertaintyincreaseswithdecreaseinprobabilityand uncertaintydecreaseswithincreasein
probability.
If we consider ????????????
0 If we consider ????????????
0
????????????
0=�
0
1
????????????
0= 0.2
????????????
0= 0.7
No uncertainty Uncertainty288

Properties of Information
1.If uncertaintyismoretheninformationismore.
2.Ifthereceiverknowsthe message/iftheevent issurelytooccurthen
informationiszero.
3.
Ifsourceemitsmessages????????????
1,????????????
2,…,????????????
????????????withprobabilities
????????????
1,????????????
2,…,????????????
????????????then their combinedinformationissumofthe
informationsofallmessages.289

1. If uncertainty is more then information is
more. {??????????????????>??????????????????then I1> ??????????????????}
Supposewehave????????????
1=
1
4
and????????????
2=
1 2
.
Asweknow,uncertaintyincreaseswithdecreaseinprobabilityanduncertainty
decreaseswithincreaseinprobabilityso:
????????????
1>????????????
2290

1.If uncertaintyismoretheninformationis
more
.{????????????1>????????????2thenI1> ????????????2}
????????????
1=
1
4
????????????
1=????????????????????????????????????
2
1
1
4
=????????????????????????????????????
2
4
????????????
1=????????????????????????????????????
22
2
????????????
1=2????????????????????????????????????
22
????????????
1=2????????????????????????????????????????????????
????????????
2=
1
2
????????????
2=????????????????????????????????????
2
1
1
2
????????????
2=????????????????????????????????????
22
????????????
2=1????????????????????????????????????????????????291

2.If thereceiverknowsthemessage,
informationiszero.
In this case, probability of the event is 1. So,????????????=????????????????????????????????????
2
1
????????????
????????????=????????????????????????????????????
2
1
1
????????????=0????????????????????????????????????????????????292

3.Combinedinformationofasourceissumof
theinformationsofallmessages.
Supposewehave????????????
1=
1
4
and????????????
2=
1 2
.Then,
????????????
1=????????????????????????????????????
2
1
????????????
1
????????????
2=????????????????????????????????????
2
1
????????????
2Theircombinedinformation willbe????????????
????????????.As????????????
1and????????????
2areindependentsotheircombined
probabilityis????????????
1.????????????
2.In thiscasecombinedinformation is:
????????????
????????????=????????????????????????????????????
2
1
????????????
1.????????????
2
????????????
????????????=????????????????????????????????????
2
1
????????????
1
+????????????????????????????????????
2
1
????????????
2????????????
????????????=????????????
1+????????????
2293

Entropy
•Average information per message of a source is called entropy. It is called
????????????????????????.Hence,
????????????????????????=�
????????????=1
????????????
????????????
????????????????????????????????????????????????
????????????
1
????????????
????????????
????????????????????????????????????????????????
•Entropy of a source is a function of message probabilities.294

Coding Theory
•Coding theoryis oneof themost importantanddirectapplicationsofinformation
theory.Itcanbesubdividedintosourcecodingtheoryandchannelcodingtheory.
•Insignal processing,datacompression, source coding,orbit-ratereductionisthe
processof encodinginformationusingfewerbitsthantheoriginalrepresentation.
•Any processthat generatessuccessivemessages canbe consideredasourceof
information.
•Amemory-lesssourceisoneinwhicheach message isanindependentidentically
distributedrandom variable,whereasthepropertiesofergodicityandstationary
imposelessrestrictiveconstraints.295

Huffman Coding
Huffman coding uses a specific method for choosingthe representationfor
eachsymbol, resultingina prefix code(sometimes called"prefix- freecodes",
thatis,the bit string representingsomeparticularsymbolis neveraprefixof
the bitstringrepresentinganyother symbol). Huffman codingissuch a
widespreadmethod forcreatingprefix codesthattheterm"Huffman code"is
widelyusedasa synonym for"prefixcode"evenwhensucha codeisnot
producedbyHuffman'salgorithm.296

Steps to obtain Huffman Code
•Arrange the messages in descending order of probabilities.
•Combine the last ????????????messages to obtain a new message.
•Re-arrange the probabilities in descending order.
•Repeat the procedure until we get exactly ????????????number of messages.
•Startfrom thelast columnand assigneach????????????number a unique digit and
movinginbackwarddirectionbuildcode-wordforeachmessage.297

Example
Amemory-lesssourceemitssixmessageswith probabilities{0.1,0.3,0.15 ,
0.12, 0.08,0.25}.ObtaincompactbinaryHuffmancode.
Solution:
????????????=2
Binaryr=201
Base3r=30 1 2
Base4r=4-0123298

(0010010011110111)
2
??????????????????0.30 ????????????????????????0.30 00 0.30 00 0.43 1 0.57 0
??????????????????0.25 ????????????????????????0.25 10 0.27 01 0.30 00 0.43 1
??????????????????0.15????????????????????????????????????0.18 11 0.25 10 0.
27 01
??????????????????0.12????????????????????????????????????0.1501 00.18 11
??????????????????
????????????.????????????????????????????????????????????????????????????0.1201 1
??????????????????????????????.????????????????????????????????????????????????????????????299

(0010010011110111)
2
probabilities Code words
?????????????????? 0.30 ????????????????????????
?????????????????? 0.25 ????????????????????????
?????????????????? 0.15 ????????????????????????????????????
?????????????????? 0.1
2 ????????????????????????????????????
?????????????????? 0.10 ????????????????????????????????????
?????????????????? 0.08 ????????????????????????????????????300

r = 3 0 1 2
(120102000001)
3??????????????????0.30 1 0.30 1 0.450
??????????????????0.25 2 0.25 2 0.301
??????????????????0.15 0 1 0.1800 0.252
??????????????????0.
12 0 2 0.150 1
??????????????????
????????????.????????????????????????00 00.120 2
??????????????????????????????.????????????????????????00 1
????????????7 0 00 2301

Ternary Huffman Code/ 3- ary Huffman code
(120102000001)
3
probabilities Code words
?????????????????? 0.30 1
?????????????????? 0.25 2
?????????????????? 0.15 0 1
?????????????????? 0.12 0 2
?????????????????? 0
.10 0 0 0
?????????????????? 0.08 0 0 1302

r = 4 0 1 2 3
(0123031 32)
4??????????????????0.301 ????????????.????????????????????????0
??????????????????0.252 0.301
??????????????????0.153 0.2
52
??????????????????????????????.????????????????????????00 0.153
??????????????????
????????????.????????????????????????01
??????????????????????????????.????????????????????????02
????????????7 0 03303

Quaternary / 4- ary Huffman code
(0123031 32)
4
probabilities Code words
?????????????????? 0.30 0
?????????????????? 0.25 1
?????????????????? 0.15 2
?????????????????? 0.12 30
?????????????????? 0.1
0 31
?????????????????? 0.08 32304

Length of the Code-word
Length of code word is the number of bits in the code. It is denoted by ????????????.
For Example:

Code word for ????????????
1is {101101}then length of code-word is ????????????=6
•Code word for ????????????
2is {1011}then length of code-word is ????????????=4305

Average Code Length
Average length of the code can be computed as:
????????????
????????????????????????????????????=�
????????????=1
????????????
????????????
????????????.????????????
????????????
Where,
????????????
????????????=??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
????????????
????????????=??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????−??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????306

Average Code Length
Example:
????????????
????????????????????????????????????=�
????????????=1
????????????
????????????
????????????.????????????
????????????
????????????
????????????????????????????????????=0.4 2+0.63=2.6
Probability Code-word
????????????
1 0.4 01
????????????
2 0.6 111307

Code Efficiency and Redundancy
•Code efficiency is denoted by η and given by:
η=
????????????(????????????)
????????????
????????????????????????????????????•Redundancy is denoted by ????????????and given by:
????????????=1−η308

Example
Amemory-lesssourceemitsmessages{????????????
1,????????????
2}with probabilities{0.8,0.2}.
Obtain compactbinary Huffman codeaswellasitssecond andthirdorder
extensions.Findcodeefficiencyforeachcase.
Solution:
ToobtainbinaryHuffmanCode:
????????????
1 0.8 0
????????????
2 0.2 1309

η=
????????????(????????????)
????????????
????????????????????????????????????
????????????????????????=�
????????????=1
????????????
????????????
????????????????????????????????????????????????
????????????
1
????????????
????????????
????????????????????????????????????????????????
????????????????????????= (0.8)????????????????????????????????????
2
1
0.8
+ (0.2)????????????????????????????????????
2
1
0.2
= 0.72 ????????????????????????????????????????????????
????????????
????????????????????????????????????=�
????????????=1
????????????
????????????
????????????.????????????
????????????= 1
η= 0.72310

For second order extension (N=2)
m1 m1 0.8 * 0.80.64 0 0.64 0 0.64 0
m1 m2 0.8 * 0.20.16 1 1 0.20 1 0 0.36 1
m2 m1 0.2 * 0.80.16 1 00 0.16 1 1
m2 m2 0.2 * 0.20.04 1 01
As we are obtaining binary Huffman code so r = 2, N does not depend on r.311

For second order extension (N=2)
ProbabilitiesCode wordLength of
code word
m1 m1 0.64 0 1
m1 m2 0.16 1 1 2
m2 m1 0.16 1 00 3
m2 m2 0.04 1 01 3312

For second order extension (N=2)
????????????
????????????????????????????????????=�
????????????=1
????????????
????????????
????????????.????????????
????????????
????????????
????????????????????????????????????=0.641+0.162+0.163+0.043=1.56
????????????

=
????????????
????????????????????????????????????
????????????
=
1.56
2
=0.78
η=
????????????(????????????)
????????????

=
0.72
0.78
=0.923313

For Third order extension (N=3)
{Although N=3, combine always r number of messages. In this
case r=2}
Probabilities Code word Code Word Length
?????????????????????????????????????????????????????? 0.512 0 1
?????????????????????????????????????????????????????? 0.128 100 3
?????????????????????????????????????????????????????? 0.128 101 3
?????????????????????????????????????????????????????? 0.128 110 3
?????????????????????????????????????????????????????? 0.032 11100 5
?????????????????????????????????????????????????????? 0.032 11101 5
?????????????????????????????????????????????????????? 0.032 11110 5
????????????????????????????????????????????????2 0.008 11111 5314

For Third order extension (N=3)
????????????
????????????????????????????????????=�
????????????=1
????????????
????????????
????????????.????????????
????????????
????????????
????????????????????????????????????=0.5121+0.128+0.128+0.1283+0.032+0.032+0.0325+0.0085
????????????
????????????????????????????????????= 2.184
????????????

=
????????????
????????????????????????????????????
????????????
=
2.184
3
= 0.728
η=
????????????(????????????)
????????????

=
0.72
0.728
= 0.989315

•Hence it can be seen from the example that code efficiency for each case:
????????????????????????????????????????????????=1, η=0.72
????????????????????????????????????????????????=2, η=0.923
????????????????????????????????????????????????=3, η=0.989
Therefo
re, code efficiency increases with number of extensions.316
Tags