ch8Bayes.ppt kikkkkkkkkkikb kiiiiiiiiiii ikk

bundugod 5 views 18 slides Mar 11, 2025
Slide 1
Slide 1 of 18
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18

About This Presentation

8iob ikoikoik boi9lko bko 9okoi9k oik olo9lk lkoi,8k 76kj 87olk olko9ko9k 877ik 87oiklo9ko9 87lkolkolkolopku8u 9olkoilkoi9k o9koioi9okoo


Slide Content

Naïve Bayes Classifier
Ke Chen
http://intranet.cs.man.ac.uk/mlo/comp20411/
Extended by Longin Jan Latecki
[email protected]
COMP20411 Machine Learning

COMP20411 Machine Learning
2
Outline
•Background
•Probability Basics
•Probabilistic Classification
•Naïve Bayes
•Example: Play Tennis
•Relevant Issues
•Conclusions

COMP20411 Machine Learning
3
Background
•There are three methods to establish a classifier
a) Model a classification rule directly
Examples: k-NN, decision trees, perceptron, SVM
b) Model the probability of class memberships given input data
Example: multi-layered perceptron with the cross-entropy cost
c) Make a probabilistic model of data within each class
Examples: naive Bayes, model based classifiers
•a) and b) are examples of discriminative classification
•c) is an example of generative classification
•b) and c) are both examples of probabilistic classification

COMP20411 Machine Learning
4
Probability Basics

•Prior, conditional and joint probability
–Prior probability:
–Conditional probability:
–Joint probability:
–Relationship:
–Independence:
•Bayesian Rule
)| ,)(
121
XP(XX|XP
2
)(
)()(
)(
X
X
X
P
CPC|P
|CP 
)(XP
) )( ),,(
22 ,XP(XPXX
11  XX
)()|()()|()
2211122 XPXXPXPXXP,XP(X
1 
)()() ),()|( ),()|(
212121212
XPXP,XP(XXPXXPXPXXP
1

Evidence
PriorLikelihood
Posterior

Example by Dieter Fox

COMP20411 Machine Learning
8
Probabilistic Classification

•Establishing a probabilistic model for classification
–Discriminative model
–Generative model

•MAP classification rule
–MAP: Maximum A Posterior
–Assign x to c* if
•Generative classification with the MAP rule
–Apply Bayesian rule to convert:
),, , )(
1 n1L X(Xc,,cC|CP  XX
),, , )(
1 n1L
X(Xc,,cCC|P  XX
L
c,,cccc|cCP|cCP 
1
**
, )( )( xXxX
)()(
)(
)()(
)( CPC|P
P
CPC|P
|CP X
X
X
X 

Feature Histograms
x
C
1
C
2
P(x)
Slide by Stephen Marsland

Posterior Probability
x
P(C|x)
1
0
Slide by Stephen Marsland

COMP20411 Machine Learning
11
Naïve Bayes

•Bayes classification
Difficulty: learning the joint probability
•Naïve Bayes classification
–Making the assumption that all input attributes are
independent
–MAP classification rule
)()|,,()()( )(
1 CPCXXPCPC|P|CP
nXX
)|,,(
1
CXXP
n

)|()|()|(
)|,,()|(
)|,,();,,|()|,,,(
21
21
22121
CXPCXPCXP
CXXPCXP
CXXPCXXXPCXXXP
n
n
nnn



Lnn
ccccccPcxPcxPcPcxPcxP ,, , ),()]|()|([)()]|()|([
1
*
1
***
1


COMP20411 Machine Learning
12
Naïve Bayes

•Naïve Bayes Algorithm (for discrete input attributes)
–Learning Phase: Given a training set S,
Output: conditional probability tables; for elements
–Test Phase: Given an unknown instance ,
Look up tables to assign the label c* to X’ if

; in examples with)|( estimate)|(
ˆ

),1 ;,,1( attribute each of value attribute every For
; in examples with)( estimate)(
ˆ

of value target each For
1
S
S
ijkjijkj
jjjk
ii
Lii
cCaXPcCaXP
N,knj xa
cCPcCP
)c,,c(c c




Lnn
ccccccPcaPcaPcPcaPcaP ,, , ),(
ˆ
)]|(
ˆ
)|(
ˆ
[)(
ˆ
)]|(
ˆ
)|(
ˆ
[
1
*
1
***
1

),,(
1 naa X
LNx
jj

,

COMP20411 Machine Learning
13
Example

•Example: Play Tennis

COMP20411 Machine Learning
14
Example

•Learning Phase
OutlookPlay=YesPlay=No
Sunny
2/9 3/5
Overcast
4/9 0/5
Rain
3/9 2/5
TemperaturePlay=YesPlay=No
Hot
2/9 2/5
Mild
4/9 2/5
Cool
3/9 1/5
HumidityPlay=Ye
s
Play=N
o
High 3/9 4/5
Normal 6/9 1/5
Wind Play=YesPlay=No
Strong 3/9 3/5
Weak 6/9 2/5
P(Play=Yes) = 9/14P(Play=No) = 5/14

COMP20411 Machine Learning
15
Example

•Test Phase
–Given a new instance,
x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong)
–Look up tables
–MAP rule
P(Outlook=Sunny|Play=No) = 3/5
P(Temperature=Cool|Play==No) = 1/5
P(Huminity=High|Play=No) = 4/5
P(Wind=Strong|Play=No) = 3/5
P(Play=No) = 5/14
P(Outlook=Sunny|Play=Yes) = 2/9
P(Temperature=Cool|Play=Yes) = 3/9
P(Huminity=High|Play=Yes) = 3/9
P(Wind=Strong|Play=Yes) = 3/9
P(Play=Yes) = 9/14
P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) =
0.0053
P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206
Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.

COMP20411 Machine Learning
16
Relevant Issues

•Violation of Independence Assumption
–For many real world tasks,
–Nevertheless, naïve Bayes works surprisingly well anyway!
•Zero conditional probability Problem
–If no example contains the attribute value
–In this circumstance, during test
–For a remedy, conditional probabilities estimated with

)|()|( )|,,(
11 CXPCXPCXXP
nn 
0)|(
ˆ
, 
ijkjjkj
cCaXPaX
0)|(
ˆ
)|(
ˆ
)|(
ˆ
1 
inijki
cxPcaPcxP
)1 examples, virtual"" of (number prior to weight:
) of values possible for /1 (usually, estimate prior :
whichfor examples training of number :
C and whichfor examples training of number :
)|(
ˆ








mm
Xttpp
cCn
caXn
mn
mpn
cCaXP
j
i
ijkjc
c
ijkj

COMP20411 Machine Learning
17
Relevant Issues

•Continuous-valued Input Attributes
–Numberless values for an attribute
–Conditional probability modeled with the normal distribution
–Learning Phase:
Output: normal distributions and
–Test Phase:
•Calculate conditional probabilities with all the normal distributions
•Apply the MAP rule to make a decision
ijji
ijji
ji
jij
ji
ij
cC
cX
X
cCXP









 

whichfor examples of X values attribute of deviation standard :
C whichfor examples of values attribute of (avearage) mean :
2
)(
exp
2
1
)|(
ˆ

2
2





Ln
ccCXX ,, ),,,( for
11
X
Ln
),,( for
1 n
XX X
LicCP
i
,,1 )( 

COMP20411 Machine Learning
18
Conclusions
•Naïve Bayes based on the independence assumption
–Training is very easy and fast; just requiring considering each
attribute in each class separately
–Test is straightforward; just looking up tables or calculating
conditional probabilities with normal distributions
•A popular generative model
–Performance competitive to most of state-of-the-art classifiers
even in presence of violating independence assumption
–Many successful applications, e.g., spam mail filtering
–Apart from classification, naïve Bayes can do more…