Naive_Bayes_Classification in detail with exzmaple

IRONSKULLGAMING 10 views 15 slides Aug 31, 2025
Slide 1
Slide 1 of 15
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15

About This Presentation

naive bayes


Slide Content

Naïve Bayes Theorem in Data Mining Theory • Formula • Example • Applications

Introduction

What is Naïve Bayes? • Naïve Bayes is a probabilistic classifier based on Bayes’ Theorem. • Assumes independence among predictors. • Simple, fast, and effective for large datasets. • Commonly used for text classification, spam filtering, sentiment analysis.

Bayes Theorem

Formula Bayes’ Theorem: P(A|B) = [ P(B|A) * P(A) ] / P(B) Where: • P(A|B): Probability of hypothesis A given evidence B • P(B|A): Probability of evidence B given hypothesis A • P(A): Prior probability of A • P(B): Probability of evidence B

Naïve Bayes Classifier

Classifier Formula Naïve Bayes Classifier Formula: P(C|X) = [ P(x1|C) * P(x2|C) * ... * P(xn|C) * P(C) ] / P(X) Where: • C = Class • X = Feature vector (x1, x2, …, xn) • Assumes independence among features xi

Example

Worked Example Example: Spam Email Classification Suppose we want to classify whether an email is Spam or Not Spam based on words like 'offer', 'win', 'buy'. Steps: 1. Calculate prior probabilities P(Spam) and P(Not Spam). 2. Calculate likelihoods P(word|Spam) and P(word|Not Spam). 3. Apply Naïve Bayes formula. 4. Choose the class with higher probability.

Applications

Where is Naïve Bayes Used? • Text Classification • Spam Email Filtering • Sentiment Analysis • Medical Diagnosis • Document Categorization • Recommendation Systems

Advantages & Limitations

Pros and Cons Advantages: • Simple and easy to implement • Works well with large datasets • Effective for text and categorical data Limitations: • Assumes independence among features (rare in reality) • Zero probability issue • Not suitable for numerical data without preprocessing

Summary

Key Takeaways • Naïve Bayes is based on Bayes’ Theorem with independence assumption. • Formula: P(C|X) = P(C) * Π P(xi|C) / P(X) • Useful for classification tasks like spam filtering and sentiment analysis. • Pros: Simple, fast, scalable. • Cons: Strong independence assumption, zero probability issue.
Tags