This bayesian belief network ppt ....pptx

shivangisingh564490 7 views 13 slides Aug 30, 2025
Slide 1
Slide 1 of 13
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13

About This Presentation

hi


Slide Content

Rule-based classifiers are just another type of classifier which makes the class decision depending by using various "if……else" rules. These rules are easily interpretable and thus these classifiers are generally used to generate descriptive models. The condition used with "if" is called the  antecedent  and the predicted class of each rule is called the  consequent . Basic Rule format IF (Condition1 AND Condition2 ...) THEN Class IF (Temperature > 38°C AND Cough = True) THEN Diagnosis = "Flu"   Key Components: Antecedent (Condition):  Logical expression on features. Consequent (Class):  Predicted label

Properties of Rule based Classifier Coverage Accuracy

Rule Triggering When is a Rule "Fired"? A rule is  triggered  if its antecedent evaluates to  True  for a given instance. Rule : IF (Age < 30 AND Income > $50K) THEN "Loan Approved" Instance : {Age: 25, Income: $60K} → Rule FIRES (output: "Loan Approved") {Age: 35, Income: $60K} → Rule DOES NOT FIRE (Age condition false) " What if multiple rules fire for the same instance?"  → Leads to  conflicts .

Conflict Resolution Strategy Why Conflicts Occur: Multiple rules may overlap in their conditions. Rule1: IF (Sunny AND Windy) THEN "Play Golf" Rule2: IF (Sunny) THEN "Stay Home" Instance: {Sunny: True, Windy: True} → Both rules fire! Resolution Strategies: 1. Priority-Based Assign  static priorities  (e.g., Rule1 > Rule2). Example: Rule1 (priority=1): "Play Golf" Rule2 (priority=2): "Stay Home" Result:  Rule1 wins. Specificity-Based Prefer the  most specific rule  (most conditions). Example: Rule1: 2 conditions (Sunny + Windy) Rule2: 1 condition (Sunny) Result:  Rule1 wins (more specific). Order-Based (First-Hit) Use the  first rule  that fires in the list. Example: Rule2 appears before Rule1 → "Stay Home" wins.

Voting/Majority Rules "vote" for a class; majority wins. Example: 3 rules fire: 2 say "Fraud", 1 says "Not Fraud" → "Fraud" wins. Weighted Rules Rules have  confidence weights ; highest sum wins. Example: Rule1 (weight=0.9): "Spam" Rule2 (weight=0.7): "Not Spam" Result:  "Spam" (0.9 > 0.7). Triggering:  A rule fires if  all  its conditions are met. Conflicts:  Resolved via priority, specificity, order, voting, or weights. Design Choice:  Depends on domain (e.g., medical systems use weights; fraud detection uses specificity). Key Takeaways

Class Activity Which rules will fire ? R1 : IF age = youth AND student = yes → buys_computer = yes R2 : IF income = medium AND student = yes AND credit_rating = fair → buys_computer = yes R3 : IF age = senior AND credit_rating = excellent → buys_computer = no R4 : IF student = no AND income = high → buys_computer = no "Here’s the customer we want to classify:" X = (age = youth, income = medium, student = yes, credit rating = fair)

Properties of R ule-based C lassifiers: Coverage:  The percentage of records which satisfy the antecedent conditions of a particular rule. The rules generated by the rule-based classifiers are generally not mutually exclusive, i.e. many rules can cover the same record. The rules generated by the rule-based classifiers may not be exhaustive, i.e. there may be some records which are not covered by any of the rules. The decision boundaries created by them is linear, but these can be much more complex than the decision tree because the many rules are triggered for the same record.

which comes into the mind after knowing that the rules are not mutually exclusive is that how would the class be decided in case different rules with different consequent cover the record. There are two solutions to the above problem: Either rules can be  ordered , i.e. the class corresponding to the highest priority rule triggered is taken as the final class. Otherwise, we can assign  votes  for each class depending on some their weights, i.e. the rules remain unordered.

Class Cap Shape Cap Surface Bruises Odour Stalk Shape Population Habitat edible flat scaly yes anise tapering scattered grasses poisonous convex scaly yes pungent enlargening several grasses edible convex smooth yes almond enlargening numerous grasses edible convex scaly yes almond tapering scattered meadows edible flat fibrous yes anise enlargening several woods edible flat fibrous no none enlargening several urban poisonous conical scaly yes pungent enlargening scattered urban edible flat smooth yes anise enlargening numerous meadows poisonous convex smooth yes pungent enlargening several urban

Rule extraction from the Decision Tree to convert decision trees into rule-based classifiers, handle rule pruning, and address conflicts. Decision Tree → Path Tracing → Raw Rules → Pruning → Conflict Resolution → Final Rule Set Why Extract Rules from Decision Trees? Decision Trees (DTs): Easy to train, but can grow large and complex. Hard to interpret when deep (e.g., 20+ levels). Rule-Based Classifiers (RBCs): Represent knowledge as  IF-THEN  rules. Often more compact and human-readable than trees. Extract rules by tracing  every root-to-leaf path  in the DT. Each path →  One IF-THEN rule .

Step-by-Step Conversion Identify all paths from the root to each leaf. For each path: Antecedent (IF): Combine all split conditions with AND. Consequent (THEN): Use the leaf’s class label. Decision Tree Splits: Root: age = {youth, middle_aged , senior} Branch age=youth → student = {yes, no} Branch age=senior → credit_rating = {excellent, fair} R1: IF age = youth AND student = no THEN buys_computer = no R2: IF age = youth AND student = yes THEN buys_computer = yes R3: IF age = middle_aged THEN buys_computer = yes R4: IF age = senior AND credit_rating = excellent THEN buys_computer = yes R5: IF age = senior AND credit_rating = fair THEN buys_computer = no
Tags