Knowledge based reasoning using aiml.pdf

anuabhi0107 37 views 178 slides May 01, 2024
Slide 1
Slide 1 of 215
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101
Slide 102
102
Slide 103
103
Slide 104
104
Slide 105
105
Slide 106
106
Slide 107
107
Slide 108
108
Slide 109
109
Slide 110
110
Slide 111
111
Slide 112
112
Slide 113
113
Slide 114
114
Slide 115
115
Slide 116
116
Slide 117
117
Slide 118
118
Slide 119
119
Slide 120
120
Slide 121
121
Slide 122
122
Slide 123
123
Slide 124
124
Slide 125
125
Slide 126
126
Slide 127
127
Slide 128
128
Slide 129
129
Slide 130
130
Slide 131
131
Slide 132
132
Slide 133
133
Slide 134
134
Slide 135
135
Slide 136
136
Slide 137
137
Slide 138
138
Slide 139
139
Slide 140
140
Slide 141
141
Slide 142
142
Slide 143
143
Slide 144
144
Slide 145
145
Slide 146
146
Slide 147
147
Slide 148
148
Slide 149
149
Slide 150
150
Slide 151
151
Slide 152
152
Slide 153
153
Slide 154
154
Slide 155
155
Slide 156
156
Slide 157
157
Slide 158
158
Slide 159
159
Slide 160
160
Slide 161
161
Slide 162
162
Slide 163
163
Slide 164
164
Slide 165
165
Slide 166
166
Slide 167
167
Slide 168
168
Slide 169
169
Slide 170
170
Slide 171
171
Slide 172
172
Slide 173
173
Slide 174
174
Slide 175
175
Slide 176
176
Slide 177
177
Slide 178
178
Slide 179
179
Slide 180
180
Slide 181
181
Slide 182
182
Slide 183
183
Slide 184
184
Slide 185
185
Slide 186
186
Slide 187
187
Slide 188
188
Slide 189
189
Slide 190
190
Slide 191
191
Slide 192
192
Slide 193
193
Slide 194
194
Slide 195
195
Slide 196
196
Slide 197
197
Slide 198
198
Slide 199
199
Slide 200
200
Slide 201
201
Slide 202
202
Slide 203
203
Slide 204
204
Slide 205
205
Slide 206
206
Slide 207
207
Slide 208
208
Slide 209
209
Slide 210
210
Slide 211
211
Slide 212
212
Slide 213
213
Slide 214
214
Slide 215
215

About This Presentation

Artificial Intelligence and Machine learning foundations


Slide Content

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-Knowledge
base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-Propositional
logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge
representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 2

Knowledge Representation & Reasoning
•The secondmost important concept in AI
•If we are going to act rationally in our environment, then we must have some way of
describing that environment and drawing inferences from that representation.
•how do we describe what we know about the world ?
•how do we describe it concisely?
•how do we describe it so that we can get hold of the right piece of knowledge when
we need it ?
•how do we generate new pieces of knowledge ?
•how do we deal with uncertainknowledge ?

Knowledge
Declarative Procedural
•Declarative knowledge deals with factoid questions (what is the capital of
India? Etc.)
•Procedural knowledge deals with “How”
•Procedural knowledge can be embedded in declarative knowledge
Knowledge Representation & Reasoning

Planning
Given a set of goals, construct a sequence of actions that achieves
those goals:
•often very large search space
•but most parts of the world are independent of most other
parts
•often start with goals and connect them to actions
•no necessary connection between order of planning and order
of execution
•what happens if the world changes as we execute the plan
and/or our actions don’t produce the expected results?

Learning
•If a system is going to act truly appropriately, then it must
be able to change its actions in the light of experience:
•how do we generate new facts from old ?
•how do we generate new concepts ?
•how do we learn to distinguish different situations in
new environments ?

17-03-2021 18CSC305J_AI_UNIT3 7
What is knowledge representation?
•Knowledge representation and reasoning (KR, KRR) is the part of Artificial
intelligence which concerned with AI agents thinking and how thinking contributes
to intelligent behavior of agents.
•It is responsible for representing information about the real world so that a
computer can understand and can utilize this knowledge to solve the complex real
world problems such as diagnosis a medical condition or communicating with
humans in natural language.
•It is also a way which describes how we can represent knowledge in artificial
intelligence. Knowledge representation is not just storing data into some database,
but it also enables an intelligent machine to learn from that knowledge and
experiences so that it can behave intelligently like a human.

17-03-2021 18CSC305J_AI_UNIT3 8
What to Represent?
Following are the kind of knowledge which needs to be represented in AI systems:
•Object:All the facts about objects in our world domain. E.g., Guitars contains
strings, trumpets are brass instruments.
•Events:Events are the actions which occur in our world.
•Performance:It describe behavior which involves knowledge about how to do
things.
•Meta-knowledge:It is knowledge about what we know.
•Facts:Facts are the truths about the real world and what we represent.
•Knowledge-Base:The central component of the knowledge-based agents is the
knowledge base. It is represented as KB. The Knowledgebase is a group of the
Sentences (Here, sentences are used as a technical term and not identical with the
English language).

Approaches to knowledge Representation
•Representational adequacy the ability to represent all of the kinds of
knowledge that are needed in that domain.
•Inferential Adequacy: -the ability to manipulate the representation structures
in such a way as to derive new structures corresponding to new knowledge
inferred from ol.
•Inferential Efficiency: -the ability to incorporate into the knowledge structure
additional information that can be used to focus the attention of the inference
mechanism in the most promising directions.
•Acquisitioned Efficiency: -the ability to acquire new information easily. The
simplest case involves direct insertion by a person of new knowledge into the
database.

Knowledge Representation Issues
•It becomes clear that particular knowledge representation models allow for more specific more powerful
problem solving mechanisms that operate on them.
•Examine specific techniques that can be used for representing & manipulating knowledge within programs.
•Representation & Mapping
•Facts :-truths in some relevant world
•These are the things we want to represent.
•Representations of facts in some chosen formalism.
•Things we are actually manipulating. Structuring these entities is as two levels.
•The knowledge level, at which facts concluding each agents behavior & current goals are described.
Internal RepresentationsFacts
English Representations
English understanding English generation

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-Knowledge
base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-Propositional
logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge
representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic reasoning over
time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster -shafertheory
17-03-2021 18CSC305J_AI_UNIT3 11

A KNOWLEDGE-BASED AGENT
12
•A knowledge-based agent includes a knowledge base and an inference system.
•A knowledge base is a set of representations of facts of the world.
•Each individual representation is called a sentence.
•The sentences are expressed in a knowledge representation language.
•The agent operates as follows:
1. It TELLs the knowledge base what it perceives.
2. It ASKs the knowledge base what action it should perform.
3. It performs the chosen action.
17-03-2021 18CSC305J_AI_UNIT3

Requirements for a Knowledge-Based Agent
1. \what it already knows" [McCarthy '59]
A knowledge base of beliefs.
2. \it must rstbe capable of being told" [McCarthy '59]
A way to put new beliefs into the knowledge base.
3. \automatically deduces for itself a sucientlywide class of
immediate consequences" [McCarthy '59]
A reasoning mechanism to derive new beliefs from ones already
in the knowledge base.
17-03-2021 18CSC305J_AI_UNIT3 13

ARCHITECTURE OF A KNOWLEDGE-BASED
AGENT
14
•Knowledge Level.
•The most abstract level: describe agent by saying what it knows.
•Example: A taxi agent might know that the Golden Gate Bridge connects San
Francisco with the Marin County.
•Logical Level.
•The level at which the knowledge is encoded into sentences.
•Example: Links(GoldenGateBridge, SanFrancisco, MarinCounty).
•Implementation Level.
•The physical representation of the sentences in the logical level.
•Example: ‘(links goldengatebridgesanfranciscomarincounty)
17-03-2021 18CSC305J_AI_UNIT3

THE WUMPUS WORLD ENVIRONMENT
15
•The Wumpuscomputer game
•The agent explores a cave consisting of rooms connected by passageways.
•Lurking somewhere in the cave is the Wumpus, a beast that eats any agent that
enters its room.
•Some rooms contain bottomless pitsthat trap any agent that wanders into the
room.
•Occasionally, there is a heap of goldin a room.
•The goal is to collect the gold and exit the world without being eaten
17-03-2021 18CSC305J_AI_UNIT3

A TYPICAL WUMPUS WORLD
1617-03-2021 18CSC305J_AI_UNIT3
•The agent always starts in the
field [1,1].
•The task of the agent is to
find the gold, return to the
field [1,1] and climb out of
the cave.

AGENT IN A WUMPUS WORLD: PERCEPTS
17
•Theagentperceives
•astenchinthesquarecontainingtheWumpusandintheadjacentsquares(not
diagonally)
•abreezeinthesquaresadjacenttoapit
•aglitterinthesquarewherethegoldis
•abump,ifitwalksintoawall
•awoefulscreameverywhereinthecave,ifthewumpusiskilled
•Theperceptsaregivenasafive-symbollist.Ifthereisastenchandabreeze,butno
glitter,nobump,andnoscream,theperceptis
[Stench,Breeze,None,None,None]
17-03-2021 18CSC305J_AI_UNIT3

WUMPUS WORLD ACTIONS
18
•goforward
•turnright90degrees
•turnleft90degrees
•grab:Pickupanobjectthatisinthesamesquareastheagent
•shoot:Fireanarrowinastraightlineinthedirectiontheagentisfacing.Thearrow
continuesuntiliteitherhitsandkillsthewumpusorhitstheouterwall.Theagent
hasonlyonearrow,soonlythefirstShootactionhasanyeffect
•climbisusedtoleavethecave.Thisactionisonlyeffectiveinthestartsquare
•die:Thisactionautomaticallyandirretrievablyhappensiftheagententersasquare
withapitoralivewumpus
17-03-2021 18CSC305J_AI_UNIT3

ILLUSTRATIVE EXAMPLE: WUMPUS WORLD
•Performance measure
•gold +1000,
•death -1000
(falling into a pit or being eaten by the wumpus)
•-1 per step, -10 for using the arrow
•Environment
•Rooms / squares connected by doors.
•Squares adjacent to wumpusare smelly
•Squares adjacent to pit are breezy
•Glitter iffgold is in the same square
•Shooting kills wumpusif you are facing it
•Shooting uses up the only arrow
•Grabbing picks up gold if in same square
•Releasing drops the gold in same square
•Randomly generated at start of game. Wumpusonly senses current room.
•Sensors: Stench, Breeze, Glitter, Bump, Scream [perceptual inputs]
•Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot
17-03-2021 1918CSC305J_AI_UNIT3

WUMPUS WORLD CHARACTERIZATION
Fully Observable No –only local perception
Deterministic Yes –outcomes exactly specified
Static Yes –Wumpus and Pits do not move
Discrete Yes
Single-agent? Yes –Wumpus is essentially a “natural feature.”
17-03-2021 2018CSC305J_AI_UNIT3

EXPLORING A WUMPUS WORLD
Stench, Breeze, Glitter, Bump, Scream
None, none, none, none, none
The knowledge baseof the agent
consists of the rules of the
Wumpusworld plus the percept
“nothing” in [1,1]
Boolean percept
feature values:
<0, 0, 0, 0, 0>
17-03-2021 2118CSC305J_AI_UNIT3

Stench, Breeze, Glitter, Bump, Scream
None, none, none, none, none
T=0 The KB of the agent consists of
the rules of the Wumpusworld plus
the percept “nothing” in [1,1].
By inference, the agent’s knowledge
base also has the information that
[2,1] and [1,2] are okay.
Added as propositions.
World “known” to agent
at time = 0.
EXPLORING A WUMPUS WORLD
17-03-2021 2218CSC305J_AI_UNIT3

Stench, Breeze, Glitter, Bump, Scream
@ T = 1 What follows?
Pit(2,2) or Pit(3,1)
None, none, none, none, none A –agent
V –visited
B -breeze
None, breeze, none, none, none
Where next?
T = 0
EXPLORING A WUMPUS WORLD
17-03-2021 2318CSC305J_AI_UNIT3
V A/B
P?
P?
T = 1

EXPLORING A WUMPUS WORLD
17-03-2021 2418CSC305J_AI_UNIT3
S
Where is Wumpus? Wumpuscannot be in (1,1) or in (2,2) (Why?)➔Wumpusin (1,3)
Not breeze in (1,2) ➔no pit in (2,2); but we know there is
pit in (2,2) or (3,1) ➔pit in (3,1)
P?
P?
1 2 3 4
1
2
3
4
Stench, none, none, none, none
P
W
P
S
T=3
Stench, Breeze, Glitter, Bump, Scream

P
W
P
We reasoned about the possible states the Wumpusworld can be in,
given our percepts and our knowledge of the rules of the Wumpus
world.
I.e., the content of KB at T=3.
Essence of logical reasoning:
Given all we know, Pit_in_(3,1) holds.
(“The world cannot be different.”)
What follows is what holds true in all those worlds that satisfy what is
known at that time T=3 about the particular Wumpusworld we are in.
Models(KB) Models(P_in_(3,1))
Example property: P_in_(3,1)
EXPLORING A WUMPUS WORLD
17-03-2021 2518CSC305J_AI_UNIT3

NO INDEPENDENT ACCESS TO THE WORLD
26
•The reasoning agent often gets its knowledge about the facts of the world as a sequence of
logical sentences and must draw conclusions only from them without independent access to
the world.
•Thus it is very important that the agent’s reasoning is sound!
17-03-2021 18CSC305J_AI_UNIT3

SUMMARY OF KNOWLEDGE BASED AGENTS
27
•Intelligentagentsneedknowledgeabouttheworldformakinggooddecisions.
•Theknowledgeofanagentisstoredinaknowledgebaseintheformofsentencesina
knowledgerepresentationlanguage.
•Aknowledge-basedagentneedsaknowledgebaseandaninferencemechanism.It
operatesbystoringsentencesinitsknowledgebase,inferringnewsentenceswiththe
inferencemechanism,andusingthemtodeducewhichactionstotake.
•Arepresentationlanguageisdefinedbyitssyntaxandsemantics,whichspecifythe
structureofsentencesandhowtheyrelatetothefactsoftheworld.
•Theinterpretationofasentenceisthefacttowhichitrefers.Ifthisfactispartofthe
actualworld,thenthesentenceistrue.
17-03-2021 18CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-Knowledge base
agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-Propositional logic-
Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge representation
using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster -shafertheory
17-03-2021 18CSC305J_AI_UNIT3 28

What is a Logic?
•A language with concrete rules
•No ambiguity in representation (may be other errors!)
•Allows unambiguous communication and processing
•Very unlike natural languages e.g. English
•Many ways to translate between languages
•A statement can be represented in different logics
•And perhaps differently in same logic
•Expressivenessof a logic
•How much can we say in this language?
•Not to be confused with logical reasoning
•Logics are languages, reasoning is a process (may uselogic)
18CSC305J_AI_UNIT3
29
17-03-2021

Syntax and Semantics
•Syntax
•Rules for constructing legal sentences in the logic
•Which symbols we can use (English: letters, punctuation)
•How we are allowed to combine symbols
•Semantics
•How we interpret (read) sentences in the logic
•Assigns a meaning to each sentence
•Example: “All lecturers are seven foot tall”
•A valid sentence (syntax)
•And we can understand the meaning (semantics)
•This sentence happens to be false (there is a counterexample)

Propositional Logic
•Syntax
•Propositions, e.g. “it is wet”
•Connectives: and, or, not, implies, iff(equivalent)
•Brackets, T (true) and F (false)
•Semantics (Classical AKA Boolean)
•Define how connectives affect truth
•“P and Q” is true if and only if P is true and Q is true
•Use truth tablesto work out the truth of statements

Predicate Logic
•Propositional logic combines atoms
•An atom contains no propositional connectives
•Have no structure (today_is_wet, john_likes_apples)
•Predicatesallow us to talk about objects
•Properties: is_wet(today)
•Relations: likes(john, apples)
•True or false
•In predicate logic each atom is a predicate
•e.g. first order logic, higher-order logic

First Order Logic
•More expressive logic than propositional
•Used in this course (Lecture 6 on representation in FOL)
•Constantsare objects: john, apples
•Predicatesare properties and relations:
•likes(john, apples)
•Functionstransform objects:
•likes(john, fruit_of(apple_tree))
•Variablesrepresent any object: likes(X, apples)
•Quantifiersqualify values of variables
•True for all objects (Universal): X. likes(X, apples)
•Exists at least one object (Existential): X. likes(X, apples)

Example: FOL Sentence
•“Every rose has a thorn”
•For all X
•if (X is a rose)
•then there exists Y
•(X has Y) and (Y is a thorn)

Example: FOL Sentence
•“On Mondays and Wednesdays I go to John’s house for dinner”
⚫Note the change from “and” to “or”
–Translating is problematic

Higher Order Logic
•More expressive than first order
•Functions and predicates are also objects
•Described by predicates: binary(addition)
•Transformed by functions: differentiate(square)
•Can quantify over both
•E.g. define red functions as having zero at 17
•Much harder to reason with

Beyond True and False
•Multi-valued logics
•More than two truth values
•e.g., true, false & unknown
•Fuzzy logicuses probabilities, truth value in [0,1]
•Modal logics
•Modal operators define mode for propositions
•Epistemic logics(belief)
•e.g. p (necessarily p), p (possibly p), …
•Temporal logics(time)
•e.g. p (always p), p (eventually p), …

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge
representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic
reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 38

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge
representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic
reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 39

40
Propositional logic
•Propositionallogicconsistsof:
•Thelogicalvaluestrueandfalse(TandF)
•Propositions:“Sentences,”which
•Areatomic(thatis,theymustbetreatedasindivisibleunits,with
nointernalstructure),and
•Haveasinglelogicalvalue,eithertrueorfalse
•Operators,bothunaryandbinary;whenappliedtologicalvalues,yield
logicalvalues
•Theusualoperatorsareand,or,not,andimplies
17-03-2021 18CSC305J_AI_UNIT3

41
Truth tables
•Logic,likearithmetic,hasoperators,whichapplytoone,two,ormore
values(operands)
•Atruthtableliststheresultsforeachpossiblearrangementofoperands
•Orderisimportant:xopymayormaynotgivethesameresultasyopx
•Therowsinatruthtablelistallpossiblesequencesoftruthvaluesforn
operands,andspecifyaresultforeachsequence
•Hence,thereare2
n
rowsinatruthtablefornoperands
17-03-2021 18CSC305J_AI_UNIT3

42
Unary operators
X Constant true, (T)
T T
F T
•There are four possible unary operators:
•Only the last of these (negation) is widely used (and has a symbol,¬,for the operation
X Constant false, (F)
T F
F F
X Identity, (X)
T T
F F
X Negation, ¬X
T F
F T
17-03-2021 18CSC305J_AI_UNIT3

43
Combined tables for unary operators
X Constant T Constant F Identity ¬X
T T F T F
F T F F T

44
Binary operators
•There are sixteen possible binary operators:
•All these operators have names, but I haven’t tried to fit them in
•Only a few of these operators are normally used in logic
XY
TTTTTTTTTTFFFFFFFF
TFTTTTFFFFTTTTFFFF
FTTTFFTTFFTTFFTTFF
FFTFTFTFTFTFTFTFTF
17-03-2021 18CSC305J_AI_UNIT3

45
Useful binary operators
•Here are the binary operators that are traditionally used:
•Notice in particular that material implication() only approximately means the same as the
English word “implies”
•All the other operators can be constructed from a combination of these (along with unary
not, ¬)
XY
AND
X Y
OR
X Y
IMPLIES
X Y
BICONDITIONAL
X Y
TT T T T T
TF F T F F
FT F T T F
FF F F T T
17-03-2021 18CSC305J_AI_UNIT3

46
Logical expressions
•All logical expressions can be computed with some combination of and(),
or(), and not() operators
•For example, logical implication can be computed this way:
•Notice that X Y is equivalent to X Y
XY X X Y X Y
TT F T T
TF F F F
FT T T T
FF T T T
17-03-2021 18CSC305J_AI_UNIT3

47
Another example
•Exclusive or (xor) is true if exactly one of its operands is true
•Notice that (XY)(XY)is equivalent to X xorY
XYXYX Y X Y (XY)(XY) X xor Y
TTF F F F F F
TFF T F T T T
FTT F T F T T
FFT T F F F F
17-03-2021 18CSC305J_AI_UNIT3

48
World
•A worldis a collection of prepositions and logical expressions relating those
prepositions
•Example:
•Propositions: JohnLovesMary, MaryIsFemale, MaryIsRich
•Expressions:
MaryIsFemaleMaryIsRichJohnLovesMary
•A proposition “says something” about the world, but since it is atomic (you
can’t look inside it to see component parts), propositions tend to be very
specialized and inflexible
17-03-2021 18CSC305J_AI_UNIT3

49
Models
A modelis an assignment of a truth value to each proposition, for example:
•JohnLovesMary: T, MaryIsFemale: T, MaryIsRich: F
•An expression is satisfiableif there is a model for which the expression istrue
•For example, the above model satisfies the expression
MaryIsFemaleMaryIsRichJohnLovesMary
•An expression is validif it is satisfied by everymodel
•This expression is notvalid:
MaryIsFemaleMaryIsRichJohnLovesMary
because it is not satisfied by this model:
JohnLovesMary: F, MaryIsFemale: T, MaryIsRich: T
•But this expression isvalid:
MaryIsFemaleMaryIsRichMaryIsFemale
17-03-2021 18CSC305J_AI_UNIT3

50
Inference rules in propositional logic
•Here are just a few of the rules you can apply when reasoning in propositional logic:
17-03-2021 18CSC305J_AI_UNIT3

51
Implication elimination
•Aparticularlyimportantruleallowsyoutogetridof
theimplicationoperator,:
•XYXY
•Wewillusethislateronasanecessarytoolfor
simplifyinglogicalexpressions
•Thesymbolmeans“islogicallyequivalentto”
17-03-2021 18CSC305J_AI_UNIT3

52
Conjunction elimination
•Anotherimportantruleforsimplifyinglogicalexpressions
allowsyoutogetridoftheconjunction(and)operator,:
•This rule simply says that if you have an andoperator at the
top level of a fact (logical expression), you can break the
expression up into two separate facts:
•MaryIsFemaleMaryIsRich
•becomes:
•MaryIsFemale
•MaryIsRich
17-03-2021 18CSC305J_AI_UNIT3

53
Inference by computer
•Todoinference(reasoning)bycomputerisbasicallyasearchprocess,
takinglogicalexpressionsandapplyinginferencerulestothem
•Whichlogicalexpressionstouse?
•Whichinferencerulestoapply?
•Usuallyyouaretryingto“prove”someparticularstatement
•Example:
•it_is_rainingit_is_sunny
•it_is_sunnyI_stay_dry
•it_is_rainyI_take_umbrella
•I_take_umbrellaI_stay_dry
•Toprove:I_stay_dry17-03-2021 18CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge
representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic
reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 54

Reasoning Patterns
•Inference in propositional logic is NP-complete!
•However, inference in propositional logic shows
monoticity:
•Adding more rules to a knowledge base does not
affect earlier inferences

56
Forward and backward reasoning
•Situation:Youhaveacollectionoflogicalexpressions(premises),and
youaretryingtoprovesomeadditionallogicalexpression(the
conclusion)
•Youcan:
•Doforwardreasoning:Startapplyinginferencerulestothelogical
expressionsyouhave,andstopifoneofyourresultsisthe
conclusionyouwant
•Dobackwardreasoning:Startfromtheconclusionyouwant,and
trytochooseinferencerulesthatwillgetyoubacktothelogical
expressionsyouhave
•Withthetoolswehavediscussedsofar,neitherisfeasible
17-03-2021 18CSC305J_AI_UNIT3

57
Example
•Given:
•it_is_rainingit_is_sunny
•it_is_sunnyI_stay_dry
•it_is_rainingI_take_umbrella
•I_take_umbrellaI_stay_dry
•You can conclude:
•it_is_sunnyit_is_raining
•I_take_umbrellait_is_sunny
•I_stay_dryI_take_umbrella
•Etc., etc. ... there are just too manythings you can conclude!
17-03-2021 18CSC305J_AI_UNIT3

58
Predicate calculus
•Predicate calculusis also known as “First Order Logic” (FOL)
•Predicate calculus includes:
•All of propositional logic
•Logical valuestrue,false
•Variablesx, y, a, b,...
•Connectives, , , , 
•ConstantsKingJohn, 2, Villanova,...
•PredicatesBrother, >,...
•FunctionsSqrt, MotherOf,...
•Quantifiers , 
17-03-2021 18CSC305J_AI_UNIT3

59
Constants, functions, and predicates
•Aconstantrepresentsa“thing”--ithasnotruthvalue,andit
doesnotoccur“bare”inalogicalexpression
•Examples:DavidMatuszek,5,Earth,goodIdea
•Givenzeroormorearguments,afunctionproducesa
constantasitsvalue:
•Examples:motherOf(DavidMatuszek),add(2,2),
thisPlanet()
•Apredicateislikeafunction,butproducesatruthvalue
•Examples: greatInstructor(DavidMatuszek),
isPlanet(Earth),greater(3,add(2,2))17-03-2021 18CSC305J_AI_UNIT3

60
Universal quantification
•Theuniversalquantifier,,isreadas“foreach”
or“forevery”
•Example:x,x
2
0(forallx,x
2
isgreaterthanorequaltozero)
•Typically,isthemainconnectivewith:
x,at(x,Villanova)smart(x)
means“EveryoneatVillanovaissmart”
•Commonmistake:usingasthemainconnectivewith:
x,at(x,Villanova)smart(x)
means“EveryoneisatVillanovaandeveryoneissmart”
•Iftherearenovaluessatisfyingthecondition,theresultistrue
•Example:x,isPersonFromMars(x)smart(x)istrue
17-03-2021 18CSC305J_AI_UNIT3

61
Existential quantification
•Theexistentialquantifier,,isread“forsome”or“thereexists”
•Example:x,x
2
<0(thereexistsanxsuchthatx
2
islessthanzero)
•Typically,isthemainconnectivewith:
x,at(x,Villanova)smart(x)
means“ThereissomeonewhoisatVillanovaandissmart”
•Commonmistake:usingasthemainconnectivewith:
x,at(x,Villanova)smart(x)
ThisistrueifthereissomeoneatVillanovawhoissmart...
...butitisalsotrueifthereissomeonewhoisnotatVillanova
Bytherulesofmaterialimplication,theresultofFTisT
17-03-2021 18CSC305J_AI_UNIT3

62
Properties of quantifiers
•x yis the same asyx
•x yis the same asyx
•x yis notthe same as yx
•x yLoves(x,y)
•“There is a person who loves everyone in the world”
•More exactly: x y(person(x) person(y) Loves(x,y))
•yx Loves(x,y)
•“Everyone in the world is loved by at least one person”
•Quantifier duality: each can be expressed using the other
•xLikes(x,IceCream) x Likes(x,IceCream)
•x Likes(x,Broccoli) xLikes(x,Broccoli)
17-03-2021 18CSC305J_AI_UNIT3

63
Parentheses
•Parentheses are often used with quantifiers
•Unfortunately, everyone uses them differently, so don’t be upset at any
usage you see
•Examples:
•(x)person(x) likes(x,iceCream)
•(x)(person(x) likes(x,iceCream))
•(x) [person(x) likes(x,iceCream) ]
•x,person(x) likes(x,iceCream)
•x(person(x) likes(x,iceCream))
•I prefer parentheses that show the scopeof the quantifier
•x (x > 0) x (x < 0)
17-03-2021 18CSC305J_AI_UNIT3

64
More rules
•Now there are numerous additional rules we can apply!
•Here are two exceptionally important rules:
•x, p(x) x, p(x)
“If not every xsatisfies p(x), then there exists a x that does not satisfy
p(x)”
•x, p(x) x, p(x)
“If there does not exist an xthat satisfies p(x), then all x do not satisfy
p(x)”
•In any case, the search space is just too largeto be feasible
•This was the case until 1970, when J. Robinson discovered resolution
17-03-2021 18CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-Knowledge
representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic
reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 65

66
Logic by computer was infeasible
•Why is logic so hard?
•You start with a large collection of facts (predicates)
•You start with a large collection of possible transformations (rules)
•Some of these rules apply to a single fact to yield a new fact
•Some of these rules apply to a pair of facts to yield a new fact
•So at every step you must:
•Choose some rule to apply
•Choose one or two facts to which you might be able to apply the rule
•If there are nfacts
•There aren potential ways to apply a single-operand rule
•There aren * (n -1) potential ways to apply a two-operand rule
•Add the new fact to your ever-expanding fact base
•The search space is huge!
17-03-2021 18CSC305J_AI_UNIT3

67
The magic of resolution
•Here’showresolutionworks:
•Youtransformeachofyourfactsintoaparticularform,calledaclause
(thisisthetrickypart)
•Youapplyasinglerule,theresolutionprinciple,toapairofclauses
•Clausesareclosedwithrespecttoresolution--thatis,whenyou
resolvetwoclauses,yougetanewclause
•Youaddthenewclausetoyourfactbase
•Sothenumberoffactsyouhavegrowslinearly
•Youstillhavetochooseapairoffactstoresolve
•Youneverhavetochoosearule,becausethere’sonlyone
17-03-2021 18CSC305J_AI_UNIT3

68
The fact base
•A fact base is a collection of “facts,” expressed in predicate calculus, that are presumed to be true (valid)
•These facts are implicitly “anded” together
•Example fact base:
•seafood(X) likes(John, X)(where X is a variable)
•seafood(shrimp)
•pasta(X) likes(Mary, X)(where X is a differentvariable)
•pasta(spaghetti)
•That is,
•(seafood(X) likes(John, X)) seafood(shrimp) 
(pasta(Y) likes(Mary, Y)) pasta(spaghetti)
•Notice that we had to change someXs toYs
•The scopeof a variable is the single fact in which it occurs
17-03-2021 18CSC305J_AI_UNIT3

69
Clause form
•A clauseis a disjunction("or") of zero or more literals, some or all of
which may be negated
•Example:
sinks(X) dissolves(X,water) ¬denser(X,water)
•Notice that clauses use only “or” and “not”—they do not use “and,”
“implies,” or either of the quantifiers “for all” or “there exists”
•The impressive part is that anypredicate calculus expression can be
put into clause form
•Existential quantifiers, , are the trickiest ones
17-03-2021 18CSC305J_AI_UNIT3

70
Unification
•Fromthepairoffacts(notyetclauses,justfacts):
•seafood(X)likes(John,X)(whereXisavariable)
•seafood(shrimp)
•Weoughttobeabletoconclude
•likes(John,shrimp)
•WecandothisbyunifyingthevariableXwiththeconstantshrimp
•Thisisthesame“unification”asisdoneinProlog
•Thisunificationturnsseafood(X)likes(John,X)into
seafood(shrimp)likes(John,shrimp)
•Togetherwiththegivenfactseafood(shrimp),thefinaldeductive
stepiseasy
17-03-2021 18CSC305J_AI_UNIT3

71
The resolution principle
•Here it is:
•From X someLiterals
and X someOtherLiterals
----------------------------------------------
conclude: someLiteralssomeOtherLiterals
•That’s all there is to it!
•Example:
•broke(Bob) well-fed(Bob)
¬broke(Bob) ¬hungry(Bob)
--------------------------------------
well-fed(Bob) ¬hungry(Bob)
17-03-2021 18CSC305J_AI_UNIT3

72
A common error
•You can only do oneresolution at a time
•Example:
•broke(Bob) well-fed(Bob) happy(Bob)
¬broke(Bob) ¬hungry(Bob) ∨¬happy(Bob)
•You can resolve on broketo get:
•well-fed(Bob) happy(Bob) ¬hungry(Bob) ¬happy(Bob) T
•Or you can resolve on happyto get:
•broke(Bob) well-fed(Bob) ¬broke(Bob) ¬hungry(Bob) T
•Note that both legal resolutions yield a tautology(a trivially true statement, containing X
¬X), which is correct but useless
•But you cannotresolve on both at once to get:
•well-fed(Bob) ¬hungry(Bob)
17-03-2021 18CSC305J_AI_UNIT3

73
Contradiction
•A special case occurs when the result of a resolution (the resolvent) is
empty, or “NIL”
•Example:
•hungry(Bob)
¬hungry(Bob)
----------------
NIL
•In this case, the fact base is inconsistent
•This will turn out to be a very useful observation in doing resolution
theorem proving
17-03-2021 18CSC305J_AI_UNIT3

74
A first example
•“Everywhere that John goes, Rover goes. John is at school.”
•at(John, X) at(Rover, X)(not yet in clause form)
•at(John, school) (already in clause form)
•We use implication elimination to change the first of these into clause
form:
•at(John, X) at(Rover, X)
•at(John, school)
•We can resolve these on at(-, -),but to do so we have to unify Xwith
school; this gives:
•at(Rover, school)
17-03-2021 18CSC305J_AI_UNIT3

75
Refutation resolution
•Thepreviousexamplewaseasybecauseithadveryfewclauses
•Whenwehavealotofclauses,wewanttofocusoursearchonthe
thingwewouldliketoprove
•Wecandothisasfollows:
•Assumethatourfactbaseisconsistent(wecan’tderiveNIL)
•Addthenegationofthethingwewanttoprovetothefactbase
•Showthatthefactbaseisnowinconsistent
•Concludethethingwewanttoprove
17-03-2021 18CSC305J_AI_UNIT3

76
Example of refutation resolution
•“Everywhere that John goes, Rover goes. John is at school. Prove that Rover is
at school.”
1.at(John, X) at(Rover, X)
2.at(John, school)
3.at(Rover, school) (this is the added clause)
•Resolve #1 and #3:
4.at(John, X)
•Resolve #2 and #4:
5.NIL
•Conclude the negation of the added clause: at(Rover, school)
•This seems a roundabout approach for such a simple example, but it works well
for larger problems
17-03-2021 18CSC305J_AI_UNIT3

77
A second example
•Start with:
•it_is_rainingit_is_sunny
•it_is_sunnyI_stay_dry
•it_is_rainingI_take_umbrella
•I_take_umbrellaI_stay_dry
•Convert to clause form:
1.it_is_rainingit_is_sunny
2.it_is_sunnyI_stay_dry
3.it_is_rainingI_take_umbrella
4.I_take_umbrellaI_stay_dry
•Prove that I stay dry:
5.I_stay_dry
•Proof:
6.(5, 2) it_is_sunny
7.(6, 1) it_is_raining
8.(5, 4) I_take_umbrella
9.(8, 3) it_is_raining
10.(9, 7) NIL
▪Therefore, (I_stay_dry)
▪I_stay_dry
17-03-2021 18CSC305J_AI_UNIT3

78
Converting sentences to CNF
1. Eliminate all ↔connectives
(P ↔Q) ((P →Q) ^ (Q →P))
2. Eliminate all →connectives
(P →Q) (P Q)
3. Reduce the scope of each negation symbol to a single predicate
P P
(P Q) P Q
(P Q) P Q
(x)P (x)P
(x)P (x)P
4. Standardize variables: rename all variables so that each quantifier has its own
unique variable name
17-03-2021 18CSC305J_AI_UNIT3

79
Converting sentences to clausal formSkolem
constants and functions
5. Eliminate existential quantification by introducing Skolem
constants/functions
(x)P(x) P(c)
c is a Skolemconstant(a brand-new constant symbol that is not used in any
other sentence)
(x)(y)P(x,y) (x)P(x, f(x))
since is within the scope of a universally quantified variable, use a Skolem
function fto construct a new value that depends onthe universally
quantified variable
f must be a brand-new function name not occurring in any other sentence in
the KB.
E.g., (x)(y)loves(x,y) (x)loves(x,f(x))
In this case, f(x) specifies the person that x loves
17-03-2021 18CSC305J_AI_UNIT3

80
Converting sentences to clausal form
6. Remove universal quantifiers by (1) moving them all to the left end;
(2) making the scope of each the entire sentence; and (3) dropping
the “prefix” part
Ex: (x)P(x) P(x)
7. Put into conjunctive normal form (conjunction of disjunctions) using
distributive and associative laws
(P Q) R (P R) (Q R)
(P Q) R (P Q R)
8. Split conjuncts into separate clauses
9. Standardize variables so each clause contains only variable names
that do not occur in any other clause
17-03-2021 18CSC305J_AI_UNIT3

81
An example
(x)(P(x) →((y)(P(y) →P(f(x,y))) (y)(Q(x,y) →P(y))))
2. Eliminate →
(x)(P(x) ((y)(P(y) P(f(x,y))) (y)(Q(x,y) P(y))))
3. Reduce scope of negation
(x)(P(x) ((y)(P(y) P(f(x,y))) (y)(Q(x,y) P(y))))
4. Standardize variables
(x)(P(x) ((y)(P(y) P(f(x,y))) (z)(Q(x,z) P(z))))
5. Eliminate existential quantification
(x)(P(x) ((y)(P(y) P(f(x,y))) (Q(x,g(x)) P(g(x)))))
6. Drop universal quantification symbols
(P(x) ((P(y) P(f(x,y))) (Q(x,g(x)) P(g(x)))))
17-03-2021 18CSC305J_AI_UNIT3

82
Example
7. Convert to conjunction of disjunctions
(P(x) P(y) P(f(x,y))) (P(x) Q(x,g(x))) 
(P(x) P(g(x)))
8. Create separate clauses
P(x) P(y) P(f(x,y))
P(x) Q(x,g(x))
P(x) P(g(x))
9. Standardize variables
P(x) P(y) P(f(x,y))
P(z) Q(z,g(z))
P(w) P(g(w))
17-03-2021 18CSC305J_AI_UNIT3

83
Running example
•All Romans who know Marcus either hate Caesar or
think that anyone who hates anyone is crazy
•x, [ Roman(x) know(x, Marcus) ] 
[ hate(x, Caesar) 
(y, z, hate(y, z) thinkCrazy(x, y))]
17-03-2021 18CSC305J_AI_UNIT3

84
Step 1: Eliminate implications
•Use the fact that x y is equivalent tox y
•x, [ Roman(x) know(x, Marcus) ] 
[ hate(x, Caesar) 
(y, z, hate(y, z) thinkCrazy(x, y))]
•x, [ Roman(x) know(x, Marcus) ] 
[hate(x, Caesar) 
(y, (z, hate(y, z) thinkCrazy(x, y))]
17-03-2021 18CSC305J_AI_UNIT3

85
Step 2: Reduce the scope of 
•Reduce the scope of negation to a single term, using:
•(p) p
•(a b) (a b)
•(a b) (a b)
•x, p(x) x, p(x)
•x, p(x) x, p(x)
•x, [ Roman(x) know(x, Marcus) ] 
[hate(x, Caesar) 
(y, (z, hate(y, z) thinkCrazy(x, y))]
•x, [ Roman(x) know(x, Marcus) ] 
[hate(x, Caesar) 
(y, z, hate(y, z) thinkCrazy(x, y))]
17-03-2021 18CSC305J_AI_UNIT3

86
Step 3: Standardize variables apart
•x, P(x) x, Q(x)
becomes
x, P(x) y, Q(y)
•This is just to keep the scopes of variables from
getting confused
•Not necessary in our running example
17-03-2021 18CSC305J_AI_UNIT3

87
Step 4: Move quantifiers
•Move all quantifiers to the left, without changing their relative
positions
•x, [ Roman(x) know(x, Marcus) ] 
[hate(x, Caesar) 
(y, z, hate(y, z) thinkCrazy(x, y)]
•x, y, z,[ Roman(x) know(x, Marcus) ] 
[hate(x, Caesar) 
(hate(y, z) thinkCrazy(x, y))]
17-03-2021 18CSC305J_AI_UNIT3

88
Step 5: Eliminate existential quantifiers
•WedothisbyintroducingSkolemfunctions:
•Ifx,p(x)thenjustpickone;callitx’
•Iftheexistentialquantifierisundercontrolofa
universalquantifier,thenthepickedvaluehastobe
afunctionoftheuniversallyquantifiedvariable:
•Ifx,y,p(x,y)thenx,p(x,y(x))
•Notnecessaryinourrunningexample
17-03-2021 18CSC305J_AI_UNIT3

89
Step 6: Drop the prefix (quantifiers)
•x,y,z,[Roman(x)know(x,Marcus)]
[hate(x,Caesar)(hate(y,z)thinkCrazy(x,y))]
•Atthispoint,allthequantifiersareuniversalquantifiers
•Wecanjusttakeitforgrantedthatallvariablesare
universallyquantified
•[Roman(x)know(x,Marcus)]
[hate(x,Caesar)(hate(y,z)thinkCrazy(x,y))]
17-03-2021 18CSC305J_AI_UNIT3

90
Step 7: Create a conjunction of disjuncts
•[ Roman(x) know(x, Marcus) ] 
[hate(x, Caesar) (hate(y, z) thinkCrazy(x, y))]
becomes
Roman(x) know(x, Marcus) 
hate(x, Caesar) hate(y, z) thinkCrazy(x, y)
17-03-2021 18CSC305J_AI_UNIT3

91
Step 8: Create separate clauses
•Every place we have an , we break our expression up
into separate pieces
•Not necessary in our running example
17-03-2021 18CSC305J_AI_UNIT3

92
Step 9: Standardize apart
•Rename variables so that no two clauses have the same
variable
•Not necessary in our running example
•Final result:
Roman(x) know(x, Marcus) 
hate(x, Caesar) hate(y, z) thinkCrazy(x, y)
•That’s it! It’s a long process, but easy enough to do
mechanically
17-03-2021 18CSC305J_AI_UNIT3

93
Resolution
•ResolutionisasoundandcompleteinferenceprocedureforFOL
•Reminder:Resolutionruleforpropositionallogic:
•P
1P
2...P
n
•P
1Q
2...Q
m
•Resolvent:P
2...P
nQ
2...Q
m
•Examples
•PandPQ:deriveQ(ModusPonens)
•(PQ)and(QR):derivePR
•PandP:deriveFalse[contradiction!]
•(PQ)and(PQ):deriveTrue
17-03-2021 18CSC305J_AI_UNIT3

94
Resolution in first-orderlogic
•Given sentences
P
1... P
n
Q
1... Q
m
•in conjunctive normal form:
•each P
iand Q
iis a literal, i.e., a positive or negated predicate symbol with its
terms,
•if P
jand Q
kunifywith substitution list θ, then derive the resolventsentence:
subst(θ, P
1... P
j-1P
j+1... P
nQ
1…Q
k-1Q
k+1... Q
m)
•Example
•from clause P(x, f(a)) P(x, f(y)) Q(y)
•and clause P(z, f(a)) Q(z)
•derive resolvent P(z, f(y)) Q(y) Q(z)
•using θ= {x/z}
17-03-2021 18CSC305J_AI_UNIT3

96
Resolution refutation
•Given a consistent set of axioms KB and goal sentence Q, show that KB
|= Q
•Proof by contradiction:Add Q to KB and try to prove false.
i.e., (KB |-Q) ↔(KB Q |-False)
•Resolution is refutation complete: it can establish that a given sentence
Q is entailed by KB, but can’t (in general) be used to generate all logical
consequences of a set of sentences
•Also, it cannot be used to prove that Q is not entailedby KB.
•Resolution won’t always give an answersince entailment is only
semidecidable
•And you can’t just run two proofs in parallel, one trying to prove Q and the
other trying to prove Q, since KB might not entail either one
17-03-2021 18CSC305J_AI_UNIT3

97
Refutation resolution proof tree
allergies(w) v sneeze(w) cat(y) v ¬allergic-to-cats(z) allergies(z)
cat(y) v sneeze(z) ¬allergic-to-cats(z) cat(Felix)
sneeze(z) v ¬allergic-to-cats(z) allergic-to-cats(Lise)
false
sneeze(Lise)sneeze(Lise)
w/z
y/Felix
z/Lise
negated query
17-03-2021 18CSC305J_AI_UNIT3

98
We need answers to the following questions
•How to convert FOL sentences to conjunctive normal form (a.k.a. CNF,
clause form): normalization and skolemization
•How to unify two argument lists, i.e., how to find their most general
unifier (mgu) q: unification
•How to determine which two clauses in KB should be resolved next
(among all resolvable pairs of clauses) : resolution (search) strategy
17-03-2021 18CSC305J_AI_UNIT3

99
Unification
•Unificationisa“pattern-matching”procedure
•Takestwoatomicsentences,calledliterals,asinput
•Returns“Failure”iftheydonotmatchandasubstitutionlist,θ,iftheydo
•Thatis,unify(p,q)=θmeanssubst(θ,p)=subst(θ,q)fortwoatomic
sentences,pandq
•θiscalledthemostgeneralunifier(mgu)
•Allvariablesinthegiventwoliteralsareimplicitlyuniversally
quantified
•Tomakeliteralsmatch,replace(universallyquantified)variablesby
terms
17-03-2021 18CSC305J_AI_UNIT3

100
Unification algorithm
procedure unify(p, q, θ)
Scan p and q left-to-right and find the first corresponding
terms where p and q “disagree” (i.e., p and q not equal)
If there is no disagreement, return θ(success!)
Let r and s be the terms in p and q, respectively,
where disagreement first occurs
If variable(r) then {
Let θ= union(θ, {r/s})
Return unify(subst(θ, p), subst(θ, q), θ)
} else if variable(s) then {
Let θ= union(θ, {s/r})
Return unify(subst(θ, p), subst(θ, q), θ)
} else return “Failure”
end
17-03-2021 18CSC305J_AI_UNIT3

101
Unification: Remarks
•Unifyisalinear-timealgorithmthatreturnsthemost
generalunifier(mgu),i.e.,theshortest-lengthsubstitution
listthatmakesthetwoliteralsmatch.
•Ingeneral,thereisnotauniqueminimum-length
substitutionlist,butunifyreturnsoneofminimumlength
•Avariablecanneverbereplacedbyatermcontainingthat
variable
Example:x/f(x)isillegal.
•This“occurscheck”shouldbedoneintheabovepseudo-
codebeforemakingtherecursivecalls
17-03-2021 18CSC305J_AI_UNIT3

102
Unification examples
•Example:
•parents(x, father(x), mother(Bill))
•parents(Bill, father(Bill), y)
•{x/Bill, y/mother(Bill)}
•Example:
•parents(x, father(x), mother(Bill))
•parents(Bill, father(y), z)
•{x/Bill, y/Bill, z/mother(Bill)}
•Example:
•parents(x, father(x), mother(Jane))
•parents(Bill, father(y), mother(y))
•Failure17-03-2021 18CSC305J_AI_UNIT3

103
Practice example: Did Curiosity kill the cat
•Jack owns a dog. Every dog owner is an animal lover. No animal lover
kills an animal. Either Jack or Curiosity killed the cat, who is named
Tuna. Did Curiosity kill the cat?
•These can be represented as follows:
A. (x) Dog(x) Owns(Jack,x)
B. (x) ((y) Dog(y) Owns(x, y)) →AnimalLover(x)
C. (x) AnimalLover(x) →((y) Animal(y) →Kills(x,y))
D. Kills(Jack,Tuna) Kills(Curiosity,Tuna)
E. Cat(Tuna)
F. (x) Cat(x) →Animal(x)
G. Kills(Curiosity, Tuna)
GOAL
Resolution example
17-03-2021 18CSC305J_AI_UNIT3

104
•Convert to clause form
A1. (Dog(D))
A2. (Owns(Jack,D))
B. (Dog(y), Owns(x, y), AnimalLover(x))
C. (AnimalLover(a), Animal(b), Kills(a,b))
D. (Kills(Jack,Tuna), Kills(Curiosity,Tuna))
E. Cat(Tuna)
F. (Cat(z), Animal(z))
•Add the negation of query:
G: (Kills(Curiosity, Tuna))
D is a skolem constant
17-03-2021 18CSC305J_AI_UNIT3

105
•The resolution refutation proof
R1: G, D, {} (Kills(Jack, Tuna))
R2: R1, C, {a/Jack, b/Tuna}(~AnimalLover(Jack),
~Animal(Tuna))
R3: R2, B, {x/Jack} (~Dog(y), ~Owns(Jack, y),
~Animal(Tuna))
R4: R3, A1, {y/D}(~Owns(Jack, D),
~Animal(Tuna))
R5: R4, A2, {} (~Animal(Tuna))
R6: R5, F, {z/Tuna}(~Cat(Tuna))
R7: R6, E, {} FALSE
17-03-2021 18CSC305J_AI_UNIT3

106
•The proof tree
G D
C
B
A1
A2
F
A
R1: K(J,T)
R2: AL(J) A(T)
R3: D(y) O(J,y) A(T)
R4: O(J,D), A(T)
R5: A(T)
R6: C(T)
R7: FALSE
{}
{a/J,b/T}
{x/J}
{y/D}
{}
{z/T}
{}
17-03-2021 18CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution
•Knowledge representation using rules-Knowledge representation using semantic
nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic reasoning
over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 107

Production Rules
•Condition-ActionPairs
•IFthiscondition(orpremiseorantecedent)occurs,
THENsomeaction(orresult,orconclusion,or
consequence)will(orshould)occur
•IFthetrafficlightisredANDyouhavestopped,
THENarightturnisOK

Production Rules
•Eachproductionruleinaknowledgebaserepresentsan
autonomouschunkofexpertise
•Whencombinedandfedtotheinferenceengine,thesetofrules
behavessynergistically
•Rulescanbeviewedasasimulationofthecognitivebehaviour
ofhumanexperts
•Rulesrepresentamodelofactualhumanbehaviour
•Predominanttechniqueusedinexpertsystems,oftenin
conjunctionwithframes

Forms of Rules
•IFpremise,THENconclusion
•IFyourincomeishigh,THENyourchanceofbeing
auditedbytheInlandRevenueishigh
•Conclusion,IFpremise
•Yourchanceofbeingauditedishigh,IFyourincome
ishigh

Forms of Rules
•InclusionofELSE
•IFyourincomeishigh,ORyourdeductionsareunusual,THEN
yourchanceofbeingauditedishigh,ORELSEyourchanceof
beingauditedislow
•Morecomplexrules
•IFcreditratingishighANDsalaryismorethan£30,000,OR
assetsaremorethan£75,000,ANDpayhistoryisnot"poor,"
THENapprovealoanupto£10,000,andlisttheloanincategory
"B.”
•Actionpartmayhavemoreinformation:THEN"approvetheloan"
and"refertoanagent"

Characteristics of Rules
First Part Second Part
Names Premise
Antecedent
Situation
IF
Conclusion
Consequence
Action
THEN
Nature Conditions, similar to declarative knowledgeResolutions, similar to procedural knowledge
Size Can have many IFs Usually only one conclusion
Statement AND statements All conditions must be true for a conclusion to be true
OR statements If any condition is true, the conclusion is true

Rule-based Inference
•Productionrulesaretypicallyusedaspartofa
productionsystem
•Productionsystemsprovidepattern-directedcontrolof
thereasoningprocess
•Productionsystemshave:
•Productions:setofproductionrules
•WorkingMemory(WM):descriptionofcurrentstate
oftheworld
•Recognise-actcycle

Production Systems
Production
Rules
C
1→A
1
C
2→A
2
C
3→A
3

C
n→A
n
Working
Memory
Conflict
Resolution
Conflict
Set
Environment

Recognise-Act Cycle
•PatternsinWMmatchedagainstproductionruleconditions
•Matching(activated)rulesformtheconflictset
•Oneofthematchingrulesisselected(conflictresolution)and
fired
•Actionofruleisperformed
•ContentsofWMupdated
•CyclerepeatswithupdatedWM

Conflict Resolution
•Reasoninginaproductionsystemcanbeviewedasatypeof
search
•Selectionstrategyforrulesfromtheconflictsetcontrols
search
•Productionsystemmaintainstheconflictsetasanagenda
•Orderedlistofactivatedrules(thosewiththeirconditions
satisfied)whichhavenotyetbeenexecuted
•Conflictresolutionstrategydetermineswhereanewly-
activatedruleisinserted

Salience
•Rulesmaybegivenaprecedenceorderbyassigninga
saliencevalue
•Newlyactivatedrulesareplacedintheagendaaboveallrules
oflowersalience,andbelowallruleswithhighersalience
•Rulewithhighersalienceareexecutedfirst
•Conflictresolutionstrategyappliesbetweenrulesofthe
samesalience
•Ifsalienceandtheconflictresolutionstrategycan’t
determinewhichruleistobeexecutednext,aruleischosen
atrandomfromthemosthighlyrankedrules

Conflict Resolution Strategies
•Depth-first:newlyactivatedrulesplacedaboveotherrulesinthe
agenda
•Breadth-first:newlyactivatedrulesplacedbelowotherrules
•Specificity:rulesorderedbythenumberofconditionsintheLHS
(simple-firstorcomplex-first)
•Leastrecentlyfired:firetherulethatwaslastfiredthelongesttime
ago
•Refraction:don’tfirearuleunlesstheWMpatternsthatmatchits
conditionshavebeenmodified
•Recency:rulesorderedbythetimestampsonthefactsthatmatch
theirconditions

Salience
•Saliencefacilitatesthemodularizationofexpertsystems
inwhichmodulesworkatdifferentlevelsofabstraction
•Over-useofsaliencecancomplicateasystem
•Explicitorderingtoruleexecution
•Makesbehaviourofmodifiedsystemslesspredictable
•Ruleofthumb:iftworuleshavethesamesalience,are
inthesamemodule,andareactivatedconcurrently,
thentheorderinwhichtheyareexecutedshouldnot
matter

Common Types of Rules
•Knowledgerules,ordeclarativerules,stateallthefacts
andrelationshipsaboutaproblem
•Inferencerules,orproceduralrules,adviseonhowto
solveaproblem,giventhatcertainfactsareknown
•Inferencerulescontainrulesaboutrules(metarules)
•Knowledgerulesarestoredintheknowledgebase
•Inferencerulesbecomepartoftheinferenceengine
17-03-2021 12018CSC305J_AI_UNIT3

Major Advantages of Rules
•Easy to understand (natural form of knowledge)
•Easy to derive inference and explanations
•Easy to modify and maintain
•Easy to combine with uncertainty
•Rules are frequently independent
17-03-2021 12118CSC305J_AI_UNIT3

Major Limitations of Rules
•Complex knowledge requires many rules
•Search limitations in systems with many rules
17-03-2021 12218CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution
•Knowledge representation using rules-Knowledge representation using semantic
nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic reasoning
over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 123

Semantic Networks
•Asemanticnetworkisastructureforrepresenting
knowledgeasapatternofinterconnectednodesand
arcs
•Nodesinthenetrepresentconceptsofentities,
attributes,events,values
•Arcsinthenetworkrepresentrelationshipsthathold
betweentheconcepts
17-03-2021 12418CSC305J_AI_UNIT3

Semantic Networks
•Semanticnetworkscanshowinheritance
•Relationshiptypes–is-a,has-a
•SemanticNets-visualrepresentationofrelationships
•Canbecombinedwithotherrepresentationmethods
17-03-2021 12518CSC305J_AI_UNIT3

Semantic Networks
Animal
Can breathe
Can eat
Has skin
Bird
Can fly
Has wings
Has feathers
Canary
Can sing
Is yellow
Ostrich
Runs fast
Cannot fly
Is tall
Fish
Can swim
Has fins
Has gills
Salmon
Swims upstream
Is pink
Is edible
is-a
is-a
is-a
is-a
is-a
17-03-2021 12618CSC305J_AI_UNIT3

DOG
ANIMAL
HOUND
BEAGLE
SNOOPY
COLLIE
LASSIE
SHEEPDOG
is a
is a
is a
is a
barks
is a
instance
instance
CHARLIE BROWN
FICTIONAL
CHARACTER
instance
instance
instance
has tail
moves
breathes
size: medium
size: small
works sheep
tracks
friend of
Semantic Networks
17-03-2021 12718CSC305J_AI_UNIT3

Semantic Networks
What does or should a node represent?
•A class of objects?
•An instance of an class?
•The canonical instance of a class?
•The set of all instances of a class?
17-03-2021 12818CSC305J_AI_UNIT3

Semantic Networks
•Semanticsoflinksthatdefinenewobjectsandlinksthatrelate
existingobjects,particularlythosedealingwith‘intrinsic’
characteristicsofagivenobject
•Howdoesonedealwiththeproblemsofcomparisonbetween
objects(orclassesofobjects)throughtheirattributes?
•Essentiallytheproblemofcomparingobjectinstances
•Whatmechanismsaretherearetohandlequantificationin
semanticnetworkformalisms?
17-03-2021 12918CSC305J_AI_UNIT3

Transitive inference, but…
•Clydeisanelephant,anelephantisamammal:Clydeisa
mammal.
•TheUSPresidentiselectedevery4years,BushisUSPresident:
Bushiselectedevery4years
•MycarisaFord,Fordisacarcompany:mycarisacar
company
17-03-2021 13018CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge reasoning-
Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and inferences-
Propositional logic-Reasoning patterns
•Unification and Resolution
•Knowledge representation using rules-Knowledge representation using semantic
nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and belief
network
•Probabilistic reasoning-Probabilistic reasoning over time-Probabilistic reasoning
over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 131

Frames
•Aframeisaknowledgerepresentationformalismbasedontheideaofa
frameofreference.
•Aframeisadatastructurethatincludesalltheknowledgeabouta
particularobject
•FramesorganisedinahierarchyFormofobject-orientedprogrammingfor
AIandES.
•Eachframedescribesoneobject
•Specialterminology
M.Minsky(1974)AFrameworkforRepresentingKnowledge,
MIT-AILaboratoryMemo306
17-03-2021 13218CSC305J_AI_UNIT3

Frames
•There are two types of frame:
•Class Frame
•Individual or Instance Frame
•A frame carries with it a set of slotsthat can represent
objects that are normally associated with a subject of
the frame.
17-03-2021 13318CSC305J_AI_UNIT3

Frames
•Theslotscanthenpointtootherslotsorframes.That
givesframesystemstheabilitytocarryoutinheritance
andsimplekindsofdatamanipulation.
•Theuseofprocedures-alsocalleddemonsinthe
literature-helpsintheincorporationofsubstantial
amountsofproceduralknowledgeintoaparticular
frame-orientedknowledgebase
17-03-2021 13418CSC305J_AI_UNIT3

Frame-based model of semantic
memory
•Knowledgeisorganisedinadatastructure
•Slotsinstructureareinstantiatedwithparticularvaluesfora
giveninstanceofdata
•...translationtoOOterminology:
•frames==classesorobjects
•slots==variables/methods
17-03-2021 13518CSC305J_AI_UNIT3

General Knowledge as Frames
DOG
Fixed
legs: 4
Default
diet: carnivorous
sound: bark
Variable
size:
colour:
COLLIE
Fixed
breed of: DOG
type: sheepdog
Default
size: 65cm
Variable
colour:
17-03-2021 13618CSC305J_AI_UNIT3

MAMMAL:
subclass: ANIMAL
has_part: head
ELEPHANT
subclass: MAMMAL
colour: grey
size: large
Nellie
instance:ELEPHANT
likes: apples
17-03-2021 13718CSC305J_AI_UNIT3
General Knowledge as Frames

Logic underlies Frames
•∀x mammal(x) ⇒has_part(x, head)
•∀x elephant(x) ⇒mammal(x)
•elephant(clyde)

mammal(clyde)
has_part(clyde, head)
17-03-2021 13818CSC305J_AI_UNIT3

MAMMAL:
subclass: ANIMAL
has_part: head
*furry: yes
ELEPHANT
subclass: MAMMAL
has_trunk:yes
*colour: grey
*size: large
*furry:no
Clyde
instance:ELEPHANT
colour:pink
owner:Fred
Nellie
instance:ELEPHANT
size: small17-03-2021 13918CSC305J_AI_UNIT3
Logic underlies Frames

Frames (Contd.)
•Canrepresentsubclassandinstancerelationships(both
sometimescalledISAor“isa”)
•Properties(e.g.colourandsize)canbereferredtoasslotsand
slotvalues(e.g.grey,large)asslotfillers
•Objectscaninheritallpropertiesofparentclass(therefore
Nellieisgreyandlarge)
•Butcaninheritpropertieswhichareonlytypical(usuallycalled
default,herestarred),andcanbeoverridden
•Forexample,mammalistypicallyfurry,butthisisnotsoforan
elephant
17-03-2021 14018CSC305J_AI_UNIT3

•Provideaconcise,structuralrepresentationofknowledgeina
naturalmanner
•Frameencompassescomplexobjects,entiresituationsora
managementproblemasasingleentity
•Frameknowledgeispartitionedintoslots
•Slotcandescribedeclarativeknowledgeorprocedural
knowledge
•HierarchyofFrames:Inheritance
17-03-2021 14118CSC305J_AI_UNIT3
Frames (Contd.)

Capabilities of Frames
•Abilitytoclearlydocumentinformationaboutadomainmodel;
forexample,aplant'smachinesandtheirassociatedattributes
•Relatedabilitytoconstrainallowablevaluesofanattribute
•Modularityofinformation,permittingeaseofsystemexpansion
andmaintenance
•Morereadableandconsistentsyntaxforreferencingdomain
objectsintherules
17-03-2021 14218CSC305J_AI_UNIT3

Capabilities of Frames
•Platformforbuildinggraphicinterfacewithobjectgraphics
•Mechanismtorestrictthescopeoffactsconsideredduring
forwardorbackwardchaining
•Accesstoamechanismthatsupportstheinheritanceof
informationdownaclasshierarchy
•UsedasunderlyingmodelinstandardsforaccessingKBs(Open
KnowledgeBaseConnectivity-OKBC)
17-03-2021 14318CSC305J_AI_UNIT3

Summary
•Frameshavebeenusedinconjunctionwithother,lesswell-
grounded,representationformalisms,likeproductionsystems,
whenusedtobuildtopre-operationaloroperationalexpert
systems
•Framescannotbeusedefficientlytoorganise‘awhole
computation
17-03-2021 14418CSC305J_AI_UNIT3

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge
reasoning-Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and
inferences-Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-
Knowledge representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and
belief network
•Probabilistic reasoning-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 145

Types of Inference
•Deduction
•Induction
•Abduction
17-03-2021 15CS401-Artificial Intelligence 146

Deduction
•Deriving a conclusion from given axioms and facts
•Also called logical inference or truth preservation
Axiom –All kids are naughty
Fact/Premise –Riya is a kid
Conclusion –Riya is naughty
17-03-2021 15CS401-Artificial Intelligence 147

Induction
•Deriving general rule or axiom from background knowledge and
observations
•Riya is a kid
•Riya is naughty
General axiom which is derived is:
Kids are naughty
17-03-2021 15CS401-Artificial Intelligence 148

Abduction
•A premise is derived from a known axiom and some observations
•All kids are naughty
•Riya is naughty
Inference
Riya is a kid
17-03-2021 15CS401-Artificial Intelligence 149

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge
reasoning-Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and
inferences-Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-
Knowledge representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and
belief network
•Probabilistic reasoning-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 150

Uncertain knowledge and reasoning
•Inreallife,itisnotalwayspossibletodeterminethestateoftheenvironmentasitmightnotbeclear.Due
topartiallyobservableornon-deterministicenvironments,agentsmayneedtohandleuncertaintyanddeal
withit.
•Uncertaindata:Datathatismissing,unreliable,inconsistentornoisy
•Uncertainknowledge:Whentheavailableknowledgehasmultiplecausesleadingtomultipleeffectsor
incompleteknowledgeofcausalityinthedomain
•Uncertainknowledgerepresentation:Therepresentationswhichprovidesarestrictedmodelofthereal
system,orhaslimitedexpressiveness
•Inference:Incaseofincompleteordefaultreasoningmethods,conclusionsdrawnmightnotbecompletely
accurate.Let’sunderstandthisbetterwiththehelpofanexample.
•IFprimaryinfectionisbacteriacea
•ANDsiteofinfectionissterile
•ANDentrypointisgastrointestinaltract
•THENorganismisbacteriod(0.7).
•Insuchuncertainsituations,theagentdoesnotguaranteeasolutionbutactsonitsown
assumptionsandprobabilitiesandgivessomedegreeofbeliefthatitwillreachtherequired
solution.
17-03-2021 18CSC305J_AI_UNIT3 151

•Forexample,IncaseofMedicaldiagnosisconsidertheruleToothache=Cavity.This
isnotcompleteasnotallpatientshavingtoothachehavecavities.Sowecanwritea
moregeneralizedruleToothache=CavityVGumproblemsVAbscess…Tomakethis
rulecomplete,wewillhavetolistallthepossiblecausesoftoothache.Butthisisnot
feasibleduetothefollowingrules:
•Laziness-Itwillrequirealotofefforttolistthecompletesetofantecedentsand
consequentstomaketherulescomplete.
•Theoreticalignorance-Medicalsciencedoesnothavecompletetheoryforthe
domain
•Practicalignorance-Itmightnotbepracticalthatalltestshavebeenorcanbe
conductedforthepatients.
•Suchuncertainsituationscanbedealtwithusing
Probabilitytheory
TruthMaintenancesystems
Fuzzylogic.
17-03-2021 18CSC305J_AI_UNIT3 152
Uncertain knowledge and reasoning

Probability
•Probabilityisthedegreeoflikelinessthataneventwilloccur.Itprovidesacertaindegreeofbelief
incaseofuncertainsituations.ItisdefinedoverasetofeventsUandassignsvalueP(e)i.e.
probabilityofoccurrenceofeventeintherange[0,1].Hereeachsentenceislabeledwithareal
numberintherangeof0to1,0meansthesentenceisfalseand1meansitistrue.
•ConditionalProbabilityorPosteriorProbabilityistheprobabilityofeventAgiventhatBhas
alreadyoccurred.
•P(A|B)=(P(B|A)*P(A))/P(B)
•Forexample,P(Itwillraintomorrow|Itisrainingtoday)representsconditionalprobabilityofit
rainingtomorrowasitisrainingtoday.
•P(A|B)+P(NOT(A)|B)=1
•Jointprobabilityistheprobabilityof2independenteventshappeningsimultaneouslylikerolling
twodiceortossingtwocoinstogether.Forexample,Probabilityofgetting2ononediceand6on
theotherisequalto1/36.Jointprobabilityhasawideuseinvariousfieldssuchasphysics,
astronomy,andcomesintoplaywhentherearetwoindependentevents.Thefulljointprobability
distributionspecifiestheprobabilityofeachcompleteassignmentofvaluestorandomvariables.
17-03-2021 18CSC305J_AI_UNIT3 153
Uncertain knowledge and reasoning

BayesTheorem
•It is based on the principle that every pair of features being
classified is independent of each other. It calculates probability
P(A|B) where A is class of possible outcomes and B is given
instance which has to be classified.
•P(A|B) = P(B|A) * P(A) / P(B)
•P(A|B) = Probability that A is happening, given that B has
occurred (posterior probability)
•P(A) = prior probability of class
•P(B) = prior probability of predictor
•P(B|A) = likelihood
17-03-2021 18CSC305J_AI_UNIT3 154
Uncertain knowledge and reasoning

Consider the following data.
Depending on the weather
(sunny, rainy or overcast), the
children will play(Y) or not
play(N).
Here, the total number of
observations = 14
Probability that children will
play given that weather is
sunny :
P( Yes| Sunny) = P(Sunny |
Yes) * P(Yes) / P(Sunny)
= 0.33 * 0.64 / 0.36
= 0.59
17-03-2021 18CSC305J_AI_UNIT3 155
Uncertain knowledge and reasoning

Itisaprobabilisticgraphicalmodelfor
representinguncertaindomainandto
reasonunderuncertainty.Itconsistsof
nodesrepresentingvariables,arcs
representingdirectconnectionsbetween
them,calledcausalcorrelations.It
representsconditionaldependencies
betweenrandomvariablesthrougha
DirectedAcyclicGraph(DAG).Abelief
networkconsistof:
1.ADAGwithnodeslabeledwithvariable
names,
2.Domainforeachrandomvariable,
3.Setofconditionalprobabilitytablesfor
eachvariablegivenitsparents,including
priorprobabilityfornodeswithno
parents.
17-03-2021 18CSC305J_AI_UNIT3 156
Uncertain knowledge and reasoning

•Let’shavealookatthestepsfollowed.
1.Identifynodeswhicharetherandomvariablesandthepossible
valuestheycanhavefromtheprobabilitydomain.Thenodescanbe
boolean(True/False),haveorderedvaluesorintegralvalues.
2.Structure-Itisusedtorepresentcausalrelationshipsbetweenthe
variables.Twonodesareconnectedifoneaffectsorcausestheother
andthearcpointstowardstheeffect.Forinstance,ifitiswindyor
cloudy,itrains.ThereisadirectlinkfromWindy/CloudytoRains.
Similarly,fromRainstoWetgrassandLeave,i.e.,ifitrains,grasswill
bewetandleaveistakenfromwork.
17-03-2021 18CSC305J_AI_UNIT3 157
Uncertain knowledge and reasoning

3. Probability-Quantifying relationship between nodes. Conditional
Probability:
•P(A^B) = P(A|B) * P(B)
•P(A|B) = P(B|A) * P(A)
•P(B|A) = P(A|B) * P(B) / P(A)
•Joint probability:
4. Markov property-Bayesian Belied Networks require assumption of
Markov property, i.e., all direct dependencies are shown by using arcs.
Here there is no direct connection between it being Cloudy and Taking a
leave. But there is one via Rains. Belief Networks which have Markov
property are also called independence maps.
17-03-2021 18CSC305J_AI_UNIT3 158
Uncertain knowledge and reasoning

InferenceinBeliefNetworks
•BayesianNetworksprovidevarioustypesofrepresentationsofprobabilitydistributionovertheir
variables.Theycanbeconditionedoveranysubsetoftheirvariables,usinganydirectionof
reasoning.
•Forexample,onecanperformdiagnosticreasoning,i.e.whenitRains,onecanupdatehisbelief
aboutthegrassbeingwetorifleaveistakenfromwork.Inthiscasereasoningoccursinthe
oppositedirectiontothenetworkarcs.Oronecanperformpredictivereasoning,i.e.,reasoning
fromnewinformationaboutcausestonewbeliefsabouteffects,followingdirectionofthearcs.
Forexample,ifthegrassisalreadywet,thentheuserknowsthatithasrainedanditmighthave
beencloudyorwindy.Anotherformofreasoninginvolvesreasoningaboutmutualcausesofa
commoneffect.Thisiscalledintercausalreasoning.
•Therearetwopossiblecausesofaneffect,representedintheformofa‘V’.Forexample,the
commoneffect‘Rains’canbecausedbytworeasons‘Windy’and‘Cloudy.’Initially,thetwo
causesareindependentofeachotherbutifitrains,itwillincreasetheprobabilityofboththe
causes.Assumethatweknowitwaswindy.Thisinformationexplainsthereasonsfortherainfall
andlowersprobabilitythatitwascloudy.
17-03-2021 18CSC305J_AI_UNIT3 159
Uncertain knowledge and reasoning

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge
reasoning-Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and
inferences-Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-
Knowledge representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and
belief network
•Probabilistic reasoning-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 160

Bayesian probability and belief network
•Bayesianbeliefnetworkiskeycomputertechnologyfordealingwith
probabilisticeventsandtosolveaproblemwhichhasuncertainty.We
candefineaBayesiannetworkas:
•"ABayesiannetworkisaprobabilisticgraphicalmodelwhich
representsasetofvariablesandtheirconditionaldependenciesusing
adirectedacyclicgraph."
•ItisalsocalledaBayesnetwork,beliefnetwork,decisionnetwork,
orBayesianmodel.
17-03-2021 18CSC305J_AI_UNIT3 161

Bayesian probability and belief network
•Bayesiannetworksareprobabilistic,becausethesenetworksarebuiltfrom
aprobabilitydistribution,andalsouseprobabilitytheoryforpredictionand
anomalydetection.
•Realworldapplicationsareprobabilisticinnature,andtorepresentthe
relationshipbetweenmultipleevents,weneedaBayesiannetwork.Itcanalsobe
usedinvarioustasksincludingprediction,anomalydetection,diagnostics,
automatedinsight,reasoning,timeseriesprediction,anddecisionmaking
underuncertainty.
•BayesianNetworkcanbeusedforbuildingmodelsfromdataandexperts
opinions,anditconsistsoftwoparts:
DirectedAcyclicGraph
Tableofconditionalprobabilities.
•ThegeneralizedformofBayesiannetworkthatrepresentsandsolvedecision
problemsunderuncertainknowledgeisknownasanInfluencediagram.
17-03-2021 18CSC305J_AI_UNIT3 162

Bayesian probability and belief network
Directed Acyclic Graph
A Bayesian network graph is made up of nodes
and Arcs (directed links), where:
Eachnodecorresponds to the random variables,
and a variable can becontinuousordiscrete.
Arc or directed arrowsrepresent the causal
relationship or conditional probabilities between
random variables. These directed links or arrows
connect the pair of nodes in the graph.
These links represent that one node directly
influence the other node, and if there is no
directed link that means that nodes are
independent with each other
In the above diagram, A, B, C, and D are
random variables represented by the nodes
of the network graph.
If we are considering node B, which is
connected with node A by a directed arrow,
then node A is called the parent of Node B.
Node C is independent of node A.
17-03-2021 18CSC305J_AI_UNIT3 163

CONDITIONALPROBABILITY
•TheBayesiannetworkhasmainlytwocomponents:
CausalComponent
Actualnumbers
•EachnodeintheBayesiannetworkhasconditionprobability
distributionP(X
i|Parent(X
i)),whichdeterminestheeffectofthe
parentonthatnode.
•BayesiannetworkisbasedonJointprobabilitydistributionand
conditionalprobability.Solet'sfirstunderstandthejointprobability
distribution:
17-03-2021 18CSC305J_AI_UNIT3 164
Bayesian probability and belief network

Jointprobabilitydistribution:
•Ifwehavevariablesx1,x2,x3,.....,xn,thentheprobabilitiesofa
differentcombinationofx1,x2,x3..xn,areknownasJointprobability
distribution.
•P[x
1,x
2,x
3,.....,x
n],itcanbewrittenasthefollowingwayintermsof
thejointprobabilitydistribution.
•=P[x
1|x
2,x
3,.....,x
n]P[x
2,x
3,.....,x
n]
•=P[x
1|x
2,x
3,.....,x
n]P[x
2|x
3,.....,x
n]....P[x
n-1|x
n]P[x
n].
•IngeneralforeachvariableXi,wecanwritetheequationas:
•P(X
i|X
i-1,.........,X
1)=P(X
i|Parents(X
i))
17-03-2021 18CSC305J_AI_UNIT3 165
Bayesian probability and belief network

ExplanationofBayesiannetwork:
•Let'sunderstandtheBayesiannetworkthroughanexamplebycreatinga
directedacyclicgraph:
Example:Harryinstalledanewburglaralarmathishometodetectburglary.
Thealarmreliablyrespondsatdetectingaburglarybutalsorespondsfor
minorearthquakes.HarryhastwoneighborsDavidandSophia,whohave
takenaresponsibilitytoinformHarryatworkwhentheyhearthealarm.
DavidalwayscallsHarrywhenhehearsthealarm,butsometimeshegot
confusedwiththephoneringingandcallsatthattimetoo.Ontheother
hand,Sophialikestolistentohighmusic,sosometimesshemissestohear
thealarm.HerewewouldliketocomputetheprobabilityofBurglary
Alarm.
17-03-2021 18CSC305J_AI_UNIT3 166
Bayesian probability and belief network

Problem:
•Calculatetheprobabilitythatalarmhassounded,butthereisneitheraburglary,noran
earthquakeoccurred,andDavidandSophiabothcalledtheHarry.
Solution:
•TheBayesiannetworkfortheaboveproblemisgivenbelow.Thenetworkstructureisshowing
thatburglaryandearthquakeistheparentnodeofthealarmanddirectlyaffectingtheprobability
ofalarm'sgoingoff,butDavidandSophia'scallsdependonalarmprobability.
•Thenetworkisrepresentingthatourassumptionsdonotdirectlyperceivetheburglaryandalso
donotnoticetheminorearthquake,andtheyalsonotconferbeforecalling.
•TheconditionaldistributionsforeachnodearegivenasconditionalprobabilitiestableorCPT.
•EachrowintheCPTmustbesumto1becausealltheentriesinthetablerepresentanexhaustive
setofcasesforthevariable.
•InCPT,abooleanvariablewithkbooleanparentscontains2
K
probabilities.Hence,ifthereare
twoparents,thenCPTwillcontain4probabilityvalues
17-03-2021 18CSC305J_AI_UNIT3 167
Bayesian probability and belief network

List of all events occurring in this network:
•Burglary (B)
•Earthquake(E)
•Alarm(A)
•David Calls(D)
•Sophia calls(S)
We can write the events of problem statement in the form of probability:P[D, S, A, B, E], can
rewrite the above probability statement using joint probability distribution:
•P[D, S, A, B, E]= P[D | S, A, B, E]. P[S, A, B, E]
•=P[D | S, A, B, E]. P[S | A, B, E]. P[A, B, E]
•= P [D| A]. P [ S| A, B, E]. P[ A, B, E]
•= P[D | A]. P[ S | A]. P[A| B, E]. P[B, E]
•= P[D | A ]. P[S | A]. P[A| B, E]. P[B |E]. P[E]
17-03-2021 18CSC305J_AI_UNIT3 168
Bayesian probability and belief network

Let'staketheobservedprobabilityfor
theBurglaryandearthquake
component:
P(B=True)=0.002,whichisthe
probabilityofburglary.
P(B=False)=0.998,whichisthe
probabilityofnoburglary.
P(E=True)=0.001,whichisthe
probabilityofaminorearthquake
P(E=False)=0.999,Whichisthe
probabilitythatanearthquakenot
occurred.
17-03-2021 18CSC305J_AI_UNIT3 169
Bayesian probability and belief network

B E P(A= True) P(A= False)
True True 0.94 0.06
True False 0.95 0.04
False True 0.31 0.69
False False 0.001 0.999
We canprovidethe
conditionalprobabilitiesas
perthebelowtables:
Conditionalprobabilitytable
forAlarmA:
TheConditionalprobabilityof
AlarmAdependsonBurglar
andearthquake:
17-03-2021 18CSC305J_AI_UNIT3 170
Bayesian probability and belief network

A P(D= True) P(D= False)
True 0.91 0.09
False 0.05 0.95
Conditional probability table
for David Calls:
TheConditionalprobabilityof
Davidthathewillcall
dependsontheprobabilityof
Alarm.
17-03-2021 18CSC305J_AI_UNIT3 171
Bayesian probability and belief network

A P(S= True) P(S= False)
True 0.75 0.25
False 0.02 0.98
Conditional probability
table for Sophia Calls:
The Conditional
probability of Sophia that
she calls is depending on
its Parent Node "Alarm."
17-03-2021 18CSC305J_AI_UNIT3
AP(S= True)P(S=
False)True0.750.25False0.020.98 AP(S=
True)P(S=
False)True0.750.25False0.020.98 172
Bayesian probability and belief network

•Fromtheformulaofjointdistribution,wecanwritetheproblemstatementintheformof
probabilitydistribution:
•P(S,D,A,¬B,¬E)=P(S|A)*P(D|A)*P(A|¬B^¬E)*P(¬B)*P(¬E).
=0.75*0.91*0.001*0.998*0.999
=0.00068045.
Hence,aBayesiannetworkcanansweranyqueryaboutthedomainbyusingJointdistribution.
•ThesemanticsofBayesianNetwork:
•TherearetwowaystounderstandthesemanticsoftheBayesiannetwork,whichisgivenbelow:
1.TounderstandthenetworkastherepresentationoftheJointprobabilitydistribution.
•Itishelpfultounderstandhowtoconstructthenetwork.
2.Tounderstandthenetworkasanencodingofacollectionofconditionalindependence
statements.
•Itishelpfulindesigninginferenceprocedure.
17-03-2021 18CSC305J_AI_UNIT3 173
Bayesian probability and belief network

Bayes' theorem in Artificial intelligence
Bayes' theorem:
•Bayes' theorem is also known asBayes' rule, Bayes' law, orBayesian
reasoning, which determines the probability of an event with uncertain
knowledge.
•In probability theory, it relates the conditional probability and marginal
probabilities of two random events.
•Bayes' theorem was named after the British mathematicianThomas Bayes.
TheBayesian inferenceis an application of Bayes' theorem, which is
fundamental to Bayesian statistics.
•It is a way to calculate the value of P(B|A) with the knowledge of P(A|B).
•Bayes' theorem allows updating the probability prediction of an event by
observing new information of the real world.
17-03-2021 18CSC305J_AI_UNIT3 174

Example:Ifcancercorrespondstoone'sagethenbyusingBayes'theorem,wecandeterminetheprobabilityofcancermoreaccuratelywith
thehelpofage.
•Bayes'theoremcanbederivedusingproductruleandconditionalprobabilityofeventAwithknowneventB:
•Asfromproductrulewecanwrite:
•P(A⋀B)=P(A|B)P(B)or
•Similarly,theprobabilityofeventBwithknowneventA:
•P(A⋀B)=P(B|A)P(A)
•Equatingrighthandsideofboththeequations,wewillget:
Theaboveequation(a)iscalledasBayes'ruleorBayes'theorem.ThisequationisbasicofmostmodernAIsystemsforprobabilisticinference.
•Itshowsthesimplerelationshipbetweenjointandconditionalprobabilities.Here,
•P(A|B)isknownasposterior,whichweneedtocalculate,anditwillbereadasProbabilityofhypothesisAwhenwehaveoccurredan
evidenceB.
•P(B|A)iscalledthelikelihood,inwhichweconsiderthathypothesisistrue,thenwecalculatetheprobabilityofevidence.
•P(A)iscalledthepriorprobability,probabilityofhypothesisbeforeconsideringtheevidence
•P(B)iscalledmarginalprobability,pureprobabilityofanevidence.
•Intheequation(a),ingeneral,wecanwriteP(B)=P(A)*P(B|Ai),hencetheBayes'rulecanbewrittenas:
•WhereA
1,A
2,A
3,........,A
nisasetofmutuallyexclusiveandexhaustiveevents.
17-03-2021 18CSC305J_AI_UNIT3 175
Bayes' theorem in Artificial intelligence

Applying Bayes' rule:
•Bayes' rule allows us to compute the single term P(B|A) in terms of P(A|B), P(B), and P(A). This is very useful
in cases where we have a good probability of these three terms and want to determine the fourth one.
Suppose we want to perceive the effect of some unknown cause, and want to compute that cause, then the
Bayes' rule becomes:
Example-1:
Question: what is the probability that a patient has diseases meningitis with a stiff neck?
•Given Data:
A doctor is aware that disease meningitis causes a patient to have a stiff neck, and it occurs 80% of
the time. He is also aware of some more facts, which are given as follows:
The Known probability that a patient has meningitis disease is 1/30,000.
The Known probability that a patient has a stiff neck is 2%.
Let a be the proposition that patient has stiff neck and b be the proposition that patient has meningitis. , so we
can calculate the following as:
P(a|b) = 0.8
P(b) = 1/30000
P(a)= .02
•Hence, we can assume that 1 patient out of 750 patients has meningitis disease with a stiff neck.
17-03-2021 18CSC305J_AI_UNIT3 176
ApplyingBayes' theorem in Artificial intelligence

ApplyingBayes' theorem in Artificial
intelligence
Example-2:
Question: From a standard deck of playing cards, a single card is drawn. The probability that the
card is king is 4/52, then calculate posterior probability P(King|Face), which means the drawn
face card is a king card.
Solution:
P(king): probability that the card is King= 4/52= 1/13
P(face): probability that a card is a face card= 3/13
P(Face|King): probability of face card when we assume it is a king = 1
Putting all values in equation (i) we will get:

17-03-2021 18CSC305J_AI_UNIT3 177

Application of Bayes' theorem in Artificial
intelligence:
Following are some applications of Bayes' theorem:
•It is used to calculate the next step of the robot when the already
executed step is given.
•Bayes' theorem is helpful in weather forecasting.
•It can solve the Monty Hall problem.
17-03-2021 18CSC305J_AI_UNIT3 178

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge
reasoning-Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and
inferences-Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-
Knowledge representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and
belief network
•Probabilistic reasoning-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 179

Probabilistic reasoning
Uncertainty:
•Tillnow,wehavelearnedknowledgerepresentationusingfirst-orderlogicandpropositionallogicwith
certainty,whichmeansweweresureaboutthepredicates.Withthisknowledgerepresentation,wemight
writeA→B,whichmeansifAistruethenBistrue,butconsiderasituationwherewearenotsureabout
whetherAistrueornotthenwecannotexpressthisstatement,thissituationiscalleduncertainty.
•Sotorepresentuncertainknowledge,wherewearenotsureaboutthepredicates,weneeduncertain
reasoningorprobabilisticreasoning.
Causes of uncertainty:
Following are some leading causes of uncertainty to occur in the real world.
•Information occurred from unreliable sources.
•Experimental Errors
•Equipment fault
•Temperature variation
•Climate change.
17-03-2021 18CSC305J_AI_UNIT3 180

Probabilisticreasoning:
•Probabilisticreasoningisawayofknowledgerepresentationwhereweapplytheconceptofprobabilitytoindicatetheuncertainty
inknowledge.Inprobabilisticreasoning,wecombineprobabilitytheorywithlogictohandletheuncertainty.
•Weuseprobabilityinprobabilisticreasoningbecauseitprovidesawaytohandletheuncertaintythatistheresultofsomeone's
lazinessandignorance.
•Intherealworld,therearelotsofscenarios,wherethecertaintyofsomethingisnotconfirmed,suchas"Itwillraintoday,"
"behaviorofsomeoneforsomesituations,""Amatchbetweentwoteamsortwoplayers."Theseareprobablesentencesforwhich
wecanassumethatitwillhappenbutnotsureaboutit,sohereweuseprobabilisticreasoning.
NeedofprobabilisticreasoninginAI:
•Whenthereareunpredictableoutcomes.
•Whenspecificationsorpossibilitiesofpredicatesbecomestoolargetohandle.
•Whenanunknownerroroccursduringanexperiment.
In probabilistic reasoning, there are two ways to solve problems with uncertain knowledge:
•Bayes' rule
•Bayesian Statistics
17-03-2021 18CSC305J_AI_UNIT3 181
Probabilistic reasoning

Asprobabilisticreasoningusesprobabilityandrelatedterms,sobeforeunderstandingprobabilisticreasoning,let's
understandsomecommonterms:
Probability:Probabilitycanbedefinedasachancethatanuncertaineventwilloccur.Itisthenumericalmeasureofthe
likelihoodthataneventwilloccur.Thevalueofprobabilityalwaysremainsbetween0and1thatrepresentideal
uncertainties.
0≤P(A)≤1,whereP(A)istheprobabilityofaneventA.
P(A)=0,indicatestotaluncertaintyinaneventA.
P(A)=1,indicatestotalcertaintyinaneventA.
Wecanfindtheprobabilityofanuncertaineventbyusingthebelowformula.
P(¬A)=probabilityofanothappeningevent.
P(¬A)+P(A)=1.
•Event:Eachpossibleoutcomeofavariableiscalledanevent.
•Samplespace:Thecollectionofallpossibleeventsiscalledsamplespace.
•Randomvariables:Randomvariablesareusedtorepresenttheeventsandobjectsintherealworld.
•Priorprobability:Thepriorprobabilityofaneventisprobabilitycomputedbeforeobservingnewinformation.
•PosteriorProbability:Theprobabilitythatiscalculatedafterallevidenceorinformationhastakenintoaccount.Itisa
combinationofpriorprobabilityandnewinformation.
17-03-2021 18CSC305J_AI_UNIT3 182
Probabilistic reasoning

Conditional probability:
•Conditional probability is a probability of occurring an event when another event
has already happened.
Let's suppose, we want to calculate the event A when event B has already occurred,
"the probability of A under the conditions of B", it can be written as:
P(A/B)=P(A⋀B)/P(B)
•Where P(A⋀B)= Joint probability of a and B
•P(B)= Marginal probability of B.
•If the probability of A is given and we need to find the probability of B, then it will
be given as: P(B/A)=P(A⋀B)/P(A)
•It can be explained by using the below Venn diagram, where B is occurred event,
so sample space will be reduced to set B, and now we can only calculate event A
when event B is already occurred by dividing the probability ofP(A⋀B) by P( B ).
17-03-2021 18CSC305J_AI_UNIT3 183
Probabilistic reasoning

Example:
In a class, there are 70% of the
students who like English and 40% of
the students who likes English and
mathematics, and then what is the
percent of students those who like
English also like mathematics?
Solution:
Let, A is an event that a student likes
Mathematics
B is an event that a student likes
English.
Hence, 57% are the students who
like English also like Mathematics.
17-03-2021 18CSC305J_AI_UNIT3 184
Probabilistic reasoning

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge
reasoning-Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and
inferences-Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-
Knowledge representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and
belief network
•Probabilistic reasoning-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 185

Probabilistic reasoning over time
Definition
•Probabilistic reasoning is the representation of knowledge where the concept of probability is applied to indicate the uncertainty
in knowledge.
Reasons to use Probabilistic Reasoning in AI
•Some reasons to use this way of representing knowledge is given below:
•When we are unsure of the predicates.
•When the possibilities of predicates become too large to list down.
•When during an experiment, it is proven that an error occurs.
•Probabilityof a given event = Chances of that event occurring / Total number of Events.
Notations and Properties
•Consider the statement S: March will be cold.
•Probability is often denoted as P(predicate).
•Considering the chances of March being cold is only 30%, therefore,P(S) = 0.3
•Probability always takes a value between 0 and 1. If the probability is 0, then the event will never occur and if it is 1, then it will
occur for sure.
•Then, P(¬S) = 0.7
•This means, the probability of March not being cold is 70%.
•Property 1: P(S) + P(¬S) = 1
17-03-2021 18CSC305J_AI_UNIT3 186

ConsiderthestatementT:Aprilwillbecold.
•Then,P(S∧T)meansProbabilityofSANDT,i.e.,ProbabilityofMarchandAprilbeingcold.
•P(S∨T)meansProbabilityofSORT,i.e.,ProbabilityofMarchorAprilbeingcold.
•Property2:P(S∨T)=P(S)+P(T)-P(S∧T)
•ProofsforthepropertiesarenotgivenhereandyoucanworkthemoutbyyourselvesusingVenn
Diagrams.
ConditionalProperty
•ConditionalPropertyisdefinedastheprobabilityofagiveneventgivenanotherevent.Itis
denotedbyP(B|A)andisreadas:''ProbabilityofBgivenprobabilityofA.''
•Property3:P(B|A)=P(B∧A)/P(A).
Bayes'Theorem
•GivenP(A),P(B)andP(A|B),then
•P(B|A)=P(A|B)xP(B)/P(A)
17-03-2021 18CSC305J_AI_UNIT3 187
Probabilistic reasoning over time

BayesianNetwork
WhendesigningaBayesianNetwork,wekeep
thelocalprobabilitytableateachnode.
BayesianNetwork-Example
ConsideraBayesianNetworkasgivenbelow:
ThisBayesianNetworktellsusthereasona
particularpersoncannotstudy.Itmaybe
eitherbecauseofnoelectricityorbecauseof
hislackofinterest.Thecorresponding
probabilitiesarewritteninfrontofthecauses.
17-03-2021 18CSC305J_AI_UNIT3 188
Probabilistic reasoning over time

No Electricity Not interested P(Cannot Study)
F F
P(No electricity = F) x P(Not
Interested = F) = 0.8 x 0.7 =
0.56
F T
P(No electricity = F) x P(Not
Interested = T) = 0.8 x 0.3 =
0.24
T F
P(No electricity = T) x
P(Not Interested = F) = 0.2
x 0.7 = 0.14
T T
P(No electricity = T) x
P(Not Interested = T) = 0.2
x 0.3 = 0.06
Now,asyoucanseenocause
isdependentoneachother
andtheydirectlycontribute
totheperson'sinabilityto
study.Toplotthethirdtable,
weconsiderfourcases.Since,
thecausesareindependent,
their corresponding
probabilitiescan be
multiplieddirectly.
17-03-2021 18CSC305J_AI_UNIT3 189
Probabilistic reasoning over time

The updated
BayesianNetwork
is:
17-03-2021 18CSC305J_AI_UNIT3 190
Probabilistic reasoning over time

Knowledge and Reasoning
Table of Contents
•Knowledge and reasoning-Approaches and issues of knowledge
reasoning-Knowledge base agents
•Logic Basics-Logic-Propositional logic-syntax ,semantics and
inferences-Propositional logic-Reasoning patterns
•Unification and Resolution-Knowledge representation using rules-
Knowledge representation using semantic nets
•Knowledge representation using frames-Inferences-
•Uncertain Knowledge and reasoning-Methods-Bayesian probability and
belief network
•Probabilistic reasoning-Probabilistic reasoning over time
•Other uncertain techniques-Data mining-Fuzzy logic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 191

Data Mining
•Inartificialintelligenceandmachinelearning,datamining,orknowledgediscoveryindatabases,isthenontrivial
extractionofimplicit,previouslyunknownandpotentiallyusefulinformationfromdata.Statisticalmethodsareusedthat
enabletrendsandotherrelationshipstobeidentifiedinlargedatabases.
•Themajorreasonthatdatamininghasattractedattentionisduetothewideavailabilityofvastamountsofdata,andthe
needforturningsuchdataintousefulinformationandknowledge.Theknowledgegainedcanbeusedforapplications
rangingfromriskmonitoring,businessmanagement,productioncontrol,marketanalysis,engineering,andscience
exploration.
Ingeneral,threetypesofdataminingtechniquesareused:association,regression,andclassification.
Associationanalysis
•Associationanalysisisthediscoveryofassociationrulesshowingattribute-valueconditionsthatoccurfrequentlytogether
inagivensetofdata.Associationanalysisiswidelyusedtoidentifythecorrelationofindividualproductswithinshopping
carts.
Regressionanalysis
•Regressionanalysiscreatesmodelsthatexplaindependentvariablesthroughtheanalysisofindependentvariables.Asan
example,thepredictionforaproduct’ssalesperformancecanbecreatedbycorrelatingtheproductpriceandtheaverage
customerincomelevel.
Classificationandprediction
•Classificationistheprocessofdesigningasetofmodelstopredicttheclassofobjectswhoseclasslabelisunknown.The
derivedmodelmayberepresentedinvariousforms,suchasif-thenrules,decisiontrees,ormathematicalformulas.
17-03-2021 18CSC305J_AI_UNIT3 192

•Adecisiontreeisaflow-chart-liketreestructurewhereeachnodedenotesatestonanattribute
value,eachbranchrepresentsanoutcomeofthetest,andeachtreeleafrepresentsaclassor
classdistribution.Decisiontreescanbeconvertedtoclassificationrules.
•Classificationcanbeusedforpredictingtheclasslabelofdataobjects.Predictionencompasses
theidentificationofdistributiontrendsbasedontheavailabledata.
Thedataminingprocessconsistsofaniterativesequenceofthefollowingsteps:
•Datacoherenceandcleaningtoremovenoiseandinconsistentdata
•Dataintegrationsuchthatmultipledatasourcesmaybecombined
•Dataselectionwheredatarelevanttotheanalysisareretrieved
•Datatransformationwheredataareconsolidatedintoformsappropriateformining
•Patternrecognitionandstatisticaltechniquesareappliedtoextractpatterns
•Patternevaluationtoidentifyinterestingpatternsrepresentingknowledge
•Visualizationtechniquesareusedtopresentminedknowledgetousers
17-03-2021 18CSC305J_AI_UNIT3 193
Data Mining

LimitsofDataMining
•GIGO(garbageingarbageout)isalmostalwaysreferencedwithrespecttodatamining,asthe
qualityoftheknowledgegainedthroughdataminingisdependentonthequalityofthehistorical
data.Weknowdatainconsistenciesanddealingwithmultipledatasourcesrepresentlarge
problemsindatamanagement.
•Datacleaningtechniquesareusedtodealwithdetectingandremovingerrorsandinconsistencies
toimprovedataquality;however,detectingtheseinconsistenciesisextremelydifficult.Howcan
weidentifyatransactionthatisincorrectlylabeledassuspicious?Learningfromincorrectdata
leadstoinaccuratemodels.
•Anotherlimitationofdataminingisthatitonlyextractsknowledgelimitedtothespecificsetof
historicaldata,andanswerscanonlybeobtainedandinterpretedwithregardstoprevioustrends
learnedfromthedata.
•Thislimitsone’sabilitytobenefitfromnewtrends.Becausethedecisiontreeistrained
specificallyonthehistoricaldataset,itdoesnotaccountforpersonalizationwithinthe
tree.Additionally,datamining(decisiontrees,rules,clusters)arenon-incrementalanddonot
adaptwhileinproduction.
17-03-2021 18CSC305J_AI_UNIT3 194
Data Mining

Knowledge and Reasoning
Table of Contents
•Knowledgeandreasoning-Approachesandissuesofknowledge
reasoning-Knowledgebaseagents
•LogicBasics-Logic-Propositionallogic-syntax,semanticsand
inferences-Propositionallogic-Reasoningpatterns
•UnificationandResolution-Knowledgerepresentationusingrules-
Knowledgerepresentationusingsemanticnets
•Knowledgerepresentationusingframes-Inferences-
•UncertainKnowledgeandreasoning-Methods-Bayesianprobabilityand
beliefnetwork
•Probabilisticreasoning-Probabilisticreasoningovertime
•Otheruncertaintechniques-Datamining-Fuzzylogic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 195

196
Operation of Fuzzy System
Crisp Input
Fuzzy Input
Fuzzy Output
Crisp Output
Fuzzification
Rule Evaluation
Defuzzification
Input Membership Functions
Rules / Inferences
Output Membership Functions

197
Building Fuzzy Systems
⚫Fuzzification
⚫Inference
⚫Composition
⚫Defuzzification

198
Fuzzification
⚫Establishesthefactbaseofthefuzzysystem.Itidentifiestheinputandoutputofthe
system,definesappropriateIFTHENrules,andusesrawdatatoderivea
membershipfunction.
⚫Consideranairconditioningsystemthatdeterminethebestcirculationlevelby
samplingtemperatureandmoisturelevels.Theinputsarethecurrenttemperature
andmoisturelevel.Thefuzzysystemoutputsthebestaircirculationlevel:“none”,
“low”,or“high”.Thefollowingfuzzyrulesareused:
1.Iftheroomishot,circulatetheairalot.
2.Iftheroomiscool,donotcirculatetheair.
3.Iftheroomiscoolandmoist,circulatetheairslightly.
⚫Aknowledgeengineerdeterminesmembershipfunctionsthatmaptemperatures
tofuzzyvaluesandmapmoisturemeasurementstofuzzyvalues.

199
Inference
⚫Evaluatesallrulesanddeterminestheirtruthvalues.Ifaninputdoesnot
preciselycorrespondtoanIFTHENrule,partialmatchingoftheinputdatais
usedtointerpolateananswer.
⚫Continuingtheexample,supposethatthesystemhasmeasuredtemperature
andmoisturelevelsandmappedthemtothefuzzyvaluesof.7and.1
respectively.Thesystemnowinfersthetruthofeachfuzzyrule.
⚫TodothisasimplemethodcalledMAX-MINisused.Thismethodsetsthe
fuzzyvalueoftheTHENclausetothefuzzyvalueoftheIFclause.Thus,the
methodinfersfuzzyvaluesof0.7,0.1,and0.1forrules1,2,and3
respectively.

200
Composition
⚫Combinesallfuzzyconclusionsobtainedbyinferenceintoasingleconclusion.
Sincedifferentfuzzyrulesmighthavedifferentconclusions,considerallrules.
⚫Continuingtheexample,eachinferencesuggestsadifferentaction
⚫rule1suggestsa"high"circulationlevel
⚫rule2suggeststurningoffaircirculation
⚫rule3suggestsa"low"circulationlevel.
⚫AsimpleMAX-MINmethodofselectionisusedwherethemaximumfuzzyvalue
oftheinferencesisusedasthefinalconclusion.So,compositionselectsafuzzy
valueof0.7sincethiswasthehighestfuzzyvalueassociatedwiththeinference
conclusions.

201
Defuzzification
⚫Convertthefuzzyvalueobtainedfromcompositionintoa“crisp”value.This
processisoftencomplexsincethefuzzysetmightnottranslatedirectlyintoa
crispvalue.Defuzzificationisnecessary,sincecontrollersofphysicalsystems
requirediscretesignals.
⚫Continuingtheexample,compositionoutputsafuzzyvalueof0.7.This
imprecisevalueisnotdirectlyusefulsincetheaircirculationlevelsare“none”,
“low”,and“high”.Thedefuzzificationprocessconvertsthefuzzyoutputof
0.7intooneoftheaircirculationlevels.Inthiscaseitisclearthatafuzzy
outputof0.7indicatesthatthecirculationshouldbesetto“high”.

202
Defuzzification
⚫Therearemanydefuzzificationmethods.Twoofthemorecommon
techniquesarethecentroidandmaximummethods.
⚫Inthecentroidmethod,thecrispvalueoftheoutputvariableis
computedbyfindingthevariablevalueofthecenterofgravityofthe
membershipfunctionforthefuzzyvalue.
⚫Inthemaximummethod,oneofthevariablevaluesatwhichthefuzzy
subsethasitsmaximumtruthvalueischosenasthecrispvalueforthe
outputvariable.

Example: Design of Fuzzy Expert System –Washing
Machine
FDP on AI & Advanced Machine Learning using Data
Science, 22/11/19
203

Fuzzification
Giveninputsx1andx2,findtheweight
valuesassociatedwith each input
membershipfunction.
ZNM NS PS PM
X1
0.2
0.7
W = [0, 0, 0.2, 0.7, 0]

Fuzzy Rules
Demystifying AI algorithms
Not Greasy Medium Greasy
Small Dirt Time= Vshort Medium Long
Medium Dirt Short Medium Long
Large Dirt Medium Long Very Long

DeFuzzification
Medium
Short
Long
30
Demystifying AI algorithms
10 20
40
(Y –20)/(30-20) = 0.5
Y –20 = 0.5* 10 = 5
Y = 25 Mins
Washing Time Long = (Y-30)/(40-30)
Washing Time Medium = (Y-20)/(30-20)
5
Very short Very Long
60
X1 and X2 = 0.5

Knowledge and Reasoning
Table of Contents
•Knowledgeandreasoning-Approachesandissuesofknowledgereasoning-
Knowledgebaseagents
•LogicBasics-Logic-Propositionallogic-syntax,semanticsandinferences-
Propositionallogic-Reasoningpatterns
•UnificationandResolution-Knowledgerepresentationusingrules-Knowledge
representationusingsemanticnets
•Knowledgerepresentationusingframes-Inferences-
•UncertainKnowledgeandreasoning-Methods-Bayesianprobabilityandbelief
network
•Probabilisticreasoning-Probabilisticreasoningovertime
•Otheruncertaintechniques-Datamining-Fuzzylogic-Dempster-shafertheory
17-03-2021 18CSC305J_AI_UNIT3 207

Dempster Shafer Theory
•DempsterShaferTheoryisgivenbyArthureP.Dempsterin1967andhisstudent
Glenn Shafer in 1976.
Thistheoryisbeingreleasedbecauseoffollowingreason:-
•Bayesiantheoryisonlyconcernedaboutsingleevidences.
•Bayesianprobabilitycannotdescribeignorance.
•DSTisanevidencetheory,itcombinesallpossibleoutcomesoftheproblem.
Henceitisusedtosolveproblemswheretheremaybeachancethatadifferent
evidencewillleadtosomedifferentresult.
Theuncertainityinthismodelisgivenby:-
•Considerallpossibleoutcomes.
•Beliefwillleadtobelieveinsomepossibilitybybringingoutsomeevidence.
•Plausibilitywillmakeevidencecompatibilitywithpossibleoutcomes.
17-03-2021 18CSC305J_AI_UNIT3 208

For eg:-
let us consider a room where four person are presented A, B, C, D(lets say) And suddenly lights out and
when the lights come back B has been died due to stabbing in his back with the help of a knife. No one came
into the room and no one has leaved the room and B has not committed suicide. Then we have to find out
who is the murdrer?
To solve these there are thefollowing possibilities:
•Either {A} or{C} or {D} has killed him.
•Either {A, C} or {C, D} or {A, C} have killed him.
•Or the three of them kill him i.e; {A, C, D}
•None of the kill him {o}(let us say).
These will be the possible evidences by which we can find the murderer by measure of plausibIlity.
Using the above example we can say :
Set of possible conclusion (P): {p1, p2….pn}
where P is set of possible conclusion and cannot be exhaustive means at least one (p)imust be true.
(p)imust be mutually exclusive.
Power Set will contain 2
n
elements where n is number of elements in the possible set.
For eg:-
If P = { a, b, c}, then Power set is given as
{o, {a}, {b}, {c}, {a, b}, {b, c}, {a, c}, {a, b, c}}= 2
3
elements.
17-03-2021 18CSC305J_AI_UNIT3 209
Dempster Shafer Theory

•Massfunctionm(K):Itisaninterpretationofm({KorB})i.e;itmeansthereis
evidencefor{KorB}whichcannotbedividedamongmorespecificbeliefsforK
andB.
•BeliefinK:ThebeliefinelementKofPowerSetisthesumofmassesofelement
whicharesubsetsofK.Thiscanbeexplainedthroughanexample
Lets say K = {a, b, c}
Bel(K)=m(a)+m(b)+m(c)+m(a,b)+m(a,c)+m(b,c)+m(a,b,c)
•PlaausiblityinK:ItisthesumofmassesofsetthatintersectswithK.
i.e;Pl(K)=m(a)+m(b)+m(c)+m(a,b)+m(b,c)+m(a,c)+m(a,b,c)
CharacteristicsofDempsterShaferTheory:
•Itwillignorancepartsuchthatprobabilityofalleventsaggregateto1.
•Ignoranceisreducedinthistheorybyaddingmoreandmoreevidences.
•CombinationruleisusedtocombinevarioustypesofpossibIlities.
17-03-2021 18CSC305J_AI_UNIT3 210
Dempster Shafer Theory

Advantages:
•As we add more information, uncertainty interval reduces.
•DST has much lower level of ignorance.
•Diagnose Hierarchies can be represented using this.
•Person dealing with such problems is free to think about evidences.
Disadvantages:
•In this computation effort is high, as we have to deal with 2
n
of sets.
17-03-2021 18CSC305J_AI_UNIT3 211
Dempster Shafer Theory

Example:4 people (B, J, S and K) are locked in a room when light goes out .
When light comes on, K is dead, staffed whit a knife.
Not suicide (staffed in the back)
No one entered room.
Assume only one killer
P(ϴ) = ({
Detectives after receiving the crime scene, assign mass probabilities to various elements of the
power set:
Event Mass
No one is guilty 0
B is guilty 0.1
J is guilty 0.2
S is guilty 0.1
Either B or J is guilty 0.1
Either B or S is guilty 0.1
Either S or J is guilty 0.3
One of the 3 is guilty 0.1
17-03-2021 18CSC305J_AI_UNIT3 212
Dempster Shafer Problem
•P(
) = ({

Belief in A:
The belief in an element A of the power set is the sum of the masses of
elements which are subsets of A (including A itself)
Ex: Given A= {q1, q2, q3}
Bet (A)
={m(q1)+m(q2)+m(q3)+m(q1,q2)+m(q2,q3),m(q1,q3)+m(q1,q2,q3)}
Ex:Given the above mass assignments,
Bel(B) = m(B) =0.1
Bel(B,J) = m(B)+m(J)+m(B,J) = 0.1+0.2=0.1 0.4
RESULT:


17-03-2021 18CSC305J_AI_UNIT3 213
Dempster Shafer Problem
A {B} {J} {S} {B,J} {B,S} {S,J} {B,J,S}
M(A) 0.1 0.2 0.1 0.1 0.1 0.3 0.1
Bel (A) 0.1 0.2 0.1 0.4 0.3 0.6 1.0

Plausibility of A: pl(A)
The plausibility of an element A, pl(a), is the sum of all the masses of
the sets that instruct with the
Set A : Ex:Pl(B,J) =M(B) +m(J)+M(B,J)+M(B,S)+M(J,S)+M(B,J,S)=0.9
All Result:
17-03-2021 18CSC305J_AI_UNIT3 214
A {B} {J} {S} {B,J} {B,S} {S,J} {B,J,S}
M(A) 0.1 0.2 0.1 0.1 0.1 0.3 0.1
Pl (A) 0.4 0.7 0.6 0.9 0.8 0.9 1.0
Dempster Shafer Problem

Disbelief (or Doubt) in A: dis(A)
The disbelief in A is simply bel(7A)
It is calculated by summing all masses of elements which do not intersect with A
D is (A) = 1-pl (A)
Or
Pl (A) = 1-Dis (A)
Belief Interval of A:
The certainty associated with a give subset A is defined by the belief interval: [bel(A) p(A)]
Ex . The belief interval of (B,S) IS [0.3,08]
17-03-2021 18CSC305J_AI_UNIT3 215
A {B} {J} {S} {B,J} {B,S} {S,J} {B,J,S}
M(A) 0.1 0.2 0.1 0.1 0.1 0.3 0.1
Bel (A) 0.1 0.2 0.1 0.4 0.3 0.6 1.0
Pl(A) 0.4 0.7 0.6 0.9 0.8 0.9 1.0
Dempster Shafer Problem
A {B} {J} {S} {B,J} {B,S} {S,J} {B,J,S}
Pl(A) 0.4 0.7 0.6 0.9 0.8 0.9 1.0
Dis(A) 0.6 0.3 0.4 0.1 0.2 0.1 0.0

P(A) represents the maximum share of the evidence. We could possibly have, if for all its that intouectswith A, the part that intractsactually
valid. So, Pl(A) is the max possible value of prof(A).
Belief intervals and Probability
The probability in A falls some ware between bel(A) and pl(A).
-bel(A) represents the evidence. We have for a directly So proof (A) cannot be less than this value.
-PL(A) represents the maximum share of the evidence we could possibly have. If, for all sets that intersects with A, the part that intersects is
actually valid. So, PL(A) is the max possible value of proof(A).
Belief intervals allow Dempster, Shaffer theory to reason about the degree of certainityor certainityof our beliefs.
A small difference between belief and plausibility shows that we are curtain about our belief.
A large difference shows that we are uncertain about our belief.
however, even with a ‘O’ interval, this does not mean we know which conclusion is right.
17-03-2021 18CSC305J_AI_UNIT3 216
Dempster Shafer Problem
A {B} {J} {S} {B,J} {B,S} {S,J} {B,J,S}
M(A) 0.1 0.2 0.1 0.1 0.1 0.3 0.1
Bel (A) 0.1 0.2 0.1 0.4 0.3 0.6 1.0
Pl(A) 0.4 0.7 0.6 0.9 0.8 0.9 1.0
Belief
interval
{0.1,0.4} {0.2,0.7} {0.1,0.6} {0.4,0.9} {0.3,0.8} {0.6,0.9} {1,1}

Thank You
17-03-2021 18CSC305J_AI_UNIT3 217