Basic Concept Of Probability

guest45a926 83,267 views 49 slides Nov 27, 2007
Slide 1
Slide 1 of 49
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49

About This Presentation

No description available for this slideshow.


Slide Content

IBS-09-SL RM 501 – Ranjit Goswami
1
Basic Probability

IBS-09-SL RM 501 – Ranjit Goswami
2
Introduction
•Probability is the study of randomness and uncertainty.
•In the early days, probability was associated with games of
chance (gambling).

IBS-09-SL RM 501 – Ranjit Goswami
3
Simple Games Involving Probability
Game: A fair die is rolled. If the result is 2, 3, or 4, you win
$1; if it is 5, you win $2; but if it is 1 or 6, you lose $3.
Should you play this game?

IBS-09-SL RM 501 – Ranjit Goswami
4
Random Experiment
•a random experiment is a process whose outcome is uncertain.
Examples:
•Tossing a coin once or several times
•Picking a card or cards from a deck
•Measuring temperature of patients
• ...

IBS-09-SL RM 501 – Ranjit Goswami
5
Sample Space
The sample space is the set of all possible outcomes.
Simple Events
The individual outcomes are called simple events.
Event
An event is any collection
of one or more simple events
Events & Sample Spaces

IBS-09-SL RM 501 – Ranjit Goswami
6
Example
Experiment: Toss a coin 3 times.
•Sample space 
 = {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}.
•Examples of events include
•A = {HHH, HHT,HTH, THH}
= {at least two heads}
•B = {HTT, THT,TTH}
= {exactly two tails.}

IBS-09-SL RM 501 – Ranjit Goswami
7
Basic Concepts (from Set Theory)
•The union of two events A and B, A  B, is the event consisting of
all outcomes that are either in A or in B or in both events.
•The complement of an event A, A
c
, is the set of all outcomes in 
that are not in A.
•The intersection of two events A and B, A  B, is the event
consisting of all outcomes that are in both events.
•When two events A and B have no outcomes in common, they are
said to be mutually exclusive, or disjoint, events.

IBS-09-SL RM 501 – Ranjit Goswami
8
Example
Experiment: toss a coin 10 times and the number of heads is observed.
•Let A = { 0, 2, 4, 6, 8, 10}.
•B = { 1, 3, 5, 7, 9}, C = {0, 1, 2, 3, 4, 5}.
•A  B= {0, 1, …, 10} = .
•A  B contains no outcomes. So A and B are mutually exclusive.
•C
c
= {6, 7, 8, 9, 10}, A  C = {0, 2, 4}.

IBS-09-SL RM 501 – Ranjit Goswami
9
Rules
•Commutative Laws:
•A  B = B  A, A  B = B  A
•Associative Laws:
•(A  B)  C = A  (B  C )
•(A  B)  C = A  (B  C) .
•Distributive Laws:
•(A  B)  C = (A  C)  (B  C)
•(A  B)  C = (A  C)  (B  C)

IBS-09-SL RM 501 – Ranjit Goswami
10
Venn Diagram


A BA∩B

IBS-09-SL RM 501 – Ranjit Goswami
11
Probability
•A Probability is a number assigned to each subset (events) of a sample
space .
•Probability distributions satisfy the following rules:

IBS-09-SL RM 501 – Ranjit Goswami
12
Axioms of Probability
•For any event A, 0  P(A)  1.
•P() =1.
•If A
1
, A
2
, … A
n
is a partition of A, then
P(A) = P(A
1
) + P(A
2
)

+

...+ P(A
n
)
(A
1
, A
2
, … A
n
is called a partition of A if A
1
 A
2




A
n
= A and A
1
,
A
2
, … A
n
are mutually exclusive.)

IBS-09-SL RM 501 – Ranjit Goswami
13
Properties of Probability
•For any event A, P(A
c
) = 1 - P(A).
•If A  B, then P(A)  P(B).
•For any two events A and B,
P(A  B) = P(A) + P(B) - P(A  B).
For three events, A, B, and C,
P(ABC) = P(A) + P(B) + P(C) -
P(AB) - P(AC) - P(BC) + P(AB C).

IBS-09-SL RM 501 – Ranjit Goswami
14
Example
•In a certain population, 10% of the people are rich, 5% are famous,
and 3% are both rich and famous. A person is randomly selected
from this population. What is the chance that the person is
•not rich?
•rich but not famous?
•either rich or famous?

IBS-09-SL RM 501 – Ranjit Goswami
15
Intuitive Development (agrees with axioms)
•Intuitively, the probability of an event a could be
defined as:
Where N(a) is the number that event a happens in n trialsWhere N(a) is the number that event a happens in n trials

Here We Go Again: Not So Basic
Probability

IBS-09-SL RM 501 – Ranjit Goswami
17
More Formal:
•  is the Sample Space:
•Contains all possible outcomes of an experiment
•  in  is a single outcome
•A in  is a set of outcomes of interest

IBS-09-SL RM 501 – Ranjit Goswami
18
Independence
•The probability of independent events A, B and C is
given by:
P(A,B,C) = P(A)P(B)P(C)
A and B are independent, if knowing that A has happened A and B are independent, if knowing that A has happened
does not say anything about B happeningdoes not say anything about B happening

IBS-09-SL RM 501 – Ranjit Goswami
19
Bayes Theorem
•Provides a way to convert a-priori probabilities to a-
posteriori probabilities:

IBS-09-SL RM 501 – Ranjit Goswami
20
Conditional Probability
•One of the most useful concepts!
AA
BB


IBS-09-SL RM 501 – Ranjit Goswami
21
Bayes Theorem
•Provides a way to convert a-priori probabilities to a-
posteriori probabilities:

IBS-09-SL RM 501 – Ranjit Goswami
22
Using Partitions:
•If events A
i
are mutually exclusive and partition 

IBS-09-SL RM 501 – Ranjit Goswami
23
Random Variables
•A (scalar) random variable X is a function that maps
the outcome of a random event into real scalar
values

 X(X())

IBS-09-SL RM 501 – Ranjit Goswami
24
Random Variables Distributions
•Cumulative Probability Distribution (CDF):
• Probability Density Function (PDF):Probability Density Function (PDF):

IBS-09-SL RM 501 – Ranjit Goswami
25
Random Distributions:
•From the two previous equations:

IBS-09-SL RM 501 – Ranjit Goswami
26
Uniform Distribution
•A R.V. X that is uniformly distributed between x
1
and
x
2
has density function:
XX
11 XX
22

IBS-09-SL RM 501 – Ranjit Goswami
27
Gaussian (Normal) Distribution
•A R.V. X that is normally distributed has density
function:


IBS-09-SL RM 501 – Ranjit Goswami
28
Statistical Characterizations
•Expectation (Mean Value, First Moment):
•Second Moment:Second Moment:

IBS-09-SL RM 501 – Ranjit Goswami
29
Statistical Characterizations
•Variance of X:
•Standard Deviation of X:

IBS-09-SL RM 501 – Ranjit Goswami
30
Mean Estimation from Samples
•Given a set of N samples from a distribution, we can
estimate the mean of the distribution by:

IBS-09-SL RM 501 – Ranjit Goswami
31
Variance Estimation from Samples
•Given a set of N samples from a distribution, we can
estimate the variance of the distribution by:

Pattern
Classification

Chapter 1: Introduction to Pattern
Recognition (Sections 1.1-1.6)
• Machine Perception
• An Example
• Pattern Recognition Systems
• The Design Cycle
• Learning and Adaptation
• Conclusion

IBS-09-SL RM 501 – Ranjit Goswami
34
Machine Perception
•Build a machine that can recognize patterns:
•Speech recognition
•Fingerprint identification
•OCR (Optical Character Recognition)
•DNA sequence identification

IBS-09-SL RM 501 – Ranjit Goswami
35
An Example
•“Sorting incoming Fish on a conveyor according to
species using optical sensing”
Sea bass
Species
Salmon

IBS-09-SL RM 501 – Ranjit Goswami
36
•Problem Analysis
•Set up a camera and take some sample images to extract
features
•Length
•Lightness
•Width
•Number and shape of fins
•Position of the mouth, etc…
•This is the set of all suggested features to explore for use in our
classifier!

IBS-09-SL RM 501 – Ranjit Goswami
37
• Preprocessing
•Use a segmentation operation to isolate fishes from one
another and from the background
•Information from a single fish is sent to a feature
extractor whose purpose is to reduce the data by
measuring certain features
•The features are passed to a classifier

IBS-09-SL RM 501 – Ranjit Goswami
38

IBS-09-SL RM 501 – Ranjit Goswami
39
•Classification
•Select the length of the fish as a possible feature for
discrimination

IBS-09-SL RM 501 – Ranjit Goswami
40

IBS-09-SL RM 501 – Ranjit Goswami
41
The length is a poor feature alone!
Select the lightness as a possible feature.

IBS-09-SL RM 501 – Ranjit Goswami
42

IBS-09-SL RM 501 – Ranjit Goswami
43
•Threshold decision boundary and cost relationship
•Move our decision boundary toward smaller values of
lightness in order to minimize the cost (reduce the number of
sea bass that are classified salmon!)
Task of decision theory

IBS-09-SL RM 501 – Ranjit Goswami
44
•Adopt the lightness and add the width of the fish
Fish x
T
= [x
1
, x
2
]
Lightness Width

IBS-09-SL RM 501 – Ranjit Goswami
45

IBS-09-SL RM 501 – Ranjit Goswami
46
•We might add other features that are not correlated
with the ones we already have. A precaution should
be taken not to reduce the performance by adding
such “noisy features”
•Ideally, the best decision boundary should be the one
which provides an optimal performance such as in the
following figure:

IBS-09-SL RM 501 – Ranjit Goswami
47

IBS-09-SL RM 501 – Ranjit Goswami
48
•However, our satisfaction is premature because
the central aim of designing a classifier is to
correctly classify novel input

Issue of generalization!

IBS-09-SL RM 501 – Ranjit Goswami
49
Tags