Topic – 7: UncertaintyTopic – 7: Uncertainty
Fall 2018Fall 2018
Tajim Md. Niamat Ullah Akhund
Lecturer
Department of Computer Science and Engineering
Daffodil International University
Email: tajim.cse@diu.edu.bd
Acting under Uncertainty
Basic Probability Notation
The Axioms of Probability
Inference Using Full Joint Distributions
Independence
Bayes’ Rule and Its Use
Topic ContentsTopic Contents
Autonomous AgentsAutonomous Agents
3
Real World Driving Agent
Sensors
camera
tachometer
engine status
temperature
Effectors
accelerator
brakes
steering
Reasoning &
Decisions Making
Model of
vehicle location & status
road status
Actions
change speed
change steering
Prior Knowledge
physics of movement
rules of road
Goals
drive home
Uncertainty in the World Model
 The agent can never be completely certain about the
state of the external world since there is ambiguity and
uncertainty.
Why?
sensors have limited precision
e.g. camera has only so many pixels to capture an image
sensors have limited accuracy
e.g. tachometer’s estimate of velocity is approximate
there are hidden variables that sensors can’t “see”
e.g. vehicle behind large truck or storm clouds approaching
the future is unknown, uncertain,
i.e. cannot foresee all possible future events which may happen
4
Majidur RahmanMajidur Rahman
Rules and Uncertainty
Say we have a rule:
if toothache then problem is cavity
But not all patients have toothaches due to cavities
so we could set up rules like:
if toothache and not(gum disease) and not(filling) and ...
then problem = cavity
This gets complicated, a better method would be:
if toothache then problem is cavity with 0.8 probability
or P(cavity|toothache) = 0.8
the probability of cavity is 0.8 given toothache is all that is
known
5
Majidur RahmanMajidur Rahman
Uncertainty in the World Model
True uncertainty: rules are probabilistic in nature
rolling dice, flipping a coin?
Laziness: too hard to determine exceptionless rules
takes too much work to determine all of the relevant factors
too hard to use the enormous rules that result
Theoretical ignorance: don't know all the rules
problem domain has no complete theory (medical diagnosis)
Practical ignorance: do know all the rules BUT
haven't collected all relevant information for a particular case
6
Majidur RahmanMajidur Rahman
LogicsLogics
Logics are characterized by
what they commit to as "primitives".
7
Logic What Exists in World Knowledge States
Propositional facts true/false/unknown
First-Order facts, objects,
relations
true/false/unknown
Temporal facts, objects,
relations, times
true/false/unknown
Probability
Theory
facts degree of belief
0..1
Fuzzy degree of truth degree of belief
0..1
Majidur RahmanMajidur Rahman
Probability Theory
 Probability theory serves as a formal means
for representing and reasoning
with uncertain knowledge
of manipulating degrees of belief
in a proposition (event, conclusion, diagnosis, etc.)
8
Majidur RahmanMajidur Rahman
Utility Theory
Every state has a degree of usefulness or utility and
the agent will prefer states with higher utility.
9
Majidur RahmanMajidur Rahman
Decision Theory
An agent is rational if and only if it chooses the action
that yields the highest expected utility, averaged over all
the possible outcomes of the action.
Decision theory = probability theory + utility theory
10
Majidur RahmanMajidur Rahman
Kolmogorov's Axioms of Probability
1. 0 ≤ P(a) ≤ 1
probabilities are between 0 and 1 inclusive
1. P(true) = 1, P(false) = 0
probability of 1 for propositions believed to be absolutely true
probability of 0 for propositions believed to be absolutely false
1. P(a ∨ b) = P(a) + P(b) - P(a ∧ b)
probability of two states is their sum minus their “intersection”
11
Inference Using Full Joint Distribution
Start with the joint probability
distribution:
12
Majidur RahmanMajidur Rahman
Inference by Enumeration
Start with a full joint probability distribution for the
Toothache, Cavity, Catch world:
P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2
13
Majidur RahmanMajidur Rahman
Inference by Enumeration
14
Start with the joint probability distribution:
Can also compute conditional probabilities:
P(¬cavity | toothache) = P(¬cavity ∧ toothache)
P(toothache)
= 0.016+0.064
0.108 + 0.012 + 0.016 + 0.064
= 0.4
Majidur RahmanMajidur Rahman
IndependenceIndependence
15
Majidur RahmanMajidur Rahman
Independence between propositions a and b can be
written as:
P(a | b) = P(a) or P(b | a) = P(b) or P(a ∧ b) = P(a) P(b)
 Independence assertions are usually based on
knowledge of the domain.
As we have seen, they can dramatically reduce the
amount of information necessary to specify the full joint
distribution.
If the complete set of variables can be divided into
independent subsets, then the full joint can be factored
into separate joint distributions on those subsets.
For example, the joint distribution on the outcome of n
independent coin flips, P(C1, . . . , Cn), can be represented
as the product of n single-variable distributions P(Ci).
16
Majidur RahmanMajidur Rahman
Bayes’ TheoremBayes’ Theorem
P(A∧B) = P(A|B)P(B)
P(B∧A) = P(B|A)P(A)
P(B|A)P(A) = P(A|B)P(B)
P(B|A) = P(A|B)P(B)
P(A)
Bayes’ law (also Bayes’ law or Bayes’ rule) is fundamental to
probabilistic reasoning in AI!!
17
Majidur RahmanMajidur Rahman
Bayes’ TheoremBayes’ Theorem
Bayes’ in Action
Bayes’ rule requires three terms - a conditional probability
and two unconditional probabilities - just to compute one
conditional probability.
Bayes’ rule is useful in practice because there are many
cases where we do have good probability estimates for
these three numbers and need to compute the fourth.
In a task such as medical diagnosis, we often have
conditional probabilities on causal relationships and want
to derive a diagnosis.
18
Majidur RahmanMajidur Rahman
Bayes’ in Action
Example:Example:
A doctor knows that the disease meningitis causes the patient
to have a stiff neck, say, 50% of the time. The doctor also
knows some unconditional facts: the prior probability that a
patient has meningitis is 1/50,000, and the prior probability
that any patient has a stiff neck is 1/20. Let s be the
proposition that the patient has a stiff neck and m be the
proposition that the patient has meningitis.
Solution:Solution:
This question can be answered by using the well-known
Bayes’ theorem.
19
Majidur RahmanMajidur Rahman
Bayes’ in ActionBayes’ in Action
Solution:Solution:
P(s|m) = 0.5
P(m) = 1/50000
P(s) = 1/20
20
Majidur RahmanMajidur Rahman
21
Bayes’ Theorem
ExampleExample
Consider a football game between two rival teams: Team 0 and
Team 1. Suppose Team 0 wins 65% of the time and Team 1 wins
the remaining matches. Among the games won by Team 0, only
30% of them come from playing on Team 1 ’s football field. On
the other hand, 75% of the victories for Team 1 are obtained
while playing at home. If Team 1 is to host the next match
between the two teams, which team will most likely emerge as
the winner?
Solution:Solution:
This question can be answered by using the well-known Bayes’
theorem.
Majidur RahmanMajidur Rahman
22
Bayes’ TheoremBayes’ Theorem
Majidur RahmanMajidur Rahman
23
Bayes’ TheoremBayes’ Theorem
Majidur RahmanMajidur Rahman
24
25
26
Using Bayes’ Theorem More Realistically
Bayes’ian updating
P(Cavity | Toothache ∧ Catch) = P(Toothache ∧ Catch | Cavity) P(Cavity)
27
Majidur RahmanMajidur Rahman
THANKS…
Majidur RahmanMajidur Rahman
COURTESY:
Md. Tarek Habib
Assistant Professor
Daffodil International University
Majidur RahmanMajidur Rahman
********** ********** If You Need Me ********** **********
Mail: tajim.cse@diu.edu.bd
Website: https://www.tajimiitju.blogspot.com
ORCID: https://orcid.org/0000-0002-2834-1507
LinkedIn: https://www.linkedin.com/in/tajimiitju
ResearchGate: https://www.researchgate.net/profile/Tajim_Md_Niamat_Ullah_Akhund
YouTube: https://www.youtube.com/tajimiitju?sub_confirmation=1
SlideShare: https://www.slideshare.net/TajimMdNiamatUllahAk
Facebook: https://www.facebook.com/tajim.mohammad
GitHub: https://github.com/tajimiitju
Google+: https://plus.google.com/+tajimiitju
Gmail: tajim.mohammad.3@gmail.com
Twitter: https://twitter.com/Tajim53
Thank you
Tajim Md. Niamat Ullah AkhundTajim Md. Niamat Ullah Akhund

AI Lecture 7 (uncertainty)

  • 1.
    Topic – 7:UncertaintyTopic – 7: Uncertainty Fall 2018Fall 2018 Tajim Md. Niamat Ullah Akhund Lecturer Department of Computer Science and Engineering Daffodil International University Email: [email protected]
  • 2.
    Acting under Uncertainty BasicProbability Notation The Axioms of Probability Inference Using Full Joint Distributions Independence Bayes’ Rule and Its Use Topic ContentsTopic Contents
  • 3.
    Autonomous AgentsAutonomous Agents 3 RealWorld Driving Agent Sensors camera tachometer engine status temperature Effectors accelerator brakes steering Reasoning & Decisions Making Model of vehicle location & status road status Actions change speed change steering Prior Knowledge physics of movement rules of road Goals drive home
  • 4.
    Uncertainty in theWorld Model  The agent can never be completely certain about the state of the external world since there is ambiguity and uncertainty. Why? sensors have limited precision e.g. camera has only so many pixels to capture an image sensors have limited accuracy e.g. tachometer’s estimate of velocity is approximate there are hidden variables that sensors can’t “see” e.g. vehicle behind large truck or storm clouds approaching the future is unknown, uncertain, i.e. cannot foresee all possible future events which may happen 4 Majidur RahmanMajidur Rahman
  • 5.
    Rules and Uncertainty Saywe have a rule: if toothache then problem is cavity But not all patients have toothaches due to cavities so we could set up rules like: if toothache and not(gum disease) and not(filling) and ... then problem = cavity This gets complicated, a better method would be: if toothache then problem is cavity with 0.8 probability or P(cavity|toothache) = 0.8 the probability of cavity is 0.8 given toothache is all that is known 5 Majidur RahmanMajidur Rahman
  • 6.
    Uncertainty in theWorld Model True uncertainty: rules are probabilistic in nature rolling dice, flipping a coin? Laziness: too hard to determine exceptionless rules takes too much work to determine all of the relevant factors too hard to use the enormous rules that result Theoretical ignorance: don't know all the rules problem domain has no complete theory (medical diagnosis) Practical ignorance: do know all the rules BUT haven't collected all relevant information for a particular case 6 Majidur RahmanMajidur Rahman
  • 7.
    LogicsLogics Logics are characterizedby what they commit to as "primitives". 7 Logic What Exists in World Knowledge States Propositional facts true/false/unknown First-Order facts, objects, relations true/false/unknown Temporal facts, objects, relations, times true/false/unknown Probability Theory facts degree of belief 0..1 Fuzzy degree of truth degree of belief 0..1 Majidur RahmanMajidur Rahman
  • 8.
    Probability Theory  Probabilitytheory serves as a formal means for representing and reasoning with uncertain knowledge of manipulating degrees of belief in a proposition (event, conclusion, diagnosis, etc.) 8 Majidur RahmanMajidur Rahman
  • 9.
    Utility Theory Every statehas a degree of usefulness or utility and the agent will prefer states with higher utility. 9 Majidur RahmanMajidur Rahman
  • 10.
    Decision Theory An agentis rational if and only if it chooses the action that yields the highest expected utility, averaged over all the possible outcomes of the action. Decision theory = probability theory + utility theory 10 Majidur RahmanMajidur Rahman
  • 11.
    Kolmogorov's Axioms ofProbability 1. 0 ≤ P(a) ≤ 1 probabilities are between 0 and 1 inclusive 1. P(true) = 1, P(false) = 0 probability of 1 for propositions believed to be absolutely true probability of 0 for propositions believed to be absolutely false 1. P(a ∨ b) = P(a) + P(b) - P(a ∧ b) probability of two states is their sum minus their “intersection” 11
  • 12.
    Inference Using FullJoint Distribution Start with the joint probability distribution: 12 Majidur RahmanMajidur Rahman
  • 13.
    Inference by Enumeration Startwith a full joint probability distribution for the Toothache, Cavity, Catch world: P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2 13 Majidur RahmanMajidur Rahman
  • 14.
    Inference by Enumeration 14 Startwith the joint probability distribution: Can also compute conditional probabilities: P(¬cavity | toothache) = P(¬cavity ∧ toothache) P(toothache) = 0.016+0.064 0.108 + 0.012 + 0.016 + 0.064 = 0.4 Majidur RahmanMajidur Rahman
  • 15.
    IndependenceIndependence 15 Majidur RahmanMajidur Rahman Independencebetween propositions a and b can be written as: P(a | b) = P(a) or P(b | a) = P(b) or P(a ∧ b) = P(a) P(b)  Independence assertions are usually based on knowledge of the domain. As we have seen, they can dramatically reduce the amount of information necessary to specify the full joint distribution. If the complete set of variables can be divided into independent subsets, then the full joint can be factored into separate joint distributions on those subsets. For example, the joint distribution on the outcome of n independent coin flips, P(C1, . . . , Cn), can be represented as the product of n single-variable distributions P(Ci).
  • 16.
  • 17.
    P(A∧B) = P(A|B)P(B) P(B∧A)= P(B|A)P(A) P(B|A)P(A) = P(A|B)P(B) P(B|A) = P(A|B)P(B) P(A) Bayes’ law (also Bayes’ law or Bayes’ rule) is fundamental to probabilistic reasoning in AI!! 17 Majidur RahmanMajidur Rahman Bayes’ TheoremBayes’ Theorem
  • 18.
    Bayes’ in Action Bayes’rule requires three terms - a conditional probability and two unconditional probabilities - just to compute one conditional probability. Bayes’ rule is useful in practice because there are many cases where we do have good probability estimates for these three numbers and need to compute the fourth. In a task such as medical diagnosis, we often have conditional probabilities on causal relationships and want to derive a diagnosis. 18 Majidur RahmanMajidur Rahman
  • 19.
    Bayes’ in Action Example:Example: Adoctor knows that the disease meningitis causes the patient to have a stiff neck, say, 50% of the time. The doctor also knows some unconditional facts: the prior probability that a patient has meningitis is 1/50,000, and the prior probability that any patient has a stiff neck is 1/20. Let s be the proposition that the patient has a stiff neck and m be the proposition that the patient has meningitis. Solution:Solution: This question can be answered by using the well-known Bayes’ theorem. 19 Majidur RahmanMajidur Rahman
  • 20.
    Bayes’ in ActionBayes’in Action Solution:Solution: P(s|m) = 0.5 P(m) = 1/50000 P(s) = 1/20 20 Majidur RahmanMajidur Rahman
  • 21.
    21 Bayes’ Theorem ExampleExample Consider afootball game between two rival teams: Team 0 and Team 1. Suppose Team 0 wins 65% of the time and Team 1 wins the remaining matches. Among the games won by Team 0, only 30% of them come from playing on Team 1 ’s football field. On the other hand, 75% of the victories for Team 1 are obtained while playing at home. If Team 1 is to host the next match between the two teams, which team will most likely emerge as the winner? Solution:Solution: This question can be answered by using the well-known Bayes’ theorem. Majidur RahmanMajidur Rahman
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
    Using Bayes’ TheoremMore Realistically Bayes’ian updating P(Cavity | Toothache ∧ Catch) = P(Toothache ∧ Catch | Cavity) P(Cavity) 27 Majidur RahmanMajidur Rahman
  • 28.
  • 29.
    COURTESY: Md. Tarek Habib AssistantProfessor Daffodil International University Majidur RahmanMajidur Rahman
  • 30.
    ********** ********** IfYou Need Me ********** ********** Mail: [email protected] Website: https://www.tajimiitju.blogspot.com ORCID: https://orcid.org/0000-0002-2834-1507 LinkedIn: https://www.linkedin.com/in/tajimiitju ResearchGate: https://www.researchgate.net/profile/Tajim_Md_Niamat_Ullah_Akhund YouTube: https://www.youtube.com/tajimiitju?sub_confirmation=1 SlideShare: https://www.slideshare.net/TajimMdNiamatUllahAk Facebook: https://www.facebook.com/tajim.mohammad GitHub: https://github.com/tajimiitju Google+: https://plus.google.com/+tajimiitju Gmail: [email protected] Twitter: https://twitter.com/Tajim53 Thank you Tajim Md. Niamat Ullah AkhundTajim Md. Niamat Ullah Akhund

Editor's Notes