Bayesian Epistemology - PowerPoint by CQkc23E

VIEWS: 0 PAGES: 23

									Bayesian Epistemology

      Kareem Khalifa
  Department of Philosophy
    Middlebury College
 The twin pillars of Bayesianism
• Inductive reasoning can be represented in
  terms of probability theory:
  – Probability places coherence constraints on
    rational degrees of belief
  – Inductive reasoning proceeds through a
    probabilistic rule of inference called
    conditionalization
• A rule of reasoning is justified if it passes
  the “pragmatic self-defeat” test (Dutch
  Book)
          What is probability?
• Several interpretations
• Logical interpretation
  – Probability: approximation of deductive entailment/
    truth
• Frequency interpretation
  – Probability: how often frequently something happens
• Propensity interpretation
  – Probability: a physical system’s disposition to produce
    a certain outcome
• Subjective interpretation = Bayesian
  – Probability: Degree of belief
                       Coherence
• Property of a synchronic set of beliefs
   – Synchronic set of beliefs = beliefs held at one time
      • Ex. A theory
• Criterion of rationality
   – Coherent set of beliefs = rational
   – Incoherent set of beliefs = irrational
• Probabilistic coherence: does not lead to bad
  wagering policies (‘pragmatic self-defeat’) in the
  long run
            Conditionalization
• Bayes Theorem: P(h/e) = probability of h given e
  – Ex. P (There is fire / There is smoke) = 0.9 means it’s
    generally true that where there’s smoke, there’s fire.
• Simple Rule of Conditionalization: Pf(h) = Pi(h/e)
  – Pi = prior probability = the degree of belief that one
    has in a claim prior to collecting evidence e
  – Pf = posterior probability = the degree of belief that
    one has in a claim after collecting evidence e
  Conditionalization: “3 Stages”
• Throw yourself into a situation; assign
  prior probabilities
• Gather evidence; assign a posterior
  probability of 1 to your evidence
• Conditionalize to find posterior of
  hypothesis
   Conditionalization: “Stage 1”
• Stage 1: You are thrown into a world with
  degrees of belief in various claims (priors),
  including:
  – Pi(h/e) = probability a hypothesis is true given
    a certain piece of evidence
  – Ex. For an early Australian explorer
     • Pi (Platypuses exist / Platypuses are observed) =
       0.9
   Conditionalization: “Stage 2”
• Stage 2: Gather the evidence. For all
  evidence gathered, assign a posterior
  probability Pf(e) = 1 (i.e., you are now
  certain that e is true)
  – Ex. You observe a platypus. Pf(Platypuses
    are observed) = 1
  Conditionalization: “Stage 3”
• Ascertain how well this new evidence
  confirms your hypothesis using the rule of
  conditionalization.
• Pf(h) = Pi(h/e)
• Ex. Pf (Platypuses exist) = Pi (Platypuses
  exist / Platypuses are observed) = 0.9
    Conditionalization and Modus
              Ponens
• Modus Ponens: Deductive Rule of Inference:
  – If e, then h
  –e
  – Therefore h
• Conditionalization: Inductive Rule of Inference
  – If e, then probably h, i.e., Pi(e/h) = p.
  – e, i.e., Pf(e) = 1.
  – Therefore probably h, i.e., Pf(h) = Pi(e/h) = p.
       Bayesian Confirmation
• Pf(h) = Pi(h/e) = Pi(e/h) x Pi(h) / Pi(e)
  – Corollary of Simple Conditionalization
• Admits of degrees of confirmation
• Three stages still obtain:
  – Throw yourself into a situation; assign prior
    probabilities
  – Gather evidence; assign a posterior
    probability of 1 to your evidence
  – Conditionalize to find posterior of hypothesis
Bayesian confirmation: Stage 1
• More priors to be assigned:
• Pi(e/h) = probability of evidence e given
  the truth of the hypothesis h
  – Called the likelihood of h on e
  – The converse of the prior examined in Simple
    Conditionalization
• Pi(e) = probability that evidence obtains
• Pi(h) = probability that hypothesis is true
        Bayesian confirmation:
            Stages 2 & 3
• Stage 2 = Gather evidence, assign Pf(e) =
  1
  – Just as with Simple Conditionalization
• Stage 3: Calculate Pf(h) = Pi(e/h)P(h)/Pi(e)
  – Like Simple Conditionalization, but more
    interesting
  – Less like Modus Ponens
       The Platypus Example
• Stage 1: Assign priors
  – Pi(Platypus observed / Platypus exists) = .6
  – Pi(Platypus observed) = .4
  – Pi(Platypus exists) = .5
• Stage 2: Gather evidence
  – Pf(Platypus observed) = 1
• Stage 3: Calculate posterior for hypothesis
  – Pf(Platypus exists) = (.6)(.5)/(.4) = .75
       Some Bayesian truisms
• Evidence e confirms hypothesis h if and only if
  Pi(h/e) > Pi(h)
• Evidence e disconfirms hypothesis h if and only
  if Pi(h/e) < Pi(h)
• If h deductively entails e, then e confirms h and
  ~e disconfirms h (by reducing Pf(h) = 0)
  – Echoes HD and PF
• Comparing hypotheses:
  – Pf(ha)/Pf(hb) = [Pi(e/ha) x Pi(ha)] / [Pi(e/hb) x Pi(hb)
      Intuitive consequences of
              Bayesianism
• The higher the likelihood, the better
  confirmed the hypothesis
• The higher the prior of the hypothesis, the
  better confirmed the hypothesis
  – Gives higher confirmation to hypotheses that
    are already strongly believed
• The lower the prior of the evidence, the
  better confirmed the hypothesis
  – Privileges bold predictions
          Bayesian virtues
• Demarcation
• Induction
  – Underdetermination
  – Raven Paradox
  – Grue
• Induction and Decision
• Vagueness
      Bayesian Demarcation
• Both laypeople and scientists can be
  treated as Bayesians
• Differences between laypeople and
  scientists:
  – Evidence
  – Testing
  – How priors are assigned (possibly)
Bayesians vs. Underdetermination
• Underdetermination of theory by data:
  Many hypotheses can entail the same set
  of data (such hypotheses are empirically
  equivalent)
• However, not all of them are equally
  probable given that data
 Bayesians vs. Raven Paradox
• Bayesian solution: difference in likelihoods
• Pi(non-black, non-raven / all ravens are
  black) would be low
• Pi(black raven / all ravens are black) would
  be high
• So non-black, non-ravens don’t have
  equal confirmatory power for the
  hypothesis that all ravens are black.
             Bayesian Holism
• Scientific testing consists of a complex
  conjunction of theoretical (“core”) and auxiliary
  hypotheses about the workings of instruments,
  measurements, definitions of concepts, etc.
• Theoretical and auxiliary hypotheses (even
  empirical statements) can be:
   – Probabilistically dependent upon each other
   – Of differing probabilities
• Thus, an experiment may more strongly
  disconfirm one statement than another
         Bayesians vs. Grue
• Let h1= All emeralds are green.
• Let h2= All emeralds are grue, i.e., green
  now, but will turn blue on May 10, 3776
  (my 2000th birthday!)
• Why choose h1 over h2?
• The higher the prior of the hypothesis, the
  better confirmed the hypothesis.
  – Our prior for h1 is higher than our gruesome
    h2 hypothesis.
         Bayesian Precision
• Popper, Kuhn, and Thagard all suffered
  from vagueness in their core concepts
  (falsification, paradigm, and the theoretical
  virtues)
• Bayesianism does not appear to have this
  problem

								
To top