Please hand in ONLY ONE homework assignment: 3a or else 3b, but NOT BOTH!
Homework assignment 3a
The swan-hypothesis H is a universal if-then sentence and says that all swans are white. H is logically
equivalent to the universal if-then sentence H that ev
REVIEW OF PROBABILITY
In Probability, three items (W, A, Pr)
Step 0 W
Step 1-3 Find A
Algebra is a subset of the powerset of W
1. W is an element of the Algebra (W is a proposition)
2. If B is an element of the Algebra, then so its complement: (W\B) is an
Homework assignment 5
A probability space is a triple (W, A, Pr) consisting of an arbitrary, non-empty set of worlds or possibilities
W, an algebra A of propositions over W, and a probability measure Pr(): A . (The dot in Pr()
indicates that the function
PHL246 FINAL Review Notes
Proving Theorems
1. Conditional Derivation: To prove that the consequence follows from the antecedent.
a. Example: If A confirms B, then A confirms B C, for any C.
2. Direct Derivation: Make both sides look the same to show they
Homework assignment 1
Intuitively, a set is a collection of objects (things, entities). For instance, the set C of Canadian cities with
a population of more than 1 million is the set containing Toronto, Montreal, and Calgary. We use curly
brackets cfw_ an
UNIVERSITY OF TORONTO, Faculty of Arts and Science
Mid-Term Exam, PHL246, Duration 50 minutes, No Aids Allowed
Please keep your answers to the following 8 questions (each worth 2.5% points) as short as possible.
Question 1: What does the Principle of Indu
Homework assignment 4
A partition P of an arbitrary, non-empty set W is a set of subsets of W, P (W), such that any two
members B and C of P are mutually exclusive (have no members in common), BC = , and the
members of P are jointly exhaustive (every memb
Outline
1. Induction
2. Probability
Carnap on Hempel
Carnap (1962) denes qualitative conrmation as positive
probabilistic relevance or incremental conrmation, and
quantitative conrmation as conditional probability or
absolute conrmation.
Recall Hempels En
Outline
1. Induction
2. Probability
Induction 1.5
1.5 Kolmogorov and the probability calculus
Franz Huber
PHL 246: Probability and Inductive Logic
Outline
1. Induction
2. Probability
Kolmogorov
In his Grundbegriffe der Wahrscheinlichkeitsrechnung
(1933) (
Outline
1. Induction
2. Probability
Carnap and the problem of induction
The justication of induction boils down to the justication
of the axioms characterizing the probability m
The reasons are based upon our intuitive judgments
concerning inductive valid
Outline
1. Induction
2. Probability
Logical probability
Logical interpretation of probability as basis for an
inductive logic.
Franz Huber
PHL 246: Probability and Inductive Logic
Outline
1. Induction
2. Probability
Logical probability
Logical interpretat
Outline
1. Induction
2. Probability
Example
z0 = cfw_s1 : there are 0 P and 2 P; z1 = cfw_s1 , s2 : there is 1
P and 1 P; z2 = cfw_s3 : there are 2 P and 0 P
s1 : Pa Pb, s2 : Pa Pb, s3 : Pa Pb, s4 : Pa Pb
m (Pa Pb) = 1/3,
m (Pa Pb) = m (Pa Pb) = 1/6,
m (P
Outline
1. Induction
2. Probability
Elementary consequences
Pr () = 0 and Pr (W \ A) = Pr A = 1 Pr (A)
Pr (A B) = Pr (B) Pr (A | B)
Pr (A B) = Pr (A) + Pr (B) Pr (A B) (picture!)
Franz Huber
PHL 246: Probability and Inductive Logic
Outline
1. Induction
2.
Outline
1. Induction
2. Probability
The probability calculus
A function Pr : A R on a(n) (-) algebra A of
propositions over W into the real numbers R is a
(-additive) probability measure on A iff for all
A, B, Ai A, i N:
1. Pr (A) 0 (non-negativity)
2. Pr
Outline
1. Induction
2. Probability
Goodmans view and.
According to Goodman the same is true of inductive logic:
An inductive inference, too, is justied by conformity to
general rules, and a general rule by conformity to accepted
inductive inferences. (65
Outline
1. Induction
2. Probability
Objection
So on purely syntactic accounts one and the same piece of
evidence can be used to conrm two logically inconsistent
hypotheses and, indeed, just about any (or no) hypothesis!
Franz Huber
PHL 246: Probability an
Kexin Wang
1001742153
phl246 Exercise 6-10
Exercise 6: Show that, in set theory, the following is true of all sets P and Q (1
point):
PQ=QP
1. P Q = cfw_ x: x (P Q)
from Extensionality
2. P Q = cfw_ x: (x P) ^ (x Q)
from 1. and the
definition of
3. P Q =
Exercise 16: We are considering an algebra A over a non-empty set of possible worlds W.
Show that the intersection of A and B is a proposition if both A and B are propositions, i.e.
(A B) A if A A and B A. (1 point)
Show (A B) A if A A, B A
1. W is a prop
Phl246 Exercise 11-15
Exercise 11: Show that HD-confirmation satisfies the converse consequence condition CCC,
where you may suppress the background assumption in the definition of HD-confirmation so that
sentence E HD-confirms sentence H just in case H l
Exercise 31: Consider two objects or individuals, viz. today a and tomorrow b, and one
property S they can have, viz. whether or not the sun rises on them. List the four state
descriptions and three structure descriptions that the two individual constants
Outline
1. Induction
2. Probability
SKIP Milnes theorem
M4a: c (H, E F , B) c (H, E G, B) is determined by
c (H, E, B) and c (H, F , E B) c (H, G, E B).
M4b: If c(H, E F , B) = 0, then
c (H, E, B) + c (H, F , E B) = 0.
Franz Huber
PHL 246: Probability and
Outline
1. Induction
2. Probability
SKIP Catch-all counterparts
The catch-all counterpart of (M3a) is L3a: If
Pr (E | H B) < Pr (F | H B) and
Pr E | H B = Pr F | H B , then
c (H, E, B) c (H, F , B). Here we hold xed the catch-alls
Pr E, F | H B rather tha
Outline
1. Induction
2. Probability
Measures of incremental conrmation
Earman (Bayes or Bust?,1992): difference measure
d (H, E, B) = Pr (H | E B) Pr (H | B)
Franz Huber
PHL 246: Probability and Inductive Logic
Outline
1. Induction
2. Probability
Measures
Outline
1. Induction
2. Probability
Hjeks illustration of Bronfmans objection
Sophia lives in Pasadena. All indices agree that she
should move to Canada.
The big-city index tells her to move to Toronto (but to stay
in Pasadena before moving to Whistler or
Outline
1. Induction
2. Probability
.
If both A and B are false, then so is A B. I give you 0
CAD for the rst and second bet, and receive 0 CAD from
you for the third bet.
In all three cases I leave with a gain of r + s t > 0 CAD.
If r + s < t I offer you
Outline
1. Induction
2. Probability
Gradational Accuracy
Joyce (A Nonpragmatic Vindication of Probabilism, 1998):
the inaccuracy of the agent Ss degree of belief b (A) in the
proposition A in the possible world w is the distance
between b (A) and the trut
Outline
1. Induction
2. Probability
2. Probability
2. Probability
Franz Huber
PHL 246: Probability and Inductive Logic
Outline
1. Induction
2. Probability
2. Probability
2. Probability
2.1 Subjective probabilities
Franz Huber
PHL 246: Probability and Indu
Outline
1. Induction
2. Probability
Subjective probabilities
Modern Bayesian conrmation theory has evolved out of
Carnaps project of an inductive logic, but frees itself from a
logical interpretation of probability.
Probabilities are interpreted as the su
Exercises 26-28: In these three exercises you show that conditional probabilities are
probabilities, i.e. that every conditional probability measure is a probability measure. Let C be
fixed a proposition in A with positive probability, Pr(C) > 0. Show tha
Homework assignment 8
Suppose your background assumptions B are such that your degree of belief that all swans are white
is independent of a being a swan:
Pr(a is a swan|all swans are white, and B) = Pr(a is a swan|B)
Suppose further that Pr(a is white|a
A Logical Introduction to Probability and
Induction
Franz Huber
University of Toronto
PHL 246 (Probability and Inductive Logic)
Fall 2016
This is a draft of a textbook for students at the University of Toronto that still
includes numerous flaws and still
PHL 246 FALL Exercises 11-15
11.
1. A HD-confirms B and C logically implies B.
assumption for CP
as well as for UG, where A, B, and C are arbitrary sentences that have not occurred prior to 1.
2. B logically implies A.
HD-confirmation with the background
PHL246 Fall 2016 Exercise 31-35
31
Four state descriptions:
s1 : S(a) S(b)
s2 : S(a) S(b)
s3 : S(a) S(b)
s4 : S(a) S(b)
Three structure descriptions: z0 = S1 = S(a) S(b)
there are 2 S and 0 S
z1 = S2 S3 = (S(a) S(b) (S(a) S(b)
there is 1 S and 1 S
z2 = s4
PHL246 Fall 2016 Exercise 26-30
26-28
1. Let C be fixed a proposition in A, and Pr(C) > 0 and Pr( | C) with domain A and range R
By given
2. A
3. Pr( | C) = Pr( C) \ Pr(C)
proved in elementary consequences
from 1. and the definition of conditional probabi
Exercise 21: Show that for all propositions A in A: if Pr (A) > 0, then Pr (A | A) = 1. (1
point)
1. Pr (A) > 0
assumption for CP
2. Pr (A | A) = Pr(AA)/ P(A)
from the definition of conditional probability,
which applies because Pr (B) > 0 according to 1