d

Priors

In induction, skepticism on 16/06/2011 at 5:52 pm

Assign a prior probability to a hypothesis h that takes into account our ignorance of the truth or falsity of the hypothesis. Even though a good deal of the predictions (p), in conjunction with some basic statements and initial conditions, of h may be true, either the hypothesis is true or is false. Outcomes of testing are transmitted back to the hypothesis: corroborating evidence e implies that h is true (for true hypotheses will always have their predictions corroborated), while evidence e that conflicts with prediction p, a logical consequence of h, implies that h is false.

Imagine that there is a true hypothesis h and a false hypothesis h‘. The logical consequences of the hypotheses, taking into account the assumed identical background knowledge, give different predictions: h predicts p; h‘ predicts ¬p.

Since h and h‘ cannot both be true, hh‘ must have a subjective or epistemic probability that does not exceed 1: no one knows which of the two theories are true in advance, so the prior probability of h is reduced (even by the smallest amount) so that hh‘ continues to add up to 1. However, h and h‘ do not exhaust the number of possible theories. We now have to spread out prior probabilities to all possible theories that attempt to explain the phenomena.

In response, the defender might point out that the distribution need not be equal: set h at .9 while set h’ at .09, and so on. This will not do, since each are equally corroborated. The evidence does not lead to an unequal distribution; the choice to prefer one over the other leads to the unequal distribution–which is a truism.

In short, the prior probability of a hypothesis, without some sort of a priori method of weighting preferred hypotheses over others, is vanishingly small.

This is not a problem if, through a process of eliminative induction, we can increase the posterior probabilities assigned to our preferred hypotheses by eliminating a sufficient number of alternate theories; however, this route doesn’t pan out: no matter how many hypotheses we rule out that give false predictions, there are still an infinite number of false theories that aren’t ruled out by e, for they all presently corroborated by e.

When we assign a prior probability to hypothesis h and accumulate corroborating evidence e, we arrive at a probability that is greater than the prior probability. However, when assigning prior probabilities, since we have an infinite number of possible theories at hand, our assigned prior probabilities are vanishingly small, and to assign a prior probability to any one theory at the expense of other theories would beg the question; when assigning posterior probabilities, the same problem remains: without some a priori way to eliminate a great number of hypotheses from consideration, there is no apparent way to increase the probability of any hypothesis.

Is there a way to solve this problem? I think it hinges on some sort of a priori method or rule that may offer reasons (simplicity, predictive power, etc.) for favoring some hypotheses over others.

  1. Is simplicity an option? No. Goodman’s riddle of induction comes into play: every hypothesis that says “All X before time t is Y, but after time t is Z” can be translated into a natural language expressible in the form “All X’ is Y'”. Thus, simplicity cannot be the solution, since any number of hypotheses may be expressed in some languages that are just as simple as our own language.
  2. Does predictive power pass any sort of critical examination? No. It’s more than conceivable that a false hypothesis may have more predictive power than a true hypothesis.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s