The Meaning of Probability

This entry is based on material from the book "How the Mind Works" by Steven Pinker.

Yogi Berra famously said that "it is hard to make predictions, especially about the future". However, as Pinker says, "in a world with any regularities at all, decisions informed by the past are better than decisions made at random".  If the above statement seems like a truism, then why do people who follow this advice so often appear to flout the elementary cannons of probability theory?

Why do basketball fans believe in players having a "hot hand"? Their strings of hits and misses are indistinguishable from coin flips. Why do people feel that if a roulette wheel has stopped at black six times in a row, it's due to stop at red? The wheel has no memory and every spin is independent.

On the other hand, let's imagine you are on vacation. It has been raining all week; your vacation is fast approaching its end; everyone is unhappy; everyone blames you for the weather; everyone wants to pack and go home. This is when you decide to boldly predict sunny weather for the remaining two days of your stay.  Are you nuts? As far as basic probability theory is concerned, you are. This situation is no different than the "hot hand" or roulette examples given above. They all represent  the same notorious class of problems called the "gambler's fallacy": expecting that a run of heads increases the chance of a tail. Basic probability theory tells you, that the coin, basketball hit, roulette wheel, as well as weather (at least as far as your knowledge of weather prediction is concerned), has no memory and no desire to be fair. Hence, if you bet money on a "hot player" or on red at the next spin of the roulette wheel, you are very likely to lose.  How about your very bold and seemingly mindless weather prediction? Well, next day the Sun appears.  All is forgotten; your kids are happy; your wife spends more time with you than with her book. Life is good. So, were you merely lucky, despite ignoring the basic tenets of probability theory, or did you actually truly make a decision informed by the past? How about the latter? As Pinker remarks, "rain clouds aren't removed from the sky at day's end and replaced with new ones the next morning.  A cloud cover has some average size, speed, and direction, and it should not be a surprise that a week of clouds should predict that the trailing edge was near and the sun was about to show up, just as the hundredth railroad car on a passing train portends the caboose with greater likelihood than the third car". Many events work like that. They have a characteristic life history and a changing probability of occurring over time (which statisticians call a hazard function)".

The above examples also illustrate the different meanings of probability. One is the single event probability. Pinker observes that, "the probability that a penny will land heads is 0.5 would mean that on a scale of 0 to 1, your confidence the next flip will be heads is halfway between certainty that it will happen and certainty that it won't. Another one is relative frequency in the long run.  The probability that a penny will land heads is 0.5 would mean that in a hundred coin flips, fifty will be heads".

As Pinker notices, "numbers referring to the probability of a single event, which only makes sense as estimates of subjective confidence", are common today. For example, there is a 70% percent chance my flight will be leaving on time tomorrow or that the odds of the Red Sox winning tonight are three to two. Says Pinker:

The interesting question is: what does the probability of a single event even mean? A colleague tells me that there is a ninety-five percent chance he will show up at a meeting tomorrow. He doesn't come. Was he lying?

You may be thinking: granted, a single-event probability is just subjective confidence, but isn't it rational to calibrate confidence by relative frequency? Ah, but the relative frequency of what? To count frequencies you have to decide on a class of events to count up, and a single event belongs to an infinite number of classes.

Richard von Mises, a probability theorist, gives an example.

In a sample of American women between the ages of 35 and 50, 4 out of 100 develop breast cancer within a year. Does Mrs. Smith, a 49-year old American women, therefore have a 4% chance of getting cancer in the next year? There is no answer. Suppose that in a sample of women between the ages of 45 and 90 - a class to which Mrs. Smith also belongs - 11 out of 100 develop breast cancer in a year. Are Mrs. Smith's chances 4%, or are they 11%? Suppose that her mother had breast cancer, and 22 out 100 women between 45 and 90 whose mother had the disease will develop it. Are her chances 4%, 11%, or 22%? She also smokes, lives in California, had two children before the age of 25 and one after 40, is of Greek descent ... What group should we compare her with to figure out the "true" odds? You might think, the more specific the class, the better - but the more specific the class, the smaller its size and the less reliable the frequency. If there were only two people in the world very much like Mrs. Smith, and one developed breast cancer, would anyone say that Mrs. Smith's chances are 50%? In the limit, the only class that is truly comparable with Mrs. Smith in all her details is the class containing Mrs. Smith herself. But in a class of one, "relative frequency" makes no sense.
As Pinker observes:

These philosophical questions about the meaning of probability are not purely academic; they affect every decision we make. During the murder trial of O.J. Simpson in 1995, the lawyer Alan Dershowitz, said on television that among men who batter their wives, only one-tenth of one percent go on to murder them. A statistician then pointed out that among men who batter their wives and whose wives are then murdered by someone, more than half are the murderers.
Another interesting element of the concept of probability is the belief in a stable world. As Pinker notices, "a probabilistic inference is a prediction today based on frequencies gathered yesterday. But that was then, this is now. How do you know that the world hasn't changed in the interim?" Again, the question is not of a purely academic or philosophical character. Think of the following situations, as Pinker describes it:

A person avoids buying a car after hearing that a neighbor's model broke down yesterday. Another person avoids letting his child play in the river with no previous fatalities after hearing that a neighbor's child was attacked there by a crocodile that morning.  The difference between the scenarios (aside from the drastic consequences) is that we judge that the car world is stable, in the US this was a true until last year, so the old statistics apply, but the river world has changed, so the old statistics are moot.

This just goes to prove that probability is a very tricky and complex concept. Every time one thinks one understands it, a new wrinkle appears. It is like the old joke about intelligence - as Pinker quotes it, " the average man's IQ is 107, the average trout's IQ is 4. So why can't a man catch a trout"

Comments:

Post a Comment:
  • HTML Syntax: NOT allowed
About

Everything about Oracle Data Mining, a component of the Oracle Advanced Analytics Option - News, Technical Information, Opinions, Tips & Tricks. All in One Place

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today