## “With Bayes’s Rule, it’s Elementary”, says Sherlock

In my last post, Thomas Bayes vs. Sherlock Holmes: The Case of Who’s Laughing Now?, I asked the following riddle:

Mr. and Mrs. Green had very different senses of humor and somewhat distinctive laughs. Only one of them ever laughs at a time. But they both laugh by saying “hee” or “haw”, sometimes using a mix of the two sounds in succession, such as “hee hee haw hee haw”. Over time, Sherlock has observed that when Mr. Green laughs, 20% of the utterances are “hee” and 80% are “haw”; Mrs. Green is more ladylike, with 60% “hee” and only 40% “haw”.

One day, Sherlock was walking by the Green house, and heard the laugh “hee haw haw” from a window. He had no knowledge of whether Mr. or Mrs. Green was more likely to be laughing, but knew it had to be one of them.

What odds should Sherlock post to create a fair bet that the laugh was Mr. Green’s?

Rich W. calculated the answer in a response to the last post, but made a calculation mistake which I didn’t catch; thanks to Andraz Tori for the correction in the comment; I just updated the values here so they’re right:

```  p(hee haw haw|Mr)  = p(hee|Mr)  * p(haw|Mr)  * p(haw|Mr)
= .2*.8*.8  = 1/5 * 4/5 * 4/5
=  16/125

p(hee haw haw|Mrs) = p(hee|Mrs) * p(haw|Mrs) * p(haw|Mrs)
= .6*.4*.4 = 3/5 * 2/5 * 2/5
= 12/125
```

Let me show the rest of the work. What we want to calculate is the odds, which are defined by:

```odds for Mrs. Green
= p(Mrs|hee haw haw) / p(Mr|hee haw haw)
```

As we’ll see below, the result is 3/4, for odds of 3:4 for Mrs. Green. Here we already see the potential for confusion. As Rich W. noted, if we were gambling, the fair payoff (meaning an expected winnings of 0 for both sides) would be the inverse of the odds. So if the odds are 3:4 for Mrs. Green, the fair payoff will be \$4 for every \$3 bet on Mrs. Green.

So now we only need to calculate `p(Mrs|hee haw haw)` and similarly for `Mr`. Here’s where Bayes’s rule:

```p(A|B) = p(B|A) p(A) / p(B)
```

comes into play. It lets us define the posterior probability `p(A|B)` as the likelihood `p(B|A)` times the prior `p(A)` divided by the marginal `p(B)`. In our case, we have:

```   p(Mrs|hee haw haw) = p(hee haw haw|Mrs) p(Mrs)
/ p(hee haw haw)

p(Mr|hee haw haw)  = p(hee haw haw|Mr)  p(Mr)
/  p(hee haw haw)
```

Now let’s plug that into the odds formula:

```   p(Mrs|hee haw haw)/p(Mr|hee haw haw)

= [ p(hee haw haw|Mrs) p(Mrs) / p(hee haw haw) ]

/ [ p(hee haw haw|Mr) p(Mr) / p(hee haw haw) ]

= p(hee haw haw|Mrs) p(Mrs)

/ [ p(hee haw haw|Mr) p(Mr) ]
```

In Bayesian parlance, this derivation is of equality of the posterior odds (`p(Mrs|hee haw haw)/p(Mr|hee haw haw)`) with the prior odds (`p(Mrs)/p(Mr)`) times the likelihood ratio (`p(hee haw haw|Mrs)/p(hee haw haw|Mr)`).

When the puzzle said there’s no reason ahead of time to assume it was either Mr. or Mrs. Green who was laughing, I meant to imply equal priors, so that `p(Mr) = p(Mrs) = 0.5`. That makes the prior odds 1, so they drop out:

```   p(Mrs|hee haw haw)/p(Mr|hee haw haw)

= p(hee haw haw|Mrs) p(Mrs)

/ [ p(hee haw haw|Mr) p(Mr) ]

= p(hee haw haw|Mrs) / p(hee haw haw|Mr)

= (12/125) / (16/125)

= 3 / 4
```

So far, we’ve only stipulated the likelihood function. In our next installment, Sherlock’s going to have to do some estimation.

OK — I don’t think I’ve gotten the hang of this yet. This should all be told with a story setup (why does Sherlock want to know who’s laughing) and from Sherlock’s point of view. He needs to round up the subjects and then lecture them.

### 3 Responses to ““With Bayes’s Rule, it’s Elementary”, says Sherlock”

1. Andraz Tori Says:

You have a calculating error there…
.2*.8*.8 = 0.128 = 16/125 as opposed to 8/125

Making the end result 4:3 for Mr. Green, which is a bit more intuitive.

Otherwise you are writing a fantastic blog with great insights and humor!

bye
Andraz Tori, Zemanta

2. lingpipe Says:

Doh! I fixed the entry and added the fractions as I should’ve done earlier: .2 * .8 * .8 = 1/5 * 4/5 * 4/5 = 16/125.

3. Rich W Says:

Thanks for the save :)