Bayesian statistics are widely used in fields like robotics and machine learning. I've been playing around with the theories over the past few months (sorry for the lack of posts by the way), and recently came across something with a high (worth sharing)/(time to write a post) ratio.

It's called the cookie problem:

Suppose there are two full bowls of cookies. Bowl #1 has 10 chocolate chip and 30 plain cookies, while bowl #2 has 20 of each. Our friend Fred picks a bowl at random, and then picks a cookie at random. We may assume there is no reason to believe Fred treats one bowl differently from another, likewise for the cookies. The cookie turns out to be a plain one. How probable is it that Fred picked it out of bowl #1?

If you're like me it feels obvious that it is *more* likely that Fred picked the plain cookies out of the bowl with a higher percentage of plain cookies, yet I don't know how to quantify it... sort of embarrassing on such a simple example.

The wikipedia entry for Bayesian Inference has a breakdown of the solution, but I found it much more valuable to watch the following video from PyCon (the speaker is Allen Downey).

The example starts at 10:15