Log In

Probability and maximum entropy

Posted on: May 26th, 2013 by
Comments Disabled

Information theorists may already be familiar with this, but it's an interesting puzzle for non-information-theorists. Here are two questions:

First an easy one: Suppose you roll a coin 1000 times. What is the conditional probability that the first roll is heads given that you observe 700 heads out of the 1000 rolls?

Next a hard one: Suppose you roll a die (faces numbered 1 through 6) 1000 times. What is the conditional probability that the first die roll lands on a 2 given that the average of 1000 rolls is 2? You may use approximations assuming 1000 is large.


Comments are closed.


{"result":"error", "message":"You can't access this resource as it requires an 'view' access for the website id = 1."}