Log In

Die and pi

Posted on: March 31st, 2014 by

Estimate the value of pi using a 6 sided die.

- via Goldman Sachs interview, reposted in CSE blog

3 Responses to Die and pi

  1. Dileep Reddy had this to say about that:

    Use central limit theorem.

    A six sided die can take one of six values, with mean 3.5 and variance 2.91667. Sum of N dice-rolls tends to a normal distribution for large N. Perform N*M dice-rolls (for large M) and construct M sums of N. These M values are samples from the normal distribution. Their sum is the Gaussian integral, and should equal the normalization factor of a normal distribution. With knowledge of M, N, and single-die variance, one can extract the value of sqrt(2 pi), and by extension, pi.

    • Sid Hollander had this to say about that:

      That's good. I was thinking along similar lines. In the presence of only one die I was going to suggest form sums of groups of throws to simulate more die. i.e. you could sum in groups of 6 to get sums from 6 to 36 to graph many of these sums to make a 'smoother' Gaussian curve. Applying distribution function of the curve would get you to the objective.

  2. Nishal Shah had this to say about that:

    Another way would be convert dice throws into a random number generator with uniform distribution in 0 to 1 , by interpreting multiple throw outcomes as representation of number in in base 6 (each outcome is a digit after decimal point).

    Then , one can generate 2 numbers - x and y (which are uniformly generated identically and independently from 0 and 1). Probability of x^2 + y^2<=1 is pi as (x,y) is uniformly distributed in unit square.

    Now, this probability can be estimated by doing the whole process multiple times and finding the fraction of times {x^2 + y^2<=1} is satisfied.

{"result":"error", "message":"You can't access this resource as it requires an 'view' access for the website id = 1."}