Binomial Calculator

Input








Formulas and Output

\[p(y) = \left(\begin{array}{c}n\\ y\end{array}\right) \times p^{y}(1-p)^{n-y}\ = \left(\begin{array}{c} 20 \\ 14 \end{array}\right) \times 0.8^{ 14 }(1- 0.8 )^{ 20 - 14 } = 0.1091\] \[p(y < 14 ) = 1 - p(y \geq 14) = 1 - 0.91331 = 0.08669\] \[p(y \leq 14) = 1 - p(y > 14) = 1 - 0.80421 = 0.19579\] \[p(y > 14) = 1 - p(y \leq 14) = 1 - 0.19579 = 0.80421\] \[p(y \geq 14) = 1 - p(y < 14) = 1 - 0.08669 = 0.91331\] \[E(Y) = np = 20 \times 0.8 = 16 \] \[V(Y) = np(1-p) = 3\]

Probability Distribution - Binomial

When do you use the binomial distribution?

Example 1. Binomial distribution is used to find the probability of success or failure in an experiment that is repeated multiple times. If the sampling is done with replacement you want to use the binomial instead of the hypergeometric distribution.

If n > 10 and probability of success < 0.1 then it can be approximated using the poisson distribution
If np(1-p) > 5 then it can be approximated using the normal distribution where variance = np(1-p)

Example, The probability that a patient recovers from a stomach disease is 0.8, suppose 20 people are known to have contracted this disease. What is the probability that:
- exactly 14 recover?
- less than 14 recover?
- more than 14 recover?
- at least 14 recover?
- at most 14 recover?

Example 2. Famous minecraft speedrunner Dreams with 14 million youtube subscribers was accused for cheating on his minecraft speed runs, he was accused of hacking his client in order to get higher drop rates on cruical items. The proof that the accuser used was that an item with a probability to drop of 4.7% dropped for dreams 42 times out of his 240 runs. According to the accuser this is very unlikely since the expected amount of drops on 242 trials with 4.7% chance of success is 11.37. How unlikely was it for Dreams to get at least 42 successful trials on 242 runs given the probability of success was 0.047.

To solve this question, you use 242 trials, 0.047 probability and 42 successful trials.

Proof expected value E(Y) for binomial distribution is equal to np)

The formula for expected values on a discrete probability distribution is: \[E(y) = \sum yp(y)\] The formula for p(y) is: \[p(y) = \left(\begin{array}{c}n\\ y\end{array}\right)p^y(1-p)^{n-y}\] Thus we multiply y with p(y) and sum all the observations: \[E(y) =\sum_{y=0}^n y \left(\begin{array}{c}n\\ y\end{array}\right)p^y(1-p)^{n-y}\] Now we rewrite y! & n! as \[y! = y(y-1)!\] \[n! = n(n-1)!\] The formula now looks like this: \[\sum_{y=1}^n x\frac{n(n-1)!}{x(x-1)!(n-x)!}p^x(1-p)^{n-x}\] You can now cancel out y and get this: \[\sum_{y=1}^n \frac{n(n-1)!}{(y-1)!(n-y)!}p^y(1-p)^{n-y}\] Since n is a constant you can move it infront of the sum. In addition, you want to move p infront of the sum thus we remove 1 from the exponent of p^y. We now get the following formula: \[np\sum_{y=1}^n \frac{(n-1)!}{(y-1)!(n-y)!}p^{y-1}(1-p)^{n-y}\] You can rewrite (n-y)! to ((n-1)-(y-1))! because they are the exact same answer; however, it makes the next steps easier. Formula is now: \[np\sum_{y=1}^n \frac{(n-1)!}{(y-1)!((n-1)-(y-1))!}p^{y-1}(1-p)^{((n-1)-(y-1))!}\] To make it look cleaner we can assign n-1 and y-1 to single value z & m: \[m = n-1\] \[z = y-1\] Now we assign z & m to the formula and get the following: \[np\sum_{z=0}^m \frac{m!}{z!(m-z)!}p^z(1-p)^{m-y}\] \[np\sum_{z=0}^m\left(\begin{array}{c}m\\ z\end{array}\right)p^z(1-p)^{m-y}\] Using the binomial expansion you can now get rid of the summation, binomial theorem states that: \[\sum_{i=0}^n\left(\begin{array}{c}n\\ i\end{array}\right)x^{n-i}y^i = (x+y)^n\] The formula we found looks exactly like the one from the binomial expansion. Now we get rid of the summation: \[x = p\] \[y = (1-p)\] \[np\sum_{z=0}^m\left(\begin{array}{c}m\\ z\end{array}\right)p^z(1-p)^{m-y} = np(p+(1-p))^m\] Since (p+1-p) is equal to 1 you get: \[np(p(1-p))^n=np(1)^n=np\] Hurray! You have now proven that the expected value of the binomial distribution is equal to np.

Proof variance V(Y) for binomial distribution is equal to np(1-p)

For this proof we assume that you know that E(Y) is equal to np.
We will also use fact that variance can be written as expected value of the squared observations minus expected value squared: \[V(Y) = E(Y^2)-E(Y)^2\] If E(Y) = np then E(Y)^2 is equal to np^2, thus we only need to find the value of E(Y^2) for the binomial distribution: \[E(y^2) =\sum_{y=2}^n y^2 \left(\begin{array}{c}n\\ y\end{array}\right)p^y(1-p)^{n-y}\] \[\left(\begin{array}{n}n\\ y\end{array}\right)=\frac{n!}{y!(n-y)!}\] Now we rewrite y! & n! as \[y! = y(y-1)!\] \[n! = n(n-1)!\] In addition, it will be hard to get rid of the y^2 value. In order to make it easier we instead find the expected value of E(y^2-y): \[E(y^2-y) = E\left[y(y-1)\right]\] The formula for E(y^2-y) now looks like this: \[E\left[y(y-1)\right] =\sum_{y=2}^n y(y-1) \frac{n(n-1)!}{y(y-1)!(n-y)!}p^y(1-p)^{n-y}\] And with some hot math magic you can now cancel out y and get the following: \[\sum_{y=2}^n (y-1)\frac{n(n-1)!}{(y-1)!(n-y)!}p^y(1-p)^{n-y}\] (y-1)! can be rewritten as (y-1)(y-2)!: \[\sum_{y=2}^n (y-1)\frac{n(n-1)!}{(y-1)(y-2)!(n-y)!}p^y(1-p)^{n-y}\] You can now get rid of (y-1) and get the following formula: \[\sum_{y=2}^n \frac{n(n-1)!}{(y-2)!(n-y)!}p^y(1-p)^{n-y}\] Now we have (y-2)! in the denominator, it would be cool if we also made the numerator equal to (n-2)! use the following: \[n(n-1)! = n(n-1)(n-2)!\] You you can now move n & (n-1) to the other side of the sum because they are constants and we would be left with (n-2)! in the numerator: \[n(n-1)\sum_{y=2}^n \frac{(n-2)!}{(y-2)!(n-y)!}p^y(1-p)^{n-y}\] It would also be cool if we made (n-y)! equal to ((n-2)-(y-2))!, did you know that: \[(n-y)! = ((n-2)-(y-2))!\] You can verify that if you don't trust me.

Now the formula looks like this: \[n(n-1)\sum_{y=2}^n \frac{(n-2)!}{(y-2)!((n-2)-(y-2))!}p^y(1-p)^{n-y}\] Don't forget to change the exponent n-y on the right to (n-2)-(y-2) : \[n(n-1)\sum_{y=2}^n \frac{(n-2)!}{(y-2)!((n-2)-(y-2))!}p^y(1-p)^{(n-2)-(y-2)}\] Now we have n-2 and y-2 everywhere; however, there is one mean boy left which is p^y, we need him to become p^y-2. In order to do so we can remove 2 from the exponent and move p^2 to the other side of the sum: \[n(n-1)p^2\sum_{y=2}^n \frac{(n-2)!}{(y-2)!((n-2)-(y-2))!}p^{y-2}(1-p)^{(n-2)-(y-2)}\] Now we have n-2 & y-2 everywhere. To make everything look simpler we can assign the following values: \[m = n-2\] \[z = y-2\] Now we made the equation look like this: \[n(n-1)p^2\sum_{y=2}^n \frac{m!}{z!(m-z)!}p^{z}(1-p)^{m-z}\] Does this remind you of something, hint: \[\sum_{i=0}^n\left(\begin{array}{c}n\\ i\end{array}\right)x^{n-i}y^i = (x+y)^n\] Using the binomial expansion we now get rid of the summation: \[x = p\] \[y = (1-p)\] \[np\sum_{z=0}^m\left(\begin{array}{c}m\\ z\end{array}\right)p^z(1-p)^{m-y} = np(p+(1-p))^n\] Since (p+(1-p)) is equal to 1 you get: \[n(n-1)p^2(p+(1-p))^n = n(n-1)p^2\] Now we have found the following: \[E(Y^2-Y)=n(n-1)p^2\] From the properties of expected values we know that: \[E(Y^2-Y)=E(Y^2) - E(Y)\] From the properties of variance V(Y) we know that: \[V(Y) = E(Y^2)-E(Y)^2\] But we replaced E(Y^2) with E(Y^2-Y) to make the proof simple. Now we need to get rid of the -Y from E(Y^2-Y).... Luckily it is easy just add np to the equation: \[E(Y^2) = E(Y^2-Y) +E(Y)\] \[E(Y^2) = n(n-1)p^2+np\] Now we finaly found what E(Y^2) is equal to, now we can plug it into the variance formula: \[V(Y) = E(Y^2)-E(Y)^2\] \[V(Y) = n(n-1)p^2+np-np^2\] Well now we need to simplify it...

Step 1 simplification: put the whole equation into paranthesis and put a np before the paranthesis: \[n(n-1)p^2+np-np^2 = np \times ((n-1) \times p + 1 - np)\] Step 2: multiply (n-1) with p and get np-p which will lead to the following equation: \[np \times (np-p+1-np) \] Step 3: positive np and negative np cancels out: \[np \times (-p+1) \] Steg 4: (-p+1) is the same as (1-p) thus we now get: \[np (1-p) \] Hurray! If you have now successfully proven that the variance of the binomial distribution is equal to np(1-p)