The random experiment is to toss a coin \(n\) times, where the probability of heads is \(p\). Initially, \(p\) is modeled by random variable \(P_0\) which has a prior beta distribution with left parameter \(a\) and right parameter \(b\). The density function of \(P_0\) is shown in the graph on the left. The prior point estimate of \(p\) is \[U_0 = \E(P_0) = \frac{a} {a + b}\] Random variable \(Y\) is the number of heads, and then \(p\) is modeled by random variable \(P_1\) which has the posterior beta probability density function with left parameter \(a + Y\) and right parameter \(b + n - Y\). The density function of \(P_1\), is shown in the graph on the right. On both graphs, the true value of \(p\) is shown as a blue dot on the horizontal axis. The Bayesian estimate of \(p\) \[U_1 = \E(P_1) = \frac{a + Y}{a + b + n}\] The parameters \(n\), \(p\), \(a\), and \(b\) can be varied with input controls.