The sine distribution and the sine-square distributions are a simple continuous distributions based on the sine function. The sine distribution is also known as Gilbert's sine distribution, named for the American geologist Grove Karl (GK) Gilbert who used the distribution in 1892 to study craters on the moon. Both the sine and sine-square distributions arise in the study of finite paths in the context of abstract reliability theory.
The (standard) sine distribution is a continuous distribution on \( [0, 1] \) with probability density function \(f\) given by \[f(x) = \frac{\pi}{2} \sin(\pi x), \quad x \in [0, 1] \]
From simple calculus, \( f \) is a probability density function: \( \sin(\pi x) \ge 0 \) for \( x \in [0, 1] \) and \[ \int_0^1 \sin(\pi x) \, dx = \frac{2}{\pi} \] The properties follow from basic calculus since \begin{align} f^\prime(x) & = \frac{\pi^2}{2} \cos(\pi x), \quad x \in [0, 1] \\ f^{\prime \prime}(x) & = -\frac{\pi^3}{2} \sin(\pi x), \quad x \in [0, 1] \end{align}
Open the special distribution simulator and select the sine distribution. Note the shape and location of the probability density function. Run the simulation 1000 times and compare the emprical density function to the probability density function.
Since the density function is bounded on a bounded interval, the rejection method of simulation can be used.
Open the rejection method app and select the sine distribution. Note the shape and location of the probability density function. Run the simulation 1000 times and compare the emprical density function to the probability density function.
The distribution function \(F\) is given by \( F(z) = \frac{1}{2} [1 - \cos(\pi x)]\) for \( x \in [0, 1] \).
The quantile function \(F^{-1}\) is given by \( F^{-1}(p) = \frac{1}{\pi} \arccos(1 - 2 p) \) for \( p \in [0, 1] \).
Open the quantile app and select the sine distribution. Note the shape of the distribution function. Compute the quantiles of order 0.1 and 0.9.
Suppose that \( X \) has the standard sine distribution. The moment generating function can be expressed in closed form.
The moment generating function \( m \) of \( X \) is given by \[ m(t) = \E\left(e^{t X}\right) = \frac{\pi^2 (1 + e^t)}{2(t^2 + \pi^2)}, \quad t \in \R \]
Note first that \[ m(t) = \frac{\pi}{2} \int_0^1 e^{t x} \sin(\pi x) \, dx \] Integrating by parts with \( u = e^{t x} \) and \( dv = \sin(\pi x) dx \) gives \[ m(t) = \frac{t}{2} (1 + e^t) + \frac{t}{2} \int_0^1 e^{t x} \cos(\pi x) \, dz \] Integrating by parts again with \( u = e^{t x} \) and \( dv = \cos(\pi x) dx \) gives \[ m(t) = \frac{t}{2} (1 + e^t) - \frac{t^2}{\pi^2} m(t) \] Solving for \( m(t) \) gives the result.
The moments of all orders exist, but a general formula is complicated and involves special functions. However, the mean and variance are easy to compute.
The mean and variance of \( X \) are
Of course, the mean and variance could also be obtained by differentiating the MGF .
Numerically, \( \sd(X) \approx 0.2176 \).
Open the special distribution simulator and select the sine distribution. Note the position and size of the mean \(\pm \) standard deviation bar. Run the simulation 1000 times and compare the empirical mean and stadard deviation to the distribution mean and standard deviation.
The skewness and kurtosis of \(X\) are
Numerically, \( \kur(X) \approx 2.1938 \).
Since the distribution function and the quantile function have closed form representations, the standard sine distribution has the usual connection to the standard uniform distribution via the distribution and quantile functions.
Connections to the uniform distribution.
Part (a) of course leads to the random quantile method of simulation.
Open the random quantile simulator and select the sine distribution. Note the shape of the distribution and density functions. Run the simulation 1000 times and note the random quantiles. Compare the empirical density function to the probability density function.
The (standard) sine-square distribution is a continuous distribution on \([0, 1]\) with probability density function \(f\) given by \[ f(x) = 2 \sin^2(\pi x) = 1 - \cos(2 \pi x), \quad x \in [0, 1]\]
The two versions of \(f\) follow from a standard trig identity. Using the second version, \(\int_0^1 f(x) \, dx = 1\). Part (a) is clear and parts (b) and (c) follow from simple calculus: \begin{align*} f^\prime(x) &= 2 \pi \sin(2 \pi x), \quad x \in [0, 1] \\ f^{\prime\prime}(x) &= 4 \pi^2 \sin(2 \pi x), \quad x \in [0, 1] \end{align*}
Open the special distribution simulator and select the sine-square distribution. Note the shape and location of the probability density function. Run the simulation 1000 times and compare the emprical density function to the probability density function.
Since the density function is bounded on a bounded interval, the rejection method of simulation can be used.
Open the rejection method app and select the sine-square distribution. Note again the shape and location of the probability density function. Run the simulation 1000 times and compare the emprical density function to the probability density function.
The distribution function \(F\) is given by \[F(x) = x - \frac{1}{2 \pi} \sin(2 \pi x) \quad x \in [0, 1]\]
This follows from simple calculus and the second version of the PDF \(f(x) = 1 = \cos(2 \pi x)\) for \(x \in [0, 1]\).
The quantile function does not have a simple, closed-form representation, but by symmerty, the median is \(\frac 1 2\). The first quartile is approximately \(0.3676\) and the third quartile is approximately \(0.6324\).
Open the quantile app and select the sine-square distribution. Note the shape of the distribution function. Compute approximate quantiles of order 0.1 and 0.9.
Suppose that \(X\) has the sine-square distribution.
The mean and variance of \(Z\) are
Part (a) follows from symmetry. Part (b) follows from the usual computational formula for variance and standard calculus since \[\E(X^2) = \int_0^1 x^2 [1 - \cos(2 \pi x)] \, dx = \frac 1 3 - \frac{1}{2 \pi^2}\]
Numerically, \( \sd(X) \approx 0.1808 \).
Open the special distribution simulator and select the sine-square distribution. Note the position and size of the mean \(\pm \) standard deviation bar. Run the simulation 1000 times and compare the empirical mean and stadard deviation to the distribution mean and standard deviation.
The skewness and kurtosis of \(Z\) are
Part (a) follows from symmetry. Part (b) follows from the usual computational formula for kurtosis and calculus since \begin{align*} \E(X^3) &= \int_0^1 x^3 [1 - \cos(2 \pi x)] \, dx = \frac 1 4 - \frac{3}{4 \pi^2} \\ \E(X^4) &= \int_0^1 x^4 [1 - \cos(2 \pi x)] \, dx = \frac 1 5 - \frac{3}{2 \pi^4} - \frac{1}{\pi^2} \end{align*}
\(X\) has moment generating function \(m\) given by \[m(t) = \E\left(e^{t X}\right) = \frac{4 \pi^2 (e^t - 1)}{t (4 \pi^2 + t^2)}, \quad t \in \R\]
This follows from routine (but tedious) calculus since \[m(t) = \int_0^1 e^{t x}[1 - \cos(2 \pi x)] \, dx \]
As with so many other standard distributions
, the standard sine and sine-square distributions are generalized by adding location and scale parameters.
Suppose that \(X\) has the standard sine distribution (sine-square distribution). For \(a \in \R\) and \( b \in (0, \infty) \), random variable \( Y = a + b X \) has the sine distribution (sine-squre distribution) with location parameter \(a\) and scale parameter \(h\).
In both cases, \(Y\) has a continuous distribution on \([a, a + b]\). The mathematics of the general distributions follow easily from the mathematics of the standard distributions in the subsections above, and basic results for location-scale families. We give a quick review. For the rest of this subsection, suppose that \(X\) and \(Y\) have the distributions in definition .
Let \(f\) and \(g\) denote the density functions of \(X\) and \(Y\), respectively. Then \[g(y) = \frac 1 b f\left(\frac{y - a}{b}\right), \quad y \in [a, a + b]\]
Pure scale transformations (\( a = 0 \) and \( b \gt 0 \)) are particularly common, since \( X \) often represents a random angle. The scale transformation with \( b = \pi \) gives the angle in radians. In this case the probability density function of the sine distribution is \( f(x) = \frac{1}{2} \sin(x) \) for \( x \in [0, \pi] \) and the density function of the sine-square distribution is \(f(x) = \frac 2 \pi \sin^2(x)\) for \(x \in [0, \pi]\). Since the radian is the standard angle unit, these distributions could also be considered the standard
ones. The scale transformation with \( b = 90 \) gives the angle in degrees. In this case, the probability density function of the sine distributions is \( f(x) = \frac{\pi}{180} \sin\left(\frac{\pi}{90} x\right) \) for \( x \in [0, 90] \). This was Gilbert's original formulation.
Let \(F\) and \(G\) denote the distribution functions of \(X\) and \(Y\), respectively. Then \[G(y) = F\left(\frac{y - a}{b}\right), \quad y \in [a, a + b]\]
Let \(F^{-1}\) and \(G^{-1}\) denote the quantile functions of \(X\) and \(Y\), respectively. Then \[G^{-1}(p) = a + b F^{-1}(p), \quad p \in [0, 1]\]
However, the quantile function of the general sine-square distribution, like the standard one, does not have a simple, closed form.
The moments of \(X\) and \(Y\) are related as follows:
Let \(m\) and \(M\) denote the moment generating functions of \(X\) and \(Y\), respectively. Then \[M(t) = e^{a t} m(b t), \quad t \in \R\]