\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\var}{\text{var}}\) \(\newcommand{\sd}{\text{sd}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\R}{\mathbb{R}}\) \( \newcommand{\Li}{\text{Li}} \) \( \newcommand{\bs}{\boldsymbol} \) \( \newcommand{\skw}{\text{skew}} \)
  1. Random
  2. 4. Special Distributions
  3. The Exponential-Logarithmic Distribution

The Exponential-Logarithmic Distribution

The exponential-logarithmic distribution arises when the rate parameter of the exponential distribution is randomized by the logarithmic distribution. The exponential-logarithmic distribution has applications in reliability theory in the context of devices or organisms that improve with age, due to hardening or immunity.

The Standard Exponential-Logarithmic Distribution

Distribution Functions

Random variable \( X \) has the standard exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \) if \( X \) has a continuous distribution on \( [0, \infty) \) with probability density function \( g \) given by \[ g(x) = -\frac{(1 - p) e^{-x}}{\ln(p)[1 - (1 - p) e^{-x}]}, \quad x \in [0, \infty) \]

\( g \) really is a probability density function for a continuous distribution on \( [0, \infty) \).

  1. \( g \) is decreasing on \( [0, \infty) \) with mode \( x = 0 \).
  2. \( g \) is concave upward on \( [0, \infty) \).
Proof:

Substituting \( u = (1 - p) e^{-x} \), \( du = -(1 - p) e^{-x} dx \) gives \[ \int_0^\infty \frac{(1 - p) e^{-x}}{1 - (1 - p) e^{-x}} dx = \int_0^{1-p} \frac{du}{1 - u} = -\ln(p) \] The shape of the graph follows from \begin{align} g^\prime(x) & = \frac{(1 - p) e^{-x}}{\ln(p) [1 - (1 - p) e^{-x}]^2}, \quad x \in [0, \infty) \\ g^{\prime\prime}(x) & = -\frac{(1 - p) e^{-x} [1 + (1 - p) e^{-x}}{\ln(p) [1 - (1 - p) e^{-x}]^3}, \quad x \in [0, \infty) \end{align}

Open the special distribution simulator and select the exponential-logarithmic distribution. Vary the shape parameter and note the shape of the probability density function. For selected values of the shape parameter, run the simulation 1000 times and compare the empirical density function to the probability density function.

\( X \) has distribution function \( G \) given by \[ G(x) = 1 - \frac{\ln\left[1 - (1 - p) e^{-x}\right]}{\ln(p)}, \quad x \in [0, \infty) \]

Proof:

This follows from the same integral substitution used above.

\( X \) has quantile function \( G^{-1} \) given by \[ G^{-1}(u) = \ln\left(\frac{1 - p}{1 - p^{1 - u}}\right) = \ln(1 - p) - \ln\left(1 - p^{1 - u}\right), \quad u \in [0, 1) \]

  1. The first quartile is \( q_1 = \ln(1 - p) - \ln\left(1 - p^{3/4}\right) \).
  2. The median is \( q_2 = \ln(1 - p) - \ln\left(1 - p^{1/2}\right) = \ln\left(1 + \sqrt{p}\right)\).
  3. The third quartile is \( q_3 = \ln(1 - p) - \ln\left(1 - p^{1/4}\right)\).
Proof:

The formula for \( G^{-1} \) follows from the distribution function above by solving \(u = G(x) \) for \( x \) in terms of \( u \).

Open the special distribution calculator and select the exponential-logarithmic distribution. Vary the shape parameter and note the shape of the distribution and probability density functions. For selected values of the shape parameter, computer a few values of the distribution function and the quantile function.

\( X \) has reliability function \( G^c \) given by \[ G^c(x) = \frac{\ln\left[1 - (1 - p) e^{-x}\right]}{\ln(p)}, \quad x \in [0, \infty) \]

Proof:

This follows trivially from the distribution function above since \( G^c = 1 - G \).

The standard exponential-logarithmic distribution has decreasing failure rate.

The failure rate function \( r \) of \( X \) is given by \[ r(x) = -\frac{(1 - p) e^{-x}}{\left[1 - (1 - p) e^{-x}\right] \ln\left[1 - (1 - p) e^{-x}\right]}, \quad x \in (0, \infty) \]

  1. \( r \) is decreasing on \( [0, \infty) \).
  2. \( r \) is concave upward on \( [0, \infty) \).
Proof:

Recall that \( r(x) = g(x) \big/ G^c(x) \) so the formula follows from the probability density function and the distribution function given above.

The Polylogarithm

The moments of the standard exponential-logarithmic distribution cannot be expressed in terms of the usual elementary functions, but can be expressed in terms of a special function known as the polylogarithm.

For \( n \in \N \) the polylogarithm of order \( n \) is defined by \[ \Li_n(x) = \sum_{k=1}^\infty \frac{x^k}{k^n}, \quad x \in (-1, 1) \]

Thus, the polylogarithm is a power series, and we will show below that the radius of convergence is 1 for each \( n \).

The polylogarithm functions of orders 0, 1, 2, and 3.

  1. The polylogarithm of order 0 is \[ \Li_0(x) = \sum_{k=1}^\infty x^k = \frac{x}{1 - x}, \quad x \in (-1, 1) \]
  2. The polylogarithm of order 1 is \[ \Li_1(x) = \sum_{k=1}^\infty \frac{x^k}{k} = -\ln(1 - x), \quad x \in (-1, 1) \]
  3. The polylogarithm of order 2 is known as the dilogarithm
  4. The polylogarithm of order 3 is known as the trilogarithm.

Thus, the polylogarithm of order 0 is a simple geometric series, and the polylogarithm of order 1 is the standard power series for the natural logarithm. Note that the probability density function of \( X \) can be written in terms of the polylogarithms of orders 0 and 1: \[ g(x) = -\frac{\Li_0\left[(1 - p) e^{-x}\right]}{\ln(p)} = \frac{\Li_0\left[(1 - p) e^{-x}\right]}{\Li_1(1 - p)}, \quad x \in [0, \infty) \] The most important property of the polylogarithm is given in the following theorem:

The polylogarithm satisfies the following recursive integral formula: \[ \Li_{n+1}(x) = \int_0^x \frac{\Li_n(t)}{t} dt; \quad n \in \N, \; x \in (-1, 1) \] Equivalently, \( x \, \Li_{n+1}^\prime(x) = \Li_n(x) \) for \( x \in (-1, 1) \) and \( n \in \N \).

Proof:

Recall that a power series may integrated term by term, and the integrated series has the same radius of convergence. Hence for \(n \in \N \), \[ \int_0^x \frac{\Li_n(t)}{t} dt = \sum_{k=1}^\infty \int_0^x \frac{t^{k-1}}{n^k} = \sum_{k=1}^\infty \frac{x^k}{n^{k+1}} = \Li_{n+1}(x), \quad x \in (-1, 1) \] We know that \( L_0 \) (the geometric series) has radius of convergence 1, and hence \( L_n \) has radius of convergence 1 for each \( n \in \N \).

When \( n \gt 1 \), the polylogarithm series converges at \( x = 1 \) also, and \[ \Li_n(1) = \zeta(n) = \sum_{k=1}^\infty \frac{1}{k^n} \] where \( \zeta \) is the Riemann zeta function, named for Georg Riemann. The polylogarithm can be extended to complex orders and defined for complex \( z \) with \( |z| \lt 1 \), but the simpler version suffices for our work here.

Moments

We assume again that \( X \) has the basic exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \).

The moments of \( X \) (about 0) are \[ \E(X^n) = -n! \frac{\Li_{n+1}(1 - p)}{\ln(p)} = n! \frac{\Li_{n+1}(1 - p)}{\Li_1(1 - p)}, \quad n \in \N \]

  1. \( \E(X^n) \to 0 \) as \( p \downarrow 0 \)
  2. \( \E(X^n) \to n! \) as \( p \uparrow 1 \)
Proof:

As noted earlier in the discussion of the polylogarithm, the PDF of \( X \) can be written as \[ g(x) = -\frac{1}{\ln(p)} \sum_{k=1}^\infty (1 - p)^k e^{-kx}, \quad x \in [0, \infty) \] Hence \[ \E(X^n) = -\frac{1}{\ln(p)} \int_0^\infty \sum_{k=1}^\infty (1 - p)^k x^n e^{-k x} dx = -\frac{1}{\ln(p)} \sum_{k=1}^\infty (1 - p)^k \int_0^\infty x^n e^{-k x} dx \] But \( \int_0^\infty x^n e^{-k x} dx = n! \big/ k^{n + 1} \) and hence \[ \E(X^n) = -\frac{1}{\ln(p)} n! \sum_{k=1}^\infty \frac{(1 - p)^k}{k^{n+1}} = - n! \frac{\Li_{n+1}(1 - p)}{\ln(p)}\]

  1. As \( p \downarrow 0 \), the numerator in the last expression for \( \E(X^n) \) converges to \( n! \zeta(n + 1) \) while the denominator diverges to \( \infty \).
  2. As \( p \uparrow 1 \), the expression for \( \E(X^n) \) has the indeterminate form \( \frac{0}{0} \). An application of L'Hospital's rule and the derivative rule above gives \[ \lim_{p \uparrow 1} \E(X^n) = \lim_{p \uparrow 1} n! p \frac{\Li_n(1 - p)}{1 - p} \] But from the series definition of the polylogarithm above, \( \Li_n(x) \big/ x \to 1 \) as \( x \to 0 \).

We will get some additional insight into the asymptotics below when we consider the limiting distribution as \( p \downarrow 0 \) and \( p \uparrow 1 \). The mean and variance of the standard exponential logarithmic distribution follow easily from the general moment formula.

The mean and variance of \( X \) are

  1. \( \E(X) = - \Li_2(1 - p) \big/ \ln(p) \)
  2. \( \var(X) = -2 \Li_3(1 - p) \big/ \ln(p) - \left[\Li_2(1 - p) \big/ \ln(p)\right]^2 \)

From the asymptotics of the general moments given above, note that \( \E(X) \to 0 \) and \( \var(X) \to 0 \) as \( p \downarrow 0 \), and \( E(X) \to 1 \) and \( \var(X) \to 1 \) as \( p \uparrow 1 \).

Open the special distribution simulator and select the exponential-logarithmic distribution. Vary the shape parameter and note the size and location of the mean \( \pm \) standard deviation bar. For selected values of the shape parameter, run the simulation 1000 times and compare the empirical mean and standard deviation to the distribution mean and standard deviation.

Related Distributions

The standard exponential-logarithmic distribution has the usual connections to the standard uniform distribution by means of the distribution and quantile functions.

Suppose that \( p \in (0, 1) \).

  1. If \( U \) has the standard uniform distribution then \[ X = \ln\left(\frac{1 - p}{1 - p^U}\right) = \ln(1 - p) - \ln\left(1 - p^U \right) \] has the standard exponential-logarithmic distribution with shape parameter \( p \).
  2. If \( X \) has the standard exponential-logarithmic distribution with shape parameter \( p \) then \[ U = \frac{\ln\left[1 - (1 - p) e^{-X}\right]}{\ln(p)} \] has the standard uniform distribution.
Proof:
  1. Recall that if \( U \) has the standard uniform distribution, then \( G^{-1}(U) \) has the exponential-logarithmic distribution with shape parameter \( p \). But \( 1 - U \) also has the standard uniform distribution and hence\( X = G^{-1}(1 - U) \) also has the exponential-logarithmic distribution with shape parameter \( p \).
  2. Similarly, if \( X \) has the exponential-logarithmic distribution with shape parameter \( p \) then \( G(X) \) has the standard uniform distribution. Hence \( U = 1 - G(X) \) also has the standard uniform distribution.

Since the quantile function of the basic exponential-logarithmic distribution has a simple closed form, the distribution can be simulated using the random quantile method.

Open the random quantile experiment and select the exponential-logarithmic distribution. Vary the shape parameter and note the shape of the distribution and probability density functions. For selected values of the parameter, run the simulation 1000 times and compare the empirical density function to the probability density function.

As the name suggests, the basic exponential-logarithmic distribution arises from the exponential distribution and the logarithmic distribution via a certain type of randomization.

Suppose that \( \bs{T} = (T_1, T_2, \ldots) \) is a sequence of independent random variables, each with the standard exponential distribution. Suppose also that \( N \) has the logarithmic distribution with parameter \( 1 - p \in (0, 1) \) and is independent of \( \bs{T} \). Then \( X = \min\{T_1, T_2, \ldots, T_N\} \) has the basic exponential-logarithmic distribution with shape parameter \( p \).

Proof:

It's best to work with reliability functions. For \( n \in \N_+ \), \( \min\{T_1, T_2, \ldots, T_n\} \) has the exponential distribution with rate parameter \( n \), and hence \( \P(\min\{T_1, T_2, \ldots T_n\} \gt x) = e^{-n x} \) for \( x \in [0, \infty) \). Recall also that \( \P(N = n) = -\frac{(1 - p)^n}{n \ln(p)} \) for \( n \in \N_+ \) Hence, using the polylogarithm of order 1 (the standard power series for the logarithm), \[ \P(X \gt x) = \E[\P(X \gt x \mid N)] = -\frac{1}{\ln(p)} \sum_{n=1}^\infty e^{-n x} \frac{(1 - p)^n}{n} = -\frac{1}{\ln(p)} \sum_{n=1}^\infty \frac{\left[e^{-x}(1 - p)\right]^n}{n} = \frac{\ln\left[1 - e^{-x} (1 - p)\right]}{\ln(p)}\] As a function of \( x \), this is the reliability function of the exponential-logarithmic distribution with shape parameter \( p \).

Also of interest, of course, are the limiting distributions of the standard exponential-logarithmic distribution as \(p \downarrow 0\) and as \( p \uparrow 1 \).

The standard exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \) converges to

  1. Point mass at 0 as \( p \downarrow 0 \).
  2. The standard exponential distribution as \( p \uparrow 1 \).
Proof:

It's slightly easier to work with the right distribution function (reliability function) \( \bar{G} \) given above rather than the ordinary (left) distribution function \( G \).

  1. Note that \( \bar{G}(0) = 1 \) for every \( p \in (0, 1) \). On the other hand, if \( x \gt 0 \) then \( \bar{G}(x) \to 0 \) as \( p \downarrow 0 \).
  2. \( \bar{G}(x) \) has the indeterminate form \( \frac{0}{0} \) as \( p \uparrow 1 \). An application of L'Hospital's rule shows that \[ \lim_{p \uparrow 1} \bar{G}(x) = \lim_{p \uparrow 1} \frac{p e^{-x}}{1 - (1 - p) e^{-x}} = e^{-x}, \quad x \in [0, \infty) \] As a function of \( x \), this is the reliability function of the standard exponential distribution.

The General Exponential-Logarithmic Distribution

The standard exponential-logarithmic distribution is generalized, like so many distributions on \( [0, \infty) \), by adding a scale parameter.

If \( Z \) has the standard exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \) and \( b \in (0, \infty) \), then \( X = b Z \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \).

Using the same terminology as the exponential distribution, \( 1/b \) is called the rate parameter.

Distribution Functions

Suppose that \( X \) has the exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \) and scale parameter \( b \in (0, \infty) \).

\( X \) has probability density function \( f \) given by \[ f(x) = -\frac{(1 - p) e^{-x / b}}{b \ln(p)[1 - (1 - p) e^{-x / b}]}, \quad x \in [0, \infty) \]

  1. \( f \) is decreasing on \( [0, \infty) \) with mode \( x = 0 \).
  2. \( f \) is concave upward on \( [0, \infty) \).
Proof:

Recall that \( f(x) = \frac{1}{b}g\left(\frac{x}{b}\right) \) for \( x \in [0, \infty) \) where \( g \) is the PDF of the standard exponential-logarithmic distribution with shape parameter \( p \), given above.

Open the special distribution simulator and select the exponential-logarithmic distribution. Vary the shape and scale parameters and note the shape and location of the probability density function. For selected values of the parameters, run the simulation 1000 times and compare the empirical density function to the probability density function.

\( X \) has distribution function \( F \) given by \[ F(x) = 1 - \frac{\ln\left[1 - (1 - p) e^{-x / b}\right]}{\ln(p)}, \quad x \in [0, \infty) \]

Proof:

Recall that \( F(x) = G(x / b) \) for \( x \in [0, \infty) \) where \( G \) is the CDF of the standard exponential-logarithmic distribution with shape parameter \( p \), given above.

\( X \) has quantile function \( F^{-1} \) given by \[ F^{-1}(u) = b \ln\left(\frac{1 - p}{1 - p^{1 - u}}\right) = b \left[\ln(1 - p) - \ln\left(1 - p^{1 - u}\right)\right], \quad u \in [0, 1) \]

  1. The first quartile is \( q_1 = b \left[\ln(1 - p) - \ln\left(1 - p^{3/4}\right)\right] \).
  2. The median is \( q_2 = b \left[\ln(1 - p) - \ln\left(1 - p^{1/2}\right)\right] = b \ln\left(1 + \sqrt{p}\right)\).
  3. The third quartile is \( q_3 = b \left[\ln(1 - p) - \ln\left(1 - p^{1/4}\right) \right]\).
Proof:

Recall that \( F^{-1}(u) = b G^{-1}(u) \) where \( G^{-1} \) is the quantile function of the basic exponential-logarithmic distribution, given above.

Open the special distribution calculator and select the exponential-logarithmic distribution. Vary the shape and scale parameter and note the shape and location of the probability density and distribution functions. For selected values of the parameters, computer a few values of the distribution function and the quantile function.

\( X \) has reliability function \( F^c \) given by \[ F^c(x) = \frac{\ln\left[1 - (1 - p) e^{-x / b}\right]}{\ln(p)}, \quad x \in [0, \infty) \]

Proof:

This follows trivially from the distribution function above since \( F^c = 1 - F \).

The exponential-logarithmic distribution has decreasing failure rate.

The failure rate function \( R \) of \( X \) is given by. \[ R(x) = -\frac{(1 - p) e^{-x / b}}{b \left[1 - (1 - p) e^{-x / b}\right] \ln\left[1 - (1 - p) e^{-x / b}\right]}, \quad x \in [0, \infty) \]

  1. \( R \) is decreasing on \( [0, \infty) \).
  2. \( R \) is concave upward on \( [0, \infty) \).
Proof:

Recall that \( R(x) = \frac{1}{b} r\left(\frac{x}{b}\right) \) for \( x \in [0, \infty) \), where \( r \) is the failure rate function of the standard exponential-logarithmic distribution given above. Alternately, \( R(x) = f(x) \big/ F^c(x) \).

Moments

Suppose again that \( X \) has the exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \) and scale parameter \( b \in (0, \infty) \). The moments of \( X \) can be computed easily from the representation \( X = b Z \) where \( Z \) has the basic exponential-logarithmic distribution.

The moments of \( X \) (about 0) are \[ \E(X^n) = -b^n n! \frac{\Li_{n+1}(1 - p)}{\ln(p)}, \quad n \in \N \]

  1. \( \E(X^n) \to 0 \) as \( p \downarrow 0 \)
  2. \( \E(X^n) \to b^n n! \) as \( p \uparrow 1 \)
Proof:

These results follow from basic properties of expected value and the corresponding results for the standard distribution given above. We can write \( X = b Z \) where \( Z \) has the standard exponential-logarithmic distribution with shape parameter \( p \). Hence \( \E(X^n) = b^n \E(Z^n) \).

The mean and variance of \( X \) are

  1. \( \E(X) = - b \Li_2(1 - p) \big/ \ln(p) \)
  2. \( \var(X) = b^2 \left(-2 \Li_3(1 - p) \big/ \ln(p) - \left[\Li_2(1 - p) \big/ \ln(p)\right]^2 \right)\)

From the general moment result above, note that \( \E(X) \to 0 \) and \( \var(X) \to 0 \) as \( p \downarrow 0 \), while \( \E(X) \to b \) and \( \var(X) \to b^2 \) as \( p \uparrow 1 \).

Open the special distribution simulator and select the exponential-logarithmic distribution. Vary the shape and scale parameters and note the size and location of the mean \( \pm \) standard deviation bar. For selected values of the parameters, run the simulation 1000 times and compare the empirical mean and standard deviation to the distribution mean and standard deviation.

Related Distributions

Since the exponential-logarithmic distribution is a scale family for each value of the shape parameter, it is trivially closed under scale transformations.

If \( X \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \), and if \( c \in (0, \infty) \), then \( Y = c X \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b c \).

Once again, the exponential-logarithmic distribution has the usual connections to the standard uniform distribution by means of the distribution and quantile functions.

Suppose that \( p \in (0, 1) \) and \( b \in (0, \infty) \).

  1. If \( U \) has the standard exponential distribution then \[ X = b \left[\ln\left(\frac{1 - p}{1 - p^U}\right)\right] = b \left[\ln(1 - p) - \ln\left(1 - p^U \right)\right] \] has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \).
  2. If \( X \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \), then \[ U = \frac{\ln\left[1 - (1 - p) e^{-X / b}\right]}{\ln(p)} \] has the standard uniform distribution.
Proof:

These results follow from the corresponding result above and the representation \(X = b Z \), where \( Z \) has the standard exponential-logarithmic distribution with shape parameter \( p \).

Again, since the quantile function of the exponential-logarithmic distribution has a simple closed form, the distribution can be simulated using the random quantile method.

Open the random quantile experiment and select the exponential-logarithmic distribution. Vary the shape and scale parameters and note the shape and location of the distribution and probability density functions. For selected values of the parameters, run the simulation 1000 times and compare the empirical density function to the probability density function.

Suppose that \( \bs{T} = (T_1, T_2, \ldots) \) is a sequence of independent random variables, each with the exponential distribution with scale parameter \( b \in (0, \infty) \). Suppose also that \( N \) has the logarithmic distribution with parameter \( 1 - p \in (0, 1) \) and is independent of \( \bs{T} \). Then \( X = \min\{T_1, T_2, \ldots, T_N\} \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \).

Proof:

Note that \( V_i = T_i / b \) has the standard exponential distribution. Hence by the corresponding result above, \( Z = \min\{V_1, V_2, \ldots, V_N\} \) has the basic exponential-logarithmic distribution with shape parameter \( p \). Hence \( X = b Z \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \).

The limiting distributions as \( p \downarrow 0 \) and as \( p \uparrow 1 \) also follow easily from the corresponding results for the standard case.

For fixed \( b \in (0, \infty) \), the exponential-logarithmic distribution with shape parameter \( p \in (0, 1) \) and scale parameter \( b \) converges to

  1. Point mass at 0 as \( p \downarrow 0 \).
  2. The exponential distribution with scale parameter \( b \) as \( p \uparrow 1 \).
Proof:

Suppose that \( X \) has the exponential-logarithmic distribution with shape parameter \( p \) and scale parameter \( b \), so that \( X = b Z \) where \( Z \) has the standard exponential-logarithmic distribution with shape parameter \( p \). Using the results above,

  1. The distribution of \( Z \) converges to point mass at 0 as \( p \downarrow 0 \) and hence so does the distribution of \( X \).
  2. The distribution of \( Z \) converges to the standard exponential distribution as \( p \uparrow 1 \) and hence the the distribution of \( X \) converges to the exponential distribution with scale parameter \( b \).