Quite a large number of multivariate reliabilitiy models
, and in particular multivariate exponential distributions
have been formulated over the years. Our goal in this section and the next is not an exhaustive review of this literature, but rather to consider a few that can be generalized to, or have special connections to, the semmigroup and graph theory in this text. We start once again with the standard space \(([0, \infty), +, \lambda)\) where \(\lambda\) is Lebesgue measure on the \(\sigma\)-algebra \(\ms B\) of Borel subsets of \([0, \infty)\). The corresponding graph, of course, is \(([0, \infty), \le)\). For \(n \in \N_+\), let \(\left([0, \infty)^n, +, \lambda^n\right)\) denote the power space of order \(n\). The corresponding graph \(\left([0, \infty)^n, \le^n\right)\) is the power graph of \(([0, \infty), \le)\) of order \(n\). Since \(\le\) is a total order \([0, \infty)\), our usual lattice notation becomes \(x \wedge y = \min\{x, y\}\) and \(x \vee y = \max\{x, y\}\) for \(x, \, y \in [0, \infty)\).
In the most general sense, a random vector \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a multivariate exponential distribution on \([0, \infty)^n\) if \(X_i\) has an exponential distribution on \(([0, \infty), +)\) for each \(i \in \{1, 2, \ldots, n\}\).
That is, \(X_i\) has an ordinary exponential distribution on \([0, \infty)\) for each \(i \in \{1, 2, \ldots, n\}\). Of course, the statement that \(\bs{X}\) has a multivariate exponential distribution on \([0, \infty)^n\) does not mean that \(\bs{X}\) has an exponential distribution on the semigroup \( \left([0, \infty)^n, +\right) \), except of course when \(n = 1\). We know the full story:
Random vector \(\bs{X}\) has an exponential distribution on \( \left([0, \infty)^n, +\right) \) if and only if \((X_1, X_2, \ldots, X_n)\) are independent and \(X_i\) has an exponential distribution on \(([0, \infty), +)\) for each \(i \in \{1, 2, \ldots, n\}\).
Definition is too weak and proposition is too strong for an interesting multivariate reliability model. Other conditions need to be imposed on \(\bs{X}\), often multivariate memoryless conditions of some sort. Among the best known of the multivariate exponential distributions are the Marshall-Olkin distributions. These distributions were discussed in Section 2.7 because these distributions can be defined in the setting of a general positive semigroup \((S, \cdot)\) whose partial order graph \((S, \preceq)\) is a lattice. In this section we review the results for the standard setting above (the orginal setting of Marshall and Olkin), including results that do not make sense in the general semigroup setting. We start with the bivariate case where the notation and results are simplest before moving on to the general multivariate case.
Suppose that \(U, \, V, \, W\) are independent and have exponential distributions on \(([0, \infty), +)\), with rates \(\alpha, \, \beta, \, \delta \in (0, \infty)\) respectively. Let \(X = U \wedge W\) and \(Y = V \wedge W\). Then \((X, Y)\) has the Marshall-Olkin distribution on \( \left([0, \infty)^2, +\right) \) with parameters \(\alpha, \, \beta, \, \delta\).
The motivation for definition is a two-component system. Random variable \(U\) is the arrival time of a shock
that is fatal for component 1, but not component 2; random variable \(V\) is the arrival time of a shock that is fatal for component 2, but not component 1; and random variable \(W\) is the arrival time of an event that is fatal for both components.
Suppose that \((X, Y)\) has the Marshall-Olkin distribution on \( \left([0, \infty)^2, +\right) \) with parameters \( \alpha, \, \beta, \, \delta \in (0, \infty) \). Then each of the following has an exponential distribution on \( ([0, \infty), +) \).
In particular, \( (X, Y) \) has a bivariate exponential distribution in the sense of definition .
Suppose that \((X, Y)\) has the Marshall-Olkin distribution on \( \left([0, \infty)^2, +\right) \) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then \((X, Y)\) has reliability function \(H\) for \( \left([0, \infty)^2, +\right) \) given by \[H(x, y) = \exp[-\alpha x - \beta y - \delta (x \vee y)], \quad (x, y) \in [0, \infty)^2 \]
Suppose that \((X, Y)\) has reliability function \(H\) on \( \left([0, \infty)^2, + \right) \). Then \((X, Y)\) has a Marshall-Olkin distribution if and only if \((X, Y)\) has a bivariate exponential distribution as in definition and satisfies the partial memoryless property
\[ H(t + x, t + y) = H(t, t) H(x, y), \quad x, \, y, \, t \in [0, \infty) \]Let \(\Delta = \{(x, x): x \in [0, \infty)\}\) denote the diagonal of \([0, \infty)^2\). The sub-semigroup \((\Delta, +)\) is isomorphic to \(([0, \infty), +)\) with invariant measure \(\lambda_1\) given by \(\lambda_1(A) = \lambda(A_1)\) for measurable \(A \subseteq \Delta\) where \(A_1 = \{x \in [0, \infty): (x, x) \in A\}\). A Marshall-Olkin distribution on \([0, \infty)^2\) has a mixed distribution, with positive probability on \(\Delta\).
Suppose that \((X, Y)\) has the Marshall-Olkin distribution on \([0, \infty)^2\) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then
This follows from standard results on independent exponential variables, by consider the 6 possible orderings of \((U, V, W)\):
So in particular \[\P(X = Y) = \frac{\delta}{\alpha + \beta + \delta}\]
Suppose again that \((X, Y)\) has the Marshall-Olkin distribution on \(([0, \infty)^2, +)\) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then
Parts (a) and (b) follow immediately from . Part (c) can be computed using the density function in , but a conditioninng argument, as in that proposition, also works: \begin{align*} \E(XY, U \lt V \lt W) &= \frac{\alpha^2 \beta + 3 \alpha \beta^2 + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, V \lt U \lt W) &= \frac{\alpha \beta^2 + 3 \alpha^2 \beta + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, U \lt W \lt V) &= \frac{\alpha^2 \delta + 3 \alpha \delta^2 + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, V \lt W \lt U) &= \frac{\beta^2 \delta + 3 \beta \delta^2 + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, W \lt U \wedge V) &= \frac{2 \delta}{(\alpha + \beta + \delta)^2} \\ \end{align*} Summing and simplifying gives \[\E(X Y) = \frac{1}{\alpha + \beta + \delta} \left(\frac{1}{\alpha + \delta} + \frac{1}{\beta + \delta}\right)\] and then part (c) follows.
It's interesting that \(\cor(X, Y) = \P(X = Y)\).
The extension of the Marshall-Olkin distribution to higher dimensions is a bit complicated and requires some additional notation to state the definition and results cleanly, just as in Section 2.7 in the abstract setting. For this subsection, fix \(n \in \N_+\). Let \(B_n\) denote the set of bit strings of length \(n\), excluding the all 0 string.
Suppose that \(\{Z_b: b \in B_n\}\) is a collection of independent variables, and that \(Z_b\) has the exponential distribution on \([0, \infty), +)\) with rate \(\alpha_b \in (0, \infty)\). Define \[ X_i = \min\{Z_b: b \in B_n, b_i = 1\}, \quad i \in \{1, 2, \ldots, n\} \] Then \((X_1, X_2, \ldots, X_n)\) has the Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b: b \in B_n\}\).
So a collection of \(2^n - 1\) independent, exponential variables on \(([0, \infty), +)\) is required for the construction of the Marshall-Olkin variable on \(([0, \infty)^n, +)\). The marginal distributions are of the same type.
Suppose that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b \in (0, \infty): b \in B_n\}\), and that \((j_1, j_2, \ldots, j_k)\) is a subsequence of \((1, 2, \ldots, n)\). Then
In partricular, \(X_i\) has an exponential distribution on \(([0, \infty), +)\) with rate \(\sum\{\alpha_b: b \in B_n, b_i = 1\}\), so \(\bs{X}\) has a mutivariate exponential distribution in the sense of definition . From part (c), \(\bs{X}\) has positive probability on each hyperplane in \([0, \infty)^n\).
Suppose that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b \in (0, \infty): b \in B_n\}\). Let \(H\) denote the reliability function of \(\bs{X}\) on \((S^n, \cdot)\). Then \[H(x_1, x_2, \ldots, x_n) = \exp\left(-\sum_{b \in B_n} \alpha_b \, \max\{x_i: b_i = 1\}\right), \quad (x_1, x_2, \ldots, x_n) \in [0, \infty)^n \]
The generalization of the partial memoryless property is straightforward.
Suppose again that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with reliability function \(H\). Then \(\bs{X}\) has the partial memoryless property \[ H(t + x_1, t + x_2, \ldots, t + x_n) = H(t, t, \ldots, t) H(x_1, x_2, \ldots, x_n), \quad t \in [0, \infty), \, (x_1, x_2, \ldots, x_n) \in [0, \infty)^n \]
Suppose again that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b \in (0, \infty): b \in B_n\}\). For distinct \(i, \, j \in \{1, 2, \ldots, n\}\), \[\cor(X_i, X_j) = \P(X_i = X_j) = \frac{\sum\left\{\alpha_b: b \in B_n, b_i = b_j = 1\}\right\}} {\sum\left\{\alpha_b: b \in B_n, b_i = 1 \text{ or } b_j = 1\right\}}\]