As in Section 1, we have three discrete, positive semigroups that are isomorphic to each other: \((R, +)\), \((S, \cdot)\), and \((T, +)\), based on a fixed, countable set \(I\). Recall that \(R\) is the collection of functions \(\bs{n} = (n_i: i \in I)\) from \(I\) into \(\N\) such that \(\sum_{i \in I} n_i \lt \infty\) (or equivalently \(n_i = 0\) for all but finitely many \(i \in I\)). If \(I\) is finite then \(R = \N^I\) the collection of all functions from \(I\) into \(\N\). The operation on \(R\) is ordinary pointwise addition of functions. The set \(S\) is the collection of commutative products of the form \(x = \prod_{i \in I} i^{n_i}\) where \(\bs n = (n_i: i \in I) \in R\). The representation is unique up to the ordering of the factors, so \((S, \cdot)\) is the arithmetic semigroup with \(I\) as the set of prime elements. Finally, \(T\) is the collection of finite multisets with elements in \(I\). A multiset \(a \in T\) is uniquely determined by its multiplicity function \(\bs n = (n_i: i \in I) \in R\) so that \(n_i\) is the number of times that \(i \in I\) is in \(a\). The operation on \(T\) is multiset addition. Associated with each positive semigroup is the corresponding discrete partial order graph: \((R, \le)\), \((S, \preceq)\), and \((T, \subseteq)\). In this section, we will also need the set \[E = \{\bs n \in R: n_i \in \{0, 1\} \text{ for all } i \in I\} = \{x \in S: x \text{ is square free} \} = \{a \in T: a \text{ is an ordinary set} \} \] Our goal in this section is to study probability distributions on the various spaces. The most important special case is when \(I = \{2, 3, 5, 7, \ldots\}\), the ordinary set of prime nubmers, so that \(S = \N_+\) and \(\cdot\) is ordimary multiplcation. This semigroup \((\N_+, \cdot)\) is the standard arithmetic semigroup. In Section 4, we will study probability distribution on \((S, \cdot)\) that are based on Dirichlet series which in turn requires the norm structure on \((S, \cdot)\).
As usual, we assume that we have a probability space \((\Omega, \ms F, \P)\) in the background. Suppose that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\), so that \(N_i\) is a random variable in \(\N\) for \(i \in I\), and \(\sum_{i \in I} N_i \lt \infty\) with probability 1. Of course, \(\bs N\) defines a random product of prime elements \(X = \prod_{i \in I} i^{N_i}\) in \(S\), and a random, finite multiset \(A\) in \(T\) with multiplicity function \(\bs N\). Our first step is to give a condition so that \(\bs N\) amd hence \(X\) and \(A\) are well defined. We start with a standard result from analysis.
Suppose that \(p_i \in (0, 1)\) for \(i \in I\). Then \[\prod_{i \in I} p_i \gt 0 \text{ if and only if } \sum_{i \in I} (1 - p_i) \lt \infty\] Let \(\ms P\) denote the set of all functions \(\bs{p} = (p_i: i \in I)\) from \(I\) into \((0, 1)\) such that the two equivalent conditions are satisfied.
Of course, the condition in is automatically satisfied if \(I\) is finite so in that case, \(\ms P = (0, 1)^I\), the set of functions from \(I\) into \((0, 1)\)
Suppose that \(N_i\) is a random variable in \(\N\) for \(i \in I\), and let \(p_i = \P(N_i = 0)\) for \(i \in I\).
From now on, we will assume that a random variable \(\bs N = (N_i: i \in I)\) in \(R\) satisfies the condition in part (a) of so that \(N\) is well defined, as are the corresponding random variables \(X\) in \(S\) and \(A\) in \(T\). Part (a) of the following result gives a simple interpretation of this condition.
Suppose that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\) and that \(X\) and \(A\) are the corresponding random variables in \(S\) and \(T\) respectively. Let \(A^\prime = \{i \in I: N_i \gt 0\}\) so that \(A^\prime\) is the (ordinary) set of distinct elements of \(A\) and is the set of prime divisors of \(X\). As in , let \(p_i = \P(N_i = 0)\) for \(i \in I\).
If \(\bs N\) is an independent sequence then
Suppose that again that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\) and that \(X\) and \(A\) are the corresponding random variables in \(S\) and \(T\) respectively. Let \(M = \sum_{i \in I} N_i\) so that \(M\) is the sum of the prime exponents of \(X\) and is the total number of elements of \(A\). Then
If \(\bs N\) is and independent sequence then
By definition, \(M\) is finite with probability 1, but of course we could still have \(\E(M) = \infty\). Similarly we could have \(\var(M) = \infty\) even if \(\E(M) \lt \infty\). Next are some simple results involving the event \(E\). We state the results in terms of the random set \(A\).
Suppose that random variable \(\bs N = (N_i: i \in I)\) in \(R\) is an independent sequence and let \(p_i = \P(N_i = 0)\) and \(q_i = \P(N_i = 1)\) for \(i \in I\). Then \begin{align*} \P(A \in E) & = \prod_{i \in I} (p_i + q_i) \\ \P(A = a \mid A \in E) & = \prod_{i \in a} \frac{q_i}{p_i + q_i} \prod_{j \in a^c} \frac{p_j}{p_j + q_j}, \quad a \in E \\ \P(a \subseteq A \mid A \in E) & = \prod_{i \in a} \frac{q_i}{p_i + q_i}, \quad a \in E \end{align*}
The first equation is obvious since \(\P(N_i \in \{0, 1\}) = p_i + q_i\) for \(i \in I\) and the variables are independent. For the second equation, note that \(\P(A = a \mid A \in E) = \P(A = a) / \P(A \in E)\) for \(a \in E\), and \[\P(A = a) = \P(N_i = 1 \text{ for } i \in a \text{ and } N_j = 0 \text{ for } j \in a^c) = \prod_{i \in a} q_i \prod _{j \in a^c} p_j, \quad a \in E\] Similalry, for the third equation, \(\P(a \subseteq A \mid A \in E) = \P(a \subseteq A, A \in E) / \P(A \in E)\) for \(a \in E\), and \[\P(a \subseteq A, A \in E) = \P(N_i = 1 \text { for } i \in a \text{ and } N_j \in \{0, 1\} \text{ for } j \in a^c) = \prod_{i \in a} q_i \prod_{j \in a^c}(p_j + q_j), \quad a \in E\]
Suppose that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\) so that \(X = \prod_{i \in I} i^{N_i}\) is the corresponding random variable in \(S\) and \(A\) is the corresponding random multiset in \(T\) with multiplicty function \(\bs N\). We have the usual representations for the reliability functions of the random variables relative to the semigroups. So let \(\bs n = (n_i: i \in I) \in R\) and let \(x = \prod_{i \in I} i^{n_i} \in S\) and \(a \in T\) with multiplicty function \(\bs n\). Then \begin{align*} \P(\bs n \le \bs N) &= \sum_{\bs n \le \bs m} \P(\bs N = \bs m) = \sum_{\bs k \in R} \P(\bs N = \bs n + \bs k) \\ \P(x \preceq X) &= \sum_{x \preceq y} \P(X = y) = \sum_{z \in S} \P(X = x z) \\ \P(a \subseteq A) &= \sum_{a \subseteq b} \P(A = b) = \sum_{c \in T} \P(A = a + c) \end{align*} and in addition of course, \(\P(\bs n \le \bs N) = \P(x \preceq X) = \P(a \subseteq A)\) and \(\P(\bs n \le \bs N) = \P(n_i \le N_i \text{ for } i \in I)\). Going forward, most of our results will be stated in terms of the arithmetic semigroup \((S, \cdot)\). In particular, if \(F_i\) denotes the reliability function of \(N_i\) for the standard discrete graph \((\N, +)\) and \(F\) the reliability function of \(X\) for \((S, \cdot)\) then \(F(i^n) = F_i(n)\) for \(i \in I\) and \(n \in \N\). If \(\bs N = (N_i: i \in I)\) is an independent sequence then \[F(x) = \prod_{i \in I} F_i(n_i), \quad x = \prod_{i \in I} i^{n_i} \in S\]
By a basic result in Section 1.3, the partial order graphs are stochastic. That is, the reliability function of a probability distribution on the base set uniquely determines the distribution. Under a mild condition, we can recover the density function from the reliability function via Möbius inversion.
Suppose that \(\mu\) is the Möbius function, and that \(F\) a reliability function for \((S, \cdot)\). If \[\sum_{x in S} F(x) \lt \infty\] then the probability density function \(f\) is given by \[f(x) = \sum_{x \preceq y} \mu(x^{-1} y) F(y) = \sum_{z \in S} \mu(z) F(x z), \quad x \in S\]
This follows from general results in Section 1.3.
Next we give the standard moment result for the arithmetic semigroup. As before, let \(\tau_k\) denote the left walk function of order \(k \in \N\) for \((S, \cdot)\), which in this context is best thought of as the divisor function of order \(k\).
Suppose that \(X\) is a random variable in \(S\) with reliability function \(F\) for \((S, \preceq)\). Then \[\sum_{x \in S} \tau_k(x) F(x) = \E[\tau_{k + 1}(X)], \quad k \in \N\] In particular, \(\sum_{x \in S} F(x) = \E[\tau(X)]\), the expected number of divisors of \(X\).
When the prime powers are independent, there is a simple moment result for a completely multiplicative function of \(X\) in terms of the probability generating functions.
Suppose that \(X = \prod_{i \in I} i^{N_i} \) is a random variable in \(S\) where \(\bs N = (N_i: i \in I)\) is the corresponding random variable in \(R\). Let \(P_i\) denote the probability generating function of \(N_i\) for \(i \in I\). If \((N_i: i \in I)\) is an independent sequence and \(g\) is completely multiplicative then \[\E[g(X)] = \prod_{i \in I} P_i[g(i)]\]
By independence and the completely multiplicative property, \[\E[g(X)] = \E\left[\prod_{i \in I} g(i)^{N_i}\right] = \prod_{i \in I} \E\left[g(i)^{N_i}\right] = \prod_{i \in I} P_i[g(i)]\]
Of course, we are particularly interested in exponential distributions for \((S, \cdot)\). In this section we will first characterize the exponential distributions in terms of the random prime powers, and then in Section 4, with the addition of the norm structure, in terms of Dirichlet distributions.
Random variable \(X\) has an exponential distribution on \((S, \cdot)\) if and only if \(X\) has a probability density function \(f\) has the form \[f(x) = \prod_{i \in I} p_i (1 - p_i)^{n_i}, \quad x = \prod_{i \in I} i^{n_i} \in S\] where \(\bs p = (p_i: i \in I) \in \ms P\). The rate constant is \(\prod_{i \in I} p_i\).
From results in Section 2.5, \(X\) has an exponential distribution on \((S, \cdot)\) if and only if the reliability function \(F\) of \(X\) satisfies
and then the rate constant is \(\alpha\). Condition (a) is the memoryless condition and means that \(F\) is completely multiplicative for \((S, \cdot)\). As usual, we write \(x \in S\) in the form \(x = \prod_{i \in I} i^{n_i}\) where \(\bs n = (n_i: i \in I) \in R\). So the memoryless property holds if and only if \[F(x) = \prod_{i \in I}[F(i)]^{n_i}, \quad x \in S\] Let \(p_i = 1 - F(i)\) for \(i \in I\), so that \(p_i\) is the probability that \(i\) is not a prime factor of \(X\). From our support assumption, \(p_i \in (0, 1)\) for each \(i \in I\) and so \(F(x) = \prod_{i \in I} (1 - p_i)^{n_i}\) for \(x \in S\). Hence \begin{align*} \sum_{x \in S} F(x) &= \sum_{\bs n \in R} F\left(\prod_{i \in I} i^{n_i}\right) = \sum_{\bs n \in R} \prod_{i \in I} (1 - p_i)^{n_i} \\ &= \prod_{i \in I} \sum_{n = 0}^\infty (1 - p_i)^n = \prod_{i \in I} \frac{1}{p_i} \end{align*} So (b) is satisfied if and only if \(\prod_{i \in I} p_i \gt 0\), in which case the density function \(f\) is given by \[f(x) = \prod_{i \in I} p_i (1 - p_i)^{n_i}, \quad x \in S\]
For the standard arithmetic semigroup \((\N_+, \cdot)\), the exponential property has the following interpretation: The conditional distribution of \(X / x\) given that \(x\) divides \(X\) is the same as the distribution of \(X\). Thus, knowledge of one divisor of \(X\) does not help in finding other divisors of \(X\), a property that may have some practical applications. The exponential distribution in , of course, corresponds to independent, geometric distributions on the prime exponents.
Random variable \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p = (p_i: i \in I) \in \ms P\) if and only if \[X = \prod_{i \in I} i^{N_i}\] where \(\bs N = (N_i: i \in I)\) is a sequence of independent random variables and \(N_i\) has the geometric distribution on \(\N\) with success parameter \(p_i\) for each \(i \in I\).
That is, \(N_i\) has the exponential distribution on \((\N, +)\) with rate \(p_i\) for each \(i \in I\). This characterization could also be obtained from the isomorphism between \((S, \cdot)\) and the semigroup \((R, +)\) discussed in Section 1. Here is the corresponding result for multisets:
Suppose that \(A\) is a random multiset and that \(N_i\) is the numnber of times that \(i\) occurs in \(A\) for \(i \in I\). Then \(A\) has an exponential distribution for \((T, +)\) if and only if \(\bs N = (N_i: i \in I)\) is a sequence of independent random variables and \(N_i\) has the geometric distribution on \(\N\) with success parameter \(p_i\) for \(i \in I\), with \(\bs p = (p_i: i \in I) \in \ms P\).
The app below simulates the exponential distribution on the arithmetic semigroup with \(I = \{2, 3, 5, 7\}\) as the set of prime elements. The parameters \(p_2\), \(p_3\), \(p_5\) and \(p_7\) can be varied with scrollbars. The geometric variables \(N_2\), \(N_3\), \(N_5\) and \(N_7\) and the exponential variable \(X = 2^{N_2} \, 3^{N_3} \, 5^{N_5} \, 7^{N_7}\) are displayed in the data table.
The basic moment result simplifies as usual.
If \(X\) has the exponential distribution with parameter \(\bs p = (p_i: i \in I) \in \ms P\) then \[\E[\tau_k(X)] = \prod_{i \in I} \frac{1}{p_i^k}, \quad k \in \N\]
This follows from the general theory in Section 1.5 since the rate constant is \(\prod_{i \in I} p_i\).
Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\), as in . Let \(M = \sum_{i \in I} N_i\) so that \(M\) is the sum of the prime exponents of \(X\). Recall also that \(E\) is the set of square-free elements of \(S\). Then \begin{align*} \E(M) &= \sum_{i \in I} \frac{1 - p_i}{p_i} \\ \var(M) &= \sum_{i \in I} \frac{1 - p_i}{p_i^2} \\ \E\left(t^M\right) &= \prod_{i \in I} \frac{p_i}{1 - (1 - p_i) t}, \quad |t| \lt \inf\left\{\frac{1}{1 - p_i}: i \in I\right\} \\ \P(X \in E) &= \prod_{i \in I} p_i (2 - p_i) \end{align*}
These are standard results for sums of independent, geometrically distributed random variables.
Of course, it's possible that \(\E(M) = \infty\) or that \(\E(M) \lt \infty\) and \(\var(M) = \infty\). It's also possible that the probability generating function converges only on the interval \([-1, 1]\).
Suppose that \(I\) is infinte and that the elements of \(I\) are enumerated by \(\N_+\), so that \(I = \{i_1, i_2, \ldots\}\). In the case of the standard arithmetic semigroup \((\N_+, \cdot)\), we could simply list the prime numbers in order. In the following simple example, some of the results in can be expressed in closed form.
In the setting just described, suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p = \left(p_{i_k}: k \in \N_+\right) \in \ms P\) where \(p_{i_k} = 1 / (1 + q^k)\) for \(k \in \N_+\), and where \(q \in (0, 1)\). Then \begin{align*} \E(M) &= \frac{q}{1 - q} \\ \var(M) &= \frac{q}{1 - q} + \frac{q^2}{1 - q^2} \\ \E(t^M) &= \prod_{k = 1}^\infty \frac{1}{1 + q^k (1 - t)}, \quad |t| \lt \frac{1}{1 + q} \\ \P(X \in E) &= \prod_{k = 1}^\infty \frac{1 + 2 q^k}{(1 + q^k)^2} \end{align*}
So for the distribution in example , \(M\) has moments of all orders. Our next result concern entropy. Once again, the result in this section is stated in terms of the parameters of the prime exponents, and in Section 4 in terms of Dirichlet distributions.
Suppose again that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). Then \(X\) maximizes entropy over all random variables \(Y = \prod_{i \in I} i^{M_i}\) in \(S\) with \(\E(M_i) = (1 - p_i) / p_i\) for each \(i \in I\), The maximum entropy is \[H(X) = -\sum_{i \in I} \left[\ln(p_i) + \ln(1 - p_i) \frac{1 - p_i}{p_i}\right]\]
Let \(F\) denote the reliability function of \(X\). From the main entropy result in Section 1.5, \(X\) maximizes entropy over all random variables \(Y\) in \(S\) with \(\E[\ln F(Y)] = \E[\ln F(X)]\), and then the maximum entropy is \(H(X) = -\ln \alpha - \E[\ln F(X)]\) where \(\alpha \in (0, \infty)\) is the rate constant. If \(x = \prod_{i \in I} i^{n_i} \in S\) then \(F(x) = \prod_{i \in I} (1 - p_i)^{n_i}\), and \(X = \prod_{i \in I} i^{N_i}\) where \(\bs N = (N_i: i \in I)\) is an independent sequence and \(N_i\) has the geometric distribution on \(\N\) with success parameter \(p_i\). Hence \[\E[\ln F(X)] = \E\left[\sum_{i \in I} N_i \ln (1 - p_i)\right] = \sum_{i \in I} \ln (1 - p_i) \E(N_i) = \sum_{i \in I} \ln(1 - p_i) \frac{1 - p_i}{p_i}\] Similarly, \(\E[\ln F(Y)] = \sum_{i \in I} \ln (1 - p_i) \E(M_i)\). So if \(\E(M_i) = \E(N_i) = (1 - p_i) / p_i\) for each \(i \in I\) then \(\E[\ln F(Y)] = \E[\ln F(X)]\). Finally, the rate constat for \(X\) is \(\prod_{i \in I} p_i\).
We give another representation of an exponential variable for \((S, \cdot)\) in terms of independent Poisson variables. This was obtained by Lin, but we give an alternate derivation based on the exponential distribution. We give the result here in terms of the parameters of the prime powers, and in Section 4 in terms of Dirichlet distributions.
Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). For \(x \in S_+\), let \(\lambda_x = (1 - p_i)^n / n\) if \(x = i^n\) for some \(i \in I\) and \(n \in \N_+\), and let \(\lambda_x = 0\) otherwise. Then \(X\) can be written in the form \[X = \prod_{x \in S_+} x^{V_x}\] where \((V_x: x \in S_+)\) is a sequence of independent variables and \(V_x\) has the Poisson distribution with parameter \(\lambda_x\) for \(x \in S_+\).
We start with the representation in . By results in Section 4.1, we can write the geometrically distributed prime exponents in the form \[N_i = \sum_{n = 1}^\infty n V_{i, n}, \quad i \in I\] where \(\{V_{i, n}: i \in I, \, n \in \N_+\}\) are independent and \(V_{i, n}\) has the Poisson distribution with parameter \((1 - p_i)^n / n\) for \(i \in I\) and \(n \in \N_+\). Substituting we have \[X = \prod_{i \in I} i^{N_i} = \prod_{i \in I} \prod_{n = 1}^\infty i^{n V_{i, n}} = \prod_{i \in I} \prod_{n = 1}^\infty (i^n)^{V_{i, n}}\] Now, for \(x \in S_+\), let \(V_x = V_{i, n}\) if \(x = i^n\) for some \(i \in I\) and \(n \in \N_+\), and let \(V_x = 0\) otherwise. Then \((V_x: x \in S_+)\) is a sequence of independent variables, \(V_x\) has the Poisson distribution with parameter \(\lambda_x\) for \(x \in S_+\), and \[X = \prod_{x \in S_+} x^{V_x}\]
The exponential distribution on \((S, \cdot)\) is a compound Poisson distribution. Once again we give the result in terms of the parameters of the prime powers, and in Section 4 in terms of Dirichlet distributions.
Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p = (p_i: i \in I) \in \ms P\). Then \(X\) has a compound Poisson distribution. Specifically \[X = V_1 V_2 \cdots V_M\] where \(\bs V = (V_1, V_2, \ldots)\) is a sequence of independent copies of \(V\), where in turn \(V\) takes values in the set of prime powers \(\{i^n: i \in I, \, n \in \N_+\}\) and has probability density function \[\P(V = i^n) = -\frac{(1 - p_i)^n}{n \sum_{j \in I} \ln p_j}, \quad i \in I, \, n \in \N_+\] The random index \(M\) is independent of \(\bs V\) and has the Poisson distribution with parameter \(-\sum_{i \in I} \ln p_i\).
Consider the representation given in the theorem. For \(k \in \N_+\) and \(i \in I\), random variable \(V_k\) is a power of \(i\) with probability \(\ln p_i / \sum_{j \in I} \ln p_j\), independently over \(k\). It follows that the number of factors \(M_i\) that are powers of \(i\) has the Poisson distribution with parameter \(-\ln p_i\). Moreover, for \(k \in \N_+\) and \(i \in I\), the conditional distribution of \(V_k\) given that \(V_k\) is a power of \(i\) is the logarithmic distribution with parameter \(p_i\): \[ \P(V_k = i^n \mid V_k \text{ is a power of } i) = - \frac{(1 - p_i)^n}{n \ln p_i}, \quad n \in \N_+ \] By grouping the factors according to the prime powers, it follows that \(X\) has the same distribution as \(\prod_{i \in I} i^{N_i}\) where \(N_i = U_{i,1} + U_{i,2} + \cdots + U_{i,M_i}\) for \(i \in I\) with the following properties satisfied:
It follows that \(\bs U_i = (U_{i,1}, U_{i,2}, \ldots)\) is an independent sequence and that \(M_i\) is independent of \(\bs U_i\). By the standard compound result in Section 4.1, \(N_i\) has the geometric distribution with success parameter \(p_i\) for \(i \in I\) and \((N_i: i \in I)\) is an independent sequence. Hence \(X\) has the exponential distribution with parameter \(\bs p = (p_i: i \in I)\).
Suppose now that \(\bs X = (X_1, X_2, \ldots)\) is a sequence of independent copies of \(X\), where \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). Let \(Y_n = X_1 \cdots X_n\) for \(n \in \N_+\) so that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \cdot)\) associated with \(X\).
For \(n \in \N_+\), \[Y_n = \prod_{i \in I} i^{M_{n, i}}\] where \(\bs M_n = \{M_{n, i}: i \in I\}\) is an independent sequence of variables and \(M_{n, i}\) has the negative binomial distribution on \(\N\) with stopping parameter \(n\) and success parameter \(p_i\) for \(i \in I\).
Hence \(Y_n\) has density function \(f_n\) given by \[f_n(x) = \prod_{i \in I}^\infty \binom{n + m_i - 1}{m_i} (1 - p_i)^{m_i} p_i^n, \quad x = \prod_{i \in I} i^{m_i} \in S\]
The exponential distribution on \((S, \cdot)\) has constant rate not only for \((S, \preceq)\) but also for the other graphs naturally associated with \((S, \preceq)\): the strict partial order graph \((S, \prec)\) of \((S, \preceq)\), the covering graph \((S, \upa)\) of \((S, \preceq)\), and the reflexive closure \((S, \Upa)\) of \((S, \upa)\) We have seen this type of result before, in Chapter 4 on the standard discrete spaces and in Chapter 5 on rooted trees and free semigroups. Once again, we give the results in terms of the geometric parameters of the prime powers and in terms of the Dirichlet formulation.
Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). As usual, we write \(x \in S\) in the form \(x = \prod_{i \in I} i^{n_i}\) where \(\bs n = (n_i: i \in I) \in R\).
Recall that for the partial order graph \((S, \preceq)\), \(X\) has reliability function \(F\) given by \(F(x) = \prod_{i \in I} (1 - p_i)^{n_i}\) for \(x = \prod_{i \in I} i^{n_i} \in S\) and has constant rate \(\prod_{i \in I} p_i\).
Other constant rate distributions can be obtained for any of the graphs by mixing two or more constant rate distributions of the types given in the theorem, with the same rate.
Characterize all constant rate distributions for each of the four graphs.