\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\var}{\text{var}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\Upa}{\Uparrow}\) \(\newcommand{\bs}{\boldsymbol}\)
  1. Reliability
  2. 6. Arithmetic Semigroups
  3. 1
  4. 2

2. Probability

Basics

Review

As in Section 1, we have three discrete, positive semigroups that are isomorphic to each other: \((R, +)\), \((S, \cdot)\), and \((T, +)\), based on a fixed, countable set \(I\). Recall that \(R\) is the collection of function \(\bs{n} = (n_i: i \in I)\) from \(I\) into \(\N\) such that \(\sum_{i \in I} n_i \lt \infty\) (or equivalently \(n_i = 0\) for all but finitely many \(i \in I\)). If \(I\) is finite then \(R = \N^I\) the collection of all functions from \(I\) into \(\N\). The operation on \(R\) is ordinary pointwise addition of functions. The set \(S\) is the collection of products of the form \(x = \prod_{i \in I} i^{n_i}\) where \(\bs n = (n_i: i \in I) \in R\). The representation is unique up to the ordering of the factors, so \((S, \cdot)\) is the arithmetic semigroup with \(I\) as the set of prime elements. Finally, \(T\) is the collection of finite multisets with elements in \(I\). A multiset \(a \in T\) is uniquely determined by its multiplicity function \(\bs n = (n_i: i \in I) \in R\) so that \(n_i\) is the number of times that \(i \in I\) is in \(a\). The operation on \(T\) is multiset addition. Associated with each positive semigroup is corresponding the discrete partial order graph: \((R, \le)\), \((S, \preceq)\), and \((T, \subseteq)\). We also assume that we have a fixed norm \(|\cdot|\) for \((S, \cdot)\). Our goal in this section is to study probability distributions on the various spaces from different points of view: multisets, prime factorings, and the norm structure. The most important special case is when \(I = \{2, 3, 5, 7, \ldots\}\), the ordinary set of prime nubmers, so that \(S = \N_+\) and \(\cdot\) is ordimary multiplcation. This semigroup \((\N_+, \cdot)\) is the standard arithmetic semigroup.

Distributions

As usual, we assume that we have a probability space \((\Omega, \ms F, \P)\) in the background. Suppose that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\), so that \(N_i\) is a random variable in \(\N\) for \(i \in I\), and \(\sum_{i \in I} N_i \lt \infty\) with probability 1. Of course, \(\bs N\) defines a random product of prime elements \(X = \prod_{i \in I} i^{N_i}\) in \(S\), and a random, finite multiset \(A\) in \(T\) with multiplicity function \(\bs N\). Our first step is to give a condition so that \(\bs N\) amd hence \(X\) and \(A\) are well defined. We start with a standard result from analysis.

Suppose that \(p_i \in (0, 1)\) for \(i \in I\). Then \[\prod_{i \in I} p_i \gt 0 \text{ if and only if } \sum_{i \in I} (1 - p_i) \lt \infty\] Let \(\ms P\) denote the set of all functions \(\bs{p} = (p_i: i \in I)\) from \(I\) into \((0, 1)\) such that the two equivalent conditions are satisfied.

Of course, the condition in is automatically satisfied if \(I\) is finite so in that case, \(\ms P = (0, 1)^I\), the set of functions from \(I\) into \((0, 1)\)

Suppose that \(N_i\) is a random variable in \(\N\) for \(i \in I\), and let \(p_i = \P(N_i = 0)\) for \(i \in I\).

  1. If \(\bs p = (p_i: i \in I) \in \ms P\) then \(\bs N = (N_i: i \in I)\) is a well-defined random variable in \(R\).
  2. If \((N_i: i \in I)\) is an independent sequence, the converse is true.
Details:
  1. By the first Borel-Cantelli Lemma, if \[\sum_{i \in I} (1 - p_i) = \sum_{i \in I} \P(N_i \gt 0) \lt \infty \] then \(N_i \gt 0\) occurs for infintely many \(i \in I\) with probability 0, or equivalently, \(N_i = 0\) for all but finitely many \(i \in I\) with probability 1.
  2. If \(\bs N = (N_i: i \in I)\) is a sequence of independent variables, then the converse is true by the second Borel-Cantelli Lemma. That is, if \(\bs N = (N_i: i \in I)\) is a well-defined random variable in \(R\) then with probability \(1\), \(N_i = 0\) for all but finitely many \(i\) and hence \( \bs p \in (0, 1)_I \). Note in this case that \(\P(\bs N = \bs 0) = \prod_{i \in I} p_i\).

There is a simple interpretation of the sum condition in .

Suppose that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\) and that \(X\) and \(A\) are the corresponding random variables in \(S\) and \(T\) respectively. Let \(K = \sum_{i \in I} \bs{1}(N_i \gt 0)\), so that \(K\) is the number of prime divisors of \(X\) and the number of distinct elements of \(A\). As in , let \(p_i = \P(N_i = 0)\) for \(i \in I\).

  1. \(\E(K) = \sum_{i \in I} (1 - p_i)\)
  2. If \(\bs N\) is an independent sequence then \(\var(K) = \sum_{i \in I} p_i (1 - p_i)\)

Suppose that again that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\) and that \(X\) and \(A\) are the corresponding random variables in \(S\) and \(T\) respectively. Let \(M = \sum_{i \in I} N_i\) so that \(M\) is the sum of the prime exponents of \(X\) and is the total number of elements of \(A\). Then trivially \[ \E(M) = \sum_{i \in I} \E(N_i)\] and if \(\bs N\) is an independent sequence then \[\var(M) = \sum_{i \in I} \var(N_i)\] By definition, \(M\) is finite with probability 1, but of course we could still have \(\E(M) = \infty\). Similarly we could have \(\var(M) = \infty\) even if \(\E(M) \lt \infty\). Next is a very simple result involving an event of interest.

Let \[ E = \{\bs n \in R: n_i \in \{0, 1\} \text{ for all } i \in I\} = \{x \in S: x \text{ is square free} \} = \{a \in T: a \text{ is an ordinary set} \} \] Suppose that random variable \(\bs N = (N_i: i \in I)\) in \(R\) is an independent sequence and let \(p_i = \P(N_i = 0)\) and \(q_i = \P(N_i = 1)\) for \(i \in I\). Then \[ \P(\bs N \in E) = \prod_{i \in I} (p_i + q_i) \]

Dirichlet Distributions

With the norm structure in place, we can also define probability distributions in terms of Dirichlet series, defined in Section 1.

Suppose that \(g\) is a positive arithmetic function for \((S, \cdot)\) and that the corresponding Dirichlet series \(G\) converges on the interval \((t_0, \infty)\) for some \(t_0 \in (0, \infty)\). The Dirichlet distribution on \((S, \cdot)\) corresponding to \(g\) (or \(G\)) with parameter \(t \in (t_0, \infty)\) has probability density function \(f\) given by \[f(x) = \frac{g(x)}{|x|^t G(t)}, \quad x \in S\]

Details:

Recall that \[ G(t) = \sum_{x \in S} \frac{g(x)}{|x|^t}, \quad t \in (t_0, \infty) \]

So \(f\) is proportional to the function \(x \mapsto g(x) / |x|^t\), and of course the normalizing constant must then be \(G(t)\). A given positive arithmetic function defines a one parameter family of Dirichlet distributions. The most famous Dirichlet distribution corresponds to \(g(x) = 1\) for all \(x \in S\).

The zeta distribution for \((S, \cdot)\) with parameter \(t\) in the interval of convergence \((t_0, \infty)\) has probability density function \(f\) given by \[f(x) = \frac{1}{|x|^t \zeta(t)}, \quad x \in S\]

For the standard arithmetic semigroup \((\N_+, \cdot)\), the Dirichlet distributions, and in particular the zeta distribution, are the standard ones.

Reliability Functions

Suppose that \(\bs N = (N_i: i \in I)\) is a random variable in \(R\) so that \(X = \prod_{i \in I} i^{N_i}\) is the corresponding random variable in \(S\) and \(A\) is the corresponding random multiset in \(T\) with multiplicty function \(\bs N\). We have the usual representations for the reliability functions of the random variables relative to the semigroups. So let \(\bs n = (n_i: i \in I) \in R\) and let \(x = \prod_{i \in I} i^{n_i} \in S\) and \(a \in T\) with multiplicty function \(\bs n\). Then \begin{align*} \P(\bs n \le \bs N) &= \sum_{\bs n \le \bs m} \P(\bs N = \bs m) = \sum_{\bs k \in R} \P(\bs N = \bs n + \bs k) \\ \P(x \preceq X) &= \sum_{x \preceq y} \P(X = y) = \sum_{z \in S} \P(X = x z) \\ \P(a \subseteq A) &= \sum_{a \subseteq b} \P(A = b) = \sum_{c \in T} \P(A = a + c) \end{align*} and in addition of course, \(\P(\bs n \le \bs N) = \P(x \preceq X) = \P(a \subseteq A)\) and \(\P(\bs n \le \bs N) = \P(n_i \le N_i \text{ for } i \in I)\). Going forward, most of our results will be stated in terms of the arithmetic semigroup \((S, \cdot)\). In particular, if \(F_i\) denotes the reliability function of \(N_i\) for the standard discrete graph \((\N, +)\) and \(F\) the reliability function of \(X\) for \((S, \cdot)\) then \(F(i^n) = F_i(n)\) for \(i \in I\) and \(n \in \N\). If \(\bs N = (N_i: i \in I)\) is an independent sequence then \[F(x) = \prod_{i \in I} F_i(n_i), \quad x = \prod_{i \in I} i^{n_i} \in S\]

By a basic result in Section 1.3, the partial order graphs are stochastic. That is, the reliability function of a probability distribution on the base set uniquely determines the distribution. Under a mild condition, we can recover the density function from the reliability function via Möbius inversion.

Suppose that \(\mu\) is the Möbius function, and that \(F\) a reliability function for \((S, \cdot)\). If \[\sum_{x in S} F(x) \lt \infty\] then the probability density function \(f\) is given by \[f(x) = \sum_{x \preceq y} \mu(x^{-1} y) F(y) = \sum_{z \in S} \mu(z) F(x z), \quad x \in S\]

Details:

This follows from general results in Section 1.3.

Moment Results

Next we give the standard moment result for the arithmetic semigroup. As before, let \(\tau_k\) denote the path function of order \(k \in \N\) for \((S, \cdot)\), which in this context is best thought of as the divisor function of order \(k\).

Suppose that \(X\) is a random variable in \(S\) with reliability function \(F\) for \((S, \preceq)\). Then \[\sum_{x \in S} \tau_k(x) F(x) = \E[\tau_{k + 1}(X)], \quad k \in \N\] In particular, \(\sum_{x \in S} F(x) = \E[\tau(X)]\), the expected number of divisors of \(X\).

When the prime powers are independent, there is a simple moment result for a completely multiplicative function of \(X\) in terms of the probability generating functions.

Suppose that \(X = \prod_{i \in I} i^{N_i} \) is a random variable in \(S\) where \(\bs N = (N_i: i \in I)\) is the corresponding random variable in \(R\). Let \(P_i\) denote the probability generating function of \(N_i\) for \(i \in I\). If \((N_i: i \in I)\) is an independent sequence and \(g\) is completely multiplicative then \[\E[g(X)] = \prod_{i \in I} P_i[g(i)]\]

Details:

By independence and the completely multiplicative property, \[\E[g(X)] = \E\left[\prod_{i \in I} g(i)^{N_i}\right] = \prod_{i \in I} \E\left[g(i)^{N_i}\right] = \prod_{i \in I} P_i[g(i)]\]

Exponential Distributions

Prime Exponents

Of course, we are particularly interested in exponential distributions for \((S, \cdot)\). We will first characterize the exponential distributions in terms of the random prime powers, and then in terms of Dirichlet distributions.

Random variable \(X\) has an exponential distribution on \((S, \cdot)\) if and only if \(X\) has a probability density function \(f\) has the form \[f(x) = \prod_{i \in I} p_i (1 - p_i)^{n_i}, \quad x = \prod_{i \in I} i^{n_i} \in S\] where \(\bs p = (p_i: i \in I) \in \ms P\). The rate constant is \(\prod_{i \in I} p_i\).

Details:

From results in Section 2.5, \(X\) has an exponential distribution on \((S, \cdot)\) if and only if the reliability function \(F\) of \(X\) satisfies

  1. \(F(x y) = F(x) F(y)\) for \(x, \, y \in S\)
  2. \(1 / \alpha := \sum_{x \in S} F(x) \lt \infty\)
and then the rate constant is \(\alpha\). Condition (a) is the memoryless condition and means that \(F\) is completely multiplicative for \((S, \cdot)\). As usual, we write \(x \in S\) in the form \(x = \prod_{i \in I} i^{n_i}\) where \(\bs n = (n_i: i \in I) \in R\). So the memoryless property holds if and only if \[F(x) = \prod_{i \in I}[F(i)]^{n_i}, \quad x \in S\] Let \(p_i = 1 - F(i)\) for \(i \in I\), so that \(p_i\) is the probability that \(i\) is not a prime factor of \(X\). From our support assumption, \(p_i \in (0, 1)\) for each \(i \in I\) and so \(F(x) = \prod_{i \in I} (1 - p_i)^{n_i}\) for \(x \in S\). Hence \begin{align*} \sum_{x \in S} F(x) &= \sum_{\bs n \in R} F\left(\prod_{i \in I} i^{n_i}\right) = \sum_{\bs n \in R} \prod_{i \in I} (1 - p_i)^{n_i} \\ &= \prod_{i \in I} \sum_{n = 0}^\infty (1 - p_i)^n = \prod_{i \in I} \frac{1}{p_i} \end{align*} So (b) is satisfied if and only if \(\prod_{i \in I} p_i \gt 0\), in which case the density function \(f\) is given by \[f(x) = \prod_{i \in I} p_i (1 - p_i)^{n_i}, \quad x \in S\]

The basic moment result simplifies as usual.

If \(X\) has the exponential distribution with parameter \(\bs p = (p_i: i \in I) \in \ms P\) then \[\E[\tau_k(X)] = \prod_{i \in I} \frac{1}{p_i^k}, \quad k \in \N\]

For the standard arithmetic semigroup \((\N_+, \cdot)\), the exponential property has the following interpretation: The conditional distribution of \(X / x\) given that \(x\) divides \(X\) is the same as the distribution of \(X\). Thus, knowledge of one divisor of \(X\) does not help in finding other divisors of \(X\), a property that may have some practical applications. The exponential distribution in , of course, corresponds to independent, geometric distributions on the prime exponents.

Random variable \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\) if and only if \[X = \prod_{i \in I} i^{N_i}\] where \(\bs N = (N_i: i \in I)\) is a sequence of independent random variables and \(N_i\) has the geometric distribution on \(\N\) with success parameter \(p_i\) for each \(i \in I\), with \(\bs p = (p_i: i \in I) \in \ms P\).

That is, \(N_i\) has the exponential distribution on \((\N, +)\) with rate \(p_i\) for each \(i \in I\). This characterization could also be obtained from the isomorphism between \((S, \cdot)\) and the semigroup \((R, +)\) discussed in Section 1.

Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\), as in . Let \(M = \sum_{i \in I} N_i\) so that \(M\) is the sum of the prime exponents of \(X\), and let \(E\) be the set of square-free elements of \(S\) as in . Then \begin{align*} \E(M) &= \sum_{i \in I} \frac{1 - p_i}{p_i} \\ \var(M) &= \sum_{i \in I} \frac{1 - p_i}{p_i^2} \\ \E\left(t^M\right) &= \prod_{i \in I} \frac{p_i}{1 - (1 - p_i) t}, \quad |t| \lt \inf\left\{\frac{1}{1 - p_i}: i \in I\right\} \\ \P(X \in E) &= \prod_{i \in I} p_i (2 - p_i) \end{align*}

Details:

These are standard results for sums of independent, geometrically distributed random variables.

Of course, it's possible that \(\E(M) = \infty\) or that \(\E(M) \lt \infty\) and \(\var(M) = \infty\). It's also possible that the probability generating function converges only on the interval \([-1, 1]\). As always we can restate our results in terms of multisets.

Suppose that \(A\) is a random multiset and that \(N_i\) is the numnber of times that \(i\) occurs in \(A\) for \(i \in I\). Then \(A\) has an exponential distribution for \((T, +)\) if and only if \(\bs N = (N_i: i \in I)\) is a sequence of independent random variables and \(N_i\) has the geometric distribution on \(\N\) with success parameter \(p_i\) for \(i \in I\), with \(\bs p = (p_i: i \in I) \in \ms P\).

Open the simulation of the exponential distribution on the four primes semigroup, that is, the arithmetic semigroup with \(I = \{2, 3, 5, 7\}\). The parameters \(p_2\), \(p_3\), \(p_5\) and \(p_7\) can be varied with scrollbars. The geometric variables \(N_2\), \(N_3\), \(N_5\) and \(N_7\) and the exponential variable \[X = 2^{N_2} 3^{N_3} 5^{N_5} 7^{N_7}\] are displayed in the data table. Run the simulation for various values of the parmaeters to gain some insight.

Suppose that \(I\) is infinte and that the elements of \(I\) are enumerated by \(\N_+\), so that \(I = \{i_1, i_2, \ldots\}\). In the case of the standard arithmetic semigroup \((\N_+, \cdot)\), we could simply list the prime numbers in order. In the following simple example, some of the results in can be expressed in closed form.

In the setting just described, suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p = \left(p_{i_k}: k \in \N_+\right) \in \ms P\) where \(p_{i_k} = 1 / (1 + q^k)\) for \(k \in \N_+\), and where \(q \in (0, 1)\). Then \begin{align*} \E(M) &= \frac{q}{1 - q} \\ \var(M) &= \frac{q}{1 - q} + \frac{q^2}{1 - q^2} \\ \E(t^M) &= \prod_{k = 1}^\infty \frac{1}{1 + q^k (1 - t)}, \quad |t| \lt \frac{1}{1 + q} \\ \P(X \in E) &= \prod_{k = 1}^\infty \frac{1 + 2 q^k}{(1 + q^k)^2} \end{align*}

Details:

These results follow from since \(1 - p_{i_k} = q^k / (1 + q^k)\), \(\left(1 - p_{i_k}\right) / p_{i_k} = q^k\), and \(\left(1 - p_{i_k}\right) / p_{i_k}^2 = q^k (1 + q^k)\).

So for example , \(M\) has moments of all orders.

Dirichlet Distributions

Using the norm structure, our next characterization is in terms of Dirichlet distributions.

Random variable \(X\) has an exponential distribution on \((S, \cdot)\) if and only if \(X\) has a Dirichlet distributions corresponding to a completely multiplicative function arithmetic function \(g\). The rate constant is \(1 / G(t)\) where \(G\) is the series function and \(t\) is the parameter of the distribution.

Details:

Suppose that \(g\) is completely multiplicative arithmetic function for \((S, \cdot)\) and that \(X\) has the Dirichlet distribution corresponding to \(g\) with parameter \(t\) in the interval of convergence of the Dirichlet series \(G\). That is, the density function \(f\) of \(X\) is given by \(f(x) = g(x) / [|x|^t G(t)]\) for \(x \in S\). So the reliability function \(F\) of \(X\) is given by \begin{align*} F(x) &= \P(X \succeq x) = \sum_{y \succeq x} \frac{g(y)}{|y|^t G(t)} = \sum_{z \in S} \frac{g(x z)}{|x z|^t G(t)} = \sum_{z \in S} \frac{g(x) g(z)}{|x|^t |z|^t G(t)} \\ &= \frac{g(x)}{|x|^t G(t)} \sum_{z \in S} \frac{g(z)}{|z|^t} = \frac{g(x)}{|x|^t G(t)} G(t) = \frac{g(x)}{|x|^t}, \quad x \in S \end{align*} Hence \(X\) has constant rate \(1 / G(t)\). Also, \(X\) is memoryless, since \(g\) is completely multiplicative: \[F(x y) = \P(X \succeq x y) = \frac{g(x y)}{|x y|^t} = \frac{g(x) g(y)}{|x|^t |y|^t} = \frac{g(x)}{|x|^t} \frac{g(y)}{|y|^t} = \P(X \succeq x)\P(X \succeq y)\] Therefore \(X\) has an exponential distribution. The converse is trivially true. Suppose that \(X\) has an exponential distribution with reliability function \(F\). For fixed \(t \in (0, \infty)\), let \(g(x) = |x|^t F(x)\) for \(x \in \N_+\), and let \(G(s) = \sum_{x \in S} g(x) / |x|^s\). Then \(g\) is g completely multiplicative function and \(G\) is the corresponding Dirichlet series. Moreover, \(t\) is in the interval of convergence since \(\sum_{x \in S} g(x) / |x|^t = \sum_{x \in S} F(x) \lt \infty\). The probability density function \(f\) of \(X\) is given by \[f(x) = \P(X = x) = \frac{g(x)}{|x|^t G(t)}, \quad x \in S\] and so \(X\) has the Dirichlet distribution corresponding to \(g\) with parameter \(t\). Note that since \(g\) is completely multiplicative, all members of this Dirichlet family are exponential, from the first part of the theorem.

Thus, a Dirichlet distribution with completely multiplicative coefficient function (in particular, the zeta distribution) has the representation given in . The geometric parameters of the random prime powers are are related to the arithmetic function by \[1 - p_i = \P(X \succeq i) = \frac{g(i)}{|i|^t}, \quad i \in I\] For the standard case \((\N_+, \cdot)\), this result was considered surprising in the paper by Lin, but is quite natural in the context of positive semigroups. As a corollary, we get a probabilistic proof of the product formula for the Dirichlet series of a completely multiplicative function.

Suppose that \(g\) is a positive, completely multiplicative function for \((S, \cdot)\) and that the corresponding Dirichlet series converges on \((t_0, \infty)\) for some \(t_0 \in (0, \infty)\). Then \[G(t) = \prod_{i \in I} \frac{|i|^t}{|i|^t - g(i)}, \quad t \in (t_0, \infty)\] In particular \[\zeta(t) = \prod_{i \in I} \frac{|i|^t}{|i|^t - 1}, \quad t \in (t_0, \infty)\]

Open the simulation of the standard zeta distribution on \(\N_+\). Vary \(t\) with the scrollbar and note the shape of the probability density function and the value of the rate constant \(\alpha = 1 / \zeta(t)\). Run the simulation and compare the empirical density function to the probability density function.

The standard moment results can be rephrased as follows:

If \(X\) has the Dirichlet distribution for \((S, \cdot)\) with completely multiplicative function \(g\) and parameter \(t\) in the interval of convergence, then \(\E[\tau_k(X)] = G^k(t)\) for \(k \in \N\). In particular, if \(X\) has the zeta distribution for \((S, \cdot)\) with parameter \(t\) in the interval of convergence then \(\E[\tau_k(X)] = \zeta^k(t)\) for \(k \in \N\).

Next we obtain a result from the paper by Lin. Our proof is better because it takes advantage of the general theory of positive semigroups. Let \(L\) denote the adjacency kernel of \((S, \cdot)\). Suppose that \(g: \N_+ \to [0, \infty)\) is a nonnegative arithmetic function, not identically zero, and let \(G\) be the corresponding Dirichlet series.

Suppose that \(X\) has the zeta distribution for \((S, \cdot)\) with parameter \(t\) in the interval of convergence of \(\zeta\) and \(G\). Then \[\E[ (g L)(X)] = G(t)\]

Details:

It follows immediately from the basic moment result that \[\E[(g L)(X)] = \zeta(t) \E[g(X)]\] since \(1 / \zeta(t)\) is the rate constant of the exponential distribution of \(X\). But \[\zeta(t) \E[g(X)] = \zeta(t) \sum_{x \in S} \frac{g(x)}{|x|^t \zeta(t)} = \sum_{x \in S} \frac{g(x)}{|x|^t} = G(t)\]

Connections to the Poisson Distribution

We give another representation in terms of independent Poisson variables. This was obtained by Lin, but we give an alternate derivation based on the exponential distribution. We give the result first in terms of the parameters of the prime powers.

Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). For \(x \in S_+\), let \(\lambda_x = (1 - p_i)^n / n\) if \(x = i^n\) for some \(i \in I\) and \(n \in \N_+\), and let \(\lambda_x = 0\) otherwise. Then \(X\) can be written in the form \[X = \prod_{x \in S_+} x^{V_x}\] where \((V_x: x \in S_+)\) is a sequence of independent variables and \(V_x\) has the Poisson distribution with parameter \(\lambda_x\) for \(x \in S_+\).

Details:

We start with the representation in . By results in Section 4.1, we can write the geometrically distributed prime exponents in the form \[N_i = \sum_{n = 1}^\infty n V_{i n}, \quad i \in I\] where \(\{V_{i n}: i \in I, \, n \in \N_+\}\) are independent and \(V_{i n}\) has the Poisson distribution with parameter \((1 - p_i)^n / n\) for \(i \in I\) and \(n \in \N_+\). Substituting we have \[X = \prod_{i \in I} i^{N_i} = \prod_{i \in I} \prod_{n = 1}^\infty i^{n V_{i n}} = \prod_{i \in I} \prod_{n = 1}^\infty (i^n)^{V_{i n}}\] Now, for \(x \in S_+\), let \(V_x = V_{i n}\) if \(x = i^n\) for some \(i \in I\) and \(n \in \N_+\), and let \(V_x = 0\) otherwise. Then \((V_x: x \in S_+)\) is a sequence of independent variables, \(V_x\) has the Poisson distribution with parameter \(\lambda_x\) for \(x \in S_+\), and \[X = \prod_{x \in S_+} x^{V_x}\]

Here is a restatement of in terms of Dirichlet distributions. This version uses the Mangoldt function \(\Lambda\) defined in Section 1.

Suppose that \(X\) has the Dirichlet distribution for \((S, \cdot)\) with completely multiplicative coefficient function \(g\) and parameter \(t\). Then \[X = \prod_{x \in S_+} x^{V_x}\] where \((V_x: x \in S_+)\) is a sequence of independent variables, and for \(x \in S_+\), \(V_x\) has the Poisson distribution with parameter \[\lambda_x = \frac{g(x) \Lambda(x)}{x^t \ln |x|}\]

The exponential distribution on \((S, \cdot)\) is a compound Poisson distribution. Once again we give the result first in terms of the parameters of the prime powers.

Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p = (p_i: i \in I) \in \ms P\). Then \(X\) has a compound Poisson distribution. Specifically \[X = V_1 V_2 \cdots V_M\] where \(\bs V = (V_1, V_2, \ldots)\) is a sequence of independent copies of \(V\), where in turn \(V\) takes values in the set of prime powers \(\{i^n: i \in I, \, n \in \N_+\}\) and has probability density function \[\P(V = i^n) = -\frac{(1 - p_i)^n}{n \sum_{j \in I} \ln p_j}, \quad i \in I, \, n \in \N_+\] The random index \(M\) is independent of \(\bs V\) and has the Poisson distribution with parameter \(-\sum_{i \in I} \ln p_i\).

Details:

Consider the representation given in the theorem. For \(k \in \N_+\) and \(i \in I\), random variable \(V_k\) is a power of \(i\) with probability \(\ln p_i / \sum_{j \in I} \ln p_j\), independently over \(k\). It follows that the number of factors \(M_i\) that are powers of \(i\) has the Poisson distribution with parameter \(-\ln p_i\). Moreover, for \(k \in \N_+\) and \(i \in I\), the conditional distribution of \(V_k\) given that \(V_k\) is a power of \(i\) is the logarithmic distribution with parameter \(p_i\): \[ \P(V_k = i^n \mid V_k \text{ is a power of } i) = - \frac{(1 - p_i)^n}{n \ln p_i}, \quad n \in \N_+ \] By grouping the factors according to the prime powers, it follows that \(X\) has the same distribution as \(\prod_{i \in I} i^{N_i}\) where \(N_i = U_{i,1} + U_{i,2} + \cdots + U_{i,M_i}\) for \(i \in I\) with the following properties satisfied:

  1. \(U_{i,k}\) has the logarithmic distribution with parameter \(p_i\) for \(i \in I\) and \(k \in \N_+\), given above.
  2. \(M_i\) has the Poisson distribution with parameter \(-\ln p_i\) for \(i \in I\).
  3. The random variables \(\{U_{i,k}, M_i: i \in I, \, k \in \N_+\}\) are independent.

It follows that \(\bs U_i = (U_{i,1}, U_{i,2}, \ldots)\) is an independent sequence and that \(M_i\) is independent of \(\bs U_i\). By the standard compound result in Section 4.1, \(N_i\) has the geometric distribution with success parameter \(p_i\) for \(i \in I\) and \((N_i: i \in I)\) is an independent sequence. Hence \(X\) has the exponential distribution with parameter \(\bs p = (p_i: i \in I)\).

Here is a restatement of in terms of Dirichlet distributions.

Suppose that \(X\) has the Dirichlet distribution on \((S, \cdot)\) corresponding to a positive, completely multiplicative function \(g\) and Dirichlet series \(G\), with parameter \(t\) in the interval of convergence. Then \(X\) has a compound Poisson distribution. Specifically \[X = V_1 V_2 \cdots V_M\] where \(\bs V = (V_1, V_2, \ldots)\) is a sequence of independent copies of \(V\), and where in turn \(V\) takes values in the set of prime powers \(\{i^n: i \in I, \, n \in \N_+\}\) and has probability density function \[\P(V = i^n) = \frac{g^n(i)}{n |i|^{n t} \ln G(t)}, \quad i \in I, \, n \in \N_+\] The random index \(M\) is independent of \(\bs V\) and has the Poisson distribution with parameter \(\ln A(t)\).

Entropy

Our next two results concern entropy. Once again, the first result is stated in terms of the parameters of the prime exponents.

Suppose again that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). Then \(X\) maximizes entropy over all random variables \(Y = \prod_{i \in I} i^{M_i}\) in \(S\) with \((M_i: i \in I)\) independent and \(\E(M_i) = (1 - p_i) / p_i\) for each \(i \in I\), The maximum entropy is \[H(X) = -\sum_{i \in I} \left[\ln(p_i) + \ln(1 - p_i) \frac{1 - p_i}{p_i}\right]\]

Here is a restatement of in terms of Dirchlet distributions.

Suppose that \(X\) has the Dirichlet distribution on \((S, \cdot)\) corresponding to a positive, completely multiplicative function \(g\). Then \(X\) maximizes entropy over all random variables \(Y \in S\) with \(\E(\ln |Y|) = \E(\ln |X|)\) and \(\E[\ln g(Y)] = \E[\ln g(X)]\).

From the paper by Gut, \[\E(\ln |X|) = \frac{1}{G(t)} \sum_{x \in S} \ln(|x|) g(x) |x|^{-t} = \frac{G^\prime (t)}{G(t)}\] where \(G\) is the Dirichlet series corresponding to \(g\).

Moments of the norm

Our next results of this subsection concern moments the norm of an exponential variable..

Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). Suppose also that \(k \in (0, \infty)\) and that \(p_i \gt 1 - 1 / |i|^k\) for \(i \in I\). Then \[\E\left(|X|^k\right) = \prod_{i \in I} \frac{p_i}{1 - |i|^k (1 - p_i)}\]

Details:

This follows from since \(x \mapsto |x|^k\) is completely multiplicative, and the probability generating function of the geometric distribution with success parameter \(p \in (0, 1)\) is \(t \mapsto p / [1 - t(1 - p)]\) for \(|t| \lt 1 / (1 - p)\).

Here is a restatement of in terms of Dirichlet distributions.

Suppose that \(X\) has the Dirichlet distribution on \((S, \cdot)\) corresponding to a positive, completely multiplicative function \(g\) and Dirichlet series \(G\), with parameter \(t\) in the interval of convergence \((t_0, \infty)\). If \(k \lt t - t_0\) then \[\E(|X|^k) = \frac{G(t - k)}{G(t)}\]

Details:

In terms of the Dirichlet formulation, the density function \(f\) of \(X\) is given by \(f(x) = g(x) / [|x|^t G(t)]\) for \(x \in S\). Hence \[\E(|X|^k) = \sum_{x \in S} |x|^k \frac{a(x)}{|x|^t A(t)} = \frac{1}{A(t)} \sum_{x \in S} \frac{a(x)}{|x|^{t-k}} = \frac{A(t - k)}{A(t)}\] assuming that \(t - k \gt t_0\). The result also follows from and the product expansion of the Dirichlet series \(G\) since \(p_i = 1 - g(i) / |i|^t\) for \(i \in I\).

Random Walks

Suppose now that \(\bs X = (X_1, X_2, \ldots)\) is a sequence of independent copies of \(X\), where \(X\) has the exponential distribution on \((S, \cdot)\) with parameters \(\bs p \in (0, 1)_I\). Let \(Y_n = X_1 \cdots X_n\) for \(n \in \N_+\) so that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \cdot)\) associated with \(X\).

For \(n \in \N_+\), \[Y_n = \prod_{i \in I} i^{M_{n i}}\] where \(\bs M_n = \{M_{n i}: i \in I\}\) is an independent sequence of variables and \(M_{n i}\) has the negative binomial distribution on \(\N\) with parameters \(n\) and \(p_i\) for \(i \in I\).

Hence \(Y_n\) has density function \(f_n\) given by \[f_n(x) = \prod_{i \in I}^\infty \binom{n + m_i - 1}{m_i} (1 - p_i)^{m_i} p_i^n, \quad x = \prod_{i \in I} i^{m_i} \in S\]

Suppose now that \(\bs X = (X_1, X_2, \ldots)\) is a sequence of independent copies of \(X\) where \(X\) has the Dirichlet distribution for \((S, \cdot)\) corresponding to the completely multiplicative function \(g\), Dirichlet series \(G\), and parameter \(t\) in the interval of convergence. Let \(Y_n = X_1 X_2 \cdots X_n\) for \(n \in \N_+\) so that once again, \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \cdot)\) associated with \(X\).

For \(n \in \N\), \(Y_n\) has density function \(f_n\) given by \[f_n(x) = \frac{\tau_{n - 1}(x) g(x)}{G^n(t) |x|^t}, \quad x \in S\]

Thus, \(Y_n\) also has a Dirichlet distribution for \(n \in \N\), but corresponding to a multiplicative function instead of a completely multiplicative coefficient function. As a corollary, we have the following result in analytic number theory that seemingly has nothing to do with probability.

Suppose that \(g\) is a completely multiplicative arithmetic function for \((S, \cdot)\) with Dirichlet series \(G\) that has interval of convergence \((t_0, \infty)\). Then \[\sum_{x \in S} \frac{\tau_{n - 1}(x) g(x)}{|x|^t} = G^n(t), \quad t \in (t_0, \infty), \, n \in \N_+\]

In the special case of the zeta distribution for \((S, \cdot)\) with parameter \(t \gt 1\), the density function \(f_n\) of \(Y_n\) is given by \[f_n(x) = \frac{\tau_{n - 1}(x)}{\zeta^n(t) |x|^t}, \quad x \in S\] for \(n \in \N_+\), so in this special case it follows that \[\sum_{x \in S} \frac{\tau_{n - 1}(x)}{|x|^t} = \zeta^n(t), \quad n \in \N_+\]

In both formulations, the density function of \(Y_n\) is a special case of the general theory in Section 2.5. That is, for \(n \in \N_+\) and \(x \in S\), \(f_n(x)\) is the product of the \(n\)th power of the rate constant, the path function of order \(n - 1\) at \(x\), and the reliability function at \(x\).

Related Graphs

The exponential distribution on \((S, \cdot)\) has constant rate not only for \((S, \preceq)\) but also for the other graphs naturally associated with \((S, \preceq)\): the strict partial order graph \((S, \prec)\) of \((S, \preceq)\), the covering graph \((S, \upa)\) of \((S, \preceq)\), and the reflexive closure \((S, \Upa)\) of \((S, \upa)\) We have seen this type of result before, in Chapter 4 on the standard discrete spaces and in Chapter 5 on rooted trees and free semigroups. Once again, we give the results in terms of the geometric parameters of the prime powers and in terms of the Dirichlet formulation.

Suppose that \(X\) has the exponential distribution on \((S, \cdot)\) with parameter \(\bs p \in \ms P\). As usual, we write \(x \in S\) in the form \(x = \prod_{i \in I} i^{n_i}\) where \(\bs n = (n_i: i \in I) \in R\).

  1. For the graph \((S, \prec)\), \(X\) has constant rate \(\prod_{i \in I} p_i \big/ \left(1 - \prod_{i \in I} p_i\right)\) and reliability function \(F_1\) given by \[F_1(x) = \left[1 - \prod_{i \in I} p_i\right] \prod_{i \in I} (1 - p_i)^{n_i}, \quad x \in S\]
  2. For graph \((S, \upa)\), \(X\) has has constant rate \(1 \big/ \sum_{j \in I} (1 - p_j)\) and reliability function \(F_2\) given by \[F_2(x) = \sum_{j \in I} (1 - p_j) \left[\prod_{i \in I} p_i (1 - p_i)^{n_i}\right], \quad x \in S\]
  3. For the graph \((S, \Upa)\), \(X\) has constant rate \(1 \big/ \left[1 + \sum_{j \in I} (1 - p_j)\right]\) and reliability function \(F_3\) given by \[F_3(x) = \left[1 + \sum_{j \in I} (1 - p_j)\right] \left[\prod_{i \in I} p_i (1 - p_i)^{n_i}\right], \quad x \in S\]
Details:

Recall that \(X\) has reliability function \(F\) for \((S, \preceq)\) given by \(F(x) = \prod_{i \in I} (1 - p_i)^{n_i}\) for \(x \in S\) and has constant rate \(\prod_{i \in I} p_i\) for the graph.

  1. This follows from results on reflexive closure in Section 1.6.
  2. For the covering graph \((S, \upa)\), \begin{align*} F_2(x) &= \sum_{j \in I} f(x j) = \sum_{j \in I} \left[\prod_{i \in I - \{j\}} p_i (1 - p_i)^{n_i}\right] p_j (1 - p_j)^{n_j + 1} \\ &= \sum_{j \in I} \left[\prod_{i \in I} p_i (1 - p_i)^{n_i}\right] (1 - p_j) = f(x) \sum_{j \in I} (1 - p_j), \quad x \in S \end{align*}
  3. This follows from (b) and results on reflexive closure.

Suppose that \(X\) has the Dirichlet distribution on \((S, \cdot)\) corresponding to the completely multiplicative function \(g\), Dirichlet series \(G\) and parameter \(t\) in the interval of convergence.

  1. For the graph \((S, \prec)\), \(X\) has constant rate \(1 / [G(t) - 1]\) and reliability function \(F_1\) given by \[F_1(x) = \left[1 - \frac{1}{A(t)}\right] \frac{g(x)}{|x|^t}, \quad x \in S\]
  2. For the graph \((S, \upa)\), \(X\) has constant rate \(1 / \sum_{j \in I} g(j) / |j|^t\) and reliability function \(F_2\) given by \[F_2(x) = \frac{g(x) |x|^{-t}}{G(t)} \sum_{j \in I} g(j) / |j|^t, \quad x \in S\]
  3. For the graph \((S, \Upa)\), \(X\) has constant rate \(1 \big/ \left[1 + \sum_{j \in I} g(j)\right]\) and reliability function \(F_3\) given by \[F_3(x) = \frac{g(x) |x|^{-t}}{G(t)} \left[1 + \sum_{j \in I} g(j) / |j|^t \right], \quad x \in S\]

Other constant rate distributions can be obtained for any of the graphs by mixing two or more constant rate distributions of the types given in the theorem, with the same rate.

Characterize all constant rate distributions for each of the four graphs.