\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\upa}{\uparrow}\)
  1. Reliability
  2. 8. Subset Spaces
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5

3. Constant Rate Distributions

The existence of a constant rate distribution for the graph \((S, \subseteq)\) or for the covering graph \((S, \upa)\) is unknown. In this section we give some preliminary results.

Basics

As usual, we assume that \(X\) is a random variable in \(S\) with probability density function \(f\). From Section 1 recall that \(A_n = \{x \in S: \#(x) = n\}\) for \(n \in \N\), and that \(A_n(x) = \{y \in S: x \subseteq y, \#(y) = \#(x) + n\}\) for \(x \in S\) and \(n \in \N\).

Suppose that \(X\) has constant rate \(\beta \in (0, \infty)\) for \((S, \uparrow)\). Then

  1. \(X\) has constant rate \(n! \beta^n\) for \((S, \upa^n)\).
  2. \(X\) has constant rate \(e^{-1 / \beta}\) for the graph \((S, \subseteq)\).
  3. \(N = \#(X)\) has the Poisson distribution with parameter \(1 / \beta\).
Details:

For \(n \in \N\), let \(F_n\) denote the reliability function of \(X\) for the graph \((S, \upa^n)\).

  1. We show by induction on \(n\) that \[F_n = \frac{1}{n! \beta^n} f\] The result is trivially true when \(n = 0\) and true by assumption when \(n = 1\). Suppose that the result is true for a given \(n \in \N_+\). Then by the constant rate property, the induction hypothesis, and a result in Section 1, we have \begin{align*} F_{n + 1}(x) &= \frac{1}{n + 1} \sum_{i \notin x} F_n(x \cup \{i\}) = \frac{1}{n + 1} \sum_{i \notin x} \frac{1}{n! \beta^n} f(x \cup \{i\}) \\ &= \frac{1}{(n + 1)! \beta^n} F_1(x) = \frac{1}{(n + 1)! \beta^n} \frac{1}{\beta} f(x) = \frac{1}{(n + 1)! \beta^{n + 1}} f(x), \quad x \in S \end{align*}
  2. Since \(X\) has constant rate \(n! \beta^n\) for \((S, \upa^n)\), from another result in Section 1 we have \[F(x) = \sum_{n = 0}^\infty F_n(x) = \sum_{n = 0}^\infty \frac{1}{n! \beta^n} f(x) = e^{1 / \beta} f(x), \quad x \in S\]
  3. Note that \[\P(N = n) = \P(X \in A_n) = F_n(\emptyset) = \frac{1}{n! \beta^n} f(\emptyset), \quad n \in \N\] So it follows that \(f(\emptyset) = e^{-1 / \beta}\) and that \(N\) has the Poisson distribution with this parameter.

If \(X\) has constant rate \(\alpha \in (0, 1)\) for \((S, \subseteq)\) then \[ f(x) = \frac{1}{\alpha} \sum_{n = 0}^\infty (-1)^n \sum_{y \in A_n(x)} f(y), \quad x \in S \]

Details:

This follows immediately from a result in Section 1, since the right side is \(F(x) / \alpha\) where \(F\) is the reliability function of \(X\) for \((S, \subseteq)\).

Proposition above shows that if \(X\) has constant rate for \((S, \upa)\) then \(X\) also has constant rate for \((S, \subseteq)\). We do not know if the converse is true. However, we also know that if \(X\) has constant rate for \((S, \upa)\) then \(N = \#(X)\) has a Poisson distribution. Our next result shows that this is also true if \(X\) has constant rate for \((S, \subseteq)\).

Suppose that \(X\) has constant rate \(\alpha \in (0, 1)\). Then \[ \alpha \P(N = k) = \E\left[(-1)^{N + k} \binom{N}{k}\right], \quad k \in \N \] Hence \(N\) has the Poisson distribution with parameter \(-\ln \alpha\).

Details:

Let \(f\) denote the probability density function of \(X\), as above. From , \[\P(N = k) = \sum_{x \in A_k} f(x) = \frac{1}{\alpha} \sum_{n = 0}^\infty (-1)^n \sum_{x \in A_k} \sum_{y \in A_n(x)} f(y)\] The last two sums are over all \(x, y \in S\) with \(\#(y) = n + k\), and \(x \subseteq y\). Interchanging the order of summation gives \begin{align*} \P(N = k) &= \frac{1}{\alpha} \sum_{n = 0}^\infty (-1)^n \sum_{y \in A_{n+k}} \sum \{f(y): x \in A_k, \, x \subseteq y\} \\ &= \frac{1}{\alpha} \sum_{n = 0}^\infty (-1)^n \sum_{y \in A_{n + k}} \binom{n + k}{k} f(y)\\ &= \frac{1}{\alpha} \sum_{n = 0}^\infty (-1)^n \binom{n + k}{k} \P(N = n + k) \end{align*} Equivalently (with the substitution \(j = n + k\)), \[\alpha \P(N = k) = \sum_{j = k}^\infty (-1)^{j - k} \binom{j}{k} \P(N = j)\] With the usual convention on binomial coefficients, that is \(\binom{a}{b} = 0\) if \(b \lt 0\) or \(b \gt a\), we have \[\alpha \P(N = k) = \sum_{j = 0}^\infty (-1)^{j + k} \binom{j}{k} \P(N = j) = \E\left[(-1)^{k + N} \binom{N}{k} \right]\] The fact that \(N\) has a Poisson distribution with parameter \(-\ln \alpha\) now follows from the characterizations in the subsection below.

For \(k \in \N\), let \(f_k\) denote the conditional probability density function of \(X\) given that \(N = k\), so that \[f_k(x) = \P(X = x \mid N = k), \quad x \in A_k\]

Suppose that \(N\) has the Poisson distribution with parameter \(\lambda\). Then the condition for \(X\) to have constant rate \(e^{-\lambda}\) for \((S, \subseteq)\) is \[ e^{-\lambda} \frac{f_k(x)}{k!} = \sum_{n = 0}^\infty \frac{(-\lambda)^n}{(n + k)!} \sum_{y \in A_n(x)} f_{n + k}(y), \quad k \in \N, \, x \in A_k \]

Details: From we have \[e^{-2 \lambda} \frac{\lambda^k}{k!} f_k(x) = \sum_{n = 0}^\infty (-1)^n \sum_{y \in A_n(x)} e^{-\lambda} \frac{\lambda^{n + k}}{(n + k)!}f_{n + k}(y), \quad k \in \N, \, x \in A_k\] which is equivalent to the condtion stated.

Characterizations of the Poisson Distribution

The equation in has several simple variations. The goal of this section is to show that all characterize the Poisson distribution.

The binomial function \(B\) defined by \[B(k, n) = \binom{n}{k}, \quad k, \, n \in \N\] is a kernel for the total order graph \((\N, \le)\). Its inverse is given by \[B^{-1}(k, n) = (-1)^{n - k} \binom{n}{k}, \quad k, \, n \in \N\]

Details:

The function \(B\) is a kernel for \((\N, \le)\) since \(B(k, n) = 0\) for \(k \gt n\). It has an inverse since \(B(k, k) = 1 \ne 0\) for \(k \in \N\). The functions \(B\) and \(B^{-1}\) as given above are inverses with respect to the usual product operation: \[\sum_{j=k}^n (-1)^{n-j} \binom{n}{j} \binom{j}{k} = \bs 1(k = n), \quad (k, n) \in \N^2\] As always, the kernel \(I\) given by \(I(k, n) = \bs 1(k = n)\) for \((k, n) \in \N^2\) is the identity.

Suppose that \(N\) is a random variable in \(\N\) with probability density function \(f\). Then \(N\) satisfies the condition in if and only if \[ f(k) = \alpha \E\left[\binom{N}{k}\right] = \alpha \sum_{n = k}^\infty \binom{n}{k} f(n), \quad k \in \N \] Equivalently, \(f\) is an eigenfunction of the binomial operator \(B\) corresponding to the eigenvalue \(1 / \alpha\).

Details:

The condition in is equivalent to \[ \alpha f(k) = \E\left[(-1)^{N - k} \binom{N}{k}\right] = \sum_{n = k}^\infty (-1)^{n - k} \binom{n}{k} f(n), \quad k \in \N \] with the underlying assumption that the series on the right converges absolutely. Equivalently, \(f\) is an eigenfunction of \(B^{-1}\) corresponding to the eigenvalue \(\alpha \in (0, \infty)\), on the space \(\ms L_1\). so the result follows from the Möbius inversion formula discussed in Section 1.2.

We will soon collect other conditions that are equivalent, so we will refer to all of them simply as the equivalent conditions with parameter \(\alpha\). Note that \(\alpha = f(0)\).

Suppose that \(N\) has the Poisson distribution on \(\N\) with parameter \(\beta \in (0, \infty)\). Then \(N\) satisfies the equivalent conditions with parameter \(\alpha = e^{-\beta}\).

Details:

The computation is a familiar one, in a variety of contexts. It suffices to show that the function \(g\) is given by \(g(n) = \beta^n / n!\) for \(n \in \N\) is an eigenfunction of \(B\) corresponding to the eigenvalue \(e^\beta\). \[\sum_{n=k}^\infty \binom{n}{k} g(n) = \sum_{n = k}^\infty \binom{n}{k} \frac{\beta^n}{n!} = \frac{\beta^k}{k!} \sum_{n = k}^\infty \frac{\beta^{n - k}}{(n - k)!} = e^\beta \frac{\beta^k}{k!} = e^\beta g(k), \quad k \in \N\]

Our goal is to show that the Poisson distribution is the only distribution on \(\N\) satisfying the equivalent conditions.

Suppose that \(N\) is a random variable in \(\N\) with probability generating function \(P\). Then \(N\) satisfies the equivalent conditions with parameter \(\alpha \in (0, 1)\) if and only if \[ P(t) = \alpha P(t + 1), \quad t \in \R \]

Details:

Let \(f\) denote the probability density function of \(N\). Recall that the probability generating function \(P\) is defined by \[P(t) = \E(t^N) = \sum_{n = 0}^\infty f(n) t^n\] for \(t\) in the interval of absolute convergence, which must be at least \((-1, 1]\), since \(P(1) = 1\). Suppose that \(N\) satisfies the equivalent conditions. Then using the condition in , a sum interchange, and the binomial theorem we have \begin{align*} P(t) &= \sum_{k = 0}^\infty t^k f(k) = \sum_{k = 0}^\infty t^k \alpha \sum_{n = k}^\infty \binom{n}{k} f(n) \\ &= \alpha \sum_{n = 0}^\infty f(n) \sum_{k = 0}^n \binom{n}{k} t^k = \alpha \sum_{n = 0}^\infty f(n) (1 + t)^n = \alpha P(t + 1) \end{align*} In addition, \(t + 1\) must be in the interval of convergence, so this interval must be \(\R\). Conversely, suppose that \(P(t) = P(t + 1)\) for \(t \in \R\). Then using the same tricks as before, \[\sum_{k = 0}^\infty t^k f(k) = \sum_{k = 0}^\infty t^k \alpha \sum_{n = k}^\infty \binom{n}{k} f(n), \quad t \in \R\] Equating coefficients gives \[f(k) = \alpha \sum_{n = k}^\infty \binom{n}{k} f(n), \quad k \in \N\]

The function \(t \mapsto P(t + 1)\) is sometimes called the factorial moment generating function since the derivatives at 0 give \(\E\left[N^{(k)}\right]\) for \(k \in \N\). So among the equivalent conditions is that the probability generating function be proportional to the factorial moment generating function. The following results will be useful in the proof of our main theorem .

Suppose that \(f\) is a probability density function on \(\N\) satisfying the equivalent conditions with parameter \(\alpha \in (0, 1)\), and with probability generating function \(P\). Then for every \(r \in \N_+\), the function \(f_r\) defined by \[f_r(n) = \alpha^{r - 1} r^n f(n), \quad n \in \N\] is a probability density function satisfying the equivalent conditions with parameter \(\alpha^r\). The probability generating function \(P_r\) of \(f_r\) is given by \(P_r(t) = \alpha^{r - 1} P(r t)\) for \(t \in \R\).

Details:

The generating function \(P_r\) of \(f_r\) is given by \[P_r(t) = \sum_{n = 0}^\infty f_r(n) t^n = \alpha^{r - 1} \sum_{n = 1}^\infty f(n) r^n t^n = \alpha^{r -1} P(r t), \quad t \in R \] Using the condition in , \[P_r(1) = \alpha^{r - 1} P(r) = \alpha^{r - 1} \frac{1}{\alpha^r} P(0) = \frac{\alpha^r}{\alpha^r} = 1\] Hence \(P_r\) is a probability generating function and so \(f_r\) is a probability density function. Finally, note that \(P_r(0) = f_r(0) = \alpha^{r - 1} P(0) = \alpha^{r}\). Thus, \[P_r(t) = \alpha^{r - 1} P(r t) = \alpha^{r - 1} \alpha^{r} P(r t + r) = \alpha^r \alpha^{r - 1} P[r (t + 1)] = \alpha^r P_r(t + 1), \quad t \in \R\]

Note that \(f_1 = f\). If any of the distributions are Poisson, then they all are.

If \(f_r\) is a Poisson density function for some \(r \in \N_+\) then \(f_r\) is a Poisson density function for all \(r \in \N_+\), with parameter \(\beta r\) where \(\beta = -\ln \alpha\).

Details:

Suppose that \(f_r\) is Poisson for some \(r \in \N_+\). Note that \(f_r(0) = \alpha^{r - 1} f(0) = \alpha^r\). Hence the Poisson parameter is \(\beta r\) where \(\beta = - \ln \alpha\). So \[f_r(n) = e^{-r \beta} \frac{(r \beta)^n}{n!} = e^{-(r - 1) \beta} r^n f(n), \quad n \in \N\] It follows that \(f(n) = e^{-\beta} \beta^n / n!\) for \(n \in \N\) so that \(f\) is the Poisson density function with parameter \(\beta\). Then clearly \(f_r\) is the Poisson density with parameter \(\beta r\) for every \(r \in \N_+\).

Suppose again that \(f\) is a probability density function satisfying the equivalent conditions, with parameter \(\alpha \in (0, 1)\), and that \(r \in \N_+\). Suppose that \((X, N)\) is a random variable in \(\N^2\) with probability density function \(g_r\) given by \[ g_r(k, n) = \alpha^r r^k \binom{n}{k} f(n), \quad (k, n) \in \N^2 \]

  1. \(X\) has probability density function \(f_r\).
  2. \(N\) has probability density function \(f_{r + 1}\).
  3. For \(n \in \N\), the conditional distribution of \(X\) given \(N = n\) is binomial with parameters \(n\) and \(r / (r + 1)\).
Details:
  1. From the definition, \[\sum_{n = k}^\infty g_r(k, n) = \alpha^r r^k \sum_{n = k}^\infty \binom{n}{k} f(n) = \alpha^r r^k \frac{1}{\alpha} f(k) = \alpha^{r - 1} r^k f(k) = f_r(k), \quad k \in \N\] It follows that \(g_r\) is a valid probability density function on \(\N^2\) and that \(X\) has probability density function \(f_r\).
  2. Next, \[\sum_{k = 0}^n g_r(k, n) = \alpha^r f(n) \sum_{k = 0}^n \binom{n}{k} r^k = \alpha^r (r + 1)^n f(n) = f_{r + 1}(n), \quad n \in \N\] so \(N\) has probability density function \(f_{r+1}\).
  3. For \(n \in \N\), the conditional density of \(X\) given \(N = n\) is \[\frac{g_r(k, n)}{f_{r + 1}(n)} = \frac{\alpha^r r^k \binom{n}{k} f(n)}{\alpha^r (r + 1)^n f(n)} = \binom{n}{k}\left(\frac{r}{r + 1}\right)^k \left(\frac{1}{r + 1}\right)^{n - k}, \quad k \in \{0, 1, \ldots, n\}\]

A probability distribution on \(\N\) satisfies the equivalent conditions if and only if it is Poisson.

Details:

We already showed in that the Poisson distribution satisfies the equivalent conditions, so we just need to show the converse. We will use the Rao-Rubin characterization of the Poisson distribution: Suppose that \((X, N)\) takes values in \(\N^2\). If the conditional distribution of \(X\) given \(N\) is binomial with parameters \(N\) and \(p\) and if the distribution of \(X\) is the same as the conditional distribution of \(X\) given \(X = N\), then \(N\) has a Poisson distribution.

Towards that end, suppose that \(f\) satisfies the equivalent conditions and let \(P\) denote the corresponding probability generating function. Let \(r \in \N_+\) and let \((X, N)\) have the probability density function \(g_r\) given in . Then \[\P(X = N) = \sum_{n = 0}^\infty \alpha^r r^n \binom{n}{n} f(n) = \alpha^r P(r) = \alpha^r \frac{1}{\alpha^r} P(0) = \alpha\] Hence \begin{align*} \P(X = k \mid X = N) &= \frac{\P(X = k, X = N)}{\P(X = N)} = \frac{\P(X = k, N = k)}{\P(X = N)}\\ &= \frac{\alpha^r r^k \binom{k}{k} f(k)}{\alpha} = \alpha^{r - 1} r^k f(k) = f_r(k) = \P(X = k), \quad k \in \N \end{align*} It follows that \(N\) has the Poisson distribution. From the function \(f\) is the Poisson density function with parameter \(\beta = - \ln \alpha\).

Thus, the conditions in propositions , , and characterize the Poisson distribution. These conditions are anayltically more elegant than most characterizations of the Poisson distribution, but the probabilisitic interpretation is less clear.