\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\lfa}{\leftarrow}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\lfrta}{\leftrightarrow}\) \(\newcommand{\nea}{\nearrow}\) \(\newcommand{\sea}{\searrow}\) \(\newcommand{\nwa}{\nwarrow}\) \(\newcommand{\swa}{\swarrow}\) \(\newcommand{\Rta}{\Rightarrow}\) \(\newcommand{\bs}{\boldsymbol}\)
  1. Reliability
  2. 4. Standard Discrete Spaces
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5

2. Variations

In this section we study some graphs that are closely related to the standard discrete graph \((\N, \le)\) studied in Section 1. Two of the graphs are based on reflexive closure as discussed in Section 1.6.

Graphs related to \((\N, \le)\):

  1. \((\N, \lt)\)
  2. \((\N, \upa)\), the covering graph of \((\N, \le)\).
  3. \((\N, \rta)\), the reflexive closure of \((\N, \upa)\).
  4. \((\N, \upa^n)\) where \(\upa^n\) is the composition power of \(\upa\) of order \(n \in \N\).
Details:
  1. Note that \((\N, \lt)\) is the reflexive reduction of \((\N, \le)\). Also, \((\N, \lt)\) is isomorphic to the graph \((\N_+, \lt)\), with isomorphism \(x \mapsto x + 1\). In turn, \((\N_+, \lt)\) is the graph associated with the strict positive semigroup \((\N_+, +)\).
  2. Recall that \(x \upa y\) if and only if \(y = x + 1\).
  3. Recall that \(x \rta y\) if and only if \(y = x\) or \(y = x + 1\).
  4. Recall that \(x \upa^n y\) if and only if \(y = x + n\).

Basics

As usual, we start with the path functions and the path generating function.

The path function \(\gamma_n\) of order \(n \in \N\) is given as follows:

  1. For the graph \((\N, \lt)\), \(\gamma_n(x) = \binom{x}{n}\) for \(x \in \N\).
  2. For the graph \((\N, \upa)\), \(\gamma_n(x) = \bs 1(x \ge n)\) for \(x \in \N\).
  3. For the graph \((\N, \rta)\), \(\gamma_n(x) = \sum_{k=0}^x \binom{n}{k}\) for \(x \in \N\).
Details:
  1. Note that this expression is correct when \(n = 0\). Assume that the expression is correct for a given \(n \in \N\). Then \[\gamma_{n + 1}(x) = \sum_{z=0}^{x-1} \gamma_n(z) = \sum_{z = 0}^{x - 1} \binom{z}{n} = \binom{x}{n + 1}\] by a binomial identity. For the combinatorial proof, to construct a paths of length \(n\) in \((\N, \lt)\) ending in \(x\) we need to select a subset of size \(n\) from \(\{0, \ldots, x - 1\}\). The number of ways to do this is \(\binom{x}{n}\).
  2. A path is a sequence of consecutive nonnegative integers. So if \(x \lt n\) there are no paths of length \(n\) ending in \(x\). If \(x \ge n\) there is one such path, namely \((x - n, x - n + 1, \ldots, x - 1, x)\).
  3. This follows from general results on reflexive closure. Note that \(\gamma_n(x) = 2^n\) if \(x \ge n\).

The path generating function \(\Gamma\) is given as follows:

  1. For the graph \((\N, \lt)\), \[\Gamma(x, t) = (1 + t)^x, \quad x \in \N, \, t \in \R\]
  2. For the graph \((\N, \upa)\), \[\Gamma(x, t) = \frac{1 - t^{x+1}}{1 - t}, \quad x \in \N, \, t \in (-1, 1)\]
  3. For the graph \((\N, \rta)\), \[\Gamma(x, t) = \frac{t^{x+1} - (1 - t)^{x+1}}{(1 - t)^{x+1}(2 t - 1)}, \quad x \in \N, \, t \in (-1, 1)\]
Details:
  1. This follows from part (a) of and the binomial theorem.
  2. This follows from part (b) of and geometric series.
  3. This follows from part (c) of , the binomial theorem and geometric series.

Probability

Suppose that \(X\) is a random variable in \(\N\) with probability density function \(f\). As usual, when a particular graph is under discussion, we assume that the graph supports the distribution of \(X\).

The reliability function \(F\) and the rate function \(r\) of \(X\) are given in terms of the density function \(f\) as follows. In each case, \(F\) uniquely determines \(f\).

  1. For the graph \((\N, \lt)\), \begin{align*} F(x) &= \sum_{y = x + 1}^\infty f(y), \quad x \in \N \\ r(x) &= \frac{f(x)}{\sum_{y = x + 1}^\infty f(y)}, \quad x \in \N \end{align*}
  2. For the graph \((\N, \upa)\), \begin{align*} F(x) &= f(x + 1), \quad x \in \N \\ r(x) &= \frac{f(x)}{f(x + 1)}, \quad x \in \N \end{align*}
  3. For the graph \((\N, \rta)\), \begin{align*} F(x) &= f(x) + f(x + 1), \quad x \in \N \\ r(x) &= \frac{f(x)}{f(x) + f(x + 1)} \quad x \in \N \end{align*}
Details:

The formulas for \(F\) and \(r\) follow immediately from the definitions of the graphs.

  1. The density function \(f\) can be recovered from \(F\) by \(f(0) = 1 - F(0)\) and \(f(x) = F(x - 1) - F(x)\) for \(x \in \N_+\).
  2. The density function \(f\) can be recovered from \(F\) by \(f(x) = F(x - 1)\) for \(x \in \N_+\) and then \(f(0) = 1 - \sum_{x=1}^\infty f(x)\).
  3. The density function \(f\) can be recovered from \(F\) by \[f(x) = \sum_{n=0}^{x-1} (-1)^{n-x} F(n) + (-1)^x \left[2 - \sum_{n=0}^\infty F(n)\right], \quad x \in \N\]

Each of the graphs in is stochastic. That is, the \(\sigma\)-algebra associated with the graph (the \(\sigma\)-algebra generated by the collection of right neighbor sets) is \(\ms P(\N\)\), and as noted in , the reliability function of a distribution on \(\ms P(\N)\) uniquely determines the distribution. The standard moment results as discussed in Section 1.3 are given next.

The standard moment results are as follows:

  1. For the graph \((\N, \lt)\), \[\sum_{x=1}^\infty \binom{x}{n} \P(X \gt x) = \E\left[\binom{X}{n + 1}\right], \quad n \in \N\]
  2. For the graph \((\N, \upa)\), \[\sum_{x=0}^\infty \bs 1(x \ge n) \P(X = x + 1) = \E[\bs 1(X \ge n + 1)], \quad n \in \N\]
  3. For the graph \((\N, \rta)\), \[\sum_{x=0}^\infty \sum_{k=0}^x \binom{n}{k}[\P(X = x) + \P(X = x + 1)] = \E\left[\sum_{k=0}^X \binom{n+1}{k}\right], \quad n \in \N\]
Details:

Suppose that \(X\) is a random variable in \(\N\) with reliabiliby function \(F\) and path function \(\gamma_n\) of order \(n \in \N\) with respect to a graph on \(\N\). The standard moment result is \[\sum_{x=0}^\infty \gamma_n(x) F(x) = \E[\gamma_{n+1}(X)], \quad n \in \N\]

  1. When \(n = 0\) this is a standard result in elementary probability theory: \(\sum_{x=1}^\infty \P(X > x) = \E(X)\).
  2. Note that this is equivalent to the obvious result \(\sum_{x=n}^\infty \P(X = x + 1) = \P(X \ge n + 1)\) for \(n \in \N\).
  3. When \(n = 0\), this reduces to the obvious result \(\sum_{x=0}^\infty [\P(X = x) + \P(X = x + 1)] = 1 + \P(X \ge 1)\).

As usual, constant rate distributions are of special interest.

Random variable \(X\) has constant rate for \((\N, \lt)\) if and only if \(X + 1\) is exponential for \((\N_+, +)\) if and only if \(X + 1\) is memoryless for \((\N_+, +)\). The distribution with constant rate \(\alpha \in (0, \infty)\) for \((\N, \lt)\) has density function \(f\) given by \[f(x) = \frac{\alpha}{1 + \alpha} \left(\frac{1}{1 + \alpha}\right)^x, \quad x \in \N\]

Details:

Let \(F\) denote the reliability function of \(X\) for \((\N, \lt)\) and recall that \(f(0) = 1 - F(0)\) and \(f(x) = F(x - 1) - F(x)\) for \(x \in \N\). For \(\alpha \in (0, \infty)\), the constant rate property \(f = \alpha F\) has the unique solution \begin{align*} F(x) &= \left(\frac{1}{1 + \alpha}\right)^{x + 1}, \quad x \in \N \\ f(x) &= \frac{\alpha}{1 + \alpha} \left(\frac{1}{1 + \alpha}\right)^x, \quad x \in \N \end{align*} Let \(Y = X + 1\), so that \(Y\) has values in \(\N_+\). Then \(Y\) has density function \(g\) given by \(g(y) = f(y - 1)\) for \(y \in \N_+\), and the reliability function \(G\) of \(Y\) for \((\N_+, \lt)\) is given by \(G(y) = F(y - 1)\) for \(y \in \N_+\). If \(X\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for \((\N, \lt)\) above, then \(Y\) is memoryless for \((\N_+, +)\) and has constant rate \(\alpha\) for the associated graph \((\N, \lt)\). Hence \(Y\) is exponential for \((\N_+, +)\). Conversely, if \(Y\) is memoryless for \((\N_+, +)\) then \(G(y) = (1 - \beta)^y\) for \(y \in \N_+\) and hence \(g(y) = \beta (1 - \beta)^{y-1}\) for \(y \in \N_+\) where \(\beta = G(1) \in (0, 1)\). Hence \(Y\) has constant rate \(\alpha = \beta / (1 - \beta) \in (0, \infty)\) for \((\N_+, \lt)\) and therefore \(Y\) is exponential for \((\N_+, +)\). Then also \(X = Y - 1\) has constant rate \(\alpha\) for \((\N, \lt)\).

Random variable \(X\) has constant rate \(\alpha \in (1, \infty)\) for \((\N, \upa)\) if and only if \(X\) has denstiy function \(f\) given by \[f(x) = \frac{\alpha - 1}{\alpha} \left(\frac{1}{\alpha}\right)^x, \quad x \in \N\] and in this case, \(X\) has constant rate \(\alpha^n\) for \((\N, \upa^n)\) for each \(n \in \N\).

Details:

Let \(f\) denote the density function of \(X\) and let \(F\) denote the reliability function of \(X\) for \((\N, \upa)\). Then \(F(x) = f(x + 1)\) for \(x \in \N\), so the constant rate property is \(f(x) = \alpha f(x + 1)\) for \(x \in \N\). Hence \(f(x) = (1 / \alpha)^x f(0)\) for \(x \in \N\). In order for \(f\) to be a proper density function, we must have \(\alpha \gt 1\), in which case \(f(x) = (1 - 1 / \alpha) (1 / \alpha)^x\) for \(x \in \N\). The total order graph \((\N, \le)\) is uniform, so from results in Section 1.5, \(X\) has constant rate \(\alpha^n\) for \((\N, \upa^n)\) for each \(n \in \N\).

Consider again the graph \((\N, \upa)\) and suppose that \(X\) is a random variable in \(\N\).

  1. The cumulative rate function \(R_n\) of order \(n \in \N_+\) is given by \(R_n(x) = f(x - n) / f(x)\) for \(x \in \{n, n + 1, \ldots\}\) and \(R_n(x) = 0\) for \(x \in \{0, 1, \ldots, n - 1\}\).
  2. If \(X\) has constant average rate (order 1) then \(X\) has constant rate.
  3. If \(n \ge 2\) then there are distributions with constant average rate of order \(n\) that do not have constant rate.
Details:

Suppose that \(X\) has density function \(f\).

  1. This follows from the definition.
  2. The condition for \(X\) to have constant average rate \(\alpha \in (1, \infty)\) reduces to \(f(x) = f(x - 1) / \alpha\) for \(x \in \N_+\). Hence \(f(x) = (1 / \alpha)^x f(0)\) for \(x \in \N_+\). The condition \(\sum_{x = 0}^\infty f(x) = 1\) then gives \[f(x) = \left(1 - \frac{1}{\alpha}\right) \left(\frac{1}{\alpha}\right)^x, \quad x \in \N\] which is the distribution with constant rate \(\alpha\).
  3. Let \(n \in \{2, 3, \ldots\}\) and \(\alpha \in (1, \infty)\). The condition for \(X\) to have constant average rate \(\alpha^n\) of order \(n\) reduces to \(f(x) = f(x - n) / \alpha^n\) for \(x \in \{n, n + 1, \ldots\}\). Hence \[f(m n + k) = \frac{f(k)}{\alpha^{m n}}, \quad m \in \N, \, k \in \{0, 1, \ldots, n - 1\}\] No conditions are imposed on \(f(k)\) for \(k \in \{0, 1, \ldots, n - 1\}\) except of course that \(\sum_{x=0}^\infty f(x) = 1\). If we take \(f(k) = (1 - 1 / \alpha) (1 / \alpha)^k\) for \(k \in \{0, 1, \ldots, n - 1\}\) then we get \(f(x) = (1 - 1 / \alpha) (1 / \alpha)^x\) for \(x \in \N\) and hence \(X\) has constant rate \(\alpha\). But of course there are infitely many other choices for \(f(k)\), \(k \in \{0, 1, \ldots, n - 1\}\) that do not lead to constant rate distributions.

Random variable \(X\) has contant rate \(\alpha \in (1/2, 1)\) for \((\N, \rta)\) if and only if \(X\) has density function \(f\) given by \[f(x) = \frac{2 \alpha - 1}{\alpha} \left(\frac{1 - \alpha}{\alpha}\right)^x, \quad x \in \N\]

Details:

The proof follows from Section 1.6 on reflexive closure: \(X\) has constant rate \(\alpha\) for \((\N, \upa)\) if and only if \(X\) has constant rate \(\alpha / (1 - \alpha)\) for \((\N, \rta)\).

Of course, the constant rate distributions for the graphs \((\N, \lt)\), \((\N, \upa)\), and \((\N, \rta)\) in propositions , , and , as well as the constant rate distribution for \((\N, \le)\) studied in Section 1 are all geometric distributions. In a sequence of Bernoulli trials, the geoemtric distribution on \(\N\) governs the number of failures before the first success. The distribution with constant rate \(\alpha \in (0, 1)\) for \((\N, \le)\) is geometric with success parameter \(\alpha\). The distribution with constant rate \(\alpha \in (0, \infty)\) for \((\N, \lt)\) is geometric with success parameter \(\alpha / (1 + \alpha)\). The distribution with constant rate \(\alpha \in (1, \infty)\) for \((\N, \upa)\) is geometric with success parameter \((\alpha - 1) / \alpha\). The distribution with constant rate \(\alpha \in (1 / 2, 1)\) for \((\N, \rta)\) is geometric with success parameter \((2 \alpha - 1) / \alpha\). Here is another way of looking at the results.

Suppose that \(X\) has the geometric distribution on \(\N\) with success parameter \(p \in (0, 1)\), so that \(X\) has density function \(f\) given by \(f(x) = p (1 - p)^x\) for \(x \in \N\). Then

  1. \(X\) has constant rate \(p\) for \((\N, \le)\).
  2. \(X\) has constant rate \(p / (1 - p)\) for \((\N, \lt)\).
  3. \(X\) has constant rate \(1 / (1 - p)^n\) for \((\N, \upa^n)\) for each \(n \in \N\).
  4. \(X\) has constant rate \(1 / (2 - p)\) for \((\N, \rta)\).

Open the simulation of the geometric distribution with success parameter \(p\). The app shows the rate constants

  1. \(\alpha_1\) for \((\N, \le)\)
  2. \(\alpha_2\) for \((\N, \lt)\)
  3. \(\alpha_3\) for \((\N, \upa)\)
  4. \(\alpha_4\) for \((\N, \rta)\)

Vary \(p\) with the scrollbar and note the shape of the probability density function and the values of the rate constants. Run the simulation and compare the empirical denisty function to the probability density function.

Random Walks

For any of the graphs considered in this section, recall that the random walk \(\bs Y = (Y_1, Y_2, \ldots)\) associated with a random variable \(X\) can be constructed from a sequence \(\bs X = (X_1, X_2, \ldots)\) of independent copies of \(X\) by means of record variables. Recall also that the random walk \(\bs Y\) corresponding to a constant rate distribution is the most random way to put points in the graph in the sense that given \(Y_{n+1} = x\), the sequence \((Y_1, Y_2, \ldots, Y_n)\) is uniformly distributed on the set of paths of length \(n\) in the graph that terminate in \(x\).

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for the graph \((\N, \lt)\) and that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk associated with \(X\).

  1. The transition density \(P\) of \(\bs Y\) is given by \[P(x, y) = \alpha \left(\frac{1}{1 + \alpha}\right)^{y-x}, \quad x, \, y \in \N, \, x \lt y\]
  2. The density function \(g_n\) of \(Y_1, Y_2, \ldots, Y_n)\) for \(n \in \N_+\) is given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n \left(\frac{1}{1 + \alpha}\right)^{x_n + 1}, \quad (x_1, x_2, \ldots, x_n) \in \N^n, \, x_1 \lt x_2 \lt \cdots \lt x_n\]
  3. The density function \(f_n\) of \(Y_n\) for \(n \in \N_+\) is given by \[f_n(x) = \alpha^n \binom{x}{n - 1} \left(\frac{1}{1 + \alpha}\right)^{x + 1}, \quad x \in \{n - 1, n , \ldots\}\]
  4. Given \(Y_{n+1} = x \in \{n, n + 1, \ldots\}\), \((Y_1, Y_2, \ldots, Y_n)\) is uniformly distributed on the \(\binom{x}{n}\) points in the set \[\{(x_1, x_2, \ldots, x_n) \in \N^n: x_1 \lt x_2 \lt \cdots \lt x_n \lt x\}\]

Suppose that \(X\) has constant rate \(\alpha \in (1, \infty)\) for the graph \((\N, \upa)\) and that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk associated with \(X\).

  1. The transition density \(P\) of \(\bs Y\) is given by \(P(x, x + 1) = 1\) for \(x \in \N\),
  2. The density function \(g_n\) of \(Y_1, Y_2, \ldots, Y_n)\) for \(n \in \N_+\) is given by \[g_n(x_1, x_1 + 1, \ldots, x_1 + (n - 1)) = \frac{\alpha - 1}{\alpha} \left(\frac{1}{\alpha}\right)^{x_1}, \quad x_1 \in \N\]
  3. The density function \(f_n\) of \(Y_n\) for \(n \in \N_+\) is given by \[f_n(x) = \frac{\alpha - 1}{\alpha} \left(\frac{1}{\alpha}\right)^{x - n + 1}, \quad x \in \{n - 1, n, n + 1, \ldots\}\]
  4. Given \(Y_{n+1} = x \in \{n, n + 1, \ldots\}\), \((Y_1, Y_2, \ldots, Y_n) = (x - n, x - n - 1, \ldots, x - 1)\) with probability 1.

Suppose that \(X\) has constant rate \(\alpha \in (1/2, 1)\) for the graph \((\N, \rta)\) and that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk associated with \(X\).

  1. The transition density \(P\) of \(\bs Y\) is given by \(P(x, x) = \alpha,\) and \(P(x, x, + 1) = 1 - \alpha\) for \(x \in \N\).
  2. The density function \(g_n\) of \(Y_1, Y_2, \ldots, Y_n)\) for \(n \in \N_+\) is given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^{n-1} \frac{2 \alpha - 1}{\alpha} \left(\frac{1 - \alpha}{\alpha}\right)^{x_n}\] For \((x_1, x_2, \ldots, x_n) \in \N^n\) with \(x_{k+1} \in \{x_k, x_k + 1\}\) for each \(k \in \{1, 2, \ldots, n - 1\}\).
  3. The density function \(f_n\) of \(Y_n\) for \(n \in \N_+\) is given by \[f_n(x) = \alpha^{n-1} \frac{2 \alpha - 1}{\alpha} \left(\frac{1 - \alpha}{\alpha}\right)^x \sum_{k=0}^x \binom{n - 1}{k}, \quad x \in \N \]
  4. Given \(Y_{n+1} = x \in \N\), \((Y_1, Y_2, \ldots, Y_n)\) is uniformly distributed on the \(\sum_{k=0}^y \binom{n}{k}\) points in the set \[\{(x_1, x_2, \ldots, x_n) \in \N^n: x_{k+1} \in \{x_k, x_k + 1\} \text{ for } k \in \{1, 2, \ldots, n - 1\} \text{ and } x_n \in \{x - 1, x\}\}\]

For the random walk \(\bs Y\) on one of the graphs associated with \(X\) recall the point process \(\bs N = \{N_A: A \subseteq \N\}\) where \(N_A = \#\{n \in \N_+: Y_n \in A\}\). As in the last section for \(x \in \N\) and \(A = \{0, 1, \ldots, x\}\) we will abbreviate \(N_A\) by \(N_x\).

\(\E(N_x)\) is given as follows:

  1. For the graph \((\N, \lt)\), \[\E(N_x) = \frac{\alpha}{1 + \alpha} (x + 1), \quad x \in \N\]
  2. For the graph \((\N, \upa)\), \[\E(N_x) = (x + 1) - \frac{\alpha^{x+1} - 1}{\alpha^{x+2} - \alpha^{x+1}}, \quad x \in \N\]
  3. For the graph \((\N, \rta)\), \[\E(N_x) = \frac{1}{1 - \alpha}(x + 1) - \left[1 - \left(\frac{1 - \alpha}{\alpha}\right)^{x+1}\right], \quad x \in \N\]
Details:

Recall that \(\E(N_x) = \E[\Gamma(X, \alpha); X \le x]\) for \(x \in \N\) where \(\Gamma\) is the generating function for the graph. All of the results follow from and sums of geometric series.

Finally, recall the process of thinning the random walk \(\bs Y\). Specifically, suppose that \(N\) is independent of \(\bs Y\) and has the geometric distribution on \(\N_+\) with success probability \(p \in (0, 1)\), so that \(\P(N = n) = p (1 - p)^{n-1}\) for \(n \in \N_+\). By above \(N\) is exponential for the semigroup \((\N_+, +)\), but with rate \(p / (1 - p)\). We accept or reject each point in \(\bs Y\), independently, with probabilities \(p\) and \(1 - p\), respectively. So \(Y_N\) is the first accepted point.

The density function \(h\) of \(Y_N\) is given as follows:

  1. For the graph \((\N, \lt)\), \[h(x) = \frac{\alpha p}{1 + \alpha} \left(\frac{1 + \alpha - \alpha p}{1 + \alpha}\right)^x, \quad x \in \N\]
  2. For the graph \((\N, \upa)\), \[h(x) = \frac{(\alpha - 1) p}{\alpha} \frac{[\alpha (1 - p)]^{x+1} - 1}{\alpha (1 - p) - 1} \left(\frac{1}{\alpha}\right)^x, \quad x \in \N\]
  3. For the graph \((\N, \rta)\), \[h(x) = \frac{p (2 \alpha - 1)}{\alpha} \frac{[\alpha (1 - p)]^{x+1} - [1 - \alpha (1 - p)]^{x+1}}{[1 - \alpha (1 - p)]^{x+1} [2 \alpha (1 - p) - 1]} \left(\frac{1 - \alpha}{\alpha}\right)^x, \quad x \in \N\]
Details:

Recall that \(h\) is given by \(h(x) = p \alpha \Gamma[x, (1 - p) \alpha] F(x)\) for \(x \in \N\) where \(\Gamma\) is the generating function and \(F\) is the reliability function of \(X\) for the graph. So the results follow from propositions and .