\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\lfrta}{\leftrightarrow}\)
  1. Reliability
  2. 1. Graphs
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8
  11. 9
  12. 10

5. Constant Rate Distributions

This section has the most important definitions and results of the chapter.

Basic Results

Our starting point is a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\). For \(n \in \N_+\) the corresponding product space is \((S^n, \ms S^n, \lambda^n)\). Suppose now that \((S, \rta)\) is a measurable graph. If \(X\) is a random variable in \(S\), recall the definitions of the reliability function \(F\) and the cumulative rate function \(R_n\) of order \(n \in \N_+\) of \(X\) for \((S, \rta)\) given in Section 3. Recall also the path function \(\gamma_n\) of order \(n \in N_+\) for \((S, \rta)\) defined in Section 1. Here are the main definitions, stated so that we avoid division by 0 problems.

Relative to the graph \((S, \rta)\),

  1. \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) if \(f = \alpha F\) is a probability density function of \(X\).
  2. \(X\) has constant average rate \(\beta_n \in (0, \infty)\) of order \(n \in \N_+\) if \(R_n = \beta_n \gamma_n\).

Here is our first trivial result:

If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \(X\) has constant average rate \(\alpha^n\) of order \(n\) for each \(n \in \N_+\).

Details:

If \(r(x) = \alpha\) for \(x \in S\) then from the definition, the cumulative rate function \(R_n\) of order \(n \in N_+\) is given by \(R_n(x) = \alpha^n \gamma_n(x)\) for \(x \in S\).

So as the name suggests, the rate function \(r = f / F\) is the constant \(\alpha\). Recall that the reference measure \(\lambda\) is fixed in advance, and is usually a natural measure in some sense for the measurable space \((S, \ms S)\). For example, counting measure is always used for discrete graphs and Lebesgue measure for graphs on measurble subsets of \(\R^n\), or more generally, Hausdorff measure on Euclidean manifolds. Trivially, if \(X\) has constant rate \(\alpha \in (0, \infty)\) with respect to \(\lambda\) and if \(c \in (0, \infty)\) then \(X\) has constant rate \(\alpha / c\) with respect to the measure \(c \lambda\). But more importantly, if we can choose the measure a posteriori, then every distribution has constant rate.

Suppose that \(X\) is supported by \((S, \rta)\) and let \(\mu\) be the measure on \((S, \ms S)\) defined by \[\mu(A) = \E[1 / F(X), X \in A], \quad A \in \ms S\] Then \(X\) has constant rate \(1\) for \((S, \rta)\) with respect to \(\mu\).

Details:

Note first that since \(F\) is positive, \(\mu\) is a \(\sigma\)-finite positive measure. By definition, \(\mu\) has density function \(1 / F\) with respect to the distribution of \(X\), and hence the distribution of \(X\) has density function \(F\) with respect to \(\mu\). That is, \(\P(X \in A) = \int_A F(x) d \mu(x)\) for \(A \in \ms S\).

For the remainder of this section, we assume again that \(\lambda\) is a fixed reference measure. Suppose that \(X\) has density function \(f\) with respect to \(\lambda\), and let \(\mu\) denote the measure in Proposition and \(P\) the distribution of \(X\). In the notation of Radon-Nikodym derivatives, we have \(d \mu = (1 / F) \, d P\) and \(d P = f \, d \lambda\). It follows that \(d \mu = (f / F) \, d \lambda = r \, d \lambda\) where \(r\) is the rate function of \(X\) for \((S, \rta)\) with respect to \(\lambda\). So theorem is hardly surprising: \(X\) has constant rate 1 with respect to the measure \(\mu\) whose density is the rate function of \(X\) with respect to \(\lambda\).

Let \(L\) denote the adjacency kernel of the graph \((S, \rta)\), considered as an operator on the space \(\ms L_1\).

  1. If \(f\) is the density function of a distribution with constant rate \(\alpha \in (0, \infty)\) then \(1 / \alpha\) is a right eigenvalue of \(L\) and \(f\) a corresponding right eigenfunction.
  2. Conversely, if \(L\) has a right eigenvalue \(\beta \in (0, \infty)\) and a corresponding positive right eigenfunction \(g \in \ms L_1\), then \(f = g / \|g\|_1\) is the probability density function of a distribution with constant rate \(1 / \beta\).
Details:

Recall that if \(f\) is a probability density function then the corresponding reliability function for \((S, \rta)\) is \(F = L f\).

In the finite case, we know the answer to the existence question.

Suppose that \((S, \rta)\) is a finite, strongly connected graph. Then there exists a unique constant rate distribution. The rate constant is the reciprocal of the largest eigenvalue and the probability density function is the normalized positive eigenfunction.

Details:

This is just the Peron-Frobenius theorem in disguise. The adjacency matrix \(L\) has unique positive eigenvalue with multiplicity 1 (the largest eigenvalue), and a corresponding eigenfunction that is strictly positive.

Suppose that \((S, \rta)\) is a discrete graph with the property that \(x \rta x\) for some \(x \in S\). If there exists a distribution with constant rate \(\alpha\) then \(\alpha \in (0, 1]\).

Details:

Suppose that \(x \rta x\) and that \(X\) has density function \(f\) and has reliability function \(F\) for \((S, \rta)\). If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \[f(x) = \alpha F(x) = \alpha \left[f(x) + \sum_{x \rta y, \, y \ne x} f(y) \right]\] By the support assumption, \(F(x) \gt 0\) and hence \(f(x) \gt 0\). Rearranging the displayed equation we have \[(1 - \alpha) f(x) = \alpha \sum_{x \rta y, \, y \ne x} f(y)\] If \(\alpha \gt 1\), the left side is negative, but of course the right side is nonnegative.

In particular, this result applies to any discrete reflexive graph, and in particular, to any discrete partial order graph. The following result shows that mixtures of distributions with the same constant rate also have the common constant rate. From the point of view of linear algebra, this corresponds to an eigenvalue with multiplicity greater than one.

Suppose that \(F\) and \(G\) are reliability functions for distributions supported by \((S, \rta)\), each with constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). For \(p \in (0, 1)\), the function \(H = p F + (1 - p)G\) is also the reliability function for a distribution with constant rate \(\alpha\) for \((S, \rta)\)

Details:

Recall that if \(f\) and \(g\) are probability density functions with corresponding reliability functions \(F\) and \(G\) for \((S, \rta)\), respectivley, and if \(p \in (0, 1)\), then the density function \(h = p f + (1 - p) g\) has reliability function \(H = p F + (1 - p) G\). In this case, \(f = \alpha F\) and \(g = \alpha G\) so \[h = p f + (1 - p) g = p \alpha F + (1 - p) \alpha G = \alpha [p F + (1 - p) G] = \alpha H\]

The following result gives a simple condition for the uniform distribution to have constant rate. Recall that the graph \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) if \(\lambda\{y \in S: x \rta y\} = k\) for all \(x \in S\).

Suppose \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\). If \(\lambda(S) \lt \infty\) then the uniform distribution on \(S\) has constant rate \(1 / k\).

Details:

Suppose that \(\lambda(S) \lt \infty\) and let \(f(x) = 1 / \lambda(S)\) for \(x \in S\) so that \(f\) is the density of the uniform distribution on \(S\). Then the reliability function is \[F(x) = \int_{x \rta y} \frac{1}{\lambda(S)} \, d\lambda(y) = \frac{\lambda\{y \in S: x \rta y\}}{\lambda(S)} = \frac{k}{\lambda(S)}\] and so \(f = \frac{1}{k} F\).

Combining with we see that if \((S, \rta)\) is a finite, strongly connected, right regular graph, then the uniform distribution on \(S\) is the unique constant rate distribution. More generally, if \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) and we know that the eigenspace of the eigenvalue \(1\) has dimension 1, then \((S, \rta)\) has a unique constant rate distribution (the uniform distribution) if \(\lambda(S) \lt \infty\) and has no constant rate distribution if \(\lambda(S) = \infty\). However, it's trivial to see that not every finite graph has a constant rate distribution.

Let \((S, \rta)\) be a finite graph with vertices \(x, \, y \in S\) satisfying the following properties:

  1. \(x \rta x\) and \(x \not \rta z\) for \(z \in S-\{x\}\).
  2. \(y \rta y\) and \(y \rta w\) for some \(w \in S - \{y\}\).

Then \((S, \rta)\) does not support a constant rate distribution.

Details:

Note that there is a loop at \(x\) and this is the only edge from \(x\). There is a loop at \(y\) and at least one other edge from \(y\) (to \(w\)). Suppose \(f\) is a probability density function on \(S\) with constant rate, and let \(F\) denote the reliability function of \(f\) for \((S, \rta)\). Then \(F(x) = f(x)\) so the rate constant must be 1. But then \(f(y) = F(y)\) and also \(F(y) \ge f(y) + f(w)\). So we have \(f(w) = F(w) = 0\) and this contradicts the support assumption that \(F(t) \gt 0\) for all \(t \in S\).

For the following proposition, recall that if \((S, \preceq)\) is a discrete partial order graph with covering relation \(\upa\), then \(\upa^n\) denotes the \(n\)-fold composition power of \(\upa\) for \(n \in \N\), where by convention, \(\upa^0\) is the equality relation \(=\).

Suppose that \((S, \preceq)\) is a discrete, locally finite, uniform partial order graph. with the property that if \(x \upa^n y\) then \(\#\{u \in S; x \upa u, u \upa^{n-1} y\} = a_n \in \N_+\) for \(n \in \N_+\), independent of \(x, \, y \in S\). If \(X\) has constant rate \(\beta \in (0, \infty)\) for \((S, \upa)\) then

  1. For \(n \in \N\), \(X\) has constant rate \(\beta^n a_1 a_2 \cdots a_n\) for \((S, \upa^n)\).
  2. \(X\) has constant rate \(\alpha\) for \((S, \preceq)\) where \[\frac{1}{\alpha} = \sum_{n=0}^\infty \frac{1}{\beta^n a_1 a_2 \cdots a_n}\] assuming that the infinite series is finite.
Details:

Let \(f\) denote the density function of \(X\). Let \(F\) denote the reliability function of \(X\) for \((S, \preceq)\) and for \(n \in \N\), let \(G_n\) denote the reliability function of \(X\) for \((S, \upa^n)\).

  1. We prove this by induction on \(n\). Note that \(a_1 = 1\), so the result is true by assumption when \(n = 1\). The result is trivial for \(n = 0\) since \(G_0(x) = f(x)\) for \(x \in S\). Assume that the result holds for a given \(n \in \N\). Then \begin{align*} G_{n+1}(x) &= \sum_{x \upa^{n+1}y} f(y) = \frac{1}{a_{n+1}} \sum_{x \upa u} \sum_{u \upa^n y} f(y) = \frac{1}{a_{n+1}} \sum_{x \upa u} G_n(u) \\ &= \frac{1}{a_{n+1}} \sum_{x \upa u} \frac{1}{\beta^n a_1 \cdots a_n} f(u) = \frac{1}{\beta^n a_1 \cdots a_{n+1}}G_1(x) = \frac{1}{\beta^{n+1} a_1 \cdots a_{n+1}} f(x), \quad x \in S \end{align*}
  2. Recall that since \((S, \preceq)\) is uniform, \(F = \sum_{n=0}^\infty G_n\). Hence \[F(x) = \sum_{n=0}^\infty G_n(x) = \sum_{n=0}^\infty \frac{1}{\beta^n a_1 \cdots a_n} f(x) = f(x) \sum_{n=0}^\infty \frac{1}{\beta^n a_1 \cdots a_n}, \quad x \in S\]

Note that series in is finite if \(\beta \gt 1\).

Suppose that \((S, \rta)\) is a transitive graph, and for \(x \in S\) let \(A_x = \{y \in S: x \rta y\}\) (the right neighbor set of \(x\)). If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then the conditional distribution of \(X\) given \(x \rta X\) has constant rate \(\alpha\) for \((A_x, \rta)\) for each \(x \in S\).

Details:

This is an immediate consequence of the result in Section 3 since the rate function of the conditional distribution for \((A_x, \rta)\) is the restriction of the rate function of \(X\) to \(A_x\).

In particular, this proposition holds for a partial order graph \((S, \preceq)\).

Moments

The basic moment result in Section 3 simplifies significantly when \(X\) has constant rate. Once again, \(L\) denotes the adjacency kernel of \((S, \rta)\).

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) and that \(g: S \to [0, \infty)\) is measurable. Then \[\E[(g L^n)(X)] = \frac{1}{\alpha^n} \E[g(X)], \quad n \in \N\]

Details:

By assumption, \(f = \alpha F\) is a density of \(X\), so \[\int_S (g L^n)(x) F(x) \, d\lambda(x) = \frac{1}{\alpha} \int_S (g L^n)(x) f(x) = \frac{1}{\alpha} \E[(g L^n)(X)]\] Hence from the basic moment result \[\frac{1}{\alpha} \E[(g L^n)(X)] = \E[(g L^{n+1})(X)]\] The result then follows since \(L^0 = I\), the identity kernel.

The result also holds if \(g: S \to \R\) is measurable and \(E[|g(X)|] \lt \infty\). Recall that the path function \(\gamma_n\) of order \(n \in \N\) is given by \[\gamma_n(x) = \bs 1 L^n(x) = \lambda^n\{(x_1, x_2, \ldots, x_n) \in S^n: x_1 \rta x_2 \rta \cdots \rta x_n \rta x\}, \quad x \in S\] the measure of the initial parts of the paths of length \(n\) that terminate in \(x\).

If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then \[\E[\gamma_n(X)] = \frac{1}{\alpha^n}, \quad n \in \N\]

Details:

This follows from with \(g = \bs 1\).

From note that \[\int_S \alpha^n \gamma_n(x) f(x) \, d\lambda(x) = 1\] so it follows that the function \(f_n\) given by \(f_n(x) = \alpha^n \gamma^n(x) f(x)\) for \(x \in S\) is a probability density function. We will return to this point shortly.

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then

  1. \(\gamma(x) \lt \infty\) for almost all \(x \in S\).
  2. If \(\gamma(x) = c \in (0, \infty)\) for almost all \(x \in S\) then \(\alpha = 1 / c\).
Details:

Recall that the graph \((S, \rta)\) supports the distribution of \(X\).

  1. If \(\lambda\{x \in S: \gamma(x) = \infty\} > 0\) then \(\P[\gamma(X) = \infty] > 0\) and hence \(\E[\gamma(X)] = \infty\). This contradicts .
  2. If \(\gamma(x) = c\) for almost all \(x \in S\) then \(\P[\gamma(X) = c] = 1\) so \(\E[\gamma(X)] = \E[\gamma(X); \gamma(X) = c] = c\). Hence \(c = 1 / \alpha\).

Part (a) means that if a graph has a constant rate distribution, then necessarily the graph is left finite. The assumption in part (b) means that \((S, \rta)\) is left regular of degree \(c\). The generating function result also simplifies when the distribution has constant rate.

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then the generating function \(\Lambda\) of \(X\) is given by \[\Lambda(t) = \frac{\alpha}{\alpha - t}, \quad |t| \lt \alpha\]

Details:

Recall that \(\Lambda(t) = \E[\Gamma(X, t)]\) where \(\Gamma(x, t) = \sum_{n=0}^\infty \gamma_n(x) t^n\) is the generating function of \((S, \rta)\). Hence using and Fubini's theorem we have \[\Lambda(t) = \E[\Gamma(X, t)] = \sum_{n=0}^\infty \E\left[\gamma_n(X)\right] t^n = \sum_{n=0}^\infty (t / \alpha)^n = \frac{1}{1 - t / \alpha}, \quad |t| \lt \alpha\]

Constant rate distributions maximize entropy among distributions satisfying a certain moment condition. This is an indication that a constant rate distribution governs the most random way to put points in a graph, but we will see a stronger expression of that idea shortly.

Suppose that \(X\) has constant rate and reliability function \(F\) for the graph \((S, \rta)\). Then \(X\) maximizes entropy among all random variables \(Y\) on \(S\) satisfying \[\E\left[\ln F(Y) \right] = \E\left[\ln F(X)\right]\]

Details:

Suppose that \(X\) has constant rate \(\alpha\), so that \(f = \alpha F\) is a density of \(X\). Suppose that \(Y\) is a random variable taking values in \(S\), with density function \(g\). From the general entropy inequality in Section 3 in we have \begin{align*} H(Y) &= -\int_S g(x) \ln[g(x)] d \lambda(x) \le -\int_S g(x) \ln[f(x)] d \lambda(x) \\ &= -\int_S g(x) \left(\ln \alpha + \ln[F(x)]\right) d \lambda(x) \\ & = -\ln \alpha - \int_S g(x) \ln[F(x)] d \lambda(x) = -\ln \alpha - \E\left(\ln[F(Y)]\right) \\ & = -\ln \alpha - \E\left(\ln[F(X)]\right) \end{align*} Of course, the entropy of \(X\) achieves the upper bound.

Note that since the reliability function \(F\) typically has an exponential form of some sort, \(\E(\ln[F(Y)])\) often reduces to a natural moment condition.

Random Walks

Results related to the random walk also simplify significantly when the underlying distribution has constant rate. For the next two results, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with random variable \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Recall also that \(\gamma_n\) denotes the path function of order \(n \in \N\).

The transition density \(P_n\) of order \(n \in \N_+\) for \(\bs Y\) is given by \[P^n(x, y) = \alpha^n \frac{F(y)}{F(x)} L^n(x, y), \quad (x, y) \in S^2\]

Details:

Recall from Section 4 that \[P^n(x, y) = \frac{f(y)}{F(x)} R^{(n)}(x, y), \quad (x, y) \in S^2\] where \[R^{(n)}(x, y) = \int_{x \rta x_1 \rta \cdots \rta x_{n-1} \rta y} r(x_1) \cdots r(x_{n-1}) d\lambda^n(x_1, \ldots, x_{n-1}), \quad (x, y) \in S^2\] Since \(X\) has constant rate \(\alpha\), \(f(x) = \alpha F(x)\) and \(r(x) = \alpha\) for \(x \in S\). Hence \[R^{(n)}(x, y) = \alpha^{n-1} \lambda^{n-1}\{(x_1, \ldots, x_{n-1}): x \rta x_1 \rta \cdots \rta x_{n-1} \rta y\} = \alpha^{n-1} L^n(x, y)\]

Let \(n \in \N_+\).

  1. \((Y_1, Y_2, \ldots, Y_n)\) has density function \(g_n\) given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n F(x_n), \quad x_1 \rta x_2 \rta \cdots \rta x_n\]
  2. \(Y_n\) has density function \(f_n\) given by \[f_n(x) = \alpha^n \gamma_{n - 1}(x) F(x), \quad x \in S\]
  3. For \(x \in S\), the conditional distribution of \((Y_1, Y_2, \ldots, Y_n)\) given \(Y_{n + 1} = x\) is uniform on the set \[\{(x_1, x_2, \ldots, x_n) \in S^n: x_1 \rta x_2 \rta \cdots \rta x_n \rta x\}\]
Details:

The results follow easily from the constant rate property.

  1. This follows since \(r(x) = \alpha\) and \(f(x) = \alpha F(x)\) for \(x \in S\).
  2. This follows since \(R_n(x) = \alpha^n \gamma_n(x)\) and again \(f(x) = \alpha F(x)\) for \(x \in S\). That is, \(X\) has constant average rate \(\alpha^n\) of order \(n\).
  3. The conditional distribution reduces to \[h(x_1, x_2, \ldots x_n \mid x) = \frac{1}{\gamma_n(x)}, \quad x_1 \rta x_2 \rta \cdots \rta x_n \rta x\]

For \(n \in \N_+\), the distribution of \(Y_n\) depends only on the rate parameter \(\alpha\), the path function \(\gamma_{n - 1}\) and the reliability function \(F\). Although a simple corollary, part (c) is one of our fundamental results, so we will restate it for emphasis:

The random walk \(\bs Y\) associated with a constant rate distribution (if such a distribution exists) is the most random way to put points in the graph \((S, \rta)\).

In the special case of a finite, strongly connected graph \((S, \rta)\), it follows from that the normalized eigenfunction corresponding to the largest eigenvalue of the adjacency matrix \(L\) defines a distribution whose random walk is the most random way to put points in the graph. Here is another simple result that follows from our general theory of random walks.

Suppose that \((S, \lfrta)\) is a symmetric graph and that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \lfrta)\), with reliability function \(F\). Let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random waslk on \((S, \lfrta)\) associated with \(X\). Then

  1. \(F^2\) is an invariant function for \(\bs Y\).
  2. \(h = \alpha F^2 / \E[F(X)\) is an invariant probability density function for \(\bs Y\).
Details:

From the general theory in Section 4, \(f F\) is invariant for \(\bs Y\), where \(f\) is the density function of \(X\). Since \(X\) has constant rate \(\alpha\) it follows that \(f = \alpha F\) and hence \(\alpha F^2\) is invariant for \(\bs Y\). But then \(F^2\) is also invariant for \(\bs Y\). The normalizing constant is \[c := \int_S F^2(x) \, d\lambda(x) = \frac{1}{\alpha} \E[F(X)]\] Note that \(c \le 1\) since \(F \le 1\) so \(h = F^2 / c\) is an invariant probability density function for \(X\)

The next result refers to the point process \(\bs N = \{N_A: A \in \ms S\}\) corresponding to the random walk \(\bs Y\). The expected number of points in a set has a simple expression in terms of the generating function when the underlying distribution has constant rate.

Suppose that \((S, \rta)\) has generating function \(\Gamma\) and that \(X\) has right constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Let \(\bs Y\) denote the random walk on \((S, \rta)\) associated with \(X\). For \(A \in \ms S\), \[\E(N_A) = \E[\Gamma(X, \alpha); X \in A]\]

Details:

Recall that \(\E(N_A) = \E[V(X); X \in A]\) where \(V(x) = \sum_{n = 0}^\infty R_n(x)\) for \(x \in S\). Since \(X\) has constant rate \(\alpha\) we have \[V(x) = \sum_{n=0}^\infty R_n(x) = \sum_{n = 0}^\infty \alpha^n \gamma_n(x) = \Gamma(x, \alpha), \quad x \in S\]

Our next topic is thinning the point process. As above, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Let \(N\) have the geometric distribution on \(\N_+\) with success parameter \(p \in (0, 1)\) so that \[\P(N = n) = p(1 - p)^{n - 1}, \quad n \in \N_+\] As we will see in Chapter 4, \(N\) has constant rate \(p\) for the graph \((\N_+, \le)\). Moreover, we assume that \(N\) and \(\bs Y\) are independent. The basic idea is that we accept a point with probability \(p\) and reject the point with probability \(1 - p\), so that \(Y_N\) is the first point accepted. We are interested in the distribution of \(Y_N\). As before, let \(\Gamma\) denote the path generating function of \((S, \rta)\).

The denstiy function \(g\) of \(Y_N\) is is given by \[g(x) = p \Gamma[x, \alpha (1 - p)] f(x), \quad x \in S\]

Details:

As usual, let \(\gamma_n\) denote the path function of \((S, \rta)\) of order \(n \in \N\). Then \begin{align*} g(x) &= \E[f_N(x)] = \sum_{n = 1}^\infty p (1 - p)^{n - 1} f_n(x) = \sum_{n = 1}^\infty p (1 - p)^{n - 1} \alpha^n \gamma_{n - 1}(x) F(x) \\ &= \alpha p F(x) \sum_{n = 1}^\infty [\alpha (1 - p)]^{n - 1} \gamma_{n - 1}(x) = \alpha p F(x) \Gamma[x, \alpha (1 - p)] \\ &= p \Gamma[x, \alpha (1 - p)] f(x), \quad x \in S \end{align*}

In general, \(Y_N\) does not have constant rate for \((S, \rta)\).

Examples and Exercises

Standard Graphs

For the standard continuous graph \(([0, \infty), \le)\), recall that the path function \(\gamma_n\) of order \(n \in \N\) is given by \(\gamma_n(x) = x^n / n!\) for \(x \in [0, \infty)\) and the path generating function \(\Gamma\) is given by \(\Gamma(x, t) = e^{x t}\) for \(x \in [0, \infty)\) and \(t \in \R\).

  1. Random variable \(X\) has constant rate \(\alpha \in (0, \infty)\) if and only if \(X\) has the exponential distribution, with density function \(f\) and reliability function \(F\) given by \(f(x) = \alpha e^{-\alpha x}\) and \(F(x) = e^{-\alpha x}\) for \(x \in [0, \infty)\).
  2. The random walk \(\bs Y = (Y_1, Y_2, \ldots)\) associated with \(X\) has transition density \(P^n\) of order \(n \in \N\) given by \[P^n(x, y) = \alpha^n \frac{(y - x)^{n - 1}}{(n - 1)!} e^{-\alpha (y - x)}, \quad x \le y\]
  3. For \(n \in \N_+\), the density function \(g_n\) of \((Y_1, Y_2, \ldots, Y_n)\) is given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n e^{-\alpha x_n}, \quad 0 \le x_1 \le \cdots \le x_n\]
  4. For \(n \in \N_+\), random variable \(Y_n\) has the gamma distribution with parameters \(n\) and \(\alpha\), with density function \(f_n\) given by \[f_n(x) = \alpha^n \frac{x^{n - 1}}{(n - 1)!} e^{-\alpha x}, \quad x \in [0, \infty)\]
  5. \(\E(N_A) = \alpha \lambda(A)\) for measurable \(A \subseteq [0, \infty)\).
  6. With thinning parameter \(p \in (0, 1)\), \(Y_N\) has the exponential distribution with parameter \(\alpha p\), with density function \(g\) given by \[g(x) = p \Gamma[x, \alpha (1 - p)] f(x) = \alpha p e^{-\alpha p x}, \quad x \in [0, \infty)\]

Of course, all of this is well known. The random walk \(\bs Y\) is the sequence of arrival times of the Poisson process with rate \(\alpha\). The standard continuous graph is studied in detail in Chapter 3.

For the standard discrete graph \((\N, \le)\) recall that the path function \(\gamma_n\) of order \(n \in \N\) is given by \(\gamma_n(x) = \binom{x + n}{n}\) for \(x \in \N\) and the path generating function \(\Gamma\) is given by \(\Gamma(x, t) = 1 / (1 - t)^{x+1}\) for \(x \in \N\) and \(t \in (-1, 1)\).

  1. Random variable \(X\) has constant rate \(\alpha \in (0, 1)\) if and only if \(X\) has the geometric distribution, with density function \(f\) and reliability function \(F\) given by \(f(x) = \alpha (1 - \alpha)^x\) and \(F(x) = (1 - \alpha)^x\) for \(x \in \N\).
  2. The random walk \(\bs Y = (Y_1, Y_2, \ldots)\) associated with \(X\) has transition density \(P^n\) of order \(n \in \N\) given by \[P^n(x, y) = \alpha^n (1 - \alpha)^{y - x} \binom{y - x + n - 1}{n - 1}, \quad x \le y\]
  3. For \(n \in \N_+\), the density function \(g_n\) of \((Y_1, Y_2, \ldots, Y_n)\) is given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n (1 - \alpha)^{x_n}, \quad 0 \le x_1 \le \cdots \le x_n\]
  4. For \(n \in \N_+\), random variable \(Y_n\) has the negative binomial distribution with parameters \(n\) and \(\alpha\), with density function \(f_n\) given by \[f_n(x) = \alpha^n \binom{x + n - 1}{n - 1} (1 - \alpha)^x, \quad x \in \N\]
  5. \(\E(N_A) = \frac{\alpha}{1 - \alpha} \#(A)\) for \(A \subseteq \N\).
  6. With thinning parameter \(p \in (0, 1)\), \(Y_N\) has the geometric distribution with parameter \(r p / [1 - r + r p]\), with density function \(g\) given by \[g(x) = p \Gamma[x, \alpha (1 - p)] f(x) = \frac{r p}{1 - r + r p} \left(\frac{1 - r}{1 - r + r p}\right)^x, \quad x \in \N\]

These results are also well known. For \(n \in \N_+\), random variable \(Y_n\) is the number of failures before the \(n\)th success in a Bernoulli trials sequence with success parameter \(\alpha\), so \(\bs Y\) can be thought of as the sequence of arrival times in a renewal process. The standard discrete graph is studied in detail in Chapter 4.

Complete and Equality Graphs

Return to the general setting of a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\).

Let \((S, \equiv)\) denote the complete reflexive graph, so that \(x \equiv y\) for every \(x, \, y \in S\).

  1. If \(\lambda(S) \lt \infty\) then the uniform distribution is the unique constant rate distribution for \((S, \equiv)\), with rate \(1 / \lambda(S)\).
  2. If \(\lambda(S) = \infty\) there are no constant rate distributions for \((S, \equiv)\).
Details:

For any probability distribution on \(S\), the reliability function \(F\) for \((S, \equiv)\) is the constant function 1. Note also that \((S, \equiv)\) is regular of degree \(\lambda(S)\), so applies.

Recall that \((S, \ne)\) is the complete irreflexive graph. Suppose that \(\lambda\{x\} = c \in [0, \infty)\) for all \(x \in S\).

  1. If \(\lambda(S) \lt \infty\) then the uniform distribution is the unique constant rate distribution for \((S, \ne)\), with rate \(1 / [\lambda(S) - c]\).
  2. If \(\lambda(S) = \infty\) there are no constant rate distributions for \((S, \ne)\).
Details:

Suppose that \(f\) is a probability density function on \(S\). The corresponding reliability function \(F\) is given by \[F(x) = \int_{S - \{x\}} f(y) \, d\lambda(y) = 1 - \int_{\{x\}} f(y) \, d\lambda(y) = \lambda\{x\} f(x) = c f(x) \] So the condition for the distribution defined by \(f\) to have constant rate \(\alpha \in (0, \infty)\) is \(f(x) = \alpha [1 - c f(x)]\) for \(x \in S\) or equivalently, \(f(x) = \alpha / (1 + \alpha c)\). If \(\lambda(S) \lt \infty\) then \(\alpha = 1 / [\lambda(S) - c]\) and \(f\) is the density of the uniform distribution on \(S\). If \(\lambda(S) = \infty\) then no choice of \(\alpha \in (0, \infty)\) makes \(f\) a valid density function. Note also that \((S, \ne)\) is regular of degree \(\lambda(S) - c\), so applies.

In the context of , if \(S\) is uncountable and the reference measure \(\lambda\) is continuous, then \(c = 0\) and the rate constant in part (a) is \(1 / \lambda(S)\), just as for the complete reflexive graph in . If \(S\) is countable with counting measure \(\lambda = \#\), then \(c = 1\). If \(\#(S) = n \in \{2, 3, \ldots\}\), then the rate constant of the uniform distribution in part (a) is \(1 / (n - 1)\).

Suppose that \(S\) is countable, with counting measure \(\#\) as the reference measure. For the equality graph \((S, =)\), every distribution has constant rate 1.

Details:

If \(f\) is a probability density function on \(S\), then trivially, the corresponding reliability function for \((S, =)\) is \(F = f\).

The Diamond Graph

Recall the diamond graph \((S, \lfrta)\) defined in Section 1.

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \lfrta)\).
  2. Find the density function \(f\) and reliability function \(F\) of \(X\).
Details:
  1. The rate constant is \(\alpha = 2 / (1 + \sqrt{17})\), the reciprocal of the largest eigenvalue.
  2. The density function \(f\) and reliability function \(F\) of \(X\) are given by \begin{align*} f(1) &= f(4) = \frac{\sqrt{17} - 3}{4}, \quad f(2) = f(3) = \frac{5 - \sqrt{17}}{4} \\ F(1) &= F(4) = \frac{7 - \sqrt{17}}{4}, \quad F(2) = F(3) = \frac{\sqrt{17} - 3}{2} \end{align*}

Open the simulation of the constant rate distribution for the diamond graph. Note the shape of the density function. Run the simulation 1000 times and compare the empirical density function with the probability density function.

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the diamond graph \((S, \lfrta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix of \(\bs Y\).
  2. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where .
  3. Find \(\lim_{n \to \infty} f_n\) and verify that the limit is the invariant density function of \(\bs Y\).
Details:
  1. The transition matrix \(P\) is given by \begin{align*} P(1, 2) &= P(1, 3) = P(4, 2) = P(4, 3) = \frac{9 = \sqrt{17}}{16} \\ P(1, 4) &= P(4, 1) = \frac{2}{1 + \sqrt{17}} \\ P(2, 1) &= P(2, 4) = P(3, 1) = P(3, 4) = \frac 1 2 \end{align*}
  2. The density function \(f_n\) of \(Y_n\) is given by \begin{align*} f_n(1) &= f_n(4) = -\left(1 - \frac{4}{\sqrt{17}}\right) \left(\frac{1 - \sqrt{17}}{1 + \sqrt{17}}\right)^{n-1} + \frac{1}{4}\left(1 + \frac{1}{\sqrt{17}}\right) \\ f_n(2) &= f_n(3) = \left(1 - \frac{4}{\sqrt{17}}\right) \left(\frac{1 - \sqrt{17}}{1 + \sqrt{17}}\right)^{n-1} + \frac{1}{4}\left(1 - \frac{1}{\sqrt{17}}\right) \end{align*}
  3. The limiting values are \begin{align*} \lim_{n \to \infty} f_n(1) &= \lim_{n \to \infty} f_n(4) = \frac{1}{4}\left(1 + \frac{1}{\sqrt{17}}\right) \\ \lim_{n \to \infty} f_n(2) &= \lim_{n \to \infty} f_n(3) = \frac{1}{4}\left(1 - \frac{1}{\sqrt{17}}\right) \end{align*} Moreover, \(\lim_{n \to \infty} f_n\) is \(f F\) normalized and hence is the invariant density function.

Open the simulation of the random walk on the diamond graph. Run the simulation 1000 times and compare the empirical density function with the limiting probability density function.

The Directed Diamond Graph

Recall the directed version of the diamond graph \((S, \rta)\) described in Section 1.

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \rta)\).
  2. Find the density function and reliability function of \(X\).
Details:
  1. The rate constant is \(\alpha = 2^{-1/3}\), the reciprocal of the largest eigenvalue.
  2. The density function \(f\) and the reliability function \(F\) of \(X\) are given by \begin{align*} f(1) &= \frac{2^{1/3}}{1 + 2^{1/3} + 2^{2/3}}, \; f(2) = f(3) = \frac{2^{-1/3}}{1 + 2^{1/3} + 2^{2/3}}, \; f(4) = \frac{1}{1 + 2^{1/3} + 2^{2/3}} \\ F(1) &= \frac{2^{2/3}}{1 + 2^{1/3} + 2^{2/3}}, \; F(2) = F(3) = \frac{1}{1 + 2^{1/3} + 2^{2/3}}, \; F(4) = \frac{2^{1/3}}{1 + 2^{1/3} + 2^{2/3}} \end{align*}

Open the simulation of the constant rate distribution on the directed diamond graph. Note the shape of the density function. Run the simulation 1000 times and compare the empirical density function with the probability density function.

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the directed diamond graph \((S, \rta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix of \(\bs Y\).
  2. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) corresponding to \(X\).
  3. Find the invariant probability density function of \(\bs Y\).
  4. Classify the random walk \(\bs Y\) in terms of periodicity.
Details:
  1. The transition probability matrix \(P\) is given by \[P(1, 2) = P(1, 3) = \frac{1}{2}, \, P(2, 4) = P(3, 4) = P(4, 1) = 1 \]
  2. The density function \(f_n\) of \(Y_n\) is given as follows: If \(n = 0 \mod 3\) then \(f_n = f\). If \(n = 1 \mod 3\) then \(f_n(x) = 2^{-1 / 3} f(x)\) for \(x \in \{1, 2, 3\}\) and \(f_n(4) = 2^{2 / 3} f(4)\). If \(n = 2 \mod 3\) then \(f_n(x) = 2^{1 / 3} f(x)\) for \(x \in \{1, 4\}\) and \(f_n(x) = 2^{-2 / 3} f(x)\) for \(x \in \{2, 3\}\).
  3. The invariant probability density function \(h\) is given by \(h(1) = h(4) = 1 / 3\) and \(h(2) = h(3) = 1 / 6\).
  4. The random walk \(\bs Y\) is periodic with period 3.

Open the simulation of the random walk on the directed diamond graph. Run the simulation 1000 times and compare the empirical density function with the limiting probability density function.

The Bull Graph

Recall the bull graph \((S, \lfrta)\) defined in Section 1.

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \lfrta)\).
  2. Find the density function \(f\) and reliability function \(F\) of \(X\).
Details:
  1. The rate constant is \(\alpha = 2 / (1 + \sqrt{13})\), the reciprocal of the largest eigenvalue.
  2. The density function \(f\) and reliability function \(F\) of \(X\) are given by \begin{align*} f(1) &= \frac{5 - \sqrt{13}}{6}, \quad f(2) = f(3) = \frac{\sqrt{13} - 2}{6}, \quad f(4) = f(5) = \frac{5 - \sqrt{13}}{12}\\ F(1) &= \frac{\sqrt{13} - 2}{3}, \quad F(2) = F(3) = \frac{11 - \sqrt{13}}{12}, \quad F(4) = F(5) = \frac{\sqrt{13} - 2}{6} \end{align*}

Open the simulation of the constant rate distribution on the bull graph. Note the shape of the density function. Run the simulation 1000 times and compare the empirical density function with the probability density function.

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the bull graph \((S, \lfrta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix of \(\bs Y\).
  2. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \lfrta)\) corresponding to \(X\).
  3. Find \(\lim_{n \to \infty} f_n\) and verify that the limit is the invariant density function of \(\bs Y\).
Details:
  1. The transition probabiity matrix \(P\) is given by \begin{align*} P(1, 2) &= P(1, 3) = \frac{1}{2} \\ P(2, 1) &= P(3, 1) = \frac{10 - 2 \sqrt{13}}{11 - \sqrt{13}}, \, P(2, 3) = P(3, 2) = \frac{2}{1 + \sqrt{13}}, \, P(2, 4) = P(3, 5) = \frac{2 \sqrt{13} - 4}{5 \sqrt{13} - 2} \\ P(4, 2) &= P(5, 3) = 1 \end{align*}
  2. The density function \(f_n\) of \(Y_n\) is given by \begin{align*} f_n(1) &= \left(\frac{14}{3 \sqrt{13}} - \frac{4}{3}\right)\left(\frac{1 - \sqrt{13}}{1 + \sqrt{13}}\right)^{n-2} + \frac{1}{3}\left(1 - \frac{1}{\sqrt{13}}\right) \\ f_n(2) = f_n(3) &= \left(1 - \frac{7}{2 \sqrt{13}}\right) \left(\frac{1 - \sqrt{13}}{1 + \sqrt{13}}\right)^{n-2} + \frac{1}{4}\left(1 + \frac{1}{\sqrt{13}}\right) \\ f_n(4) = f_n(5) &= \left(-\frac{1}{3} + \frac{7}{6 \sqrt{13}}\right) \left(\frac{1 - \sqrt{13}}{1 + \sqrt{13}}\right)^{n-2} + \frac{1}{12}\left(1 - \frac{1}{\sqrt{13}}\right) \end{align*}
  3. The limiting values are \begin{align*} \lim_{n \to \infty} f_n(1) &= \frac{1}{3}\left(1 - \frac{1}{\sqrt{13}}\right) \\ \lim_{n \to \infty} f_n(2) = \lim_{n \to \infty} f_n(3) &= \frac{1}{4}\left(1 + \frac{1}{\sqrt{13}}\right) \\ \lim_{n \to \infty} f_n(4) = \lim_{n \to \infty} f_n(5) &= \frac{1}{12}\left(1 - \frac{1}{\sqrt{13}}\right) \end{align*} Moreover, \(\lim_{n \to \infty} f_n\) is \(f F\) normalized and hence is the invariant density function.

Open the simulation of the random walk on the bull graph. Run the simulation 1000 times and compare the empirical density function with the limiting probability density function.

The Wheel Graph

For \(k \in \{3, 4, \ldots\}\), let \((\N_k, \lfrta)\) denote the wheel graph of order \(k\) as defined in Section 1.

Find the rate constant \(\alpha\) and the probability density function \(f\) of the unique constant rate distribution for \((\N_k, \lfrta)\).

Details:

As noted in Section 3 the largest eigenvalue of the adjacency matrix \(L\) is \(1 + \sqrt{k + 1}\), and moreover a corresponding eigenfunction \(g\) is given by \(g(0) = \sqrt{k + 1} - 1\) and \(g(x) = 1\) for \(x \in \{1, 2, \ldots, k\}\). Hence rate constant is \[\alpha = \frac{1}{1 + \sqrt{k + 1}} = \frac{\sqrt{k + 1} - 1}{k}\] The corresponding density function \(f\) is obatined by normalizing \(g\). Since \(c = \sum_{x \in S} g(x) = k - 1 + \sqrt{k + 1}\), \begin{align*} f(0) & = \frac{\sqrt{k + 1} - 1}{k - 1 + \sqrt{k + 1}}\\ f(x) & = \frac{1}{k - 1 + \sqrt{k + 1}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}

Open the simulation of the constant rate distribution on the wheel graph. Vary \(k\) and note the shape of the probability density function and the value of the rate constant. Run the simulation 1000 times and compare the empirical density function with the probability density function.

Find the rate constant \(\alpha\) and the density function \(f\) of the constant rate distribution explicitly for the wheel graph \((\N_3, \lfrta)\).

Details:

From Example , \(\alpha = 1 / 3\) and \(f(x) = 1 / 4\) for \(x \in \N_3\). But we already knew this since \((\N_3, \lfrta)\) is regular with degree \(3\).

Find the rate constant \(\alpha\) and the density function \(f\) of the constant rate distribution explicitly for the wheel graph \((\N_4, \lfrta)\).

Details:

From Example , \(\alpha = \left(\sqrt{5} - 1\right) / 4 \approx 0.309\). For the constant rate density function, \(f(0) = \sqrt 5 - 2\) and \(f(x) = \left(3 - \sqrt 5\right) / 4\) for \(x \in \{1, 2, 3, 4\}\).

Suppose that \(X_k\) has the constant rate distribution on the wheel graph \(\N_k, \lfrta)\). Show that the distribution of \(X_k / k\) converges to the uniform distribution on \([0, 1]\) as \(k \to \infty\).

Details:

Let \(p_k\) denote the common value of \(f_k(j)\) for \(j \in \{1, 2, \ldots, k\}\) as given above. So \(f(0) = 1 - k p_k\). Then for \(x \in [0, 1]\), \begin{align*} \P(X_k / k \le x) & = \sum_{j = 0}^k \bs{1}(j / k \le x) f_k(j) = f_k(0) + p_k \sum_{j = 1}^k \bs{1}(j / k \le x) \\ & = f_k(0) + k p_k \frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \end{align*} But as \(k \to \infty\), \(f_k(0) \to 0\), \(k p_k \to 1\), and \(\frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \to \int_0^x 1 dt = x\).

Suppose that \(X\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\) and let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random walk on the graph associated with \(X\).

  1. Find the transition matrix \(P\) of \(\bs Y\).
  2. Find the invariant probability density function \(h\) of \(\bs Y\).
Details:
  1. From Example and the definition of \(P\), \begin{align*} P(x, 0) & = \frac{\left(\sqrt{k + 1} - 1\right)^2}{k} \\ P(0, x) & = \frac{1}{k} \\ P(x, y) & = \frac{\sqrt{k + 1} - 1}{k}, \quad y = (x - 1) \mod k, \; y = (x + 1) \mod k \end{align*} Note that \((x - 1) \mod k\) and \((x + 1) \mod k\) are the neighbors of \(x\) on the cycle.
  2. Since the wheel is finite and symmetric, \(\bs Y\) is positive recurrent and the invariant probability density function \(h\) is obtained by normalizing the invariant function \(f^2\). After some algebra, \begin{align*} h(0) &= \frac 1 2 - \frac{1}{2 \sqrt{k + 1}} \\ h(x) &= \frac{1}{2 k} + \frac{1}{2 k \sqrt{k + 1}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}

Open the simulation of the random walk on the wheel graph. Vary \(k\) and note the shape of the invariant probability density function. Run the simulation 1000 times and compare the empirical density function with the invariant probability density function.