\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\lfrta}{\leftrightarrow}\)
  1. Reliability
  2. 1. Graphs
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8
  11. 9
  12. 10

5. Constant Rate Distributions

This section has the most important definitions and results of the chapter.

Basic Theory

Definitions and Properties

Our starting point is a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\). For \(n \in \N_+\) the corresponding product space is \((S^n, \ms S^n, \lambda^n)\). Suppose now that \((S, \rta)\) is a measurable graph. If \(X\) is a random variable in \(S\), recall the definitions of the reliability function \(F\) and the cumulative rate function \(R_n\) of order \(n \in \N_+\) of \(X\) for \((S, \rta)\) given in Section 3. Recall also the left walk function \(u_n\) of order \(n \in N_+\) for \((S, \rta)\) defined in Section 1. Here are the main definitions, stated so that we avoid division by 0 problems.

Relative to the graph \((S, \rta)\),

  1. \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) if \(f = \alpha F\) is a probability density function of \(X\).
  2. \(X\) has constant average rate \(\beta_n \in (0, \infty)\) of order \(n \in \N_+\) if \(R_n = \beta_n u_n\).

Here is our first trivial result:

If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \(X\) has constant average rate \(\alpha^n\) of order \(n\) for each \(n \in \N_+\).

Details:

If \(r(x) = \alpha\) for \(x \in S\) then from the definition, the cumulative rate function \(R_n\) of order \(n \in N_+\) is given by \(R_n(x) = \alpha^n u_n(x)\) for \(x \in S\).

So as the name suggests, the rate function \(r = f / F\) is the constant \(\alpha\). Recall that the reference measure \(\lambda\) is fixed in advance, and is usually a natural measure in some sense for the measurable space \((S, \ms S)\). For example, counting measure is always used for discrete graphs and Lebesgue measure for graphs on measurble subsets of \(\R^n\), or more generally, Hausdorff measure on Euclidean manifolds. Trivially, if \(X\) has constant rate \(\alpha \in (0, \infty)\) with respect to \(\lambda\) and if \(c \in (0, \infty)\) then \(X\) has constant rate \(\alpha / c\) with respect to the measure \(c \lambda\). But more importantly, if we can choose the measure a posteriori, then every distribution has constant rate.

Suppose that \(X\) is supported by \((S, \rta)\) and let \(\mu\) be the measure on \((S, \ms S)\) defined by \[\mu(A) = \E[1 / F(X), X \in A], \quad A \in \ms S\] Then \(X\) has constant rate \(1\) for \((S, \rta)\) with respect to \(\mu\).

Details:

Note first that since \(F\) is positive, \(\mu\) is a \(\sigma\)-finite positive measure. By definition, \(\mu\) has density function \(1 / F\) with respect to the distribution of \(X\), and hence the distribution of \(X\) has density function \(F\) with respect to \(\mu\). That is, \(\P(X \in A) = \int_A F(x) d \mu(x)\) for \(A \in \ms S\).

For the remainder of this section, we assume again that \(\lambda\) is a fixed reference measure. Suppose that \(X\) has density function \(f\) with respect to \(\lambda\), and let \(\mu\) denote the measure in Proposition and \(P\) the distribution of \(X\). In the notation of Radon-Nikodym derivatives, we have \(d \mu = (1 / F) \, d P\) and \(d P = f \, d \lambda\). It follows that \(d \mu = (f / F) \, d \lambda = r \, d \lambda\) where \(r\) is the rate function of \(X\) for \((S, \rta)\) with respect to \(\lambda\). So theorem is hardly surprising: \(X\) has constant rate 1 with respect to the measure \(\mu\) whose density is the rate function of \(X\) with respect to \(\lambda\).

Let \(L\) denote the adjacency kernel of the graph \((S, \rta)\), considered as an operator on the space \(\ms L_1\).

  1. If \(f\) is the density function of a distribution with constant rate \(\alpha \in (0, \infty)\) then \(1 / \alpha\) is a right eigenvalue of \(L\) and \(f\) a corresponding right eigenfunction.
  2. Conversely, if \(L\) has a right eigenvalue \(\beta \in (0, \infty)\) and a corresponding positive right eigenfunction \(g \in \ms L_1\), then \(f = g / \|g\|_1\) is the probability density function of a distribution with constant rate \(1 / \beta\).
Details:

Recall that if \(f\) is a probability density function then the corresponding reliability function for \((S, \rta)\) is \(F = L f\).

Suppose that \((S, \rta)\) is a discrete graph with the property that \(x \rta x\) for some \(x \in S\). If there exists a distribution with constant rate \(\alpha\) then \(\alpha \in (0, 1]\).

Details:

Suppose that \(x \rta x\) and that \(X\) has density function \(f\) and has reliability function \(F\) for \((S, \rta)\). If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \[f(x) = \alpha F(x) = \alpha \left[f(x) + \sum_{x \rta y, \, y \ne x} f(y) \right]\] By the support assumption, \(F(x) \gt 0\) and hence \(f(x) \gt 0\). Rearranging the displayed equation we have \[(1 - \alpha) f(x) = \alpha \sum_{x \rta y, \, y \ne x} f(y)\] If \(\alpha \gt 1\), the left side is negative, but of course the right side is nonnegative.

In particular, this result applies to any discrete reflexive graph, and in particular, to any discrete partial order graph. The following result shows that mixtures of distributions with the same constant rate also have the common constant rate. From the point of view of linear algebra, this corresponds to an eigenvalue with multiplicity greater than one.

Suppose that \(F\) and \(G\) are reliability functions for distributions supported by \((S, \rta)\), each with constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). For \(p \in (0, 1)\), the function \(H = p F + (1 - p)G\) is also the reliability function for a distribution with constant rate \(\alpha\) for \((S, \rta)\)

Details:

Recall that if \(f\) and \(g\) are probability density functions with corresponding reliability functions \(F\) and \(G\) for \((S, \rta)\), respectivley, and if \(p \in (0, 1)\), then the density function \(h = p f + (1 - p) g\) has reliability function \(H = p F + (1 - p) G\). In this case, \(f = \alpha F\) and \(g = \alpha G\) so \[h = p f + (1 - p) g = p \alpha F + (1 - p) \alpha G = \alpha [p F + (1 - p) G] = \alpha H\]

The following result gives a simple condition for the uniform distribution to have constant rate. Recall that the graph \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) if \(\lambda\{y \in S: x \rta y\} = k\) for all \(x \in S\).

Suppose \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\). If \(\lambda(S) \lt \infty\) then the uniform distribution on \(S\) has constant rate \(1 / k\).

Details:

Suppose that \(\lambda(S) \lt \infty\) and let \(f(x) = 1 / \lambda(S)\) for \(x \in S\) so that \(f\) is the density of the uniform distribution on \(S\). Then the reliability function is \[F(x) = \int_{x \rta y} \frac{1}{\lambda(S)} \, d\lambda(y) = \frac{\lambda\{y \in S: x \rta y\}}{\lambda(S)} = \frac{k}{\lambda(S)}\] and so \(f = \frac{1}{k} F\).

Combining with we see that if \((S, \rta)\) is a finite, strongly connected, right regular graph, then the uniform distribution on \(S\) is the unique constant rate distribution. More generally, if \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) and we know that the eigenspace of the eigenvalue \(1\) has dimension 1, then \((S, \rta)\) has a unique constant rate distribution (the uniform distribution) if \(\lambda(S) \lt \infty\) and has no constant rate distribution if \(\lambda(S) = \infty\).

For the following proposition, recall that if \((S, \preceq)\) is a discrete partial order graph with covering relation \(\upa\), then \(\upa^n\) denotes the \(n\)-fold composition power of \(\upa\) for \(n \in \N\), where by convention, \(\upa^0\) is the equality relation \(=\).

Suppose that \((S, \preceq)\) is a discrete, locally finite, uniform partial order graph. with the property that if \(x \upa^n y\) then \(\#\{u \in S; x \upa u, u \upa^{n-1} y\} = a_n \in \N_+\) for \(n \in \N_+\), independent of \(x, \, y \in S\). If \(X\) has constant rate \(\beta \in (0, \infty)\) for \((S, \upa)\) then

  1. For \(n \in \N\), \(X\) has constant rate \(\beta^n a_1 a_2 \cdots a_n\) for \((S, \upa^n)\).
  2. \(X\) has constant rate \(\alpha\) for \((S, \preceq)\) where \[\frac{1}{\alpha} = \sum_{n=0}^\infty \frac{1}{\beta^n a_1 a_2 \cdots a_n}\] assuming that the infinite series is finite.
Details:

Let \(f\) denote the density function of \(X\). Let \(F\) denote the reliability function of \(X\) for \((S, \preceq)\) and for \(n \in \N\), let \(G_n\) denote the reliability function of \(X\) for \((S, \upa^n)\).

  1. We prove this by induction on \(n\). Note that \(a_1 = 1\), so the result is true by assumption when \(n = 1\). The result is trivial for \(n = 0\) since \(G_0(x) = f(x)\) for \(x \in S\). Assume that the result holds for a given \(n \in \N\). Then \begin{align*} G_{n+1}(x) &= \sum_{x \upa^{n+1}y} f(y) = \frac{1}{a_{n+1}} \sum_{x \upa u} \sum_{u \upa^n y} f(y) = \frac{1}{a_{n+1}} \sum_{x \upa u} G_n(u) \\ &= \frac{1}{a_{n+1}} \sum_{x \upa u} \frac{1}{\beta^n a_1 \cdots a_n} f(u) = \frac{1}{\beta^n a_1 \cdots a_{n+1}}G_1(x) = \frac{1}{\beta^{n+1} a_1 \cdots a_{n+1}} f(x), \quad x \in S \end{align*}
  2. Recall that since \((S, \preceq)\) is uniform, \(F = \sum_{n=0}^\infty G_n\). Hence \[F(x) = \sum_{n=0}^\infty G_n(x) = \sum_{n=0}^\infty \frac{1}{\beta^n a_1 \cdots a_n} f(x) = f(x) \sum_{n=0}^\infty \frac{1}{\beta^n a_1 \cdots a_n}, \quad x \in S\]

Note that series in is finite if \(\beta \gt 1\).

Suppose that \((S, \rta)\) is a transitive graph, and for \(x \in S\) let \(A_x = \{y \in S: x \rta y\}\) (the right neighbor set of \(x\)). If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then the conditional distribution of \(X\) given \(x \rta X\) has constant rate \(\alpha\) for \((A_x, \rta)\) for each \(x \in S\).

Details:

This is an immediate consequence of the result in Section 3 since the rate function of the conditional distribution for \((A_x, \rta)\) is the restriction of the rate function of \(X\) to \(A_x\).

In particular, this proposition holds for a partial order graph \((S, \preceq)\).

Finite, Discrete Graphs

Suppose that \((S, \rta)\) is a finite, discrete, strongly connected graph (so a strongly connected graph in the usual combinatorial sense). In this case, constant rate distributions are closely connected to the Peron-Frobenius theory of linear algebra.

The graph \((S, \rta)\) has a unique constant rate distribution. The rate constant \(\alpha\) is the reciprocal of the largest eigenvalue (the Peron eigenvaalue) and the probability density function \(f\) is the corresponding normalized positive eigenfunction.

Details:

This is just the Peron-Frobenius theorem in disguise. Since the graph is strongly connected, the adjacency matrix \(L\) is nonnegative and irreducible. Hence \(L\) has unique positive eigenvalue with multiplicity 1 (the largest eigenvalue), and a corresponding (right) eigenfunction that is strictly positive.

Moreover, if the graph is aperiodic, then the constant rate distribution is the limit of the right walk distributions. To review, recall that the right walk distribution of order \(n \in \N\) has density function \(d_n\) given by \(d_n(x) = v_n(x) / w_n\) on \(S\) where \(v_n(x) = \#\{(x_1, x_2, \ldots, x_n) \in S^n: x \rta x_1 \rta x_2 \rta \cdots \rta x_n\}\) is the number of walks of length \(n\) starting in \(x \in S\), and where the normalizing constant \(w_n = \#\{(x_1, x_2, \ldots, x_{n + 1}) \in S^{n + 1}: x_1 \rta x_2 \rta \cdots \rta x_{n + 1}\}\) is the total number of walks of length \(n\).

Suppose that the graph \((S, \rta)\) is aperiodic. Let \(\alpha\) denote the rate constant and \(f\) the density function of the unique constant rate distribution. Then

  1. \(d_n(x) \to f(x)\) as \(n \to \infty\) for \(x \in S\).
  2. \(w_n / w_{n + 1} \to \alpha\) as \(n \to \infty\).
Details:

Since the adjacency matrix \(L\) is aperiodic, it is not only irreducible, but also primitive. As in , \(1 / \alpha\) is the Peron eigenvalue (the unique positive eigenvalue). From the Peron-Frobenius theory for nonnegative primitive matrices, there exists a nonnegative right eigenfunction \(a\) and a nonnegative left eigenfunction \(b\), both corresponding to the eigenvalue \(1 / \alpha\), such that \(\alpha^n L^n(x, y) \to a(x) b(y)\) as \(n \to \infty\) for \((x, y) \in S^2\). So \[\frac{v_n(x)}{w_n} = \frac{\sum_{y \in S} L^n(x, y)}{\sum_{(t, y) \in S^2} L^n(t, y)}, \quad x \in S\] Multiplying by \(\alpha^n\) and passing to the limit gives \[\frac{v_n(x)}{w_n} \to \frac{a(x) \sum_{y \in S} b(y)}{\sum_{t \in S} a(t) \sum_{y \in S} b(y)} = \frac{a(x)}{\sum_{t \in S} a(t)} = f(x) \; \text{ as } n \to \infty \text{ for } x \in S\] The proof of part (b) is similar.

In the context of , the corresponding reliability function \(D_n\) is given by \(D_n(x) = v_{n + 1}(x) / w_n\) and the corresponding rate function is \(d_n(x) / D_n(x) = v_n(x) / v_{n + 1}(x)\). So of course, \(v_n(x) / v_{n + 1}(x) \to \alpha\) as \(n \to \infty\) for \(x \in S\). Theorem does not necessarily hold if \((S, \rta)\) is periodic, but a related result holds in a certain average sense.

Suppose that the graph \((S, \rta)\) is periodic. Let \(\alpha\) denote the rate constant and \(f\) the density function of the unique constant rate distribution. For \(n \in \N\), let \(\hat d_n\) denote the density function on \(S\) defined by \[\hat d_n(x) = \frac{\sum_{j = 0}^n \alpha^j v_j(x)}{\sum_{j = 0}^n \alpha^j w_j}, \quad x \in S\] Then \(\hat d_n(x) \to f(x)\) as \(n \to \infty\) for \(x \in S\).

Details:

Clearly \(\hat d_n\) is a probability density function for \(n \in \N\) since the denominator is the normalizing constant for the function defined by the numerator. From the Peron-Frobenius theory for irreducible matrices, there exists a nonnegative right eigenfunction \(a\) and a nonnegative left eigehfunction \(b\), both corresponding to the Peron eigenvalue \(1 / \alpha\) such that \[\frac{1}{n + 1} \sum_{j = 0}^n \alpha^n L(x, y) \to a(x) b(y) \text{ as } n \to \infty \text{ for } (x, y) \in S^2 \] So \[\hat d_n(x) = \frac{\sum_{j = 0}^n \alpha^j v_j(x)}{\sum_{j = 0}^n \alpha^j w_j} = \frac{\sum_{j = 0}^n \alpha^j \sum_{y \in S} L^j(x, y)} {\sum_{j = 0}^n \alpha^j \sum_{(t, y) \in S^2} L^j(t, y)} = \frac{\sum_{y \in S} \frac{1}{n + 1} \sum_{j = 0}^n \alpha^j L^j(x, y)}{\sum_{(t, y) \in S^2} \frac{1}{n + 1} \sum_{j = 0}^n \alpha^j L^j(t, y)}\] Passing to the limit we have \[\hat d_n(x) \to \frac{\sum_{y \in S} a(x) b(y)} {\sum_{(t, y) \in S^2} a(t) b(y)} = \frac{a(x)} {\sum_{t \in S} a(t)} = f(x) \text{ as } n \to \infty \text{ for } x \in S\]

In the context of , the density \(\hat d_n\) is a convex combination of \(\{d_0, d_1, \ldots, d_n\}\) for \(n \in \N\). Specifically, \(\hat d_n(x) = \sum_{j = 0}^n p_j d_j(x)\) for \(n \in \N\) and \(x \in S\), where \[p_j = \frac{\alpha^j w_j}{\sum_{k = 0}^n \alpha^k w_k}, \quad j \in \{0, 1, \ldots n\}\] so that \(j \mapsto p_j\) is a probability density function on \(\{0, 1, \ldots, n\}\) for \(n \in \N\). Theorems and are certainly of theoretical interest, but are not of much practical help. It's usually much easier to find the constant rate density of a strongly connected graph directly than it is to find the walk densities and take the appropriate limits.

Moments

The basic moment result in Section 3 simplifies significantly when \(X\) has constant rate. Once again, \(L\) denotes the adjacency kernel of \((S, \rta)\).

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) and that \(g: S \to [0, \infty)\) is measurable. Then \[\E[(g L^n)(X)] = \frac{1}{\alpha^n} \E[g(X)], \quad n \in \N\]

Details:

By assumption, \(f = \alpha F\) is a density of \(X\), so \[\int_S (g L^n)(x) F(x) \, d\lambda(x) = \frac{1}{\alpha} \int_S (g L^n)(x) f(x) = \frac{1}{\alpha} \E[(g L^n)(X)]\] Hence from the basic moment result \[\frac{1}{\alpha} \E[(g L^n)(X)] = \E[(g L^{n+1})(X)]\] The result then follows since \(L^0 = I\), the identity kernel.

The result also holds if \(g: S \to \R\) is measurable and \(E[|g(X)|] \lt \infty\). Recall that the left walk function \(u_n\) of order \(n \in \N\) is given by \[u_n(x) = \bs 1 L^n(x) = \lambda^n\{(x_1, x_2, \ldots, x_n) \in S^n: x_1 \rta x_2 \rta \cdots \rta x_n \rta x\}, \quad x \in S\] the measure of the set of walks of length \(n\) that terminate in \(x\).

If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then \[\E[u_n(X)] = \frac{1}{\alpha^n}, \quad n \in \N\]

Details:

This follows from with \(g = \bs 1\).

From note that \[\int_S \alpha^n u_n(x) f(x) \, d\lambda(x) = 1\] so it follows that the function \(f_n\) given by \(f_n(x) = \alpha^n u_n(x) f(x)\) for \(x \in S\) is a probability density function. We will return to this point shortly.

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then

  1. \(u(x) \lt \infty\) for almost all \(x \in S\).
  2. If \(u(x) = c \in (0, \infty)\) for almost all \(x \in S\) then \(\alpha = 1 / c\).
Details:

Recall that the graph \((S, \rta)\) supports the distribution of \(X\).

  1. If \(\lambda\{x \in S: u(x) = \infty\} > 0\) then \(\P[u(X) = \infty] > 0\) and hence \(\E[u(X)] = \infty\). This contradicts .
  2. If \(u(x) = c\) for almost all \(x \in S\) then \(\P[u(X) = c] = 1\) so \(\E[u(X)] = \E[u(X); u(X) = c] = c\). Hence \(c = 1 / \alpha\).

Part (a) means that if a graph has a constant rate distribution, then necessarily the graph is left finite. The assumption in part (b) means that \((S, \rta)\) is left regular of degree \(c\). The generating function result also simplifies when the distribution has constant rate.

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then the left moment generating function \(M\) of \(X\) is given by \[M(t) = \frac{\alpha}{\alpha - t}, \quad |t| \lt \alpha\]

Details:

Recall that \(M(t) = \E[U(X, t)]\) where \(U(x, t) = \sum_{n = 0}^\infty u_n(x) t^n\) is the left generating function of \((S, \rta)\). Hence using and Fubini's theorem we have \[M(t) = \E[U(X, t)] = \sum_{n = 0}^\infty \E\left[u_n(X)\right] t^n = \sum_{n = 0}^\infty (t / \alpha)^n = \frac{1}{1 - t / \alpha}, \quad |t| \lt \alpha\]

Constant rate distributions maximize entropy among distributions satisfying a certain moment condition. This is an indication that a constant rate distribution governs the most random way to put points in a graph, but we will see a stronger expression of that idea shortly.

Suppose that \(X\) has constant rate and reliability function \(F\) for the graph \((S, \rta)\). Then \(X\) maximizes entropy among all random variables \(Y\) on \(S\) satisfying \[\E\left[\ln F(Y) \right] = \E\left[\ln F(X)\right]\]

Details:

Suppose that \(X\) has constant rate \(\alpha\), so that \(f = \alpha F\) is a density of \(X\). Suppose that \(Y\) is a random variable taking values in \(S\), with density function \(g\). From the general entropy inequality in Section 3 in we have \begin{align*} H(Y) &= -\int_S g(x) \ln[g(x)] d \lambda(x) \le -\int_S g(x) \ln[f(x)] d \lambda(x) \\ &= -\int_S g(x) \left(\ln \alpha + \ln[F(x)]\right) d \lambda(x) \\ & = -\ln \alpha - \int_S g(x) \ln[F(x)] d \lambda(x) = -\ln \alpha - \E\left(\ln[F(Y)]\right) \\ & = -\ln \alpha - \E\left(\ln[F(X)]\right) \end{align*} Of course, the entropy of \(X\) achieves the upper bound.

Note that since the reliability function \(F\) typically has an exponential form of some sort, \(\E(\ln[F(Y)])\) often reduces to a natural moment condition.

Random Walks

Results related to the random walk also simplify significantly when the underlying distribution has constant rate. For the next two results, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with random variable \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Recall also that \(u_n\) denotes the walk function of order \(n \in \N\).

The transition density \(P^n\) of order \(n \in \N_+\) for \(\bs Y\) is given by \[P^n(x, y) = \alpha^n \frac{F(y)}{F(x)} L^n(x, y), \quad (x, y) \in S^2\]

Details:

Recall from Section 4 that \[P^n(x, y) = \frac{f(y)}{F(x)} R^{(n)}(x, y), \quad (x, y) \in S^2\] where \[R^{(n)}(x, y) = \int_{x \rta x_1 \rta \cdots \rta x_{n - 1} \rta y} r(x_1) \cdots r(x_{n - 1}) d\lambda^n(x_1, \ldots, x_{n - 1}), \quad (x, y) \in S^2\] Since \(X\) has constant rate \(\alpha\), \(f(x) = \alpha F(x)\) and \(r(x) = \alpha\) for \(x \in S\). Hence \[R^{(n)}(x, y) = \alpha^{n - 1} u_{n - 1}\{(x_1, \ldots, x_{n - 1}): x \rta x_1 \rta \cdots \rta x_{n - 1} \rta y\} = \alpha^{n - 1} L^n(x, y)\]

Let \(n \in \N_+\).

  1. \((Y_1, Y_2, \ldots, Y_n)\) has density function \(g_n\) given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n F(x_n), \quad x_1 \rta x_2 \rta \cdots \rta x_n\]
  2. \(Y_n\) has density function \(f_n\) given by \[f_n(x) = \alpha^n u_{n - 1}(x) F(x), \quad x \in S\]
  3. For \(x \in S\), the conditional distribution of \((Y_1, Y_2, \ldots, Y_n)\) given \(Y_{n + 1} = x\) is uniform on the set walks of length \(n\) that terminate at \(x\): \[\{(x_1, x_2, \ldots, x_n) \in S^n: x_1 \rta x_2 \rta \cdots \rta x_n \rta x\}\]
Details:

The results follow easily from the constant rate property.

  1. This follows since \(r(x) = \alpha\) and \(f(x) = \alpha F(x)\) for \(x \in S\).
  2. This follows since \(R_n(x) = \alpha^n u_n(x)\) and again \(f(x) = \alpha F(x)\) for \(x \in S\). That is, \(X\) has constant average rate \(\alpha^n\) of order \(n\).
  3. The conditional distribution reduces to \[h(x_1, x_2, \ldots x_n \mid x) = \frac{1}{u_n(x)}, \quad x_1 \rta x_2 \rta \cdots \rta x_n \rta x\]

For \(n \in \N_+\), the distribution of \(Y_n\) depends only on the rate parameter \(\alpha\), the walk function \(u_{n - 1}\) and the reliability function \(F\). Although a simple corollary, part (c) is one of our fundamental results, so we will restate it for emphasis:

The random walk \(\bs Y\) associated with a constant rate distribution (if such a distribution exists) is the most random way to put points in the graph \((S, \rta)\).

In the special case of a finite, strongly connected graph \((S, \rta)\), it follows from that the normalized eigenfunction corresponding to the largest eigenvalue of the adjacency matrix \(L\) defines a distribution whose random walk is the most random way to put points in the graph. Here is another simple result that follows from our general theory of random walks.

Suppose that \((S, \lfrta)\) is a symmetric graph and that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \lfrta)\), with reliability function \(F\). Let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random waslk on \((S, \lfrta)\) associated with \(X\). Then

  1. \(F^2\) is an invariant function for \(\bs Y\).
  2. \(h = \alpha F^2 / \E[F(X)\) is an invariant probability density function for \(\bs Y\).
Details:

From the general theory in Section 4, \(f F\) is invariant for \(\bs Y\), where \(f\) is the density function of \(X\). Since \(X\) has constant rate \(\alpha\) it follows that \(f = \alpha F\) and hence \(\alpha F^2\) is invariant for \(\bs Y\). But then \(F^2\) is also invariant for \(\bs Y\). The normalizing constant is \[c := \int_S F^2(x) \, d\lambda(x) = \frac{1}{\alpha} \E[F(X)]\] Note that \(c \le 1\) since \(F \le 1\) so \(h = F^2 / c\) is an invariant probability density function for \(X\)

The next result refers to the point process \(\bs N = \{N_A: A \in \ms S\}\) corresponding to the random walk \(\bs Y\). The expected number of points in a set has a simple expression in terms of the generating function when the underlying distribution has constant rate.

Suppose that \((S, \rta)\) has left generating function \(U\) and that \(X\) has right constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Let \(\bs Y\) denote the random walk on \((S, \rta)\) associated with \(X\). For \(A \in \ms S\), \[\E(N_A) = \E[U(X, \alpha); X \in A]\]

Details:

Recall that \(\E(N_A) = \E[V(X); X \in A]\) where \(V(x) = \sum_{n = 0}^\infty R_n(x)\) for \(x \in S\). Since \(X\) has constant rate \(\alpha\) we have \[V(x) = \sum_{n = 0}^\infty R_n(x) = \sum_{n = 0}^\infty \alpha^n u_n(x) = U(x, \alpha), \quad x \in S\]

Our next topic is thinning the point process. As above, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Let \(N\) have the geometric distribution on \(\N_+\) with success parameter \(p \in (0, 1)\) so that \[\P(N = n) = p(1 - p)^{n - 1}, \quad n \in \N_+\] As we will see in Chapter 4, \(N\) has constant rate \(p\) for the graph \((\N_+, \le)\). Moreover, we assume that \(N\) and \(\bs Y\) are independent. The basic idea is that we accept a point with probability \(p\) and reject the point with probability \(1 - p\), so that \(Y_N\) is the first point accepted. We are interested in the distribution of \(Y_N\). As before, let \(U\) denote the walk generating function of \((S, \rta)\).

The denstiy function \(g\) of \(Y_N\) is is given by \[g(x) = p U[x, \alpha (1 - p)] f(x), \quad x \in S\]

Details:

As usual, let \(u_n\) denote the left walk function of \((S, \rta)\) of order \(n \in \N\). Then \begin{align*} g(x) &= \E[f_N(x)] = \sum_{n = 1}^\infty p (1 - p)^{n - 1} f_n(x) = \sum_{n = 1}^\infty p (1 - p)^{n - 1} \alpha^n \gamma_{n - 1}(x) F(x) \\ &= \alpha p F(x) \sum_{n = 1}^\infty [\alpha (1 - p)]^{n - 1} \gamma_{n - 1}(x) = \alpha p F(x) U[x, \alpha (1 - p)] \\ &= p U[x, \alpha (1 - p)] f(x), \quad x \in S \end{align*}

In general, \(Y_N\) does not have constant rate for \((S, \rta)\).

Examples

Trivially, a finite discrete graph that is not strongly connected may not have a constant rate distribution.

Let \(S = \{0, 1\}\) and let \(\rta\) denote the relation on \(S\) defined by \(0 \rta 0\), \(0 \rta 1\) and \(1 \rta 1\). The graph \((S, \rta)\) does not support a constant rate distribution.

Details:

If \(f\) is a probability density function on \(S\) then the corresponding reliability function \(F\) for \((S, \rta)\) is given by \(F(0) = f(0) + f(1)\) and \(F(1) = f(1)\). So if \(f(0) \gt 0\) and \(f(1) \gt 0\) then no \(\alpha \in (0, \infty)\) satisfies \(f = \alpha F\). Note that \((S, \rta)\) is not strongly connected.

The Standard Continuous Graph

For the standard continuous graph \(([0, \infty), \le)\), recall that the left walk function \(u_n\) of order \(n \in \N\) is given by \(u_n(x) = x^n / n!\) for \(x \in [0, \infty)\) and the walk generating function \(U\) is given by \(U(x, t) = e^{x t}\) for \(x \in [0, \infty)\) and \(t \in \R\).

  1. Random variable \(X\) has constant rate \(\alpha \in (0, \infty)\) if and only if \(X\) has the exponential distribution, with density function \(f\) and reliability function \(F\) given by \(f(x) = \alpha e^{-\alpha x}\) and \(F(x) = e^{-\alpha x}\) for \(x \in [0, \infty)\).
  2. The random walk \(\bs Y = (Y_1, Y_2, \ldots)\) associated with \(X\) has transition density \(P^n\) of order \(n \in \N\) given by \[P^n(x, y) = \alpha^n \frac{(y - x)^{n - 1}}{(n - 1)!} e^{-\alpha (y - x)}, \quad x \le y\]
  3. For \(n \in \N_+\), the density function \(g_n\) of \((Y_1, Y_2, \ldots, Y_n)\) is given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n e^{-\alpha x_n}, \quad 0 \le x_1 \le \cdots \le x_n\]
  4. For \(n \in \N_+\), random variable \(Y_n\) has the gamma distribution with parameters \(n\) and \(\alpha\), with density function \(f_n\) given by \[f_n(x) = \alpha^n \frac{x^{n - 1}}{(n - 1)!} e^{-\alpha x}, \quad x \in [0, \infty)\]
  5. \(\E(N_A) = \alpha \lambda(A)\) for measurable \(A \subseteq [0, \infty)\).
  6. With thinning parameter \(p \in (0, 1)\), \(Y_N\) has the exponential distribution with parameter \(\alpha p\), with density function \(g\) given by \[g(x) = p U[x, \alpha (1 - p)] f(x) = \alpha p e^{-\alpha p x}, \quad x \in [0, \infty)\]

Of course, all of this is well known. The random walk \(\bs Y\) is the sequence of arrival times of the Poisson process with rate \(\alpha\). The standard continuous graph is studied in detail in Chapter 3.

The Standard Discrete Graph

For the standard discrete graph \((\N, \le)\) recall that the left walk function \(u_n\) of order \(n \in \N\) is given by \(u_n(x) = \binom{x + n}{n}\) for \(x \in \N\) and the walk generating function \(U\) is given by \(U(x, t) = 1 / (1 - t)^{x+1}\) for \(x \in \N\) and \(t \in (-1, 1)\).

  1. Random variable \(X\) has constant rate \(\alpha \in (0, 1)\) if and only if \(X\) has the geometric distribution, with density function \(f\) and reliability function \(F\) given by \(f(x) = \alpha (1 - \alpha)^x\) and \(F(x) = (1 - \alpha)^x\) for \(x \in \N\).
  2. The random walk \(\bs Y = (Y_1, Y_2, \ldots)\) associated with \(X\) has transition density \(P^n\) of order \(n \in \N\) given by \[P^n(x, y) = \alpha^n (1 - \alpha)^{y - x} \binom{y - x + n - 1}{n - 1}, \quad x \le y\]
  3. For \(n \in \N_+\), the density function \(g_n\) of \((Y_1, Y_2, \ldots, Y_n)\) is given by \[g_n(x_1, x_2, \ldots, x_n) = \alpha^n (1 - \alpha)^{x_n}, \quad 0 \le x_1 \le \cdots \le x_n\]
  4. For \(n \in \N_+\), random variable \(Y_n\) has the negative binomial distribution with parameters \(n\) and \(\alpha\), with density function \(f_n\) given by \[f_n(x) = \alpha^n \binom{x + n - 1}{n - 1} (1 - \alpha)^x, \quad x \in \N\]
  5. \(\E(N_A) = \frac{\alpha}{1 - \alpha} \#(A)\) for \(A \subseteq \N\).
  6. With thinning parameter \(p \in (0, 1)\), \(Y_N\) has the geometric distribution with parameter \(r p / [1 - r + r p]\), with density function \(g\) given by \[g(x) = p U[x, \alpha (1 - p)] f(x) = \frac{r p}{1 - r + r p} \left(\frac{1 - r}{1 - r + r p}\right)^x, \quad x \in \N\]

These results are also well known. For \(n \in \N_+\), random variable \(Y_n\) is the number of failures before the \(n\)th success in a Bernoulli trials sequence with success parameter \(\alpha\), so \(\bs Y\) can be thought of as the sequence of arrival times in a renewal process. The standard discrete graph is studied in detail in Chapter 4.

Complete and Equality Graphs

Return to the general setting of a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\).

Let \((S, \equiv)\) denote the complete reflexive graph, so that \(x \equiv y\) for every \(x, \, y \in S\).

  1. If \(\lambda(S) \lt \infty\) then the uniform distribution is the unique constant rate distribution for \((S, \equiv)\), with rate \(1 / \lambda(S)\).
  2. If \(\lambda(S) = \infty\) there are no constant rate distributions for \((S, \equiv)\).
Details:

For any probability distribution on \(S\), the reliability function \(F\) for \((S, \equiv)\) is the constant function 1. Note also that \((S, \equiv)\) is regular of degree \(\lambda(S)\), so applies.

Recall that \((S, \ne)\) is the complete irreflexive graph. Suppose that \(\lambda\{x\} = c \in [0, \infty)\) for all \(x \in S\).

  1. If \(\lambda(S) \lt \infty\) then the uniform distribution is the unique constant rate distribution for \((S, \ne)\), with rate \(1 / [\lambda(S) - c]\).
  2. If \(\lambda(S) = \infty\) there are no constant rate distributions for \((S, \ne)\).
Details:

Suppose that \(f\) is a probability density function on \(S\). The corresponding reliability function \(F\) is given by \[F(x) = \int_{S - \{x\}} f(y) \, d\lambda(y) = 1 - \int_{\{x\}} f(y) \, d\lambda(y) = \lambda\{x\} f(x) = c f(x) \] So the condition for the distribution defined by \(f\) to have constant rate \(\alpha \in (0, \infty)\) is \(f(x) = \alpha [1 - c f(x)]\) for \(x \in S\) or equivalently, \(f(x) = \alpha / (1 + \alpha c)\). If \(\lambda(S) \lt \infty\) then \(\alpha = 1 / [\lambda(S) - c]\) and \(f\) is the density of the uniform distribution on \(S\). If \(\lambda(S) = \infty\) then no choice of \(\alpha \in (0, \infty)\) makes \(f\) a valid density function. Note also that \((S, \ne)\) is regular of degree \(\lambda(S) - c\), so applies.

In the context of , if \(S\) is uncountable and the reference measure \(\lambda\) is continuous, then \(c = 0\) and the rate constant in part (a) is \(1 / \lambda(S)\), just as for the complete reflexive graph in . If \(S\) is countable with counting measure \(\lambda = \#\), then \(c = 1\). If \(\#(S) = n \in \{2, 3, \ldots\}\), then the rate constant of the uniform distribution in part (a) is \(1 / (n - 1)\).

Suppose that \(S\) is countable, with counting measure \(\#\) as the reference measure. For the equality graph \((S, =)\), every distribution has constant rate 1.

Details:

If \(f\) is a probability density function on \(S\), then trivially, the corresponding reliability function for \((S, =)\) is \(F = f\).

The Diamond Graph

Recall the diamond graph \((S, \lfrta)\) defined in Section 1.

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \lfrta)\).
  2. Find the density function \(f\) and reliability function \(F\) of \(X\).
Details:
  1. The rate constant is \(\alpha = 2 / (1 + \sqrt{17})\), the reciprocal of the largest eigenvalue.
  2. The density function \(f\) and reliability function \(F\) of \(X\) are given by \begin{align*} f(1) &= f(4) = \frac{\sqrt{17} - 3}{4}, \quad f(2) = f(3) = \frac{5 - \sqrt{17}}{4} \\ F(1) &= F(4) = \frac{7 - \sqrt{17}}{4}, \quad F(2) = F(3) = \frac{\sqrt{17} - 3}{2} \end{align*}

The app below is a simulation of the constant rate distribution for the diamond graph.

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the diamond graph \((S, \lfrta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix of \(\bs Y\).
  2. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where .
  3. Find \(\lim_{n \to \infty} f_n\) and verify that the limit is the invariant density function of \(\bs Y\).
Details:
  1. The transition matrix \(P\) is given by \begin{align*} P(1, 2) &= P(1, 3) = P(4, 2) = P(4, 3) = \frac{9 = \sqrt{17}}{16} \\ P(1, 4) &= P(4, 1) = \frac{2}{1 + \sqrt{17}} \\ P(2, 1) &= P(2, 4) = P(3, 1) = P(3, 4) = \frac 1 2 \end{align*}
  2. The density function \(f_n\) of \(Y_n\) is given by \begin{align*} f_n(1) &= f_n(4) = -\left(1 - \frac{4}{\sqrt{17}}\right) \left(\frac{1 - \sqrt{17}}{1 + \sqrt{17}}\right)^{n-1} + \frac{1}{4}\left(1 + \frac{1}{\sqrt{17}}\right) \\ f_n(2) &= f_n(3) = \left(1 - \frac{4}{\sqrt{17}}\right) \left(\frac{1 - \sqrt{17}}{1 + \sqrt{17}}\right)^{n-1} + \frac{1}{4}\left(1 - \frac{1}{\sqrt{17}}\right) \end{align*}
  3. The limiting values are \begin{align*} \lim_{n \to \infty} f_n(1) &= \lim_{n \to \infty} f_n(4) = \frac{1}{4}\left(1 + \frac{1}{\sqrt{17}}\right) \\ \lim_{n \to \infty} f_n(2) &= \lim_{n \to \infty} f_n(3) = \frac{1}{4}\left(1 - \frac{1}{\sqrt{17}}\right) \end{align*} Moreover, \(\lim_{n \to \infty} f_n\) is \(f F\) normalized and hence is the invariant density function.

The app below is a simulation of the random walk on the diamond graph.

Verify directly the result of theorem for the diamond graph.

Details:

As shown in Section 3, the right walk density function \(d_n\) of order \(n \in \N\) is given by \begin{align*} d_n(1) = d_n(4) & = \frac{\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 5\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 5\right)} {4 \left[\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 4\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 4\right)\right]} \\ d_n(2) = d_n(3) & = \frac{\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 3\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 3\right)} {4 \left[\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 4\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 4\right)\right]} \end{align*} Taking limits gives \begin{align*} \lim_{n \to \infty} d_n(1) & = \lim_{n \to \infty} d_n(4) = \frac{\sqrt{17} - 3}{4} \\ \lim_{n \to \infty} d_n(2) & = \lim_{n \to \infty} d_n(3) = \frac{5 - \sqrt{17}}{4} \end{align*} which agrees with the density function of the constant rate distribution in .

The Directed Diamond Graph

Recall the directed version of the diamond graph \((S, \rta)\) described in Section 1.

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \rta)\).
  2. Find the density function and reliability function of \(X\).
Details:
  1. The rate constant is \(\alpha = 2^{-1/3}\), the reciprocal of the largest eigenvalue.
  2. The density function \(f\) and the reliability function \(F\) of \(X\) are given by \begin{align*} f(1) &= \frac{2^{1/3}}{1 + 2^{1/3} + 2^{2/3}}, \; f(2) = f(3) = \frac{2^{-1/3}}{1 + 2^{1/3} + 2^{2/3}}, \; f(4) = \frac{1}{1 + 2^{1/3} + 2^{2/3}} \\ F(1) &= \frac{2^{2/3}}{1 + 2^{1/3} + 2^{2/3}}, \; F(2) = F(3) = \frac{1}{1 + 2^{1/3} + 2^{2/3}}, \; F(4) = \frac{2^{1/3}}{1 + 2^{1/3} + 2^{2/3}} \end{align*}

The app below is a simulation of the constant rate distribution on the directed diamond graph.

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the directed diamond graph \((S, \rta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix of \(\bs Y\).
  2. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) corresponding to \(X\).
  3. Find the invariant probability density function of \(\bs Y\).
  4. Classify the random walk \(\bs Y\) in terms of periodicity.
Details:
  1. The transition probability matrix \(P\) is given by \(P(1, 2) = P(1, 3) = \frac{1}{2}, \, P(2, 4) = P(3, 4) = P(4, 1) = 1 \).
  2. The density function \(f_n\) of \(Y_n\) is given as follows:
    • If \(n = 0 \bmod 3\) then \(f_n = f\).
    • If \(n = 1 \bmod 3\) then \(f_n(x) = 2^{-1 / 3} f(x)\) for \(x \in \{1, 2, 3\}\) and \(f_n(4) = 2^{2 / 3} f(4)\).
    • If \(n = 2 \bmod 3\) then \(f_n(x) = 2^{1 / 3} f(x)\) for \(x \in \{1, 4\}\) and \(f_n(x) = 2^{-2 / 3} f(x)\) for \(x \in \{2, 3\}\).
  3. The invariant probability density function \(h\) is given by \(h(1) = h(4) = 1 / 3\) and \(h(2) = h(3) = 1 / 6\).
  4. The directed diamond graph is periodic with period 3, and hence so is a random walk \(\bs Y\) on the graph.

The app below is a simulation of the random walk on the directed diamond graph.

For the directed diamond graph,

  1. Show that the result in theorem does not hold.
  2. Verify directly the result in theorem .
Details:
  1. As shown in Section 3, the right walk distribution of order \(n \in \N\) for the directed diamond graph depends only on the remainder when \(n\) is divided by 3 (the period of the graph). The density function \(d_m\) is given as follows:
    • If \(n = 0 \bmod 3\) then \(d_n(x) = 1 / 4\) for \(x \in S\), so the distribution is uniform on \(S\).
    • If \(n = 1 \bmod 3\) then \(d_n(1) = 2 / 5\) and \(d_n(2) = d_n(3) = d_n(4) = 1 / 5\).
    • If \(n = 2 \bmod 3\) then \(d_n(1) = d_n(4) = 1 / 3\) and \(d_n(2) = d_n(3) = 1 / 6\).
    In particular, the distribution does not coverge as \(n \to \infty\).
  2. Let \(\approx\) be the symbol for asymptotic to as \(n \to \infty\). So for sequences \((a_n: n \in \N)\) and \((b_n: n \in \N)\), \(a_n \approx b_n\) means that \(a_n / b_n \to 1\) as \(n \to \infty\). Using the walk functions for the directed diamond graph in Section 1 and the rate constant \(\alpha = 2^{-1/3}\) from , \begin{align*} & \sum_{j = 0}^n \alpha^j v_j(1) \approx \frac{n}{3} \left(1 + 2 \cdot 2^{-1/3} + 2 \cdot 2^{-2/3}\right) \\ & \sum_{j = 0}^n \alpha^j v_j(2) = \sum_{j = 0}^n \alpha^j v_j(3) \approx \frac{n}{3} \left(1 + 2^{-1/3} + 2^{-2/3}\right) \\ & \sum_{j = 0}^n \alpha^j v_j(4) \approx \frac{n}{3} \left(1 + 2^{-1/3} + 2 \cdot 2^{-2/3}\right) \\ & \sum_{j = 0}^n \alpha^j w_j \approx \frac{n}{3} \left(4 + 5 \cdot 2^{-1/3} + 6 \cdot 2^{-2/3}\right) \end{align*} So as \(n \to \infty\) \begin{align*} & \hat d_n(1) \to \frac{1 + 2 \cdot 2^{-1/3} + 2 \cdot 2^{-2/3}}{4 + 5 \cdot 2^{-1/3} + 6 \cdot 2^{-2/3}}\\ & \hat d_n(2) = \hat d_n(3) \to \frac{1 + 2^{-1/3} + 2^{-2/3}}{4 + 5 \cdot 2^{-1/3} + 6 \cdot 2^{-2/3}}\\ & \hat d_n(4) \to \frac{1 + 2^{-1/3} + 2 \cdot 2^{-2/3}}{4 + 5 \cdot 2^{-1/3} + 6 \cdot 2^{-2/3}}\\ \end{align*} A little algebra shows that the limiting values agree with the values of the constant rate density \(f\) in .

The Bull Graph

Recall the bull graph \((S, \lfrta)\) defined in Section 1.

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \lfrta)\).
  2. Find the density function \(f\) and reliability function \(F\) of \(X\).
Details:
  1. The rate constant is \(\alpha = 2 / (1 + \sqrt{13})\), the reciprocal of the largest eigenvalue.
  2. The density function \(f\) and reliability function \(F\) of \(X\) are given by \begin{align*} f(1) &= \frac{5 - \sqrt{13}}{6}, \quad f(2) = f(3) = \frac{\sqrt{13} - 2}{6}, \quad f(4) = f(5) = \frac{5 - \sqrt{13}}{12}\\ F(1) &= \frac{\sqrt{13} - 2}{3}, \quad F(2) = F(3) = \frac{11 - \sqrt{13}}{12}, \quad F(4) = F(5) = \frac{\sqrt{13} - 2}{6} \end{align*}

The app below is a simulation of the constant rate distribution on the bull graph.

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the bull graph \((S, \lfrta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix of \(\bs Y\).
  2. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \lfrta)\) corresponding to \(X\).
  3. Find \(\lim_{n \to \infty} f_n\) and verify that the limit is the invariant density function of \(\bs Y\).
Details:
  1. The transition probabiity matrix \(P\) is given by \begin{align*} P(1, 2) &= P(1, 3) = \frac{1}{2} \\ P(2, 1) &= P(3, 1) = \frac{10 - 2 \sqrt{13}}{11 - \sqrt{13}}, \, P(2, 3) = P(3, 2) = \frac{2}{1 + \sqrt{13}}, \, P(2, 4) = P(3, 5) = \frac{2 \sqrt{13} - 4}{5 \sqrt{13} - 2} \\ P(4, 2) &= P(5, 3) = 1 \end{align*}
  2. The density function \(f_n\) of \(Y_n\) is given by \begin{align*} f_n(1) &= \left(\frac{14}{3 \sqrt{13}} - \frac{4}{3}\right)\left(\frac{1 - \sqrt{13}}{1 + \sqrt{13}}\right)^{n-2} + \frac{1}{3}\left(1 - \frac{1}{\sqrt{13}}\right) \\ f_n(2) = f_n(3) &= \left(1 - \frac{7}{2 \sqrt{13}}\right) \left(\frac{1 - \sqrt{13}}{1 + \sqrt{13}}\right)^{n-2} + \frac{1}{4}\left(1 + \frac{1}{\sqrt{13}}\right) \\ f_n(4) = f_n(5) &= \left(-\frac{1}{3} + \frac{7}{6 \sqrt{13}}\right) \left(\frac{1 - \sqrt{13}}{1 + \sqrt{13}}\right)^{n-2} + \frac{1}{12}\left(1 - \frac{1}{\sqrt{13}}\right) \end{align*}
  3. The limiting values are \begin{align*} \lim_{n \to \infty} f_n(1) &= \frac{1}{3}\left(1 - \frac{1}{\sqrt{13}}\right) \\ \lim_{n \to \infty} f_n(2) = \lim_{n \to \infty} f_n(3) &= \frac{1}{4}\left(1 + \frac{1}{\sqrt{13}}\right) \\ \lim_{n \to \infty} f_n(4) = \lim_{n \to \infty} f_n(5) &= \frac{1}{12}\left(1 - \frac{1}{\sqrt{13}}\right) \end{align*} Moreover, \(\lim_{n \to \infty} f_n\) is \(f F\) normalized and hence is the invariant density function.

The app below is a simulation of the random walk on the bull graph.

Verify directly the result of theorem for the bull graph.

Details:

As shown in Section 3, the right walk density function \(d_n\) of order \(n \in \N\) is given by \begin{align*} d_n(1) = & = \frac{2 \left[\left(1 - \sqrt{13}\right)^n \left(\sqrt{13} - 2\right) + \left(1 + \sqrt{13}\right)^n \left(\sqrt{13} + 2\right)\right]} {\left(1 - \sqrt{13}\right)^n \left(7 \sqrt{13} - 23\right) + \left(1 + \sqrt{13}\right)^n \left(7 \sqrt{13} + 23\right)} \\ d_n(2) = d_n(3) & = \frac{3 \left[\left(1 - \sqrt{13}\right)^n \left(\sqrt{13} - 5\right) + \left(1 + \sqrt{13}\right)^n \left(\sqrt{13} + 5\right)\right]} {2 \left[\left(1 - \sqrt{13}\right)^n \left(7 \sqrt{13} - 23\right) + \left(1 + \sqrt{13}\right)^n \left(7 \sqrt{13} + 23\right)\right]} \\ d_n(4) = d_n(5) & = \frac{\left(1 - \sqrt{13}\right)^n \left(\sqrt{13} - 2\right) + \left(1 + \sqrt{13}\right)^n \left(\sqrt{13} + 2\right)} {\left(1 - \sqrt{13}\right)^n \left(7 \sqrt{13} - 23\right) + \left(1 + \sqrt{13}\right)^n \left(7 \sqrt{13} + 23\right)} \end{align*} Taking limits gives \begin{align*} \lim_{n \to \infty} d_n(1) & = \frac{5 - \sqrt{13}}{6} \\ \lim_{n \to \infty} d_n(2) = \lim_{n \to \infty} d_n(3) & = \frac{\sqrt{13} - 2}{6} \\ \lim_{n \to \infty} d_n(4) = \lim_{n \to \infty} d_n(5) & = \frac{5 - \sqrt{13}}{12} \end{align*} which agrees with the constant rate density in .

Verify directly the result of theorem for the bull graph.

Details:

As shown in Section 3, the right walk density functions are \(d_0 = (1 / 3, 1 / 3, 1 / 3)\), \(d_1 = (1 / 6, 1 / 2, 1 / 3)\) and \(d_n = (1 / 4, 1 / 2, 1 / 4) = f\) for \(n \in \{2, 3, \ldots\}\).

The Triangle Graph

Recall the traingle graph \((S, \rta)\) defined in Section 1. So \(S = \{1, 2, 3\}\) and \(\rta\) is defined by \(1 \rta 2\), \(2 \rta 1\), \(2 \rta 2\), \(2 \rta 3\), \(3 \rta 1\) and \(3 \rta 3\).

  1. Find the rate constant of the random variable \(X\) that has constant rate for \((S, \rta)\).
  2. Find the density function \(f\) of \(X\).
Details:
  1. From Section 1, the Peron eigenvalue is 2 and so the rate constant is \(\alpha = 1 / 2\).
  2. The corresponding normalized eigenvector is \(f = (1 / 4, 1 / 2, 1 / 4)\).

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the bull graph \((S, \lfrta)\) corresponding to the constant rate distribution.

  1. Find the transition probability matrix \(P\) of \(\bs Y\).
  2. Find \(P^n\) for \(n \in \{2, 3, \ldots\}\).
  3. Find the density function \(f_n\) of \(Y_n\) for \(n \in \N_+\), where \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \lfrta)\) corresponding to \(X\).
Details:
  1. \[P = \left[\begin{matrix} 0 & 1 & 0 \\ 1 / 4 & 1 / 2 & 1 / 4 \\ 1 / 2 & 0 & 1 / 2 \end{matrix}\right]\]
  2. \[P^n = \left[\begin{matrix} 1 / 4 & 1 / 2 & 1 4 \\ 1 / 4 & 1 / 2 & 1 / 4 \\ 1 / 4 & 1 2 & 1 / 4 \end{matrix}\right], \quad n \in \{2, 3, \ldots\}\]
  3. \(f_n = f = (1 / 4, 1 / 2, 1 / 4)\) for \(n \in \N_+\).

The Star Graph

For \(k \in \{3, 4, \ldots\}\), let \((\N_k, \lfrta)\) denote the star graph of order \(k\) as defined in Section 1. So \(\N_k = \{0, 1, \ldots, k\}\) where \(0\) is the center and \(\{1, 2, \ldots, k\}\) is the set of endpoints.

Find the rate constant \(\alpha\) and the probability density function \(f\) of the unique constant rate distribution for \((\N_k, \lfrta)\).

Details:

Let \(f\) denote the constant rate density function and \(\alpha\) the rate constant. By symmetry, \(f\) is constant on the endpoints, so let \(p = f(0)\) and let \(q\) denote the common value of \(f(x)\) for \(x \in \{1, 2, \ldots, k\}\). The equations that define \(f\) are \(p = \alpha k q\), \(q = \alpha p\) and \(p + k q = 1\). Solving gives \[\alpha = \frac{1}{\sqrt{k}}\] and then \begin{align*} f(0) & = \frac{\sqrt{k}}{k + \sqrt{k}} \\ f(x) & = \frac{1}{k + \sqrt{k}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}

The app below is a simulation of the constant rate distribution on the star of order \(k\). The parameter \(k\) can be varied with the scrollbar.

Suppose that \(X\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\) and let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random walk on the graph associated with \(X\).

  1. Find the transition matrix \(P\) of \(\bs Y\).
  2. Find the invariant probability density function \(h\) of \(\bs Y\).
Details:
  1. From exercise and the definition of \(P\), \begin{align*} P(0, x) & = \frac{1}{k} \\ P(x, 0) & = 1 \end{align*} This is also trivial from symmetry.
  2. Since the star is finite and symmetric, \(\bs Y\) is positive recurrent and the invariant probability density function \(h\) is obtained by normalizing the invariant function \(f^2\). So \begin{align*} h(0) &= \frac 1 2 \\ h(x) &= \frac{1}{2 k}, \quad x \in \{1, 2, \ldots, k\} \end{align*}

The app below is a simulation of the random walk on the star of order \(k\) assoicated with the constant rate distribution. The parameter \(k\) can be varied with the scrollbar.

For the star graph,

  1. Show that the result in theorem does not hold.
  2. Verify directly the result in theorem .
Details:

Recall that the star graph is periodic with period 2.

  1. As shown in Section 3, the walk distribution of order \(n \in \N\) depends only on whether \(n\) is even or odd. If \(n\) is even, the distribution is uniform on \(\N_k\). If \(n\) is odd, the center 0 has probability \(1 / 2\) while the remaining \(1 / 2\) probability is uniformly distributed among the endpoints. In particular, the walk distribution of order \(n \in \N\) does not converge as \(n \to \infty\).
  2. As in , let \(\approx\) be the symbol for asymptotic to as \(n \to \infty\). Using the walk functions for the star graph in Section 1 and the rate constant \(\alpha = 1 / \sqrt{k}\) from , \begin{align*} & \sum_{j = 0}^n \alpha^j v_j(0) \approx \frac{n}{2} \left(1 + \sqrt{k}\right)\\ & \sum_{j = 0}^n \alpha^j v_j(x) \approx \frac{n}{2} \left(1 + \frac{1}{\sqrt{k}}\right), \quad x \in \{1, 2, \ldots, k\} \\ & \sum_{j = 0}^n \alpha^j w_j \approx \frac{n}{2} \left(1 + k + 2 \sqrt{k}\right) \end{align*} So as \(n \to \infty\), \begin{align*} & \hat d_n(0) \to \frac{1 + \sqrt{k}}{1 + k + 2 \sqrt{k}} = \frac{\sqrt{k}}{k + \sqrt{k}} \text{ as } \\ & \hat d_n(x) \to \frac{1 + 1 / \sqrt{k}}{1 + k + 2 \sqrt{k}} = \frac{1}{k + \sqrt{k}}, \quad x \in \{1, 2, \ldots, k\} \\ \end{align*}

The Wheel Graph

For \(k \in \{3, 4, \ldots\}\), let \((\N_k, \lfrta)\) denote the wheel graph of order \(k\) as defined in Section 1. So \(\N_k = \{0, 1, \ldots, k\}\) where \(0\) is the center and \(\{1, 2, \ldots, k\}\) is the set of vertices on the rim.

Find the rate constant \(\alpha\) and the probability density function \(f\) of the unique constant rate distribution for \((\N_k, \lfrta)\).

Details:

As noted in Section 3 the largest eigenvalue of the adjacency matrix \(L\) is \(1 + \sqrt{k + 1}\), and moreover a corresponding eigenfunction \(g\) is given by \(g(0) = \sqrt{k + 1} - 1\) and \(g(x) = 1\) for \(x \in \{1, 2, \ldots, k\}\). Hence rate constant is \[\alpha = \frac{1}{1 + \sqrt{k + 1}} = \frac{\sqrt{k + 1} - 1}{k}\] The corresponding density function \(f\) is obatined by normalizing \(g\). Since \(c = \sum_{x \in S} g(x) = k - 1 + \sqrt{k + 1}\), \begin{align*} f(0) & = \frac{\sqrt{k + 1} - 1}{k - 1 + \sqrt{k + 1}}\\ f(x) & = \frac{1}{k - 1 + \sqrt{k + 1}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}

The app below is a simulation of the constant rate distribution on the wheel graph of order \(k\). The parameter \(k\) can be varied with the scrollbar.

Note that when \(k = 3\), the rate constant is \(\alpha = 1 / 3\) and the constant rate density is \(f(x) = 1 / 4\) for \(x \in \N_3\). But we already knew this since \((\N_3, \lfrta)\) is regular with degree \(3\).

Suppose that \(X_k\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\). Show that the distribution of \(X_k / k\) converges to the uniform distribution on \([0, 1]\) as \(k \to \infty\).

Details:

Let \(p_k\) denote the common value of \(f_k(j)\) for \(j \in \{1, 2, \ldots, k\}\) as given above. So \(f(0) = 1 - k p_k\). Then for \(x \in [0, 1]\), \begin{align*} \P(X_k / k \le x) & = \sum_{j = 0}^k \bs{1}(j / k \le x) f_k(j) = f_k(0) + p_k \sum_{j = 1}^k \bs{1}(j / k \le x) \\ & = f_k(0) + k p_k \frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \end{align*} But as \(k \to \infty\), \(f_k(0) \to 0\), \(k p_k \to 1\), and \(\frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \to \int_0^x 1 dt = x\).

Suppose that \(X\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\) and let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random walk on the graph associated with \(X\).

  1. Find the transition matrix \(P\) of \(\bs Y\).
  2. Find the invariant probability density function \(h\) of \(\bs Y\).
Details:
  1. From exercise and the definition of \(P\), \begin{align*} P(x, 0) & = \frac{\left(\sqrt{k + 1} - 1\right)^2}{k} \\ P(0, x) & = \frac{1}{k} \\ P(x, y) & = \frac{\sqrt{k + 1} - 1}{k}, \quad y = (x - 1) \bmod k, \; y = (x + 1) \bmod k \end{align*} Note that \((x - 1) \bmod k\) and \((x + 1) \bmod k\) are the neighbors of \(x\) on the cycle.
  2. Since the wheel is finite and symmetric, \(\bs Y\) is positive recurrent and the invariant probability density function \(h\) is obtained by normalizing the invariant function \(f^2\). After some algebra, \begin{align*} h(0) &= \frac 1 2 - \frac{1}{2 \sqrt{k + 1}} \\ h(x) &= \frac{1}{2 k} + \frac{1}{2 k \sqrt{k + 1}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}

The app below is a simulation of the random walk on the wheel graph of order \(k\). The parameter \(k\) can be varied with the scrollbar.

Verify theorem for the wheel graph of order \(k \in \{3, 4, \ldots\}\).

Details:

As shown in Section 3, the right walk density function \(d_n\) of order \(n \in \N\) is given by \begin{align*} d_n(0) = & = \frac{\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^{n - 1} + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^{n - 1}}{\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^{n - 1}\left(2 + \sqrt{k + 1}\right) + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^{n - 1}\left(2 - \sqrt{k + 1}\right)} \\ d_n(x) = & = \frac{\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^n + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^n}{k \left[\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^{n - 1}\left(2 + \sqrt{k + 1}\right) + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^{n - 1}\left(2 - \sqrt{k + 1}\right)\right]}, \quad x \in \{1, 2, \ldots, k\} \end{align*} Taking limits gives \begin{align*} \lim_{n \to \infty} d_n(0) & = \frac{1}{2 + \sqrt{k + 1}}\\ \lim_{n \to \infty} d_n(x) & = \frac{1 + \sqrt{k + 1}}{k \left(2 + \sqrt{k + 1}\right)}, \quad x \in \{1, 2, \ldots, k\} \end{align*} which (after some algebra) agrees with the constant rate density function in .