This section has the most important definitions and results of the chapter.
Our starting point is a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\). For \(n \in \N_+\) the corresponding product space is \((S^n, \ms S^n, \lambda^n)\). Suppose now that \((S, \rta)\) is a measurable graph. If \(X\) is a random variable in \(S\), recall the definitions of the reliability function \(F\) and the cumulative rate function \(R_n\) of order \(n \in \N_+\) of \(X\) for \((S, \rta)\) given in Section 3. Recall also the left walk function \(u_n\) of order \(n \in N_+\) for \((S, \rta)\) defined in Section 1. Here are the main definitions, stated so that we avoid division by 0 problems.
Relative to the graph \((S, \rta)\),
Here is our first trivial result:
If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \(X\) has constant average rate \(\alpha^n\) of order \(n\) for each \(n \in \N_+\).
If \(r(x) = \alpha\) for \(x \in S\) then from the definition, the cumulative rate function \(R_n\) of order \(n \in N_+\) is given by \(R_n(x) = \alpha^n u_n(x)\) for \(x \in S\).
So as the name suggests, the rate function \(r = f / F\) is the constant \(\alpha\). Recall that the reference measure \(\lambda\) is fixed in advance, and is usually a natural measure in some sense for the measurable space \((S, \ms S)\). For example, counting measure is always used for discrete graphs and Lebesgue measure for graphs on measurble subsets of \(\R^n\), or more generally, Hausdorff measure on Euclidean manifolds. Trivially, if \(X\) has constant rate \(\alpha \in (0, \infty)\) with respect to \(\lambda\) and if \(c \in (0, \infty)\) then \(X\) has constant rate \(\alpha / c\) with respect to the measure \(c \lambda\). But more importantly, if we can choose the measure a posteriori, then every distribution has constant rate.
Suppose that \(X\) is supported by \((S, \rta)\) and let \(\mu\) be the measure on \((S, \ms S)\) defined by \[\mu(A) = \E[1 / F(X), X \in A], \quad A \in \ms S\] Then \(X\) has constant rate \(1\) for \((S, \rta)\) with respect to \(\mu\).
Note first that since \(F\) is positive, \(\mu\) is a \(\sigma\)-finite positive measure. By definition, \(\mu\) has density function \(1 / F\) with respect to the distribution of \(X\), and hence the distribution of \(X\) has density function \(F\) with respect to \(\mu\). That is, \(\P(X \in A) = \int_A F(x) d \mu(x)\) for \(A \in \ms S\).
For the remainder of this section, we assume again that \(\lambda\) is a fixed reference measure. Suppose that \(X\) has density function \(f\) with respect to \(\lambda\), and let \(\mu\) denote the measure in Proposition and \(P\) the distribution of \(X\). In the notation of Radon-Nikodym derivatives, we have \(d \mu = (1 / F) \, d P\) and \(d P = f \, d \lambda\). It follows that \(d \mu = (f / F) \, d \lambda = r \, d \lambda\) where \(r\) is the rate function of \(X\) for \((S, \rta)\) with respect to \(\lambda\). So theorem is hardly surprising: \(X\) has constant rate 1 with respect to the measure \(\mu\) whose density is the rate function of \(X\) with respect to \(\lambda\).
Let \(L\) denote the adjacency kernel of the graph \((S, \rta)\), considered as an operator on the space \(\ms L_1\).
Recall that if \(f\) is a probability density function then the corresponding reliability function for \((S, \rta)\) is \(F = L f\).
Suppose that \((S, \rta)\) is a discrete graph with the property that \(x \rta x\) for some \(x \in S\). If there exists a distribution with constant rate \(\alpha\) then \(\alpha \in (0, 1]\).
Suppose that \(x \rta x\) and that \(X\) has density function \(f\) and has reliability function \(F\) for \((S, \rta)\). If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \[f(x) = \alpha F(x) = \alpha \left[f(x) + \sum_{x \rta y, \, y \ne x} f(y) \right]\] By the support assumption, \(F(x) \gt 0\) and hence \(f(x) \gt 0\). Rearranging the displayed equation we have \[(1 - \alpha) f(x) = \alpha \sum_{x \rta y, \, y \ne x} f(y)\] If \(\alpha \gt 1\), the left side is negative, but of course the right side is nonnegative.
In particular, this result applies to any discrete reflexive graph, and in particular, to any discrete partial order graph. The following result shows that mixtures of distributions with the same constant rate also have the common constant rate. From the point of view of linear algebra, this corresponds to an eigenvalue with multiplicity greater than one.
Suppose that \(F\) and \(G\) are reliability functions for distributions supported by \((S, \rta)\), each with constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). For \(p \in (0, 1)\), the function \(H = p F + (1 - p)G\) is also the reliability function for a distribution with constant rate \(\alpha\) for \((S, \rta)\)
Recall that if \(f\) and \(g\) are probability density functions with corresponding reliability functions \(F\) and \(G\) for \((S, \rta)\), respectivley, and if \(p \in (0, 1)\), then the density function \(h = p f + (1 - p) g\) has reliability function \(H = p F + (1 - p) G\). In this case, \(f = \alpha F\) and \(g = \alpha G\) so \[h = p f + (1 - p) g = p \alpha F + (1 - p) \alpha G = \alpha [p F + (1 - p) G] = \alpha H\]
The following result gives a simple condition for the uniform distribution to have constant rate. Recall that the graph \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) if \(\lambda\{y \in S: x \rta y\} = k\) for all \(x \in S\).
Suppose \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\). If \(\lambda(S) \lt \infty\) then the uniform distribution on \(S\) has constant rate \(1 / k\).
Suppose that \(\lambda(S) \lt \infty\) and let \(f(x) = 1 / \lambda(S)\) for \(x \in S\) so that \(f\) is the density of the uniform distribution on \(S\). Then the reliability function is \[F(x) = \int_{x \rta y} \frac{1}{\lambda(S)} \, d\lambda(y) = \frac{\lambda\{y \in S: x \rta y\}}{\lambda(S)} = \frac{k}{\lambda(S)}\] and so \(f = \frac{1}{k} F\).
Combining with we see that if \((S, \rta)\) is a finite, strongly connected, right regular graph, then the uniform distribution on \(S\) is the unique constant rate distribution. More generally, if \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) and we know that the eigenspace of the eigenvalue \(1\) has dimension 1, then \((S, \rta)\) has a unique constant rate distribution (the uniform distribution) if \(\lambda(S) \lt \infty\) and has no constant rate distribution if \(\lambda(S) = \infty\).
For the following proposition, recall that if \((S, \preceq)\) is a discrete partial order graph with covering relation \(\upa\), then \(\upa^n\) denotes the \(n\)-fold composition power of \(\upa\) for \(n \in \N\), where by convention, \(\upa^0\) is the equality relation \(=\).
Suppose that \((S, \preceq)\) is a discrete, locally finite, uniform partial order graph. with the property that if \(x \upa^n y\) then \(\#\{u \in S; x \upa u, u \upa^{n-1} y\} = a_n \in \N_+\) for \(n \in \N_+\), independent of \(x, \, y \in S\). If \(X\) has constant rate \(\beta \in (0, \infty)\) for \((S, \upa)\) then
Let \(f\) denote the density function of \(X\). Let \(F\) denote the reliability function of \(X\) for \((S, \preceq)\) and for \(n \in \N\), let \(G_n\) denote the reliability function of \(X\) for \((S, \upa^n)\).
Note that series in is finite if \(\beta \gt 1\).
Suppose that \((S, \rta)\) is a transitive graph, and for \(x \in S\) let \(A_x = \{y \in S: x \rta y\}\) (the right neighbor set of \(x\)). If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then the conditional distribution of \(X\) given \(x \rta X\) has constant rate \(\alpha\) for \((A_x, \rta)\) for each \(x \in S\).
This is an immediate consequence of the result in Section 3 since the rate function of the conditional distribution for \((A_x, \rta)\) is the restriction of the rate function of \(X\) to \(A_x\).
In particular, this proposition holds for a partial order graph \((S, \preceq)\).
Suppose that \((S, \rta)\) is a finite, discrete, strongly connected graph (so a strongly connected graph in the usual combinatorial sense). In this case, constant rate distributions are closely connected to the Peron-Frobenius theory of linear algebra.
The graph \((S, \rta)\) has a unique constant rate distribution. The rate constant \(\alpha\) is the reciprocal of the largest eigenvalue (the Peron eigenvaalue) and the probability density function \(f\) is the corresponding normalized positive eigenfunction.
This is just the Peron-Frobenius theorem in disguise. Since the graph is strongly connected, the adjacency matrix \(L\) is nonnegative and irreducible. Hence \(L\) has unique positive eigenvalue with multiplicity 1 (the largest eigenvalue), and a corresponding (right) eigenfunction that is strictly positive.
Moreover, if the graph is aperiodic, then the constant rate distribution is the limit of the right walk distributions. To review, recall that the right walk distribution of order \(n \in \N\) has density function \(d_n\) given by \(d_n(x) = v_n(x) / w_n\) on \(S\) where \(v_n(x) = \#\{(x_1, x_2, \ldots, x_n) \in S^n: x \rta x_1 \rta x_2 \rta \cdots \rta x_n\}\) is the number of walks of length \(n\) starting in \(x \in S\), and where the normalizing constant \(w_n = \#\{(x_1, x_2, \ldots, x_{n + 1}) \in S^{n + 1}: x_1 \rta x_2 \rta \cdots \rta x_{n + 1}\}\) is the total number of walks of length \(n\).
Suppose that the graph \((S, \rta)\) is aperiodic. Let \(\alpha\) denote the rate constant and \(f\) the density function of the unique constant rate distribution. Then
Since the adjacency matrix \(L\) is aperiodic, it is not only irreducible, but also primitive. As in , \(1 / \alpha\) is the Peron eigenvalue (the unique positive eigenvalue). From the Peron-Frobenius theory for nonnegative primitive matrices, there exists a nonnegative right eigenfunction \(a\) and a nonnegative left eigenfunction \(b\), both corresponding to the eigenvalue \(1 / \alpha\), such that \(\alpha^n L^n(x, y) \to a(x) b(y)\) as \(n \to \infty\) for \((x, y) \in S^2\). So \[\frac{v_n(x)}{w_n} = \frac{\sum_{y \in S} L^n(x, y)}{\sum_{(t, y) \in S^2} L^n(t, y)}, \quad x \in S\] Multiplying by \(\alpha^n\) and passing to the limit gives \[\frac{v_n(x)}{w_n} \to \frac{a(x) \sum_{y \in S} b(y)}{\sum_{t \in S} a(t) \sum_{y \in S} b(y)} = \frac{a(x)}{\sum_{t \in S} a(t)} = f(x) \; \text{ as } n \to \infty \text{ for } x \in S\] The proof of part (b) is similar.
In the context of , the corresponding reliability function \(D_n\) is given by \(D_n(x) = v_{n + 1}(x) / w_n\) and the corresponding rate function is \(d_n(x) / D_n(x) = v_n(x) / v_{n + 1}(x)\). So of course, \(v_n(x) / v_{n + 1}(x) \to \alpha\) as \(n \to \infty\) for \(x \in S\). Theorem does not necessarily hold if \((S, \rta)\) is periodic, but a related result holds in a certain average sense.
Suppose that the graph \((S, \rta)\) is periodic. Let \(\alpha\) denote the rate constant and \(f\) the density function of the unique constant rate distribution. For \(n \in \N\), let \(\hat d_n\) denote the density function on \(S\) defined by \[\hat d_n(x) = \frac{\sum_{j = 0}^n \alpha^j v_j(x)}{\sum_{j = 0}^n \alpha^j w_j}, \quad x \in S\] Then \(\hat d_n(x) \to f(x)\) as \(n \to \infty\) for \(x \in S\).
Clearly \(\hat d_n\) is a probability density function for \(n \in \N\) since the denominator is the normalizing constant for the function defined by the numerator. From the Peron-Frobenius theory for irreducible matrices, there exists a nonnegative right eigenfunction \(a\) and a nonnegative left eigehfunction \(b\), both corresponding to the Peron eigenvalue \(1 / \alpha\) such that \[\frac{1}{n + 1} \sum_{j = 0}^n \alpha^n L(x, y) \to a(x) b(y) \text{ as } n \to \infty \text{ for } (x, y) \in S^2 \] So \[\hat d_n(x) = \frac{\sum_{j = 0}^n \alpha^j v_j(x)}{\sum_{j = 0}^n \alpha^j w_j} = \frac{\sum_{j = 0}^n \alpha^j \sum_{y \in S} L^j(x, y)} {\sum_{j = 0}^n \alpha^j \sum_{(t, y) \in S^2} L^j(t, y)} = \frac{\sum_{y \in S} \frac{1}{n + 1} \sum_{j = 0}^n \alpha^j L^j(x, y)}{\sum_{(t, y) \in S^2} \frac{1}{n + 1} \sum_{j = 0}^n \alpha^j L^j(t, y)}\] Passing to the limit we have \[\hat d_n(x) \to \frac{\sum_{y \in S} a(x) b(y)} {\sum_{(t, y) \in S^2} a(t) b(y)} = \frac{a(x)} {\sum_{t \in S} a(t)} = f(x) \text{ as } n \to \infty \text{ for } x \in S\]
In the context of , the density \(\hat d_n\) is a convex combination of \(\{d_0, d_1, \ldots, d_n\}\) for \(n \in \N\). Specifically, \(\hat d_n(x) = \sum_{j = 0}^n p_j d_j(x)\) for \(n \in \N\) and \(x \in S\), where \[p_j = \frac{\alpha^j w_j}{\sum_{k = 0}^n \alpha^k w_k}, \quad j \in \{0, 1, \ldots n\}\] so that \(j \mapsto p_j\) is a probability density function on \(\{0, 1, \ldots, n\}\) for \(n \in \N\). Theorems and are certainly of theoretical interest, but are not of much practical help. It's usually much easier to find the constant rate density of a strongly connected graph directly than it is to find the walk densities and take the appropriate limits.
The basic moment result in Section 3 simplifies significantly when \(X\) has constant rate. Once again, \(L\) denotes the adjacency kernel of \((S, \rta)\).
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) and that \(g: S \to [0, \infty)\) is measurable. Then \[\E[(g L^n)(X)] = \frac{1}{\alpha^n} \E[g(X)], \quad n \in \N\]
By assumption, \(f = \alpha F\) is a density of \(X\), so \[\int_S (g L^n)(x) F(x) \, d\lambda(x) = \frac{1}{\alpha} \int_S (g L^n)(x) f(x) = \frac{1}{\alpha} \E[(g L^n)(X)]\] Hence from the basic moment result \[\frac{1}{\alpha} \E[(g L^n)(X)] = \E[(g L^{n+1})(X)]\] The result then follows since \(L^0 = I\), the identity kernel.
The result also holds if \(g: S \to \R\) is measurable and \(E[|g(X)|] \lt \infty\). Recall that the left walk function \(u_n\) of order \(n \in \N\) is given by \[u_n(x) = \bs 1 L^n(x) = \lambda^n\{(x_1, x_2, \ldots, x_n) \in S^n: x_1 \rta x_2 \rta \cdots \rta x_n \rta x\}, \quad x \in S\] the measure of the set of walks of length \(n\) that terminate in \(x\).
If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then \[\E[u_n(X)] = \frac{1}{\alpha^n}, \quad n \in \N\]
From note that \[\int_S \alpha^n u_n(x) f(x) \, d\lambda(x) = 1\] so it follows that the function \(f_n\) given by \(f_n(x) = \alpha^n u_n(x) f(x)\) for \(x \in S\) is a probability density function. We will return to this point shortly.
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then
Recall that the graph \((S, \rta)\) supports the distribution of \(X\).
Part (a) means that if a graph has a constant rate distribution, then necessarily the graph is left finite. The assumption in part (b) means that \((S, \rta)\) is left regular of degree \(c\). The generating function result also simplifies when the distribution has constant rate.
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then the left moment generating function \(M\) of \(X\) is given by \[M(t) = \frac{\alpha}{\alpha - t}, \quad |t| \lt \alpha\]
Recall that \(M(t) = \E[U(X, t)]\) where \(U(x, t) = \sum_{n = 0}^\infty u_n(x) t^n\) is the left generating function of \((S, \rta)\). Hence using and Fubini's theorem we have \[M(t) = \E[U(X, t)] = \sum_{n = 0}^\infty \E\left[u_n(X)\right] t^n = \sum_{n = 0}^\infty (t / \alpha)^n = \frac{1}{1 - t / \alpha}, \quad |t| \lt \alpha\]
Constant rate distributions maximize entropy among distributions satisfying a certain moment condition. This is an indication that a constant rate distribution governs the most random way to put points in a graph, but we will see a stronger expression of that idea shortly.
Suppose that \(X\) has constant rate and reliability function \(F\) for the graph \((S, \rta)\). Then \(X\) maximizes entropy among all random variables \(Y\) on \(S\) satisfying \[\E\left[\ln F(Y) \right] = \E\left[\ln F(X)\right]\]
Suppose that \(X\) has constant rate \(\alpha\), so that \(f = \alpha F\) is a density of \(X\). Suppose that \(Y\) is a random variable taking values in \(S\), with density function \(g\). From the general entropy inequality in Section 3 in we have \begin{align*} H(Y) &= -\int_S g(x) \ln[g(x)] d \lambda(x) \le -\int_S g(x) \ln[f(x)] d \lambda(x) \\ &= -\int_S g(x) \left(\ln \alpha + \ln[F(x)]\right) d \lambda(x) \\ & = -\ln \alpha - \int_S g(x) \ln[F(x)] d \lambda(x) = -\ln \alpha - \E\left(\ln[F(Y)]\right) \\ & = -\ln \alpha - \E\left(\ln[F(X)]\right) \end{align*} Of course, the entropy of \(X\) achieves the upper bound.
Note that since the reliability function \(F\) typically has an exponential
form of some sort, \(\E(\ln[F(Y)])\) often reduces to a natural moment condition.
Results related to the random walk also simplify significantly when the underlying distribution has constant rate. For the next two results, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with random variable \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Recall also that \(u_n\) denotes the walk function of order \(n \in \N\).
The transition density \(P^n\) of order \(n \in \N_+\) for \(\bs Y\) is given by \[P^n(x, y) = \alpha^n \frac{F(y)}{F(x)} L^n(x, y), \quad (x, y) \in S^2\]
Recall from Section 4 that \[P^n(x, y) = \frac{f(y)}{F(x)} R^{(n)}(x, y), \quad (x, y) \in S^2\] where \[R^{(n)}(x, y) = \int_{x \rta x_1 \rta \cdots \rta x_{n - 1} \rta y} r(x_1) \cdots r(x_{n - 1}) d\lambda^n(x_1, \ldots, x_{n - 1}), \quad (x, y) \in S^2\] Since \(X\) has constant rate \(\alpha\), \(f(x) = \alpha F(x)\) and \(r(x) = \alpha\) for \(x \in S\). Hence \[R^{(n)}(x, y) = \alpha^{n - 1} u_{n - 1}\{(x_1, \ldots, x_{n - 1}): x \rta x_1 \rta \cdots \rta x_{n - 1} \rta y\} = \alpha^{n - 1} L^n(x, y)\]
Let \(n \in \N_+\).
The results follow easily from the constant rate property.
For \(n \in \N_+\), the distribution of \(Y_n\) depends only on the rate parameter \(\alpha\), the walk function \(u_{n - 1}\) and the reliability function \(F\). Although a simple corollary, part (c) is one of our fundamental results, so we will restate it for emphasis:
The random walk \(\bs Y\) associated with a constant rate distribution (if such a distribution exists) is the most random way to put points in the graph \((S, \rta)\).
In the special case of a finite, strongly connected graph \((S, \rta)\), it follows from that the normalized eigenfunction corresponding to the largest eigenvalue of the adjacency matrix \(L\) defines a distribution whose random walk is the most random way to put points in the graph. Here is another simple result that follows from our general theory of random walks.
Suppose that \((S, \lfrta)\) is a symmetric graph and that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \lfrta)\), with reliability function \(F\). Let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random waslk on \((S, \lfrta)\) associated with \(X\). Then
From the general theory in Section 4, \(f F\) is invariant for \(\bs Y\), where \(f\) is the density function of \(X\). Since \(X\) has constant rate \(\alpha\) it follows that \(f = \alpha F\) and hence \(\alpha F^2\) is invariant for \(\bs Y\). But then \(F^2\) is also invariant for \(\bs Y\). The normalizing constant is \[c := \int_S F^2(x) \, d\lambda(x) = \frac{1}{\alpha} \E[F(X)]\] Note that \(c \le 1\) since \(F \le 1\) so \(h = F^2 / c\) is an invariant probability density function for \(X\)
The next result refers to the point process \(\bs N = \{N_A: A \in \ms S\}\) corresponding to the random walk \(\bs Y\). The expected number of points in a set has a simple expression in terms of the generating function when the underlying distribution has constant rate.
Suppose that \((S, \rta)\) has left generating function \(U\) and that \(X\) has right constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Let \(\bs Y\) denote the random walk on \((S, \rta)\) associated with \(X\). For \(A \in \ms S\), \[\E(N_A) = \E[U(X, \alpha); X \in A]\]
Recall that \(\E(N_A) = \E[V(X); X \in A]\) where \(V(x) = \sum_{n = 0}^\infty R_n(x)\) for \(x \in S\). Since \(X\) has constant rate \(\alpha\) we have \[V(x) = \sum_{n = 0}^\infty R_n(x) = \sum_{n = 0}^\infty \alpha^n u_n(x) = U(x, \alpha), \quad x \in S\]
Our next topic is thinning the point process. As above, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Let \(N\) have the geometric distribution on \(\N_+\) with success parameter \(p \in (0, 1)\) so that \[\P(N = n) = p(1 - p)^{n - 1}, \quad n \in \N_+\] As we will see in Chapter 4, \(N\) has constant rate \(p\) for the graph \((\N_+, \le)\). Moreover, we assume that \(N\) and \(\bs Y\) are independent. The basic idea is that we accept a point with probability \(p\) and reject the point with probability \(1 - p\), so that \(Y_N\) is the first point accepted. We are interested in the distribution of \(Y_N\). As before, let \(U\) denote the walk generating function of \((S, \rta)\).
The denstiy function \(g\) of \(Y_N\) is is given by \[g(x) = p U[x, \alpha (1 - p)] f(x), \quad x \in S\]
As usual, let \(u_n\) denote the left walk function of \((S, \rta)\) of order \(n \in \N\). Then \begin{align*} g(x) &= \E[f_N(x)] = \sum_{n = 1}^\infty p (1 - p)^{n - 1} f_n(x) = \sum_{n = 1}^\infty p (1 - p)^{n - 1} \alpha^n \gamma_{n - 1}(x) F(x) \\ &= \alpha p F(x) \sum_{n = 1}^\infty [\alpha (1 - p)]^{n - 1} \gamma_{n - 1}(x) = \alpha p F(x) U[x, \alpha (1 - p)] \\ &= p U[x, \alpha (1 - p)] f(x), \quad x \in S \end{align*}
In general, \(Y_N\) does not have constant rate for \((S, \rta)\).
Trivially, a finite discrete graph that is not strongly connected may not have a constant rate distribution.
Let \(S = \{0, 1\}\) and let \(\rta\) denote the relation on \(S\) defined by \(0 \rta 0\), \(0 \rta 1\) and \(1 \rta 1\). The graph \((S, \rta)\) does not support a constant rate distribution.
If \(f\) is a probability density function on \(S\) then the corresponding reliability function \(F\) for \((S, \rta)\) is given by \(F(0) = f(0) + f(1)\) and \(F(1) = f(1)\). So if \(f(0) \gt 0\) and \(f(1) \gt 0\) then no \(\alpha \in (0, \infty)\) satisfies \(f = \alpha F\). Note that \((S, \rta)\) is not strongly connected.
For the standard continuous graph \(([0, \infty), \le)\), recall that the left walk function \(u_n\) of order \(n \in \N\) is given by \(u_n(x) = x^n / n!\) for \(x \in [0, \infty)\) and the walk generating function \(U\) is given by \(U(x, t) = e^{x t}\) for \(x \in [0, \infty)\) and \(t \in \R\).
Of course, all of this is well known. The random walk \(\bs Y\) is the sequence of arrival times of the Poisson process with rate \(\alpha\). The standard continuous graph is studied in detail in Chapter 3.
For the standard discrete graph \((\N, \le)\) recall that the left walk function \(u_n\) of order \(n \in \N\) is given by \(u_n(x) = \binom{x + n}{n}\) for \(x \in \N\) and the walk generating function \(U\) is given by \(U(x, t) = 1 / (1 - t)^{x+1}\) for \(x \in \N\) and \(t \in (-1, 1)\).
These results are also well known. For \(n \in \N_+\), random variable \(Y_n\) is the number of failures before the \(n\)th success in a Bernoulli trials sequence with success parameter \(\alpha\), so \(\bs Y\) can be thought of as the sequence of arrival times in a renewal process. The standard discrete graph is studied in detail in Chapter 4.
Return to the general setting of a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\).
Let \((S, \equiv)\) denote the complete reflexive graph, so that \(x \equiv y\) for every \(x, \, y \in S\).
Recall that \((S, \ne)\) is the complete irreflexive graph. Suppose that \(\lambda\{x\} = c \in [0, \infty)\) for all \(x \in S\).
Suppose that \(f\) is a probability density function on \(S\). The corresponding reliability function \(F\) is given by \[F(x) = \int_{S - \{x\}} f(y) \, d\lambda(y) = 1 - \int_{\{x\}} f(y) \, d\lambda(y) = \lambda\{x\} f(x) = c f(x) \] So the condition for the distribution defined by \(f\) to have constant rate \(\alpha \in (0, \infty)\) is \(f(x) = \alpha [1 - c f(x)]\) for \(x \in S\) or equivalently, \(f(x) = \alpha / (1 + \alpha c)\). If \(\lambda(S) \lt \infty\) then \(\alpha = 1 / [\lambda(S) - c]\) and \(f\) is the density of the uniform distribution on \(S\). If \(\lambda(S) = \infty\) then no choice of \(\alpha \in (0, \infty)\) makes \(f\) a valid density function. Note also that \((S, \ne)\) is regular of degree \(\lambda(S) - c\), so applies.
In the context of , if \(S\) is uncountable and the reference measure \(\lambda\) is continuous, then \(c = 0\) and the rate constant in part (a) is \(1 / \lambda(S)\), just as for the complete reflexive graph in . If \(S\) is countable with counting measure \(\lambda = \#\), then \(c = 1\). If \(\#(S) = n \in \{2, 3, \ldots\}\), then the rate constant of the uniform distribution in part (a) is \(1 / (n - 1)\).
Suppose that \(S\) is countable, with counting measure \(\#\) as the reference measure. For the equality graph \((S, =)\), every distribution has constant rate 1.
If \(f\) is a probability density function on \(S\), then trivially, the corresponding reliability function for \((S, =)\) is \(F = f\).
Recall the diamond graph \((S, \lfrta)\) defined in Section 1.
The app below is a simulation of the constant rate distribution for the diamond graph.
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the diamond graph \((S, \lfrta)\) corresponding to the constant rate distribution.
The app below is a simulation of the random walk on the diamond graph.
Verify directly the result of theorem for the diamond graph.
As shown in Section 3, the right walk density function \(d_n\) of order \(n \in \N\) is given by \begin{align*} d_n(1) = d_n(4) & = \frac{\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 5\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 5\right)} {4 \left[\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 4\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 4\right)\right]} \\ d_n(2) = d_n(3) & = \frac{\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 3\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 3\right)} {4 \left[\left(1 - \sqrt{17}\right)^n \left(\sqrt{17} - 4\right) + \left(1 + \sqrt{17}\right)^n \left(\sqrt{17} + 4\right)\right]} \end{align*} Taking limits gives \begin{align*} \lim_{n \to \infty} d_n(1) & = \lim_{n \to \infty} d_n(4) = \frac{\sqrt{17} - 3}{4} \\ \lim_{n \to \infty} d_n(2) & = \lim_{n \to \infty} d_n(3) = \frac{5 - \sqrt{17}}{4} \end{align*} which agrees with the density function of the constant rate distribution in .
Recall the directed version of the diamond graph \((S, \rta)\) described in Section 1.
The app below is a simulation of the constant rate distribution on the directed diamond graph.
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the directed diamond graph \((S, \rta)\) corresponding to the constant rate distribution.
The app below is a simulation of the random walk on the directed diamond graph.
For the directed diamond graph,
Recall the bull graph \((S, \lfrta)\) defined in Section 1.
The app below is a simulation of the constant rate distribution on the bull graph.
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the bull graph \((S, \lfrta)\) corresponding to the constant rate distribution.
The app below is a simulation of the random walk on the bull graph.
Verify directly the result of theorem for the bull graph.
As shown in Section 3, the right walk density function \(d_n\) of order \(n \in \N\) is given by \begin{align*} d_n(1) = & = \frac{2 \left[\left(1 - \sqrt{13}\right)^n \left(\sqrt{13} - 2\right) + \left(1 + \sqrt{13}\right)^n \left(\sqrt{13} + 2\right)\right]} {\left(1 - \sqrt{13}\right)^n \left(7 \sqrt{13} - 23\right) + \left(1 + \sqrt{13}\right)^n \left(7 \sqrt{13} + 23\right)} \\ d_n(2) = d_n(3) & = \frac{3 \left[\left(1 - \sqrt{13}\right)^n \left(\sqrt{13} - 5\right) + \left(1 + \sqrt{13}\right)^n \left(\sqrt{13} + 5\right)\right]} {2 \left[\left(1 - \sqrt{13}\right)^n \left(7 \sqrt{13} - 23\right) + \left(1 + \sqrt{13}\right)^n \left(7 \sqrt{13} + 23\right)\right]} \\ d_n(4) = d_n(5) & = \frac{\left(1 - \sqrt{13}\right)^n \left(\sqrt{13} - 2\right) + \left(1 + \sqrt{13}\right)^n \left(\sqrt{13} + 2\right)} {\left(1 - \sqrt{13}\right)^n \left(7 \sqrt{13} - 23\right) + \left(1 + \sqrt{13}\right)^n \left(7 \sqrt{13} + 23\right)} \end{align*} Taking limits gives \begin{align*} \lim_{n \to \infty} d_n(1) & = \frac{5 - \sqrt{13}}{6} \\ \lim_{n \to \infty} d_n(2) = \lim_{n \to \infty} d_n(3) & = \frac{\sqrt{13} - 2}{6} \\ \lim_{n \to \infty} d_n(4) = \lim_{n \to \infty} d_n(5) & = \frac{5 - \sqrt{13}}{12} \end{align*} which agrees with the constant rate density in .
Verify directly the result of theorem for the bull graph.
As shown in Section 3, the right walk density functions are \(d_0 = (1 / 3, 1 / 3, 1 / 3)\), \(d_1 = (1 / 6, 1 / 2, 1 / 3)\) and \(d_n = (1 / 4, 1 / 2, 1 / 4) = f\) for \(n \in \{2, 3, \ldots\}\).
Recall the traingle graph \((S, \rta)\) defined in Section 1. So \(S = \{1, 2, 3\}\) and \(\rta\) is defined by \(1 \rta 2\), \(2 \rta 1\), \(2 \rta 2\), \(2 \rta 3\), \(3 \rta 1\) and \(3 \rta 3\).
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the bull graph \((S, \lfrta)\) corresponding to the constant rate distribution.
For \(k \in \{3, 4, \ldots\}\), let \((\N_k, \lfrta)\) denote the star graph of order \(k\) as defined in Section 1. So \(\N_k = \{0, 1, \ldots, k\}\) where \(0\) is the center and \(\{1, 2, \ldots, k\}\) is the set of endpoints.
Find the rate constant \(\alpha\) and the probability density function \(f\) of the unique constant rate distribution for \((\N_k, \lfrta)\).
Let \(f\) denote the constant rate density function and \(\alpha\) the rate constant. By symmetry, \(f\) is constant on the endpoints, so let \(p = f(0)\) and let \(q\) denote the common value of \(f(x)\) for \(x \in \{1, 2, \ldots, k\}\). The equations that define \(f\) are \(p = \alpha k q\), \(q = \alpha p\) and \(p + k q = 1\). Solving gives \[\alpha = \frac{1}{\sqrt{k}}\] and then \begin{align*} f(0) & = \frac{\sqrt{k}}{k + \sqrt{k}} \\ f(x) & = \frac{1}{k + \sqrt{k}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}
The app below is a simulation of the constant rate distribution on the star of order \(k\). The parameter \(k\) can be varied with the scrollbar.
Suppose that \(X\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\) and let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random walk on the graph associated with \(X\).
The app below is a simulation of the random walk on the star of order \(k\) assoicated with the constant rate distribution. The parameter \(k\) can be varied with the scrollbar.
For the star graph,
Recall that the star graph is periodic with period 2.
For \(k \in \{3, 4, \ldots\}\), let \((\N_k, \lfrta)\) denote the wheel graph of order \(k\) as defined in Section 1. So \(\N_k = \{0, 1, \ldots, k\}\) where \(0\) is the center and \(\{1, 2, \ldots, k\}\) is the set of vertices on the rim.
Find the rate constant \(\alpha\) and the probability density function \(f\) of the unique constant rate distribution for \((\N_k, \lfrta)\).
As noted in Section 3 the largest eigenvalue of the adjacency matrix \(L\) is \(1 + \sqrt{k + 1}\), and moreover a corresponding eigenfunction \(g\) is given by \(g(0) = \sqrt{k + 1} - 1\) and \(g(x) = 1\) for \(x \in \{1, 2, \ldots, k\}\). Hence rate constant is \[\alpha = \frac{1}{1 + \sqrt{k + 1}} = \frac{\sqrt{k + 1} - 1}{k}\] The corresponding density function \(f\) is obatined by normalizing \(g\). Since \(c = \sum_{x \in S} g(x) = k - 1 + \sqrt{k + 1}\), \begin{align*} f(0) & = \frac{\sqrt{k + 1} - 1}{k - 1 + \sqrt{k + 1}}\\ f(x) & = \frac{1}{k - 1 + \sqrt{k + 1}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}
The app below is a simulation of the constant rate distribution on the wheel graph of order \(k\). The parameter \(k\) can be varied with the scrollbar.
Note that when \(k = 3\), the rate constant is \(\alpha = 1 / 3\) and the constant rate density is \(f(x) = 1 / 4\) for \(x \in \N_3\). But we already knew this since \((\N_3, \lfrta)\) is regular with degree \(3\).
Suppose that \(X_k\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\). Show that the distribution of \(X_k / k\) converges to the uniform distribution on \([0, 1]\) as \(k \to \infty\).
Let \(p_k\) denote the common value of \(f_k(j)\) for \(j \in \{1, 2, \ldots, k\}\) as given above. So \(f(0) = 1 - k p_k\). Then for \(x \in [0, 1]\), \begin{align*} \P(X_k / k \le x) & = \sum_{j = 0}^k \bs{1}(j / k \le x) f_k(j) = f_k(0) + p_k \sum_{j = 1}^k \bs{1}(j / k \le x) \\ & = f_k(0) + k p_k \frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \end{align*} But as \(k \to \infty\), \(f_k(0) \to 0\), \(k p_k \to 1\), and \(\frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \to \int_0^x 1 dt = x\).
Suppose that \(X\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\) and let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random walk on the graph associated with \(X\).
The app below is a simulation of the random walk on the wheel graph of order \(k\). The parameter \(k\) can be varied with the scrollbar.
Verify theorem for the wheel graph of order \(k \in \{3, 4, \ldots\}\).
As shown in Section 3, the right walk density function \(d_n\) of order \(n \in \N\) is given by \begin{align*} d_n(0) = & = \frac{\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^{n - 1} + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^{n - 1}}{\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^{n - 1}\left(2 + \sqrt{k + 1}\right) + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^{n - 1}\left(2 - \sqrt{k + 1}\right)} \\ d_n(x) = & = \frac{\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^n + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^n}{k \left[\left(1 / 2 + 1 / \sqrt{k + 1}\right) \left(1 + \sqrt{k + 1}\right)^{n - 1}\left(2 + \sqrt{k + 1}\right) + \left(1 / 2 - 1 / \sqrt{k + 1}\right) \left(1 - \sqrt{k + 1}\right)^{n - 1}\left(2 - \sqrt{k + 1}\right)\right]}, \quad x \in \{1, 2, \ldots, k\} \end{align*} Taking limits gives \begin{align*} \lim_{n \to \infty} d_n(0) & = \frac{1}{2 + \sqrt{k + 1}}\\ \lim_{n \to \infty} d_n(x) & = \frac{1 + \sqrt{k + 1}}{k \left(2 + \sqrt{k + 1}\right)}, \quad x \in \{1, 2, \ldots, k\} \end{align*} which (after some algebra) agrees with the constant rate density function in .