In this section, we consider the discrete graph \((\N, \lfrta)\) where \(\lfrta\) is the relation on \(\N\) defined by \(x \lfrta x + 1\) for \(x \in \N\) and \(x \lfrta x - 1\) for \(x \in \N_+\). So \((\N, \lfrta)\) is the undirected simple path \((0, 1, 2, \ldots)\), clearly a basic and interesting algebraic structure. Since the graph is discrete, we have our usual reference space \((\N, \ms P(\N), \#)\) in the background.
The \(\sigma\)-algebra associated with \((\N, \lfrta)\) is \(\ms P(\N)\).
By definition, the \(\sigma\)-algebra associated with the graph is \(\ms A = \sigma\{A_x: x \in \N\}\) where \(A_x = \{y \in \N: x \lfrta y\}\) is the neighbor set of \(x \in \N\). So \(A_x = \{x - 1, x + 1\}\) for \(x \in \N_+\) and \(A_0 = \{1\}\). So \(A_x \cap A_{x + 2} = \{x + 1\} \in \ms A\) for \(x \in \N\). It then follows that \(\N_+ \in \ms A\) and hence \(\{0\} \in \ms A\).
The walk functions for \((\N, \lfrta)\) are given recursively as follows.
The generating function \(U(x, t)\) for \((\N, \lfrta)\) satisfies
\begin{align*}
U(0, t) &= 1 + t U(1, t), \quad |t| \lt \frac 1 2 \\
U(x, t) &= 1 + t U(x + 1, t) + t U(x - 1, t), \quad x \in \N_+, \, |t| \lt \frac 1 2
\end{align*}
First, it's easy to see that for \(x \in \N\), the generating function converges for \(|t| \lt \frac 1 2\). From part (b) of proposition we have \(t^{n + 1} u_n(1) = t^{n + 1} u_{n + 1}(0)\) for \(|t| \lt \frac 1 2\) and \(n \in \N\). Summing over \(n \in \N\) gives
\[t U(1, t) = U(0, t) - U_0(0) = U(0, t) - 1\]
From part (c) of proposition we have
\[t^{n + 1} u_n(x + 1) = t^{n + 1} u_{n + 1}(x) - t^{n + 1} u_n(x - 1), \quad n \in \N, \, x \in \N_+, \, |t| \lt \frac 1 2\]
Summing over \(n \in \N\) gives
\[t U(x + 1, t) = U(x, t) - u_0(x) - t U(x - 1, t) = U(x, t) - 1 - t U(x - 1, t), \quad x \in \N_+, \, |t| \lt \frac 1 2 \]Details:
For the following result, \(L\) denotes the adjacency kernel of \((\N, \lfrta)\), as usual, and \(P_n\) denotes the polynomial in Section 1 of degree \(n \in \N\).
For \(\beta \in \R\), the solution of the equation \(L g = \beta g\) is given by \(g(x) = g(0) P_x(\beta)\) for \(x \in \N\).
For \(\beta \in \R\), the equation \(L g = \beta g\) becomes \begin{align*} g(1) & = \beta g(0) \\ g(x + 1) & = \beta g(x) - g(x - 1), \quad x \in \N_+ \end{align*} Hence \(g(x) = g(0) P_x (\beta)\) for \(x \in \N\) by defintion of the polynomials.
So on the vector space of all functions from \(\N\) to \(\R\), every \(\beta \in \R\) is an eigenvalue of \((\N, \lfrta)\), and a corresponding eigenfunction is given by \(x \mapsto P_x(\beta)\). But usually we are interested in eigenvalues and eigenfunctions of \((\N, \lfrta)\) for the space \(\ms L_1\), where probability density functions live.
Suppose that \(X\) is a random variable in \(\N\) with density function \(f\). The reliability function \(F\) of \(X\) for \((\N, \lfrta)\) is given by \[F(0) = f(1), \quad F(x) = f(x - 1) + f(x + 1), \; x \in \N_+\]
As usual, we have our basic support assumption that \(F(x) \gt 0\) for all \(x \in \N_+\). So in particular, \(f(1) \gt 0\), and for every \(x \in \N_+\), either \(f(x - 1) \gt 0\) or \(f(x + 1) \gt 0\). The probability function does in fact determines the distribution, and we can characterize reliability functions:
The graph \((\N, \lfrta)\) is stochastic. Suppose that \(f\) is a probability density function on \(\N\) and that \(F\) is the corresponding reliability function for \((\N, \lfrta)\). Then
Conversely, if \(F\) is a positive function on \(\N\) such that \(1 \lt \sum_{x = 0}^\infty F(x) \lt 2\) and such that the right side of (b) is positive for \(x \in \N\) and the right side of (c) is positive for \(x \in \N_+\), then \(F\) is the reliability function for \((\N, \lfrta)\) of a distribution on \(\N\) with probability density function given by (a), (b), and (c).
Suppose that \(f\) is a density function on \(\N\) and that \(F\) is the reliability function of \(f\) for \((\N, \lfrta)\). Note that \[\sum_{x = 0}^\infty F(x) = f(1) + \sum_{x = 1}^\infty [f(x - 1) + f(x + 1)] = f(0) + 2 \sum_{x = 1}^\infty f(x) = f(0) + 2[1 - f(0)] = 2 - f(0)\] and hence (a) holds. For the odd-order terms, note first that \(f(1) = F(0)\). Next, \(F(2) = f(1) + f(3)\) so \(f(3) = F(2) - F(0)\). Next, \(F(4) = f(3) + f(5)\) so \[f(5) = F(4) - f(3) = F(4) - F(2) + F(0)\] Continuing in this way gives (b). Similarly, \(F(1) = f(0) + f(2)\), so \(f(2) = F(1) - f(0)\), where \(f(0)\) is given in (a). Next, \(F(3) = f(2) + f(4)\) so \[f(4) = F(3) - f(2) = F(3) - F(1) + f(0)\] Continuing in this way gives (c).
Conversely, suppose that \(F\) is a positive function on \(\N\) such that \(1 \lt \sum_{x=0}^\infty F(x) \lt 2\) and with the right sides of (b) and (c) positive. Define \(f\) by (a), (b), and (c). Then \(f(x) \ge 0\) for \(x \in \N\) and it's easy to see from the definitions that \(F(0) = f(1)\) and \(F(x) = f(x - 1) + f(x + 1)\) for \(x \in \N_+\). So we have \[\sum_{x = 0}^\infty F(x) = f(0) + 2 \sum_{x = 1}^\infty f(x)\] Therefore \[f(0) + \sum_{x = 0}^\infty F(x) = 2 \sum_{x = 0}^\infty f(x)\] But by definition of \(f(0)\), the left side is \(2\). It follows that \(f\) is a density function on \(\N\) and \(F\) is the probability function of \(f\) for \((\N, \lfrta)\).
Here is our main result:
The graph \((\N, \lfrta)\) does not have a constant rate distribution. However, the geometric distribution on \(\N\) has constant rate on \(\N_+\) and is the only such distribution.
Suppose that \(X\) is a random variable with value in \(\N\) having constant rate \(\alpha\). From our support assumptions, \(\P(X = 0) \gt 0\) and \(\P(X \in \N_+) \gt 0\). It follows that \(\E[u(X)] \in (1, 2)\) where as usual, \(u\) is the walk function of order 1. But \(\E[u(X)] = 1 / \alpha\) from the basic moment result in Section 1.5 so \(\alpha \in \left(\frac 1 2, 1\right)\). Suppose next that \(X\) has probability density function \(f\). By the constant rate property, \(L f = \frac 1 \alpha f\), so that
Returning to the general case, we will consider random walk on \((\N, \lfrta)\) next.
Suppose that \(X\) is a random variable with probability density function \(f\) on \(\N\) and with reliability function \(F\) for \((\N, \lfrta)\). Let \(\bs{Y} = (Y_1, Y_2, \ldots)\) denote the random walk on \((\N, \lfrta)\) associated with \(X\), so that \(Y_1\) has density function \(f\) and \(\bs Y\) has transition kernel \(P\) given by \(P(0, 1) = 1\) and \begin{align*} P(x, x - 1) & = \frac{f(x - 1)}{f(x -1) + f(x + 1)}, \quad x \in \N_+ \\ P(x, x + 1) & = \frac{f(x + 1)}{f(x - 1) + f(x + 1)}, \quad x \in \N_+ \end{align*}
Hence \(\bs Y\) is a birth-death chain on \(\N\) with reflecting boundary point 0.
The random walk \(\bs{Y}\) is positive recurrent and reversible, with invariant probability density function \(g\) given by \(g(0) = f(0) f(1) / c\) and \(g(x) = f(x)[f(x - 1) + f(x + 1)] / c\) for \(x \in \N_+\) where \[c = 2 \sum_{x = 0}^\infty f(x) f(x + 1)\]
This follows from results in Section 1.4. Recall that an invariant function is \( \varphi = f F\). The normalizing constant is \[c = \sum_{x = 0}^\infty \varphi(x) = 2 \sum_{x = 0}^\infty f(x) f(x + 1)\]
Of course, the general theory of birth-death chains would lead to the same conclusions.
Suppose that \(X\) has the geometric distribution on \(\N\) with success parameter \(p\) and that \(\bs Y\) is the random walk on \((\N, \lfrta)\) associated with \(X\).
The app below simulates the random walk on \((\N, \lfrta)\) corresponding to the geometric distribution with success parameter \(p\), as described in .
The geometric distribution is almost the most random way to put points in the infinite path \((\N, \lfrta)\).