\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\lfa}{\leftarrow}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\Upa}{\Uparrow}\) \(\newcommand{\lfrta}{\leftrightarrow}\) \(\newcommand{\nea}{\nearrow}\) \(\newcommand{\sea}{\searrow}\) \(\newcommand{\nwa}{\nwarrow}\) \(\newcommand{\swa}{\swarrow}\) \(\newcommand{\Rta}{\Rightarrow}\) \(\newcommand{\bs}{\boldsymbol}\)
  1. Reliability
  2. 5. Rooted Trees and Related Spaces
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6

3. Probability on Rooted Trees

Preliminaries

As in Section 1, suppose that \((S, \upa)\) is a rooted tree, and let \((S, \preceq)\) denote the corresponding partial order graph, \((S, \prec)\) the corresponding strict partial order graph, and \((S, \Upa)\) the reflexive closure of \((S, \upa)\). If \(X\) is a random variable in \(S\) with density function \(f\), then by definition, the (right) reliability function of \(X\) for one of the graphs is the function that assigns probabilities to the (right) neighbor sets. Recall also that the (right) \(\sigma\)-algebra assoicated with the graph is the \(\sigma\)-algebra generated by those neighbor sets. Finally, recall that the graph is (right) stochastic if a probability measure on the associated \(\sigma\)-algebra is uniquely determined by the reliability function. We start with the graphs \((S, \preceq)\) and \((S, \Upa)\). In Section 1 we showed that in both cases, the associated \(\sigma\)-algebra is the reference \(\sigma\)-algebra \(\ms P(S)\), the collection of all subsets of \(S\).

The graph \((S, \preceq)\) is stochastic. A function \(F: S \to [0, 1]\) is a reliability function for \((S, \preceq)\) if and only if

  1. \(F(e) = 1\)
  2. \(\sum_{x \upa y} F(y) \le F(x)\) for \(x \in S\).
  3. \(\sum_{e \upa^n x} F(x) \to 0\) as \(n \to \infty\).

In this case, the unique density function is given by \[ f(x) = F(x) - \sum_{x \upa y} F(y), \quad x \in S \]

Details:

The characterization follows from the Möbius inversion formula in Section 1.4, using the Möbius kernel in Section 1, but we will give a direct proof. Suppose first that \(F\) is the reliability function on \((S, \preceq)\) for random variable \(X\) with density function \(f\). Then

  1. \(F(e) = \P(e \preceq X) = 1\)
  2. \(\sum_{x \upa y} F(y) = \P(x \prec X) \le \P(x \preceq X) = F(x)\) for \(x \in S\).
  3. \(\sum_{e \upa^n x} F(x) = \P[d(X) \ge n] \to 0\) as \(n \to \infty\).

and then \[ f(x) = \P(X = x) = \P(X \succeq x) - \P(X \succ x) = \P(X \succeq x) - \sum_{x \upa y} \P(X \succeq y) = F(x) - \sum_{x \upa y} F(y)\] Conversely, suppose that \(F\) satisfies the conditions in the theorem, and let \(f\) be defined by the displayed equation. By (b), \(f(x) \ge 0\) for \(x \in S\). Next, for \(x \in S\), \begin{align*} \sum_{x \preceq y} f(y) &= \sum_{n = 0}^\infty \sum_{x \upa^n y} f(y) = \sum_{n=0}^\infty \sum_{x \upa^n y} \left[F(y) - \sum_{y \upa z} F(z)\right] = \lim_{m \to \infty} \sum_{n = 0}^m \sum_{x \upa^n y} \left[F(y) - \sum_{y \upa z} F(z)\right] \\ &= \lim_{m \to \infty} \left[\sum_{n = 0}^m \sum_{x \upa^n y} F(y) - \sum_{n = 0}^m \sum_{x \upa^n y} \sum_{y \upa z} F(z)\right] = \lim_{m \to \infty} \left[\sum_{n = 0}^m \sum_{x \upa^n y} F(y) - \sum_{n = 0}^m \sum_{x \upa^{n + 1} z} F(z)\right] \\ &= \lim_{m \to \infty} \left[\sum_{n = 0}^\infty \sum_{x \upa^n y} F(y) - \sum_{n = 1}^{m + 1} \sum_{x \upa^n y} F(y)\right] = \lim_{m \to \infty} \left[F(x) - \sum_{x \upa^{m + 1} y} F(y)\right] = F(x) \end{align*} Letting \(x = e\) shows that \(\sum_{x \in S} f(x) = 1\) so that \(f\) is a probability density function. For general \(x \in S\), it then follows that \(F\) is the reliability function of \(f\).

The graph \((S, \Upa)\) is stochastic. If \(G\) is the reliability function for \((S, \Upa)\) corresponding to density function \(g\), then \(g\) can be recovered from \(G\) by \[ g(x) = \sum_{n = 0}^\infty (-1)^n \sum_{x \upa^n y} G(y), \quad x \in S\]

Details:

Suppose that \(X\) is a random variable in \(S\) with density \(g\) and with reliability function \(G\) for \((S, \Upa)\). For \(n \in \N\), \[ \sum_{x \upa^n y} G(y) = \sum_{x \upa^n y} \P(X = y \text{ or } y \upa X) = \P[d(x, X) = n] + \P[d(x, X) = n + 1], \quad x \in S\] Hence the partial sum up to \(m \in \N_+\) collapses: \[\sum_{n = 0}^m (-1)^n \sum_{x \upa^n y} G(y) = \P(X = x) + (-1)^n \P[d(x, X) = m + 1], \quad x \in S\] Letting \(m \to \infty\) gives the result.

Next, let \(A_x = \{y \in S: x \upa y\}\), the set of children of \(x \in S\) and the neighbor set of \(x\) for the graph \((S, \upa)\). For both the graph \((S, \upa)\) and the graph \((S, \prec)\) we showed in Section 1 that the associated \(\sigma\)-algebra is \(\sigma(\ms A)\) where \(\ms A\) is the collection of subsets that consists of \(\{e\}\) and \(A_x\) for \(x \in S\). This collection of sets partitions \(S\) and so \(\sigma(\ms A)\) consists of all unions of sets in \(\ms A\). Unless \((S, \upa)\) is a directed path, \(\sigma(\ms A)\) is a strict subset of \(\ms P(S)\) and so in general, a reliability function of a distribution on \(\ms P(S)\) does not determine the distribution. Nonetheless, the graphs are stochastic.

The graphs \((S, \upa)\) and \((S, \prec)\) are stochastic.

Details:

Suppose that \(P\) is a probability measure on \(\sigma(\ms A)\). That \((S, \upa)\) is stochastic is trivial, since the reliability function \(F\) of \(P\) for the graph is simply \(F(x) = P(A_x)\) for \(x \in S\). Let \(G\) denote the reliability function of \(P\) for the graph \((S, \prec)\). We simply need to show that \(P(A_x)\) can be written in terms of \(G\) for each \(x \in S\). Let \(B_x = \{y \in S: x \prec y\}\), the neighbor set of \((S, \prec)\) for \(x \in S\), so that \(G(x) = P(B_x)\) for \(x \in S\). Then \(A_x = B_x - \bigcup_{x \upa y} B_y\) for \(x \in S\) and therefore since the sets in the union are disjoint, \(P(A_x) = G(x) - \sum_{x \upa y}G(y)\) for \(x \in S\).

The standard moment results are next. We start with \((S, \preceq)\), our main graph of interest.

Suppose \(X\) is a random variable in \(S\). Then \[\sum_{x \in S} \binom{n + d(x)}{n} \P(X \succeq x) = \E\left[\binom{n + 1 + d(X)}{n + 1}\right], \quad n \in \N\]

Details:

From the standard moment result recall that \[\sum_{x \in S} u_n(x) F(x) = \E[u_{n + 1}(X)], \quad n \in \N\] where \(u_n\) is the left walk function of order \(n \in \N\) for \((S, \preceq)\) and \(F\) is the reliability function of \(X\) for \((S, \preceq)\). So the result follows from general result in Section 1.3 and the path function in Section 1.

Give the standard moment result for each of the following graphs:

  1. \((S, \upa)\)
  2. \((S, \Upa)\)
  3. \((S, \prec)\)
Details:

Recall again that \[\sum_{x \in S} u_n(x) F(x) = \E[u_{n + 1}(X)], \quad n \in \N\] where \(u_n\) is the left walk function of order \(n \in \N\) for the graph and \(F\) is the reliability function of \(X\) for the graph.

  1. For the graph \((S, \upa)\), \(u_n(x) = \bs 1[d(x) \ge n]\) and \(F(x) = \P(x \upa X)\) for \(x \in S\) so \[\sum_{d(x) \ge n} \P(x \uparrow X) = \P[d(X) \ge n + 1]\]
  2. For the graph \((S, \Upa)\), \(u_n(x) = \sum_{k = 0}^{d(x)} \binom{n}{k}\) and \(F(x) = \P(x \Upa X) = \P(x = X) + \P(x \uparrow X)\) for \(x \in S\) so \[\sum_{x \in S} \sum_{k = 0}^{d(x)} \binom{n}{k}\P(x \Uparrow X) = \E\left[\sum_{k = 0}^{d(X)} \binom{n + 1}{k}\right]\]
  3. For the graph \((S, \prec)\), \(u_n(x) = \binom{d(x)}{n}\) and \(F(x) = \P(X \succ x)\) for \(x \in S\) so \[\sum_{x \in S} \binom{d(x)}{n} \P(X \succ x) = \E\left[\binom{d(X)}{n + 1}\right]\]

Constant Rate Distributions

We will identify a class of probability distributions that have constant rate for all of the graphs under discussion, and we will identify the rate constants. A tree with leaves cannot have a constant rate distribution, so we will assume for the rest of this section that \((S, \preceq)\) has no leaves, and hence that \(S\) is countably infinite.

Fix \(\alpha \in (0, 1)\). The following recursive scheme defines a probability density functions \(f\) on \(S\):

  1. \(f(e) = \alpha\).
  2. If \(f(x)\) is defined for \(x \in S\), then define \(f(y)\) for \(x \upa y\) arbitrarily, subject to the restrictions
    1. \(f(y) > 0\)
    2. \(\sum_{x \upa y} f(y) = (1 - \alpha) f(x)\)
Details:

Note first that \((S, \preceq)\) is a well-founded partial order graph, and hence the recursive definition makes sense. Since the tree has no leaves, \(\{y \in S: x \upa y\}\) is nonempty for each \(x \in S\), and there is always some way to define \(f(y)\) for \(x \upa y\) so that (a) and (b) are satisfied. In fact, if \(\#\{y \in S: x \upa y\} \ge 2\) then this can be done in infinitely many ways. We show by induction that \[\sum_{e \upa^n x} f(x) = \alpha (1 - \alpha)^n, \quad n \in \N\] First, this holds for \(n = 0\) by definition. If it holds for a given \(n \in \N\), then by construction \[\sum_{e \upa^{n+1} x} f(x) = \sum_{e \upa^n x} \sum_{x \upa y} f(y) = \sum_{e \upa^n x} (1 - \alpha) f(x) = (1 - \alpha) \alpha (1 - \alpha)^n = \alpha (1 - \alpha)^{n + 1}\] So then \[\sum_{x \in S} f(x) = \sum_{n = 0}^\infty \sum_{e \upa^n x} f(x) = \sum_{n = 0}^\infty \alpha(1 - \alpha)^n = 1\]

Suppose that random variable \(X\) in \(S\) has density function \(f\) with parameter \(\alpha \in (0, 1)\) constructed in . Then

  1. \(X\) has constant rate \(\alpha\) for \((S, \preceq)\).
  2. \(X\) has constant rate \(1 / (1 - \alpha)\) for \((S, \upa)\).
  3. \(X\) has constant rate \(1 / (2 - \alpha)\) for \((S, \Upa)\).
  4. \(X\) has constant rate \(\alpha / (1 - \alpha)\) for \((S, \prec)\).
  5. \(X\) has constant rate \(1 / (1 - \alpha)^n\) for \((S, \upa^n)\) for each \(n \in \N\).
Details:
  1. For \((S, \preceq)\), the reliability function \(F\) is given by \[F(x) = \sum_{x \preceq y} f(y) = \sum_{n=0}^\infty \sum_{x \upa^n y} f(y)\] But by the same proof as in , \[\sum_{x \upa^n y} f(y) = f(x) (1 - \alpha)^n\] and hence \(F(x) = \sum_{n = 0}^\infty f(x) (1 - \alpha)^n = f(x) / \alpha\) for \(x \in S\). So \(f = \alpha F\).
  2. For \((S, \uparrow)\), the reliability function \(F\) is given by \(F(x) = \sum_{x \upa y} f(y)\) for \(x \in S\). But by construction, \(F(x) = (1 - \alpha) f(x)\) for \(x \in S\), so \(f = \frac{1}{1 - \alpha} F\).
  3. For \((S, \Upa)\), it follows from (b) and basic results on reflexive closure in Section 1.7 that the distribution has constant rate \[\frac{1 / (1 - \alpha)}{1 + 1 / (1 - \alpha)} = 1 / (2 - \alpha)\]
  4. By (a) and basic results on reflexive closure, the distribution has constant rate \(\alpha / (1 - \alpha)\) for \((S, \prec)\).
  5. This follows from (b).

Clearly if \(F\) is the reliability function for \((S, \preceq)\) corresponding to a density function \(f\) constructed in , then \(F\) also satisfies the recursive relation, but with initial condition replaced by \(F(e) = 1\).

Suppose again that \(X\) has density function \(f\) with parameter \(\alpha \in (0, 1)\) constructed in , and let \(N = d(X)\). Then \(N\) has the geometric distribution on \(\N\) with success parameter \(\alpha\).

Recall that the geometric distribution has constant rate for \((\N, \le)\), \((\N, \lt)\), and for the covering graph \((\N, \upa)\). Of course, the last graph is a directed path and hence a rooted tree where each vertex has a single child.

Suppose again that \(X\) is a random variable in \(S\). Then for the graph \((S, \preceq)\), \(X\) has constant rate \(\alpha \in (0, 1)\) if and only if \(X\) has constant average rate \(\alpha^n\) of order \(n\) for each \(n \in \N_+\).

Details:

The proof is essentially the same as for \((\N, \le)\). We know that if \(X\) has constant rate \(\alpha\) for a graph then \(X\) has constant average rate \(\alpha^n\) of order \(n\) for each \(n \in \N\). Conversely, suppose that \(X\) has constant average rate \(\alpha^n\) of order \(n\) for the graph \((S, \preceq)\), for some \(n \in \N_+\). Let \(r\) and \(R_n\) denote the rate function and cumulative rate function of order \(n\) of \(X\) for \((S, \preceq)\). By assumption then, \[ R_n(x) = \sum_{x_1 \preceq x_2 \preceq \cdots \preceq x_n \preceq x} r(x_1) r(x_2) \cdots r(x_n) = \alpha^n u_n(x), \quad x \in S \] Where has usual, \(u_n\) is the left walk function of order \(n\). We will show by partial order induction that \(r(x) = \alpha\) for \(x \in S\). First, letting \(x = e\) in the displayed equation gives \(r^n(e) = \alpha^n\) so \(r(e) = \alpha\). Suppose now that \(x \in S_+\) and that \(r(y) = \alpha\) for \(y \prec x\). For \(k \in \{0, 1, \ldots n\}\) let \(A_k\) denote the set of walks \(\bs x = (x_1, x_2, \ldots, x_n)\) with \(x_1 \preceq x_2 \preceq \cdots \preceq x_n \preceq x\) and with \(x_i \prec x\) for exactly \(k\) indices. Hence \(x_i = x\) for \(n - k\) indices. Let \(x_-\) denote the parent of \(x\). The collection \(\{A_k: k = 0, 1, \ldots n\}\) partitions the set of walks of length \(n\) terminating in \(x\). Moreover, the elements in \(A_k\) are in one-to-one correspondence with the walks of length \(k\) terminating in \(x_-\) and hence there are \(u_k(x_-)\) such paths. By the induction hypothesis, note that if \(\bs x \in A_k\) then \(r(x_1) r(x_2) \cdots r(x_n) = \alpha^k r^{n-k}(x)\). Substituting into the displayed equation gives \[\sum_{k = 0}^n u_k(x_-) \alpha^k r^{n - k}(x) = \alpha^n u_n(x) = \sum_{k = 0}^n \alpha ^k \alpha^{n - k} u_k(x_-)\] or equivalently \[\sum_{k = 0}^n \alpha^k u_k(x_-)\left[r^{n-k}(x) - \alpha^{n - k}\right] = 0\] Hence \(r(x) = \alpha\).

Next we consider the random walks on the various graphs associated with the constant rate distributions.

Suppose next that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \preceq)\) associated with \(X\), where \(X\) has density function \(f\) with parameter \(\alpha \in (0, 1)\) constructed in . Then \(Y_n\) has density function \(f_n\) for \(n \in \N\) given by \[f_n(x) = \alpha^{n - 1} \binom{n - 1 + d(x)}{d(x)} f(x), \quad x \in S\]

Details:

This follows from the general theory: \(f_n(x) = \alpha^n u_{n - 1}(x) F(x) = \alpha^{n - 1} u_{n - 1}(x) f(x) \) for \(x \in S\).

Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on each of the graphs below associated with \(X\), where \(X\) has density function \(f\) with parameter \(\alpha \in (0, 1)\) constructed in . For \(n \in \N_+\), find the density function of \(Y_n\) for each of the following graphs:

  1. \((S, \upa)\)
  2. \((S, \Upa)\)
  3. \((S, \prec)\)
Details:

Once again, the results follow from the general theory: The density function \(f_n\) of \(Y_n\) is given by \(f_n(x) = \alpha^n u_{n - 1}(x) F(x) = \alpha^{n - 1} u_{n - 1}(x) f(x)\) for \(x \in S\).

  1. For the graph \((S, \upa)\), the rate constant is \(1 / (1 - \alpha)\) and \(u_{n - 1}(x) = \bs 1[d(x) \ge n - 1]\). So \[f_n(x) = \frac{1}{(1 - \alpha)^{n - 1}} f(x), \quad x \in S, \, d(x) \ge n - 1\] Note that \(f_n(x) = 0\) if \(d(x) \lt n - 1\).
  2. For the graph \((S, \Upa)\), the rate constant is \(1 / (2 - \alpha)\) and \(u_{n - 1}(x) = \sum_{k = 0}^{d(x)} \binom{n - 1}{k}\). Hence \[f_n(x) = \frac{1}{(2 - \alpha)^{n - 1}} \left[\sum_{k = 0}^{d(x)} \binom{n - 1}{k}\right] f(x), \quad x \in S\] Note that if \(d(x) \ge n - 1\) then \(f_n(x) = \left(\frac{2}{2 - \alpha}\right)^{n - 1} f(x)\).
  3. For the graph \((S, \prec)\), the rate constant is \(\alpha / (1 - \alpha)\) and \(u_{n - 1}(x) = \binom{d(x)}{n - 1}\). Hence \[f_n(x) = \left(\frac{\alpha}{1 - \alpha}\right)^{n - 1} \binom{d(x)}{n - 1} f(x), \quad x \in S\] Note that \(f_n(x) = 0\) if \(d(x) \lt n - 1\).