In this section, we consider two types of graph products. Throughout we have \(\sigma\)-finite measure spaces \((S, \ms S, \mu)\) and \((T, \ms T, \nu)\). The corresponding product measure space \((S \times T, \ms S \times \ms T, \mu \times \nu)\) is also \(\sigma\)-finite. Of course in the discrete case, the sets are countable, the \(\sigma\)-algebras are the power sets, and measures are counting measure.
Suppose that \((S, \rta)\) and \((T, \upa)\) are measurable graphs.
The direct product of \((S, \rta)\) and \((T, \upa)\) is the graph \((S \times T, \nea)\) where where \(\nea\) is defined by \[(x, y) \nea (z, w) \text{ if and only if } x \rta z \text{ and } y \upa w\]
We need to show that the appropriate measurability condition holds. Note that the adjacency kernel \(L\) of \((S \times T, \nea)\) is related to the adjacency kernels \(J\) and \(K\) of \((S, \rta)\) and \((T, \upa)\) by \[L[(x, y), (z, w)] = J(x, z) K(y, w), \quad (x, y), \, (z, w) \in S \times T\] so \(L\) is measurable.
Of special interest is the case where the graphs \((S, \rta)\) and \((T, \upa)\) are the same, so that the direct product \((S^2, \nea)\) is the direct power. In the discrete case, the direct product of graphs, as we have defined it, is sometimes referred to as the categorical product or the tensor product. The walk functions are related as follows:
Let \(a_n\), \(b_n\) and \(c_n\) denote the left walk function of order \(n \in \N_+\) for the graphs \((S, \rta)\), \((T, \upa)\) and \((S \times T, \nea)\), respectively. Then \(c_n(x, y) = a_n(x) b_n(y)\) for \((x, y) \in S \times T\).
This follows since \(((x_1, y_1), (x_2, y_2), \ldots, (x_{n + 1}, y_{n + 1}))\) is a walk of length \(n\) in \((S \times T, \nea)\) if and only if \((x_1, x_2, \dots, x_{n + 1})\) and \((y_1, y_2, \dots, y_{n + 1})\) are walks of length \(n\) in \((S, \rta)\) and \((T, \upa)\) respectively.
Note however that the left walk generating function of \((S \times T, \nea)\) has no simple representation in terms of the walk generating functions of \((S, \rta)\) and \((T, \upa)\). The generating functions are the mappings \begin{align*} [(x, y), t] &\mapsto \sum_{n = 0}^\infty c_n(x, y) t^n = \sum_{n = 0}^\infty a_n(x) b_n(y) t^n\\ (x, t) &\mapsto \sum_{n = 0}^\infty a_n(x) t^n, \; (x, t) \mapsto \sum_{n = 0}^\infty b_n(y) t^n \end{align*}
Let \(\ms A\), \(\ms B\), and \(\ms C\) denote the right \(\sigma\)-algebras associated with the graphs \((S, \rta)\), \((T, \upa)\), and \((S \times T, \nea)\), respectivley. Then \(\ms C \subseteq \ms A \times \ms B\).
The right neighbor set of \(x \in S\) for \((S, \rta)\) is \(A_x = \{z \in S: x \rta z\}\). The right neighbor set of \(y \in T\) for \((T, \upa)\) is \(B_y = \{w \in T: y \upa w\}\). Hence the right neighbor set of \((x, y) \in S \times T\) for \((S \times T, \nea)\) is \(A_x \times B_y\): \[ \{(z, w) \in S \times T: (x, y) \nea (z, w)\} = \{(z, w) \in S \times T: x \rta z, y \upa w\} = A_x \times B_y\] By definition, \(\ms A = \sigma\{A_x: x \in S\}\), \(\ms B = \sigma\{B_y: y \in T\}\) and \(\ms C = \sigma\{A_x \times B_y: (x, y) \in S \times T\}\). So \[\ms C \subseteq \ms A \times \ms B = \sigma\{A \times B: A \in \ms A, B \in \ms B\}\]
In general, equality does not hold in .
The direct product is the most common way of forming a partial order on a product space from partial orders on the component spaces.
If \((S, \preceq_1)\) and \((T, \preceq_2)\) are partial order graphs then the direct product \((S \times T, \preceq)\) is also a partial order graph.
If \((x, y) \in S \times T\) then \(x \preceq_1 x\) and \(y \preceq_2 y\) so \((x, y) \preceq (x, y)\). Hence \(\preceq\) is reflexive. Suppose \((u, v), \, (x, y) \in S \times T\) and that \((u, v) \preceq (x, y)\) and \((x, y) \preceq (u, v)\). Then \(u \preceq_1 x\), \(x \preceq_1 u\), \(v \preceq_2 y\), and \(y \preceq_2 v\). Hence \(u = x\) and \(v = y\) so \((u, v) = (x, y)\). Thus \(\preceq\) is antisymmetric. Finally, suppose that \((u, v), \, (x, y), \, (z, w) \in S \times T\) and that \((u, v) \preceq (x, y)\) and \((x, y) \preceq (z, w)\). Then \(u \preceq_1 x\), \(x \preceq_1 z\), \(v \preceq_2 y\), and \(y \preceq_2 w\). Hence \(u \preceq_1 z\) and \(v \preceq_2 w\) so \((u, v) \preceq (z, w)\). Thus \(\preceq\) is transitive.
Suppose again that \((S, \preceq_1)\) and \((T, \preceq_2)\) are partial order graphs, with product graph \((S \times T, \preceq)\).
We return to the general case and consider probability distributions next. The following simple results relate the reliability functions.
Suppose that \(X\) and \(Y\) are random variables in \(S\) and \(T\), and let \(F\), \(G\), and \(H\) denote the reliability functions of \(X\), \(Y\), and \((X, Y)\) for the graphs \((S, \rta)\), \((T, \upa)\) and \((S \times T, \nea)\), respectively.
The proofs are trivial.
Note that \(e\) in part (a) and \(\epsilon\) in part (b) are not necessarily unique unless the relations are antisymmetric. But in particular, (a) holds for a partial order graph \((S, \preceq_1)\) with minimum element \(e \in S\) and (b) holds for a partial order graph \((T, \preceq_2)\) with minimum element \(\epsilon\), and these are among our most important special cases.
Random variables \(X\) and \(Y\) are right independent for the graphs \((S, \rta)\) and \((T, \upa)\) if the events \(\{x \rta X\}\) and \(\{y \upa Y\}\) are independent for every \(x \in S\) and \(y \in T\).
Of course, if \(X\) and \(Y\) are independent, then they are right independent. The converse is not generally true. The following result is trivial.
Random variables \(X\) and \(Y\) are right independent for the graphs \((S, \rta)\) and \((T, \upa)\) respectively if and only if \[H(x, y) = F(x) G(y), \quad (x, y) \in S \times T\]
By definition, \(F(x) = \P(x \rta X)\) for \(x \in S\), \(G(y) = \P(y \upa Y)\) for \(y \in T\), and \[H(x, y) = \P[(x, y) \nea (X, Y)] = \P(x \rta X, y \upa Y), \quad (x, y) \in S \times T \]
So right independence implies full independence in spaces where the reliability function completely determines the distribution.
Suppose that \(X\) and \(Y\) are independent random variables in \(S\) and \(T\), respectively. Suppose also that \(X\) has rate function \(p\) for \((S, \rta)\) and that \(Y\) has rate function \(q\) for \((T, \upa)\). Then \((X, Y)\) has rate function \(r\) for \((S \times T, \nea)\) where \[r(x, y) = p(x) q(y), \quad (x, y) \in S \times T\] In particular, if \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) and \(Y\) has constant rate \(\beta \in (0, \infty)\) for \((T, \upa)\) then \((X, Y)\) has constant rate \(\alpha \beta\) for \((S \times T, \nea)\).
By assumption, \(X\) and \(Y\) have density functions \(f\) and \(g\) and then \(p = f / F\) and \(q = g / G\) where as before, \(F\) and \(G\) are the reliability functions of \(X\) and \(Y\) for the graphs \((S, \rta)\) and \((T, \upa)\), respectively. By independence, a density function \(h\) for \((X, Y)\) is given by \(h(x, y) = f(x) g(x)\) for \((x, y) \in S \times T\), and by , \((X, Y)\) has reliability function \(H\) for \((S \times T, \nea)\) where \(H(x, y) = F(x) G(y)\) for \((x, y) \in S \times T\). Hence \((X, Y)\) has rate function \(r\) given by \[r(x, y) = \frac{h(x, y)}{H(x, y)} = \frac{f(x) g(y)}{F(x) G(y)} = p(x) q(y), \quad (x, y) \in S \times T\]
Here is a converse that relates to constant rate distributions, our most important topic.
Suppose again that \(X\) and \(Y\) are random variables in \(S\) and \(T\). If \((X, Y)\) has constant rate \(\delta \in (0, \infty)\) for the graph \((S \times T, \nea)\) and if \(X\) and \(Y\) are right independent for \((S, \rta)\) and \((T, \upa)\), then \(X\) and \(Y\) are independent, and \(X\) has constant rate \(\alpha\) for \((S, \rta)\) and \(Y\) has constant rate \(\beta\) for \((T, \upa)\) for some \(\alpha, \, \beta \in (0, \infty)\) with \(\alpha \beta = \delta\).
Let \(F\), \(G\), and \(H\) denote the reliability functions of \(X\), \(Y\), and \((X, Y)\) for the graphs \((S, \rta)\), \((T, \upa)\) and \((S \times T, \nea)\), respectively. Then from the constant rate property and , \((X, Y)\) has density function \(h\) given by \[h(x, y) = \delta H(x, y) = \delta F(x) G(y), \quad x \in S, \, y \in T\] By the factorization theorem, it follows that \(X\) and \(Y\) are independent, and there are density functions \(f\) and \(g\) of \(X\) and \(Y\) and positive constants \(\alpha\) and \(\beta\) such that \(f(x) = \alpha F(x)\), \(g(y) = \beta G(y)\) for \(x \in S\) and \(y \in T\) such that \(\alpha \beta = \delta\).
The following result is useful in the study of multivariate reliability.
Suppose that \((S, \preceq_0)\) is a lower semi-lattice and let \((S^2, \preceq)\) denote the direct power of order 2. Suppose also that \(X\) and \(Y\) are random variables in \(S\) and let \(H\) denote the reliability function of \((X, Y)\) for \((S^2, \preceq)\). The reliability function \(\Phi\) of \(X \wedge Y\) for \((S, \preceq_0)\) is given by \[\Phi(x) = H(x, x), \quad x \in S\]
Recall that \(x \preceq_0 X \wedge Y\) if and only if \(x \preceq_0 X\) and \(x \preceq_0 Y\) so \[ \P(x \preceq_0 X \wedge Y) = \P(x \preceq_0 X, x \preceq_0 Y) = \P[(x, x) \preceq (X, Y)] = H(x, x), \quad x \in S \]
So in the setting of , if \(X\) and \(Y\) are right independent with reliability functions \(F\) and \(G\) for \(S, \preceq_0)\) then \(X \wedge Y\) has reliability function \(\Phi\) given by \(\Phi(x) = F(x) G(x)\) for \(x \in S\).
For the remainder of this subsection, suppose that \((S, \rta)\) is a general graph, as above, but that \((T, \equiv)\) is the complete reflexive graph so that \(v \equiv y\) for every \(v, \, y \in T\). The direct product \((S \times T, \Rta)\) of \((S, \rta)\) and \((T, \equiv)\) is useful for counterexamples, and many of the results above simplify considerably. We will assume that \(c = \nu(T) \in (0, \infty)\) so that \((T, \equiv)\) is finite.
Consider the graphs \((S, \rta)\), \((T, \equiv)\), and \((S \times T, \Rta)\) as above.
Suppose that \((X, Y)\) is a random variable in \(S \times T\).
We will return to this setting later in Section 2.7 on the direct product of semigroups.
Suppose that \((S, \rta)\) and \((T, \upa)\) are discrete, irreflexive graphs, and so are graphs without loops in the ordinary combinatorial sense.
The Cartesian product of \((S, \rta)\) and \((T, \upa)\) is the graph \((S \times T, \sea)\) where \(\sea\) is defined by \[(x, y) \sea (z, w) \text{ if and only if } (x \rta z \text{ and } y = w) \text{ or } (x = z \text{ and } y \upa w)\]
The Cartesian product is a well-known construction in graph theory. The assumption that the graphs are discrete is necessary since in the continuous case, the conditions \(x = z \in S\) and \(y = w \in T\) would have measure 0 and so the definition would be irrelevant. The two propositions below are our main results.
Suppose that \(X\) and \(Y\) are independent random variables in \(S\) and \(T\) with probability density functions \(f\) and \(g\), respectively. Let \(F\) and \(G\) denote the reliability functions of \(X\) and \(Y\) with respect to the graphs \((S, \rta)\) and \((T, \upa)\), respectively. Then \((X, Y)\) has reliability function \(H\) with respect to \((S, \sea)\) given by \[H(x, y) = F(x) g(y) + f(x) G(y), \quad (x, y) \in S\]
By definition of the relation and by independence, \begin{align*} H(x, y) &= \P[(x, y) \sea (X, Y)] = \P(x \rta X, Y = y) + \P(X = x, y \upa Y) \\ &= \P(x \rta X) \P(Y = y) + \P(X = x) \P(y \upa Y) = F(x) g(y) + f(x) G(y) \quad (x, y) \in S \end{align*} We have used the fact that the events \(\{x \rta X, Y = y\}\) and \(\{X = x, y \upa Y\}\) are disjoint, since the graphs are irreflexive.
Suppose that again that \(X\) and \(Y\) are independent random variables in \(S\) and \(T\), and that \(X\) has rate function \(p\) for \((S, \rta)\) and that \(Y\) has rate function \(q\) for \((T, \upa)\). Then \((X, Y)\) has rate function \(r\) fort \((S \times T, \sea)\) given by \[r(x, y) = \frac{p(x)q(y)}{p(x) + q(y)}\] In particular, if \(X\) has constant rate \(\alpha\) for \((S, \rta)\) and \(Y\) has constant rate \(\beta\) for \((T, \upa)\) then \((X, Y)\) has constant rate \(\alpha \beta / (\alpha + \beta)\) for \((S \times T, \sea)\).
As before, let \(f\), \(g\), and \(h\) denote the density functions of \(X\), \(Y\), and \((X, Y)\), and let \(F\), \(G\), and \(H\) denote the reliability functions of \(X\), \(Y\), and \((X, Y)\) in terms of their respective graphs. Then \[r(x, y) = \frac{h(x, y)}{H(x, y)} = \frac{f(x) g(y)}{F(x) g(y) + f(x) G(y)}\] Dividing the numerator and denominator by \(F(x) G(y)\) gives the result.
In the context and language of spectral graph theory, parts of these results are well known. Suppose again that \((S, \rta)\) and \((T, \upa)\) are discrete, irreflexive graphs. Suppose again also that \(X\) and \(Y\) are independent random variables in \(S\) and \(T\) respectively, and that \(X\) has constant rate \(\alpha\) for \((S, \rta)\) and \(Y\) has constant rate \(\beta\) for \((S, \upa)\). Combining the results in this subsection and the last we have the interesting result that \((X, Y)\) has constant rate \(\alpha \beta\) for the direct product \((S \times T, \nea)\) and has constant rate \(\alpha \beta / (\alpha + \beta)\) for the Cartesian product \((S \times T, \sea)\).
Suppose now that \((S, \rta)\) is a discrete, irreflexive graph (so a graph without loops in the ordinary combinatorial sense) and that \((T, \upa)\) is a general measurable graph with respect to the \(\sigma\)-finite measure space \((T, \ms T, \nu)\).
The lexicographic product of \((S, \rta)\) and \((T, \upa)\) is the graph \((S \times T, \sea)\) where
\[(x, y) \sea (z, w) \text{ if and only if } x \rta z \text{ or } (x = z \text{ and } y \upa w)\]
The measurability of \((S \times T, \sea)\) is clear since \(\sea\) as a subset of \((S \times T)^2\) can be written as a countable union of measurable sets.Details:
The lexicographic product is a common construction in ordinary combinatorial graph theory. It is also common in the study of partial orders, and it is in this context that the graph gets its name.
Suppose that \((S, \prec_1)\) is a discrete, strict partial order graph.
The proofs are simple from the various cases in the defintion.
Returning to the general setting, there is a simple relationship between the walk functions of order 1. The relationship is more complicated for higher orders, but the order 1 relationship is important for the constant rate distributions that we will discuss below.
Let \(a\), \(b\), and \(c\) denote the walk functions of order 1 for \((S, \rta)\), \((T, \upa)\), and \((S \times T, \sea)\), respectively. Then \(c(x, y) = a(x) \mu(T) + b(y)\) for \((x, y) \in S \times T\)
This follows from the definition of the walk function. Let \(\lambda = \# \times \mu\) denote the product measure. \begin{align*} c(x, y) &= \lambda \{(u, v) \in S \times T: (u, v) \sea (x, y)\}\\ &= \lambda \{(u, v) \in S \times T: u \rta x\} + \lambda \{(x, v): v \in T, v \upa y\}\\ &= \sum_{u \rta x} \mu(T) + b(y) = a(x) \mu(T) + b(y), \quad (x, y) \in S \times T \end{align*}
So the lexicographic product \((S \times T, \sea)\) is left finite if the graphs \((S, \rta)\) and \((T, \upa)\) are left finite and \(\mu(T) \lt \infty)\).
Suppose now that \((X, Y)\) is a random variable in \(S \times T\). Let \(F\) denote the reliability function of \(X\) for \((S, \rta)\), and for \(x \in S\), let \(G(\cdot \mid x)\) denote the conditional reliability function of \(Y\) for \((T, \upa)\), given that \(X = x\). Let \(H\) denote the reliability function of \((X, Y)\) for \((S \times T, \sea)\). Finally, let \(f\) denote the density function of \(X\), so that \(f(x) = \P(X = x)\) for \(x \in S\). In addition to our usual support assumptions, we assume that \(f(x) \gt 0\) for \(x \in S\).
The reliability functions are related as follows: \[H(x, y) = F(x) + f(x) G(y \mid x), \quad x \in S, \, y \in T_x\]
This follows from the definition of the lexicographic relation \(\sea\): \begin{align*} H(x, y) &= \P[(x, y) \sea (X, Y)] = \P(x \rta X) + \P(X = x, y \upa Y) \\ &=\P(x \rta X) + \P(X = x) \P(y \upa Y \mid X = x) = F(x) + f(x) G(y \mid x), \quad (x, y) \in S \times T \end{align*} Note that since the discrete graph \((S, \rta)\) is irreflexive, the events \(\{x \rta X\}\) and \(\{X = x\}\) are disjoint.
The reason for requiring that the graph \((S, \rta)\) be discrete is clear: If \(X\) has a continuous distribution on \(S\) then \(\P(X = x) = 0\) for \(x \in S\) and so the lexicographic relation is irrelevant from a probabilistic viewpoint.
If \(X\) and \(Y\) are independent then \[H(x, y) = F(x) + f(x) G(y), \quad x \in S, \, y \in T\] where \(G\) is the reliability function of \(Y\) for \((T, \upa)\).
The following results on the density function and rate function of \((X, Y)\) is a simple corollary:
Suppose that \(Y\) has conditional density function \(g(\cdot \mid x)\) on \(T\) given \(X = x \in S\).
As usual, we have special interest in constant rate distributions. Recall that the existence of a constant rate distribution for a graph implies that the walk function is finite.
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\), and that given \(X = x \in S\), the conditional density function of \(Y\) and the conditional reliability function of \(Y\) for \((T, \upa)\) are related by \[g(y \mid x) = \beta[1 + \alpha G(y \mid x)], \quad y \in T\] for some \(\beta \in (0, \infty)\). Then \((X, Y)\) has constant rate \(\alpha \beta\) for \((S \times T, \sea)\).
Here is a simple special case
Suppose that \(T\) is finite and consider the lexicographic product \((S \times T, \sea)\) of \((S, \rta)\) and \((T, =)\). Suppose also that \(X\) has constant rate \(\alpha\) for \((S, \rta)\), random variable \(Y\) is uniformly distributed on \(T\), and \(X\) and \(Y\) are independent. Then \((X, Y)\) has constant rate \(\alpha / [\#(T) + \alpha]\) for \((S \times T, \sea)\).
Note that \((T, =)\) is also discrete and so the reference measure \(\mu\) becomes counting measure \(\#\). Note also that \(G = g\). From , \((X, Y)\) has rate function \(r\) given by \[r(x, y) = \frac{\alpha F(x) / \#(T)} {F(x) + \alpha F(x) / \#(T)} = \frac{\alpha}{\#(T) + \alpha}, \quad (x, y) \in S \times T\]
Let's return to the general case of a lexicographic product \((S \times T, \sea)\) of \((S, \rta)\) and \((T, \upa)\) and consider consequences of a constant rate distribution for the product graph. The graph must be left finite, so we assume that \(\mu(T) \lt \infty\). Recall that \(\delta\) and \(\sigma\) denote the walk functions of order 1 for \((S, \rta)\) and \((T, \upa)\) respectively.
Suppose that \((X, Y)\) has constant rate \(\alpha \in (0, \infty)\) for \((S \times T, \sea)\). Then
By assumption, \(h = \alpha H\) is a probability density function of \((X, Y)\).
Here is a simple corollary:
Suppose that \((X, Y)\) has constant rate \(\alpha\) for \((S \times T, \sea)\) and that \(X\) and \(Y\) are independent.
Lexicographic products of graphs will be generalized to lexicographic sums in Section 9.
The standard continuous graph has the structure studied in this section and can provide some helpful insight.
The standard graph \(([0, \infty), \le)\) is isomorphic to the lexicographic product \((\N \times [0, 1), \preceq)\) of \((\N, \lt)\) and \(([0, 1), \le)\) where \([0, \infty)\) and \([0, 1)\) have the usual Lebesgue measure structure.
In the notation above, note that \(\mu(T) = 1\), \(a(n) = n - 1\) for \(n \in \N_+\) (and \(a(0) = 0\)), and \(b(y) = y\) for \(y \in [0, 1)\).
Suppose now that \(X\) has the exponential distribution on \([0, \infty)\) with parameter \(\alpha \in (0, \infty)\).
Of course, part (a) is well known, and part (b) follows immediately from part (a) since the spaces are isomorphic. But the following exercises derive the result directly. First let's explore the distribution of \(N\) on the graph \((\N, \lt)\).
Next let's explore the distribution of \(Y\) on the graph \(([0, 1), \le)\).
Consider again the setting of .
For part (b) note that \[G(y) = \int_y^1 \frac{\alpha}{1 - e^{-\alpha}} e^{-\alpha t} dt, \quad y \in [0, 1)\]
Next let's explore the joint distribution of \((N, Y)\) on the product graph \((\N \times [0, 1), \preceq)\).
Consider again the setting of .
Finally, we can verify some of the other results in the general theory above directly.
Consider again the setting of .
We will return to this basic example, with additional structure, in Section 2.8 on quotient spaces.