The matching experiment is a random experiment that can the formulated in a number of colorful ways. Let \(n \in \N_+\).
The experiments in are equivalent from a mathematical point of view, and correspond to selecting a random permutation \(\bs{X} = (X_1, X_2, \ldots, X_n)\) of the population \(D_n = \{1, 2, \ldots, n\}\).
Our modeling assumption, of course, is that \(\bs{X}\) is uniformly distributed on the set of permutations of \(D_n\). The number of objects \(n\) is the basic parameter of the experiment. We will also consider the case of sampling with replacement from the population \(D_n\), because the analysis is much easier but still provides insight. In this case, \(\bs{X}\) is a sequence of independent random variables, each uniformly distributed over \(D_n\).
A match occurs at position \(j \in \{1, 2, \ldots, n\}\) if \(X_j = j\). So the number of matches is the random variable \(N\) defined mathematically by \[ N_n = \sum_{j=1}^n I_j\] where \(I_j = \bs{1}(X_j = j)\) is the indicator variable for the event of match at position \(j\).
Our problem is to compute the probability distribution of the number of matches. This is an old and famous problem in probability that was first considered by Pierre-Remond Montmort; it sometimes referred to as Montmort's matching problem in his honor.
First let's solve the matching problem in the easy case, when the sampling is with replacement. Of course, this is not the way that the matching game is usually played, but the analysis will give us some insight.
\((I_1, I_2, \ldots, I_n)\) is a sequence of \(n\) Bernoulli Trials, with success probability \(\frac{1}{n}\).
The variables are independent since the sampling is with replacement. Since \(X_j\) is uniformly distributed, \(\P(I_j = 1) = \P(X_j = j) = \frac{1}{n}\).
The number of matches \(N_n\) has the binomial distribution with trial parameter \(n\) and success parameter \(\frac{1}{n}\). \[ \P(N_n = k) = \binom{n}{k} \left(\frac{1}{n}\right)^k \left(1 - \frac{1}{n}\right)^{n-k}, \quad k \in \{0, 1, \ldots, n\} \]
The mean and variance of the number of matches are
The distribution of the number of matches converges to the Poisson distribution with parameter 1 as \(n \to \infty\): \[ \P(N_n = k) \to \frac{e^{-1}}{k!} \text{ as } n \to \infty \text{ for } k \in \N \]
This is a special case of the convergence of the binomial distribution to the Poisson. For a direct proof, note that \[ \P(N_n = k) = \frac{1}{k!} \frac{n^{(k)}}{n^k} \left(1 - \frac{1}{n}\right)^{n-k} \] But \(\frac{n^{(k)}}{n^k} \to 1\) as \(n \to \infty\) and \(\left(1 - \frac{1}{n}\right)^{n-k} \to e^{-1}\) as \(n \to \infty\) by a famous limit from calculus.
Now let's consider the case of real interest, when the sampling is without replacement, so that \(\bs{X}\) is a random permutation of the elements of \(D_n = \{1, 2, \ldots, n\}\).
To find the probability density function of \(N_n\), we need to count the number of permutations of \(D_n\) with a specified number of matches. This will turn out to be easy once we have counted the number of permutations with no matches; these are called derangements of \(D_n\). We will denote the number of permutations of \(D_n\) with exactly \(k\) matches by \(b_n(k) = \#\{N_n = k\}\) for \(k \in \{0, 1, \ldots, n\}\). In particular, \(b_n(0)\) is the number of derrangements of \(D_n\).
The number of derrangements is \[ b_n(0) = n! \sum_{j=0}^n \frac{(-1)^j}{j!} \]
By the complement rule for counting measure \(b_n(0) = n! - \#(\bigcup_{i=1}^n \{X_i = i\})\). From the inclusion-exclusion formula, \[ b_n(0) = n! - \sum_{j=1}^n (-1)^{j-1} \sum_{J \subseteq D_n, \; \#(J) = j} \#\{X_i = i \text{ for all } i \in J\} \] But if \(J \subseteq D_n\) with \(\#(J) = j\) then \(\#\{X_i = i \text{ for all } i \in J\} = (n - j)!\). Finally, the number of subsets \(J\) of \(D_n\) with \(\#(J) = j\) is \(\binom{n}{j}\). Substituting into the displayed equation and simplifying gives the result.
The number of permutations with exactly \(k\) matches is \[ b_n(k) = \frac{n!}{k!} \sum_{j=0}^{n-k} \frac{(-1)^j}{j!}, \quad k \in \{0, 1, \ldots, n\} \]
The following is two-step procedure that generates all permutations with exactly \(k\) matches: First select the \(k\) integers that will match. The number of ways of performing this step is \(\binom{n}{k}\). Second, select a permutation of the remaining \(n - k\) integers with no matches. The number of ways of performing this step is \(b_{n-k}(0)\). By the multiplication principle of combinatorics it follows that \(b_n(k) = \binom{n}{k} b_{n-k}(0)\). Using and simplifying gives the results.
The probability density function of the number of matches is \[ \P(N_n = k) = \frac{1}{k!} \sum_{j=0}^{n-k} \frac{(-1)^j}{j!}, \quad k \in \{0, 1, \ldots, n\} \]
In the matching experiment, vary the parameter \(n\) and note the shape and location of the probability density function. For selected values of \(n\), run the simulation 1000 times and compare the empirical density function to the true probability density function.
\(\P(N_n = n - 1) = 0\).
The distribution of the number of matches converges to the Poisson distribution with parameter 1 as \(n \to \infty\): \[ \P(N_n = k) \to \frac{e^{-1}}{k!} \text{ as } n \to \infty, \quad k \in \N \]
The convergence is remarkably rapid.
In the matching experiment, increase \(n\) and note how the probability density function stabilizes rapidly. For selected values of \(n\), run the simulation 1000 times and compare the relative frequency function to the probability density function.
The mean and variance of the number of matches could be computed directly from the distribution. However, it is much better to use the representation in terms of indicator variables. The exchangeable property is an important tool in this section.
\(\E(I_j) = \frac{1}{n}\) for \(j \in \{1, 2, \ldots, n\}\).
\(X_j\) is uniformly distributed on \(D_n\) for each \(j\) so \(\P(I_j = 1) = \P(X_j = x) = \frac{1}{n}\).
\(\E(N_n) = 1\) for each \(n\)
This follows from and the additive property of expected value.
So the expected number of matches is 1, regardless of \(n\), just as in when the sampling is with replacement .
\(\var(I_j) = \frac{n-1}{n^2}\) for \(j \in \{1, 2, \ldots, n\}\).
This follows from \(\P(I_j = 1) = \frac{1}{n}\).
A match in one position would seem to make it more likely that there would be a match in another position. Thus, we might guess that the indicator variables are positively correlated.
For distinct \(j, \, k \in \{1, 2, \ldots, n\}\),
Note that \(I_j I_k\) is the indicator variable of the event of a match in position \(j\) and a match in position \(k\). Hence by the exchangeable property \(\P(I_j I_k = 1) = \P(I_j = 1) \P(I_k = 1 \mid I_j = 1) = \frac{1}{n} \frac{1}{n-1}\). As before, \(\P(I_j = 1) = \P(I_k = 1) = \frac{1}{n}\). The results now follow from standard computational formulas for covariance and correlation.
Note that when \(n = 2\), the event that there is a match in position 1 is perfectly correlated with the event that there is a match in position 2. This makes sense, since there will either be 0 matches or 2 matches.
\(\var(N_n) = 1\) for every \(n \in \{2, 3, \ldots\}\).
In the matching experiment, vary the parameter \(n\) and note the shape and location of the mean \( \pm \) standard deviation bar. For selected values of the parameter, run the simulation 1000 times and compare the sample mean and standard deviation to the distribution mean and standard deviation.
For distinct \(j, \, k \in \{1, 2, \ldots, n\}\), \(\cov(I_j, I_k) \to 0\) as \(n \to \infty\).
So the event that a match occurs in position \(j\) is nearly independent of the event that a match occurs in position \(k\) if \(n\) is large. For large \(n\), the indicator variables behave nearly like \(n\) Bernoulli trials with success probability \(\frac{1}{n}\), which of course, is what happens when the sampling is with replacement.
In this subsection, we will give an alternate derivation of the distribution of the number of matches, in a sense by embedding the experiment with parameter \(n\) into the experiment with parameter \(n + 1\).
The probability density function of the number of matches satisfies the following recursion relation and initial condition:
First, consider the random permutation \((X_1, X_2, \ldots, X_n, X_{n+1})\) of \(D_{n+1}\). Note that \((X_1, X_2, \ldots, X_n)\) is a random permutation of \(D_n\) if and only if \(X_{n+1} = n + 1\) if and only if \(I_{n+1} = 1\). It follows that \[ \P(N_n = k) = \P(N_{n+1} = k + 1 \mid I_{n+1} = 1), \quad k \in \{0, 1, \ldots, n\} \] From the defnition of conditional probability argument we have \[ \P(N_n = k) = \P(N_{n+1} = k + 1) \frac{\P(I_{n+1} = 1 \mid N_{n+1} = k + 1)}{\P(I_{n+1} = 1)}, \quad k \in \{0, 1, \ldots, n\} \] But \(\P(I_{n+1} = 1) = \frac{1}{n+1}\) and \(\P(I_{n+1} = 1 \mid N_{n+1} = k + 1) = \frac{k+1}{n+1}\). Substituting into the last displayed equation gives the recurrence relation. The initial condition is obvious, since if \(n = 1\) we must have one match.
This result can be used to obtain the probability density function of \(N_n\) recursively for any \(n\).
Next recall that the probability generating function of \(N_n\) is given by \[ G_n(t) = \E\left(t^{N_n}\right) = \sum_{j=0}^n \P(N_n = j) t^j, \quad t \in \R \]
The family of probability generating functions satisfies the following differential equations and ancillary conditions:
Note also that \(G_1(t) = t\) for \(t \in \R\). Thus, the system of differential equations can be used to compute \(G_n\) for any \(n \in \N_+\).
In particular, for \(t \in \R\),
For \(k, \, n \in \N_+\) with \(k \lt n\), \[ G_n^{(k)}(t) = G_{n-k}(t), \quad t \in \R \]
For \(n \in \N_+\), \[ \P(N_n = k) = \frac{1}{k!} \P(N_{n-k} = 0), \quad k \in \{0, 1, \ldots, n - 1\} \]
A secretary randomly stuffs 5 letters into 5 envelopes. Find each of the following:
\(k\) | 0 | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|---|
\(b_5(k)\) | 44 | 45 | 20 | 10 | 0 | 1 |
\(k\) | 0 | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|---|
\(\P(N_5 = k)\) | 0.3667 | 0.3750 | 0.1667 | 0.0833 | 0 | 0.0083 |
Ten married couples are randomly paired for a dance. Find each of the following:
\(k\) | \(\P(N_{10} = k)\) |
---|---|
0 | \( \frac{16\,481}{44\,800} \approx 0.3678795\) |
1 | \(\frac{16\,687}{45\,360} \approx 0.3678792\) |
2 | \(\frac{2119}{11\,520} \approx 0.1839410\) |
3 | \(\frac{103}{1680} \approx 0.06130952\) |
4 | \( \frac{53}{3456} \approx 0.01533565 \) |
5 | \( \frac{11}{3600} \approx 0.003055556 \) |
6 | \( \frac{1}{1920} \approx 0.0005208333 \) |
7 | \( \frac{1}{15\,120} \approx 0.00006613757\) |
8 | \(\frac{1}{80\,640} \approx 0.00001240079\) |
9 | 0 |
10 | \(\frac{1}{3\,628\,800} \approx 2.755732 \times 10^{-7}\) |
In the matching experiment, set \(n = 10\). Run the experiment 1000 times and compare the following for the number of matches: