Glosar

Selectați unul dintre cuvintele cheie din stânga ...

ProbabilityJoint Distributions

Timp de citit: ~15 min

The distribution of a random variable is sometimes its called its marginal distribution, with the term marginal emphasizing that distribution includes information only about a single random variable. If we are interested in two random variables X and Y, it is often important to consider their joint distribution, which captures probabilistic information about where the pair (X,Y) falls in \mathbb{R}^2.

Definition
If X and Y are two random variables defined on the same probability space, then the joint distribution of X and Y is the measure on \mathbb{R}^2 which assigns to each set A \subset \mathbb{R}^2 the value \mathbb{P}((X,Y) \in A).

If X and Y are discrete random variables, then we can find the probability mass function of (X,Y) by (i) finding all of the pairs (x,y) \in \mathbb{R}^2 with the property that the event \{X = x\} \cap \{Y = y\} has positive probability, and (ii) finding the probability of each such event.

Example
Consider the two-fair-coin-flip experiment, and let X_1 be the number of heads in the first flip and X_2 the number of heads in the second flip. Let Y_1 be the number of tails in the first flip.

Show that X_1, X_2, and Y_1 all have the same marginal distributions and but that (X_1, X_2) and (X_1, Y_1) have different joint distributions.

Solution. The random variables X_1, X_2, Y_1 all have the same distribution because each can be 1 or 0 with probability \frac{1}{2}. On the other hand, (X_1, X_2) can take the values \{(0, 0), (0, 1), (1, 0), (1, 1)\} with equal probability \frac{1}{4}, while (X_1, Y_1) can only be either (0, 1) or (1,0) with probability \frac{1}{2}.

This exercise shows that the joint distribution of two random variables provides information not present in the marginal distributions alone. Conversely, the marginal distributions of two random variables may be recovered from their joint distribution:

Exercise
Consider a computer program which rolls two virtual dice and returns roll results with probabilities shown in the table.

The probability that die 1 shows 4 is .

Solution. The event that the first die shows 4 can be written as a disjoint union of the events \{\text{Die 1} = 4\} \cap \text{Die 2} = j where j ranges over the integers 1 to 6. We get

\begin{align*}\mathbb{P}(\text{Die 1} = 4) &= \sum_{j=1}^6 \mathbb{P}(\text{Die 1} = 4, \text{ Die 2} = j) \\\ &= \frac{1}{36} + \frac{1}{36} + \frac{1}{72} + \frac{1}{36} + \frac{1}{36} + \frac{1}{36} \\\ &= \frac{11}{72}.\end{align*}

Exercise
Determine which of the following joint distributions on (X,Y) has the property that each random variable X and Y has the same marginal distribution. (Note: each disk indicates a probability mass at a point, with the size of the disk proportional to the mass at that point)

The first one
The second one
The third one

Solution. We find the distribution of X by summing the joint distribution along vertical lines, and we obtain the distribution of Y by summing along horizontal lines. Only for the third distribution do these two procedures give the same results.

Exercise
For each of the three joint distributions in the previous exercise, the probability that X + Y > 0 is equal to . The distribution for which \mathbb{P}(Y > X) is the largest is .

Solution. Since all of the probability mass is in the first quadrant, both X and Y are positive with probability 1. The probability that Y > X is the total amount of probability mass in the region in the plane above the line y = x. The figure with the most mass in that region is the first one.

Bruno
Bruno Bruno