Interactive animation

Marginal distribution

Let X and Y be two random variables, and p(x, y) their joint probability distribution.

By definition, the marginal distibution of X is just the distribution of X, with Y being ignored (with a similar definition for Y).

-----

The reason why this concept was introduced is that it often happens that the joint probability distribution.of the pair {X, Y} is known, while the individual distributions of X and Y are not. But it is then possible to derive these individual distributions from the joint distribution as we show now (and as is illustrated by the animation below).

A nice example of such a situation is found when we calculate the generating function of a random sum.

# Discrete case

Suppose that X and Y are discete random variables. Their joint probability mass function is :

p(xi, yj ) = P{X = xi, Y = yj }

#### What is the marginal distribution of X ?

X = xi if and only if one of these mutually exclusive events occur :

* {xi and y1}

* {xi and y2}

* {xi and y3}

* ------

The probability P{X = xi}is therefore the sum of the probabilities of these events, and we have :

 P{X = xi}= j p(xi, yj )

#### Then pi. = P{X = xi} may be visualized as being written in the right margin of the table, hence the name "marginal" distribution.

Similarly, P{Y = yj } = p.j  is the sum of the probabilities in the jth column.

This  illustration assumes that X can take n values and Y can take m values, but the above result is true even if X or Y or both can take an (enumerably) infinte number of values, as it is the case, for example, for the Poisson or negative binomial distributions.

# Continuous case

#### Suppose that X and Y are continuous variables and that their joint distribution can be represented by their joint probability density f(x, y). An informal argument can be developped as for the discrete case.

The probability for a realization of (X, Y) to be equal to (x, y) within dx and dy is f(x, y).dxdy. For a given value x, the probability for X to be equal to x within dx is the sum over y of these infinitesimal probabilities. Therefore, the marginal probability density fX (x) of X is given by :

with a similar result for Y.

-----

It is common to say that the marginal distribution of one variable is obtained by "integrating the other variable out the joint distribution".

-----
Another convenient way of calculating a marginal distribution is by calling on the properties of multivariate moment generating functions (see here).

# The multivariate case

We described the marginal distributions of a bivariate distribution, but the concept is straightforwardly generalized to the joint distributions of any number of variables.

#### Let x = {X1, X2, ..., Xp} be a set of p random variables with joint probability distribution p(x1, x2, ..., xp). Then, given any subset of this set of variables, one can define a marginal probability disribution of the complete set as the joint probability distribution of this subset of variables, the other variables of the set being ignored.

So a p-variate distribution has 2p - 1 marginal distributions (ignoring the empty and the complete subsets of variables).

* The most famous example of such a set of marginal distributions is that of the multivariate normal distribution.

* We calculate here the joint probability distribution of two order statistics. This joint distribution is one marginal distribution of the joint probability distribution of the complete set of order statistics.

# Marginal distributions are not characteristic properties of a multivariate distribution

Many joint distributions may have the same set of marginals. For example :

* We know that the marginals of the standard bivariate normal distribution are standard normal.

* But we give here two examples of bivariate distributions that have standard normal marginals, and yet that are not bivariate normal.

-----

So knowing the marginals of a multivariate distribution is not enough to characterize the distribution. This is because knowing the marginals says nothing about the coupling between these marginals. Only the full joint distribution accounts for these couplings.

# Marginals and independence

There is but one exception to the above remark. It can be shown that :

* If the variables X and Y are independent, then their joint probability distribution is the product of the (marginal) distributions of these two variables.

* Conversely, if a joint probability distribution is the equal to the product of its marginal distributions, then these marginal variables are independent.

 f(x, y) = fX (x) fY (x)   iff X and Y are independent

This result provides a very powerful method for proving the independence of two random variables It generalizes to any number of variables.

# Animation

Calculating the probability distribution of a random variable A can often be most conveniently achieved :

* By first calculating the joint distribution of A and some other suitably chosen r.v. B.

* And then by considering the distribution of A as one of the marginals of this joint distribution.

We now illustrate this indirect, yet powerful method for calculating distributions with the following animation.

_______

Let X and Y be two independent random variables, both following the uniform distribution in [0, 1].

One considers the seemingly unrelated and difficult-looking problems :

1) What is the distribution of the r.v. U = X/Y ?

2) What is the distribution of the r.v. V = XY ?

In the Tutorial below, we show that the answers will come from :

* First calculating the joint probability distribution of {U, V},

* And then calculating the distribution of U = X/Y as one of the two marginal distributions of this joint distribution, with a similar approach for V = XY.

 The "Book of Animations" on your computer

* Click repetitively on "Next". Each new click generates a realization of the pair {X, Y}, which is then displayed as a pair of values of {U, V} (red dot in the main frame).

* The black roof-shaped line represents the limits of the domain in which the joint probability distribution of {U, V} takes non-zero values. In other words, a red dot will never fall beyond this boundary.

* The red line in the lower gray-background frame represents the marginal density of the r.v. U (the ratio X/Y).

* The red line in the left gray-background frame represents the marginal density of the r.v. V (the product XY).

____________________

* Click on "Go". Observations are now automatically drawn from the uniform distribution. Observe :

- The build-up of the histograms of the densities of U and V.

- The joint probability density of {U, V}.

These distributions certainly deserve comments, that are developed in the following Tutorial.

___________

Note that we have been ambitious by attempting to calculate both the distributions of XY and X/Y in one sweep. You may want to first try on your own to solve separately the two simpler problems :

* Calculate the distribution of XY.

* Calculate the distribution of X/Y.

# Other examples

You'll find in this site several other examples of calculation of a distribution as a marginal distribution of a joint distribution. In particular, it is used here :

* For calculating the density of Student's t distribution.

* For calculating the density of Fisher's F distribution.

_____________________________________________________________

 Tutorial

Let X and Y be two independent random variables, both uniformly distributed in [0, 1].

In this Tutorial, we calculate the distributions :

* Of the ratio U = X/Y,

* And of the product V = XY.

We do it by :

* First calculating the joint probability distribution of U and V.

* And then by calculating the distributions of U and V as the two marginal distributions of this joint distribution.

-----

Although the results are of little practical use, they are beyond the reach of intuition, as illustrated by the above animation, and could hardly have been obtained by a more direct method.

The method we describe is powerful and of general use, and this demonstration can be considered as a template for calculating the probability distributions of random variables in many circumstances where direct methods fail.

DISTRIBUTIONS OF THE PRODUCT AND RATIO

OF TWO INDEPENDENT UNIFORM VARIABLES

 The inverse transformation The transformation The limits Limits of u Limits of v The Jacobian The joint probability distribution The probability distribution of XY The probability distrbution of X/Y Case u < 1 Case u > 1 TUTORIAL

_______________________________________________________

Related readings :

 Transformation of random variables
 Download this Glossary