Slutsky's theorem

Actually, several theorems all pertaining to the limit distribution of a converging sequence of random vectors (or random variables as a special case).

# The general Slutsky's theorem

## Simple form

Let :

* {Xn} be a sequence of p-variate random vectors converging in distribution to the random vector X,

* and f(.) a function from R p to R k.

A natural question is :

"What is the limit distribution of the sequence of random vectors f{Xn} as n tends to infinity ?".

In its most general form, Slutsky's theorem states that if  f(.) is continuous, then f{Xn} converges in distribution to f(X).

-----

The demonstration of this intuitive result is quite difficult and beyond the bounds of this Glossary.

## Complete form : discontinuity points of f(.)

The condition that f(.) be continuous is in fact too strong. The theorem still holds if f(.) is not continuous, but X is in the set of continuity points of f(.) with probability 1.

This condition may sound like a fine point of detail, but it's not. If it is omitted, many important results, (e.g. the limit distribution of the sequence of reciprocals of converging sequence of r.v.) cannot be obtained. Conversely, failure to satisfy the condition explains why some anticipated convergences do not occur.

# Sequences of random variables

The theorem of course applies when the vectors Xn have a single component. The sequence {Xn} is then just a sequence of random variables.

Slutsky's theorem then states that :

* If a sequence {Xn} of random variables converges in distribution to a rv X,

* and if f(.) is a continuous function (with the above condition about the continuity points),

then

* {f(Xn)} converges in distribution to f(X).

# Convergence to a constant vector

A frequently encountered situation is that where the random vectors Xn can be partitionned into two sub-vectors :

Xn = (Yn , Zn)

with :

* {Yn} converging in distribution to a random vector Y.

* {Zn} converging in distribution (or, equivalently, in probability) to a constant vector C.

A version of Slutsky's theorem then states that the sequence {(Yn , Zn)} converges in distribution to (Y, C).

 Caveat The convergence of {Zn} to a constant is essential. It is in general not true that :     * If {Yn} converges in distribution to a random vector Y,     * If {Zn} converges in distribution to a random vector Z, then {(Yn , Zn)} converges in distribution to (Y , Z) {(Yn , Zn)} may even not converge at all as will be shown in a counter-example.

# Most common version of Slutsky's theorem

From the above results, it is easy to show that :

* If the sequence of random vectors {Yn} converges in distribution to Y,

* If the sequence of random vectors {Zn} converges in distribution (or, equivalently, in probability) to a constant vector C,

then

 * {Yn + Zn} converges in distribution to Y + C * {Yn.Zn} converges in distribution to CY.

which is the most common version of Slutsky's theorem, especially when Yn and Zn are just random variables.

In loose terms, the theorem states that if a rv converges to a constant, then it essentially behaves as a constant for addition and multiplication.

We give below a direct proof of these two special but important results.

# Asymptotically equivalent sequences

Two sequences of random vectors {Xn} and {Yn} are said to be asymptotically equivalent if the sequence {Xn - Yn} converges to 0 (in probability).

A result related to Slutsky's theorem states that :

* If {Xn} converges in distribution to X,

* And if {Xn} and {Yn} are asymptotically equivalent,

then

* {Yn} also converges in distribution to X.

# Extensions of Slutsky's theorem

The reader will have noticed that Slutsky's theorem refers to convergence in distribution. It can be shown that all of the above still holds if "in distribution" is replaced by "in probability". So, for example, the basic form would now read :

If {Xn} converges in probability to X, and if f(.) is continuous, then {f(Xn)} converges in probability to f(X).

In fact, the result about the convergence of a partitioned vector can then even be improved upon and replaced by the following stronger theorem :

* Let {Xn} be a sequence of random vectors converging in probability (and not just in distribution) to a random vector X,

* And let {Yn} be a sequence of random vectors converging in probability (and not just in distribution) to a random vector Y,

then

* The random vector {Xn, Yn} converges in probability to the random vector {X, Y}.

-----

The results are still true if "convergence in probability" is replaced by "almost sure convergence".

_________________________________________________

 Tutorial 1

In this Tutorial, we demonstrate two special cases of Slutsky's theorem.

In both cases, it is assumed that :

* {Xn} is a sequence of random variables converging in distribution to a rv X.

* {Yn} is a sequence of random variables converging in probability to constant c.

1) The first case states that the sequence {Xn + Yn} converges in distribution to X + c.

2) The second case states that the sequence {Xn.Yn} converges in distribution to cX.

Somewhat unexpectedly, the demonstration of the "." case will turn out to be a bit simpler than that of the "+" case.

TWO SPECIAL CASES OF SLUTSKY'S THEOREM

 Slutsky's theorem for sums Constant chosen to be equal to 0 Outline of the proof Lower bound First term Second term The lower bound Upper bound First term Second term The upper bound Limits of the lower and upper bounds Limit of the lower bound Limit of the upper bound Convergence of Yn and end of the proof Slutsky's theorem for products Constant chosen to be equal to 0 The proof Upper bound Limit of upper bound TUTORIAL

____________________________________________

 Tutorial 2

In this Tutorial, we first examine some examples of applications of Slutsky's theorem, in particular for illustrating the importance of the condition on the continuity of the transformation function f(.) : it is not really required that f(.) be continuous, only that the limit variable X be in the set of continuity points of X with probability 1. This will allow us to exhibit cases where :

* Slutsky's theorem applies, even though f(.) is not continuous.

* Slutsky's theorem does not apply because the above condition is not satisfied. Thus, we'll exhibit an example where the distribution of the limit of a converging sequence of rvs is not equal to the limit of the corresponding sequence of distributions.

In the same vein, we'll give an example of a sequence {Xn, Yn} of random vectors with both {Xn} and {Yn} converging in distribution, and yet with the sequence {Xn, Yn} not converging to anything. This will stress the importance of {Yn} converging to a constant vector for Slutsky's theorem to apply.

-----

We'll then move on to two fundamental results.

* We'll first show that the sample variance sn² is a convergent estimator of the distribution variance σ², a not absolutely surprising result but that nevertheless needs to be established on firm grounds.

* We'll then show that the limit distribution of the sample standardized mean is the standard normal distribution N(0, 1). Note that this is not quite the Central Limit Theorem, for which the standardization factor involves the true distribution variance σ², not the sample variance sn².

As a special case, it will appear that Student's t distribution converges to N(0, 1) as the sample size grows without limit.

This last result is also established here by showing directly that the limit, when n tends to infinity, of the probability density function of Student's distribution is the pdf of the standard normal distribution.

SLUTSKY'S THEOREM : EXAMPLES

 Examples of applications of Slutsky's theorem Convergence to a Chi-square distribution Sequence of reciprocals of random variables Sequence of ratios, denominator converges to a constant Counter-examples Probability of the set of discontinuity points Convergence to a non constant vector Sample variance is a convergent estimator Student's t converges to standard normal Asymptotic normality of standardized mean : general case Special case : the t distribution TUTORIAL

______________________________________________________