Sufficient statistic Click on a name to access the corresponding entry of the Glossary 1 Click on a number to access the detailed Table of Contents of the Tutorial

 ESTIMATION AND TESTS   Ancillary statistic Bartlett's test (Homogeneity of variances) Basu's Theorem Bias Bias-variance tradeoff Complete statistic Confidence intervals Cramér-Rao lower bound Exponential family Lehmann-Scheffé theorem Likelihood Ratio Tests (LRT) Maximum Likelihood estimation Mean Square Error Minimal sufficient statistic Rao-Blackwell theorem Sufficient statistic Uniformly Minimum Variance Unbiased Estimators UMVUE) ANOVA (One-way) Chi-square tests Dunnet's test Fisher-Irwin test Goodness-of-fit tests Kruskal-Wallis test Mann-Whitney test Newman-Keuls test Neyman-Pearson lemma t tests

 Tutorial 1 Shortcomings of unbiased estimators :        * Existence.        * Strange behaviors.        * Biased but better estimators.
 Animation Estimation of θ of the uniform distribution U[0, θ] :        * Twice the sample mean.        * UMVUE.        * A lower MSE estimator.

 Confidence intervals for means of the normal distribution 1 One sample confidence intervals.    Two sample confidence intervals :       * Paired samples       * Independent samples (variances known, unknown but         equal, unknown and not equal).
 Approximate confidence intervals on means 2 Asymptotic interval  (no demonstration).    Welch's approximation.

 Tutorial 1 MSE of a parameter estimator.    Best estimate of a random variable X.    Best estimate of X when a second r.v. Y is available.    Properties of Minimum Mean Square Error estimators.
 Animation MSE of a model :        * As a function of the position of the measurement           point.        * As a function of the model complexity (bias-           variance tradeoff).

 Tutorial 1 Estimating the variance of the normal distribution.    MSE of the corrected (unbiased) sample variance.    MSE of the uncorrected (biased) sample variance.    An even better estimator of the variance.    Comparing the properties of the three estimators.
 Animation MSE of a model :        * As a function of the position of the measurement           point.        * As a function of the model complexity (bias-           variance tradeoff).

 Tutorials 1 A Uniformly Minimum Variance Unbiased Estimator is unique.    An unbiased estimator is a UMVUE if and only if it is uncorrelated    with all unbiased estimators of 0.    A UMVUE is a function of any sufficient statistic. 2 NORMAL DISTRIBUTION    Mean, variance.    UMVUE of integer powers of the mean, of any power    of the standard deviation, of µ / σ.    UMVUE of the distribution function. * Poisson distribution : squared parameter, analytic function       of the parameter.   * Uniform distribution U[0, θ] : parameter θ, any       differentiable function of the parameter θ.    * Uniform distribution U[θ, θ + 1] : no function of the       parameter admits a UMVUE.
 Tutorials 4 Power series distributions :       * Complete statistic.       * UMVUE of integer powers of the parameter. 5 UMVUEs of :         * Variance of the Bernoulli distribution.         * Mean and variance of the negative binomial            distribution (with the geometric distribution as a            special case).         * Parameter p of the geometric distribution.         * Some integer powers of the parameter p of the            negative binomial distribution (does not apply to the            geometric distribution).

 Tutorials 1 Justification of the definition of a sufficient statistic.    A necessary and sufficient condition for a statistic to be    sufficient.    Application to the Bernoulli and to the normal distrbutions. 2 Sufficient statistics for some distributions :           * Bernoulli b(p),          * Binomial B(n, p),          * Poisson  P(λ),           * Uniform U[0, θ],           * Normal N(µ, σ²)          * Truncated exponential exp(θ - x),    from the definition only.
 Tutorials 3 The Factorization Theorem.    Functions of sufficient statistics.    Improving a sufficient statistic. 4 Examples of applications of the Factorization Theorem :        * Bernoulli, uniform (2), Poisson, mean of normal (two           methods), variance of normal, mean and variance of           normal (bidimensional sufficient statistic), exponential,          Gamma (shape and dispersion parameters), Beta. 5 Maximum Likelhood Estimator and Sufficiency.    If a MLE is unique and sufficient, then it is minimal sufficient.

 Tutorial 1 A condition for a sufficient statistic to be minimal sufficient    Examples of minimal sufficient statistics.

 Tutorial 1 Basic properties of complete statistics.    Examples of complete statistics.

 Tutorial 1 The Lehmann-Scheffé theorem.    Corollary.    Examples of applications of the corollary of the Lehmann-Scheffé theorem.

 Tutorial 1 Examples of ancillary statistics :      * Of location families.       * Of scale families.    A sufficient condition for a statistic to be ancillary.    Basu's Theorem and three examples of applications.

 Tutorials 1 Expectation and variance of the score.    "Basic" Cramér-Rao inequality.    The two operational forms of the Cramér-Rao inequality.   Regularity conditions. 2 A necessary and sufficient condition for the existence    of an efficient estimator.    Variance of the estimator.    Efficient estimator and Maximum Likelihood.   Efficient estimator and Sufficient statistic.
 Tutorial 3 Examples of applications of the Cramér-Rao lower bound :       * Mean and variance of the normal distribution.       * Mean of the exponential distribution.       * Parameter of the Bernoulli distribution.       * Mean of the Poisson distribution.       -----     Parameter of the uniform distribution :        * Cramér-Rao does not apply.        * An unbiased estimator "better" that the CR lower bound.

 Tutorial 1 The parameter admits a sufficient statistic if and only if     the distribution belongs to the exponential family.     There exists a function of the parameter that can be     efficiently estimated if and only if the distribution     belongs to the exponential family.     -----     Mean and variance of the natural exponential family.     Variance function.
 Tutorial 2 Examples of exponential families :       * Binomial, negative binomial, Poisson, Gamma, Beta,          normal.    For each one, when possible :       * General, canonical and natural forms.       * Canonical statistic. Efficient estimation.

 Tutorial 1 Reducing the variance of a statistic    while preserving its expectation.    The Rao-Blackwell theorem :       * Creating the new statistic.       * Preserving the expectation.       * Reducing the variance.
 Tutorials 2 First example of blackwellization :       * Estimating the probability for X = 0 for a Poisson          distribution with parameter unknown. 3 Second example of blackwellization :       * Estimating the probability for X > t for an exponential          distribution with parameter unknown.

 Tutorials 1 A Maximum Likelihood estimator is consistent. 2 A Maximum Likelihood estimator is :       * Asymptotically normal.       * Asymptotically efficient.
 Tutorial 3 The MLE of a transformed parameter is the transformed     MLE of the parameter.

__________________________________________________________________________________________________

 Tutorial 1 Demonstration of the Neyman-Pearson lemma.    First consequences on :       * Power and significance level.       * Probabilities of Type I and Type II errors.
 Tutorial 2 Mean of normal distributions.     Location parameter of the Cauchy distribution.     Example in which no parameters are involved.     Neyman-Pearson and sufficient statistic :          * New expression of the likelihood ratio.          * Mean of normal distribution revisited.

 Tutorials 1 Examples of Likelihood Ratio Tests :       * Mean of normal distribution, variance is known.       * Mean of normal distribution, variance is unknown.       * Variance of normal distribution, mean is unknown. 2 A LRT on the normal distribution with no classical equivalent.    Transformation of the test statistic.    Pathological behavior of the distribution of the test statistic. 3 More examples of Likelihood Ratio Tests :       * Parameter of the uniform distribution.       * Parameter of the shifted exponential distribution..       * Identity test on exponential distributions. 4 Asymptotic distribution of the test statistic :        * Example : the Poisson distribution.        * General case : asymptotic Chi-square distribution.    Likelihood Ratio Test and sufficient statistic.
 Tutorials 5 Goodness-of-fit LRT for the multinomial distribution.    Asymptotic equivalence with the Chi-square test.

 Animation Identity Likelihood Ratio Test on exponential distributions.    Asymptotic distribution of the test statistic.

 Animation Comparison of the behaviors and performances of :        * Pearson's Chi-square statistic,        * And Wilks' G² statistic.

_________________________________________________________________________________________________

PARAMETRIC TESTS

 Tutorials 1 Reminder : the goal of ANOVA    The principle of ANOVA 2 Variance decomposition    Total Sum of Squares (SST )       * Decomposition of SST        * Factorial Sum of Squares (SSF)       * Residual Sum of Squares (SSR)    The "variance decomposition" equation

Tutorials

### 3

Total Sum of Squares
* Distribution
Residual Sum of Squares
* Distribution
A premature attempt
Factorial Sum of Squares  (no demonstration)

4

The ANOVA statistic
* Fisher's F statistic
* Mean Squares
The F test
ANOVA table

 Tutorials 1 What does confidence depend on ?    The T statistic    The assumptions    Variance known or unknown    Student's t distribution    Degrees of freedom 2 The "Reference value" t test.    The "Paired samples" t test.    The "Independent samples t test".
 Tutorial 3 Reading the results of a t test :       * Standard error       * Degrees of freedom       * Significance and p-value Animation Distribution of the test statistic in the case of two independent    samples.

 Tutorial 1 Bartlett's test is a Likelihood Ratio Test.    Improvements of the standard LRT statistic.

 Tutorial and case study 1 The goal of Dunnett's test.    Conditions of use.    Dunnett's test.       * Principle of the test.       * Dunnett' statistics.       * Dunnett's table of critical values.       * Special case : equal group sizes.    Case study.

 Tutorial 1 Testing the identity of two Bernoulli populations.    The statistic of the Fisher-Irwin test is hypergeometric.

 Tutorial 1 Conditions of use    The Newman-Keuls statistic.    Distribution of the Newman-Keuls statistic.    Large samples and normal approximation.    Examples.

__________________________________________________

NON PARAMETRIC TESTS

 Tutorials 1 The Chi-square test of independence.    Functional relationship between two variables.    Contributions to the Chi-square statistic. 2 Other Chi-square tests :        * Symmetry of a joint probability distribution.        * Identity of the marginal distributions of a joint probability           distributions.        * Identity of the distributions of several populations.
 Tutorial 3 2x2 tables :        * Special form of the Chi-2 statistic.        * Exact tests :             - Fisher-Irwin.             - Fisher's exact test.       * McNemar test.

 Tutorial 1 The distribution of the Kolmogorov statistic is distribution free.    The Cramér-von Mises statistic can be expressed as a sum.   The Anderson-Darling statistic can be expressed as a sum.

 Tutorial 1 The Kruskal-Wallis statistic.    Rationale of the test.    The two forms of the Kruskal-Wallis statistic.    The Chi-square approximation.    Two examples (small and large samples).
 Tutorial 2 Ties.     The influence of ties on the Kruskal-Wallis statistic.    ----     Multiple comparisons beween treatments.     Multiple comparisons with a reference groups.

 Tutorial 1 Rationale of the test.     The Wilcoxon and the Mann-Whitney statistics.     Distribution of the Wilcoxon statistic.     Large samples and normal approximation.     Examples.

___________________________________________________________________________________