Negative binomial distribution

Definition of the negative binomial distribution

You're playing Heads and Tails. You decide to play until Head has turned up exactly k times, and then stop the game. How many times do you have to toss the coin ?

If, after the first game, you decide to play another one, you certainly expect the number of tosses needed to win this second game to be different from the number of tosses needed to win the first one. So the number of tosses needed to win a game is a random variable, whose distribution is known as the negative binomial distribution.

The geometric distribution therefore appears as the special case of the negative binomial distribution for k = 1.

A negative binomial distribution is defined by two parameters :

* p, the probability for the coin to land on Head,

* k, the number of "Heads" you want to get before stopping the game. This parameter is sometimes known as the "size" of the distribution.

We give below a slightly different definition of the Negative Binomial Distribution, with slightly different properties.

Animation

This animation illustrates the Negative Binomial distribution.

 The "Book of Animations" on your computer

 Reminder     In this animation, the definition of the Negative Binomial distribution is as follows :         * "Distribution of the number of tosses needed to obtain k "Heads"",     and not :         * "Distribution of the number of "Tails" before k "Heads" have been obtained.     The probability p     With your mouse, slide the boundary between the white and the gray areas of the upper rectangle. By so doing, you change the value of the probability p, which is equal to the ratio of the length of the white area to that of the whole rectangle.   The overall shape of the distribution changes with p :     * For low values of p, the distribution gets wider and flatter. The mode and the mean move to the right (larger values), and the distribution looks like a (discrete) bell-shaped curve that becomes more and more symmetrical as p gets lower.     * For larger values of p, the distribution shrinks to the left, the mode gets higher, and the distribution gets more and more skewed. For a "certain" value of p (can you determine which one ?), the first two non-zero positions have the same hight. For values of p that are larger than this limit value, the distribution will always be decreasing.   Lower frame     The lower frame shows the Negative Binomial distribution for the probability p and the size k you have selected. Note that the k - 1 first positions are empty, as at least k tosses are needed to obtain k "Heads". The blue vertical line is the distribution mean. You may change k with the "Size" buttons. As k gets larger :     * The number of empty positions obviously gets larger.     * The mode of the distribution shifts to the right and becomes lower.     * The distribution looks more and more like a (discretized) gaussian   For k = 1, the distribution is the geometric distribution.   Animation Click on "Go" and observe the build up of the histogram of the Negative Binomial distribution BN(p, k). Click on "Pause", then on "Next". A sample is built up toss after toss. The process goes on as long as the number of red points ("Heads") is less than k. It stops righ after the kth red point has been drawn. Click on "Next" again : a new sample is being drawn etc...   Click on "Resume" to launch the animation again.

Basic properties of the negative binomial distribution

Here are some basic properties of the negative binomial distribution.

Probability mass function

The probability Pk{X = n} of having to toss the coin n times before obtaining k "Heads" is :

with n = k, k +1, k + 2 ......

and where :

is the number of combinations of B objects among A.

-----

Note that for k = 1, one obtains the probability mass function of the geometric distribution.

Mean

We have :

which is just k times the mean of the geometric distribution for the same value of p.

-----

We calculate here :

* The UMVUE of the mean k/p of the negative binomial distribution,

* As well as the UMVUE of pr, with r a positive integer in a limited range.

Variance

We have :

which is just k times the variance of the geometric distribution for the same value of p.

-----

We calculate here the UMVUE of the variance of the negative binomial distribution.

_________________________

The simple relationships between the basic parameters of the geometric and the negative binomial distributions are justified below.

Negative binomial variables as sum of independent geometric variables

Let Gi be k independent geometric variables, all with the same value of the parameter p. Denote L the sum of these variables :

L = Σi Gi        i = 1, 2, ..., k

We'll show that L follows a negative binomial distribution of parameter p and size k.

This provides a simple explanation for the values of the mean and variance of the negative binomial distribution.

Negative binomial distribution and exponential families

We show here that the family of negative binomial distributions NBk(np) for a given value of k form a 1-parameter exponential family. As a consequence :

1) The statistic T = Σi Xi is minimal sufficient and complete for the parameter p.

2) The distribution mean µ is efficiently estimated by the sample mean .

Alternative definition of the negative binomial distribution

Another definition of the negative binomial distribution may be used, depending on the problem at hand.

Suppose Head is considered as a "success", while Tail is considered as a "failure". Then you may ask : "How many failures will have to be overcome before finally getting k successes ?". The random variable is then F, the number of failures (Tails) in a winning game. A winning series will now be made of f failures, k successes for a total of n = k + l tosses, with the last toss being a success. We then have :

with   f = 0, 1, 2, ...

The mean is:

This value is smaller by a factor q than the mean of the first definition.

We also have :

The variance has the same value as for the first definition.

Note that the variance is now always larger than the mean (by a factor 1/p). A distribution with such a property is said to be "overdispersed". This result is to be compared with that for the Poisson distribution, where the mean λ is always equal to the variance. Thus, the Poisson distribution is a "limit case" of the overdispersion concept.

These basic results relative to this second definition are not demonstrated in the Tutorial below, but they can easily be obtained by following exactly the same path that we use for demonstrating the results relative to the first demonstration.

____________________________________________________________

 Tutorial

Here is the Table of Content of the Tutorial on the Negative Binomial Distribution.

THE NEGATIVE BINOMIAL DISTRIBUTION

 Probability mass function Why "negative binomial"? Taking the derivatives of the Mclaurin expansion First derivative Second derivative Mean µ Mean of the translated distribution Mean of the original distribution Variance σ² Sum of independent geometric variables Additivity Moment generating function M(t) The mgf Mean Variance TUTORIAL

____________________________________________________________

 Geometric distribution Poisson distribution