Discrete Uniform Distribution
If a random variable
assumes the values with equal probability, then follows a discrete uniform distribution. The probability mass function for
is then given
Thus, if
Theorem 2
X
follows the discrete uniform distribution withR_{X}= \{x_{1}, .. , x_k\}
The expectation of
is then given: The variance of
is then given:
Bernoulli Trials
Bernoulli trial
A random experiment with only two possible outcomes.
The outcomes are generally coded:
- 0: failure
- 1: success
Bernoulli Random Variable
Let
be the number of success in a Bernoulli trial. Then has only two possible values, 1 or 0, and is called a Bernoulli random variable. Let
denote the probability of success for a Bernoulli trial. Then,
has the probability mass function: This can also be written:
Notation
We denote a Bernoulli random variable by
, and . Then, the probability mass function becomes:
Theorem 5
For a Bernoulli random variable
,
In certain instances,
Thus, for the Bernoulli distribution, the parameter is
Bernoulli Process
A sequence of repeatedly performed independent and identical Bernoulli trials.
Generates a sequence of independent and identically distributed Bernoulli random variables.
Binomial Distribution
Binomial Random Variable
A Binomial random variable counts the number of successes in
trials of a Bernoulli Process. Suppose we have
trials where:
- probability of success for each trial is
- trials are independent.
Then, the number of successes denoted by
has a binomial distribution . The probability of getting exactly
successes is given as: It can be shown that
Remark
When
, probability mass function for the binomial random variable is reduced to:
# computing binoms in R
pbinom(x, n, p) #P(X <= x);
pbinom(x, n, p, lower.tail = F) #P(X > x);
dbinom(x, n, p) #P(X = x)
Negative Binomial Distribution
Consider a Bernoulli process, where the variable of interest is the number of trials needed so that
Negative Binomial Distribution
The random variable
is defined as the number of independent and identically distributed Bernoulli( ) trials needed until the th success occurs. Then,
follows a Negative Binomial distribution, denoted by . The probability mass function of
is given by: Then,
The derivation of the
The event
- There are
successes in the first trials (define this event ) - There is a success on the
trial. (define this event )
Then,
# computing negative binoms in R
dnbinom(x - k, k, p) #P(X = x);
pnbinom(x - k, k, p) #P(X <= x);
pnbinom(x - k, k, p) #P(X > x);
Geometric Distribution
Consider a Bernoulli process where the random variable of interest is the amount of Bernoulli (
Geometric distribution
Let
be the number of independent and identically distributed Bernoulli( ) trials needed until the first success occurs. Then follows a Geometric distribution, denote by The probability mass function of
is given by: It then can be shown that:
Poisson Distribution
Poisson random variable
The Poisson random variable
denotes the number of events occuring in a fixed period of time or fixed region.
is used to denote the distribution, where is the expected number of occurences during given period/region. The probability mass function is:
Poisson process
A continuous time process, where the number of occurences within a given interval of time is counted.
The defining properties with rate parameter
- the expected number of occurences in an interval of length T is
- no simultaneous occurences
- number of occurences in disjoint time intervals are independent
The number of occurences follow a
distribution.
Poisson Approximation to Binomial
The Poisson random variable can be used for approximation of binomial random variable under certain conditions.
Poisson Approximation to Binomial
. Suppose that such that remains a constant. Approximately, . Good approximation:
n \geq 20, p \leq 0.05
orn \geq 100, np \leq 10
# computing poisson probabilities
dpois(x, lambda) # P(X = x)
ppois(x, lambda) # P(X <= x)
ppois(x, lambda, lower.tail = F) # P(X > x)