Conditional Expectation: 7 Facts You Should Know

image 5 1

For the random variable dependent on one another requires the calculation of conditional probabilities which we already discussed, now we will discuss some more parameters for such random variables or experiments like conditional expectation and conditional variance for different types of random variables.

Conditional Expectation

   The definition of conditional probability mass function of discrete random variable X given Y is

image

here pY(y)>0 , so the conditional expectation for the discrete random variable X given Y when pY (y)>0 is

image 1

in the above expectation probability is the conditional probability.

  In similar way if X and Y are continuous then the conditional probability density function of the random variable X given Y is

image 2

where f(x,y) is joint probability density function and for all yfY(y)>0 , so the conditional expectation for the random variable X given y will be

MT2

for all yfY(y)>0.

   As we know that all the properties of probability are applicable to conditional probability same is the case for the conditional expectation, all the properties of mathematical expectation are satisfied by conditional expectation, for example conditional expectation of function of random variable will be

image 3

and the sum of random variables in conditional expectation will be

image 4

Conditional Expectation for the sum of binomial random variables

    To find conditional expectation of the sum of binomial random variables X and Y with parameters n and p which are independent, we know that X+Y will be also binomial random variable with the parameters 2n and p, so for random variable X given X+Y=m the conditional expectation will be obtained by calculating the probability

image 5

since we know that

image 6

thus the conditional expectation of X given X+Y=m is

image 7

Example:

Find the conditional expectation

image 8

if the joint probability density function of continuous random variables X and Y is given as

image 9

solution:

To calculate the conditional expectation we require conditional probability density function, so

image 10

since for the continuous random variable the conditional expectation is

image 11

hence for the given density function the conditional expectation would be

image 12

Expectation by conditioning||Expectation by conditional expectation

                We can calculate the mathematical expectation with the help of conditional expectation of X given Y as

image 13

for the discrete random variables this will be

image 14

which can be obtained as

image 15

and for the continuous random we can similarly show

image 16

Example:

                A person is trapped in his building underground as the entrance is blocked due to some heavy load fortunately there are three pipelines from which he can come out the first pipe take him safely out after 3 hours, the second after 5 hours and the third pipeline  after 7 hours, If any of these pipeline chosen equally likely by him, then what would be the expected time he will come outside safely.

Solution:

Let X be the random variable that denote the time in hours until the person came out safely and Y denote the pipe he chooses initially, so

image 17

since

image 18

If the person chooses the second pipe , he spends 5 hous in that but  he come outside with expected time

image 19

so the expectation  will be

image 20

Expectation of sum of random number of random variables using conditional expectation

                Let N be the random number of random variable and sum of random variables is     then the expectation  

image 21

since

image 22

as

MT11

thus

MT12

Correlation of bivariate distribution

If the probability density function of the bivariate random variable X and Y is

image 23

where

image 24

then the correlation between random variable X and Y for the bivariate distribution with density function is

since correlation is defined as

image 25

since the expectation using conditional expectation is

image 26

for the normal distribution the conditional  distribution X given Y is having mean

image 27

now the expectation of  XY given Y is

image 28

this gives

image 29

hence

image 30

Variance of geometric distribution

    In the geometric distribution let us perform successively independent trials which results in success with probability p , If N represents the time of first success in these succession then the variance of N as by definition will be

image 31

Let the random variable Y=1 if the first trial results in success and Y=0 if first trial results in failure, now to find the mathematical expectation here we apply the conditional expectation as

image 32

since

image 33

if success is in first trial then N=1 and N2=1 if failure occur in first trial , then to get the first success  the total number of trials will have the same distribution as 1 i.e the first trial that results in failure with  plus the necessary number of additional trials,  that is

image 34

Thus the expectation will be

image 35

since the expectation of geometric distribution is so

image 36

hence

image 37

and

E

image 38

so the variance of geometric distribution will be

image 39

Expectation of Minimum of sequence of uniform random variables

   The sequence of uniform random variables U1, U2 … .. over the interval (0, 1) and N is defined as

image 40

then for the expectation of N, for any x ∈ [0, 1] the value of N

image 41

we will set the expectation of N as

image 42

to find the expectation we use the definition of conditional expectation on continuous random variable

lagrida latex editor 6

now conditioning for the first term of the sequence  we have

image 43

here we get

image 44

the remaining number of uniform random variable is same at the point where the first uniform value is y,in starting and then were going to add uniform random variables until their sum surpassed x − y.

so using this value of expectation the value of integral will be

image 45

if we differentiate this equation

image 46

and

image 47

now integrating this gives

image 48

hence

image 49

the value of k=1 if x=0 , so

m

image 50

and m(1) =e, the expected number of uniform  random variables over the interval (0, 1) that need to be added until their sum surpasses 1, is equal to e

Probability using conditional Expectation || probabilities using conditioning

   We can find the probability also by using conditional expectation like expectation we found with conditional expectation, to get this consider an event and a random variable X as

image 51

from the definition of this random variable and expectation clearly

image 52

now by conditional expectation in any sense we have

image 53

Example:

compute the probability mass function of random variable X , if U is the uniform random variable on the interval (0,1), and consider the conditional distribution of X given U=p as binomial  with parameters n and p.

Solution:

For the value of U the probability by conditioning is

image 54

we have the result

lagrida latex editor 15

so we will get

image 55

Example:

what is the probability of X < Y, If X and Y are the continuous random variables with probability density functions fX and fY respectively.

Solution:

By using conditional expectation and conditional probability

image 56

as

image 57

Example:

Calculate the distribution of sum of continuous independent random variables X and Y.

Solution:

To find the distribution of X+Y we have to find the probability of the sum by using the conditioning as follows

image 58

Conclusion:

The conditional Expectation for the discrete and continuous random variable with different examples considering some of the types of these random variables discussed using the independent random variable and the joint distribution in different conditions, Also the expectation and probability how to find using conditional expectation is explained with examples, if you require further reading go through below books or for more Article on Probability, please follow our Mathematics pages.

https://en.wikipedia.org/wiki/probability_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Moment Generating Functions: 13 Important Facts

Moment generating function    

Moment generating function is very important function which generates the moments of random variable which involve mean, standard deviation and variance etc., so with the help of moment generating function only, we can find basic moments as well as higher moments, In this article we will see moment generating functions for the different discrete and continuous random variables. since the Moment generating function(MGF) is defined with the help of mathematical expectation denoted by M(t) as

gif

and using the definition of expectation for the discrete and continuous random variable this function will be

gif.latex?M%28t%29%3D%5Cleft%5C%7B%5Cbegin%7Barray%7D%7Bll%7D%20%5Csum %7Bx%7D%20e%5E%7Bt%20x%7D%20p%28x%29%20%26%20%5Ctext%20%7B%20if%20%7D%20X%20%5Ctext%20%7B%20is%20discrete%20with%20mass%20function%20%7D%20p%28x%29%20%5C%5C%20%5Cint %7B

which by substituting the value of t as zero generates respective moments. These moments we have to collect by differentiating this moment generating function for example for first moment or mean we can obtain by differentiating once as

gif

This gives the hint that differentiation is interchangeable under the expectation and we can write it as

gif

and

gif

if t=0 the above moments will be

gif

and

gif

In general we can say that

gif

hence

gif

Moment generating function of Binomial distribution||Binomial distribution moment generating function||MGF of Binomial distribution||Mean and Variance of Binomial distribution using moment generating function

The Moment generating function for the random variable X which is Binomially distribution will follow the probability function of binomial distribution with the parameters n and p as

gif

which is the result by binomial theorem, now differentiating and putting the value of t=0

gif

which is the mean or first moment of binomial distribution similarly the second moment will be

gif

so the variance of the binomial distribution will be

gif

which is the standard mean and variance of Binomial distribution, similarly the higher moments also we can find using this moment generating function.

Moment generating function of Poisson distribution||Poisson distribution moment generating function||MGF of Poisson distribution||Mean and Variance of Poisson distribution using moment generating function

 If we have the random variable X which is Poisson distributed with parameter Lambda then the moment generating function for this distribution will be

gif

now differentiating this will give

gif

this gives

gif

which gives the mean and variance for the Poisson distribution same which is true

Moment generating function of Exponential distribution||Exponential distribution moment generating function||MGF of Exponential distribution||Mean and Variance of Exponential distribution using moment generating function

                The Moment generating function for the exponential random variable X by following the definition is

gif

here the value of t is less than the parameter lambda, now differentiating this will give

gif

which provides the moments

gif

clearly

gif

Which are the mean and variance of exponential distribution.

Moment generating function of Normal distribution||Normal distribution moment generating function||MGF of Normal distribution||Mean and Variance of Normal distribution using moment generating function

  The Moment generating function for the continuous distributions also same as the discrete one so the moment generating function for the normal distribution with standard probability density function will be

this integration we can solve by adjustment as

%202%7D%20%5Cend%7Barray%7D

since the value of integration is 1. Thus the moment generating function for the standard normal variate will be

%202%7D

from this we can find for any general normal random variable the moment generating function by using the relation

gif

thus

%202%7D%20%5C%5C%20%26amp%3B%3De%5E%7B%5Cleft%5C%7B%5Cfrac%7B%5Csigma%5E%7B2%7D%20t%5E%7B2%7D%7D%7B2%7D+%5Cmu%20t%5Cright%5C%7D%7D%20%5Cend%7Baligned%7D

so differetiation gives us

gif

thus

gif

so the variance will be

gif

Moment generating function of Sum of random variables

The Moment generating function of sum of random variables gives important property that it equals the product of moment generating function of respective independent random variables that is for independent random variables X and Y then the moment generating function for the sum of random variable X+Y is

Moment generating function
MGF OF SUM

here moment generating functions of each X and Y are independent by the property of mathematical expectation. In the succession we will find the sum of moment generating functions of different distributions.

Sum of Binomial random variables

If the random variables X and Y are distributed by Binomial distribution with the parameters (n,p) and (m,p) respectively then moment generating function of their sum X+Y will be

gif

where the parameters for the sum is (n+m,p).

Sum of Poisson random variables

The distribution for the sum of independent random variables X and Y with respective means which are distributed by Poisson distribution we can find as

gif

Where

gif

is the mean of Poisson random variable X+Y.

Sum of Normal random variables

     Consider the independent normal random variables X and Y with the parameters

gif

then for the sum of random variables X+Y with parameters

gif

so the moment generating function will be

gif

which is moment generating function with additive mean and variance.

Sum of random number of random variables

To find the moment generating function of the sum of random number of random variables let us assume the random variable

gif

where the random variables X1,X2, … are sequence of random variables of any type, which are independent and identically distributed then the moment generating function will be

gif
gif

Which gives the moment generating function of Y on differentiation as

gif

hence

gif

in the similar way the differentiation two times will give

gif

which give

gif

thus the variance will be

gif

Example of Chi-square random variable

Calculate the moment generating function of the Chi-squared random variable with n-degree of freedom.

Solution: consider the Chi-squared random variable with the n-degree of freedom for

gif

the sequence of standard normal variables then the moment generating function will be

gif

so it gives

%202%7D%20%5Cend%7Baligned%7D

the normal density with mean 0 and variance σ2 integrates to 1

%202%7D

which is the required moment generating function of n degree of freedom.

Example of Uniform random variable

Find the moment generating function of random variable X which is binomially distributed with parameters n and p given the conditional random variable Y=p on the interval (0,1)

Solution: To find the moment generating function of random variable X given Y

gif

using the binomial distribution, sin Y is the Uniform random variable on the interval (0,1)

gif.latex?%5Cbegin%7Barray%7D%7Bl%7D%20E%5Cleft%5Be%5E%7Bt%20X%7D%5Cright%5D%3D%5Cint %7B0%7D%5E%7B1%7D%5Cleft%28p%20e%5E%7Bt%7D+1 p%5Cright%29%5E%7Bn%7D%20d%20p%20%5C%5C%3D%5Cfrac%7B1%7D%7Be%5E%7Bt%7D 1%7D%20%5Cint %7B1%7D%5E%7Be%5E%7Bt%7D%7D%20y%5E%7Bn%7D%20d%20y%5C%5C%20%3D%5Cfrac%7B1%7D%7Bn+1%7D%20%5Cfrac%7Be%5E%7Bt%28n+1%29%7D 1%7D%7Be%5E%7Bt%7D

Joint moment generating function

The Joint moment generating function for the n number of random variables X1,X2,…,Xn

gif

where t1,t2,……tn are the real numbers, from the joint moment generating function we can find the individual moment generating function as

gif

Theorem: The random variables X1,X2,…,Xn are independent if and only if the joint mement generating function

gif

Proof: Let us assume that the given random variables X1,X2,…,Xn are independent then

gif

Now assume that the joint moment generating function satisfies the equation

gif
  • to prove the random variables X1,X2,…,Xn are independent we have the result that the joint moment generating function uniquely gives the joint distribution(this is another important result which requires proof) so we must have joint distribution which shows the random variables are independent, hence the necessary and sufficient condition proved.

Example of Joint Moment generating function

1.Calculate the joint moment generating function of the random variable X+Y and X-Y

Solution : Since the sum of random variables X+Y and subtraction of random variables X-Y are independent as for the independent random variables X and Y so the joint moment generating function for these will be

%202%7D%20%5C%5C%20%26amp%3B%3De%5E%7B2%20%5Cmu%20t+%5Csigma%5E%7B2%7D%20t%5E%7B2%7D%7D%20e%5E%7B%5Csigma%5E%7B2%7D%20s%5E%7B2%7D%7D%20%5Cend%7Baligned%7D

as this moment generating function determine the joint distribution so from this we can have X+Y and X-Y are independent random variables.

2. Consider for the experiment the number of events counted and uncounted distributed by poisson distribution with probability p and the mean λ, show that the number of counted and uncounted events are independent with respective means λp and λ(1-p).

Solution: We will consider X as the number of events and Xc the number of counted events so the number of uncounted events is X-Xc,the joint moment genrating function will generate moment

gif

and by the moment generating function of binomial distribution

gif

and taking expectation off these will give

gif

Conclusion:

By using the standard definition of moment generating function the moments for the different distributions like binomial, poisson, normal etc were discussed and the sum of these random variables either the discrete or continuous the moment generating function for those and joint moment generating function were obtained with suitable examples , if you require further reading go through below books.

For more articles on Mathematics, please see our Mathematics page.

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Covariance, Variance Of Sums: 7 Important Facts

COVARIANCE, VARIANCE OF SUMS, AND CORRELATIONS OF RANDOM VARIABLES

  The statistical parameters of the random variables of different nature using the definition of expectation of random variable is easy to obtain and understand, in the following we will find some parameters with the help of mathematical expectation of random variable.

Moments of the number of events that occur

    So far we know that expectation of different powers of random variable is the moments of random variables and how to find the expectation of random variable from the events if number of event occurred already, now we are interested in the expectation if pair of number of events already occurred, now if X represents the number of event occurred then for the events A1, A2, ….,An define the indicator variable Ii as

gif

the expectation of X in discrete sense will be

gif

because the random variable X is

gif

now to find expectation if number of pair of event occurred already we have to use combination as

gif

this gives expectation as

gif
gif

from this we get the expectation of x square and the value of variance also by

gif

By using this discussion we focus different kinds of random variable to find such moments.

Moments of binomial random variables

   If p is the probability of success from n independent trials then lets denote Ai for the trial i as success so

gif
gif
gif
gif

and hence the variance of binomial random variable will be

gif

because

gif

if we generalize for k events

gif
gif

this expectation we can obtain successively for the value of k greater than 3  let us find for 3

gif
gif

gif
gif

using this iteration we can get

gif

Moments of hypergeometric random variables

  The moments of this random variable we will understand with the help of an example suppose n pens are randomly selected from a box containing N pens of which m are blue, Let Ai denote the events that i-th pen is blue, Now X is the number of blue pen selected is equal to the number of events A1,A2,…..,An that occur because the ith pen selected is equally likely to any of the N pens of which m are blue

gif

and so

A %7Bi%7D%29%20%3D%5Cfrac%7Bm%7D%7BN%7D%20%5Cfrac%7Bm 1%7D%7BN 1%7D
gif
gif

this gives

gif

so the variance of hypergeometric random variable will be

gif
gifgif

in similar way for the higher moments

gif
gif

hence

gif

Moments of the negative hypergeometric random variables

  consider the example of a package containing n+m vaccines of which n are special and m are ordinary, these vaccines removed one at a time, with each new removal equally likely to be any of the vaccine that remain in the package. Now let random variable Y denote the number of vaccines that need to be withdrawn until a total of r special vaccines have been removed, which is negative hypergeometric distribution, this is somehow similar with negative binomial to binomial as to hypergeometric distribution. to find the probability mass function if the kth draw gives the special vaccine after k-1 draw gives r-1 special and k-r ordinary vaccine

gif

now the random variable Y

Y=r+X

for the events Ai

gif
gif

as

gif

hence to find the variance of Y we must know the variance of X so

gif
gif
gif
gif

hence

gif

COVARIANCE             

The relationship between two random variable can be represented by the statistical parameter covariance, before the definition of covariance of two random variable X and Y recall that the expectation of two functions g and h of random variables X and Y respectively gives

gif
gif
gif
gif
gif

using this relation of expectation we can define covariance as

   “ The covariance between random variable X and random variable Y denoted by cov(X,Y)  is defined as

gif

using definition of expectation and expanding we get

gif
gifgif

it is clear that if the random variables X and Y are independent then

gif
gif

but the converse is not true for example if

gif

and defining the random variable Y as

gif

so

gif

here clearly X and Y are not independent but covariance is zero.

Properties of covariance

  Covariance between random variables X and Y has some properties as follows

gif
gif
gif
gif

using the definition off the covariance the first three properties are immediate and the fourth property follows by considering

em%3E%7Bj%3D1%7D%5E%7Bm%7D%20Y %7Bj%7D%20%5Cright%20%5D%20%3D%5Csum %7Bj%3D1%7D%5E%7Bm%7D%20v %7Bj%7D

now by definition

covariance

Variance of the sums

The important result from these properties is

gif

as

gif
gif
gif
gif

If Xi ‘s are pairwise independent then

Example: Variance of a binomial random variable

  If X is the random variable

gif

where Xi are the independent Bernoulli random variables such that

gif

 then find the variance of a binomial random variable X with parameters n and p.

Solution:

since

gif
gif

so for single variable we have

gif
gif
gif

so the variance is

gif

Example

  For the independent random variables Xi with the respective means and variance and a new random variable with deviation as

gif

then compute

gif

solution:

By using the above property and definition we have

gif
gif
gif

now for the random variable S

COVARIANCE

take the expectation

gif

Example:

Find the covariance of indicator functions for the events A and B.

Solution:

for the events A and B the indicator functions are

gif
gif

so the expectation of these are

gif
gif
gif
gif

thus the covariance is

gif
B%29%20 %20P%28A%29%5D

Example:

     Show that

gif

where Xi are independent random variables with variance.

Solution:

The covariance using the properties and definition will be

gif
gif
gif
gif

Example:

  Calculate the mean and variance of random variable S which is the sum of n sampled values if set of N people each of whom has an opinion about a certain subject that is measured by a real number v that represents the person’s “strength of feeling” about the subject. Let  represent the strength of feeling of person  which is unknown, to collect information a sample of n from N is taken randomly, these n people are questioned and their feeling is obtained to calculate vi

Solution

let us define the indicator function as

gif

thus we can express S as

gif

and its expectation as

gif

this gives the variance as

gif
gif

since

gif
gif

we have

gif
gif
gif
gif
gif

we know the identity

gif

so

gif
gif
gif
gif

so the mean and variance for the said random variable will be

gif
gif

Conclusion:

The correlation between two random variables is defined as covariance and using the covariance the sum of the variance is obtained for different random variables, the covariance and different moments with the help of definition of expectation is obtained  , if you require further reading go through

https://en.wikipedia.org/wiki/Expectation

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH.

For more post on mathematics, please follow our Mathematics page

Conditional Variance & Predictions: 7 Important Facts

lagrida latex editor 21 300x70 1

In this article the conditional Variance and predictions using conditional expectation for the different kind of random variable with some examples we will discuss.

Conditional Variance

The conditional variance of random variable X given Y is defined in similar way as conditional Expectation of random variable X given Y as

(X|Y)=E[(X-E[X|Y])2|Y]

here variance is the conditional expectation of difference between random variable and square of conditional expectation of X given Y when the value of Y is given.

The relation between the conditional variance and conditional expectation is

(X|Y) = E[X2|Y] – (E[X|Y])2

E[(X|Y)] = E[E[X2|Y]] – E[(E[X|Y])2]

= E[X2] – E[(E[X\Y])2]

since E[E[X|Y]] = E[X], we have

(E[X|Y]) = E[(E[X|Y])2] – (E[X])2

this is somehow similar from the relation of unconditional variance and expectation which was

Var(X) = E[X2] – (E[X])2

and we can find the variance with the help of conditional variance as

Var(X) = E[var(X|Y] + var(E[X|Y])

Example of conditional variance

Find the mean and variance of the number of travelers who enters into the bus if the people arrived at bus depot is Poisson distributed with mean λt and the initial bus arrived at bus depot is uniformly distributed over the interval (0,T) independent of people arrived or not.

Solution:

To find the mean and variance let for any time t , Y is the random variable for the time bus arrive and N(t) is the number of arrivals

E[N(Y)|Y = t] = E[N(t)|Y = t]

by the independence of Y and N(t)

=λt

since N(t) is Poisson with mean \lambda t
Hence

E[N(Y)|Y]=λY

so taking expectations gives

E[N(Y)] = λE[Y] = λT/2

To obtain Var(N(Y)), we use the conditional variance formula

lagrida latex editor 21

thus

(N(Y)|Y) = λY

E[N(Y)|Y] = λY

Hence, from the conditional variance formula,

Var(N(Y)) = E[λY]+(λY)

=λT/2 + λ2T2/12

where we have used the fact that Var(Y)=T2 / 12.

Variance of a sum of a random number of random variables

consider the sequence of independent and identically distributed random variables X1,X2,X3,………. and another random variable N independent of this sequence, we will find variance of sum of this sequence as

CodeCogsEqn 92

using

lagrida latex editor 48

which is obvious with the definition of variance and conditional variance for the individual random variable to the sum of sequence of random variables hence

CodeCogsEqn 93

Prediction

In prediction the value of one random variable can be predicted on the basis of observation of another random variable, for prediction of random variable Y if observed random variable is X we use g(X) as the function which tells the predicted value, obviously we try to choose g(X) closed to Y for this the best g is g(X)=E(Y|X) for this we must have to minimize the value of g by using the inequality

lagrida latex editor 49

This inequality we can get as

lagrida latex editor 22

However, given X, E[Y|X]-g(X), being a function of X, can be treated as a constant. Thus,

lagrida latex editor 23

which gives the required inequality

lagrida latex editor 50

Examples on Prediction

1. It is observed that the height of a person is six feet, what would be the prediction of his sons height after grown up if the height of son which is x inches now is normally distributed with mean x+1 and variance 4.

Solution: let X be the random variable denoting the height of the person and Y be the random variable for the height of son, then the random variable Y is

Y=X+e+1

here e represent the normal random variable independent of random variable X with mean zero and variance four.

so the prediction for the sons height is

lagrida latex editor 24

so the height of the son will be 73 inches after growth.

2. Consider an example of sending signals from location A and location B, if from location A a signal value s is sent which at location B received by normal distribution with mean s and variance 1 while if the signal S sent at A is normally distributed with mean \mu and variance \sigma^2, how we can predict that the signal value R sent from location A will be received is r at location B?

Solution: The signal values S and R denote here the random variables distributed normally, first we find the conditional density function S given R as

lagrida latex editor 25

this K is independent of S, now

lagrida latex editor 26

here also C1 and C2 are independent on S, so the value of conditional density function is

WhatsApp Image 2022 09 10 at 11.02.40 PM

C is also independent on s, Thus the signal sent from location A as R and received at location B as r is normal with mean and variance

lagrida latex editor 27

and the mean square error for this situation is

lagrida latex editor 28

Linear Predictor

Every time we can not find the joint probability density function even the mean, variance and the correlation between two random variables is known, in such a situation linear predictor of one random variable with respect to another random variable is very helpful which can predict the minimum, so the for the linear predictor of random variable Y with respect to random variable X we take a and b to minimize

lagrida latex editor 29

Now differentiate partially with respect to a and b we will get

lagrida latex editor 26 1

solving these two equations for a nd b we will get

lagrida latex editor 31

thus minimizing this expectation gives the linear predictor as

lagrida latex editor 32

where the means are the respective means of random variables X and Y, the error for the linear predictor will be obtained with the expectation of

conditional variance
conditional variance: Error in prediction

This error will be nearer to zero if correlation is perfectly positive or perfectly negative that is coefficient of correlation is either +1 or -1.

Conclusion

The conditional variance for the discrete and continuous random variable with different examples were discussed, one of the important application of conditional expectation in prediction is also explained with suitable examples and with best linear predictor, if you require further reading go through below links.

For more post on Mathematics, please refer to our Mathematics Page

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

11 Facts On Mathematical Expectation & Random Variable

2 1

Mathematical Expectation and random variable    

     The mathematical expectation plays very important role in the probability theory, the basic definition and basic properties of mathematical expectation already we discussed in previous some articles now after discussing the various distributions and types of distributions, in the following article we will get familiar with some more advanced properties of mathematical expectation.

Expectation of sum of random variables | Expectation of function of random variables | Expectation of Joint probability distribution

     We know the mathematical expectation of random variable of discrete nature is

2 1
2.0 Copy

and for the continuous one is

3.0 Copy

now for the random variable X and Y if discrete then with the joint probability mass function p(x,y)

expectation of function of random variable X and Y will be

4.0

and if continuous then with the joint probability density function f(x, y) the expectation of function of random variable X and Y will be

5.0

if g is addition of these two random variables in continuous form the

6.0
7.0
8.0
9.0

and if for the random variables X and Y we have

X>Y

then the expectation also

10.0 1

Example

A Covid-19 hospital is uniformly distributed on the road of the length L at a point X, a vehicle carrying oxygen for the patients is at a location Y which is also uniformly distributed on the road, Find the expected distance between Covid-19 hospital and oxygen carrying vehicle if they are independent.

Solution:

To find the expected distance between X and Y we have to calculate E { | X-Y | }

Now the joint density function of X and Y will be

11.0 1

since

12.0 1

by following this we have

13.0 1

now the value of integral will be

14.0
15.0
16.0

Thus the expected distance between these two points will be

17.0

Expectation of Sample mean

  As the sample mean of the sequence of random variables X1, X2, ………, Xn with distribution function F and expected value of each as μ is

18.0

so the expectation of this sample mean will be

19.0
20.0
71.0
22.0

which shows the expected value of sample mean is also μ.

Boole’s Inequality

                Boole’s inequality can be obtained with the help of properties of expectations, suppose the random variable X defined as

23.0 1

where

24.0

here Ai ‘s are the random events, this means random variable X represents the occurrence of the number of events Ai and another random variable Y as

25.0

clearly

X>=Y

E[X] >= E[Y]

and so is

now if we take the value of random variable X and Y these expectation will be

28.0

and

29.0

substituting these expectation in the above inequality we will get Boole’s inequality as

30.0

Expectation of Binomial random variable | Mean of Binomial random variable

  We know that the binomial random variable is the random variable which shows number of successes in n independent trials with probability of success as p and failure as q=1-p, so if

X=X1 + X2+ …….+ Xn

Where

31.0

here these Xi ‘s are the Bernoulli and the expectation will be

32.0

so the expectation of X will be

33.0

Expectation of Negative binomial random variable | Mean of Negative binomial random variable

  Let a random variable X which represents the number of trials needed to collect r successes, then such a random variable is known as negative binomial random variable and it can be expressed as

34.0

here each Xi denote the number of trials required after the (i-1)st success  to obtain the total of i successes.

Since each of these Xi represent the geometric random variable and we know the expectation for the geometric random variable is

35.0

so

36.0

which is the expectation of negative binomial random variable.

Expectation of hypergeometric random variable | Mean of hypergeometric random variable

The expectation or mean of the hypergeometric random variable we will obtain with the help of a simple real life example, if n number of books are randomly selected from a shelf containing N books of which m are of mathematics, then to find the expected number of mathematics books let X denote the number of mathematics books selected then we can write X as

37.0

where

38.0

so

39.0
40.0

=n/N

which gives

41.0

which is the mean of such a hypergeometric random variable.

Expected number of matches

   This is very popular problem related to expectation, suppose that in a room there are N number of people who throw their hats in the middle of the room  and all the hats are mixed  after that each person randomly choose one hat then the expected number of people who select their own hat we can obtain by letting X to be the number of matches so

42.0

Where

43.0

since each person has equal opportunity to select any of the hat from N hats then

44.0

so

45.0

which means exactly one person on average choose his own hat.

The probability of a union of events

     Let us obtain the probability of the union of the events with the help of expectation so for the events Ai

46.0

with this we take

47.0

so the expectation of this will be

48.0

and expanding using expectation property as

49.0

since we have

Mathematical Expectation
Mathematical Expectation: The probability of a union of events

and

51.0

so

52.0

this implies the probability of union as

52.0 1

Bounds from Expectation using Probabilistic method

    Suppose S be a finite set and f is the function on the elements of S and

53.0

here we can obtain the lower bound for this m by expectation of f(s) where “s” is any random element of S whose expectation we can calculate so

54.0
55.0 1

here we get expectation as the lower bound for the maximum value

Maximum-Minimum identity

 Maximum Minimum identity is the maximum of the set of numbers to the minimums of the subsets of these numbers that is for any numbers xi

56.0 1

To show this let us restrict the xi within the interval [0,1], suppose a uniform random variable U on the interval (0,1) and the events Ai as the uniform variable U is less than xi that is

57.0

since at least one of the above event occur as U is less than one the value of xi

58.0

and

59.0

Clearly we know

60.0

and all the events will occur if U is less than all the variables and

62.0 1

the probability gives

62.0

we have the result of probability of union as

63.0

following this inclusion exclusion formula for the probability

64.0

consider

65.0

this gives

66.0

since

67.0

which means

68.0
  • hence we can write it as
69.0

taking expectation we can find expected values of maximum and partial minimums as

70.0

Conclusion:

The Expectation in terms of various distribution and correlation of expectation with some of the probability theory concepts were the focus of this article which shows the use of expectation as a tool to get expected values of different kind of random variables, if you require further reading go through below books.

For more articles on Mathematics, please see our Mathematics page.

https://en.wikipedia.org/wiki/Expectation

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Conditional Distribution: 7 Interesting Facts To Know

9.PNG

Conditional distribution

   It is very interesting to discuss the conditional case of distribution when two random variables follows the distribution satisfying one given another, we first briefly see the conditional distribution in both the case of random variables, discrete and continuous then after studying some prerequisites we focus on the conditional expectations.

Discrete conditional distribution

     With the help of joint probability mass function in joint distribution we define conditional distribution for the discrete random variables X and Y using conditional probability for X given Y as the distribution with the probability mass function

1
2.PNG
3.PNG

provided the denominator probability is greater than zero, in similar we can write this as

4.PNG
5.PNG

in the joint probability if the X and Y are independent random variables then this will turn into

6.PNG
7.PNG
8.PNG

so the discrete conditional distribution or conditional distribution for the discrete random variables X given Y is the random variable with the above probability mass function in similar way for Y given X we can define.

Example on discrete conditional distribution

  1. Find the probability mass function of random variable X given Y=1, if the joint probability mass function for the random variables X and Y has some values as

p(0,0)=0.4 , p(0,1)=0.2, p(1,0)= 0.1, p(1,1)=0.3

Now first of all for the value Y=1 we have

9.PNG

so using the definition of probability mass function

10.PNG
11.PNG
12.PNG

we have

13.PNG

and

14.PNG
  • obtain the conditional distribution of X given X+Y=n, where X and Y are Poisson distributions with the parameters λ1 and λ2 and X and Y are independent random variables

Since the random variables X and Y are independent, so the conditional distribution will have probability mass function as

15.PNG
16.PNG
17.PNG

since the sum of Poisson random variable is again poisson so

18.PNG
19.PNG
20.PNG

thus the conditional distribution with above probability mass function will be conditional distribution for such Poisson distributions. The above case can be generalize for more than two random variables.

Continuous conditional distribution

   The Continuous conditional distribution of the random variable X given y already defined is the continuous distribution with the probability density function

21.PNG

denominator density is greater than zero, which  for the continuous density function is

22.PNG
23.PNG

thus the probability for such conditional density function is

24.PNG

In similar way as in discrete if X and Y are independent  in continuous then also

25.PNG

and hence

px 26
px 28 Copy 1

so we can write it as

px 29 Copy 1

Example on Continuous conditional distribution

  1. Calculate conditional density function of random variable X given Y if the joint probability density function with the open interval (0,1) is given by
px 30 Copy 1

If for the random variable X given Y within (0,1) then by using the above density function we have

px 31
px 32
px 33
px 34
px 35
  • Calculate the conditional probability
px 36

if the joint probability density function is given by

px 37

To find the conditional probability first we require the conditional density function so by the definition it would be

px 38
px 39
px 40

now using this density function in the probability the conditional probability is

100
101
px 41

Conditional distribution of bivariate normal distribution

  We know that the Bivariate normal distribution of the normal random variables X and Y with the respective means and variances as the parameters has the joint probability density function

Conditional distribution
Conditional distribution of bivariate normal distribution

so to find the conditional distribution for such a bivariate normal distribution for X given Y is defined by following the conditional density function of the continuous random variable and the above joint density function we have

Conditional distribution
Conditional distribution of bivariate normal distribution

By observing this we can say that this is normally distributed with the mean

px 42

and variance

px 43

in the similar way the conditional density function for Y given X already defined will be just interchanging the positions of the parameters of X with Y,

The marginal density function for X we can obtain from the above conditional density function by using the value of the constant

Conditional distribution
Conditional distribution of bivariate normal distribution

let us substitute in the integral

px 44

the density function will be now

Image3 1

since the total value of

Image4

by the definition of the probability so the density function will be now

Image5

which is nothing but the density function of random variable X with usual mean and variance as the parameters.

Joint Probability distribution of function of random variables

  So far we know the joint probability distribution of two random variables, now if we have functions of such random variables then what would be the joint probability distribution of those functions, how to calculate the density and distribution function because we have real life situations where we have functions of the random variables,

If Y1 and Y2 are the functions of the random variables X1 and X2 respectively which are jointly continuous then the joint continuous density function of these two functions will be

px 45

where Jacobian

px 46

and Y1 =g1 (X1, X2) and Y2 =g2 (X1, X2) for some functions g1 and g2 . Here g1 and g2 satisfies the conditions of the Jacobian as continuous and have continuous partial derivatives.

Now the probability for such functions of random variables will be

Image7

Examples on Joint Probability distribution of function of random variables

  1. Find the joint density function of the random variables Y1 =X1 +X2 and Y2=X1 -X2 , where X1 and X2 are the jointly continuous with joint probability density function. also discuss for the different nature of distribution .

Here we first we will check Jacobian

px 47

since g1(x1, x2)= x1 + x2  and g2(x1, x2)= x1 – x2 so

px 48

simplifying Y1 =X1 +X2 and Y2=X1 -X2 , for the value of X1 =1/2( Y1 +Y2 ) and X2 = Y1 -Y2 ,

px 49

if these random variables are independent uniform random variables

px 50

or if these random variables are independent exponential random variables with usual parameters

Image10

or if these random variables are independent normal random variables then

px 51
px 52
px 53
  • If X and Y are the independent standard normal variables as given
Conditional distribution

calculate the joint distribution for the respective polar coordinates.

We will convert by usual conversion X and Y into r and θ as

px 54

so the partial derivatives of these function will be

px 55
px 56
px 57
px 58

so the Jacobian using this functions is

px 59

if both the random variables X and Y are greater than zero then conditional joint density function is

px 60

now the conversion of cartesian coordinate to the polar coordinate using

px 61

so the probability density function for the positive values will be

px 62

for the different combinations of X and Y the density functions in similar ways are

px 63
px 64
px 65

now from the average of the above densities we can state the density function as

px 66

and the marginal density function from this joint density of polar coordinates over the interval (0, 2π)

px 67
  • Find the joint density function for the function of random variables

U=X+Y and V=X/(X+Y)

where X and Y are the gamma distribution with parameters (α + λ) and (β +λ) respectively.

Using the definition of gamma distribution and joint distribution function the density function for the random variable X and Y will be

px 68
px 69

consider the given functions as

g1 (x,y) =x+y , g2 (x,y) =x/(x+y),

so the differentiation of these function is

px 70
px 71
px 72

now the Jacobian is

px 73

after simplifying the given equations the variables x=uv and y=u(1-v) the probability density function is

px 74
px 75

we can use the relation

px 76
px 77
  • Calculate the joint probability density function for

Y1 =X1 +X2+ X3 , Y2 =X1– X2 , Y3 =X1 – X3

where the random variables X1 , X2, X3 are the standard normal random variables.

Now let us calculate the Jacobian by using partial derivatives of

Y1 =X1 +X2+ X3 , Y2 =X1– X2 , Y3 =X1 – X3

as

px 78

simplifying for variables X1 , X2 and X3

X1 = (Y1 + Y2 + Y3)/3 , X2 = (Y1 – 2Y2 + Y3)/3 , X3 = (Y1 + Y2 -2 Y3)/3

we can generalize the joint density function as

px 79

so we have

px 80

for the normal variable the  joint probability density function is

px 81

hence

px 82

where the index is

px 83
px 84

compute the joint density function of Y1 ……Yn and marginal density function for Yn where

px 85

and Xi are independent identically distributed exponential random variables with parameter λ.

for the random variables of the form

Y1 =X1 , Y2 =X1 + X2 , ……, Yn =X1 + ……+ Xn

the Jacobian will be of the form

Image11

and hence its value is one, and the joint density function for the exponential random variable

px 86

and the values of the variable Xi ‘s will be

px 87

so the joint density function is

px 88
px 89
px 90
px 91

Now to find the marginal density function of Yn we will integrate one by one  as

px 92
px 93

and

px 94 1
px 94 2

like wise

px 96

if we continue this process we will get

px 97

which is the marginal density function.

Conclusion:

The conditional distribution for the discrete and continuous random variable with different examples considering some of the types of these random variables discussed, where the independent random variable plays important role. In addition the  joint distribution for the function of joint continuous random variables also explained with suitable examples, if you require further reading go through below links.

For more post on Mathematics, please refer to our Mathematics Page

Wikipediahttps://en.wikipedia.org/wiki/joint_probability_distribution/” target=”_blank” rel=”noreferrer noopener” class=”rank-math-link”>Wikipedia.org

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Jointly Distributed Random Variables: 11 Important Facts

Content

Jointly distributed random variables

     The jointly distributed random variables are the random variable more than one with probability jointly distributed for these random variables, in other words in experiments where the different outcome with their common probability is known as jointly distributed random variable or joint distribution, such type of situation occurs frequently while dealing the problems of the chances.

Joint distribution function | Joint Cumulative probability distribution function | joint probability mass function | joint probability density function

    For the random variables X and Y the distribution function or joint cumulative distribution function is

gif

where the nature of the joint probability depends on the nature of random variables X and Y either discrete or continuous, and the individual distribution functions for X and Y can be obtained using this joint cumulative distribution function as

gif

similarly for Y as

gif

these individual distribution functions of X and Y are known as Marginal distribution functions when joint distribution is under consideration. These distributions are very helpful for getting the probabilities like

and in addition the joint probability mass function for the random variables X and Y is defined as

gif

the individual probability mass or density functions for X and Y can be obtained with the help of such joint probability mass or density function like in terms of discrete random variables as

gif

and in terms of continuous random variable the joint probability density function will be

gif

where C is any two dimensional plane, and the joint distribution function for continuous random variable will be

image 60

the probability density function from this distribution function can be obtained by differentiating

gif

and the marginal probability from the joint probability density function

gif

as

gif

and

gif

with respect to the random variables X and Y respectively

Examples on Joint distribution

  1. The joint probabilities for the random variables X and Y representing the number of mathematics and statistics books from a set of books which contains 3 mathematics, 4 statistics and 5 physics books if 3 books taken randomly
%5Cbinom%7B12%7D%7B3%7D%3D%5Cfrac%7B1%7D%7B220%7D
  • Find the joint probability mass function for the sample of families having 15% no child, 20% 1 child, 35% 2 child and 30% 3 child if the family we choose randomly from this sample for child to be Boy or Girl?

The joint probability we will find by using the definition as

Jointly distributed random variables
Jointly distributed random variables : Example

and this we can illustrate in the tabular form as follows

Jointly distributed random variables
Jointly distributed random variables : Example of joint distribution
  • Calculate the probabilities
gif

if for the random variables X and Y the joint probability density function is given by

gif

with the help of definition of joint probability for continuous random variable

gif

and the given joint density function the first probability for the given range will be

gif
gif
gif
gif

in the similar way the probability

gif
gif
gif
gif

and finally

gif
gif
gif
  • Find the joint density function for the quotient X/Y of random variables X and Y if their joint probability density function is
gif

To find the probability density function for the function X/Y we first find the joint distribution function then we will differentiate the obtained result,

so by the definition of joint distribution function and given probability density function we have

%7BY%7D%28a%29%3DP%20%7B%20%5Cfrac%7BX%7D%7BY%7D%5Cleq%20a%20%7D
gif
gif
gif
gif

thus by differentiating this distribution function with respect to a we will get the density function as

gif

where a is within zero to infinity.

Independent random variables and joint distribution

     In the joint distribution the probability for two random variable X and Y is said to be independent if

gif

where A and B are the real sets. As already in terms of events we know that the independent random variables are the random variables whose events are independent.

Thus for any values of a and b

gif

and the joint distribution or cumulative distribution function for the independent random variables X and Y will be

gif

if we consider the discrete random variables X and Y then

gif

since

gif
gif
gif
gif

similarly for the continuous random variable also

gif

Example of independent joint distribution

  1. If for a specific day in a hospital the patients entered are poisson distributed with parameter λ and probability of male patient as p and probability of female patient as (1-p) then show that the number of male patients and female patients entered in the hospital are independent poisson random variables with parameters λp and λ(1-p) ?

consider the number of male and female patients by random variable X and Y then

gif
gif

as X+Y are the total number of patients entered in the hospital which is poisson distributed so

gif

as the probability of male patient is p and female patient is (1-p) so exactly from total fix number are male or female shows binomial probability as

gif

using these two values we will get the above joint probability as

gif
gif
gif

thus probability of male and female patients will be

gif
gif

and

gif

which shows both of them are poisson random variables with the parameters λp and λ(1-p).

2. find the probability that a person has to wait for more than ten minutes at the meeting for a client as if each client and that person arrives between  12 to 1 pm following uniform distribution.

consider the random variables X and Y to denote the time for that person and client between 12 to 1 so the probability jointly for X and Y will be

image 61
gif
gif
gif
gif

calculate

gif

where X,Y and Z are uniform random variable over the interval (0,1).

here the probability will be

gif

for the uniform distribution the density function

gif

for the given range so

gif
gif
gif
gif

SUMS OF INDEPENDENT RANDOM VARIABLES BY JOINT DISTRIBUTION

  The sum of independent variables X and Y with the probability density functions as continuous random variables, the cumulative distribution function will be

gif
gif
gif
gif

by differentiating this cumulative distribution function for the probability density function of these independent sums are

latex%5Dfty%7D%20F %7BX%7D%20%28a y%29%20f %7BY%7D%28y%29dy
gif
gif

by following these two results we will see some continuous random variables and their sum as independent variables

sum of independent uniform random variables

   for the random variables X and Y uniformly distributed over the interval (0,1) the probability density function for both of these independent variable is

gif

so for the sum X+Y we have

gif

for any value a lies between zero and one

gif

if we restrict a in between one and two it will be

gif

this gives the triangular shape density function

gif

if we generalize for the n independent uniform random variables 1 to n then their distribution function

by mathematical induction will be

gif

sum of independent Gamma random variables

    If we have two independent gamma random variables with their usual density function

gif

then following the density for the sum of independent gamma random variables

gif
gif
gif
gif
gif

this shows the density function for the sum of gamma random variables which are independent

sum of independent exponential random variables

    In the similar way as gamma random variable the sum of independent exponential random variables we can obtain density function and distribution function by just specifically assigning values of gamma random variables.

Sum of independent normal random variable | sum of independent Normal distribution

                If we have n number of independent normal random variables Xi , i=1,2,3,4….n with respective means μi and variances σ2i then their sum is also normal random variable with the mean as Σμi  and variances Σσ2i

    We first show the normally distributed independent sum for two normal random variable X with the parameters 0 and σ2 and Y with the parameters 0 and 1, let us find the probability density function for the sum X+Y with

gif

in the joint distribution density function

gif

with the help of definition of density function of normal distribution

gif
gif

thus the density function will be

gif
gif
gif

which is nothing but the density function of a normal distribution with mean 0 and variance (1+σ2) following the same argument we can say

em%3E%7B2%7D

with usual mean and variances. If we take the expansion and observe the sum is normally distributed with the mean as the sum of the respective means and variance as the sum of the respective variances,

thus in the same way the nth sum will be the normally distributed random variable with the mean as Σμi  and variances Σσ2i

Sums of independent Poisson random variables

If we have two independent Poisson random variables X and Y with parameters λ1 and λ2 then their sum X+Y is also Poisson random variable or Poisson distributed

since X and Y are Poisson distributed and we can write their sum as the union of disjoint events so

gif
gif
em%3E%7B2%7D%5E%7Bn k%7D%7D%7B%28n k%29%21%7D

by using the of probability of independent random variables

em%3E%7B2%7D%5E%7Bn k%7D%7D%7Bk%21%28n k%29%21%7D
em%3E%7B2%7D%5E%7Bn k%7D
em%3E%7B2%7D%29%5E%7Bn%7D

so we get the sum X+Y is also Poisson distributed with the mean λ12

Sums of independent binomial random variables

                If we have two independent binomial random variables X and Y with parameters (n,p) and (m, p) then their sum X+Y is also binomial random variable or Binomial distributed with parameter (n+m, p)

let use the probability of the sum with definition of binomial as

gif
gif
gif
gif
gif

which gives

gif

so the sum X+Y is also binomially distributed with parameter (n+m, p).

Conclusion:

The concept of jointly distributed random variables which gives the distribution comparatively for more than one variable in the situation is discussed in addition the basic concept of independent random variable with the help of joint distribution and sum of independent variables with some example of distribution is given with their parameters, if you require further reading go through mentioned books. For more post on mathematics, please click here.

https://en.wikipedia.org

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Gamma Distribution Exponential Family: 21 Important Facts

Content

  1. Special form of Gamma distributions and relationships of Gamma distribution
  2. Gamma distribution exponential family
  3. Relationship between gamma and normal distribution
  4. Poisson gamma distribution | poisson gamma distribution negative binomial
  5. Weibull gamma distribution
  6. Application of gamma distribution in real life | gamma distribution uses | application of gamma distribution in statistics 
  7. Beta gamma distribution | relationship between gamma and beta distribution
  8. Bivariate gamma distribution
  9. Double gamma distribution
  10. Relation between gamma and exponential distribution | exponential and gamma distribution | gamma exponential distribution
  11. Fit gamma distribution
  12. Shifted gamma distribution
  13. Truncated gamma distribution
  14. Survival function of gamma distribution
  15. MLE of gamma distribution | maximum likelihood gamma distribution | likelihood function of gamma distribution
  16. Gamma distribution parameter estimation method of moments | method of moments estimator gamma distribution
  17. Confidence interval for gamma distribution
  18. Gamma distribution conjugate prior for exponential distribution | gamma prior distribution | posterior distribution poisson gamma
  19. Gamma distribution quantile function
  20. Generalized gamma distribution
  21. Beta generalized gamma distribution

Special form of Gamma distributions and relationships of Gamma distribution

  In this article we will discuss the special forms of gamma distributions and the relationships of gamma distribution with different continuous and discrete random variables also some estimation methods  in sampling of population using gamma distribution is briefly discuss.

Gamma distribution exponential family

  The gamma distribution exponential family and it is two parameter exponential family which is largely and applicable family of distribution as most of real life problems can be modelled in the gamma distribution exponential family and the quick and useful calculation  within the exponential family can be done easily, in the two parameter if we take probability density function as

x%7Dx%5E%7B%5Calpha%20

if we restrict the known value of α (alpha) this two parameter family will reduce to one parameter exponential family

x%7D a%20%5C%20%5C%20log%5Clambda%20%5Cfrac%7Bx%5E%7B%5Calpha%20

and for λ (lambda)

gif

Relationship between gamma and normal distribution

  In the probability density function of gamma distribution if we take alpha nearer to 50 we will get the nature of density function as

Gamma distribution exponential family
Gamma distribution exponential family

even the shape parameter in gamma distribution we are increasing which is resulting in similarity of normal distribution normal curve, if we tend shape parameter alpha tends to infinity the gamma distribution will be more symmetric and normal but as alpha tends to infinity value of x in gamma distribution will tends to minus infinity which result in semi infinite support of gamma distribution infinite hence even gamma distribution becomes symmetric but not same with normal distribution.

poisson gamma distribution | poisson gamma distribution negative binomial

   The poisson gamma distribution and binomial distribution are the discrete random variable whose random variable deals with the discrete values specifically success and failure in the form of Bernoulli trials which gives random success or failure as a result only, now the mixture of Poisson and gamma distribution also known as negative binomial distribution is the outcome of the repeated trial of Bernoulli’s trial, this can be parameterize in different way as if r-th success occurs in number of trials then it can be parameterize as

gif

and if the number of failures before the r-th success then it can be parameterize as

gif

and considering the values of r and p

gif
gif

the general form of the parameterization for the negative binomial or poisson gamma distribution is

gif.latex?P%28X%3Dx%29%3D%5Cbinom%7Bx+r 1%7D%7Bx%7Dp%5E%7Br%7D%281

and alternative one is

gif.latex?P%28X%3Dx%29%3D%5Cbinom%7Bx+r

this binomial distribution is known as negative because of the coefficient

gif.latex?%5Cbinom%7Bx+r 1%7D%7Bx%7D%20%3D%5Cfrac%7B%28x+r 1%29%28x+r 2%29...r%7D%7Bx%21%7D%20%5C%20%3D%20%28 1%29%5E%7Bx%7D%5Cfrac%7B%28 r %28x 1%29%29%28 r %28x 2%29%29...%28 r%29%7D%7Bx%21%7D%20%5C%20%3D%20%28 1%29%5E%7Bx%7D%5Cfrac%7B%28 r%29%28 r 1%29..

and this negative binomial or poisson gamma distribution is well define as the total probability we will get as one for this distribution

gif

The mean and variance for this negative binomial or poisson gamma distribution is

gif
gif

the poisson and gamma relation we can get by the following calculation

%5Cbeta%20%7D%20d%5Clambda
%5Cbeta%20%29%7Dd%5Clambda
gif
gif

Thus negative binomial is the mixture of poisson and gamma distribution and this distribution is used in day to day problems modelling where discrete and continuous mixture we require.

Gamma distribution exponential family
Gamma distribution exponential family

Weibull gamma distribution

   There are generalization of exponential distribution which involve Weibull as well as gamma distribution as the Weibull distribution has the probability density function as

gif

and cumulative distribution function as

gif

where as pdf and cdf of gamma distribution is already we discussed above the main connection between Weibull and gamma distribution is both are generalization of exponential distribution the difference between them is when power of variable is greater than one then Weibull distribution gives quick result while for less than 1 gamma gives quick result.

     We will not discuss here generalized Weibull gamma distribution that require separate discussion.

application of gamma distribution in real life | gamma distribution uses | application of gamma distribution in statistics 

  There are number of  application where gamma distribution is used to model the situation such as insurance claim to aggregate, rainfall amount accumulation, for any product its manufacturing and distribution, the crowd on specific web,  and in telecom exchange etc. actually the gamma distribution give the wait time prediction till next event for nth event. There are number of application of gamma distribution in real life.

beta gamma distribution | relationship between gamma and beta distribution

    The beta distribution is the random variable with the probability density function

gif

where

gif

which has the relationship with gamma function as

gif

and beta distribution related to gamma distribution as if X be gamma distribution with parameter alpha and beta as one and Y be the gamma distribution with parameter alpha as one and beta then the random variable X/(X+Y) is beta distribution.

or If X is Gamma(α,1) and Y is Gamma (1, β) then the random variable X/(X+Y) is Beta (α, β) 

and also

gif

bivariate gamma distribution

     A two dimensional or bivariate random variable is continuous if there exists a function f(x,y) such that the joint distribution function

gif

where

gif
gif

and the joint probability density function obtained by

gif

there are number of bivariate gamma distribution one of them is the bivariate gamma distribution with probability density function as

gif

double gamma distribution

  Double gamma distribution is one of the bivariate distribution with gamma random variables having parameter alpha and one with joint probability density function as

em%3E%7B2%7D%29%7Dy %7B1%7D%5E%7B%5Calpha %7B1%7D%20 1%7Dy %7B2%7D%5E%7B%5Calpha %7B2%7D%20 1%7D%20exp%28 y %7B1%7D%20 y %7B2%7D%29%2C%20y %7B1%7D%26gt%3B%200%2C%20y %7B2%7D%26gt%3B%200

this density forms the double gamma distribution with respective random variables and the moment generating function for double gamma distribution is

em%3E%7B2%7D%7D%20%7D

relation between gamma and exponential distribution | exponential and gamma distribution | gamma exponential distribution

   since the exponential distribution is the distribution with the probability density function

and the gamma distribution has the probability density function

clearly the value of alpha if we put as one we will get the exponential distribution, that is the gamma distribution is nothing but the generalization of the exponential distribution, which predict the wait time till the occurrence of next nth event while exponential distribution predict the wait time till the occurrence of the next event.

fit gamma distribution

   As far as fitting the given data in the form of gamma distribution imply finding the two parameter probability density function which involve shape, location and scale parameters so finding these parameters with different application and calculating the mean, variance, standard deviation and moment generating function is the fitting of gamma distribution, since different real life problems will be modelled in gamma distribution so the information as per situation must be fit in gamma distribution for this purpose various technique in various environment is already there e.g in R, Matlab, excel etc.

shifted gamma distribution

     There are as per application and need whenever the requirement of shifting the distribution required from two parameter gamma distribution the new generalized three parameter or any another generalized gamma distribution shift the shape location and scale , such gamma distribution is known as shifted gamma distribution

truncated gamma distribution

     If we restrict the range or domain of the gamma distribution for the shape scale and location parameters the restricted gamma distribution is known as truncated gamma distribution based on the conditions.

survival function of gamma distribution

                The survival function for the gamma distribution is defined the function s(x) as follows

gif

mle of gamma distribution | maximum likelihood gamma distribution | likelihood function of gamma distribution

we know that the maximum likelihood take the sample from the population as a representative and this sample consider as an estimator for the probability density function to maximize for the parameters of density function, before going to gamma distribution recall some basics as for the random variable X the probability density function with theta as parameter has likelihood function as

this we can express as

and method of maximizing this likelihood function can be

if such theta satisfy this equation, and as log is monotone function we can write in terms of log

and such a supremum exists if

em%3E%7Bk%7D%29

now we apply the maximum likelihood for the gamma distribution function as

gif

the log likelihood of the function will be

gif

so is

gif

and hence

gif

This can be achieved also as

gif.latex?%5Ctextbf%7BL%7D%28%5Calpha%20%2C%5Cbeta%20%7C%20x%29%3D%5Cleft%20%28%20%5Cfrac%7B%5Cbeta%20%5E%7B%5Calpha%20%7D%7D%7B%5CGamma%20%28%5Calpha%20%29%7D%20x %7B1%7D%5E%7B%5Calpha%20 1%7D%20e%5E%7B %5Cbeta%20x %7B1%7D%7D%20%5Cright%20%29...%5Cleft%20%28%20%5Cfrac%7B%5Cbeta%20%5E%7B%5Calpha%20%7D%7D%7B%5CGamma%20%28%5Calpha%20%29%7D%20x %7Bn%7D%5E%7B%5Calpha%20 1%7D%20e%5E%7B %5Cbeta%20x %7Bn%7D%7D%20%5Cright%20%29%20%3D%5Cleft%20%28%20%5Cfrac%7B%5Cbeta%20%5E%7B%5Calpha%20%7D%7D%7B%5CGamma%20%28%5Calpha%20%29%7D%20%5Cright%29%5E%7Bn%7D%20%28x %7B1%7D%20%28x %7B2%7D...%28x %7Bn%7D%29%5E%7B%5Calpha%20 1%7D%20e%5E%7B

by

gif

and the parameter can be obtained by differentiating

gif
gif
gif

gamma distribution parameter estimation method of moments | method of moments estimator gamma distribution

   We can calculate the moments of the population and sample with the help of expectation of nth order respectively, the method of moment equates these moments of distribution and sample to estimate the parameters, suppose we have sample of gamma random variable with the probability density function as

gif

we know the first tow moments for this probability density function is

em%3E%7B2%7D%3D%5Cfrac%7B%5Calpha%20%28%5Calpha%20+1%29%20%7D%7B%5Clambda%20%5E%7B2%7D%7D

so

gif

we will get from the second moment if we substitute lambda

em%3E%7B1%7D%5E%7B2%7D%7D%3D%5Cfrac%7B%5Calpha%20+1%7D%7B%5Calpha%20%7D

and from this value of alpha is

em%3E%7B2%7D %5Cmu%20 %7B1%7D%5E%7B2%7D%7D

and now lambda will be

em%3E%7B2%7D %5Cmu%20 %7B1%7D%5E%7B2%7D%7D

and moment estimator using sample will be

gif

confidence interval for gamma distribution

   confidence interval for gamma distribution is the way to estimate the information and its uncertainty which tells the interval is expected to have the true value of the parameter at what percent, this confidence interval is obtained from the observations of random variables, since it is obtained from random it itself is random to get the confidence interval for the gamma distribution there are different techniques in different application that we have to follow.

gamma distribution conjugate prior for exponential distribution | gamma prior distribution | posterior distribution poisson gamma

     The posterior and prior distribution  are the terminologies of Bayesian probability theory and they are conjugate to each other, any two distributions are conjugate if the posterior of one distribution is another distribution, in terms of theta let us show that gamma distribution is conjugate prior to the exponential distribution

if the probability density function of gamma distribution in terms of theta is as

gif

assume the distribution function for theta is exponential from given data

gif

so the joint distribution will be

gif

and using the relation

gif

we have

gif
gif
gif

which is

gif

so gamma distribution is conjugate prior to exponential distribution as posterior is gamma distribution.

gamma distribution quantile function

   Qauntile function of gamma distribution will be the function that gives the points in gamma distribution which relate the rank order of the values in gamma distribution, this require cumulative distribution function and for different language different algorithm and functions for the quantile of gamma distribution.

generalized gamma distribution

    As gamma distribution itself is the generalization of exponential family of distribution adding more parameters to this distribution gives us generalized gamma distribution which is the further generalization of this distribution family, the physical requirements gives different generalization one of the frequent one is using the probability density function as

gif

the cumulative distribution function for such generalized gamma distribution can be obtained by

gif

where the numerator represents the incomplete gamma function as

em%3E%7B0%7D%5E%7B%5Cinfty%7Dt%5E%7Ba 1%7De%5E%7B t%7Ddt

using this incomplete gamma function the survival function for the generalized gamma distribution can be obtained as

gif

another version of this three parameter generalized gamma distribution having probability density function is

gif

where k, β, θ are the parameters greater than zero, these generalization has convergence issues to overcome the Weibull parameters replaces

using this parameterization the convergence of the density function obtained so the more generalization for the gamma distribution with convergence is the distribution with probability density function as

gif.latex?F%28x%29%20%3D%20%5Cbegin%7Bcases%7D%20%5Cfrac%7B%7C%5Clambda%20%7C%7D%7B%5Csigma%20.t%7D.%5Cfrac%7B1%7D%7B%5CGamma%20%5Cleft%20%28%20%5Cfrac%7B1%7D%7B%5Clambda%20%5E%7B2%7D%7D%20%5Cright%20%29%7D.e%5Cleft%20%5B%20%5Cfrac%7B%5Clambda%20.%5Cfrac%7BIn%28t%29 %5Cmu%20%7D%7B%5Csigma%20%7D+In%5Cleft%20%28%20%5Cfrac%7B1%7D%7B%5Clambda%20%5E%7B2%7D%7D%20%5Cright%20%29 e%5E%7B%5Clambda.%5Cfrac%7BIn.%28t%29

Beta generalized gamma distribution

   The gamma distribution involving the parameter beta in the density function because of which sometimes gamma distribution is known as the beta generalized gamma distribution with the density function

gif
gif

with cumulative distribution function as

gif

which is already discussed in detail in the discussion of gamma distribution, the further beta generalized gamma distribution is defined with the cdf as

gif

where B(a,b) is the beta function , and the probability density function for this can be obtained by differentiation and the density function will be

gif

here the G(x) is the above defined cumulative distribution function of gamma distribution, if we put this value then the cumulative distribution function of beta generalized gamma distribution is

%5CGamma%20%28%5Cbeta%20%29%7D%7D%5Comega%20%5E%7Ba 1%7D%20%281 %5Comega%20%29%5E%7Bb 1%7D%20d%5Comega

and the probability density function

gif

the remaining properties can be extended for this beta generalized gamma distribution with usual definitions.

Conclusion:

There are different forms and generalization of gamma distribution and Gamma distribution exponential family as per the real life situations so possible such forms and generalizations were covered in addition with the estimation  methods of gamma distribution in population sampling of information, if you require further reading on Gamma distribution exponential family, please go through below link and books. For more topics on Mathematics please visit our page.

https://en.wikipedia.org/wiki/Gamma_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Inverse Gamma Distribution: 21 Important Facts

Inverse gamma distribution and moment generating function of gamma distribution

      In continuation with gamma distribution we will see the concept of inverse gamma distribution and moment generating function, measure of central tendencies mean, mode and median of gamma distribution by following some of the basic properties of gamma distribution.

gamma distribution properties

Some of the important properties of gamma distribution are enlisted as follows

The probability density function for the gamma distribution is

gif

or

gif

where the gamma function is

gif

2.The cumulative distribution function for the gamma distribution is

gif

where f(x) is the probability density function as given above in particular cdf is

gif

and

gif

respectively or

E[X]=α*β

and

gif
  • The moment generating function M(t) for the gamma distribution is
gif

or

gif
  • The curve for the pdf and cdf is
Inverse gamma distribution
  • The invers gamma distribution can be defined by taking reciprocal of the probability density function of gamma distribution as
gif
  • The sum of independent gamma distribution is again the gamma distribution with sum of the parameters.

inverse gamma distribution | normal inverse gamma distribution

                If in the gamma distribution in the probability density function

or

gif

we take the variable reciprocal or inverse then the probability density function will be

Thus the random variable with this probability density function is known to be the inverse gamma random variable or inverse gamma distribution or inverted gamma distribution.

y%29%5Cleft%20%7C%20%5Cfrac%7B%5Cmathrm%7Bd%7D%20%7D%7B%5Cmathrm%7Bd%7D%20y%7Dy%5E%7B 1%7D%20%5Cright%20%7C
%5Cbeta%20y%29%7Dy%5E%7B 2%7D
y%7D

The above probability density function in any parameter we can take either in the form of lambda or theta the probability density function which is the reciprocal of gamma distribution is the probability density function of inverse gamma distribution.

Cumulative distribution function or cdf of inverse gamma distribution

                The cumulative distribution function for the inverse gamma distribution is the distribution function

gif

in which the f(x) is the probability density function of the inverse gamma distribution as

Mean and variance of the inverse gamma distribution

  The mean and variance of the inverse gamma distribution by following the usual definition of expectation and variance will be

gif

and

gif

Mean and variance of the inverse gamma distribution proof

        To get the mean and variance of the inverse gamma distribution using the probability density function

and the definition of expectations, we first find the expectation for any power of x as

gif
gif.latex?%3D%5Cfrac%7B%5Cbeta%20%5E%7Bn%7D%5Ctau%20%28%5Calpha n%29%7D%7B%28%5Calpha%20 1%29..
gif.latex?%3D%5Cfrac%7B%5Cbeta%20%5E%7Bn%7D%7D%7B%28%5Calpha%20 1%29..

in the above integral we used the density function as

now for the value of α greater than one and n as one

gif

similarly the value for n=2 is for alpha greater than 2

gif

using these expectations will give us the value of variance as

gif

Invers gamma distribution plot | Inverse gamma distribution graph

                The inverse gamma distribution is the reciprocal of the gamma distribution so while observing the gamma distribution it is good to observe the nature of the curves of inverse gamma distribution having probability density function as

and the cumulative distribution function by following

gif
Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 1 and varying the value of β.

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 2 and varying the value of β

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 3 and varying the value of β.

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 1 and varying the value of α.

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 2 and varying the value of α

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 3 and varying the value of α.

moment generating function of gamma distribution

Before understanding the concept of moment generating function for the gamma distribution let us recall some concept of moment generating function

Moments

    The moment of the random variable is defined with the help of expectation as

gif

this is known as r-th moment of the random variable X it is the moment about origin and commonly known as raw moment.

     If we take the r-th moment of the random variable about the mean μ as

gif

this moment about the mean is known as central moment and the expectation will be as per the nature of random variable as

gif
gif

in the central moment if we put values of r then we get some initial moments as

em%3E%7B1%7D%3D0%20%2C%20%7B%5Cmu%7D %7B2%7D%3D%5Csigma%20%5E%7B2%7D

If we take the binomial expansion in the central moments then we can easily get the relationship between the central and raw moments as

em%3E%7Br j%7D%7B%5Cmu%7D%5E%7Bj%7D%20+%20..

some of the initial relationships are as follows

Moment generating function

   The moments we can generate with the help of a function that function is known as moment generating function and is defined as

gif

this function generates the moments with the help of expansion of exponential function in either of the form

gif

using Taylors form as

em%3E%7Br%7D%5Cfrac%7Bt%5E%7Br%7D%7D%7Br%21%7D+.

differentiating this expanded function with respect to t gives the different moments as

em%3E%7BX%7D%28t%29%5Clvert %7Bt%3D0%20%7D

on in another way if we take the derivative directly as

gif

since for both discrete

gif

and continuous we have

gif

so for t=0 we will get

gif

likewise

gif

as

gif

and in general

gif

there is two important relations for the moment generating functions

b%29%20%5C%20M %7B%28X+Y%29%7D%28t%29%3DM %7BX%7D%28t%29%20M %7BY%7D%28t%29

moment generating function of a gamma distribution | mgf of gamma distribution | moment generating function for gamma distribution

Now for the gamma distribution the moment generating function M(t) for the pdf

is

gif

and for the pdf

the moment generating function is

gif

gamma distribution moment generating function proof | mgf of gamma distribution proof

    Now first take the form of probability density function as

and using the definition of moment generating function M(t) we have

gif
gif

we can find the mean and variance of the gamma distribution with the help of moment generating function as differentiating with respect to t two times this function we will get

gif

if we put t=0 then first value will be

gif

and

gif

Now putting the value of these expectation in

gif

alternately for the pdf of the form

gif

the moment generating function will be

%5Cbeta%20%29%7D%20x%5E%7B%5Calpha%20 1%7D%20dx%20%5C%20%3D%20%5Cleft%20%28%20%5Cfrac%7B1%7D%7B1 %5Cbeta%20t%7D%20%5Cright%20%29%5E%7B%5Calpha%20%7D%5Cint %7B0%7D%5E%7B%5Cinfty%7D%20%5Cfrac%7By%5E%7B%5Calpha%20 1%7D%20e%5E%7B y%7D%7D%7B%5Ctau%20%28%5Calpha%20%29%7D%20dy%20%5C%20%5C%20%2C%20%5C%20%5C%20t%26lt%3B%20%5Cfrac%7B1%7D%7B%5Cbeta%20%7D%20%5C%20%3D%20%281 %5Cbeta%20t%29%5E%7B %5Calpha%20%7D%20%5C%20%5C%20t%26lt%3B%20%5Cfrac%7B1%7D%7B%5Cbeta%20%7D

and differentiating and putting t=0 will give mean and variance as follows

gif

2nd moment of gamma distribution

   The second moment of gamma distribution by differentiating moment generating function two times and putting the value of t=0 in second derivative of that function we will get

gif

third moment of gamma distribution

                The third moment of gamma distribution we can find by differentiating the moment generating function three times and putting the value of t=0 in third derivative of the mgf we will get

gif

or directly by integrating as

gif

 sigma for gamma distribution

   sigma or standard deviation of gamma distribution we can find by taking the square root of variance of gamma distribution of type

gif

or

gif

for any defined value of alpha, beta and lambda.

characteristic function of gamma distribution | gamma distribution characteristic function

      If the variable t in the moment generating function is purely an imaginary number as t=iω then the function is known as the characteristic function of gamma distribution denoted and expressed as

gif

as for any random variable the characteristic function will be

gif

Thus for the gamma distribution the characteristic function by following the pdf of gamma distribution is

gif

following

%5Cbeta%20%29%5E%7B %5Calpha%20%7D%5Cint %7B0%7D%5E%7B%5Cinfty%7Dx%5E%7B%5Calpha%20 1%7De%5E%7B x%7D%20dx%3D%5Ctau%20%28%5Calpha%20%29%5Cbeta%20%5E%7B%5Calpha%20%7D%281 i%5Cbeta%20t%29%5E%7B %5Calpha%20%7D

There is another form of this characteristics function also if

2%7D

then

2%7D

sum of gamma distributions | sum of exponential distribution gamma

  To know the result of sum of gamma distribution we must first of all understand sum of independent random variable for the continuous random variable, for this let us have probability density functions for the continuous random variables X and Y then the cumulative distribution function for the sum of random variables will be

gif

differentiating this convolution of integral for the probability density functions of X and Y will give the probability density function for the sum of random variables as

em%3E%7B %5Cinfty%7D%5E%7B%5Cinfty%7DF %7BX%7D%28a y%29f %7BY%7D%28y%29%20dy%20%5C%20%3D%20%5Cint %7B %5Cinfty%7D%5E%7B%5Cinfty%7D%5Cfrac%7B%5Cmathrm%7Bd%7D%20%7D%7B%5Cmathrm%7Bd%7D%20a%7DF %7BX%7D%28a y%29f %7BY%7D%28y%29%20dy%20%5C%20%3D%20%5Cint %7B %5Cinfty%7D%5E%7B%5Cinfty%7Df %7BX%7D%28a y%29f %7BY%7D%28y%29%20dy

Now let us prove if X and Y are the gamma random variables with respective density functions then there sum will also be gamma distribution with sum of same parameters

considering the probability density function of the form

for the random variable X take alpha as  s and for random variable Y take alpha as t so using the probability density for the sum of random variables we have

em%3E%7B0%7D%5E%7Ba%7D%5Clambda%20e%5E%7B %5Clambda%20%28a y%29%7D%20%28%5Clambda%20%28a y%29%29%5E%7Bs 1%7D%5Clambda%20e%5E%7B %5Clambda%20y%7D%20%28%5Clambda%20y%29%5E%7Bt 1%7D%20dy

here C is independent of a , now the value will be

gif

which represent the probability density function of sum of X and Y and which is of the Gamma distribution, hence the sum of the gamma distribution also represents the gamma distribution by respective sum of parameters.

mode of gamma distribution

    To find the mode of gamma distribution let us consider the probability density function as

now differentiate this pdf with respect to x, we will get the differentiation as

gif

this will be zero for x=0 or x=(α -1)/λ

so these are only critical points at which our first derivative will be zero if alpha greater than or equal to zero then x=0 will not be mode because this makes pdf zero so mode will be (α -1)/λ

and for alpha strictly less than one the derivative decreases from infinity to zero as x increases from zero to infinity so this is not possible hence the mode of gamma distribution is

gif

median of gamma distribution

The median of the gamma distribution can be found with the help of inverse gamma distribution as

gif

or

gif

provided

gif

which gives

gif.latex?median%28n%29%3Dn+%5Cfrac%7B2%7D%7B3%7D+%5Cfrac%7B8%7D%7B405n%7D%20 %5Cfrac%7B64%7D%7B5103n%5E%7B2%7D%7D+..

gamma distribution shape

     Gamma distribution takes different shape depending on the shape parameter when shape parameter is one gamma distribution is equal to the exponential distribution but when we vary the shape parameter the skewness of the curve of gamma distribution decreases as the increase in the shape parameter, in another words the shape of the curve of gamma distribution changes as per the standard deviation .

skewness of gamma distribution

    skewness of any distribution can be observed by observing the probability density function of that distribution and skewness coefficient

em%3E%7B3%7D%7D%7B%5Csigma%20%5E%7B3%7D%7D

for the gamma distribution we have

gif.latex?E%28X%5E%7Bk%7D%29%3D%5Cfrac%7B%28%5Calpha%20+k 1%29%28%5Calpha%20+k 2%29..

so

gif

this shows the skewness depends on alpha only if alpha increases to infinity curve will be more symmetric and sharp and when alpha goes to zero the gamma distribution density curve positively skewed which can be observed in the density graphs.

generalized gamma distribution | shape and scale parameter in gamma distribution | three parameter gamma distribution | multivariate gamma distribution

gif

where γ, μ and β are the shape, location and scale parameters respectively, by assigning specific values to these parameters we can get the two parameter gamma distribution specifically if we put μ=0, β=1 then we will get standard gamma distribution as

gif

using this 3 parameter gamma distribution probability density function we can find the expectation and variance by following there definition respectively.

Conclusion:

The concept of reciprocal of gamma distribution that is inverse gamma distribution in comparison with gamma distribution and measure of central tendencies of gamma distribution with the help of moment generating function were the focus of this article, if you require further reading go through suggested books and links. For more post on mathematics, visit our mathematics page.

https://en.wikipedia.org/wiki/Gamma_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Gamma Distribution: 7 Important Properties You Should Know

Gamma Distribution

One of the continuous random variable and continuous distribution is the Gamma distribution, As we know the continuous random variable deals with the continuous values or intervals so is the Gamma distribution with specific probability density function and probability mass function, in the successive discussion we discuss in detail the concept, properties and results with examples of gamma random variable and gamma distribution.

Gamma random variable or Gamma distribution | what is gamma distribution | define gamma distribution | gamma distribution density function | gamma distribution probability density function | gamma distribution proof

A continuous random variable with probability density function

gif

is known to be Gamma random variable or Gamma distribution where the α>0, λ>0 and the gamma function

gif

we have the very frequent property of gamma function by integration by parts as

gif
gif
gif

If we continue the process starting from n then

gif
gif
gif.latex?%3D%28n 1%29%20%28n 2%29....3.

and lastly the value of gamma of one will be

CodeCogsEqn

thus the value will be

gif

cdf of gamma distribution | cumulative gamma distribution | integration of gamma distribution

The cumulative distribution function(cdf) of gamma random variable or simply the distribution function of gamma random variable is same as that of continuous random variable provided the probability density function is different i.e

gif

here the probability density function is as defined above for the gamma distribution, the cumulative distribution function we can write also as

gif

in both of the above formats the value of pdf is as follows

gif

where the α >0, λ>0 are real numbers.

Gamma distribution formula | formula for gamma distribution | gamma distribution equation | gamma distribution derivation

To find the probability for the gamma random variable the probability density function we have to use for different given α >0 , λ >0 is as

gif


and using the above pdf the distribution for the gamma random variable we can obtain by

gif

Thus the gamma distribution formula require the pdf value and the limits for the gamma random variable as per the requirement.

Gamma distribution example


show that the total probability for the gamma distribution is one with the given probability density function i.e

gif

for λ >0, α>0.
Solution:
using the formula for the gamma distribution

gif
gif

since the probability density function for the gamma distribution is

gif


which is zero for all the value less than zero so the probability will be now

gif
gif

using the definition of gamma function

gif

and substitution we get

gif

thus

gif

Gamma distribution mean and variance | expectation and variance of gamma distribution | expected value and variance of gamma distribution | Mean of gamma distribution | expected value of gamma distribution | expectation of gamma distribution


In the following discussion we will find the mean and variance for the gamma distribution with the help of standard definitions of expectation and variance of continuous random variables,

The expected value or mean of the continuous random variable X with probability density function

gif

or Gamma random variable X will be

gif

mean of gamma distribution proof | expected value of gamma distribution proof

To obtain the expected value or mean of gamma distribution we will follow the gamma function definition and property,
first by the definition of expectation of continuous random variable and probability density function of gamma random variable we have

gif
gif
gif

by cancelling the common factor and using the definition of gamma function

gif

now as we have the property of gamma function

gif

the value of expectation will be

gif

thus the mean or expected value of gamma random variable or gamma distribution we get is

gif

variance of gamma distribution | variance of a gamma distribution

The variance for the gamma random variable with the given probability density function

gif

or variance of the gamma distribution will be

gif

variance of gamma distribution proof


As we know that the variance is the difference of the expected values as

gif

for the gamma distribution we already have the value of mean

gif

now first let us calculate the value of E[X2], so by definition of expectation for the continuous random variable we have
since the function f(x) is the probability distribution function of gamma distribution as

gif

so the integral will be from zero to infinity only

gif
gif

so by definition of the gamma function we can write

gif
gif

Thus using the property of the gamma function we got the value as

gif


Now putting the value of these expectation in

gif
gif
gif

thus, the value of variance of gamma distribution or gamma random variable is

gif

Gamma distribution parameters | two parameter gamma distribution | 2 variable gamma distribution


The Gamma distribution with the parameters λ>0, α>0 and the probability density function

gif

has statistical parameters mean and variance as

gif

and

gif

since λ is positive real number, to simplify and easy handling another way is to set λ=1/β so this gives the probability density function in the form

gif

in brief the distribution function or cumulative distribution function for this density we can express as

this gamma density function gives the mean and variance as

gif

and

gif


which is obvious by the substitution.
Both the way are commonly used either the gamma distribution with the parameter α and λ denoted by gamma (α, λ) or the gamma distribution with the parameters β and λ denoted by gamma (β, λ) with the respective statistical parameters mean and variance in each of the form.
Both are nothing but the same.

Gamma distribution plot | gamma distribution graph| gamma distribution histogram

The nature of the gamma distribution we can easily visualize with the help of graph for some of specific values of the parameters, here we draw the plots for the probability density function and cumulative density function for some values of parameters
let us take probability density function as

gif

then cumulative distribution function will be

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of alpha as 1 and varying the value of beta.

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of alpha as 2 and varying the value of beta

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of alpha as 3 and varying the value of beta

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of beta  as 1 and varying the value of alpha

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of beta  as 2 and varying the value of alpha

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of beta as 3 and varying the value of alpha.

In general different curves as for alpha varying is

Gamma distribution
Gamma distribution graph

Gamma distribution table | standard gamma distribution table


The numerical value of gamma function

gif


known as incomplete gamma function numerical values as follows

Gamma distribution



The gamma distribution numerical value for sketching the plot for the probability density function and cumulative distribution function for some initial values are as follows

1xf(x),α=1,β=1f(x),α=2,β=2f(x),α=3,β=3P(x),α=1,β=1P(x),α=2,β=2P(x),α=3,β=3
0100000
0.10.9048374180.023780735611.791140927E-40.095162581960.0012091042746.020557215E-6
0.20.81873075310.04524187096.929681371E-40.18126924690.004678840164.697822176E-5
0.30.74081822070.064553098230.0015080623630.25918177930.010185827111.546530703E-4
0.40.6703200460.081873075310.002593106130.3296799540.017523096313.575866931E-4
0.50.60653065970.097350097880.0039188968750.39346934030.026499021166.812970042E-4
0.60.54881163610.11112273310.0054582050210.45118836390.036936313110.001148481245
0.70.49658530380.12332041570.0071856645830.50341469620.048671078880.001779207768
0.80.44932896410.13406400920.0090776691950.55067103590.061551935550.002591097152
0.90.40656965970.14346633410.011112273310.59343034030.075439180150.003599493183
10.36787944120.15163266490.013269098340.63212055880.090204010430.004817624203
1.10.33287108370.15866119790.015529243520.66712891630.10572779390.006256755309
1.20.30119421190.16464349080.017875201230.69880578810.12190138220.007926331867
1.30.2725317930.16966487750.02029077660.7274682070.13862446830.00983411477
1.40.24659696390.17380485630.022761011240.75340303610.15580498360.01198630787
1.50.22313016010.17713745730.025272110820.77686983990.17335853270.01438767797
1.60.2018965180.17973158570.027811376330.7981034820.19120786460.01704166775
1.70.18268352410.18165134610.030367138940.81731647590.20928237590.01995050206
1.80.16529888820.18295634690.032928698170.83470111180.22751764650.02311528775
1.90.14956861920.18370198610.035486263270.85043138080.24585500430.02653610761
20.13533528320.18393972060.038030897710.86466471680.26424111770.03021210849
2.10.12245642830.18371731830.040554466480.87754357170.28262761430.03414158413
2.20.11080315840.1830790960.043049586250.88919684160.30097072420.03832205271
2.30.10025884370.18206614240.045509578110.89974115630.31923094580.04275032971
2.40.090717953290.18071652720.047928422840.90928204670.33737273380.04742259607
2.50.082084998620.1790654980.050300718580.91791500140.35536420710.052334462
2.60.074273578210.17714566550.052621640730.92572642180.3731768760.05748102674
2.70.067205512740.17498717590.054886904070.93279448730.39078538750.0628569343
2.80.060810062630.17261787480.057092726880.93918993740.40816728650.06845642568
2.90.055023220060.17006345890.059235797090.94497677990.42530279420.07427338744
30.049787068370.16734762010.06131324020.95021293160.44217459960.08030139707
Image9
Gamma Distribution Graph
Image10
Image11

finding alpha and beta for gamma distribution | how to calculate alpha and beta for gamma distribution | gamma distribution parameter estimation


For a gamma distribution finding alpha and beta we will take mean and variance of the gamma distribution

gif

and

gif


now we will get value of beta as

gif


so

gif


and

gif

thus

gif

only taking some fractions from the gamma distribution we will get the value of alpha and beta.

gamma distribution problems and solutions | gamma distribution example problems | gamma distribution tutorial | gamma distribution question

1. Consider the time require to resolve the problem for a customer is gamma distributed in hours with the mean 1.5 and variance 0.75 what would be the probability that the problem resolving time exceed 2 hours, if time exceeds 2 hours what would be the probability that the problem will resolved in at least 5 hours.

solution: since the random variable is gamma distributed with mean 1.5 and variance 0.75 so we can find the values of alpha and beta and with the help of these values the probability will be

P(X>2)=13e-4=0.2381

and

P(X>5 | X>2)=(61/13)e-6=0.011631

2. If the negative feedback in week from the users is modelled in gamma distribution with parameters alpha 2 and beta as 4 after the 12 week negative feedback came after restructuring the quality, from this information can restructuring improves the performance ?

solution: As this is modelled in gamma distribution with α=2, β=4

we will find the mean and standard deviation as μ =E(x)=α * β=4 * 2=8

since the value X=12 is within the standard deviation from the mean so we can not say this is improvement or not by the restructuring the quality, to prove the improvement caused by the restructuring information given is insufficient.

3. Let X be the gamma distribution with parameters α=1/2, λ=1/2 , find the probability density function for the function Y=Square root of X

Solution: let us calculate the cumulative distribution function for Y as

2%7D

now differentiating this with respect to y gives the probability density function for Y as

2%7D

and the range for y will be from 0 to infinity


Conclusion:

The concept of gamma distribution in probability and statistic is the one of the important day to day applicable distribution of exponential family, all the basic to higher level concept were discussed so far related to gamma distribution, if you require further reading, please go through mentioned books. You can also visit out mathematics page for more Topic

https://en.wikipedia.org/wiki/Gamma_distribution
A first course in probability by Sheldon Ross
Schaum’s Outlines of Probability and Statistics
An introduction to probability and statistics by ROHATGI and SALEH