- Content
- Conditional distribution
- Discrete conditional distribution
- Example on discrete conditional distribution
- Continuous conditional distribution
- Example on Continuous conditional distribution
- Conditional distribution of bivariate normal distribution
- Joint Probability distribution of function of random variables
- Examples on Joint Probability distribution of function of random variables
Conditional distribution
It is very interesting to discuss the conditional case of distribution when two random variables follows the distribution satisfying one given another, we first briefly see the conditional distribution in both the case of random variables, discrete and continuous then after studying some prerequisites we focus on the conditional expectations.
Discrete conditional distribution
With the help of joint probability mass function in joint distribution we define conditional distribution for the discrete random variables X and Y using conditional probability for X given Y as the distribution with the probability mass function
provided the denominator probability is greater than zero, in similar we can write this as
in the joint probability if the X and Y are independent random variables then this will turn into
so the discrete conditional distribution or conditional distribution for the discrete random variables X given Y is the random variable with the above probability mass function in similar way for Y given X we can define.
Example on discrete conditional distribution
- Find the probability mass function of random variable X given Y=1, if the joint probability mass function for the random variables X and Y has some values as
p(0,0)=0.4 , p(0,1)=0.2, p(1,0)= 0.1, p(1,1)=0.3
Now first of all for the value Y=1 we have
so using the definition of probability mass function
we have
and
- obtain the conditional distribution of X given X+Y=n, where X and Y are Poisson distributions with the parameters λ1 and λ2 and X and Y are independent random variables
Since the random variables X and Y are independent, so the conditional distribution will have probability mass function as
since the sum of Poisson random variable is again poisson so
thus the conditional distribution with above probability mass function will be conditional distribution for such Poisson distributions. The above case can be generalize for more than two random variables.
Continuous conditional distribution
The Continuous conditional distribution of the random variable X given y already defined is the continuous distribution with the probability density function
denominator density is greater than zero, which for the continuous density function is
thus the probability for such conditional density function is
In similar way as in discrete if X and Y are independent in continuous then also
and hence
so we can write it as
Example on Continuous conditional distribution
- Calculate conditional density function of random variable X given Y if the joint probability density function with the open interval (0,1) is given by
If for the random variable X given Y within (0,1) then by using the above density function we have
- Calculate the conditional probability
if the joint probability density function is given by
To find the conditional probability first we require the conditional density function so by the definition it would be
now using this density function in the probability the conditional probability is
Conditional distribution of bivariate normal distribution
We know that the Bivariate normal distribution of the normal random variables X and Y with the respective means and variances as the parameters has the joint probability density function
so to find the conditional distribution for such a bivariate normal distribution for X given Y is defined by following the conditional density function of the continuous random variable and the above joint density function we have
By observing this we can say that this is normally distributed with the mean
and variance
in the similar way the conditional density function for Y given X already defined will be just interchanging the positions of the parameters of X with Y,
The marginal density function for X we can obtain from the above conditional density function by using the value of the constant
let us substitute in the integral
the density function will be now
since the total value of
by the definition of the probability so the density function will be now
which is nothing but the density function of random variable X with usual mean and variance as the parameters.
Joint Probability distribution of function of random variables
So far we know the joint probability distribution of two random variables, now if we have functions of such random variables then what would be the joint probability distribution of those functions, how to calculate the density and distribution function because we have real life situations where we have functions of the random variables,
If Y1 and Y2 are the functions of the random variables X1 and X2 respectively which are jointly continuous then the joint continuous density function of these two functions will be
where Jacobian
and Y1 =g1 (X1, X2) and Y2 =g2 (X1, X2) for some functions g1 and g2 . Here g1 and g2 satisfies the conditions of the Jacobian as continuous and have continuous partial derivatives.
Now the probability for such functions of random variables will be
Examples on Joint Probability distribution of function of random variables
- Find the joint density function of the random variables Y1 =X1 +X2 and Y2=X1 -X2 , where X1 and X2 are the jointly continuous with joint probability density function. also discuss for the different nature of distribution .
Here we first we will check Jacobian
since g1(x1, x2)= x1 + x2 and g2(x1, x2)= x1 – x2 so
simplifying Y1 =X1 +X2 and Y2=X1 -X2 , for the value of X1 =1/2( Y1 +Y2 ) and X2 = Y1 -Y2 ,
if these random variables are independent uniform random variables
or if these random variables are independent exponential random variables with usual parameters
or if these random variables are independent normal random variables then
- If X and Y are the independent standard normal variables as given
calculate the joint distribution for the respective polar coordinates.
We will convert by usual conversion X and Y into r and θ as
so the partial derivatives of these function will be
so the Jacobian using this functions is
if both the random variables X and Y are greater than zero then conditional joint density function is
now the conversion of cartesian coordinate to the polar coordinate using
so the probability density function for the positive values will be
for the different combinations of X and Y the density functions in similar ways are
now from the average of the above densities we can state the density function as
and the marginal density function from this joint density of polar coordinates over the interval (0, 2π)
- Find the joint density function for the function of random variables
U=X+Y and V=X/(X+Y)
where X and Y are the gamma distribution with parameters (α + λ) and (β +λ) respectively.
Using the definition of gamma distribution and joint distribution function the density function for the random variable X and Y will be
consider the given functions as
g1 (x,y) =x+y , g2 (x,y) =x/(x+y),
so the differentiation of these function is
now the Jacobian is
after simplifying the given equations the variables x=uv and y=u(1-v) the probability density function is
we can use the relation
- Calculate the joint probability density function for
Y1 =X1 +X2+ X3 , Y2 =X1– X2 , Y3 =X1 – X3
where the random variables X1 , X2, X3 are the standard normal random variables.
Now let us calculate the Jacobian by using partial derivatives of
Y1 =X1 +X2+ X3 , Y2 =X1– X2 , Y3 =X1 – X3
as
simplifying for variables X1 , X2 and X3
X1 = (Y1 + Y2 + Y3)/3 , X2 = (Y1 – 2Y2 + Y3)/3 , X3 = (Y1 + Y2 -2 Y3)/3
we can generalize the joint density function as
so we have
for the normal variable the joint probability density function is
hence
where the index is
compute the joint density function of Y1 ……Yn and marginal density function for Yn where
and Xi are independent identically distributed exponential random variables with parameter λ.
for the random variables of the form
Y1 =X1 , Y2 =X1 + X2 , ……, Yn =X1 + ……+ Xn
the Jacobian will be of the form
and hence its value is one, and the joint density function for the exponential random variable
and the values of the variable Xi ‘s will be
so the joint density function is
Now to find the marginal density function of Yn we will integrate one by one as
and
like wise
if we continue this process we will get
which is the marginal density function.
Conclusion:
The conditional distribution for the discrete and continuous random variable with different examples considering some of the types of these random variables discussed, where the independent random variable plays important role. In addition the joint distribution for the function of joint continuous random variables also explained with suitable examples, if you require further reading go through below links.
For more post on Mathematics, please refer to our Mathematics Page
Wikipediahttps://en.wikipedia.org/wiki/joint_probability_distribution/” target=”_blank” rel=”noreferrer noopener” class=”rank-math-link”>Wikipedia.org
A first course in probability by Sheldon Ross
Schaum’s Outlines of Probability and Statistics
An introduction to probability and statistics by ROHATGI and SALEH
I am DR. Mohammed Mazhar Ul Haque. I have completed my Ph.D. in Mathematics and working as an Assistant professor in Mathematics. Having 12 years of experience in teaching. Having vast knowledge in Pure Mathematics, precisely on Algebra. Having the immense ability of problem design and solving. Capable of Motivating candidates to enhance their performance.
I love to contribute to Lambdageeks to make Mathematics Simple, Interesting & Self Explanatory for beginners as well as experts.