Conditional Distribution: 7 Interesting Facts To Know

Conditional distribution

   It is very interesting to discuss the conditional case of distribution when two random variables follows the distribution satisfying one given another, we first briefly see the conditional distribution in both the case of random variables, discrete and continuous then after studying some prerequisites we focus on the conditional expectations.

Discrete conditional distribution

     With the help of joint probability mass function in joint distribution we define conditional distribution for the discrete random variables X and Y using conditional probability for X given Y as the distribution with the probability mass function

1
2.PNG
3.PNG

provided the denominator probability is greater than zero, in similar we can write this as

4.PNG
5.PNG

in the joint probability if the X and Y are independent random variables then this will turn into

6.PNG
7.PNG
8.PNG

so the discrete conditional distribution or conditional distribution for the discrete random variables X given Y is the random variable with the above probability mass function in similar way for Y given X we can define.

Example on discrete conditional distribution

  1. Find the probability mass function of random variable X given Y=1, if the joint probability mass function for the random variables X and Y has some values as

p(0,0)=0.4 , p(0,1)=0.2, p(1,0)= 0.1, p(1,1)=0.3

Now first of all for the value Y=1 we have

9.PNG

so using the definition of probability mass function

10.PNG
11.PNG
12.PNG

we have

13.PNG

and

14.PNG
  • obtain the conditional distribution of X given X+Y=n, where X and Y are Poisson distributions with the parameters λ1 and λ2 and X and Y are independent random variables

Since the random variables X and Y are independent, so the conditional distribution will have probability mass function as

15.PNG
16.PNG
17.PNG

since the sum of Poisson random variable is again poisson so

18.PNG
19.PNG
20.PNG

thus the conditional distribution with above probability mass function will be conditional distribution for such Poisson distributions. The above case can be generalize for more than two random variables.

Continuous conditional distribution

   The Continuous conditional distribution of the random variable X given y already defined is the continuous distribution with the probability density function

21.PNG

denominator density is greater than zero, which  for the continuous density function is

22.PNG
23.PNG

thus the probability for such conditional density function is

24.PNG

In similar way as in discrete if X and Y are independent  in continuous then also

25.PNG

and hence

px 26
px 28 Copy 1

so we can write it as

px 29 Copy 1

Example on Continuous conditional distribution

  1. Calculate conditional density function of random variable X given Y if the joint probability density function with the open interval (0,1) is given by
px 30 Copy 1

If for the random variable X given Y within (0,1) then by using the above density function we have

px 31
px 32
px 33
px 34
px 35
  • Calculate the conditional probability
px 36

if the joint probability density function is given by

px 37

To find the conditional probability first we require the conditional density function so by the definition it would be

px 38
px 39
px 40

now using this density function in the probability the conditional probability is

100
101
px 41

Conditional distribution of bivariate normal distribution

  We know that the Bivariate normal distribution of the normal random variables X and Y with the respective means and variances as the parameters has the joint probability density function

Conditional distribution
Conditional distribution of bivariate normal distribution

so to find the conditional distribution for such a bivariate normal distribution for X given Y is defined by following the conditional density function of the continuous random variable and the above joint density function we have

Conditional distribution
Conditional distribution of bivariate normal distribution

By observing this we can say that this is normally distributed with the mean

px 42

and variance

px 43

in the similar way the conditional density function for Y given X already defined will be just interchanging the positions of the parameters of X with Y,

The marginal density function for X we can obtain from the above conditional density function by using the value of the constant

Conditional distribution
Conditional distribution of bivariate normal distribution

let us substitute in the integral

px 44

the density function will be now

Image3 1

since the total value of

Image4

by the definition of the probability so the density function will be now

Image5

which is nothing but the density function of random variable X with usual mean and variance as the parameters.

Joint Probability distribution of function of random variables

  So far we know the joint probability distribution of two random variables, now if we have functions of such random variables then what would be the joint probability distribution of those functions, how to calculate the density and distribution function because we have real life situations where we have functions of the random variables,

If Y1 and Y2 are the functions of the random variables X1 and X2 respectively which are jointly continuous then the joint continuous density function of these two functions will be

px 45

where Jacobian

px 46

and Y1 =g1 (X1, X2) and Y2 =g2 (X1, X2) for some functions g1 and g2 . Here g1 and g2 satisfies the conditions of the Jacobian as continuous and have continuous partial derivatives.

Now the probability for such functions of random variables will be

Image7

Examples on Joint Probability distribution of function of random variables

  1. Find the joint density function of the random variables Y1 =X1 +X2 and Y2=X1 -X2 , where X1 and X2 are the jointly continuous with joint probability density function. also discuss for the different nature of distribution .

Here we first we will check Jacobian

px 47

since g1(x1, x2)= x1 + x2  and g2(x1, x2)= x1 – x2 so

px 48

simplifying Y1 =X1 +X2 and Y2=X1 -X2 , for the value of X1 =1/2( Y1 +Y2 ) and X2 = Y1 -Y2 ,

px 49

if these random variables are independent uniform random variables

px 50

or if these random variables are independent exponential random variables with usual parameters

Image10

or if these random variables are independent normal random variables then

px 51
px 52
px 53
  • If X and Y are the independent standard normal variables as given
Conditional distribution

calculate the joint distribution for the respective polar coordinates.

We will convert by usual conversion X and Y into r and θ as

px 54

so the partial derivatives of these function will be

px 55
px 56
px 57
px 58

so the Jacobian using this functions is

px 59

if both the random variables X and Y are greater than zero then conditional joint density function is

px 60

now the conversion of cartesian coordinate to the polar coordinate using

px 61

so the probability density function for the positive values will be

px 62

for the different combinations of X and Y the density functions in similar ways are

px 63
px 64
px 65

now from the average of the above densities we can state the density function as

px 66

and the marginal density function from this joint density of polar coordinates over the interval (0, 2π)

px 67
  • Find the joint density function for the function of random variables

U=X+Y and V=X/(X+Y)

where X and Y are the gamma distribution with parameters (α + λ) and (β +λ) respectively.

Using the definition of gamma distribution and joint distribution function the density function for the random variable X and Y will be

px 68
px 69

consider the given functions as

g1 (x,y) =x+y , g2 (x,y) =x/(x+y),

so the differentiation of these function is

px 70
px 71
px 72

now the Jacobian is

px 73

after simplifying the given equations the variables x=uv and y=u(1-v) the probability density function is

px 74
px 75

we can use the relation

px 76
px 77
  • Calculate the joint probability density function for

Y1 =X1 +X2+ X3 , Y2 =X1– X2 , Y3 =X1 – X3

where the random variables X1 , X2, X3 are the standard normal random variables.

Now let us calculate the Jacobian by using partial derivatives of

Y1 =X1 +X2+ X3 , Y2 =X1– X2 , Y3 =X1 – X3

as

px 78

simplifying for variables X1 , X2 and X3

X1 = (Y1 + Y2 + Y3)/3 , X2 = (Y1 – 2Y2 + Y3)/3 , X3 = (Y1 + Y2 -2 Y3)/3

we can generalize the joint density function as

px 79

so we have

px 80

for the normal variable the  joint probability density function is

px 81

hence

px 82

where the index is

px 83
px 84

compute the joint density function of Y1 ……Yn and marginal density function for Yn where

px 85

and Xi are independent identically distributed exponential random variables with parameter λ.

for the random variables of the form

Y1 =X1 , Y2 =X1 + X2 , ……, Yn =X1 + ……+ Xn

the Jacobian will be of the form

Image11

and hence its value is one, and the joint density function for the exponential random variable

px 86

and the values of the variable Xi ‘s will be

px 87

so the joint density function is

px 88
px 89
px 90
px 91

Now to find the marginal density function of Yn we will integrate one by one  as

px 92
px 93

and

px 94 1
px 94 2

like wise

px 96

if we continue this process we will get

px 97

which is the marginal density function.

Conclusion:

The conditional distribution for the discrete and continuous random variable with different examples considering some of the types of these random variables discussed, where the independent random variable plays important role. In addition the  joint distribution for the function of joint continuous random variables also explained with suitable examples, if you require further reading go through below links.

For more post on Mathematics, please refer to our Mathematics Page

Wikipediahttps://en.wikipedia.org/wiki/joint_probability_distribution/” target=”_blank” rel=”noreferrer noopener” class=”rank-math-link”>Wikipedia.org

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH