ELEC 321

Continous Random Vectors

Updated 2017-12-04

Continuous Random Vectors

Take for instance, the corresponding joint distribution function is a multivariable function given by .

By the FTC, we obtain the joint density function:

Bivariate Normal (More in module 6)

The main model is given by:

Marginal Density Function

Suppose we have the joint density function, , for random variables .

Similar to discrete random variables, the marginal density function of one random variable is obtained by integrating the other random variables out.

Conditional Density Function

The conditional densities work identical to the discrete version:

Where is the joint density and is the marginal density.

It follows that given conditional and joint densities, we can obtain the joint density:

Conditional Mean

Let denote the mean of given some realization of . It is defined as

Conditional Variance

The variance of given some is defined as

Independence

If the continuous random variables in the random vector are independent, then the joint density is all the marginal densities multiplied together.

Thus, any conditional density is zero.

Furthermore, given its covariance matrix, all non-diagonal elements is 0. In particular, for .

Example:

Suppose and that . What is the mean and variance of ? What fractional of the total variance is explained by ?

Recall exponential random variables: the expected value is and variance is . Since , then the expected value of , an exponential random variable is

Recall uniform random variables: the expected value is and the variance is . Plugging , we get and .

Putting everything together, we get

We compute the total variance:

To compute the percentage of explained variance, we look at the explained variance, which is , and divide it by the total variance, which gives

Functions of Continuous Random Vectors

Suppose we have a function that is bijective, then for some random vectors and ,

If given the density , then the density for is

Where is the Jacobian which is used for transformations.

Linear Transformations

Suppose we have random vector , where and is a -dimensional random vector, is an matrix, and is an -dimensional vector.

Then the expected value and covariance is transformed as follows: