ELEC 321
# Continous Random Vectors

Updated 2017-12-04

- All the entries are
*continuous*random vectors - The joint behavior are determined by the
*continuous joint density function* - is a function that maps where is the number of items in the vector
- satisfies the following:
- for all

Take for instance, the corresponding joint distribution function is a multivariable function given by .

By the *FTC*, we obtain the joint density function:

The main model is given by:

Suppose we have the joint density function, , for random variables .

Similar to discrete random variables, the marginal density function of one random variable is obtained by integrating the other random variables out.

The conditional densities work identical to the discrete version:

Where is the joint density and is the marginal density.

It follows that given conditional and joint densities, we can obtain the joint density:

Let denote the mean of given some realization of . It is defined as

The variance of given some is defined as

If the continuous random variables in the random vector are independent, then the joint density is all the marginal densities multiplied together.

Thus, any conditional density is zero.

Furthermore, given its covariance matrix, all non-diagonal elements is 0. In particular, for .

Example:Suppose and that . What is the mean and variance of ? What fractional of the total variance is explained by ?

Recall exponential random variables: the expected value is and variance is . Since , then the expected value of , an exponential random variable is

Recall uniform random variables: the expected value is and the variance is . Plugging , we get and .

Putting everything together, we get

We compute the

total variance:To compute the percentage of explained variance, we look at the explained variance, which is , and divide it by the total variance, which gives

Suppose we have a function that is bijective, then for some random vectors and ,

If given the density , then the density for is

Where is the **Jacobian** which is used for transformations.

Suppose we have random vector , where and is a -dimensional random vector, is an matrix, and is an -dimensional vector.

Then the expected value and covariance is transformed as follows: