Transformation of Variables
Let be a continuous random variable with range , is a differentiable and invertible real function on A, then the p.d.f of , , for
proof If is increasing, is also increasing. The c.d.f , so
If is decreasing, then c.d.f , so
Example If , and , then
Solution Since if and only if
Suppose we have continuous random variables with joint p.d.f , and is 1-1 then change of variable tells us where is the Jocobian of g at .
Example Suppose and , what is the joint p.d.f for abd marginal p.d.f for each?
Solution The joint p.d.f of ,
Jacobian is for all , so So the marginal p.d.f of can be calculated as
Definition If a sequence of r.v.’s are independent and identically dis- tributed (i.i.d.),then they are called a random sample.
So if is a random sample from a distribution, the joint p.d.f of where is the p.d.f of any in the random sample.
Definition Suppose we have a random variable with p.d.f where is an unknown vector, we then call a parameter and the set of 's possible values, denoted , is called the parameter space.
Example If , its p.d.f has parameter and , with parameter space
Example If , 's p.d.f has parameter with a parameter space
Definition For a random sample , any function independent of parameter is called a statistics.
Example is called the sample mean and is called the sample variance. And they are both statistics.
Theorem Two random variables are independent if and only if
Proof If are independent, then Note that the proof handles the case for both continuous and discrete case, this shows the usefulness of Riemann-Stieltjes integral.
Now if , then we will use the fact that m.g.f uniquely determines a distribution. Since The first equation correspond to and the third correspond to 's, so
Now let's introduce a less independent idea of independence. We say and are independent if , where , and are joint p.d.f of , 's and 's respectively.
If and are independent, then and are independent. Some consequences then follows:
Example If are independent chi-square random variable, then
Proof Find the m.g.f. of and we are done.
Example If , then
Proof Now this doesn't look like you can solve it with m.g.f, so we appeal to c.d.f. Note if is an even function, then .
So p.d.f of is
Suppose that are independent, we know that and are independent, so and are independent. But in general, and are not independent.
Theorem If are a random sample from , then
- The sample mean
- The sample variance and sample mean are independent.
Proof
- This can be proved by m.g.f. The m.g.f of
- To show that and are independent, we need only to show that and are independent. The joint m.g.f of and is And
- First we need the following equality because . Dividing both side by
Now And since and are independent,
Note that if the above equation will hold, so it must be the case that by the 1-1 correspondence between distribution and m.g.f.