If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content
Current time:0:00Total duration:10:47
AP.STATS:
VAR‑5 (EU)
,
VAR‑5.E (LO)
,
VAR‑5.E.2 (EK)
,
VAR‑5.E.3 (EK)

Video transcript

what I want to do in this video is build up some tools in our toolkit for dealing with sums and differences of random variables so let's say that we have two random variables x and y and they are completely independent they are independent independent random variables random variables and I'm just going to go over a little bit of notation here if we wanted to know the expected or if we talked about the expected value of this random variable X that is the same thing as the mean value of the mean value of this random variable X if we talk about the expected value of y the expected value of y that is the same thing as the mean the mean of Y if we talk about the variance if we talk about the variance of the random variable X that is the same thing as the expected value of the squared distances between our random variable X and its mean so and that right there squared so the expected value of these squared differences and that is can also be can also you could also use the notation Sigma squared for the random variable X this is just a review of things we already know but I just want to reintroduce it because I'll use this to build up some of our tools so you do the same thing write this with random variable Y the variance the variance of random variable Y is the expected value of the squared difference between our random variable Y and the mean of Y the mean the mean of Y are the expected value of Y squared and that's the same thing as Sigma squared of Y there is a variance of Y now you may or may not already know these properties of expected values and variances but I will reintroduce them to you and I won't go into some rigorous proof actually I think they're fairly easy to digest so one is is that if I have some third random variable let's say I have some third random variable that is defined as being the random variable X plus the random variable Y let me stay with my colors just so everything becomes clear the random variable X plus the random variable Y what is the expected value of Z going to be the expected value of Z is going to be equal to the expected value of x plus y and this is a property of expected values I'm not going to prove it rigorously right here but it's the expected value of x plus the expected value of y or another way to think about this is that the mean of Z is going to be the mean of X plus the mean of Y or another way to view it is if I wanted to take let's say I have some other random variable let's let me I'm running out of letters here let's say I have the random variable a and I define random variable a to be X minus y so what's its expected value going to be the expected value of a is going to be equal to the expected value of X minus y which is equal to you can even either viewed as the expected value of x plus the expected value of negative Y or the expected value of x minus the expected value of y which is the same thing as the mean of x minus the mean of Y so this is what the mean of our random variable a would be equal to and all of this is review and I'm going to use this when we start talking about distributions that are sums and differences of other distributions now let's think about what the variance of random variable Z is and what the variance of random variable a is so the variance the variance of Z the variance of Z and just to you know to kind of always focus back on the intuition it makes sense if X is completely independent of Y and if I have some random variable that is the sum of the two then the expected value of that set of that of that variable of that new variable is going to be this the sum of the expected values of the other two because they are unrelated if I think if if my expected value here is five and my expected value here is seven completely reasonable that my expected value here is twelve assuming that they are completely independent now if we have a situation if we have a so what is the variance what is the variance of my random variable Z and once it again I'm not going to do a rigorous proof here this is really just a property of variances but I'm going to use this to establish what the variance of our random variable a is so if if this on if this squared distance on average is some variance and this this one is completely independent and its squared distance on average is some distance then the variance of their sum is actually going to be the sum of their variances so this is going to be equal to the variance the variance of random variable X plus the variance of random variable Y the variance of random variable Y or another way of thinking about it another way of thinking about is that the variance the variance of Z which is the same thing as the variance of X plus y of X plus y X plus y is equal to is equal to the variance of X plus plus the variance of random variable Y and hopefully that makes some sense I'm not proving it to rigorously and you'll see this in a lot of statistics books now what I want to show you is that the variance of random variable a is actually this exact same thing and that's the interesting thing because you might say hey why wouldn't it be the difference we had the differences over here so let's experiment with this a little bit the variance the variance so I'll just write write this the variance of random variable a is the same thing as the variance of I'll write it like this is x minus y which is equal to which is equal to you could view it this way which is equal to the variance which is equal to the variance of X plus negative Y right these these are equivalent statements so you could view this as being equal to just using this over here the sum of these two variances so it's going to be equal to the sum of the variance of X plus the variance plus the variance of negative Y and what I need to show you is that the variance of negative Y of the negative of that random variable is going to be the same thing as the variance of Y so what is the variance of negative Y the variance of negative Y is the same thing as the variance of negative Y which is equal to which is equal to the expected value the expected value of the distance between negative Y the difference the difference between negative Y negative Y and the expected value of negative Y squared that's all the variance actually is that's all the variance actually is now what is what is the expected value of negative Y right over here or actually even better let me factor out a negative 1 so what's in the parentheses right here this is the exact same thing as negative 1 squared times y plus the expected value of negative Y so that's the same exact same thing in the parentheses squared so everything in magenta is everything in magenta here and it is the expected value of that thing it's the expected value of that thing now what is the expected value of negative Y the expected value of negative Y I'll do it over here the expected value of the negative of a random variable is just the negative of the expected value of that random variable so if you look at this we can rewrite this I'll write give myself a little bit more space we can rewrite this as the expected value of the variance of negative Y is the expected value this is just 1 negative 1 squared is just 1 and over here you have Y and instead of right the expected value of negative Y that's the same thing as minus the expected value of y so you have that and then all of that squared now notice this is the exact same thing this is the exact same thing by definition as the variance of Y so we just showed you just now so this is the variance of Y so we just showed you that the variance the variance the variance of the difference of two independent random variables is equal to the sum of is equal to the sum of the variances you could definitely believe this it's equal to the sum of the first this variance of the first one plus the variance of the negative of the second one and we just show that that variance is the same thing as the variance of the positive version of that variable which makes sense your distance your distance from the mean is going to be it doesn't matter whether you're taking the positive or the negative of the variable you just care about absolute distance so it makes complete sense that that quantity and that quantity is going to be the same thing now the whole reason why I went through this exercise kind of the important takeaways the important takeaways here is that the mean of difference is the mean of differences right over here so I could rewrite it as the mean of the differences of the random variable is the same thing as the differences of their means and then the other important takeaway and I'm going to build on this in the next few videos is that the variance of the difference the so if I take a random if I define a new random variable as the difference of two other random variables the variance of that random variable is actually the sum of the variances of the two random variables so these are the two important takeaways that we'll use to build on in future videos anyway hopefully that wasn't too confusing and if it was you can kind of just accept these at face value and just assume that these are tools that you can use
AP® is a registered trademark of the College Board, which has not reviewed this resource.