Combining random variables
Current time:0:00Total duration:8:07
Variance of sum and difference of random variables
- So, we've defined two random variables here. The first random variable X is the weight of the cereal in a random box of our favorite cereal, Mathies, a random closed box of our favorite cereal, Mathies. And we know a few other things about it. We know what the expected value of X is, it is equal to 16 ounces. In fact, they tell it to us on a box, they say, you know, net weight, 16 ounces. Now, when you see that on a cereal box, it doesn't mean that every box is going to be exactly 16 ounces. Remember you have a discrete number of these flakes in here, they might have slightly different densities, slightly different shapes depending how they get packed into this volume, so there is some variation which you can measure with standard deviation. So, the standard deviation, let's just say for the sake of argument, for the random variable X is 0.8 ounces and just to build our intuition a little bit later in this video, let's say that this, the random variable X, is always stays constrained within a range, that if it goes above a certain weight or below a certain weight, then the company that produces it just throws out that box. And so, let's say that our random variable X is always greater than or equal to 15 ounces and it is always less than or equal to 17 ounces, just for argument. This'll help us build our intuition later on. Now, separately, let's consider a bowl, we're always gonna consider the same size bowl, let's consider just a four ounce bowl because the expected value of Y, if you took a random one of these bowls, always the same bowl, or if you took the same bowl and someone filled it with Mathies, the expected weight of the Mathies in that bowl is going to be four ounces. But once again, there's going to be some variation, depends who filled it in, how it packed in, did they shake it while they were filling it? There could be all sorts of things that could make some variation here. And so, for the sake of argument, let's say that variation can be measured by standard deviation, it's 0.6 ounces. And let's say whoever the bowl fillers are, they are also, they don't like bowls that are too heavy or too light, and so they'll also throw out bowls, so we can say that Y can, its maximum value that it'll ever take on is five ounces and the minimum value that it could ever take on, let's say, it is three ounces. So, given all of this information, what I wanna do is let's just say I take a random box of Mathies and I take a random filled bowl, and I wanna think about the combined weight in the closed box and the filled bowl. So, what I wanna think about is, really, X plus Y. I wanna think about the sum of the random variables. So, in previous videos, we already know that the expected value of this is just gonna be the sum of the expected values of each of the random variables. So, it would be the expected value of X plus the expected value of Y, and so it'd be 16 plus four ounces, in this case, this would be equal to 20 ounces. But what about the variation? Can we just add up the standard deviations? If I wanna figure out the standard deviation of X plus Y, how can I do this? Well, it turns out that you can't just add up the standard deviations, but you can add up the variances. So, it is the case that the variance of X plus Y is equal to the variance of X plus the variance of Y. And so, this is gonna have an X right over here, X, and then we have plus Y and our Y. And actually, both of these assume independent random variables. So, it assumes, assumes X and Y are independent, I'm gonna write it in caps. In a future video, I'm going to give you hopefully a better intuition for why this must be true, that they are independent in order to make this claim right over here. I'm not going to prove it in this video, but we could build a little bit of intuition. Here, for each of these random variables, we have a range of two ounces over which this random variable can take and that's true for both of them. But what about this sum? Well, this sum here could get as high as, so let me write it this way, so X plus Y, X plus Y, what's the maximum value that it could take on? Well, if you get a heavy version of each of these, then it's going to be 17 plus five. So, this has to be less than 22 ounces, that's going to be greater than or equal to, well, what's the lightest possible scenario? Well, if you get a 15 ouncer here and you get a three ouncer here, and it is 18 ounces. And so, notice, now, the variation for the sum is larger. We have a range that this thing can take on now of four while the range for each of these was just two. Or another way you could think about it is these upper and lower ends of the range are further from the mean than these upper and lower ends of the range were from their respective means. So, hopefully, this gives you an intuition for why this makes sense. Let me ask you another question: what if I were to say what about the variance, what about the variance of X minus Y? What would this be? Would you subtract the variances of each of the random variables here? Well, let's just do the exact same exercise. Let's take X minus Y, X minus Y and think about it. What would be the lowest value that X minus Y could take on? Well, the lowest values, if you have a low X and you have a high Y, so it'd be 15 minus five, so this would be 10 right over here, that would be the lowest value that you could take on and what would be the highest value? Well, the highest values, if you have a high X and a low Y, so 17 minus three is 14. So, notice, just as we saw in this case of the sum, even in the difference, your variability seems to have increased. This is still going to be, the extremes are still further than the mean of the difference, the mean of the difference would be 16 minus four is 12. These extreme values are two away from 12. And this is just to give us an intuition. Once again, it's not a rigorous proof. So, it actually turns out that in either case, when you're taking the variance of X plus Y or X minus Y, you would sum the variances assuming X and Y are independent variables. Now, with that out of the way, let's just calculate the standard deviation of X plus Y. Well, we know this, let me just write it using this sigma notation, so another way of writing the variance of X plus Y is to write the standard deviation of X plus Y squared and that's going to be equal to the variance of X plus the variance of Y. Now, what is the variance of X? Well, that's the standard deviation of X squared, 0.8 squared, this is 0.64, 0.64. The standard deviation of Y is 0.6, you square it to get the variance, that's 0.36. You add these two up and you are going to get one. So, the variance of the sum is one, and then if you take the square root of both of these, you get the standard deviation of the sum is also going to be one. And that just happened to work out because we're dealing with the scenario where the variance, where the square root of one is, well, one. So, this hopefully builds your intuition, whether we are adding or subtracting to independent random variables. The variance of that sum or the difference, the variability will increase. In the next video, we'll go into some depth talking about getting an intuition for why independence is an important condition for making this statement, this claim.
AP® is a registered trademark of the College Board, which has not reviewed this resource.