Main content
Statistics and probability
Course: Statistics and probability > Unit 9
Lesson 4: Combining random variables- Mean of sum and difference of random variables
- Variance of sum and difference of random variables
- Intuition for why independence matters for variance of sum
- Deriving the variance of the difference of random variables
- Combining random variables
- Combining random variables
- Example: Analyzing distribution of sum of two normally distributed random variables
- Example: Analyzing the difference in distributions
- Combining normal random variables
- Combining normal random variables
© 2023 Khan AcademyTerms of usePrivacy PolicyCookie Notice
Mean of sum and difference of random variables
Mean of sum and difference of random variables.
Want to join the conversation?
- Hi, can someone please clarify my basic confusion.Let's say if I have two hypothetical random independent variables X and Y like :
X : 1,2,3
Y : 4,5,6
Now if I have to combine these two variables what will be resultant output X+Y?
Will it be {1,2,3,4,5,6}
Or {5,7,9} ?(10 votes)- You can't really add the random variables themselves but can add their means and std dev.
You would get mean:7
StdDev.:1.1547(4 votes)
- I am really looking forward to your proof of this! There are many proofs that I have seen but I haven't found a way to prove it that is accessible to the majority of my students.
I find it challenging to clearly (and concisely) explain what the sum of two random variables is.
Note: I am going through the probability content with honors algebra 2 students to prepare them for AP Stats (and for their future studies).(7 votes) - Where is the proof of this? Please help with the link?(5 votes)
- Look man all I'm saying is you would definitely see more dogs in a day than cats, on average. You're more likely to see people with dogs in their car, walking them in neighborhoods, or just hanging out outside. Cats don't go in cars often, usually don't get walked, and don't hangout in easy to find places if they are outside. Still love learning from your site though.(3 votes)
- if you like maths so much name every number(2 votes)
- If we define a description of a number as a finite string of symbols that uses a finite alphabet of symbols, then there are only countably many descriptions. However, there are uncountably many real numbers. So almost all real numbers are indescribable!(3 votes)
- I still don't know whats a sum(2 votes)
- The expected value is always a mean? A sample mean? A population mean?
Expected value is the sum of multiplying probabilities with their respective events? Weird.(1 vote) - so is there a difference between an expected value and a mean? like can the mean be decimals like 5.6 but you can expect to see 5.6 cats, so would the expected value be 6 cause you would round this number upwards? or would it be 5 cause that would give you a more accurate expected value? or am i thinking too much and the expected value would actually just be equal to the mean and therefor be 5.6 ?(1 vote)
- Expected value is the average value.
So, seeing 5.6 cats could very well be the expected value, even though it's definitely not the expected outcome of any given trial.(1 vote)
- If the standard deviation of x is 5 and the standard deviation of y is 8, what is the standard deviation of x + Y?(1 vote)
- prove that the arithmetic mean of the sum of two or more variables is equal to the sum of their mean.(0 votes)
- Math definition of expected value for a continuous RV X is E(X)=integral(x*p(x)). Therefore definition of expected value of sum of two RVs X,Y is E(X+Y)=integral(x*p(x)+y*p(y)). Integral is a linear operator, so integral(x*p(x)+y*p(y))=integral(x*p(x))+integral(y*p(y)) which is E(X)+E(Y). Same is true of discrete RVs, they're defined as sum rather than integral, sum is likewise a linear operator.(1 vote)
Video transcript
- [Instructor] Let's say that
I have a random variable X which is equal to the number
of dogs that I see in a day. And random variable Y is
equal to the number of cats that I see in a day. Let's say I also know what
the mean of each of these random variables are, the expected value. So the expected value of X
which I could also denote as the mean of our random
variable X let's say I expect to see three dogs a day
and similarly for the cats, the expected value of Y is
equal to I could also denote that as the mean of Y is going
to be equal to and this is just for the sake of (mumbles)
let's say I expect to see four cats a day. And pretty much we define how
you take the mean of a random variable or the expected
value for a random variable. What we're going to think
about now is what would be the expected value of X plus
Y or other way of saying that the mean of the sum of
these two random variables. Well it turns out, and I'm
not proving it just yet, that the mean of the sum of
random variables is equal to the sum of the means. So this is going to be equal
to the mean of random variable X plus the mean of random variable Y. And so in this particular case,
if I were to say well what's the expected number of dogs
and cats that I would see in a given day. Well I would add these two
means, it would be three plus four it would be equal to seven,
so in this particular case it is equal to three plus
four which is equal to seven. And similarly if I were to ask
you the difference if I were to say how many more cats in
a given day would I expect to see than dogs, so the
expected value of Y minus X. What would that be? Well intuitively you might
say well hey if we can add random... If the expected value of the
sum is the sum of the expected values, then the expected
value or the mean of the difference will be the
differences of the means and that is absolutely true. So this is the same thing as
the mean of Y minus X which is equal to the mean of Y is going
to be equal to the mean of Y minus the mean of X, minus the mean of X. And in this particular case,
it would be equal to four minus three, minus three is equal to one. So another way of thinking about
this intuitively is I would expect to see on a given
day one more cat than dog. The example that I just
used this is discrete random variables, on a given day I
wouldn't see 2.2 dogs or pi dogs, the expected value itself
does not have to be a whole number 'cause you could of
course average it over many days. But this same idea that the
mean of a sum is the same thing as a sum of means and the
mean of a difference of random variables is the same as
the difference of the means. In a future video I'll do a proof of this.