If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Mean of sum and difference of random variables

AP.STATS:
VAR‑5 (EU)
,
VAR‑5.E (LO)
,
VAR‑5.E.1 (EK)
Mean of sum and difference of random variables.

Want to join the conversation?

  • blobby green style avatar for user rautchetan2993
    Hi, can someone please clarify my basic confusion.Let's say if I have two hypothetical random independent variables X and Y like :

    X : 1,2,3
    Y : 4,5,6

    Now if I have to combine these two variables what will be resultant output X+Y?
    Will it be {1,2,3,4,5,6}
    Or {5,7,9} ?
    (10 votes)
    Default Khan Academy avatar avatar for user
  • starky tree style avatar for user Henry, Eric
    I am really looking forward to your proof of this! There are many proofs that I have seen but I haven't found a way to prove it that is accessible to the majority of my students.

    I find it challenging to clearly (and concisely) explain what the sum of two random variables is.

    Note: I am going through the probability content with honors algebra 2 students to prepare them for AP Stats (and for their future studies).
    (7 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Rishabh Chopra
    Where is the proof of this? Please help with the link?
    (5 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Jack Miller
    Look man all I'm saying is you would definitely see more dogs in a day than cats, on average. You're more likely to see people with dogs in their car, walking them in neighborhoods, or just hanging out outside. Cats don't go in cars often, usually don't get walked, and don't hangout in easy to find places if they are outside. Still love learning from your site though.
    (3 votes)
    Default Khan Academy avatar avatar for user
  • winston baby style avatar for user Janeyh
    I still don't know whats a sum
    (2 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Mez Cooper
    The expected value is always a mean? A sample mean? A population mean?

    Expected value is the sum of multiplying probabilities with their respective events? Weird.
    (1 vote)
    Default Khan Academy avatar avatar for user
  • piceratops seed style avatar for user leena kiyumi
    so is there a difference between an expected value and a mean? like can the mean be decimals like 5.6 but you can expect to see 5.6 cats, so would the expected value be 6 cause you would round this number upwards? or would it be 5 cause that would give you a more accurate expected value? or am i thinking too much and the expected value would actually just be equal to the mean and therefor be 5.6 ?
    (1 vote)
    Default Khan Academy avatar avatar for user
  • male robot hal style avatar for user Ping
    if you like maths so much name every number
    (0 votes)
    Default Khan Academy avatar avatar for user
    • primosaur seed style avatar for user Ian Pulizzotto
      If we define a description of a number as a finite string of symbols that uses a finite alphabet of symbols, then there are only countably many descriptions. However, there are uncountably many real numbers. So almost all real numbers are indescribable!
      (3 votes)
  • blobby green style avatar for user halimajaman18
    prove that the arithmetic mean of the sum of two or more variables is equal to the sum of their mean.
    (0 votes)
    Default Khan Academy avatar avatar for user
    • blobby green style avatar for user sayhitoteebee
      Math definition of expected value for a continuous RV X is E(X)=integral(x*p(x)). Therefore definition of expected value of sum of two RVs X,Y is E(X+Y)=integral(x*p(x)+y*p(y)). Integral is a linear operator, so integral(x*p(x)+y*p(y))=integral(x*p(x))+integral(y*p(y)) which is E(X)+E(Y). Same is true of discrete RVs, they're defined as sum rather than integral, sum is likewise a linear operator.
      (1 vote)

Video transcript

- [Instructor] Let's say that I have a random variable X which is equal to the number of dogs that I see in a day. And random variable Y is equal to the number of cats that I see in a day. Let's say I also know what the mean of each of these random variables are, the expected value. So the expected value of X which I could also denote as the mean of our random variable X let's say I expect to see three dogs a day and similarly for the cats, the expected value of Y is equal to I could also denote that as the mean of Y is going to be equal to and this is just for the sake of (mumbles) let's say I expect to see four cats a day. And pretty much we define how you take the mean of a random variable or the expected value for a random variable. What we're going to think about now is what would be the expected value of X plus Y or other way of saying that the mean of the sum of these two random variables. Well it turns out, and I'm not proving it just yet, that the mean of the sum of random variables is equal to the sum of the means. So this is going to be equal to the mean of random variable X plus the mean of random variable Y. And so in this particular case, if I were to say well what's the expected number of dogs and cats that I would see in a given day. Well I would add these two means, it would be three plus four it would be equal to seven, so in this particular case it is equal to three plus four which is equal to seven. And similarly if I were to ask you the difference if I were to say how many more cats in a given day would I expect to see than dogs, so the expected value of Y minus X. What would that be? Well intuitively you might say well hey if we can add random... If the expected value of the sum is the sum of the expected values, then the expected value or the mean of the difference will be the differences of the means and that is absolutely true. So this is the same thing as the mean of Y minus X which is equal to the mean of Y is going to be equal to the mean of Y minus the mean of X, minus the mean of X. And in this particular case, it would be equal to four minus three, minus three is equal to one. So another way of thinking about this intuitively is I would expect to see on a given day one more cat than dog. The example that I just used this is discrete random variables, on a given day I wouldn't see 2.2 dogs or pi dogs, the expected value itself does not have to be a whole number 'cause you could of course average it over many days. But this same idea that the mean of a sum is the same thing as a sum of means and the mean of a difference of random variables is the same as the difference of the means. In a future video I'll do a proof of this.