If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains ***.kastatic.org** and ***.kasandbox.org** are unblocked.

Main content

Current time:0:00Total duration:7:40

so I'm interested in finding the relationship between people's height in inches and their weight in pounds and so I'm randomly sampling a bunch of people measuring their Heights measuring their weight and then for each person I'm plotting a point that represents their height and weight combination so for example let's say I measure someone who is 60 inches tall that would be five feet tall and they weigh a hundred pounds and so I'd go to sixty inches and then a hundred pounds all right over there so that point right over there is the point sixty comma 60 comma 100 one way to think about it height we could say is being measured on our x-axis or plotted along our x-axis and then weight along our y-axis until this point from this person is like point sixty comm100 representing sixty inches a hundred pounds and so so far I've done it for one two three four five six seven eight nine people and I could keep going but even with this I could say well look it looks like there's a roughly linear relationship here it looks like it's positive that generally speaking as height increases so does weight maybe I could try to put a line that can approximate this trend so let me try to do that so this is my line tool I could think about a bunch of line something like this seems like it would be you'd be most of the data is below the line so that seems like it's not right I could do something like I could do something like this but that doesn't seem like a good fit most of the data seems to be above the line and so and once again I'm just eyeballing it here in the future you will learn better methods of finding a better fit but this something like this and I'm just eyeballing it looks about right so that line you could view this as a regression line we could view this as y equals MX plus B where we would have to figure out the slope and the y-intercept we could figure it out based on what I just drew or we could even think of this as weight weight is equal to our slope times height times height plus whatever our y-intercept is or you could think of it if you think of the vertical axis of the Wade axis you could think of it as your weight intercept but either way this is the model that I'm just through eyeballing this is my regression line something that I'm trying to fit to these points but clearly it can't go through one line won't be able to go through all of these points there's going to be for each point some some difference or not for all of them but for many of them some difference between the actual and what would have been predicted by the line and that idea the difference between the actual for point and what would have been predicted given say the height that is called a residual and write that down the a residual for each of these data points and so for example if I call this right here if I call that point 1 the residual for point 1 is going to be well 4 for our variable for our height variable 60 inches the actual here is 100 pounds and from that we would subtract what would be predicted and so what would be predicted is right over here I could I could just substitute 60 into this equation so it'd be M times 60 plus B so I could write it as M maybe able to write it this way 60 M plus B once again I would just take the 60 pounds and put it into my model here and say well what weight would that have predicted and I could even just for the sake of having a number here I can look I can let me get my line tool out and try to get a straight line from that point so from this point let me get a straight line so that doesn't look quite straight okay a little bit okay so if I it looks like it's about 150 pounds so my model would have predicted 150 pounds so the residual here is going to be equal to negative 50 and so negative residual is when your actual is below your predicted so this right over here this is our one it is a negative residual if you had if you tried to find let's say that let's say this residual right over here for this point this r2 this would be a positive rigid residual because the actual is larger than what would have actually been predicted and so what residual is good for seeing well how good does your line does your regression does your model fit a given data point or how does a given data point compare to that but you probably want to do is think about some combination of all the residuals and try to minimize it now you might say well I want to just add up all the residuals and try to minimize that but that gets tricky because some are positive and some are negative and so a big negative residual negative residual could counterbalance a big positive residual and it would look if they would add up to zero and then it would look like this no residual so you could just add up the absolute values so you could say well let me just take the sum of all of the residual of the absolute value of all over the residuals and then let me change M and B for my line to minimize this and that would be a technique of trying to create a regression line but another way to do it and this is actually the most typical way that you will see in statistics is that people take the sum of the squares of the residuals the sum of the squares and when you square something whether it's negative or positive it's going to be a positive so it takes care of that issue of negatives and positives cancelling out with each other and when you square number things with large residuals are going to become even larger relatively speaking you know if you square a large yeah you know one is if you think about this way let me put regular numbers one two three four these are all one apart from each other but if I were to square them 1 4 9 16 they get further and further apart and so something the larger the residual is when you square it when the sum of squares is going to represent a bigger proportion of the sum and so what we'll see in future videos is that there is a technique called least leashed squares squares regression least squares regression where you can find an M and a B for a given set of data so it minimizes the sum of the squares of the residual and that's valuable and the reason why this is used most is it really tries to take into account things that are significant outliers things that sit from pretty far away from the model something like this is going to really with a least squares regression it's going to try to be minimized or it's going to be waited a little bit heavier because when you square it it becomes even a bigger factor in this but this is just a conceptual introduction in future videos we'll do things like calculate residuals and we'll actually derive the formula for how do you figure out an M and a B for a line that actually minimizes the sum of the squares of the residuals