If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

# Introduction to residuals and least squares regression

Introduction to residuals and least squares regression.

## Want to join the conversation?

• I am so confused. Which one is the actual y value and which one is the predicted y value ?? Why is 100 the actual value ? And also, at , how did he get that point ?? He literally just said the predicted value was right there, but he did not even explain how he got it...
• 100 is the actual weight because he measured someone who was 60" tall and that person weighed 100 pounds. He plotted that on the graph at (60,100). He created a line, by “eyeballing” the data points for what looked like a best fit for the data. He used that diagonal line to predict a person’s height from their given weight. Using the line, a person who is 60” is predicted to weigh 150 pounds. You can find that by drawing a line straight up from the x-axis at 60 and see where it meets the diagonal line. Draw a horizontal line from that point to the y-axis and you can read the y value, which is the weight predicted by using the line.
• Is residual same as variance in machine learning?
• this confused me even more.
• Since sum of squared residuals is more sensitive to outliers (as squaring assigns greater proportion of the sum to the outlier), why is sum of absolute residuals used less in regression?
(1 vote)
• so what is the easiest way of doing this and understanding because the way my math teacher explained, its hard.
• Is this pretty much finding slope y=mx+b