Machine Learning A-Z™: Hands-On Python & R In Data
  • Introduction
  • Introduction
    • Introduction
  • Section 1: Welcome to the course!
    • 1. Applications of Machine Learning
    • 2. Why Machine Learning is the Future
    • 3. Important notes, tips & tricks for this course
    • 4. Installing Python and Anaconda (Mac, Linux & Windows)
    • 5. Update: Recommended Anaconda Version
    • 6. Installing R and R Studio (Mac, Linux & Windows)
    • 7. BONUS: Meet your instructors
  • Section 2: Part 1 Data Preprocessing
    • 8. Welcome to Part 1 - Data Preprocessing
    • 9. Get the dataset
    • 10. Importing the Libraries
    • 11. Importing the Dataset
    • 12. For Python learners, summary of Object-oriented programming: classes & objects
    • 13. Missing Data
    • 14. Categorical Data
    • 15. WARNING - Update
    • 16. Splitting the Dataset into the Training set and Test set
    • 17. Feature Scaling
    • 18. And here is our Data Preprocessing Template!
    • Quiz 1: Data Preprocessing
  • Section 3: Part 2 Regression
    • 19. Welcome to Part 2 - Regression
  • Section 4: Simple Linear Regression
    • 20. How to get the dataset
    • 21. Dataset + Business Problem Description
    • 22. Simple Linear Regression Intuition - Step 1
    • 23. Simple Linear Regression Intuition - Step 2
Powered by GitBook
On this page
  • SIMPLE LINEAR REGRESSION 2
  • Ordinary Least Squares
  1. Section 4: Simple Linear Regression

23. Simple Linear Regression Intuition - Step 2

SIMPLE LINEAR REGRESSION 2

Ordinary Least Squares

  • To get this best fitting line, you take the each of the points on the line, you square them and you take the sum of the squares

  • So you got to find the minimum of the sum of the squares

  • The formula is SUM (y - y^)^2 -> min

  • So basically what a simple linear regession does is it draws lots and lots of these lines

  • These trend lines all this is like a simplistic way of imagining the linear regression. and it draws all possible trendlines through the dots and counts the sum of those squares every single time.

  • It then finds the minimum one so it looks for the minimum sum of squares and finds a line which has the smallest sum of squares possible

  • And that line will be the best fitting line and that is called the ordinary least squares method.

Previous22. Simple Linear Regression Intuition - Step 1

Last updated 6 years ago