Ordinary Least Squares is a method which helps us estimate the unknown parameters in the Linear regression model. How does it estimate the parameters though? Well, it estimates the parameters by minimizing the sum of squared residuals. The way it does is , it draws a line through the data points such that the squared differences between the observed values and the corresponding fitted value is minimized. you can see it in the figure below:
In this post we will first compare the Linear Regression model with the Ordinary Least Squares model. The coefficients and the intercept for the Linear Regression model and the OLS model will be compared. The main advantage of the Ordinary Least Squares model is it gives us the summary report of the model. It gives us a comprehensive report on how the model is split, what are the main parameters to look at, what are the different tests that is performed to validate if a feature is necessary or not for model creation. All these important things are covered in this Ordinary Least Squares model summary report.
Multicollinearity in Ordinary Least Squares(OLS)
Multicollinearity is a condition when the independent features of a regression model are correlated. Independent features should not be correlated. If the degree of correlation is high then it would impact the model while fitting it and interpreting the results. Multicollinearity can be seen a potential problem when it comes to building regression models.
Now let's go ahead and implement the OLS model and see if there is any Multicollinearity problem in it. Also let's see how we could interpret the Ordinary Least Squares summary report for the regression model
Also refer this post to understand Multicollinearity using Variance Inflation Factor