Home > Back-end >  Simple linear regressions vs multiple linear regression model scaling
Simple linear regressions vs multiple linear regression model scaling

Time:12-26

I read somewhere suggesting that in case there are multiple features(multi linear model) no feature scaling is needed because co-efficient takes care of that.

But for single feature(simple linear model); feature scaling is needed.

Is this how python scikilt learn works or I read something wrong?

Need answer from someone who has tested both with and without feature scaling in simple linear regression

CodePudding user response:

Scaling is used when we want to scale the features in a particular range. In particular algorithms, the model will be sensitive to outliers so it is recommended to scale the features in a particular range. Algorithms like distance-based need feature scale. It also depends on data not in particular for any dataset such as multiple linear regression or linear regression. Sometimes features scaling is not recommended as the data points will shift from a particular range to a normal distribution range as it will lead to an impact on modelling.

  • Related