본문 바로가기

ML | DL

Linear Regression

Linear regression is one of the approaches of regressions. It is used for prediction and input data X and target value y are linearly correlated. So it is called linear regression.

 

For examples, prediction for real assets, forecasting the weather and revenue of movies are predicted by linear regression.

 

So after we make a problem, we basically need to create a linear model by adding weights and bias in order to predict. We use linear function to do regression. To use linear regression for modelling, It is necessary to remove correlated variables to improve your model. 

 

In the previous post, it is mentioned that data is separated as training and test data for measurement of how the training data fit the model and validation. Most commonly, Mean of Squared Error(MSE) is uesd for the measurement.

 

To train model is to find the minimized parameter(θ). A parameter is key to find a good model and it depends on us. So we need to update the parameter. it is called optimization. The representative optimization for linear regression is gradient descent. it allows us to get a global minimum and that value would be the closest value for prediction.

 

 

 

 

 

'ML | DL' 카테고리의 다른 글

XGBoost regressor  (0) 2020.10.01
Cluster analysis of Unsupervised Learning  (0) 2020.08.23
ARIMA model for time series forecasting(시계열 분석)  (3) 2020.08.12
Supervised Learning  (0) 2020.06.12