Linear Regression 

It is a Machine Learning algorithm which comes under supervised learning. It is that process of predicting a continuous value, used to find the linearity between one or more than one predictor and target.

1. Simple Linear Regression

Predicting a response using a single feature.

It is a method to predict the dependent variable (Y) based on the values of independent variables/features (X). Let's assume that the two variables are linearly related. Hence, try to find the linear function through y=mx+c  that predicts the output value(y) that is probably a function of the feature or independent variable(x) with less error.

How to find the best fit/suitable line.

In this simple linear regression model, trying to minimize the errors in prediction by finding the "best-fitted line"  the line from the errors would be minimal. Consequently, the goal is clear that the model aims at decreasing the difference between y_actual and y_predicted.

2. Multiple Linear Regression

Multiple Linear Regression is an algorithm that is used to model the relationship among more than one feature and responses the output by fitting an equation of Linear Regression to observed data. There are some steps to perform the Multiple Linear Regression which is similar to Simple Linear Regression. The only difference between them is the way of evaluation. 
Multiple Linear Regression proposes to find out the dependency of one feature over others and also let you know which factor is highly impacting the predicted output.

To Remember:
While designing a Machine Learning model with many features, it is to be considered that having too many features could potentially lead our model to predict the result with less accuracy, especially if certain features have no effects over the outcomes or may have a drastic effect on the variables.
There are some methods to select the most appropriate variables from the huge dataset.

  • Forward Selection

  • Backward Selection

  • Bi-directional Comparision

To start building a model with multiple features, the following steps can be taken into consideration:

Step 1: Data Preprocessing
It includes importing the libraries and datasets, checking for missing values, proper visualization of data.
The next task is to deal with categorical data. For this, encode the categorical data using LaberEncoder() and make the dummy variables if required.
Feature scaling should be taken care of to clean the data for better accuracy.
 
Step 2: Fitting the model with the training set.

After the data is cleaned and finalized, the features and target must be picked out from the finalized dataset, and then further it has to be split into training and testing dataset. The instantiated model object must be fitted with the training data using the fit() method of LinearRegression. The LinearRegression class includes both simple and multiple linear regression algorithms.

Step 3: Predicting the output
To predict the outcome, we use the predict() method of class LinearRegression on the regressor which has been fitted before.

Here Linear Regression ends, now

let's get back to ML Codes page

  • CREATED BY ANMOL VARSHNEY & PALAK GUPTA