# Regression Analysis: Types, Importance and Limitations

Contents

## Meaning of Regression Analysis

Regression analysis refers to a statistical method used for studying the relationship in between dependent variables (target) and one or more independent variables (predictors). It enables in easily determining the strength of relationship among these 2 types of variable for modelling future relationship in between them. Regression analysis explains variations taking place in target in relation to changes in select predictors. It is mostly used in investment and finance disciplines. Finance and investment managers utilize this statistical technique for valuing assets, discovering capital costs and easy understanding of relationship among variables like commodity prices and businesses stocks dealing in such commodities.  Business also used regression analysis for predicting sales volume on the basis of previous growth, GDP growth, weather and many other factors.

## Types of Regression Analysis

Various types of regression analysis are as given below: –

1. Linear Regression.

Linear regression is simplest form of regression analysis in which dependent variable is of continuous nature. There is a linear relationship in between the dependent and independent variables.  In linear regression, a best fit straight line also known as regression line is used for establishing relationship in between these 2 variables. Y=a+b*X + e is an equation used for linear regression where a is intercept, e is error term and b is slope of line.

1. Logistic Regression.

Logistic regression is one in which dependent variable is binary is nature. Independent variable either can be continuous or binary. It is a form of binomial regression that estimates parameters of logistic model. Data having two possible criterions are deal with using the logistic regression.

1. Polynomial Regression.

Polynomial regression is one in which power of independent variable is more than 1. This model is deployed when relationship in between dependent and independent variables is non-linear. The best fit line in polynomial regression technique is curve instead of straight line.

1. Ridge Regression.

Ridge regression is widely used when there is high correlation between the independent variables. In such multi collinear data, although least square estimates are unbiased but their variances are quite large that deviates observed value from true value. Ridge regression reduces the standard errors by adding a degree of bias to the estimates of regression.

1. Bayesian Linear Regression.

Bayesian linear regression is type of regression that employs Bayes theorem for determining values of regression coefficients. Under this regression, posterior distribution of features is find out instead of determined the least squares. Bayesian linear regression is more stable as compared to simple linear regression.