This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.

Building and Regularizing Linear Regression Models in Scikit-learn

QuantInsti

Contributor:
QuantInsti
Visit: QuantInsti

Excerpt

We will discuss the concept of regularization, its examples (Ridge, Lasso and Elastic Net regularizations) and how they can be implemented in Python using the scikit learn library.

Linear Regression problems also fall under supervised learning, where the goal is to construct a “model” or “estimator” which can predict the continuous dependent variable(y) given the set of values for features(X).

One of the underlying assumptions of any linear regression model is that the dependent variable(y) is (at least to some degree!) a linear function of the independent variables(Xis). That is, we can estimate y using the mathematical expression:

y=b0+b1X1+b2X2+b3X3+⋯+bnXn ,

where, /(b_i\)s are the coefficients that are to be estimated by the model.

The cost function that we seek to minimize is the sum of squared errors(also referred to as residuals). This methodology is called the ordinary least squares(OLS) approach. In other words, we want those optimal values of bis that minimize ∑(y−^y)2 , where ^y is the value of y predicted by the model.

Although the OLS approach works well in a lot of cases, it has its own drawbacks when the data has far-flung outliers or when the predictor variables (Xis) are correlated with each other. This can have a significant impact on the overall prediction accuracy of the model, especially for out of sample or new data. In such cases, some form of regularization is helpful. Two of the most popular regularization techniques are Ridge regression and Lasso regression, which we will discuss in this blog.

Importing Libraries

We will need some commonly used libraries such as pandas, numpy and matplotlib along with scikit learn itself: In [33]:

import numpy as np
import pandas as pd
import matplotlib.pylab as plt
%matplotlib inline
import sklearn

That’s it!

In the next section, we will get some data to build our models on.

Visit QuantInsti Blog to download the ready-to-use code:
https://blog.quantinsti.com/linear-regression-models-scikit-learn/

Disclosure: Interactive Brokers

Information posted on IBKR Traders’ Insight that is provided by third-parties and not by Interactive Brokers does NOT constitute a recommendation by Interactive Brokers that you should contract for the services of that third party. Third-party participants who contribute to IBKR Traders’ Insight are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from QuantInsti and is being posted with permission from QuantInsti. The views expressed in this material are solely those of the author and/or QuantInsti and IBKR is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to sell or the solicitation of an offer to buy any security. To the extent that this material discusses general market activity, industry or sector trends or other broad based economic or political conditions, it should not be construed as research or investment advice. To the extent that it includes references to specific securities, commodities, currencies, or other instruments, those references do not constitute a recommendation to buy, sell or hold such security. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

trading top