Skip to content
This repository has been archived by the owner on Nov 19, 2020. It is now read-only.

L1-regularized (Logistic) regression #310

Closed
mikhail-barg opened this issue Sep 27, 2016 · 2 comments
Closed

L1-regularized (Logistic) regression #310

mikhail-barg opened this issue Sep 27, 2016 · 2 comments

Comments

@mikhail-barg
Copy link
Contributor

I've been looking on having L1 regularization to LogisticRegression. It has a nice property — it's said to be useful for feature selection (see for example this).

As far as I understand, the IterativeReweightedLeastSquares.Regularization parameter is only for L2 regularization, and there's no L1 currently supported. Is this correct?

It would be nice to have it added (for example like this). And maybe even being able to specify both L1 and L2 at the same time.

@cesarsouza
Copy link
Member

Hi there,

Thanks for opening the issue! It is currently possible to train L1-regularized Logistic Regression through ProbabilisticCoordinateDescent. Albeit being a method originally intended for learning L1-regularized linear support vector machines, it so happens that linear probabilistic SVMs are exactly the same model as logistic regression.

After you learn a L1-regularized SVM using this method, one possible way to convert it to a LogisticRegression was to use SVM's ToWeights() method and pass it to LogisticRegression's FromWeights() method. I will update the documentation with a concrete example on how this can be done.

Regards,
Cesar

cesarsouza added a commit that referenced this issue Oct 4, 2016
Adding documentation examples on how to train L1 logistic regression models through SVM learning.
@cesarsouza
Copy link
Member

Integrated in release 3.4.0.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants