You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 19, 2020. It is now read-only.
I've been looking on having L1 regularization to LogisticRegression. It has a nice property — it's said to be useful for feature selection (see for example this).
As far as I understand, the IterativeReweightedLeastSquares.Regularization parameter is only for L2 regularization, and there's no L1 currently supported. Is this correct?
It would be nice to have it added (for example like this). And maybe even being able to specify both L1 and L2 at the same time.
The text was updated successfully, but these errors were encountered:
Thanks for opening the issue! It is currently possible to train L1-regularized Logistic Regression through ProbabilisticCoordinateDescent. Albeit being a method originally intended for learning L1-regularized linear support vector machines, it so happens that linear probabilistic SVMs are exactly the same model as logistic regression.
After you learn a L1-regularized SVM using this method, one possible way to convert it to a LogisticRegression was to use SVM's ToWeights() method and pass it to LogisticRegression's FromWeights() method. I will update the documentation with a concrete example on how this can be done.
I've been looking on having L1 regularization to LogisticRegression. It has a nice property — it's said to be useful for feature selection (see for example this).
As far as I understand, the
IterativeReweightedLeastSquares.Regularization
parameter is only for L2 regularization, and there's no L1 currently supported. Is this correct?It would be nice to have it added (for example like this). And maybe even being able to specify both L1 and L2 at the same time.
The text was updated successfully, but these errors were encountered: