-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[python-package] support sub-classing scikit-learn estimators #6783
base: master
Are you sure you want to change the base?
Conversation
…htGBM into python/sklearn-subclassing
Could you please setup an RTD build for this branch? I'd like to see how |
Sure, here's a first build: https://readthedocs.org/projects/lightgbm/builds/26983170/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great simplification, thanks for working on it!
I don't have any serious comments, just want to get some answers before approving.
importance_type=importance_type, | ||
**kwargs, | ||
) | ||
super().__init__(**kwargs) | ||
|
||
_base_doc = LGBMClassifier.__init__.__doc__ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's a little better for users to see all the parameters right here, instead of having to click over to another page.
This is what XGBoost is doing too: https://xgboost.readthedocs.io/en/stable/python/python_api.html#xgboost.XGBRFRegressor
But I do also appreciate that it could look confusing.
If we don't do it this way, then I'd recommend we add a link in the docs for `**kwargs`` in these estimators, like this:
**kwargs
Other parameters for the model. These can be any of the keyword arguments forLGBMModel
or any other LightGBM parameters documented at https://lightgbm.readthedocs.io/en/latest/Parameters.html.
I have a weak preference for keeping it as-is (the signature in docs not having all parameters, but docstring having all parameters), but happy to change it if you think that's confusing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for clarifying your opinion!
I love your suggestion for **kwargs
description. But my preference is also weak 🙂
I think we need a third judge opinion for this question.
Either way, I'm approving this PR!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Co-authored-by: Nikita Titov <[email protected]>
Co-authored-by: Nikita Titov <[email protected]>
Co-authored-by: Nikita Titov <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you very much!
I recently saw a Stack Overflow post ("Why can't I wrap LGBM?") expressing the same concerns from #4426 ... it's difficult to sub-class
lightgbm
'sscikit-learn
estimators.It doesn't have to be! Look how minimal the code is for
XGBRFRegressor
:https://github.com/dmlc/xgboost/blob/45009413ce9f0d2bdfcd0c9ea8af1e71e3c0a191/python-package/xgboost/sklearn.py#L1869
This PR proposes borrowing some patterns I learned while working on
xgboost
'sscikit-learn
estimators to make it easier to sub-classlightgbm
estimators. This also has the nice side effect of simplifying thelightgbm.dask
code 😁Notes for Reviewers
Why make the breaking change of requiring keyword args?
As part of this PR, I'm proposing immediately switching the constructors for
scikit-learn
estimators here (including those inlightgbm.dask
) to only supporting keyword arguments.Why I'm proposing this instead of a deprecation cycle:
scikit-learn
itself does this (HistGradientBoostingClassifier example)I posted a related answer to that Stack Overflow question
https://stackoverflow.com/a/79344862/3986677