-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Relax check_X_y
for DecayEstimator
#580
Relax check_X_y
for DecayEstimator
#580
Conversation
I'm also OK with removing the entire check. Mainly because the internal method should already be doing that and this way we'll need to make fewer assumptions. |
To be completely pedantic, the Now both approaches feel wrong and with shortcomings:
Opinions? |
Ah, I forgot about that. That's a fair point. I guess the simplest solution is to offer a flag to the end-user and to defer the responsibility of applying the check to them? That said, I think most of the time |
That's reasonable!
With such default the scikit-learn tests would fail. I will add a big warning in the docstring 😁 |
There are a lot of components in this library that fail the standard checks. That's ok. It's not ideal, but given how "non-standard" some of these ideas are ... it's fine to turn a test off sometimes. Adding a warning to the docstring is fair though! |
sklego/meta/decay_estimator.py
Outdated
The DecayEstimator will use exponential decay to weight the parameters. | ||
|
||
w_{t-1} = decay * w_{t} | ||
""" | ||
|
||
def __init__(self, model, decay: float = 0.999, decay_func="exponential"): | ||
def __init__( | ||
self, model, decay: float = 0.999, decay_func="exponential", check_X_y=False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a bit of a nitpick.
self, model, decay: float = 0.999, decay_func="exponential", check_X_y=False | |
self, model, decay: float = 0.999, decay_func="exponential", check_input=False |
My reasoning here is that some estimators may only have a check_X
. This is the case when the component is a scikit-learn transformer. But it'd be nice to have a consistent name for checking the internal thing. So maybe this namechange?
Once that change is in/discussed I think we can hit merge :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's very reasonable 👌 let me adjust that!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left one comment related to the name of a variable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Description
Relaxes the check for
X
inDecayEstimator
by delegating the actual checks to the given estimator. This fixes #573 and allows estimators that do not needX
features.Remark, as discussed in the issue, it may be worth it to completely remove the check.
Type of change
Checklist: