-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix CI #185
Fix CI #185
Conversation
@c-bata Could you review this PR? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about handling all positional arguments like below?
@@ -641,7 +641,7 @@ def decision_function( | |||
|
|||
return self.best_estimator_.decision_function(X, **kwargs) | |||
|
|||
def inverse_transform(self, X: TwoDimArrayLikeType) -> TwoDimArrayLikeType: | |||
def inverse_transform(self, X: TwoDimArrayLikeType, **kwargs: Any) -> TwoDimArrayLikeType: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def inverse_transform(self, X: TwoDimArrayLikeType, **kwargs: Any) -> TwoDimArrayLikeType: | |
def inverse_transform(self, X: TwoDimArrayLikeType, *args: Any, **kwargs: Any) -> TwoDimArrayLikeType: |
@@ -652,7 +652,7 @@ def inverse_transform(self, X: TwoDimArrayLikeType) -> TwoDimArrayLikeType: | |||
|
|||
self._check_is_fitted() | |||
|
|||
return self.best_estimator_.inverse_transform(X) | |||
return self.best_estimator_.inverse_transform(X, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return self.best_estimator_.inverse_transform(X, **kwargs) | |
return self.best_estimator_.inverse_transform(X, *args, **kwargs) |
@c-bata |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks almost good to me. I left two minor suggestions though.
Note that we support the argument detailed in | ||
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.inverse_transform |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we could simply remove these lines since all positional / keyword arguments are supported. What do you think?
Note that we support the argument detailed in | |
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.inverse_transform |
Note that we support the argument detailed in | ||
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.transform |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto.
Note that we support the argument detailed in | |
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.transform |
@c-bata I changed the doc-strings accordingly! |
@@ -494,8 +494,7 @@ def sample_train_set(self) -> None: | |||
|
|||
def tune_feature_fraction(self, n_trials: int = 7) -> None: | |||
param_name = "feature_fraction" | |||
param_values = [0.4 + 0.6 * i / (n_trials - 1) for i in range(n_trials)] | |||
|
|||
param_values = cast(list, np.linspace(0.4, 1.0, n_trials).tolist()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change was to go back to the original implementation.
https://github.com/optuna/optuna-integration/pull/184/files#diff-a12e6b555e010ad81b969adb35d8f3e60dcabf6c4f3719d824db2e5012497e7e
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
Motivation
This PR fixes the CI.
In principle, the errors in CI happen because of the following:
From scikit-learn v1.6.0, it seems that
check_is_fitted
callshasattr(estimator, "transform")
, which triggers the getter of theOptunaSearchCV.transform
property.This, in turn, calls
self._check_is_fitted()
in the property and we yield the infinite loop ofcheck_is_fitted
.Description of the changes
To avoid the problem above, I made
inverse_transform
andtransform
the methods of theOptunaSearchCV
object.Note that I confirmed the possible arguments of these methods here