Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix CI #185

Merged
merged 5 commits into from
Dec 17, 2024
Merged

Fix CI #185

merged 5 commits into from
Dec 17, 2024

Conversation

nabenabe0928
Copy link
Collaborator

@nabenabe0928 nabenabe0928 commented Dec 16, 2024

Motivation

This PR fixes the CI.

In principle, the errors in CI happen because of the following:

class C:
  @property
  def f(self):
    self.f()

hasattr(C(), "f")

>>> RecursionError: maximum recursion depth exceeded

From scikit-learn v1.6.0, it seems that check_is_fitted calls hasattr(estimator, "transform"), which triggers the getter of the OptunaSearchCV.transform property.
This, in turn, calls self._check_is_fitted() in the property and we yield the infinite loop of check_is_fitted.

Description of the changes

To avoid the problem above, I made inverse_transform and transform the methods of the OptunaSearchCV object.
Note that I confirmed the possible arguments of these methods here

@nabenabe0928
Copy link
Collaborator Author

@c-bata Could you review this PR?

Copy link
Member

@c-bata c-bata left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about handling all positional arguments like below?

@@ -641,7 +641,7 @@ def decision_function(

return self.best_estimator_.decision_function(X, **kwargs)

def inverse_transform(self, X: TwoDimArrayLikeType) -> TwoDimArrayLikeType:
def inverse_transform(self, X: TwoDimArrayLikeType, **kwargs: Any) -> TwoDimArrayLikeType:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def inverse_transform(self, X: TwoDimArrayLikeType, **kwargs: Any) -> TwoDimArrayLikeType:
def inverse_transform(self, X: TwoDimArrayLikeType, *args: Any, **kwargs: Any) -> TwoDimArrayLikeType:

@@ -652,7 +652,7 @@ def inverse_transform(self, X: TwoDimArrayLikeType) -> TwoDimArrayLikeType:

self._check_is_fitted()

return self.best_estimator_.inverse_transform(X)
return self.best_estimator_.inverse_transform(X, **kwargs)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return self.best_estimator_.inverse_transform(X, **kwargs)
return self.best_estimator_.inverse_transform(X, *args, **kwargs)

@nabenabe0928
Copy link
Collaborator Author

@c-bata
I applied your suggestion, PTAL!

Copy link
Member

@c-bata c-bata left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks almost good to me. I left two minor suggestions though.

Comment on lines 651 to 652
Note that we support the argument detailed in
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.inverse_transform
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we could simply remove these lines since all positional / keyword arguments are supported. What do you think?

Suggested change
Note that we support the argument detailed in
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.inverse_transform

Comment on lines 719 to 720
Note that we support the argument detailed in
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.transform
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto.

Suggested change
Note that we support the argument detailed in
https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.FunctionTransformer.html#sklearn.preprocessing.FunctionTransformer.transform

@nabenabe0928
Copy link
Collaborator Author

@c-bata I changed the doc-strings accordingly!

@@ -494,8 +494,7 @@ def sample_train_set(self) -> None:

def tune_feature_fraction(self, n_trials: int = 7) -> None:
param_name = "feature_fraction"
param_values = [0.4 + 0.6 * i / (n_trials - 1) for i in range(n_trials)]

param_values = cast(list, np.linspace(0.4, 1.0, n_trials).tolist())
Copy link
Collaborator Author

@nabenabe0928 nabenabe0928 Dec 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@c-bata c-bata added the bug Something isn't working label Dec 17, 2024
@c-bata c-bata added this to the v4.2.0 milestone Dec 17, 2024
Copy link
Member

@c-bata c-bata left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@c-bata c-bata merged commit 05006b3 into optuna:main Dec 17, 2024
30 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants