-
Notifications
You must be signed in to change notification settings - Fork 193
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
configure_optimizers
must include a monitor when a ReduceLROnPlateau
scheduler is used. For example: {"optimizer": optimizer, "lr_scheduler": scheduler, "monitor": "metric_to_track"}
#94
Comments
I am facing the same problem, the environment seems to have the packages with a proper version: absl-py 2.1.0 |
I have managed to solve the problem based on the answer provided in #79 (comment) I add some changes to that answer, considering the definition of the variables should be done using "self". The way to solve it is to modify the spacetimeformer_model.py. It's under spacetimeformer/spacetimeformer_model. In the function configure_optimizers, so it seems like that now:
|
when using “python ./spacetimeformer/train.py spacetimeformer mnist --embed_method spatio-temporal --local_self_attn full --local_cross_attn full --global_self_attn full --global_cross_attn full --run_name mnist_spatiotemporal --context_points 10 --gpus 0”
it occurs this problem:
how to deal with it?
The text was updated successfully, but these errors were encountered: