-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
a problem #9
Comments
Hi,
The message you posted here is just a warning. You can ignore it.
Regards,
Kalpesh
…On Thu, 30 Mar, 2023, 10:42 john-yyb, ***@***.***> wrote:
Hello author, I am very interested in your work. I tried to reproduce it
but ran into a problem, can you tell me how you solved it.
/home/student/student9/miniconda3/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:131:
UserWarning: Detected call of lr_scheduler.step() before optimizer.step().
In PyTorch 1.1.0 and later, you should call them in the opposite order:
optimizer.step() before lr_scheduler.step(). Failure to do this will
result in PyTorch skipping the first value of the learning rate schedule.
See more details at
https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of lr_scheduler.step() before
optimizer.step(). "
/home/student/student9/miniconda3/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:418:
UserWarning: To get the last learning rate computed by the scheduler,
please use get_last_lr().
—
Reply to this email directly, view it on GitHub
<#9>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQII3HJXTOZN4Y4F4MX55ULW6UI3ZANCNFSM6AAAAAAWMYN2RQ>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thank you very much for your prompt response. For your work, can I try swapping the dataset. I tried using a dataset of black and white images. For this data set, I intend to downsample one part as LR and the other part as HR. Then the two form an LR-HR unpaired dataset for training. Am I doing the right thing? Because I am new to super-resolution of unsupervised learning, there may be incorrectly understood. I hope I can get your answer, then I will be very grateful. |
Hi
Actually in unsupervised SR task, you have unpaired LR-HR dataset.
For example, consider images captured by low-end mobile device as LR and
download some high quality images from internet. With this type of data, if
you apply SR, then it is called unsupervised SR.
But if you created LR from some HR data, in that case you have paired LR-HR
data. In this case, you can use any supervised method for better results.
I hope, you can understand it now.
…On Fri, 31 Mar, 2023, 17:17 john-yyb, ***@***.***> wrote:
Thank you very much for your prompt response. For your work, can I try
swapping the dataset. I tried using a dataset of black and white images.
For this data set, I intend to downsample one part as LR and the other part
as HR. Then the two form an LR-HR unpaired dataset for training. Am I doing
the right thing? Because I am new to super-resolution of unsupervised
learning, there may be incorrectly understood. I hope I can get your
answer, then I will be very grateful.
—
Reply to this email directly, view it on GitHub
<#9 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQII3HO7DCLVATWGP3SC5IDW6273XANCNFSM6AAAAAAWMYN2RQ>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Thank you very much for your reply, I understand now. |
Hello author, I am very interested in your work. I tried to reproduce it but ran into a problem, can you tell me how you solved it.
/home/student/student9/miniconda3/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:131: UserWarning: Detected call of
lr_scheduler.step()
beforeoptimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order:optimizer.step()
beforelr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-ratewarnings.warn("Detected call of
lr_scheduler.step()
beforeoptimizer.step()
. "/home/student/student9/miniconda3/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:418: UserWarning: To get the last learning rate computed by the scheduler, please use
get_last_lr()
.The text was updated successfully, but these errors were encountered: