We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, Adan是一个性能十分优秀的优化器,谢谢你们的工作。
但我最近在尝试用Adan进行指令微调时,发现loss曲线很漂亮,但是下游任务表现(GSM-8k)不如预期。 同样的数据处理和评测,AdamW大概9.63,Adan只有5.08左右。
AdamW超参数:weight_decay 0.01, lr 2e-5 Adan超参数:weight_decay 0.02,按照repo的建议lr尝试了2e-4 1e-4, GSM8k都比较低 lr scheduler都是3%升到最高然后下降到0
AdamW的训练loss曲线:
Adan的训练loss曲线:
使用的代码:
from adan import Adan optimizer = Adan(model.parameters(), lr=args.lr, weight_decay=0.02, foreach=True, fused=True)
想知道有没有一些对指令微调的超参设置建议?
The text was updated successfully, but these errors were encountered:
Hi,
Sorry, something went wrong.
Hi, 我做的尝试如下:
按照你建议的效果确实有提升。请问还有没有进一步的建议?
No branches or pull requests
Hi, Adan是一个性能十分优秀的优化器,谢谢你们的工作。
但我最近在尝试用Adan进行指令微调时,发现loss曲线很漂亮,但是下游任务表现(GSM-8k)不如预期。
同样的数据处理和评测,AdamW大概9.63,Adan只有5.08左右。
AdamW超参数:weight_decay 0.01, lr 2e-5
Adan超参数:weight_decay 0.02,按照repo的建议lr尝试了2e-4 1e-4, GSM8k都比较低
lr scheduler都是3%升到最高然后下降到0
AdamW的训练loss曲线:
Adan的训练loss曲线:
使用的代码:
想知道有没有一些对指令微调的超参设置建议?
The text was updated successfully, but these errors were encountered: