You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there! I would like to fine-tune the pre-trained model on a custom dataset. If I understand correctly, I have to train a new word2vec model and skip-instructions model on the custom dataset. Using these models, I create the custom HDF5 file. Then, I can fine-tune the pre-trained model by training it on the custom HDF5 file with the parameter -finetune 1.
Is this the correct way to do this? I’m asking because I’m not sure if I should train the word2vec and skip-instruction model on the custom dataset only, or if these should be trained on a concatenation of the Recipe1M and the custom dataset. Additionally, the main.lua file contains opts.finetune = opts.finetune ~= 0, but I have not been able to figure out how this parameter is used during training. Thanks!
The text was updated successfully, but these errors were encountered:
Hi there! I would like to fine-tune the pre-trained model on a custom dataset. If I understand correctly, I have to train a new word2vec model and skip-instructions model on the custom dataset. Using these models, I create the custom HDF5 file. Then, I can fine-tune the pre-trained model by training it on the custom HDF5 file with the parameter
-finetune 1
.Is this the correct way to do this? I’m asking because I’m not sure if I should train the word2vec and skip-instruction model on the custom dataset only, or if these should be trained on a concatenation of the Recipe1M and the custom dataset. Additionally, the main.lua file contains
opts.finetune = opts.finetune ~= 0
, but I have not been able to figure out how this parameter is used during training. Thanks!The text was updated successfully, but these errors were encountered: