You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was impressed by your excellent work. I wanted to ask about something - to accomplish the task you proposed, it seems that the token must appear in the output sequence. I am working on a similar task, but when I tried, even though I included the token in the training data, the model doesn't seem to include it in its responses. I'd like to ask if you used any special methods to force the generation of the token, or if you were able to achieve such results simply through finetuning with the training data.
Thank you for reading my question.
The text was updated successfully, but these errors were encountered:
I was impressed by your excellent work. I wanted to ask about something - to accomplish the task you proposed, it seems that the token must appear in the output sequence. I am working on a similar task, but when I tried, even though I included the token in the training data, the model doesn't seem to include it in its responses. I'd like to ask if you used any special methods to force the generation of the token, or if you were able to achieve such results simply through finetuning with the training data.
Thank you for reading my question.
The text was updated successfully, but these errors were encountered: