-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No EOS token appended #10
Comments
|
Hi, @ZhiyuanChen @tBai1994 the reason for this issue is that we did not append the eos token, which is consistent with MSA Transformer. If you would like to append this special token, you can do so by setting Lines 166 to 172 in 43d3d93
|
Hi,
Thank you for this wonderful work!
When I was trying to reproduce your results, I faced some challenges when getting a minimum working example to run.
The length of sequence is 22, so inputs should have 24 tokens (with and )
But it only has 23 tokens.
The inputs is:
Since vocab is
Vocab({'<cls>': 0, '<pad>': 1, '<eos>': 2, '<unk>': 3, 'A': 4, 'G': 5, 'C': 6, 'U': 7, 'X': 8, 'N': 9, '-': 10, '<mask>': 11})
,it appears
<eos>
token is not appended by the vocab.The text was updated successfully, but these errors were encountered: