You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Im wondering if the perplexity calculation is correct.
You calculate the total loss for every batch and average it over the (average)sequence length in that batch an in the end take the average over all batches.
As far as i understood the sum of all batch losses as well as the total number of target tokens need to be collected and divided after the epoch is done?
Could you please tell me why you are calculating it this way? Is it just an aproximation? Also the ppl's calculated this way seem rather small?!
Thanks
The text was updated successfully, but these errors were encountered:
Im wondering if the perplexity calculation is correct.
You calculate the total loss for every batch and average it over the (average)sequence length in that batch an in the end take the average over all batches.
As far as i understood the sum of all batch losses as well as the total number of target tokens need to be collected and divided after the epoch is done?
Could you please tell me why you are calculating it this way? Is it just an aproximation? Also the ppl's calculated this way seem rather small?!
Thanks
The text was updated successfully, but these errors were encountered: