Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discrepancy in Inference Cost for LLaMA2-7B on GSM8K Dataset #24

Open
sofan110 opened this issue Jan 18, 2025 · 0 comments
Open

Discrepancy in Inference Cost for LLaMA2-7B on GSM8K Dataset #24

sofan110 opened this issue Jan 18, 2025 · 0 comments

Comments

@sofan110
Copy link

Problem Description

While reading the paper, I noticed that the average inference cost for 32 rollouts of the LLaMA2-7B model on the GSM8K dataset is reported as 166. However, in our actual tests, even with 8 rollouts, the average inference cost has already reached 600. This is a significant discrepancy compared to the results reported in the paper. Could you kindly provide guidance on whether there are any specific settings, configurations, or considerations that we might have overlooked?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant