Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input size in tokens might be climbing steadily even if I write relatively small GM responses #7

Open
maxwelljoslyn opened this issue Apr 17, 2024 · 1 comment
Assignees

Comments

@maxwelljoslyn
Copy link
Owner

maxwelljoslyn commented Apr 17, 2024

Looking at my Anthropic logs:

Screen Shot 2024-04-16 at 22 17 13 PDT
@maxwelljoslyn maxwelljoslyn self-assigned this Apr 17, 2024
@maxwelljoslyn
Copy link
Owner Author

maxwelljoslyn commented Apr 17, 2024

I can't tell if this is because the input actually means the entire conversation so far and that's obviously going to just increase in size over time, or if I'm somehow accidentally sending too much stuff from each GM response, i.e. perhaps I'm accidentally appending too many pieces of information to what gets sent to each LLM.

Should be easy to determine which of those cases it is. And if the former, nothing I can do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant