Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prompt is too long: 200099 tokens > 199999 maximum #893

Closed
orkhanart opened this issue Oct 24, 2024 · 6 comments
Closed

prompt is too long: 200099 tokens > 199999 maximum #893

orkhanart opened this issue Oct 24, 2024 · 6 comments

Comments

@orkhanart
Copy link

Describe the bug

I created a lot of code in last few days, built an amazing product, but now i can't continue it says
prompt is too long: 200099 tokens > 199999 maximum. whats your advise?

Link to the Bolt URL that caused the error

https://bolt.new/~/sb1-pwqxrd

Steps to reproduce

Ask for any new code implementation

Expected behavior

to code next feature

Screen Recording / Screenshot

No response

Platform

  • OS: [e.g. macOS, Windows, Linux]
  • Browser: [e.g. Chrome, Safari, Firefox]
  • Version: [e.g. 91.1]

Additional context

No response

@lcgarza10
Copy link

Do you still have the issue? Im in the same situation started last night but it appeared jsut once, now again I thought it was becasue i had less then 10k tokens and I made a recharge now I have 10M but cant use, still getting the same msg

@orkhanart
Copy link
Author

Yes, I have over 50 million tokens, which I believe is tied to the scale of the project. It seems that beyond a certain volume of work, the ai struggles to manage and retain information effectively for reasoning. In my case, after a while, it began to exhibit hallucinations and even deleted elements it had previously generated.

@lcgarza10
Copy link

I don't see any answer from them, some update that says that they are working on, we can't just leave our projects as this, I was almost finishing mine. What other competitors we can use instead of this one?

@lcgarza10
Copy link

what is indeed working flawlessly is the token usage... that guy is still reducing my tokens every time he gave me an incomplete answer

@lcgarza10
Copy link

Fix xxxx Type Conversion

src/app/services/xxx.service.ts
This message didn't complete its response.

@kc0tlh
Copy link
Collaborator

kc0tlh commented Nov 1, 2024

@orkhanart thanks for your report, and appreciate your patience as we are a small team working to support all of the new users! The context on this error, workaround ideas, and future updates on the R&D we are doing on this issue are being tracked in #1322 going forward so please go subscribe there!

@kc0tlh kc0tlh closed this as completed Nov 1, 2024
@stackblitz stackblitz deleted a comment from lcgarza10 Nov 5, 2024
@stackblitz stackblitz deleted a comment from lcgarza10 Nov 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants