Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[object Object] as ERROR #1469

Open
j2l opened this issue Mar 6, 2025 · 3 comments
Open

[object Object] as ERROR #1469

j2l opened this issue Mar 6, 2025 · 3 comments
Labels
question Further information is requested

Comments

@j2l
Copy link

j2l commented Mar 6, 2025

Describe the bug

Here's the error in terminal (not browser console):

 INFO   stream-text  Sending llm call to HuggingFace with model Qwen/Qwen2.5-Coder-32B-Instruct
 ERROR   api.chat  [object Object]
 DEBUG   api.chat  usage {"promptTokens":null,"completionTokens":null,"totalTokens":null}

I read here that others could used Qwen2.5-Coder-32B-Instruct, would you please share how you do it?
Thanks!

Link to the Bolt URL that caused the error

http://localhost:5173/chat/1?rewindTo=d3cy3j0zj9b

Steps to reproduce

  1. select HuggingFace
  2. select Qwen2.5-Coder-32B-Instruct
  3. enter prompt
  4. hit enter

Expected behavior

code 😄

Screen Recording / Screenshot

No response

Platform

  • OS: Ubuntu 22.04
  • Browser: Chrome
  • Version: 134.0.6998.35 (Official Build) (64-bit)

Provider Used

HF

Model Used

Qwen2.5-Coder-32B-Instruct

Additional context

Running bolt.diy with npm run dev

@leex279
Copy link
Collaborator

leex279 commented Mar 8, 2025

Take a look here, maybe it helps: https://thinktank.ottomator.ai/t/how-to-use-hugging-faces/6231/2

@leex279 leex279 added the question Further information is requested label Mar 8, 2025
@j2l
Copy link
Author

j2l commented Mar 10, 2025

@leex279 this was not a question
ERROR api.chat [object Object] is not helpful as an error 😸
My API key is full write so it's not an issue.

Redoing it, I have another message in the browser:
There was an error processing your request: Custom error: Input validation error: inputs tokens + max_new_tokens must be <= 16000. Given: 146038 inputs tokens and 8000 max_new_tokens

Still, this error should also be in terminal.

@leex279
Copy link
Collaborator

leex279 commented Mar 10, 2025

@thecodacus can you please take a look

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants