Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No response with using Cloud provided LLMs... #1491

Open
arhanjain11 opened this issue Mar 10, 2025 · 3 comments
Open

No response with using Cloud provided LLMs... #1491

arhanjain11 opened this issue Mar 10, 2025 · 3 comments

Comments

@arhanjain11
Copy link

arhanjain11 commented Mar 10, 2025

Describe the bug

When using a cloud provided LLM like DeepSeek, Anthropic, OpenRouter etc., the API key just doesn't seem to work. Many fail to fetch. OpenRouter API key does fetch, but even after that, there is no response from bolt.diy and it keeps on loading forever. I have set the API key from both UI and .env file. Locally installed LLMs are fine though.

Steps to reproduce

  1. Run bolt.diy.
  2. Set your API key from UI or .env (or maybe both your choice).
  3. Choose the key provider and select a model on the main screen.
  4. Once you type your request, you never get a response. It just keeps on loading forever.

Expected behavior

It should be able to respond when using a cloud provider! It gets stuck.

Platform

  • OS: Windows 11
  • Browser: Chrome
  • Version: 134.0.6998.35

Provider Used

Just any provider

Model Used

Just any model

@leex279
Copy link
Collaborator

leex279 commented Mar 11, 2025

@arhanjain11 can you please provide logs and screenshot of bolt. as well as the dev-console and terminal logs. There must be a problem / error.

@arhanjain11
Copy link
Author

arhanjain11 commented Mar 11, 2025

@leex279
Sure buddy!

Image

It loads on forever.

Image

I even double checked the API key being used, both from .env.example (renamed to .env) and UI.

Even when I use a model like qwen: qwq 32b from openrouter, it gets stuck on the start only with a long error message:

Image

Note - I have not tested all the models and providers yet, but most which I have tested simply do not work. Few like qwen qwq 32b and qwen qwq 32b (free) work halfway and get stuck most of the times. I managed to get qwen qwq 32b to work once.

Image
as you can see here, this model and provider somewhat works (it got stuck).

@arhanjain11
Copy link
Author

arhanjain11 commented Mar 11, 2025

I just noticed something....

When I run pnpm run dev for the first time after cloning bolt.diy, my localhost doesn't load (blank screen on localhost:5173) and I get these yellow/orange messages in my terminal (see images below). After I stop it and do pnpm run dev for the second time, those yellow/orange messages disappear forever and my localhost works normally with the bolt.diy screen (the model not working problem mentioned in the previous comment is obviously still there):
Image
Image

Note: Those unocss failed to load icon messages and the "indexedDB is not available in this environment" messages appear every time:
Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants