Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat Request failing on Imported Folder/ Cloned Repo #1062

Closed
tgey opened this issue Jan 10, 2025 · 11 comments
Closed

Chat Request failing on Imported Folder/ Cloned Repo #1062

tgey opened this issue Jan 10, 2025 · 11 comments

Comments

@tgey
Copy link

tgey commented Jan 10, 2025

Describe the bug

Unable to get a response from a chat request for imported folders or cloned repositories. Bolt is loading for a while (5-10 seconds) before a timeout-like action and the following mesage:

_

There was an error processing your request: An error occurred.

_

The web console is pretty clean, except a 'https://w-corp-staticblitz.com/fetch.worker.1b4252dd.js' warning: preloaded using link preload but not used within a few seconds from the window's load event.

Link to the Bolt URL that caused the error

http://localhost:5173/chat/1

Steps to reproduce

  1. Import Folder / Clone repo (private or public)
  2. Start a chat

Expected behavior

The expected behavior should be a chat response.

Screen Recording / Screenshot

image

image

Platform

Windows 11 - Google Chrome Canary
Windows 11 - Firefox
Ubuntu 20.04 - Google Chrome Canary
Ubuntu 20.04 - Firefox

Provider Used

Google / OpenAI / Deepseek

Model Used

Gemini 2.0 Flash / GPT-4o / GPT -4o-mini / Deepseek-Chat

Additional context

These models are working on a blank chat. The problem seems to be recurrent for chat with imported data.

@dorianbodnariuc
Copy link

I get the same error on a regular basis on different providers.
"There was an error processing your request: An error occurred"
Changing the provider sometimes fixes the error, but it's hit and miss.
Sometime changing the model on the same provider fixes the error, (openrouter different models.)
Could this be because of the chat's length?
If yes, what would be some ways to mitigate?

@thecodacus
Copy link
Collaborator

i believe its a too big project to load it into context all at once

@yaniboum
Copy link

I have the same errors

@yaniboum
Copy link

i believe its a too big project to load it into context all at once

whats the solution then ?

@tgey
Copy link
Author

tgey commented Jan 11, 2025

I cloned a project built on bolt.new, so I believe the repo length is not the main problem. If so, may we add some logging or more explicit alerting ?

@dadebulba
Copy link

Hey there, I have the same problem, imported a Nuxt project with Vuetify created from scratch so without too many files and now the chat keep saying "There was an error processing your request: An error occurred". I'm using Claude 3.5 Sonnet as LLM model.

@thecodacus
Copy link
Collaborator

I cloned a project built on bolt.new, so I believe the repo length is not the main problem. If so, may we add some logging or more explicit alerting ?

can you give me the repo link you are trying to import ?

@dadebulba
Copy link

Hi @thecodacus , for my case these are the steps to reproduce the error:

@radiovisual
Copy link

The same happened for me. the steps to reproduce the error in my case:

  1. Create a project on bolt.new. In my case I created a sample app with the prompt "Create a Slack-inspired chat application in Express and React".
  2. Export the project files from bolt.new as a .zip drive
  3. Setup bolt.diy (I used the git clone option) and start bolt.diy via pnpm run dev
  4. Unzip the project created on boly.new and import into bolt.diy
  5. Start the chat with your imported project, the project should be running and loaded into the preview browser
  6. Type a new prompt to edit the code, then the error appears "There was an error processing your request: An error occurred."

@thecodacus
Copy link
Collaborator

The same happened for me. the steps to reproduce the error in my case:

  1. Create a project on bolt.new. In my case I created a sample app with the prompt "Create a Slack-inspired chat application in Express and React".
  2. Export the project files from bolt.new as a .zip drive
  3. Setup bolt.diy (I used the git clone option) and start bolt.diy via pnpm run dev
  4. Unzip the project created on boly.new and import into bolt.diy
  5. Start the chat with your imported project, the project should be running and loaded into the preview browser
  6. Type a new prompt to edit the code, then the error appears "There was an error processing your request: An error occurred."

I tried to replicate the issue.
turns out the import folder option is not excluding package.lock.json file and its too big for the llm to put into the context

workaround is delete the package.lock.json file before uploading the file

@thecodacus thecodacus pinned this issue Jan 12, 2025
@radiovisual
Copy link

I tried to replicate the issue. turns out the import folder option is not excluding package.lock.json file and its too big for the llm to put into the context

workaround is delete the package.lock.json file before uploading the file

I can confirm that this workaround solved the issue for me. Thanks @thecodacus!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants