You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bolt keeps making changes to files and executing terminal commands after every response from the LLM.
Screen Recording / Screenshot
Screenshots showing working normally (checklist of actions taken) which at some point in the conversation is replaced by output showing no actions taken.
Starting a new project with any of these models works as expected - models return correctly formatted artifacts, Bolt takes actions to create / modify files and run terminal commands
Forking the long conversation the stops taking action at the last time Bolt took action continues to result in Bolt taking no action (no files updated, no terminal commands run)
Afaik Ollama behavior is to silently discard context over the model setting which presumably degrades the responses as the project evolves past the context window but the output format should remain the same so unclear why changes at some point stop being applied by Bolt.
Describe the bug
At some point in a multi-turn conversation Bolt stops editing files or running commands.
The LLM continues to output artifacts the same way but Bolt stops processing them and taking action.
Link to the Bolt URL that caused the error
http://localhost:5173/chat/3
Steps to reproduce
Start a new Bolt chat.
Keep revising the code for ~20-30+ turns
At some point Bolt stops taking action on the code.
Exact dialog used is attached.
chat-2025-03-11T23_50_37.061Z.json
Expected behavior
Bolt keeps making changes to files and executing terminal commands after every response from the LLM.
Screen Recording / Screenshot
Screenshots showing working normally (checklist of actions taken) which at some point in the conversation is replaced by output showing no actions taken.
Platform
OS: Arch Linux
Browser: Chrome Canary 136.0.7054.0 (Official Build) canary (64-bit)
Bolt Version: 0.7
Bolt Install Method: Git / pnpm start
Provider Used
Ollama
Model Used
qwq:32b-q8_0 / qwen2.5-coder:32b-instruct-q6k / qwen2.5-coder:32b
Additional context
Starting a new project with any of these models works as expected - models return correctly formatted artifacts, Bolt takes actions to create / modify files and run terminal commands
Forking the long conversation the stops taking action at the last time Bolt took action continues to result in Bolt taking no action (no files updated, no terminal commands run)
Afaik Ollama behavior is to silently discard context over the model setting which presumably degrades the responses as the project evolves past the context window but the output format should remain the same so unclear why changes at some point stop being applied by Bolt.
Additional discussion on this issue can be found here: https://thinktank.ottomator.ai/t/bolt-diy-not-modifying-files/5031/35
The text was updated successfully, but these errors were encountered: