Running self-hosted refact.ai without GPU? #712
Unanswered
gitwittidbit
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
So, I have set up my own local refact.ai instance in a GPU enabled docker container. And I already have a local ollama instance with various LLMs capable of coding. So I wouldn't want to download them again into refact.ai. I have also managed to connect refact.ai to ollama so that refact.ai utilizes ollama to answer questions.
My question is: Is it possible to run refact.ai also in a non-GPU enabled container (I would want to run it on a different server that doesn't have a GPU)? I have actually tried but it just keeps complaining that there is no GPU. So, is there maybe a switch somewhere to tell refact.ai that it shouldn't look for a GPU?
Beta Was this translation helpful? Give feedback.
All reactions