You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Despite being new to docker, I believe others will also already have a Ollama running in a separate Docker Container.
Could you please provide instructions on how to interact with separate containers.
Separate 'Open Source Swarm' instructions may be required as the docker run -it -v ./:/app --rm -p 7860:7860 -e OPENAI_API_KEY=<YourOpenAIKey> vrsen/agency-swarm
seems somewhat redundant if not using OPENAI_API_KEY
Thanks, and thanks for your enthusiasm ;)
The text was updated successfully, but these errors were encountered:
kartguru
changed the title
Ollama running in separate Docker Container
Open Source Swarm: Ollama running in separate Docker Container
May 30, 2024
My efforts to run and interact using docker networks haven't been fruitful however in trying to run litellm I've seen my separate Open WebUI container appears to offer a way to manage under Settings > Models > Manage LiteLLM Models. Can the Open WebUI container be used instead?
Despite being new to docker, I believe others will also already have a Ollama running in a separate Docker Container.
Could you please provide instructions on how to interact with separate containers.
Separate 'Open Source Swarm' instructions may be required as the
docker run -it -v ./:/app --rm -p 7860:7860 -e OPENAI_API_KEY=<YourOpenAIKey> vrsen/agency-swarm
seems somewhat redundant if not using OPENAI_API_KEY
Thanks, and thanks for your enthusiasm ;)
The text was updated successfully, but these errors were encountered: