Releases: av/harbor
v0.3.7 - Tools
In this release, Harbor gets access to MCP and OpenAPI tools ecosystems with the following new services:
# Bring your MCP tools to Open WebUI
harbor up mcpo metamcp
More details in the Harbor Tools guide
Additionally, I'm launching av/tools to simplify containerized tool use in general.
Full Changelog: v0.3.6...v0.3.7
v0.3.6 - LibreTranslate
v0.3.6 - LibreTranslate
Free and Open Source Machine Translation API, entirely self-hosted.
# [Optional] pre-pull the image
harbor pull libretranslate
# Start the service
harbor up libretranslate
Pulls translation models on the first start - may take a while, monitor with
harbor logs libretranslate
Misc
app
- bundle docs in GH action
- attempt to fix #64
boost
- basic support for proxying tool calls
- logging revamp
- request ID tracking
n8n
- fix typo in the docs preventing initial workflow import
Full Changelog: v0.3.5...v0.3.6
v0.3.5 - in-app docs
v0.3.5 - in-app docs
Harbor App now has dedicated service pages. There are a lot of plans, but we start simple, you'll be able to perform the same actions as in the Home page + see service wiki in the app.
Misc
harbor dev app
helper- App - bumping Tauri deps
Full Changelog: v0.3.4...v0.3.5
v0.3.4 - llama-swap
llama-swap
llama-swap is a lightweight, transparent proxy server that provides automatic model swapping to llama.cpp's server.
# [Optional] pre-pull the image
harbor pull llamaswap
# Edit the swap config
open $(harbor home)/llamaswap/config.yaml
# Run the service
harbor up llamaswap
Misc
boost
- docs revamp
- fixing plain proxy without modules
tgi
- HF cache normalisedraglite
- adding missing traefik config
Full Changelog: v0.3.3...v0.3.4
v0.3.3 - oterm, RAGLite
oterm
the text-based terminal client for Ollama.
harbor up ollama
harbor oterm
RAGLite
⚠️ Unfortunately current integration is not fully compatible with Ollama. See docs for more info and this issue for details.
RAGLite is a Python toolkit for Retrieval-Augmented Generation (RAG) with PostgreSQL or SQLite. Harbor's intergration is centered around the provided chainlit
WebUI.
ROCm Ollama
Automatic AMD detection and capability by @cedstrom in #143
DND Module for Boost
dnd-skill-check.mp4
This module makes LLM pass a skill check before generating a reply to your last message.
App - service names, tooltips
App now displays actual Service names and tooltips with service info. Huge thanks to @cedstrom for the inspiration 🙌
Misc
boost
- There's now a starter repo for standalone use of Harbor Boost
- Removed incompatible
num_ctx
from default LLM params - Fixed streaming for incomplete/merged chunks (seen in Azure/Groq APIs)
n8n
- fixed persistence for custom module installation (broken workspace path)docs
- service index now generated from app service metadata (now has tags)ollama
- configure default ctx length withharbor ctx
webtop
- restored Harbor App functionality
New Contributors
Full Changelog: v0.3.2...v0.3.3
v0.3.2
v0.3.2
This is a very minor bugfix release
boost
- now correctly handles incomplete chunks from downstream APIs by attempting buffering and then parsing (tested with Groq API)
Full Changelog: v0.3.1...v0.3.2
v0.3.1
v0.3.1
This is a maintenance release with a few fixes, nothing exciting
harbor dev docs
- fixes relative URLs so that Boost README links now finally work- README - revamp, supporters
boost
- fixed mismatch between docs and actual env vars
r0
- workflow for R1-like reasoning chains for any LLM (including older ones, like Llama 2)
markov
- Open WebUI-only, serves an artifact showing a token graph for the current completiondocs
- numerous tweaks and adjustmentsn8n
- fixed missing EOF preventingharbor env n8n
from working as expectedtxtai
- restored functionality with a monkey patch until this PR is merged
Full Changelog: v0.3.0...v0.3.1
v0.3.0 - Routines, Traefik, Latent Scope
v0.3.0 - Routines, Traefik, Latent Scope
Routines
We now have more than 200 compose files and docker compose
itself is being slow merging them (upwards of ~5s even on my powerful dev machine). In order to overcome this we move the core logic that powers the (now legacy) version of the CLI into dedicated routines. v0.3.0 is also a step towards having a native Harbor CLI in the future.
New routines setup is based on the distroless flavor of Deno. In typical Harbor fashion you don't need to install anything, only to pay the disk space tax (~150Mb in this instance). Harbor will cache the dependencies and everything needed after the first cold start. PyPi, Native, and NPM installation paths will continue to function in the same way as previously. Deno was chosen over Bun, Node.js, Python because it brings the most value within these 150Mb with a path towards native binaries in the future and much more. Additionally, Harbor was already using Deno for some lightweight automation. I did experiments with Rust and Go and despite the better end binaries - the cost of creating and maintaining a CLI there is much higher, so choosing Deno means more time to update and improve the project.
You can return to the legacy behavior by setting a config option:
harbor config set legacy_cli true
Traefik
Harbor now includes traefik
as its reverse HTTP Proxy. Automatic configuration is now limited to a local deployment, however it can be reconfigured as needed manually - see the service wiki for more details.
Latent Scope
One of the tools that leave you with a "woah". Allows exploring the given dataset representation in latent space.
Misc
perplexideez
- fixed usage without access to modern composerelease
script seeds programmatic filesseed-cdi
- EOF fixesseed-traefik
- dev script to create.x.traefik.
cross-files for related servicesboost
- revisedr0
moduleqrgen
- fixed build (affectsharbor qr
,harbor tunnel
)
New Contributors
- @heronsouzamarques made their first contribution in #134 💪
- @Tien-Cheng helped restoring the
qrgen
functionality in #137
Full Changelog: v0.2.28...v0.3.0
v0.2.28
This is a mostly maintenance release with a small new Frontend and an exciting new feature for Harbor Boost.
Mikupad
LLM frontend in a single HTML file
Misc
boost
- support for interactive artifacts (specifically for Open WebUI),
- dedicated README
- multiple new experimental modules (undocumented, see source)
- relaxing CORS policy
- cosmetic fixes to dev scripts
librechat
- fixing after MeiliSearch update to v1.12.3promptfoo
- access to Harbor's configured API keys
- example eval based on Misguided Attention (unfinished)
webui
- now sees local time correctly on supported systemsharbor logs
,harbor down
and other commands relying on"*"
- fixed incorrect detection of CDI capability
New Contributors
- @ColumbusAI made their first contribution in #131 🙌
- @kianmeng made their first contribution in #124 💪
Full Changelog: v0.2.27...v0.2.28
v0.2.27 - Morphic, SQL Chat, gptme, Kokoro v1
v0.2.27 - Morphic, SQL Chat, gptme
Three new services are here to add even more value to your local LLM setup!
Morphic
An AI-powered search engine with a generative UI.
SQL Chat
SQL Chat is a chat-based SQL client, which uses natural language to communicate with the database to implement operations such as query, modification, addition, and deletion of the database.
gptme
Terminal assistant, with tools so it can: use the shell, run code, edit files, and much more;
Speaches - now supports Kokoro v1
Harbor patched its installation of speaches to support Kokoro v1 models (ahead of official support from the project itself)
Misc
- CDI - Harbor detects and enables CDI Nvidia driver on compatible systems (kudos to @FrantaNautilus)
harbor dev
- alias to run dev-related scripts from.scripts
harbor dev scaffold
- unwanted prefix newlineboost
- extras for chat/chat node APIs for custom modulescex
- experiment on automatic context expansion by paraphrasingstcl
- continued experiments on "side" reasoning
- Plenty of clarifications and extra-examples for Ollama/WebUI docs and more
- Experimental
requirements.sh
to install Harbor's dependencies automatically on Linux (undocumented, untested)
New Contributors
- @FrantaNautilus made their first contribution in #119 🎉
Full Changelog: v0.2.26...v0.2.27