v0.2.20 - OptiLLM, Langfuse v3
OptiLLM
optillm
is an optimising LLM proxy, similar to Harbor Boost with a lot of advanced reasoning/planning workflows.
# Will build and start the service
# [--tail] is optional to automatically follow service logs after start
harbor up optillm --tail
optillm
is connected to all inference backends in Harbor out of the box (but haven't been tested). See compatibility guide on making it work with Open WebUI.
Misc
langfuse
was updated to v3