Does LocalAI support Chat Templates for completions? #1551
-
I have been using llama-cpp-python, but am very interested in the idea of being able to use bert.cpp for embeddings in the local openai endpoint. They support chat templates and I submitted a PR to get universal HuggingFace chat templating to enable transparent model swapping to automatically handle the eos token, and chat templating supplied by the model so I would not need to modify my completion payloads every time I use a new model. Does LocalAI support chat templates for chat completion? The PR: abetlen/llama-cpp-python#790 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
LocalAI has quite an advanced templating system, so yes. It is documented here: https://localai.io/advanced/#prompt-templates. You can find also more advanced examples in my upcoming PR here: https://github.com/mudler/LocalAI/pull/1532/files#diff-a1b3f4dafaeae34ff87ab6fa77d2eaed8604a5cafdd9b708f5335a88805b5c1a |
Beta Was this translation helpful? Give feedback.
LocalAI has quite an advanced templating system, so yes. It is documented here: https://localai.io/advanced/#prompt-templates. You can find also more advanced examples in my upcoming PR here: https://github.com/mudler/LocalAI/pull/1532/files#diff-a1b3f4dafaeae34ff87ab6fa77d2eaed8604a5cafdd9b708f5335a88805b5c1a