Replies: 1 comment
-
Oh wait maybe that's on by default:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does llama-server support prompt caching between requests, similar to llama-cli prompt file? I have a use case where the prefix stays the same between requests, with just a few characters changing at the end.
Beta Was this translation helpful? Give feedback.
All reactions