Skip to content

Conversation

@nhamanasu
Copy link
Contributor

Added a script to chat with llama2 in server mode for llama.cpp.

  • examples/server-llama2-13B.sh
    • Start the server with llama2 13B.
  • examples/server/chat-llama2.sh
    • Script to chat with llama2 when the server is running. Modifications from chat.sh are as follows:
      • Added support for the prompt format of llama2 chat.
      • Fixed an issue where newlines (\n) in llama2 output were not displayed correctly in the terminal.
      • Changed user input to appear in green (like ./main).

@ggerganov ggerganov merged commit 34ae1ca into ggml-org:master Jul 28, 2023
@nhamanasu nhamanasu deleted the feature/llama2-server-chat branch July 28, 2023 18:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants