Skip to content

UPSTREAM PR #18053: CLI: llama-cli and llama-completion cosmetics#608

Open
loci-dev wants to merge 1 commit intomainfrom
upstream-PR18053-branch_andrew-aladev-feature/llama-cli-and-completion-cosmetics
Open

UPSTREAM PR #18053: CLI: llama-cli and llama-completion cosmetics#608
loci-dev wants to merge 1 commit intomainfrom
upstream-PR18053-branch_andrew-aladev-feature/llama-cli-and-completion-cosmetics

Conversation

@loci-dev
Copy link

Mirrored from ggml-org/llama.cpp#18053

Hello, I've just finished the last (I think) PR regarding separation of llama-cli and llama-completion in docs and scripts. Here I tried to fix places where llama-cli and llama-completion are absolutely obvious, so it is safe to make updates.

  1. Added llama-completion to .dockerignore
  2. Added llama-completion to the GitHub issue template.
  3. Fixed docs in copilot instructions.
  4. Fixed writing a bash completion in README (because > implies non-interactive mode, so for now llama-cli is not possible here)
  5. Used llama-completion in docs/backend/SYCL (llm_load_tensors: ... is the output of llama-completion)
  6. Added llama-completion to docs/backend/hexagon/README
  7. Used llama-completion in docs/backend/hexagon/developer (llama_model_loader: ... is the output of llama-completion)
  8. Used llama-completion in scripts/fetch_server_test_models.py because it uses non-interactive mode.

PS I found other places where I can't understand whether llama-cli or llama-completion should be used. I think maintainers of these docs may update them later.

Related PRs: #17993, #18003

@loci-dev loci-dev force-pushed the main branch 29 times, most recently from 7ceec3c to c8dcfe6 Compare December 21, 2025 10:08
@loci-dev loci-dev force-pushed the main branch 30 times, most recently from 799071d to dba3ea5 Compare December 27, 2025 20:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants