Skip to content

Misc. bug: test-chat fails on x86 windows builds but works everywhere else #14112

@max-krasnyansky

Description

@max-krasnyansky

Name and Version

maxk\src\llama.cpp\build-win> .\bin\llama-cli.exe --version
register_backend: registered backend CPU (1 devices)
register_device: registered device CPU (AMD EPYC 7763 64-Core Processor )
version: 5629 (d1660c8)
built with MSVC 19.42.34433.0 for Windows AMD64

Operating systems

Windows

Which llama.cpp modules do you know to be affected?

Test code

Command line

bin\test-chat.exe

Problem description & steps to reproduce

I tried macos and linux builds and test-chat works fine.
Did a clean windows x86-64 build (with BUILD_SHARED_LIBS=OFF) and test-chat aborts on mistralai-Mistral-Nemo-Instruct-2407.jinja and some other templates.

@slaren looks like you were the last to commit, could you check this on your setup

First Bad Commit

No response

Relevant log output

See output here.
https://github.com/ggml-org/llama.cpp/actions/runs/15570861515/job/43846147913?pr=14003

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions