Skip to content

Misc. bug: The model can no more continue answering while switching to another message (for example, to read it). #16133

@DigitalKO

Description

@DigitalKO

Name and Version

$./llama-server --version
version: 6503 (62c3b64)
built with cc (Debian 14.2.0-19) 14.2.0 for x86_64-linux-gnu

Up to version 6295 (86076f9), the model could continue answering while switching to another message (for example, to read it).

However, with this version (6503), interrupting the model while it is answering will stop the response abruptly. It is no longer possible to switch back to the answering message after leaving it to read older messages.

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

llama-server

Command line

Problem description & steps to reproduce

Let the model answering while switching to another message (for example, to read it).

First Bad Commit

No response

Relevant log output

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions