Skip to content

Bug: llama-server web UI resets the text selection during inference on every token update #9608

@mashdragon

Description

@mashdragon

What happened?

When using llama-server, the output in the UI can't be easily selected or copied until after text generation stops. This may be because the script replaces all the DOM nodes of the current generation when every new token is output.

The existing text content ideally shouldn't be replaced during generation so we can copy the text as it continues to produce output.

Name and Version

version: 3755 (822b632)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu

What operating system are you seeing the problem on?

No response

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggood first issueGood for newcomershelp wantedNeeds help from the communitylow severityUsed to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)server/webui

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions