mirror of
https://github.com/ggml-org/llama.cpp.git
synced 2026-05-11 19:44:06 +00:00
* webui: add LLM title generation option * webui: use chat_template_kwargs for title gen + fix conversation check * webui: capture firstUserMessage before async streamChatCompletion to fix race condition * webui: extract LLM title generation into separate method * webui: use constants and ChatService for LLM generated titles * webui: rebuild static output * webui: add LLM title generation setting to new settings location * webui: use sendMessage in generateTitle * webui: rebuild static output * webui: fix formatting * webui: configurable title prompt, remove think tag regexes, fix TS error * webui: group title constants into TITLE object, use TruncatedText for CSS truncation and fix race condition * webui: rebuild static output