Logo
Explore Help
Register Sign In
sdgoij/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
mirror of https://github.com/ggml-org/llama.cpp.git synced 2026-05-12 12:04:08 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
3802d3c78f5791f77ee9706e8ef440ca61d12ea3
llama.cpp/tools/server/webui/src/lib
History
Hendrik Erz 3802d3c78f fix: Use tabular-nums for chat message statistics (#18915)
* fix: Use `tabular-nums` for chat message statistics

* fix: Rebuild WebUI
2026-01-21 18:46:01 +01:00
..
components
fix: Use tabular-nums for chat message statistics (#18915)
2026-01-21 18:46:01 +01:00
constants
sampling : add support for backend sampling (#17004)
2026-01-04 22:22:16 +02:00
enums
webui: display prompt processing stats (#18146)
2025-12-18 17:55:03 +01:00
hooks
webui: fix prompt progress ETA calculation (#18468)
2025-12-29 21:42:11 +01:00
markdown
webui: Fix selecting generated output issues during active streaming (#18091)
2025-12-18 11:13:52 +01:00
services
sampling : add support for backend sampling (#17004)
2026-01-04 22:22:16 +02:00
stores
sampling : add support for backend sampling (#17004)
2026-01-04 22:22:16 +02:00
types
sampling : add support for backend sampling (#17004)
2026-01-04 22:22:16 +02:00
utils
Webui/file upload (#18694)
2026-01-09 16:45:32 +01:00
Powered by Gitea Version: 1.25.3 Page: 49ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API