Files
llama.cpp/examples
Georgi Gerganov f64d56bcd8 llama-server-simulator : replace Flask with stdlib http.server
- Use HTTPServer + BaseHTTPRequestHandler instead of Flask
- RequestHandler handles POST /v1/chat/completions
- Server runs in daemon thread with clean Ctrl+C shutdown
- Remove flask and unused asdict imports

Assisted-by: llama.cpp:local pi
2026-05-10 20:47:08 +03:00
..
2026-04-28 09:07:33 +03:00
2026-04-28 09:07:33 +03:00
2026-05-08 06:54:57 +03:00