mirror of
https://github.com/ggml-org/llama.cpp.git
synced 2026-03-17 16:44:07 +00:00
* gguf-py: Bump sentencepiece version There's a new version that's been out for a while that addresses the issues mentioned in https://github.com/ggml-org/llama.cpp/pull/14200. There's a long chain of reasons I would like this change, but the short version is that it allows people who use both `sentencepiece` and `gguf` to take advantage of these fixes. On conda-forge, currently, it locks the version (since there is no notion of optional dependencies). Regardless, I don't think this should be too controversial. * review feedback
8 lines
109 B
Plaintext
8 lines
109 B
Plaintext
numpy~=1.26.4
|
|
sentencepiece>=0.1.98,<0.3.0
|
|
|
|
transformers>=4.57.1,<5.0.0
|
|
|
|
gguf>=0.1.0
|
|
protobuf>=4.21.0,<5.0.0
|