diff options
author | Lars Grammel <lars.grammel@gmail.com> | 2024-01-07 21:24:11 +0100 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-01-07 22:24:11 +0200 |
commit | b7e7982953f80a656e03feb5cfb17a17a173eb26 (patch) | |
tree | 7acf2ed95055fdabc2d2715b03235b669eee254a | |
parent | 226460cc0d5b185bc6685fb76f418fd9418d7add (diff) |
readme : add lgrammel/modelfusion JS/TS client for llama.cpp (#4814)
-rw-r--r-- | README.md | 1 |
1 files changed, 1 insertions, 0 deletions
@@ -118,6 +118,7 @@ as the main playground for developing new features for the [ggml](https://github - Python: [abetlen/llama-cpp-python](https://github.com/abetlen/llama-cpp-python) - Go: [go-skynet/go-llama.cpp](https://github.com/go-skynet/go-llama.cpp) - Node.js: [withcatai/node-llama-cpp](https://github.com/withcatai/node-llama-cpp) +- JS/TS (llama.cpp server client): [lgrammel/modelfusion](https://modelfusion.dev/integration/model-provider/llamacpp) - Ruby: [yoshoku/llama_cpp.rb](https://github.com/yoshoku/llama_cpp.rb) - Rust: [mdrokz/rust-llama.cpp](https://github.com/mdrokz/rust-llama.cpp) - C#/.NET: [SciSharp/LLamaSharp](https://github.com/SciSharp/LLamaSharp) |