diff options
author | Xuan Son Nguyen <thichthat@gmail.com> | 2024-03-16 16:42:08 +0100 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-03-16 17:42:08 +0200 |
commit | dfbfdd60f90207404039c6578d709231496831d9 (patch) | |
tree | e9138410b45e5977fb1aa319ddadb5ade1f7b11a | |
parent | 15961ec04dbd59d21d8984d42e4c0f7e7e7d320a (diff) |
readme : add wllama as a wasm binding (#6100)
-rw-r--r-- | README.md | 1 |
1 files changed, 1 insertions, 0 deletions
@@ -134,6 +134,7 @@ Typically finetunes of the base models below are supported as well. - Node.js: [withcatai/node-llama-cpp](https://github.com/withcatai/node-llama-cpp) - JS/TS (llama.cpp server client): [lgrammel/modelfusion](https://modelfusion.dev/integration/model-provider/llamacpp) - JavaScript/Wasm (works in browser): [tangledgroup/llama-cpp-wasm](https://github.com/tangledgroup/llama-cpp-wasm) +- Typescript/Wasm (nicer API, available on npm): [ngxson/wllama](https://github.com/ngxson/wllama) - Ruby: [yoshoku/llama_cpp.rb](https://github.com/yoshoku/llama_cpp.rb) - Rust (nicer API): [mdrokz/rust-llama.cpp](https://github.com/mdrokz/rust-llama.cpp) - Rust (more direct bindings): [utilityai/llama-cpp-rs](https://github.com/utilityai/llama-cpp-rs) |