diff options
author | Bryan Honof <bryanhonof@gmail.com> | 2024-06-17 17:37:55 +0200 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-06-17 09:37:55 -0600 |
commit | b473e95084c286780165568cf0f385f21141d68d (patch) | |
tree | 7de2f3df93ecf30265cd290c67998147ca09a2e8 | |
parent | 99052cd227c7182fcf53343d2e7d33bfa180a9cf (diff) |
Add Nix and Flox install instructions (#7899)
-rw-r--r-- | README.md | 24 |
1 files changed, 24 insertions, 0 deletions
@@ -387,6 +387,30 @@ brew install llama.cpp ``` The formula is automatically updated with new `llama.cpp` releases. More info: https://github.com/ggerganov/llama.cpp/discussions/7668 +### Nix + +On Mac and Linux, the Nix package manager can be used via +``` +nix profile install nixpkgs#llama-cpp +``` +For flake enabled installs. + +Or +``` +nix-env --file '<nixpkgs>' --install --attr llama-cpp +``` +For non-flake enabled installs. + +This expression is automatically updated within the [nixpkgs repo](https://github.com/NixOS/nixpkgs/blob/nixos-24.05/pkgs/by-name/ll/llama-cpp/package.nix#L164). + +#### Flox + +On Mac and Linux, Flox can be used to install llama.cpp within a Flox environment via +``` +flox install llama-cpp +``` +Flox follows the nixpkgs build of llama.cpp. + ### Metal Build On MacOS, Metal is enabled by default. Using Metal makes the computation run on the GPU. |