Loading
llama-cpp: remove workaround for local-ai
`local-ai` package is marked broken. It uses `overrideAttrs` to override `llama-cpp` package, which is not advised and puts maintenance burden on `llama-cpp` maintainers. If and when the `local-ai` package is unbroken, it should stop using `overrideAttrs` on `llama-cpp`. Instead, if they still have to build against a non-standard `llama-cpp` revision, they should maintain the `llama-cpp` derivation on their own.