5 points | by dcreater 10 hours ago ago
4 comments
Georgi's relevant comment: https://github.com/ggml-org/llama.cpp/pull/19324#issuecommen...
and use the original llama.cpp directly. Its infinitely more easy to setup and use now
Setting up ollama is 2 steps:
1. yay -S ollama
2. sytemctl enable --now ollama
How is llama.cpp infinitely more easy to set up?
infinitely more easy relative to what it used to be
Georgi's relevant comment: https://github.com/ggml-org/llama.cpp/pull/19324#issuecommen...
and use the original llama.cpp directly. Its infinitely more easy to setup and use now
Setting up ollama is 2 steps:
1. yay -S ollama
2. sytemctl enable --now ollama
How is llama.cpp infinitely more easy to set up?
infinitely more easy relative to what it used to be