Portable LLMs with llamafile
Portable LLMs with llamafile
Posted May 15, 2024 9:59 UTC (Wed) by snajpa (subscriber, #73467)In reply to: Portable LLMs with llamafile by snajpa
Parent article: Portable LLMs with llamafile
Btw @ single exec file, this has been a done thing now for a while also. There's ollama, which also packs it all (and if it doesn't support all three platforms it's certainly their goal), but unlike _this_ llama.cpp fork, that project has a real added value. It really makes running LLMs easy for people, it abstracts the llama.cpp's rough edges to present a smooth workflows for people who never touched any of this stuff. It's also original code, which *includes* llama.cpp, rather than just "rebrands" it.
