Portable LLMs with llamafile
Portable LLMs with llamafile
Posted May 15, 2024 14:26 UTC (Wed) by flussence (guest, #85566)In reply to: Portable LLMs with llamafile by taladar
Parent article: Portable LLMs with llamafile
Does *anything* work in ROCm? My impression of it over the past 5-10 years, from the volume of complaints on the internet about it, has been that it's the best advertising campaign Nvidia could've wished for.
