Portable LLMs with llamafile
Portable LLMs with llamafile
Posted May 15, 2024 8:13 UTC (Wed) by taladar (subscriber, #68407)Parent article: Portable LLMs with llamafile
GPU support does not seem to work for me on AMD. It seems to produce an assertion failure in hip on Gentoo
llava-v1.5-7b-q4.llamafile: /var/tmp/portage/dev-util/hip-5.7.1-r2/work/clr-rocm-5.7.1/rocclr/os/os_posix.cpp:310: static void amd::Os::currentStackInfo(unsigned char**, size_t*): Assertion `Os::currentStackPtr() >= *base - *size && Os::currentStackPtr() < *base && "just checking"' failed.
error: Uncaught SIGABRT (SI_TKILL) at 0x3e8001df83d on <hostname removed>
./llava-v1.5-7b-q4.llamafile
File exists
