Hacker News new | past | comments | ask | show | jobs | submit login

How powerful of a computer does this need? It would be useful to see, for one thing, minimum RAM requirements for these models.



llama.cpp needs 40GB for the 65B model (due to int4 quantization)

RamNeeded(other_size) ~= 40GB * other_size/65B




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: