Hacker News new | past | comments | ask | show | jobs | submit login

Can't wait for a LLM that can be installed locally and has no stupid restrictions.

StableGPT when?




The folks making Open Assistant [1] (opensource ChatGPT clone) gathered enough data to start initial training, so hopefully there will be something to play with soon.

[1] https://github.com/LAION-AI/Open-Assistant


We just need denser RAM.


Why? My Threadripper desktop PC supports a maximum of 1TB RAM and it doesn't take much space at all.


Most don't; we need commodity RAM to get denser and cheaper -- at $2/GB for cheap RAM, a terabyte is still $2000, and that'll require an expensive motherboard to support.

Otherwise, fewer people have access and progress is slower.

See also: the commiditization of GPUs, once it was no longer an SGI product and regular people could get them for $200, all kinds of GPGPU stuff started happening.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: