Hacker News new | past | comments | ask | show | jobs | submit login

The option to have 128GB RAM for stable diffusion and local LLMs.



Since 8GB unified memory is "equivalent to a 16GB non Mac" Is 128GB unified memory MacBook basically a 256GB Nvidia GPU(s)?


They are not equivalent - it's pure marketing BS.


Not in the machine learning world ;-) You'd need to switch to marketing workloads to make 8GB = 16GB.


I think that one is mostly marketing but there's a grain of truth in terms of the combined effect of unified memory (which is fast as hell) and macOS memory compression. But I wouldn't put 8 GB past similar to 10-12 GB on another system without these features.


Say I want to launch a vm with 8GB reserved memory. On a 10-12 GB machine I would have 2 to 4GB of memory left to actually allocate to the system. I don't see how unified memory magically fixes this. It's actually worse because even more memory has to be shared with the GPU.

This is like saying a 8GB disk is actually like a 10-12GB disk elsewhere.


> Since 8GB unified memory is "equivalent to a 16GB non Mac"

Yeah, right.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: