> The general public isn’t asking for a hundred gigs, but I’d love to see the baseline rise up a bit. It doesn’t feel like we’ve budged meaningfully here for years. Or is that just me?
While the baseline of 8GB hasn't risen, I do think the floor of what we'd consider "unnecessary" has risen. I remember in 2018 I was building a new desktop and I spent a pretty penny on 32GB of RAM; folks on r/buildapc said that was a waste. Now and days I feel like I've seen a lot of higher end builds that feature 32GB or even 64GB.
Just my 2c; I don't have stats to back this up or anything...
My primary workstation has 128GB and it is for sure unnecessary. Even with multiple projects running each of which spins up a dozen containers in k3s, and a background game of Rimworld as well as all the electron apps that life requires these days, I rarely ever breach 64GB much less 100GB.
The only real use is writing horrifying malloc experiments and having a couple extra seconds to kill them before OOMing.
I routinely use this much RAM for work. And it's not malloc experiments lol. I need 200ag to 500G for my research work. Most of our systems have 384G and this is just enough. If I could only have a laptop with that much...
Containers are relatively resource-efficient; if you need to run a bunch of actual VMs for testing (eg, Windows), you can easily find ways to use 128GB.
I wonder if something like vista is needed to move the needle on consumer RAM again. Pre-vista, windows required 64MB of RAM, and you could fudge it a bit even lower if you knew what you were doing. Vista _required_ 1GB of RAM, and recommended 2.
OEMs were selling 64MB desktops right up until vista was released.
Today, windows 11 requires 4GB or ram. If windows 12 (or whatever they're going to call the next windows, they're not too good at counting) required the same sized jump between XP and Vista, it'd require 64GB or ram.
Vista required 512 MB. In practice, that probably sucked, but those were the paper specs.
XP might have required 64 MB on paper but it crawled and was practically useless on that spec (like Windows 95 was on 4 MB). Never saw anyone do it. A common spec in 2001 would be 256 MB or at an absolute minimum, 128.
Vista came out in 2007 and absolutely no one was selling 64 MB desktop computers at that point. 64 MB is 1997-1998 spec -- in 2007 it would commonly be 512 MB or 1 GB.
It is true, however, that Vista had astronomically high minimum specs for the time -- some machines sold at the time could just barely run it -- and that it probably drove a lot of upgrades.
Hm... I had a Celeron machine running XP ~2004-2007 that had 512MB of RAM, and it wasn't hard to run out, eventually upgraded to 768MB but I was still jealous of my friend running XP with 1GB.
Then, I built a Vista machine in 2007 with 2GB to start, and it was clearly not enough, immediately filled the other 2 slots to go to 4GB.
A bit of bullshit. Ublock Origin, git://bitreich.org/privacy-haters, enable that config to either Firefox or Chrom* based browsers. Under Windows you can set he env vars properly to the desktop shortcuts as pure arguments for the exe.
Seriously, I tried with > 10yo Celeron netbook and it was perfectly usable once you set up ZRAM/zswap and Ublock Origin. Oh, and some tweaks on about:flags
to force the GL acceleration. OpenGL 2.1 capable, go figure, and yet browsing it's snappy on the Celeron.
Does it though? I made another comment but for home use I can't even max out 64 GBs.
The only thing I can think of that'd ever max out my RAM is some sort of training task (even though I'd expect to run out of VRAM first). But those are the kinds of tasks that do best on distributed systems since you don't really need to care about them, just spin it up, run your task, and tear it back down
Google recommends 64GiB to build the Android kernel. That's a thing you could technically do at home. And if you want to do anything else at the same time, you're gonna need to go to 128.
Funny you should say that... my entire career has been built on embedded Android and I've built a lot of images from "scratch" (or as close as you get with random SoC supplier provided garbage tacked on)
The first time I built an AOSP image from scratch was on some dinky office workstation that had been freshly upgraded with a whopping 16GBs so you wouldn't come back in the next morning to a random OOM
These days I get to open a PR and some random monster of a machine on a build farm does the heavy lifting, but I can still say from a lot of experience that 64GB is truly more than plenty for Android OS builds and definitely won't be what keeps you from doing other stuff... IO and CPU usage might make it an interesting proposition, but not RAM.
When Google says 64GB it's for the whole machine: the build process will use a lot of it when configured properly, but not so much that you can't run anything else
(Also again, Android builds are a perfect example of where a remote machine pays off in spades. The spin up is such a small fraction of what's needed you don't have to start messing with containerization if you don't want to, and you can get access to some insanely large instance for a few hours then stop paying for it as soon as it's done.
It just seems unlikely to have a task that warrants that much RAM that isn't just about throwing resources at an otherwise "stateless" task that's a great match for cloud compute)
There are various analytic APIs/libraries that will map data files into memory when there is surplus RAM available. That can really speed up processes which would otherwise be IO bound.
> The general public isn’t asking for a hundred gigs, but I’d love to see the baseline rise up a bit. It doesn’t feel like we’ve budged meaningfully here for years. Or is that just me?
Then less than 3 comments in somehow it became we start justifying 256 GBs of RAM?
256 GBs of RAM can be useful in some cases, on some computers, in some places, but that's not really a meaningful inference? 1TB of RAM can be useful depending on the workload, any given number could be.
The question is can it be useful for anything even vaguely resembling personal computing, and the answer is for all intents and purposes: No. It's not.
You're getting upset and going all caps over your own misunderstanding....
> The general public isn’t asking for a hundred gigs, but I’d love to see the baseline rise up a bit. It doesn’t feel like we’ve budged meaningfully here for years. Or is that just me?
This was the comment that kicked off the thread. Some people felt 32 GBs was the new baseline, and then out of left field comes _256 GBs_
For any amount of RAM, someone somewhere will be able to use it. But that's the kind of deep observation I expect from a toddler.
If we're going past kiddie pool deep observations of plain fact, no, the baseline wouldn't be anywhere near 256 GBs of RAM based on how people use computers.
(And before you attack me for your own poor understanding of language again: People as in the human collective. "People don't need" is not the same as "no one needs".)
I didn't. At least, not to those of us who can deal with some flexibility in interpreting written communication, and use a little tool called context.
But then again, there are definitely people out there who need every. single. nuance. of the most basic statement spelled out for them, as if they're biological GPT-3 endpoint (and this site certainly does feel like it's drowning in those people these days) but I don't write comments for them.
Instead I write comments for people who are interested in actual conversation over browbeating everyone in site because they assumed the most useless interpretation of your statement was the correct one.
While the baseline of 8GB hasn't risen, I do think the floor of what we'd consider "unnecessary" has risen. I remember in 2018 I was building a new desktop and I spent a pretty penny on 32GB of RAM; folks on r/buildapc said that was a waste. Now and days I feel like I've seen a lot of higher end builds that feature 32GB or even 64GB.
Just my 2c; I don't have stats to back this up or anything...