My minikube dev environment (microservices with lots of independent databases) can be crammed into about 24gb of ram and mimic our production environment almost 1:1, there's a number of various databases (couchabse, elastic, redis, rabbit, etc). If a developer is limited to 16GB they have to run a more crammed and far less simliar dev environment. Instead of using one database per service like we do in production we have to cram multiple services into one database (say couchbase with buckets for each service). I can chunk this down to use 13-14GB but if the user has 16GB max that means they're left with 2-3GB of ram for their IDE, Chrome, spotify, etc.
It's severely limited the freedom of range with our dev environment and we're constantly fighting to stay within that 16GB spec. Do we deviate heavily from our staging/prod environments?
What's the sweet spot? A bunch of us have built hackintosh desktops at this point so we can have 32-64+ and more cores so we're not constantly fighting resource contension with all of docker containers we need to run.
It is really irritating that Apple don't offer 256GB iMac Pros with the current 64GB LR-DIMM prices (and it being officially supported in Intel), and even the current XNU kernel can do 252GB.
While not a true solution because you need to break open your computer, the newest iMac Pro has socketed DDR4. iFixIt’s teardown[0] suggests the max it’ll support is 4x32 GB for a total of 128 GB.
It's severely limited the freedom of range with our dev environment and we're constantly fighting to stay within that 16GB spec. Do we deviate heavily from our staging/prod environments?
What's the sweet spot? A bunch of us have built hackintosh desktops at this point so we can have 32-64+ and more cores so we're not constantly fighting resource contension with all of docker containers we need to run.