Not true. I spent a summer not too long ago doing quantum Monte Carlo calculations for condensed-matter physics, and IIRC Valgrind showed that the whole thing was under 5 MB. Not only did it not require 1GB of RAM, but code + data fit in the L2 cache of each node. (We were seriously CPU bound, not memory bound.) I can't speak to how much of a market there is for something with 1GB RAM, but you certainly can do "serious computation" with far less than that.
Well yeah, this is an exaggeration; things like brute force hash reversing can take kilobytes. Yet you must admit that such problems are in minority even in material science.
Also note that some Linux distros in GUI mode can take like half of that alone (this is this household-experimentation-with-parallelism use-case they advocate in the proposal).