The result is a very narrow definition of "virtually worthless". It indicates gaming might be too far into the diminishing returns curve so that a good gaming PC is no longer on the highest end of the desktop computing spectrum.
The last three words of the title are a clear indication of the context of the discussion. It's perfectly legitimate to discuss tech as it applies to a specific (and popular) use.
More generally, games really don't tax computers very much. You can get a 400gbps network interface but it won't speed up your game. It does require a PCIe 5.0 x16 slot to hit that speed, though. It's the same for SSDs. Nobody is going to notice the difference between 10µs and 9µs 4K random read latency when launching Grand War United: Cyber 99, but it could substantially speed up the critical path of a database query. Games are not pushing the frontiers here.
Not anymore at least. Top-tier "GPUs" these days don't even have plugs for monitors. You won't see much of an improvement either by maxing out the memory - anything beyond 16GB will make very little change.
You might want to explore RAID-1 arrays to reduce that 9µs random read latency to 4.5µs or less (with more SSDs), but there is little benefit in transferring 8K UHD content faster than the monitor's frame rate allows one to watch it. I no longer see storage like that in the wild (as such things get abstracted away in cloud platforms) but I still remember RAID-1 "readzillas" and RAID-0 "writezillas" with a dozen or more fast disks (we used spinning metal when I last saw them). These are specially useful for back-ups, as they can ingest a truckload of data and spit them to tape at the maximum speed of the tape (avoiding a lot of wear and tear).