As a counter-argument, I ran Arch Linux + nvidia GPUs + Intel CPUs between 2012 and 2020, and still run Arch + nvidia (now with AMD CPU) to this day. I won't say it has been bug free at all, but it generally works pretty well. If you find a problem in Arch that you cannot fix without reinstalling, you do not sufficiently understand the problem or Arch itself. "Installing" Arch is refreshingly manual and "simple" compared to the magic that is other Linux distros or the closed source OSes.
I tried using an Nvidia card with OBS to record my screen and it kind of freezes in Wine. I switched from x11 to Wayland and now Wine shows horizontal lines (!) and performs like crap.
Even my 4GB RX 570 from years ago gives a better experience doing this. You just install OBS from flathub, Wayland works, everything works without any setup or tinkering. You click record and you can record your gameplay footage.
I'm sure that I could have fixed it, but I gave up after spending multiple evenings on it. Have you ever spent hours debugging a system exclusively in text mode? It isn't fun. Reinstalling the OS takes less than 30 minutes. It's a clear choice for me
Yes in fact, I have spent hours debugging a system from the console. links/lynx is a godsend. I agree though, reinstalling is certainly easier. This is more of a philosophical argument than a practical one. I installed Arch to really learn Linux, not just to get work done. If I just wanted to get work done, I'd have used Fedora, Ubuntu, or Debian.
I ran a laptop with the swappable dedicated Nvidia and integrated Intel GPU for a decade with no issues. Used to use something called Bumblebee to swap between them depending on workload, actually worked surprisingly well given the circumstances. Eventually I just dropped back to integrated only when I stopped doing anything intensive with the machine.