Sound cards were important a decade ago, when CPUs were not yet capable of rendering complex audio. If you wanted to (for example) remix ten mono channels to 5.1 surround in real time, the only option was a separate dedicated processor.
Since then, CPUs have become massively more capable; any reasonable audio processing can be done on a budget desktop. The only real reason to get a separate sound card any more is specialized connectors, but even that's becoming less true -- even my normal, non-fancy motherboard has an actual S/PDIF jack!
Of course, even if a third-party chip's output was not measurably different from an integrated chip, there would still be hordes of idiot audiophiles rushing to drop a few grand on one.
I agree, roughly. At the audiophile level there is some improvement to be expected with S/PDIF in particular(less cable noise). But the situation today is nothing like with older on-board sound where they used the cheapest generic AC97 chips and the output sounded like a tin can.
If you're recording or need low latency playback, there's still tons of room for improvement by getting a sound card. For good recordings, you need a preamp stage, you need <50ms latency, and you (often) need more than one track. And the entry price for these features is still relatively high, between $100 and $250.
Since then, CPUs have become massively more capable; any reasonable audio processing can be done on a budget desktop. The only real reason to get a separate sound card any more is specialized connectors, but even that's becoming less true -- even my normal, non-fancy motherboard has an actual S/PDIF jack!
Of course, even if a third-party chip's output was not measurably different from an integrated chip, there would still be hordes of idiot audiophiles rushing to drop a few grand on one.