Because real CS degrees require a lot of it, and math exams are the main reason people wash out in the first 2 years. (There's some controversy around whether that should be but it is what it is.) And it's not a watered down version for CS students. In the universities I know, the first ~4 math courses are taken together by maths and CS students. I've heard "CS is applied math" a lot, and I agree with it. (If you want to become a programmer, you don't need to go to university.)
Not saying you can't coast through without math if you're somewhat picky, but it would be a significant restriction. And for me the realization that most CS problems can be approached in a very rigorous, mathy way was one of the main takeaways of all the time studying.
Agree OS is less mathy, but even for operating systems you need statistics the moment you want to talk intelligently about performance (look how many people average percentiles, do coordinated omission, or insist on averaging results instead of looking at distributions). Btw random webdevs handling A/B testing tools would benefit as well. We're talking MSc? Ok, a basic OS class won't occupy you for long, it's 5% (actual number of one uni) of credits for an MSc only. Let's say you stay in the field... queueing theory is somewhat mathy? Advanced data structures needs experience with proofs, which is taught in maths.
Maybe other fields are safer?
Computer graphics and it's subfields need lots of linear algebra.
Crypto? Symmetric crypto needs probability theory, asymmetric crypto needs number theory, side channel attacks need linear algebra.
Robotics, signals and systems, sensor fusion, electrical networks all need lots of linear algebra.
(I've taken all of above courses at some point, but not all exams.)
Set theory and graph theory came up often in my undergraduate operating systems and computer organization classes. I imagine it'd be even more so for graduate level classes.