Agreed. Css optimization is a drop in the ocean compared to the shitload of 3rd party scripts the marketing team will load through gtm anyway.
Readability is the key. Performance improvement is a red herring.
Reducing the overall size of CSS by cutting repetition isn't even that relevant given how well gzip handles that repetition. Unless your CSS is freakishly huge, selector count is not even likely to have much perceptible impact.
I'm completely with you on the 3rd party stuff too. If I run ublock on the relatively JS heavy React site I work on, load times creep in at sub-second, I reckon I can get that down to half a second by hammering the code and applying some smoke & mirrors, but there are obviously diminishing returns on client side optimisation.
Disable ublock and that loadtime triples - likely due in no small part to the three (3!!!) different versions of jQuery the adverts & misc 3rd party scripts are currently loading.
I can't help but wonder if the secret to fixing the internet for the less tech savvy general public isn't messing about with stuff like AWP, offline first, whatever, but to force code going through DFP and GTM to respect the browser and the host site a little more.
That's a pretty succinct summary of the gzip thing.
There is still worth mentioning the issue of executing code once it's decompressed - so properly scoping a var is always better value for money than global.ancestor.foo.bar etc.