If I want a voltage divider, it's a lot easier to just use some 1% resistors and forward-calculate the expected output (rather than doing a calibration) if you're happy with 1-2% error from the resistors and your ADC or the like. Adding software and testing hardware to do a full on calibration is a lot of work.
But yeah, for digital signals, oft times 1k or 100k make no difference.
I definitely agree that 1% or better resistors are easier than calibration, but that doesn't mean you need values outside of E3 most of the time.
I might want want an accurate 1/10 divider or something, but a 1/12 divider would probably be fine too, as long as it's consistent. If it doesn't vary between devices, it's just a line of code to change.
I've done stuff that needs high precision resistors, but usually the specific value isn't that important, just that it's a known repeatable value.