Reindexing a page is dynamic based on noteworthiness and volatility iirc, but individual links can be reindexed on the fly since the Percolator index. The 90d number was from an old system when indexes were broken into shards that had to be swapped out wholesale.
I don't mean reindexing, I mean "hiding from the index" ("Remove URLs" in GSC). It works instantly, but only for a limited time, after which it will re-appear in the index if you haven't gotten it out of the index (via 410, noindex or disallow). Since these other ways don't always work, if you're unlucky and want it to stay gone, you need to hide it again (and again and again). I've had clients that were hacked and had spammy content injected into their site and it took (literally!) years for that to get removed (we tried combinations of 404, 410, noindex and disallow).
Exactly, there is no guaranteed way to remove anything, HTTP status, meta-tags, headers, and robots.txt only have advisory status. They are usually followed when a resource is hit first, but once it's in the index, "keeping the result available" seems to be a top priority. I do understand the idea - it might still be a useful result for a user, but otoh if it's 410 (or continuously 404), it won't be of any use because the content that was indexed is no longer available (especially in case of 410).
Granted, these are edge cases, in most circumstances, 410 + 90 day hiding means they are hidden instantly and don't resurface. These edge cases do make me take Google's official statements on how to deal with things with a grain of salt though: bugs exist, and unless you happen to know somebody at Google there's no way to report them.
Percolator white paper: https://ai.google/research/pubs/pub36726