> However, you will not see any information like the meta description on these blocked URLs.
True, but that's not the only thing. If it ever was in the index, it takes forever to be removed, if it gets removed at all. Send 404 or 410, Disallow it or set it to noindex - you may get lucky or you may not. You can of course "hide it from search results", but that only works for 90 days (iirc, may be 120, something in that range). Those leftovers will typically lose rankings, but they often stay indexed, easy to spot with a site: query.
Reindexing a page is dynamic based on noteworthiness and volatility iirc, but individual links can be reindexed on the fly since the Percolator index. The 90d number was from an old system when indexes were broken into shards that had to be swapped out wholesale.
I don't mean reindexing, I mean "hiding from the index" ("Remove URLs" in GSC). It works instantly, but only for a limited time, after which it will re-appear in the index if you haven't gotten it out of the index (via 410, noindex or disallow). Since these other ways don't always work, if you're unlucky and want it to stay gone, you need to hide it again (and again and again). I've had clients that were hacked and had spammy content injected into their site and it took (literally!) years for that to get removed (we tried combinations of 404, 410, noindex and disallow).
Exactly, there is no guaranteed way to remove anything, HTTP status, meta-tags, headers, and robots.txt only have advisory status. They are usually followed when a resource is hit first, but once it's in the index, "keeping the result available" seems to be a top priority. I do understand the idea - it might still be a useful result for a user, but otoh if it's 410 (or continuously 404), it won't be of any use because the content that was indexed is no longer available (especially in case of 410).
Granted, these are edge cases, in most circumstances, 410 + 90 day hiding means they are hidden instantly and don't resurface. These edge cases do make me take Google's official statements on how to deal with things with a grain of salt though: bugs exist, and unless you happen to know somebody at Google there's no way to report them.
URLs blocked in robots.txt can get discovered through other links and they will get displayed in the search results.
However, you will not see any information like the meta description on these blocked URLs.
There's a good explanation about this here, including a video from former Googler, Matt Cutts: https://yoast.com/prevent-site-being-indexed/