Due to Rust's safety guarantees there's a perception that software written in Rust is automatically safer than software written in its direct competitors C and C++.
This is not necessarily true, you can write unsafe Rust code and you can write safe C++ code, but it does seem to hold in practice that the guardrails imposed by Rust help quite a bit in stopping devs from making really stupid mistakes.
That would be the "thrustworthiness" implied by the use of Rust.
And a VMM is going to require a lot of unsafe rust code. There are strategies to minimize it to make that surface easier to audit, but it is not a panacea for systems programming gremlins.
Out of 413,842 non-empty lines of rust code in the repository (including comments) there are 2006 instances of "unsafe", 255 of which are of the form "unsafe_" (definitely not unsafe blocks, mostly of the form #![forbid(unsafe_code)] which is an instruction to the linter to forbid unsafe blocks in the module) leaving slightly less than 1751 unsafe blocks. (Still counting comments, and type signature annotations on function pointers, and so on, but most of those 1751 will be actual unsafe blocks. A block can of course be multiple lines of code).
I don't really know what a VMM consists of, so I'm mostly surprised that this project is half a million lines of code.
It's not unsafe that causes unsafety, it's how you wield it.
- Do you know your invariants?
- Have you documented them?
- If using unsafe block, have you asserted them or guaranteed that they hold for any input?
Granted, Rust is kind of mediocre at teaching you this. It raises warning for unsafe fn without safety documentation block, but not when omittin safety comments in unsafe blocks.
No question, just pointing out where the *perceived* trustworthiness comes from. If it helps for something like a VMM it's a whole other story. Marketing gimmick.
For myself only, there's also an implication that perhaps the authors are a bit more concerned with safety and security in general. (Don't reply with counterexamples. I know them already. I mean that as a trend, not a solid rule.) That is, the sort of person who might pick Rust for its features might correlate with the sort of person who cares more about those properties in the first place. That doesn't mean they're experts who can execute the ideas flawlessly, or that someone slogging in C couldn't carefully built the same project. I do think it means they're more likely to prioritize safety (yes, it's not 100% safe; yes, you can write insecure code in Rust; don't "correct" me) as an inherent design goal than maybe someone cranking it out in assembler.
> For myself only, there's also an implication that perhaps the authors are a bit more concerned with safety and security in general. (Don't reply with counterexamples. I know them already. I mean that as a trend, not a solid rule.) That is, the sort of person who might pick Rust for its features might correlate with the sort of person who cares more about those properties in the first place
I haven't seen any causation between SW and their creator. A good example: Hans Reiser.
This is not necessarily true, you can write unsafe Rust code and you can write safe C++ code, but it does seem to hold in practice that the guardrails imposed by Rust help quite a bit in stopping devs from making really stupid mistakes.
That would be the "thrustworthiness" implied by the use of Rust.