Hacker News new | past | comments | ask | show | jobs | submit login

Many people seem to think it is. I work on the area of secure software stacks and the #1 “helpful” suggestion I get is “why don’t you use WebAssembly?” As if that made a significant difference to the attack surface, at all.



WebAssembly is designed to provide safety, in the sense that WebAssembly programs are sandboxed and unable to do anything you don't give them permission to do. WebAssembly won't make the runtime behavior of your programs correct, which is what the article seems to be getting at, but that doesn't make the WebAssembly VM model unsafe or insecure from a host perspective.

If you isolate untrusted code in a WebAssembly VM, that should reduce the attack surface for the system as a whole to whatever functions you expose into the WebAssembly VM.

Personally, my bias is to suggest that using Rust would significantly cut down on (but not eliminate) the attack surface inside the WebAssembly program... but some people don't like to hear that.


The simple version:

WebAssembly does protect the host from a compromised process. WebAssembly doesn't do jack to prevent a process (and whatever data it controls) from being compromised in the first place.


It does in the sense that it totally protects the stack and doesn't allow creating new executable memory. Control flow can be manipulated, but then only via changing function pointers to other functions of the same type


Control flow can also be manipulated via memory corruption, making the existing code take other decisions, due to non valid semantic states.


Sure, but that's even weaker, since you can't even create new control flows, just cause surprising ones.


It is enough to work around authentication checks, for example.


It doesn’t even do a particularly good job of that, given the size of the WebAssembly code base and reasonable bugs-per-kLOC assumptions. If you really want to protect the host then run a small, probably correct interpreter or JIT that is simplified enough to prove properties about and have an implementation small enough to be potentially bug free.


Well, it certainly could be helpful, if part of your stack wants to run inherently untrusted code. I think this is what gets people excited by the idea of putting it into the kernel.

It is a bit surprising, though, that many people do not realize WebAssembly doesn't make C code safe against itself.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: