Hacker News new | past | comments | ask | show | jobs | submit login

Imagine thinking we should, literally, police engineering techniques.

If you build a bridge then you are expected to use techniques and systems that provide at least some degree of planned safety for the users of that bridge. It is virtually impossible to write a C++ program of any meaningful complexity that processes untrusted data in an unsandboxed environment that does not expose the owner of the device running that program to harm. To say otherwise is to ignore decades of observation.

Every single person who starts writing a new application in a memory-unsafe language that will deal with untrusted inputs is declaring up front that they are willing to tolerate the inevitable vulnerabilities and exploits caused by that decision.

I think it is very important that our industry develops a path to getting all such programs off of unsafe languages, since it is very clear that techniques like testing, fuzzing, and audits are not sufficient to actually produce safe programs.




I initially disagreed with your viewpoint and after reading your response you've actually changed my mind.

My only real gripe is I would prefer it came from the IEEE or something and not really from some government agency; or worse -> oracle or someone trying to get everyone to use java/their stuff.


I personally don't think that the IEEE would have any capability of really shifting the industry. It isn't like IEEE guidance for privacy preserving programs really moved the needle. You needed legislation like GDPR to do that (and even then it remains incomplete). Ultimately, adopting memory-safe languages for systems programming is going to be very expensive. You need more than just recommendations to make that happen.

I do think there is risk with legislation binding developers too much or forcing them into suboptimal approaches if things aren't written well. One could imagine legislation that does not permit the use of Rust because of the presence of `unsafe`, but that would be a terrible misstep.


You make a very sound argument about the engineering perspective.

Unfortunately, many of the folks writing such software aren't (formally-trained) engineers. Would you suggest that they receive training which allows them to think of software as infrastructure? I'm genuinely curious, not being sarcastic.


> Would you suggest that they receive training which allows them to think of software as infrastructure?

I don't know. I don't know enough about detailed practices in fields like civil engineering to have any idea what would translate. I'm not convinced that "teach every software engineer to use model checking for everything they ever write" is going to be a winning approach. This is why memory safe languages are so valuable. You don't need to teach engineers new techniques. You just outright eliminate an entire class of vulnerability that has persisted despite efforts to eliminate it with other means.


> Every single person who starts writing a new application in a memory-unsafe language that will deal with untrusted inputs is declaring up front that they are willing to tolerate the inevitable vulnerabilities and exploits caused by that decision.

Meanwhile we banished Java and Flash from browsers, with JavaScript still leading every pwn2Own contest because these "memory safe" languages are ultimately still implemented by humans paid to prioritize new features instead of security. I still haven't seen a website that absolutely needed multi threading, certainly didn't break anything of note when it had to be disabled as specter mitigation.


I worked on V8 for almost 7 years. It being written in C++ is a cause of a large number of issues. And even larger number of issues is caused by its absolutely massive complexity and the low-level nature of what it does, particularly the object model and the JIT compiler's complex optimizations. Low-level is really dangerous and error prone.

I think every VM should be rewritten in a memory-safe, GC'd language. While there are bugs at the meta-level (i.e. the compiler IR and object representation), making the runtime code itself memory-safe should be table stakes for even talking about a trustworthy implementation.


Memory safety is not a security panacea and you should run far away from anybody who says that it is. What I am arguing is that it is table stakes.

Browsers also have a uniquely difficult security challenge in that they, by design, execute untrusted code and compete based on the performance of their js engines.


I don't think comparing software to buildings is always apt.

If a building collapses, it's likely that people will die.

The consequences of failing software can be mere annoyances depending on the context of its use.

Obviously certain industries that use software have much more dire consequences of failure though (eg. large machinery, transport, health care).

I think one could come up with all sorts of analogies that fit or don't fit, such as, applying a similar argument to door locks. Why should it be legal to use ordinary keyed locks on houses when they are so easy to circumvent with basic lockpicks?


Programs written in memory-unsafe languages are riddled with RCE vulns. This is true even for software written by companies that hire the very best security engineers in the world. The consequences of such software that processes untrusted input is more than mere annoyance. This sort of behavior is the root of RATs operated by both criminals and oppressive states. It does not matter if your program is intended for something as seemingly non-critical as text messaging - it will still be used to cause terrible harm.

I do not think that the lock is a reasonable comparison here, because exploitation of software scales so so so much more effectively than picking locks. One exploit easily scales to millions of devices. So the harm caused by vulnerable software has a much higher ceiling than the harm caused by a weak lock.


The point of the lock analogy is to point out the absurdity of analogies here.


Then drop the analogy.

If I install software that was written in C++ on a device I own and it processed untrusted content then I put myself as fairly major risk of all sorts of harm. There are only two resolutions for this problem:

1. No more memory-unsafe languages on security boundaries.

2. Extremely effective sandboxing and process isolation.

#2 has proven very hard. But we know how to do #1. We just need to spend the effort.


Part of the problem is that the actual impact of vulnerabilities in the program is often divorced from it's actual purpose. A simple TODO list that allows RCE is one example. It also has a wide variety of impact based on the user - is it just installed on a random personal computer? Or is it on a hospital server?

I don't know that it's particularly possible for a developer to truly understand all the possible impacts of an error in their program.

I'm not sure what the best way to handle that uncertainty is. Assuming all failures are critical would do the job, but certainly isn't free. However doing something like is suggested here - somehow requiring safer languages - might be a decent middle ground. The cost of using languages more built-in safety features is often not very high. Actually often such languages claim that those features make them cheaper to use.


Your point might make sense for web facing software because programs where lives are actually at stake are written in Ada or a subset of C with rigorous static analysis and engineering processes.

Now, it can't be denied that C and C++ are weak from a security perspective and that they should be avoided for network software as much as possible. But the problem with your take is the subtle implication that Rust is "safe" (not just memory-safe) when in fact there is no empirical evidence or track record of Rust being successfully used in anything remotely mission-critical. I mention this because you brought up the bridge example when it is also possible that due to language complexity that the new "bridge" built in Rust would turn out to be even fragile (but just memory-safe).

Just the other day, there was a Rust GUI library posted here. The library uses a convoluted event handling mechanism of passing enum values as messages and additional book-keeping burden instead of straightforward closures just so that the compiler can prove the code is safe (just memory-safe, mind you). It is possible that because of such contortions required to pass the compiler, Rust could fare worse in the "general correctness" area[1]. It is just that we don't know yet. Even the particular safety issue that is mentioned in the GP comment could be solved by having built-in slice types and mandatory bounds checking (like Zig/Go/D). As usual, C/C++ have terrible defaults.

Again, I agree that there is a need for secure alternatives to C and C++. But the contention is whether Rust is that. Even Rust is actually far from optimal in the "safe systems language" space. There might be languages in the future that are as fast as Rust but more ergonomic. Microsoft Research, for example, is creating a research language named Verona[2] that aims to be memory-safe and concurrency-safe. There are also other attempts like Vale[3] that aim at this space. It is premature to think that Rust is the final evolutionary step in the landscape of systems language and suggest for everything to be moved to Rust ASAP. It often appears like reckless fanaticism.

[1]: There is, in fact, few "anecdata" of Rust being less reliable: https://news.ycombinator.com/item?id=24027296 https://dev.to/yujiri8/it-seems-like-rust-software-us-bad-hk...

[2]: https://www.microsoft.com/en-us/research/project/project-ver...

[3]: https://vale.dev/


> But the problem with your take is the subtle implication that Rust is "safe" (not just memory-safe) when in fact there is no empirical evidence or track record of Rust being successfully used in anything remotely mission-critical.

An application built from the ground up in a language like Rust is going to have fewer vulnerabilities than the same application built in C++. I say this as a person who loves C++ and is intimately familiar with the state of the art of securing C++ applications. I am not proposing an immediate rewrite of everything, though I do personally believe that the "well a rewrite will just introduce more vulns" concern is overblown. I expect papers in ICSE in the not to distant future to be able to validate one of our views.

Rust is not flawless, not even close. There are other alternatives and there can even be new languages in the future, but it has the most mindshare and it is an alternative today. For many years, people would simply say that there was nothing that could compete with C and C++ for systems programming. Rust very nearly handles all of the common use cases. But... I didn't even really mention Rust in my post so I think it is especially difficult to call me an evangelist for it.

Like it or not, ideas in research languages take ages to filter into real world ecosystems. My PhD is in the intersection of static analysis and security. I love this research. But the honest truth is that waiting for MSR to produce the path forward is not a winning strategy. Languages need ecosystems and I think it is more likely that the future will come from industry than directly from academia.


> the subtle implication that Rust is "safe" (not just memory-safe)

It's not a "subtle implication" it's a fact that Rust is also data race free and thus concurrency safe in the same sense you're attributing to Verona, although for very different reasons - it can't introduce data races. Verona, unlike Rust, is not in fact a production system, it's an academic toy for pondering new ways to approach concurrency. Perhaps ten years from now its findings will influence future Rust development.

It's certainly interesting that we're still at the place where people are going, "This is only better if you can't afford GC", when even Java is markedly less safe than Rust since it doesn't prevent data races. (Yes there is ConcurrentModificationException for this, no Java doesn't promise to raise this Exception, and if it happens that might already be too late).


Did they edit their comment? I saw nothing about Rust in it


No, that is why I wrote "subtle implication" there. Unfortunately on online forums, the term "memory-safety" (which is a well-defined term in computer science), is nowadays almost always used in contexts of Rust evangelism. I would be very surprised if the GP's actual intent was that Zoom must have been written in a garbage collected language and not Rust. The wider context of this discussion at all is that whether memory-unsafe languages (ie., C/C++) must be made illegal with the implicit suggestion that Rust must be pushed as the alternative. If C/C++ is made illegal (because "memory-unsafety"), then guess what would be the legal alternative if you can't afford GC overhead. Moreover, for people not using C/C++, the question of memory-safety/unsafety doesn't even arise in the first place.


> But the problem with your take is the subtle implication that Rust is "safe" (not just memory-safe) when in fact there is no empirical evidence or track record of Rust being successfully used in anything remotely mission-critical.

If you don't need the performance characteristics or OS-level interaction offered by systems languages, then please use an interpreted language. Please please please please please. But there aren't a lot of new projects started in C or C++ that fit this, since people have known for decades that using something else will be better if you don't need the specific features offered by systems languages.

> The wider context of this discussion at all is that whether memory-unsafe languages (ie., C/C++) must be made illegal with the implicit suggestion that Rust must be pushed as the alternative.

I never said this and it would be wildly ridiculous for me to suggest this. I mention Rust elsewhere to describe poorly written legislation, not to say that legislation must demand that everybody bow down at the feat of the Rust community and donate their first born child to the borrow checker.

You are reading way too much into my post.


Fair enough. I felt compelled to post in this thread because I've seen the "ban unsafe languages" sentiment expressed several times here and on Reddit before (especially on r/rust I remember reading some comments that had a hostile tone written by people who were serious about it). Your initial comment in this thread resembled one of those.

I think you've misunderstood why I mentioned Verona and Vale though. It is to challenge the notion that there could not be any other language than Rust that could be more ergonomic but with slightly different trade-offs. Moreover, I agree with your point regarding the ecosystem.


You sound paranoid.


That is a neat attempt at making it appear like I am somehow deluded and am imagining Rust evangelism. The person I replied to made a comment down thread that literally states that Rust must be given a free pass despite `unsafe` blocks on the face of such legislation against unsafe languages. Sounds completely illogical to me.

https://news.ycombinator.com/item?id=28343526


You can disable security features in java as well. Elsewhere, you mention GCed languages as an alternative. Would it be appropriate for me to assume that you are a java evangelist and then criticize you for not considering the harm that can be caused by turning off stack inspection? That's what you are doing to me.

The fact that the default is safe matters. It matters a lot. Heck, if you want to use C++ with a sound static analysis tool then I'd support doing that and I'd hope that legislation would support that too - but I think you'd be working 10x as hard as really necessary.


Yeah, I'm literally saying you are deluded and imagining things. The post you replied to mentions multiple GC and non-GC languages. That you also have a bad opinion about unsafe isn't really important.


Throwing ad-hominems at people criticising your language is not a good long term strategy, though it might appear to work for a while.

Not only there was not any "mentions multiple GC and non-GC languages" in the comment I replied or in the parent comments (except for single mention of C++), I also don't get why I have "bad opinion about unsafe" (and where I claimed it is important?). Such a friendly community. Now I see why people don't engage with Rust evangelists. Lesson learned. Anyway, have a nice day!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: