Hacker News new | past | comments | ask | show | jobs | submit login

Servo is a waste of time. If we want a fast rendering engine, Mozilla already has it.

If we want a secure rendering engine we could leverage code checks.

It's all there. The meme of Rust equals safety (or C equals I safety) has to go away.




Servo was about parallelism.

Mozilla tried multiple times to parallelize CSS style calculation in Gecko which is written in C++, and failed all of them. When they tried again in Servo with Rust, they succeeded first time.

They integrated Rust-written parallel CSS style calculation to Gecko. As a result, to this day, Firefox is the only web browser which can parallelize CSS style calculation, and beats every other browser in CSS style calculation performance.

The meme that Rust is easier to parallelize is true.


I don't think this has anything to do with the language itself. If anything, you could claim the same for C++ since "easier to parallelize in Rust" is derived from the fact that Rust models pretty much everything as a shared-ptr so many gotchas you would normally have in multithreaded (but not concurrent) code disappear. Since you have the shared-ptrs in C++, you can achieve pretty much the same and also quite easily.

So I think that the programming language as an underlying reason is most likely a wrong premise to start with. IMO there's a huge difference between "here's several MLOC with all of its 20-years legacy/baggage, and now make N% of that non-trivial work to run faster" and "here's a greenfield project with 0 LOC, no legacy and no baggage, no code to learn, and now please write the algorithm from the ground up". I think this is much more likely to be closer to the truth.


Rust doesn't model everything as a shared_ptr, it gives you a choice of tools that fit different use cases - just like C++ does. The difference is if you mess up, it is massively more likely to detect it at compile time.

I agree that starting from scratch can make a huge difference, but if you're starting from scratch anyway why not use the language that will prevent you from making mistakes in your design?


I did not mean "everything" in the broader context but in the context when it comes to writing "easy" multithreaded programs. Pretty much everything in that case becomes modeled through a shared-ownership or message-passing semantics.

Since those same mechanisms are available in C++, and other languages too, making an argument that some specific XYZ algorithm re-implementation from scratch was more successful only because it was written in Rust, doesn't hold water. It was successful, for the arbitrary definition of success, in its major part because it was a greenfield project.

I believe that suggesting otherwise is plain wrong and misleading.


You might be right, but you're stating this without any evidence, so I don't think it's clearly "wrong or misleading". There are many cases of software rewrites failing, so I'm not sure you can take for granted that "greenfield project" implies higher success rate, and even if you did, I don't see how you can judge how much of this was due to it being rewritten from scratch vs that it was in Rust to claim "major part".


It's common sense what I said. It applies across the industry regardless of the programming languages used. On the contrary, where's the evidence suggesting that the Rust is what made Gecko rewrite succeed? Has there been any rewrite from scratch with some other programming language?


There were two previous attempts at parallelizing CSS layout in Firefox. Both were in C++. Both were abandoned after being unsuccessful. The Servo folks credited Rust's safety guarantees as the reason why they were able to be successful on the third attempt.

https://www.youtube.com/watch?v=Y6SSTRr2mFU


C++ attempts were from scratch or were based on interventions on the existing codebase?


I mean, the Rust version was also put into the existing codebase, so it's not clear to me what distinction you're making.

But this presentation was made seven years ago, and the attempts it's talking about are even older, and I wasn't involved with them. So I don't know the answer to your question.


The distinction is whether or not you're rewriting something from scratch carrying no baggage from the thwarts of the existing system or you keep organically growing existing code to meet the new requirements. The latter is usually much much harder. I hope it's clear now.


Understood. Yeah, I do not know.


No, I don't think there's any concrete evidence either way. I'm not trying to argue that it was Rust that made it succeed - I'm sure in reality it was some mixture of both, as well as other factors.


Having to work and learn through the codebase to make some substantial improvements often requires substantial effort and even rewiring the code architecture itself. That's enough of the evidence for me.


Firefox owes quite a bit of its speed to Servo... so that's a weird take.


Honestly I think servo is a worthwhile endeavor if solely because it has the potential to introduce some needed variety into the browser engine space.

Especially if they can dial it in with tauri as a viable electron/blink alternative.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: