Hacker News new | past | comments | ask | show | jobs | submit login
Servo: Inside Mozilla's mission to reinvent the web browser (zdnet.com)
212 points by brson on Feb 25, 2014 | hide | past | favorite | 181 comments



Why are so many people here on HN so eager to throw away html/css/js, just because they're a little icky and you don't like to use them?

Do you not realise that means throwing away the entire web, and its history going back to the early 1990's? Backwards compatibility and platform stability are GOOD THINGS that we need more, not less of. Lest we perpetually build content for doomed platforms, and leave no mark, no history on the world.

the better course is evolution, not revolution.

Mozilla has been through every revolution, first hand. They know what they're doing.

edit: I'm flattered but please vote up the comments relevant to the actual post.


In my opinion HTML/CSS/JavaScript is the best interface design technology ever invented. I believe it gets a lot of hate because:

1) Amateurs can use it easily and poorly 2) It is not taught in computer science class 3) HTML/CSS is not programming 4) HTML docs attempt to be responsive/liquid as opposed to traditional screen painters.

These are actually good things (and #2 can be fixed) but they represent a significant philosophical shift that most traditional developers don't want to accept.


> In my opinion HTML/CSS/JavaScript is the best interface design technology ever invented

i thoroughly agree with this if you say "interface implementation technology" instead of "interface design technology". i also believe anyone who disagrees has never fully explored the various print/console/mobile/mfc/actionscript worlds. i mean... have you ever even USED something like pagemaker?


I am an expert in web technology (approaching 15 years of experience) and I absolutely disagree that that's is the best we have.

Edit: damn, it is more than 15 years already. I've brought home the first HTML spec printout in 1996 and got first money for the website I've built in 1998. Still remember fixing <center> bug in IE3 when it came out…


I used the web in 1993, in the fall, with mosaic, and grey backgrounds. On a 286. With windows 3.1, and 16 colors.

So that makes it 20 years. (dangit!)

Anyway, I think javascript made a mess of it. The point of the web was to link documents. JS makes links obsolete, as they now are relegated to JS-app entry points.

before I get too much in the weeds, I'm gonna go read the article.


Shouldn't calling something "obsolete" mean it's no longer compatible or practical? Links work! They will always work!


The problem is not that JS is worst than links or forms. It's the most potent hypermedia technique (code-on-demand) but it lays behind the Turing horizon: you can't statically determine what it's doing.


What do you feel is the best that we have? I'm not preparing to raise an argument, merely asking for information. My own experience with user interface development outside of the web is pretty limited; I've created a few winform apps and done some OpenGL stuff in college, but that's basically it. Compared to those, I prefer the web stack.


I don't know about rimantas but I really wish CSS would adopt Cassowary like Apple adopted for Cocoa Autolayout. Way better than flexbox.

Cassowary is hands down the best tech for interfaces.


Well, or it gets hate because you have to code for multiple competing implementations that aren't good at interoperability.


> In my opinion HTML/CSS/JavaScript is the best interface design technology ever invented.

Given my experience with the dozens of UI frameworks we enjoyed since the early 80's I cannot agree.

It cannot be called anything else other than a big hack.


you'd think if something were the most successful content platform since the invention of movable type, the scientific approach would be to figure out why, instead of trying to destroy it.


One huge benefit of HTML/CSS/JS is this: you are downloading the actual source code. This allows different platforms to interpret it as they will. It also allows developers to learn from each other in an open way. Of course, it allows for security audits as well, since JavaScript can be profiled. Overall, it's the most communicative form of programming.


Please breathe, nobody is "destroying" anything.


of course not. you can't despite how much you might want to, and I think a lot of people here do.


There is a very large, important difference between "best ever invented (so far)" and "best possible".


If XAML some open source project made by some hippie group of developers, HTML/JAVASCRIPT/CSS would be in the toilet today. It is miles ahead of the barely moving technologies held back by the standards committees.


I'm not sure where you are going with that, but if you think you're going to convince millions of people to rewrite their websites because $NEW_TECH is now on the market, think again.


He is politely giving an example of a superior mark-up language. Just because developers are effectively forced to eat shit we shouldn't pretend it's a piece of a nice cake. Things like angu-bootstrap-whatever are just a thin layer of icing on the top of a big pile of manure.


That's not what parent is saying. Parent is claiming that the only thing preventing us from welcoming our new XAML overlord is that it's made by Microsoft and not an "open source" project made by "some hippie group of developers". That's hogwash.


Is it??? A guaranteed prerequisite to next advancement in web tech is certainly that it does not come out of one of those 'evil corporations'...


It's hogwash because nobody is going to invest billions to remake the Internet with a non-backward compatible technology, wherever said technology comes from.


I think you're making a great number of assumptions.

There's no reason we can't keep backward compatibility as long as it's desired.

No one is "throwing away the web", what ever that even means.

We can always put the current stuff in a nice museum and pay tribute to the technological importance it achieved during the evolution of the web! vs an evolution of any seriously flawed specific piece of technology.

Also, this really is not a "want", rather the real problems are creating real needs.


> There's no reason we can't keep backward comparability as long as it's desired.

That is exactly what Servo is doing.


"I would have thought ditching HTML, the box model, CSS and JavaScript to be a more important objective that should come sooner than this. Or another way to put this would be: Wouldn't development efficiency/simplicity be more important than running the current junk in parallel?"

Does "ditch" and "junk" mean something other than the common usages of these terms to you? The only assumption I'm making is that you're writing in english.


One mans junk is another mans treasure. Looks like you're offended.

I was suggesting to ditch specific technologies, not the "web" as you put it. So it's... what the web means... That I referred to.


The web is made of these technologies.


But not limited to them.


Right. You can use xhtml 2.0, you can use flash. you can use java applets. you can use ios apps. you can use android apps. You can use director. you can use axr. you can use activex. you can even use xaml, xul or silverlight.

Go on. Use them. I dare you. See how long you last.

or invent your own platform and see how quickly everyone switches over to it. I'm sure the only thing stopping this from happening is nobody has tried. Sure!


Why are you taking this so personally? When centering a div on vertically and horizontally is such a hassle, there can definitively be some improvements in the system. HTML lacks many semantics that would cut down on div.content or .main or any number of similar classes. CSS lacks any sort of inherit structure (like SCSS gives or taking it further with SMACSS) and quickly can become a huge mess. JS is still a terrible language with no support for basic things classes or importing.


HTML lacks many semantics that would cut down on div.content

<section> and <article> are new semantic containers for content.

or .main or any number of similar classes.

<main>?


why are you taking it so personally? why are you reading such things in? it's totally irrelevant.

I didn't say that html/css/doesn't need improvement. that's changing the subject. The question is: why do the people here so eager to "ditch" (as in totally discard) html/css/js.


ES6 will have classes and modules.


Is it pushed to 2020 now or just until 2015? If the weak typing is still there it's an loosely-interpretable text rather than a language anyway.


> Mozilla has been through every revolution, first hand. > They know what they're doing.

Oh yes, I do remember those nice <layers> in NN4 and that CSS wannabe mingled with JS.

For what I am grateful is Firefox (before the rise of chrome). However I think the proper mindset left Mozilla together with Blake Ross and all that's left is posturing and politics.


NN4 is not Mozilla.

Mozilla is making Firefox and helping keep the web open with new APIs, standards. It is helping with the politics side regarding net neutrality. It is helping millions learn how to build web pages thru webmaker.org program that teaches web literacy and moving onward to teach app making with Mozilla Appmaker. Mozilla is helping bring the next billion people online by working with partners to offer affordable mobile devices on emerging markets where people can't afford computers or high end devices. Thats more than posturing.... Thats the result of a global community of volunteers, staff and partners working together to make the web and the world a better place for everybody.


> Do you not realise that means throwing away the entire web, and its history going back to the early 1990's?

The web is more than the browser. The network protocols is what matter, not the browser.


That's even worse. By that metric we should just killed it a long time ago.


HTML etc get flack because they're badly designed. What do you want? Semantic markup, document layout or distributed programming? Sorry but the web people got _everything_ wrong. We don't have a platform that does any one thing well. The whole thing is a laughable dog's breakfast with no layering, no foundations or unity of design. The better course is evolution.. for Mozilla, who have been very good at maintaining their market position while endlessly bungling platform design.


and yet despite its flaws it's become the most successful platform for content since moveable type. Java failed to do that. iOS, despite its financial success, is failing to do that. android is failing to do that. flash failed.

Aren't you the least bit curious why it succeeded despite not being to your taste? Do you think it's just accidental? That the laws of physics and rationality were suspended for one brief glorious moment in the 1990's? Are you a web creationist?

Or do you believe that there's an actual answer?


I think portions of it were wide availability of web browsers, combined with the explosive adoption rates of internet usage... as time moved forward, entrenchment of the existing sites (and their quirks) got buried in...

If we'd started with xml compatible markup (all tags must close in order), and no browsers supported a quirks mode... we'd have much cleaner web browser engines and a much more usable web today.

I think that JS has a few quirks as well... so does CSS.. JS and CSS came after HTML, and even then have grown/distorted a bit. XHTML broke too many things, so we went pragmatic with HTML5. Just the same, no "new wheel" will get adopted in this space whole-sale. People have ditched XHTML and run back to HTML5.

I think a lot of things could be better, and will get better... so long as there are billions of pages/sites out there as-is, quirks mode browsers aren't going away.


well-formedness? _that's_ the problem with the web? Not slow performance? Not no realistic offline story? Not a loosely-typed, dynamic language? Well-formedness.

Sigh.


How much of a browser's time is spent in JS vs. rendering? How much overhead is spent of parsing/rendering? JS isn't even 25% of overhead in most sites, or even dynamic applications. Rendering of reflows, and other UI elements is. Understanding this in the scope of JS is important, as this is where it gets triggered. Hell, having something like AngularJS out of the box in the browser earlier on would have helped a lot.

Just the same, it started with well-formedness being loose, and continued from there.


Sure, slow rendering/layout/performance would be a perfectly reasonable complaint about the web platform.

The well-formedness thing is, sorry to be blunt, a totally crazy thing to complain about.

Every single platform that has any popularity introduces rough edges like this over time. It's impossible not to because every single bug that introduces relaxations gets baked in as content comes to rely on it. It is impossible to ever remove those relaxations, and really, it's totally fine.

There is a cost to lack of well-formedness, but on the list of problems with the web it is waaaaay waaaay down there.


"Content" is the key there. But I am at loss what exactly iOS and Android are failing to do? I think they do very well with music, movies, books, etc, thank you very much. Hypertext content? Well, we do have web for that.


oh, iOS and Android are doing just fine, as of this moment. Check back in 5-10 years though.

Just how exactly do you archive that content anyway? Anything you "buy" on those platforms is not something you own. It's something you're licensing for short term use. Doesn't sound like a great outlet for culture to me. It sounds like a death trap.


Well, how do you explain the fact that Web did not win on desktop? I am still puzzled why people think web should take over mobile for some reason, but never mention desktop.

  > Just how exactly do you archive that content anyway?
Why should I archive anything? I don't archive web pages I visit either.

  > Anything you "buy" on those platforms is not something you own.
I don't care if I "own" something. Owning for the sake of owning means nothing to me. If I pay for the book it is because I want to read it. If I pay for the music, it is because I want to listen to it. Even CDs I do own are now represented by their cloudy ghosts using iTunes Match. Why? Because they are always there. I don't have to walk with backpack full of CDs just in case I'd like to listen to particular song. I can get it on any of the devices I use. Yeah, I don't have a install DVD for every app I bought. I don't care. I change my phone: they are already there. I get new Mac: I go to App Store app and just click "Install" for every app I want to have on that machine. Maybe to some it is a death trap, I don't know.


"Well, how do you explain the fact that Web did not win on desktop? I am still puzzled why people think web should take over mobile for some reason, but never mention desktop."

Define "win". From my point of view, the web has not only won desktop, but utterly dominated it, and relegated the rest of the OS to a mere substrate for webpages. The only things it hasn't really replaced are photoshop, final cut and protools. It's only a matter of time.

"Why should I archive anything? I don't archive web pages I visit either"

oh boy. Paging Jason Scott. Jason Scott on aisle 12.

" I don't care. I change my phone: they are already there. I get new Mac:"

I'm glad you can thrive only on corporately produced content you license for brief periods of time. Many people out in the world are not corporations, and produce things that they care about. Many use computers to do this. Most care about the thing they made, and not about the tech they made it with. And so it ends up in these closed off little data silos and proprietary formats- not only are these things not backed up, they can't be backed up.

And then those people die, and there is nothing left of that person except what they made on the iPad with iOS6. The apps the things live inside are not compatible with the newer iOS. When that iPad dies, it's like the father, the lost son, the missing daughter- they die for a second time.

But you know, it's good that owning that stuff doesn't matter to you, and therefore should not matter to anyone else.

ta.


The web did win on the desktop. New desktop apps are web apps, except for clients to sync your files with a cloud service. The popular desktop apps all predate the rise of the web.

Also, having books and movies stuck in your amazon or apple account is convenient for consumption, but horrible for creation. You can't DO anything with that content. I've wanted to extract interesting stuff from books I own before and was forced to make screengrabs. If you don't understand what's wrong with the media rental model, go read "the right to read" by stallman.


It is already there ... as long as the provider allows it. _that_ is the death trap. You can download a file (and locally backup it), not a stream.

Well, yes, technically, we as tech savvy people could, but the commoners (not to say technophobics), they don't know how.

and that is how it is designed. You don't need backup, it's in the cloud. What happens when it is not anymore ?

It is not owning but rather preserving that is the concern. Is everything worth preserving though, I don't know.


It is not clear what you refer to when you say ``buy".

By the way, you have drifted away from discussing the technological aspects of the Web. This line of argument doesn't help establish why HTML/JavaScript/CSS are better.


HTML can be archived. Backed up. Saved. Become a part of history.

Apps cannot.

That's why HTML matters.


The man accusing me of "web creationism" jumps from "HTML etc suck and were poorly designed from the start" to "there's no explanation for their success". And in the same breath you engage in magical thinking regarding cause of the popularity of the web (indeed, I would bet you aren't capable of mentally separating "the web" from implementation details like HTML, CSS etc). Creationism was highly "successful" for the longest time too. Bravo on looking an absolute fool.


Simple, because they have personally experienced better. They are tired of libraries, CSS/JavaScript compilers and hacks to try to drag old clunky technology into new paradigms.


>Do you not realise that means throwing away the entire web, and its history going back to the early 1990's? Backwards compatibility and platform stability are GOOD THINGS that we need more, not less of. Lest we perpetually build content for doomed platforms, and leave no mark, no history on the world.

the better course is evolution, not revolution.

Candles oil lamps, and torches were pretty good at providing light at night in the 18th century, then there came electricity and the light bulb, everyone slowly threw away their candles for something better. Many people tried to design better candles instead of switching over. Candles that would burn brighter and longer, candles that would not blow out as easily and did not put off as much smoke, but you know what? These were was mere evolutionary changes and could not "hold a candle" to the revolutionary benefits of the light bulb.

Sure it took a lot of work for people to take their oil lamps and candle fixtures off of the walls and ceilings. It was a lot of work to run wiring and put up new electric fixtures. But the time spent was an investment. Eventually it led to a lot better solution and a lot less effort in total.

You are one of these guys who was using and evolving candles and oil lamps. You just can't see the bigger picture.

These HNers who are not happy with still being forced to use candles are the real innovators... The technologists. You who are happy with the candle are merely paint and canvas artists.

Paint and canvas artists in the 80s and 90s started to scoff when computer art and graphics began to gain prominence. These artists lacked the skills to do art with new technology, and lacked the ambition and any passion which was required to learn something new that would have diversified their artistic talents. They had their tools and they didn't want new ones. Their opposition actually was a result of a fear of being displaced, they wanted to stay in their stagnated comfort zone.

Many web designers today are much the same. You want to make art with your old tools and are not really technologists at all, perhaps with new brushing techniques (JS libraries) in some instances, but never with any major evolutionary change.

This stagnates innovation, even worse you are actively supporting the suppression of choice because you are afraid that you may become obsolete in your complacence with the technology behind your art.


This is so cool, and I love that Rust is being so heavily driven by the pragmatic needs of an ambitious project like this. As a systems guy, Rust is the most excited I've been about a new language in years.


Yes, Rust is very promising especially if you look for better alternatives for C++. I wish it would stabilize and someone could write an in depth and well structured book on it. I have easier time learning languages that way, when ideas are explained and well presented, rather than deducing it all from very lacking documentation. In case of Rust it's even more difficult since it's packed with new ideas.


> I wish it would stabilize

Should be later this year.

> and someone could write an in depth and well structured book on it.

Give me some time. ;)


Let me first say I really loved your Rust intro and I'll be buying any new book you produce.

Question: are there any moderate-sized codebases (ie not Servo, not Rust itself) that can be used to show idiomatic Rust? In my own experimentation (a simple image utility to resize images and convert to b&w), I ended up writing it a lot like I would C, which I know isn't correct based on snippets I've seen. Or rather, it's correct in that it works, but it's not particularly idiomatic and the Rust elite would sneer at me.

It's like the early days of C++ when people were writing it as "C with classes", complete with pointers everywhere, malloc, etc. A lot of that could have been mitigated with some easily digestible source that clearly demonstrates the idioms. Maybe a small game, a simple text editor, etc.? Does anyone know of anything like this? Or is it too soon for this?


You might be interested in the ongoing reconstruction of Cave Story written in Rust: https://github.com/drbawb/rust-story


Yes, this is the size of project I'm talking about. Thanks for pointing this out.


Thanks. I am currently finishing up "Rails 4 in Action" and "Designing Hypermedia APIs," which I've both been working on for far too long. In this sense, I don't mind that Rust isn't ready for a full book. ;)

I am not entirely sure that 'idiomatic Rust' is something that's fully figured out yet. Rust itself is _certainly_ not idiomatic, as it's a hodge-podge of several historical coding styles.

As I said when I spoke at the Bay Area Meetup, my position in Rust is 'eternal newbie,' so I'm not actually the best one to ask here. Maybe pcwalton can chime in.


"...but it's not particularly idiomatic and the Rust elite would sneer at me."

OMFG..and in this day and age is soooooo much more important not to be sneered at by the "elite" then it is to have working code...

man do i miss the days when programming was about getting stuff working not worrying about the political and social impact of how your style was...


Your comment misses the point entirely.

As you know, Real Programmers can write FORTRAN in any language.

But, do we want to?

No, we want to learn what is new and useful in a new language. We want to learn how to make best use of that language.

That doesn't mean writing code that looks just like the language we used before. What would be the point of using a new language, then?

No one cares what the "Rust elite" think. Didn't you see that that was just a humorous throwaway comment following a much more important point: writing idiomatic code.

Learning to write idiomatic code helps us see what is new and useful in any language.

We won't learn that by writing FORTRAN in the new language.

But seeing how experienced users write their code is a great way to see what is good and interesting about the language.


Working code is far less valuable than maintainable code.

EDIT: not always, of course, but unless it's a one-off tool, it doesn't make a lot of sense to NOT be idiomatic.


I think something was lost in translation here (assuming you aren't a native English speaker). This was a joke.


Thanks, I'm looking forward to it :)


> Yes, Rust is very promising especially if you look for better alternatives for C++.

Go kind of wanted that prize. It didn't quite get it.

Many will say how they care about cool features, built-in concurrency, nice stuff and in the end, when comes to releasing product if performance isn't there, it won't replace C++.

Some program in C++ because they love the language. They really do. There are people like that, I've met one, once. Quite often others do it because the rest of their world does (libraries, coworkers) and performance.

The niche were performance matters (and I understand that is a loaded word so to speak) is going to be hard to get into.


Go was never supposed to replace C++, it was supposed to replace Java in server applications. Happily it seems Python and Ruby developers are finding good use in Go too.

How was a garbage collection language ever supposed to take the place of a manual memory management language? The area that C++ shines in is systems programming, and you can't do that with Go, because of garbage collection. Rust on the other hand, is precisely made for systems programming.


> Go was never supposed to replace C++, it was supposed to replace Java in server applications.

That is simply not the case. Rob Pike himself describes how we was working on a large C++ application when he thought of creating Go:

http://commandcenter.blogspot.com/2012/06/less-is-exponentia...


I do like C++, have my bookshelf full of C++ books and magazines since the C++ ARM days, still follow C++ standard even if I nowadays tend to spend more time with other languages.


> Many will say how they care about cool features, built-in concurrency, nice stuff and in the end, when comes to releasing product if performance isn't there, it won't replace C++.

That's why Rust is about zero-cost abstractions, period. Unlike almost every other new language (for example, Go), we don't accept abstractions that compromise performance.

This is an article about Servo. We care about performance for Servo—along with safety, performance is Servo's entire raison d'être. Much of the focus on performance in Rust is motivated by the desire to build an engine that's faster—not just "as fast as" or "within the ballpark of", faster—than the existing C++ engines.


Go doesn't address concurrency design deficiencies of C++ in any way, while Rust does. So I don't see Go as a next generation successor, but Rust really looks that way.

I like C++ personally for its power and keeping performance accessible (especially with new developments of C++11 and beyond). And looking at new alternatives, Rust looks more like a successor to me than Go. As others pointed out, Rust doesn't enforce garbage collection, while Go does. For example if someone would want to write a demanding game, they wouldn't do it with garbage collection enforced language.


I am hoping Servo is going to increase the use of Rust. There aren't many resources for it at the moment even though it is at version 0.9. I would think that a language nearing "1.0" would have a wider community by now. Here's hoping!


For such a young and not yet completely stable language, I think there's actually quite a lot of active projects, especially pertaining to gamedev. Here are some:

http://rust-ci.org/projects/


I think this has a lot to do with the complete instability of the rust API, and I think a lot of others (myself included), are waiting for the 1.0 release before actually trying to build something in the language. Having software built in rust should be great for the community though, I'm exited about it.


> There aren't many resources for it at the moment even though it is at version 0.9. I would think that a language nearing "1.0"

FWIW the language is only nearning 1.0 in that 0.9 > 0.8. 0.9 is by no means the final pre-1.0 release.


This conversation resurfaces every time a project gets to x.10 version number, e.g. KDE SC 4.10. Considering even HN audience which are mostly programmers find this versioning scheme confusing, I wish we have had settled on another identifier to separate minor and major version numbers. Something that didn't have a mathematical meaning. I personally like : and ::


One could use the pretty standard tuple notation: Rust (0, 9), KDE SC (4, 10). It's pretty common to represent version numbers in ways not dissimilar from that internally…


Ah, good point!

And this is Mozilla. They are used to putting out releases often. Firefox Nightly is at version 30.0 :)


I would have thought ditching HTML, the box model, CSS and JavaScript to be a more important objective that should come sooner than this. Or another way to put this would be: Wouldn't development efficiency/simplicity be more important than running the current junk in parallel?


A clean break from the past was tried once already: XHTML 2.0. It failed.

We absolutely plan to (and are!) working on new standards that enable greater efficiency and developer productivity. I don't like the fact that floats are used for layout any more than anyone does, not the least because it makes it hard to parallelize. But at the end of the day we're building a browser, not a from-scratch app platform.


It failed for a number of reasons - timing, lack of compelling new features, IE's lack of support for application/xhtml+xml, Gecko's Yellow Screen of Death, etc. I'm not sure that you can blame any one of them. And it really wasn't a clean break in the sense that whyme is talking about.


I'm pretty sure it failed due to not being backwards compatible. Sometimes you can throw stuff away if it isn't getting much use, but backwards compatibility is too important in the real world.

If your browser won't browse the Web, consumers simply won't use it.


XHTML 2.0 was not a worthy successor. Apologies if my hatred for the current technology is coming through, but hey... At least I'm being honest.


He didn't mention XHTML. He seems to be talking about an entirely new layout language, which is nutty, but not as crazy as XHTML, which was "let's make everything way more complicated for the sake of ideological purity, but not add any new features"


"rebuilding the entire world wide web" is one of the few projects that I'd imagine is _more_ difficult than "write a new web browser from scratch."


Well the sooner we start the better. ;)


Why?

What's the pain point here, beyond just wanting to reinvent a round-enough wheel?


It's not a case of the wheel needing to be round enough. The wheel is old and has not been able to keep pace. We have many people walking aside the wagon repairing the wheel at every turn. Those people are the current browsers. The wagon is the current technology and the people going along for the ride can see just how painful this is.

Anyone who has attempted to implement trivial things, like thematic scrollbars or cross-browser capable code, or dealt with dom thrashing via JavaScript, knows the amount of effort going in (or knowledge required) to do trivial things is ridiculous; considering how long we've been travelling down this road. Why are we still using a wagon? And why do we want to build more of them running in parallel?


>Anyone who has attempted to implement trivial things, like thematic scrollbars or cross-browser capable code, or dealt with dom thrashing via JavaScript, knows the amount of effort going in (or knowledge required) to do trivial things is ridiculous

That's true, but that's maybe for the better?

In the sense that it leaves a place for desktop and native apps to thrive too.

I would worry if the browsers was a great execution environment, that could even run Photoshop or Valve or Pro Tools, and every company went that route.

It would mean total loss of data freedom (they would push you to have them in the "cloud"), and a rent-model for software purchases, with software being able to completely change under your feet (like Gmail did for example).


What's the superior alternative? (serious question)


In spite of the firmly stated opinion I left elsewhere in this thread, I do indulge in imagining how all this tech stack could be better.

First things first, I found out the other day that someone actually built the major part of my vision: Constraints Oriented stylesheets (using cassowary).

http://gridstylesheets.org/

This is how iOS now handles layout. It makes easy things easy, and hard things achievable. Exactly what you'd want.

Beyond this, I think that html has too many tags. To put another way, the browser should not generally permit <script> and <style> to exist in the same context that <em> can exist. onblah= attributes were a terrible mistake that we'll never be able to undo. It would be cool if there was something like a <content> tag which, within it, only permits content oriented mark up, while disabling scripting and styling tags. Likewise, forms, scripting, meta tags and linkages should be in isolated contexts.

And finally, what idiot decided that & ampersands should have a prominent position in many URLS and then use & for entity encoding in html, a completely different purpose. URLS now have to be http://entity&amp;encoded

uhg. so stupid. what would be wrong with <amp> or <amp/> instead? or since we've freed up & for normal expected use, <gt> <lt> and <quot>

ah well.


> ...what idiot decided that ampersands...

    example.com/?a=1&b=2
    vs.
    example.com/?a=1;b=2
The second example doesn't need to be encoded in HTML.

https://stackoverflow.com/questions/3481664/semicolon-as-url...


And that would be great if it didn't inadvertently break certain browsers, proxies, web servers and parameter parsers. Back in the day the advice was you could use & or ;. TBL had subtly different semantics in mind for them though: apparently ; was meant to represent some coordinates in some spacial system while & was the normal query parameter separator.

Somewhere along the line, for some reason, everyone settled on & and the web ossified around the assumption that was the only character that ever gets used.


The worst part about ampersands in URLs is that not only you have to escape them but so few people know that you need to escape it!


How paralelizable would Cassowary be? It seems it adds a lot of inter-dependencies between elements, no?


I don't know the algorithm well enough to say for certain. but I get the impression it involves a clever work around to the interdependencies. it converts equalities and inequalities into a system of linear equations. It then converts this system into a matrix form it calls a "tableau". The solver appears to work by repeated applications of matrix operations which may be paralelizable. but I'm not sure. This is different from how physics engines resolve constraints, one after another. In a matrix form they just kind of all get optimised "at once"

the paper describing the algorithm is here:

https://docs.google.com/viewer?url=http://www.cs.washington....


At a really high level...

Create an entirely different model. One that acts as a hybrid between a full blown inefficient co-ordinate system and, plausibly, a more efficient box model.

We could start by modelling 'movement' which equates to objects having 1. A reference point, 2. A magnitude and 3. A direction.

It would be beneficial to do this in such a way that one could apply functions to said model achieving a reactive design that gives you a sense of a full blown co-ordinate system while achieving simpler mechanisms for ease in development and efficiency.

I know that's pretty high level stuff, but in my mind I can see a fairly clear path to achieve a significant improvement.


Cool, that sounds interesting. I certainly think it's worth experimenting with stuff like that in parallel to projects like Servo which optimize current standards, long-term it's hard to tell which will be better.


You won't displace entrenched technology with an entirely different model, no matter how good. Sorry, this is just a pipe dream.

The real competition for the web are the walled gardens of iOS and Android App development.


If iOS or android app development succeed, then you would be proving yourself wrong as they would be displacing the entrenched technology. It's that, or they are not real competition.

I have no pipe dreams. Displacing the aforementioned technologies with something better is inevitable. I just gave an example of how easy better ideas are to be found.


I always wonder what it would be like if the browser was just a thin virtual machine, something like Java had hoped to become, and then you could specify the components you wanted to run on top of that. If, for example, you are serving an HTML document, you could point the browser to also download the WebKit renderer, or whatever. If you're building something more akin to an application, you might use some other framework to host your application.


That's certainly been discussed many times. It's an interesting concept. The main downside though is that if each site has its own renderer, with no shared concepts like the current web has, then you lose out on things like easily making accessibility tools, web scraping, etc. etc.

The benefits are also clear though, basically not being limited to current standards.


People have been parroting this nonsense for years. You have WebGL etc in the browser now. Web "pages" aren't pages anymore, and there is no way to enforce semantically meaningful markup. All that's been accomplished by fighting against the future is to ensure that all these low level services were implemented poorly. You can have all your high-level, common concepts but they need to be built onto a layered system. What we have now is a laughable mess, and it proves that the browser vendors have no idea what they're doing.


Of course then Google would have you download Webkit as a renderer and NaCl as an execution environment. Which would be only available on the latest version of your OS.

And Apple would shove some Obj-C renderer down your device, only this would support only iOS, MacOS and Windows, not Android and certainly not Linux.

FOSS-Shops would transmit only a repackaged build system which then would start downloading source packages and start compiling them in your browser. 3min later and you could actually start using the application.

I really fail to see the advantage here.


I don't think that he knows. No one does. But that does not mean that it isn't something that should be discussed.


Well, of course, I don't think any reasonable person would say it shouldn't be discussed.

I was just hoping there was some interesting project already underway I could read about.


Something like XAML?


Man it's so much much better then crap-tml! I can layout anything flexibly and reliably in like 5 minutes.


Reads like the original Chromium manifesto (comic), plus goals for Blink.

https://chromium.googlecode.com/files/chromecomicJPGS.zip http://www.chromium.org/blink


The goals are shared, which is not surprising - everyone wants faster, safer, more parallel, etc.

But the means are extremely different between those projects.



I hope when they are rewriting the browser with modern architecture, they keep in mind the problems which kept this bug alive for more than 12 years now (while all other major browsers fixed it one by one), citing 'foundation' or architectural problems:

https://bugzilla.mozilla.org/show_bug.cgi?id=78414


There are currently no plans to support plugins, so that bug is not relevant to Servo.


You mean that will again be an 'after thought'?


No, we won't support plugins at all.


Nit: I wouldn't say we won't, but there are currently no plans to.


In the hypothesis Servo becomes Firefox's engine, this will be a blocker for Danish users. The government-mandated Nem-ID scheme uses a Java applet, and is used everywhere (taxes, banks, school system...).


A javascript version of Nem-ID is planned for release in 2014 [1].

[1]: http://www.nets.eu/dk-da/Produkter/Sikkerhed/NemID-tjenesteu...


Thank you. Then apt-get purge icedtea-7-plugin icedtea-6-plugin is planned for execution in 2014 as well.


I'm not sure why this is being downvoted?


Will you support add ons?


Yes, we're thinking about how best to do it. This is the kind of thing where ideas are quite welcome :)


I hope there will be no concept of "not restartless" add-ons. ... Thanks :)


Can you expand on this?


Not a contributor, but from what I gather this is the strategy:

    PDF plugin   -> PDF.js

    Flash plugin -> Shumway.js

    Java Plugin  -> Nope. Good riddance.

    Others       -> Nope
Jetpack Addons on the other hand with are basically JS with access to some of Fx private API and are ported as soon as hooks for that API is made.


Google Talk (/Hangouts) plugin. Silverlight plugin (Netflix, Amazon streaming) There are hundreds (if not thousands) of others. There are some we use to allow exploring 3d CAD models in the browser. HTML5/JS might be the way to go for future, but currently it seems a lot of plugin replacements (including shumway) are not up to the mark.


Tell that to Google, who are actively working on breaking all those plugins in chrome, for everyone, in an attempt to force people to rewrite plugins for the chrome specific Ppapi (which can't do many of the things npapi can)


Why would I tell that to Google? I use Firefox :).


WebRTC?

As far as I know Silverlight's primary draw is DRM. Good riddance.


> Mozilla on how its Servo engine will throw away the 20th-century baggage that holds back current browsers and harness the power of modern multi-core smartphones and tablets.

Then I hope they don't intend on supporting any 32-bit platform. Make a clean start on 64-bit platforms. By the time Servo is out, there should be 64-bit ARM processors cheap enough even for their low-end phones, and on a desktop it should be a nobrainer.


It's too soon surely. There are plenty of recent or current 64-bit-incapable platforms (PCs running 32-bit Windows, every Android phone to date ...) which you'd hope to at least be able to run an up-to-date web browser on for several years to come.


As a matter of fact, Servo was broken in 32-bit for quite some time, although it isn't at the moment.


No, it's definitely still broken on 32bit platforms. It does not even finish building right now.


Doesn't it build on Android, which is 32-bit?


Maybe he meant x86 as opposed to ARM(32 bit)?


Correct.


From what I remember from the Rust/Servo ML they intend to pursue 64 bit as default and 32 as a legacy option.


There's no particular reason to not support 32 bit, at least so far.


Would it be that difficult to just queue up those threads on a single 32 bit processor instead of sending them off to separate cores?

It wouldn't be optimal, but at least they wouldn't have to completely write off older or lower-powered devices.


Distributing threads and processes across available cores is the job of the OS, not the application.


In Rust, (right now) you can choose between a 1:1 or a N:M mapping between OS threads and Rust tasks. With N:M threading, the runtime necessarily does (some of) that internally.


Incidentally, Servo uses both: we use 1:1 for the parallel layout worker threads and M:N for the script and layout tasks.


> Then I hope they don't intend on supporting any 32-bit platform.

this type of thinking is so negative... why would you ever make not supporting/doing something to be a project's goal?

the servo devs might make speed of development or maintainability a goal, and that might lead to the decision not to support 32-bit (it's not clear to me why it would create that much overhead on either front, but then i don't know rust).

sorry if that's tangential, but reasoning/explanations of the form, "here's the goal: let's not accomplish __," is broken, and i just needed to vent.


> "It usually only had one core, clock speeds were lower and you had much less memory available to you,"

Yeah, right. Modern browsers already eat up several GIGS of memory, while super-advenced-zomg tablets boast of 2G. Most have 1G 80% of which is eaten by the system (and widgets, and other persistent stuff). But not caring about memory is hype. Because its future, technology, and, you know, Moore's law


It doesn't mean that Servo would use more memory. Current browsers are doing things like multi-process to trade off memory for security and getting less safety in return than a memory-safe language would provide.


Another day and another ambitious project by Mozilla. However, when will Mozilla see these projects thru and give them the proper marketing so they have a chance to thrive?


It is open source. We don't get to have it both ways. Reporters see a slide deck from FOSDEM [1] and decide to report on it.

[1] http://video.fosdem.org/2014/UD2218A/Saturday/Servo_building...


But what if they don't turn out as good as hoped, why should they be marketed then? This is true in general. If they had crystal ball and saw that technology X is going to be popular at time Y, they they work on it until then and release. However things change, market changes, by the time X is 3/4 done maybe there is no more demand for X, or X just turns out not to work as well as hoped. Going full steam ahead with it just because of sunk costs might not be best use of resources.

So they try something else.

> them the proper marketing so they have a chance to thrive?

For mediocre products to thrive just marketing is not enough. Marketing is a good accelerator for already great products. If they are not great, marketing is working against the flow. Sure one can show people the cool Windows 8 tablet commercial with people dancing in it 100 more times and maybe they'll get a few more sales, but that alone is not going to be enough to make it a success.


I wonder if mozilla plans to merge this back into b2g eventually, and how easy that will be. They have several very exciting projects on the go, but some seem to be heading in different directions.


Yep, I started to hack around b2g and servo build systems to make that happen. Stay tuned.


You're the best.


It likely would be boot2servo!


I can't help but question whether this will actually realize any improvements. What pages will actually be faster? By how much?


We have significant signs that this will improve performance on many pages. It would be too early to quote numbers, but you can run tests yourself if you're interested.

See "Fast and Parallel Webpage Layout" for some early work by Leo Meyerovich at UC Berkeley: http://www.eecs.berkeley.edu/~lmeyerov/projects/pbrowser/pub...


Thanks for the link! For running the tests, are you just saying to do this on servo? Is there a preferred "getting started" page out there?


So, the preferred "getting started" of just reading the README on the repo seems to be working out quite well, actually. Looking forward to having fun with this.


[deleted]


Servo inherits asm.js support from SpiderMonkey.


I love how they capitalize iframe as iFrame :)


I cannot help but think of Joel Spolsky's article about reasons not to do a complete rewrite of your project -- a lesson he learned after finally shipping the long delayed rewrite of Netscape Navigator (progenitor to Mozilla)

http://www.joelonsoftware.com/articles/fog0000000069.html


I hate when this article of Joel's get cited as an argument not to rewrite anything.

> With that thinking we never get Linux (NeoMinix), tmux (NeoScreen), Clang (NeoGCC), WebKit (NeoGecko), Subversion (NeoCVS), or Vim itself (NeoVi/Elvis/Stevie/etc).

-- haberman (https://news.ycombinator.com/item?id=7288033)

Stuff is rewritten all the time. Servo is not replacing all of Firefox, just an experimental project to replace Gecko by a skeleton crew.


Well, on the other hand if they haven't done the rewrite, Mozilla wouldn't be here now, and Netscape would have been an insignificant player like Opera at best.

So, it might have not worked out for Netscape the company, but then nothing would have, and at least if worked well for Mozilla the organisation.


I imagine one has to thank Google for the funding as well.


Which is why this is a small project by few people and the community.

Even IF it fails it's still a benefit as a data point. Rust people are reusing a lot of Mozilla software stack and plan to do auto conversion of C++ to Rust (where benefits are greatest).


>plan to do auto conversion of C++ to Rust

This... is news to me? I'm pretty sure we have no such plans at all.


https://github.com/mozilla/servo/issues/1289

I think it was this issue that talked about porting using Java -> Rust and C++ -> Rust translator.


Oh right, that's a special case. The Gecko HTML5 parser is already generated by a specially-targetted Java -> C++ translator. We want to use the same code and make it go Java -> Rust, that's all.


Thanks, didn't know that.


His article was talking about abandoning a large product to do a ground-up rewrite without clear improvement goals beyond "make it easier to understand/work on". Servo meets basically none of those criteria.


Counterexample: Damian Katz and how he did a rewrite on the Lotus compute engine[1]. I guess it depends. For 30k lines of code it should be possible for any good engineer. A browser is of course in another ballpark, but if you have a really good core team it should still be manageable. It also depends on whether you're willing to break things (Apple with OSX).

[1]http://damienkatz.net/2005/01/formula-engine-rewrite.html


It is possible to apply this advice too broadly.


Very nostalgic!


Mozilla seems like a post-capitalist non-profit for the benefit of bored programmers everywhere, sponsoring such useful work as "lets rewrite the browser".

It's not yet another HTTP, HTML, CSS .. implementation that is needed, it's making these standards sane and fit for 2014.


> Mozilla seems like a post-capitalist non-profit for the benefit of bored programmers everywhere, sponsoring such useful work as "lets rewrite the browser".

Judging by the number of people who complain that their browsers are slow, it absolutely seems like useful work to me.

> It's not yet another HTTP, HTML, CSS .. implementation that is needed, it's making these standards sane and fit for 2014.

A quick search turns up dozens of standards that Mozilla is working on: https://www.google.com/search?q=w3+module+mozilla+corporatio...


As far as I know (and people don't seem to talk about it all that much), the core of the browser is borrowed from a browser project called NetSurf (yes, there are others besides webkit / blink). So, technically, they're not really reimplementing HTML/CSS/HTTP/What-have-you.


Nope. I mean, the parser's from there, but we're replacing it. We wrote a CSS parser to replace the NetSurf one we used for a bit, and all the layout code is from scratch. There's a Rust library called rust-http providing us with networking.


I see. Thanks!


We jettisoned most of the netsurf code a while ago. It was purely a stopgap measure. Only the HTML parser is still netsurf's and we're planning to rewrite it.


My bad. Sorry.


Do you feel unhappy about Mozilla and what they're doing for some reason? If not, why the rant?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: