Hacker News new | past | comments | ask | show | jobs | submit login
And so I'm giving up the Mozilla project - Jamie Zawinski (1999) (jwz.org)
77 points by Andrew-Dufresne on Feb 11, 2011 | hide | past | favorite | 37 comments



Lots of things on the Internet can move quickly but browsers are not Internet software the way that Google or Facebook are. Browsers are still desktop software and things don't move as quickly there thanks to slow PC upgrade cycles and the absolute dominance that software like Windows, Office, and Internet Explorer had a decade ago. Turning that massive ship took longer than many imagined but it is happening.

Jamie gave the open source Mozilla project about a year and a half, including the months of pre-source release preparation (and I think that's being generous.) Brendan Eich and Mitchell Baker didn't give up so easily and thirteen-plus years later they're still giving all they've got to make sure that Mozilla continues to be successful in promoting choice, opportunity, and participation on the Web.

Some things are worth fighting for and I believe that the Web is one of those things. I'm proud to work with some of the founding members of mozilla.org and think it's a phenomenal thing that such talented people are willing to commit their professional lives to the Mozilla mission when there were and still are far sexier opportunities available to all of them.


When Firefox 1.0 came out, it was one of the lightest browsers available on the market. What happened until version 4.0? Even the latest Firefox beta is one of the slowest browser available. I’m not talking about the JavaScript engine, I’m talking about the user experience of your product.


A pessimistic answer would be that all the people who were working on Mozilla who liked to add bloat like email and IRC and Palm syncing to a web browser jumped over to Firefox. Or that the guys who wrote Firefox 1.0 have moved on. Or that that's what happens when you write your UI in XML, render it with a browser engine, and load half a million extensions and history databases from disk on startup.

A different answer might be that it does so much more that a slowdown is part and parcel of doing all that's necessary in a modern browser.


I'd say that in the 10 or so years since firefox 1.0, the average computing power and requirements of the firefox user have shifted enough that the goalposts have moved.

Firefox 1.0's USP at a time when browsers were bloated was its lack of bloat. IMO thats less of an issue now.


I'm sorry but this mentality is everything that is wrong with software development. How is bloat not an issue now? We expect that hardware is getting better and better, but it is ok for software to become worse and worse just because it can sort of get away with it?

In this case Firefox has even managed to outperform Moore's law. It managed to become slower in a faster rate than computers get faster.


"Worse" in this case is subjective. Perhaps firefox today is objectively not as good for me (as a developer) as it was at 1.0.

My argument is that I'm not necessarily the main target demographic any more, Firefox 3.6 is fast enough and has the features that Joe average wants.


Actually, 6 years and about 3 months.


Due to Firefox slowness, I have fully transitioned to Chrome on all my computers. I run adblock plus and often flashblock.

Phoenix was fast. I liked that. I should go try to compile it and run it on W7 and see what it does on modern websites.

Speed and ability to render correctly are my killer features for browsers.


Exactly. I'm holding out for Firefox 4, and if it's not substantially faster than Firefox 3, I'll be moving to Chrome and I wont be moving back. Firefox 4 is their last chance to get their shit sorted.


What in particular about the UX of the latest beta do you find slow? We should fix it.


Here are couple of things:

1. Sync slows down the interface of the application a lot. There should be an option to completely disable Sync. Or Sync should have been an add-on installed by default. If I want to sync only the bookmarks, do just that, nothing else. Sync should also be activated only when I save a new bookmark, not every now and then.

2. SQLite slows down the application and complicates things with those default smart folders. You can't even create a smart folder for bookmarks. Mozilla should have write a faster database for bookmarks and history — they are just soem lines of text after all.

3. Live Bookmarks are useless now since the RSS icon was removed. It was useless before. I think by removing the code for Live Bookmarks and treating bookmarks and history in the same way will make things work faster.

4. Tab Groups also slows down the interface. It is useless since you can manage multiple tab groups by opening new windows.

5. Moving the status bar in the address bar was a bad idea. It complicates the user experience. Status bar should be shown only when something new happens — a download stars, a new page is loading, etc.

I don't know what happens under the hood, but Firefox is heading in the wrong direction.


(1) Sync is now on asynchronous APIs as of the latest beta (or maybe the next one). You can disable Sync in the Preferences window, and you can choose to only sync bookmarks if you'd like.

(2) Places (the bookmarks database) is now on asynchronous APIs, which means that, unless you're using esoteric features of bookmarks, they never block the UI thread. SQLite is also now in WAL mode. Smart folders are still slow as I understand it, and there's a plan to revamp and/or remove them.

(3) Bookmarks and history have been treated the same way since Firefox 3 (Places). There is talk of revamping and/or removing Live Bookmarks as well.

(4) Tab Groups is now secondary UI (the icon is gone by default). It is never loaded until you open it.

(5) The status bar is no longer in the address bar.


I interviewed at Netscape shortly after they announced they would release the browser as open source (June '98, ended up going to Inktomi). I remember being very excited about this, and downloading the source when it first became available. I found out that my (decent) desktop machine was not powerful enough to compile it. When I found a machine that could compile it, it took about 8 hours.

No surprise that it remained a Netscape project for so long.


"When I found a machine that could compile it, it took about 8 hours."

Thank you. It never occurred to me that you need a lot more processing power to compile code than you do to run the software you're creating. You just saved me a lot of heartache in a few months once it's time to start compiling. I love HN.

/srsly


This isn't always the case. It is highly dependent on the language, the compiler used, the size of the code-base and many other factors.

Most of the small to medium sized project I have worked on compile in what seems like an instant.

However, I work on a medium-to-large code-base at work and it takes around 8 minutes to compile on our build server and around 12-15 minutes on my dev machine. Even these speeds seem tremendously slow to me. I can't imagine being a developer on a huge C++ project and having to wait tremendously long build times. When your compile times aren't near instant it really changes how you program. That feed back loop of code-compile-run-repeat becomes more like code-double check code-code some more-compile-surf the web-run-repeat.


Ideally, you should not be in this situation where you have to compile large amounts of code to go through the code-test-debug cycle.

Even in a large project, there should be subysystems that a developer owns and works on that can be partitioned or isolated from the rest, and execute within a test harness on a dev machine -- a harness that feeds the subsystem inputs and records outputs.

Successful projects either begin with or evolve to a state where they have the architecture, the toolset, the process and the culture to allow devs to enjoy a fast code/debug cycle.


Besides subsystems, tools like make and nmake utility can also reduce the compile time.

Make figures out automatically which files it needs to update, based on which source files have changed. It also automatically determines the proper order for updating files, in case one non-source file depends on another non-source file. As a result, if you change a few source files and then run Make, it does not need to recompile all of your program. It updates only those non-source files that depend directly or indirectly on the source files that you changed.


Yes, but large projects often need to use recursive makefiles, which are notoriously flaky at detecting what has changed (if anything).


No project needs to use recursive make.

"Recursive make considered harmful": http://miller.emu.id.au/pmiller/books/rmch/


ccache. Come on guys, this is basic tooling.


ccache only helps so much. On a pretty modern fast machine, typical web browser source takes 5-10 minutes to compile for me even with ccache (depending on what I changed).

Of that, 30-90 seconds is just linking (and no, gold doesn't help all that much).


It never occurred to me that you need a lot more processing power to compile code than you do to run the software you're creating

It never occurred to you because it isn't true! You could for example write a trivial raytracer that compiles in seconds and takes hours to render a scene.


...Could you explain? I don't understand what you're saying.


Also remember that compiling code can use memory. I once had a compilation causing the machine to swap. That took a while to compile.


This is often the case. My last company had software that took 45 mins to compile on high-end machines in 2010.


Excellent documentary on YouTube covering the year long open-sourcing of Mozilla, right up to the point of jwz leaving Netscape.

http://www.youtube.com/watch?v=u404SLJj7ig

(via codinghorror: http://www.codinghorror.com/blog/2011/01/lived-fast-died-you... )


Code Rush can downloaded from http://clickmovement.org/content/code-rush-download

Other links for unused footage and annotated version: http://en.wikipedia.org/wiki/Code_Rush#External_links


Thanks for linking that, I'd never seen it before and it was very fun to watch.

Near the end Jamie says (slightly paraphrasing): "This could all turn into television again... a small number of companies controlling what we see or hear."

And now we basically have Facebook, Twitter and Google as the big entry points for finding things on the web. I realise that we're not living in this closed system that's push only.. I just thought there was a slight parallel there.


5 short years later, mozilla.org shipped a product that was finally ready for prime time.


And today that product is used by more than 400 million people around the world -- far more users than Netscape ever had.


Specifically, it was Phoenix (aka Firebird, aka Firefox) that made that happen: a version of Mozilla that threw out everything but the browser, including the mail client that jwz considered essential. It made it smaller to download and faster to run, but most importantly it made it faster to write: they could ship improvements to Firefox way faster than they could to Mozilla, simply because there was so much less of it.


Yup. A classic example of minimum viable product (although for phoenix far more than minimum). Phoenix was a breath of fresh air back then. Mozilla was as bloated and cumbersome as Netscape ever was, but Phoenix was fast, streamlined, and a joy to use.

The turning point was when the RSA patent expired and Phoenix could fully support SSL out of the box without any wonky add-ons. At that point there was no longer a good reason not to use Phoenix/Fire{bird,fox} and it started gaining huge amounts of momentum.


The RSA patent expired back in 2000.


I think by Mozilla 1.0 it was finally ready, but it was Firefox's marketing that finally led people to adopt it.

BTW, you can partly blame this petition:

http://archive.webstandards.org/ng.txt


From what I recall, the long delay seemed to have much more to do with XPCOM than with the CSS engine.


"you can divide our industry into two kinds of people: those who want to go work for a company to make it successful, and those who want to go work for a successful company"

Awesome insight. Absolutely breathtaking. Explains so much of what happens at larger companies.


That was a good post and I had forgotten about it over the years.

It prompted me to write up some of my own thoughts on the current day Mozilla Community: http://daniele.livejournal.com/80677.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: