Hacker News new | past | comments | ask | show | jobs | submit login

What are they supposed to do if they want to implement features which are not supported by Firefox?



> What are they supposed to do if they want to implement features which are not supported by Firefox?

If the features they want to use are not available in standards-compliant web-browsers they should not be released on a public facing production website.

I thought that was blindingly obvious, but the Chrome-effect is evidently taking quite a hold.


Degrade gracefully?

BTW, for those playing at home, the Play Store website loads just fine if you claim to be running Firefox OS instead. If there actually are features the Play Store site uses that work in Firefox OS and not in Firefox for Android, I'm sure Mozilla would like to hear about them (probably via Bugzilla).


They could at least not intentionally block the browser and let it render to the best of its abilities. The Google Play site is still functional in desktop Firefox anyway, and it's difficult for me to imagine that it's not at least still mostly functional on Mobile Firefox.

Perhaps they should inform the user that they've designed the site for a particular browser in such a case, though. A little animated GIF that says "This page works best in Google Chrome" would do the trick.


follow standards maybe?


That approach is what got us XHTML 2.0 and a "lost decade" for web technology. The things Google adopts (vp8, dart, spdy) are things it releases and publishes as open standards, and therefore significantly more standard and open than e.g. javascript, or even the <img> tag, were when first introduced.


> are things it releases and publishes as open standards

Publishing working code (even as Open Source) and possibly a white paper does not make something a 'standard.' It may be 'open' but not a 'standard.'


And yet it also completely disproves the argument of anyone claiming that Google is trying to make another ActiveX.

The idea that nobody can do anything cool on the web unless all browsers support it seems like a great way to encourage stagnation.


>The idea that nobody can do anything cool on the web unless all browsers support it seems like a great way to encourage stagnation.

But that's the entire point of open web standards. If you don't like using a runtime that is the lowest common denominator across all platforms then why are you using the web in the first place?

I really don't understand people who claim to support the web and web standards but then moan about vendor X or Y not implementing this or that. That's the single biggest defining feature of open web standards; things don't happen unless everybody agrees. If you don't like it that individual vendors have veto power over things then you don't like open web standards. If you don't like technology that moves slowly and by consensus then you don't like open web standards. These are the costs of creating a platform that is defined by open standards.


Arguing definitions is a waste of time. How about this: I like published formats that become standards as and when they gain multiple implementations. If you try to standardize first and then implement, you get CSS2 (or, my first example, XHTML2). The web features we use are there because one vendor or another implemented them, experimentally (again I refer you to javascript, or the <img> tag), and they became standards some time after that. For non-web examples consider something like python - at first, the implementation was the spec; as it matured and things like jython and pypy began to be important, the spec took on more of an independent existence.

This is the model that works, and google is trying to continue it. Best of luck to them.


> How about this: I like published formats that become standards as and when they gain multiple implementations.

That is how the standards process generally works these days.

> The web features we use are there because one vendor or another implemented them, experimentally (again I refer you to javascript, or the <img> tag), and they became standards some time after that.

That was a long time ago, when there were few browsers and the Web was much smaller. Nowadays, whenever a browser ships anything, content immediately starts relying on it, and it becomes frozen for all time. None of your other examples have billions of pieces of content; the probability that some content starts relying on the random corner cases of whatever you ship is almost certain. That is one of the most important reasons the standards process exists: to allow multiple vendors a seat at the table in order to create something that makes sense, as opposed to sitting down, writing a pile of code, having content depend on the random bugs and corner cases in your implementation, and forcing all other vendors to reverse engineer your code. (Being open source does not make reverse engineering free, and doesn't even make it that much easier: the HTML5 parsing algorithm was reverse engineered from IE6 without looking at the source code.)

> This is the model that works, and google is trying to continue it. Best of luck to them.

It's also what got us quirks mode, the content sniffing algorithm, the gratuitous complexity of HTML5 parsing, marquee, blink, and the incredibly underdocumented and hideously complex tables specification. I could go on.

You're portraying CSS2 as a failure, but CSS2 is actually a great example of something that is implementable by virtue of being standardized. CSS2 only looks bad because you can go to the standard and look at the complexity, but automatic table layout (what we had before) is much worse, being defined by a pile of C++ code that few people in the world know, with corner cases and weirdnesses a mile long. To this day, table layout is essentially implemented by reverse engineering Gecko. As someone who has implemented both features, I much prefer the former.


There's a significant difference between implementing something in your browser, publishing the source, putting out demos, etc. and making it a critical part of a primary business web application to the extent that you explicitly wall-off browsers that don't support it.

You want to add new functionality to the web? Great, implement it, make some demos, show us why it's awesome and something we should all implement too. Advocate and demonstrate all you want. But don't make your applications break for anyone that doesn't support your new fanciness.


Lack of competition for Internet Explorer got us the lost decade – Microsoft was perfectly happy with the web not being competitive with unportable desktop apps.


Was that the "lost decade" when we got LinkedIn, Skype, Second Life, MySpace, Flickr, Facebook, Gmail, Google Maps, YouTube, Pandora, Twitter and many of the other things on which we now depend?


Read the comment I was replying to and note that it says “web technology” rather than “web sites”. While people built some great things they did so with significant limitations and had to use features which were not standardized. XmlHttpRequest is a great example – it was used by most of the sites you mentioned but wasn't even submitted as a standard for a full 7 years after it first shipped.


See comment from jordanlev above: "Also hints at the idea that the stability of IE6 for a while actually created a good environment for innovation in the web app space to take place"

Came along and made the point I was making....


1. There is no comment from jordanlev in this story. You're thinking of https://news.ycombinator.com/item?id=9034177 in the Memoirs from the Browser Wars thread

2. Note that immediately before your quote, he made the same point I made: “Microsoft intentionally let IE6 development come to a halt because it was no longer strategically beneficial to them”.

3. In addition to reversing your earlier position on the first point, you never stated anything like his second point – just a tangent from the topic in question. You could have fleshed it out into something similar but never did.

4. There's potentially an interesting discussion about the benefits of API stability but that's not conclusively proven – there are many confounds – and there's a separate question of actually specifying behaviours and fixing bugs in the various in-the-wild-versions. As anyone who was working on the web in that era remembers even IE6 wasn't reliably a single target since key features depended on the combination of Windows patches installed on the client. There would have been zero downside had Microsoft more aggressively promoted updates so IE would consistently support HTTP compression, SSL, caching, etc. rather than marking them as minor updates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: