Hacker News new | past | comments | ask | show | jobs | submit login
Google discontinues support for IE7 in Google Apps (google.com)
241 points by Yrlec on June 20, 2011 | hide | past | favorite | 142 comments



I'm quite grateful to Google for this. As a web developer, I'd always hoped that some big industry player would start forcing people to use modern browsers, and now it's happening.

As small players, dropping support for older browsers kind of pulls us into the morass of a Nash equilibrium. Everyone would benefit if all web developers could agree on requiring modern browsers. We'd all be saved the pain of supporting old browsers, and users would upgrade because every site forces them to. But as an individual web developer, I can't very well just make that call and hope others will follow suit. Because until everyone else does the same, I'm stuck telling my clients they're giving up visitors with older browsers "for the greater good." Not workable. So my current best strategy is to support older browsers, and the same is true for each other developer. Yet as a whole industry, we'd be way better off dropping old browsers.

But a huge player like Google can afford to do it unilaterally. And when they do, they create an opportunity for countless small developers like me to do the same.

I'm not going to jump on the bandwagon just yet. I plan to embrace this cautiously. We still don't know when it will be safe for freelancers and small companies to drop IE 7 and other dinosaurs. But I predict that day will come much sooner thanks to Google.


I disagree, as a web developer big or small it is your job to set the standards for what the web should be.

Small freelancers make up a greater percentage than bigger firms as developers. At very least you should make it known that supporting old browsers is not a standard service and will incur extra costs to the client. There is plenty of literature to point your clients to in order to back up your point. If we all do this one by one it will have a greater impact than this move by Google.


I think they should go further. A more Microsoft-esque thing to do would be if someone is using IE to require the use of Chrome Frame.


No, I think they're doing the right thing here. A firm hand on the backs of people who would otherwise never upgrade, not a punch in the face. Frankly, I'd be willing to bet there are plenty of people in Redmond that secretly approve this move. Microsoft has to play the role of stabilizer for their enterprise customers (so therefore updates are optional), but I'm certain that they wish their home users could play with the "new fast native Internet."


> Frankly, I'd be willing to bet there are plenty of people in Redmond that secretly approve this move.

It is no secret. MS themselves have gone as far as putting up "anti"-IE6 sites and not supporting IE6 in their own online office apps. IE7 can't be far behind. IE prior to version 9 is an embarrassment to them compared to IE9 and versions of other browsers from three (or more) years ago and they'd like to sweep them under the carpet and get users running IE9+ ASAP.

MS will draw the line at IE8 which Google (and anyone following a similar pattern) may not (they may drop IE8 support soon after IE10 is released if sticking with a "current and previous only" model), as MS won't want to lock off XP+IE8 users (at least until April 2014 when XP with SP3 completely falls out of extended support anyway) because that might further encourage shifting to Firefox or Chrom{e|ium} as an upgrade to IE9 is not possible on XP.


So Google should aspire to be more like Microsoft? :/


Agree! Many thanks Google! It is about time this happened. Let Microsoft worry about Skype while the rest of us try to make the WWW more user and developer friendly.


Lets put this into perspective: We are talking about Google-Apps, not Search! Marketshare is very small, and a company even considering working with GApps will not do so with IE7 anyway.

Wake me up when Google is dropping GMail support for IE7. Now that would be a story.


This includes Gmail.


That's misleading. Google Apps has Google in it, but (unless I missed something) GMail as a seperate product is not what is talked about here.


according to this help doc (which points to the original blog post announcement), they're moving gmail to this same support system:

https://mail.google.com/support/bin/answer.py?hl=en&answ...


As a web user, I'm not grateful. I don't think having more pixel-perfect control or slightly faster JavaScript makes the web any better. All websites look slightly better, but the client side dies a little bit every time. In the web circa '96, anyone could write a web spider or web browser, and everyone did. There was tremendous innovation both client-side and server-side. Client-side innovations included things like search engines and Google. With the hyper-AJAXed world, client-side innovation becomes impossible. I don't think this is a net win for the world.


I echo the comments others have made about the wide range of improvements offered by modern browsers. But even putting that aside, the rendering capabilities you refer to make a bigger difference to web users than you might think.

You say "I don't think having more pixel-perfect control or slightly faster JavaScript makes the web any better." But--no disrespect--you probably only feel that way because you're already benefitting from the huge amounts of effort invested in supporting multiple, incompatible browsers. The web looks OK for you now because, without you noticing it, web developers have slaved to make it look OK for your unique combination of OS and browser.

It may seem like that's just a cost we web developers have to bear, with little effect on you. But that's not so. The fact that we have to spend time supporting old browsers increases the cost of everything that's created on the web. And when innovation is more expensive, it happens more slowly. The costs imposed on our industry by older browsers do effect ordinary web users, because those costs translate into a slower pace of innovation.


> I don't think having more pixel-perfect control or slightly faster JavaScript makes the web any better.

If that's all it was, then you'd be right. You can't do geolocation, local storage, audio/video, canvas/svg/webgl, etc. etc. on old browsers. It's not about making things pretty. It's about creating software that can compete with desktop alternatives.


Unspoken assumption: that competing with desktop alternatives is good or desired.

Personally, I can't think of a single desktop application that I use that is better done in a browser with the possible exception of Google Maps, and I say "possible" because I haven't seen a desktop contender.

I use GMail's web interface solely because I can't stand any of the Mac clients (at work) and Outlook isn't available on Linux (which a couple of my PCs at home run). It is cross-platform, which is a plus; it is painfully ugly and slower than a desktop alternative, which is an overwhelming minus.

I greatly prefer my web browser to be for browsing the web. While I'm all for standards in HTML/CSS/JS, I find the "web browser as your OS!" crap to be disheartening. I like things that work, and for the most part, web applications don't.


I share you views. Making a web app is easier for the developer: just one platform to support, no restrictions on implementation, etc. But a desktop app is nicer for the users (the ones we're supposed to care about, right?). In the very best case a web app would be able to function as good as a desktop app. The best case is usually not achieved.

Imagine if a person at your job said they had just made a new client app for your business users workflow. When you ask how it's implemented the person says that the GUI itself is just a skin that reads everything from the database. The GUI layout, how the buttons behave, all of it is stored in the database and the actual GUI is nothing more than a kind of platform for what's in the database. I've actually seen this done and the team who did it were sacked, their application deprecated. As far as I know it's still running because the team in charge of replacing it still doesn't completely understand it. But this is what web apps are. Model, view, presenter, they're all stored in the same place.

Personally, I prefer having a back end RESTful server with native... shall we say "fit" clients (not fat but not thin either) using it. The browser gets a simplified, default version of the app which has a link to the appropriate native client somewhere visible.


Time = $$$. Universal constant. While your desktop app may be shinier, it takes much more time to create and maintain. This means fewer apps are published and fewer features are added.

Less choices and fewer features translate to a negative for the user.

On top of that, native desktop apps are compiled ... the web is open, with html/css/javascript/etc. and this creates an open environment that encourages free software.

Free is always good for the user.

Anyway, re-arguing this sort of thing is pointless. The desktop is in a death spiral. All this stuff has already been set in stone. It's a question of when, not if.


With the hyper-AJAXed world, client-side innovation becomes impossible.

What?


15 years ago, I could reasonably write a search engine. Myself. 1 person. In a few weeks (modulo bandwidth and server farm). I write a program that grabs a web page, and reads out keywords. Today, if I grab a web page, quite often, that web page has nothing except for JavaScript code. That code grabs the actual content from the server, lays it out, and animates it. To write a web search engine, I need to write a complete JavaScript library.

At the time, we were talking about developing all sorts of agents. Things that would shop for you. Things that would find parts for you. Thinks that would remember what web sites you visited, and let you search them. Things that would track where in a long set of pages you were (blog, comic, etc.), and let you keep reading from there. It happened for a while, and then it died when the web became too damn hard. Writing anything that can reasonably see and parse web pages now takes many, many web years. There are only four or five organizations with that kind of resources (WebKit, Mozilla, Opera, IE, and internally, Google). There are countless things we just didn't even imagine.

It's like the DMCA. You notice all the innovations that happen, but you miss all the innovations it made impossible.


>15 years ago, I could reasonably write a search engine.

No, 15 years ago you could reasonably write a search engine for 15 years ago. It would suck by today's standards.

You want to handle Javascript? Easy! There are plenty of tools to choose from now. Run a browser as your crawler, visit the sites, and read the generated source instead of the static source. Shove that into your 15-years-ago search engine, and there's no difference.

>Things that would track where in a long set of pages you were

You mean bookmarks? Add a scroll %, assuming they're not nice enough to use anchor tags / IDs meaningfully, and you're golden.

>Writing anything that can reasonably see and parse web pages...

has become a community effort, instead of a bunch of isolated silos where people reinvented the wheel out of necessity.

The resources required aren't so large just because it's so much more complex, it's large because it's so much faster, and you won't survive if you can't compete. How long did we languish with crappy Javascript engines? How much would you need to know to actively compete in that section alone now? It's easy to make a slow-but-functional browser, and if you looked around you'd see some people doing just that. Making a fast-and-resilient one is as hard as making a fast-and-resilient anything, especially where human input (ie, HTML) is expected to be consumed.


> You mean bookmarks? Add a scroll %, assuming they're not nice enough to use anchor tags / IDs meaningfully, and you're golden.

Bookmarks in books work okay. You move them. Book marks in browsers don't. You have to remove the old one, add the new one, and the overall process is too cumbersome to be useful for the application I mentioned.


We actually built a site to solve that problem. If you have a series of pages (blog, comic, book, etc) and want to mark your place in them with a bookmark that moves as you read, try Serialist (https://serialist.net/).


There's an "edit" option as well.

As to the auto-updating bookmarks, would it resolve the issue if I made an extension to do that for you? I can see the use, honestly, and I like it. (seriously, I'm offering, and I'd probably use it myself. It'd be an interesting project. Even if it doesn't resolve the issue - we might just fundamentally disagree here, I'm OK with that.)

But why should that be part of the browser, when modern browsers allow you to do damn near anything by simply leveraging it? Why should we rely on browser makers to tell us what's possible, when we can do it ourselves, because of the changes in the past 15 years?


I'd love to see that extension. If you write it, I will use it. I use Chrome too, so it should work here.

As to what should and shouldn't be part of the browser -- the way to figure that out is experimentation and competition. When you make technologies and standards simple and easy, people will make independent implementations and try things. The vast majority will be dumb, but some (often unanticipated ones) will turn out to be useful, clever, or brilliant. That's how the technology improves.

When you make standards big and cumbersome, progress stops.


If you want to move a bookmark to a different place on a blog / content site, it is probably because you want to read new entries. RSS does this fairly well.

If you want to read through a site's archives, what I do is keep it open in a tab. It is restored when I reopen my browser, saved if I reboot, etc. It's not as handy as a bookmark, but it comes close.


With all the headless Webkit tools coming out nowadays (and all the free and fast JS engines like V8), writing a spider that runs a JS engine and clicks on all kinds of non-<a> elements is not beyond the reach of somebody innovative and motivated enough to create new kinds of spidering robots.

You won't need to write a complete JavaScript library. Look at all the testing suites that automate browser instances, Selenium being the most well-known.


15 years ago the thing we call "web application" hardly existed. If web page "has nothing except JavaScript" (e.g. GMail) is probably is web app and indexing it makes little sense anyway. If someone misuses JS on content site, that's another story. And your comment about innovation makes no sense at all. Capabilities of modern browsers (Canvas, geolocation, local storage, offline apps, etc.) offer more opportunities for innovation than "old web" could even imagine.


I am aware of the current opportunities.

I think you (and most people here) underestimate what the "old web" could imagine, though. We had all sort of ideas for agents that would go out and grab and analyze data for us in all sorts of clever and interesting ways. Search engines got built, as did one or two other things, and then the web just got too complex.

Hell, even I had a simple app that went out and grabbed all my favorite comics and showed them to me, nicely formatted, and without ads.


You mean ad filtered RSS/Atom? I assume such a program would be much faster to write these days: have a set of newsfeeds, map() them with a filter function and merge the results.

While the web gets more complex, the tools at hand get better. Much better.


If web page "has nothing except JavaScript" (e.g. GMail) is probably is web app and indexing it makes little sense anyway.

What about JS frameworks like JavascriptMVC or Sammy [1]? Google even created a spec [2] for crawling such sites.

[1] http://sammyjs.org/ [2] http://code.google.com/web/ajaxcrawling/docs/getting-started...


Gmail's HTML view works fine in Links. That team has been showing competence and diligence that's increasingly rare, and I wish people wouldn't tar them with the same brush as the clowns who write js-only crap.


At the time, we were talking about developing all sorts of agents. Things that would shop for you. Things that would find parts for you. Thinks that would remember what web sites you visited, and let you search them. Things that would track where in a long set of pages you were (blog, comic, etc.), and let you keep reading from there.

The drive toward semantic markup in HTML5 is supposed to help the web get back to those original ideals. Over time, we'll increasingly expect web developers to conform to a subset of possible HTML arrangements, much like book publishers conform to a subset of the possible random arrangements and orientations of letters on a page (odd poetry excepted).


Most people would gladly make it harder for a single person to write a search engine if, in return, it makes it easier for them to make good web pages and web apps.


Your premise that the web is somehow less effective because you can't scrape data from pages easily doesn't make much sense to me.

Have you taken a look recently at the plethora of web APIs for just about every purpose? The modern way of collecting machine-friendly data from a server is through APIs and semantic content (RDFa, microformats, etc.).

Not through HTML / CSS / Javascript formatted pages which are made primarily for human consumption.


Which only works for pre-intended uses, or standards that have substantial market share.

Again, read the literature on agents from the nineties, and see how diverse a technology tree was killed....


I would blame poor/lazy devs inappropriately using JS rather than the evolution of the browser for this. For the average web page, it's unnecessary 90% of the time to require JavaScript for any core functionality ( not so much with web applications ). I have a hard time understanding why people do this as it's often much easier to test and develop when you're layering on JS unobtrusively.


Agreed that it's nearly impossible to generally parse web pages now, though if you're screen scraping it's still pretty easy (if not easier than before) to pull out data. Before you had to parse the DOM; now you can often get structured data via JSON APIs. It's more brittle, though.


You're essentially saying we should halt all progress in web design because it will make some programming things harder for you as an individual.

That's not what the web is and that is not the web environment that I want.


Did you post this using the lynx browser?


I think he's saying that it makes scraping harder.

But today JS frameworks like jQuery give us the means to do anything we want javascript-related, in any browser that half-supports javascript. By deprecating IE7 they're just saying they're going to drop all of the extra hacks they had to use to keep IE7 working.

A lot of what newer browsers give us is just better rendering. You can replace a mess of tables and nested divs with things like border-radius, which means less client-side html to wade through.


Not just scraping. Any sort of non-human parsing of web pages. Look at the 1990s literature on web agents for lots of applications.


Are you kidding? The semantic web and using more metadata is making it easier than ever. Nowadays in many cases not only you have the content, as it is tagged with microformats or RDFa.

Try looking at Freebase or DBpedia and tell me where did you have such a huge amount of easily parsable, semantic content in the 90s.


More and more content is being taken entirely off the open web and siloed behind a server that talks an unstable proprietary protocol, with exactly one blob of javascript in existence that knows how to tunnel requests over HTTP to access shreds of that content and cram them into an utterly non-semantic DOM. We are hurtling backwards into the client-server hell the web had saved us from.


Yeah, I don't see that. I see more and more accessible APIs[1] and pages having more and more an incentive to being semantic due to search engines now reading that data (hRecipe, for example).

Service architecture have also been moving from stuff like SOAP to REST, which is definitively more open and accessible.

And even Ajax-ladden webpages are still just a Firebug Network tab away since they all run over HTTP, and then you have a nicely structured data format instead of having to deal with messy HTML pages.

[1] http://www.programmableweb.com/apis


A JSON (or SOAP) backend is only usable by third parties if its API is kept stable. There are far too many devs who redesign their backend request and response formats at the drop of a hat because they think their js client is the only one that matters (a self-fulfilling prophesy) and they can replace it simultaneously. And their responses tend to look like "here's some more markup to stuff into an arbitrary location in the DOM we're using today", not semantically structured (e.g., Rails now has this built into JavaScriptGenerator). A given site can be reverse-engineered, but anything built on that is going to be fragile and short-lived, much more so than when the typical visual rendering desired for a page determined its structure.


I don't see how that's worse than unstable, non semantically structured HTML markup of yore. In the worst cases, we're not really worse, and we have much more semantic content nowadays.


Well yeah, the web as a collection of interlinked information is dead. It's now television 2.0 with most innovation going to marketing.


With respect, I think this is the same innovation that occurs in every industry and it's silly to bemoan the increasing complexity of the web.

In transportation:

It used to be that everyone could buy a horse and build a buggy and get around. Then cars came along and it got a lot more complicated and expensive to build a vehicle that was state-of-the-art, but tinkerers could still do it.

Now there are only a few big players who are capable of innovating and building the best and newest vehicles.

Now that I think about it though, there is still space for tinkerers and inventors in the automobile space. But You can't expect those automobiles to compete with those made by, e.g., Toyota.

In the same way, it's still possible to write a spider without a javascript renderer. It just won't be able to compete with Google.

One last point: the state of the web is based on the collective decisions of all internet users. Ultimately, people building things on the web decided more often than not that ajax-ifying things benefited their users.

If users had wanted a web client that would spider the web and shop for them, they would have latched onto it during the time of great innovation that you think is now gone. But they didn't. The things that users wanted are the things we see today, assuming that there isn't some horrible inefficiency in the feedback loop between web-builders and their users.


> I don't think having more pixel-perfect control or slightly faster JavaScript makes the web any better.

I think the point for making ie7 deprecated legacy is more about its hideous bugs in the parsing, internal document representation and rendering bugs.

It is the things that makes proper code unable to be displayed properly without the tedious work of understanding the dysfunctions to circumvent them properly. To me, the old IE rendering engines alone has slowed web innovation for at least several years all by themselves.


In what sense is Google a "client-side innovation"?


The Google Spider grabs pages from other servers. That's a web client. It's a web client that was easy to write when Google was started, but is almost impossible to write today. If search engines hadn't been invented 20 years ago, they'd be impossible to invent today. The only reason they still work is tremendous work on Google's end to have its spider be able to spider complex AJAXy pages, and that content creators engage in SEO and develop to Google.


It is not harder or easier. It is just different than it used to be. Things that used to be hard are easy now. Problems that didn't exist 10 years ago exist today. I develop a spider.

For the most part, the bulk of the web's content is as easily accessible as it was years ago. You make a request and you get a blob of HTML back. If you have special requirements and need to get into all the nooks and crannies you create a DOM implementation and embed a JavaScript engine. Then you parse the page into a DOM and start firing off events. There are quality open source JavaScript engines available. JavaScript and AJAX are a breeze.

Flash is a different story. If you have any requirement to follow links or process content in a Flash movie (you'd be surprised how many sites still have Flash nav) you pretty much have to write your own runtime. Unless you are big enough to have Adobe do it for you.

Depending on what you are doing with the data that your spider collects, chances are writing a spider is far easier than writing a browser. There are at least 4 widely used browser engines and plenty more toy browsers floating around.

I can guarantee that writing a spider that can deal with AJAX is not the biggest challenge of developing a search engine. Scaling it, fighting SPAM, understanding the content, indexing and then being able to provide quick lookups are much, much harder.


I recently looked through the Google developer guidelines and they still recommend not changing the page contents significantly using Javascript. Also, the !# in modern AJAX apps is there to avoid having to run the Javascript in order to crawl the content. Do Google actually do that much with the Javascript on a page even today?


I don't really quite get what point you are making here. Are you saying that the innovation today owes more to the innovation of the mid-nineties? Are you saying that any innovation now with respect to AJAX and JS does not make your life better in any way?

Seemingly you argument could be made for cars "15 years ago cars were easy to fix and understand. Now they are not, so wake me up when they are like the cars of the mid-nineties."

JS/Ajax help programmers tremendously. Helps speed. Helps functionality.

To view these things through the lense of "I can't write a crawler for them" is a pretty limited view of what today's technology offers.


Indeed I see now what you mean.


I'm glad. Thank you for taking the time to read and understand. Hacker News is starting to go down the decline that hit reddit 2 years ago, where people don't bother to try to understand different viewpoints, and just downvote anything they don't agree with. It's nice to see good people still on here...


The Les Paul guitar worked in IE6. If they can get that to work, everything else should be easy.


With the hyper-AJAXed world, client-side innovation becomes impossible.

Are you serious? Have you heard of the canvas element? It allows modern browsers to do things that were only possible 2 years ago in Flash. Have you noticed how there are actual web applications, not just a collection of linked pages these days? Have you noticed how with ubiquitous JavaScript, the usability and ease of websites has improved greatly?


No. I haven't. I've noticed maybe 1 or 2 web apps I want to use (Google Docs and Google Maps). Beyond that, I don't see anything that couldn't be delivered more effectively without JavaScript that I want or need.

Usability is not up. Each web site has its own, custom, non-standard user interface. I could teach my mom to use the web circa '96. I cannot teach her to use it today. It's too damn complex.

Usability would be up if the browser knew more about what to expect. You can look at things like Readability. The browser ought to know more about the content, and be able to present it in a coherent, usable way. The server-side shouldn't dictate presentation.


A big part of usability is not sitting around waiting for entire pages to reload every time you interact with them. AJAX has done great things for users by minimizing this delay. You wouldn't like Google Maps as much if you had to click an arrow and wait for a page refresh for the map to move, like MapQuest circa 2003.

Yes, the proliferation of web apps has created a diversity of user interface paradigms. Some would say this is a good thing, however, since the web has spurred all kinds of new UI philosophies, and the fact that JavaScript and HTML isn't compiled allows people to examine and re-work others' code, so good ideas spread very quickly. I for one don't intend on waiting for the HTML5 group to invent every new <input type=""> that I could conceivably need, and then wait some more for browser vendors to implement them all consistently. With JavaScript, you can currently build and deploy just about any kind of 2D client-side interaction imaginable.

In short, the vast majority of users on the internet probably have a different idea of usability than yours, and the numbers tell the rest of that story. You only need to look at the gross casserole of UI paradigms within the applications installed on your mom's PC to see how much users really care about UI standardization.


Thank you.


Did you read what I wrote? I mentioned Google Maps as one of the two places I found AJAX useful.

The applications on my mom's PC do have much better UI standardization than the web does. Microsoft releases UI guidelines. alt-f4 does the same thing in every application I've used, and the menu structure is roughly the same too. Apple is even better.


> Did you read what I wrote? I mentioned Google Maps as one of the two places I found AJAX useful.

Yes, and I was dissecting why you may have found it useful, because the same principle applies to hundreds of other situations that you may not have recognized.

> The applications on my mom's PC do have much better UI standardization than the web does. Microsoft releases UI guidelines. alt-f4 does the same thing in every application I've used

Questionable. About the only key shortcuts you can rely on are the ones that will work in your browser too. Alt-F4 will close your browser--that's what you wanted, right? Cut/copy/paste, print, etc. all work there as well...

> the menu structure is roughly the same too

Ha, you mean the invisible menus on Explorer and IE>8, the mega "office button" menu in Office 2007, the delightfully inconsistent menu bars in WMP>9...


     Microsoft releases UI guidelines
Microsoft and UI guidelines in the same sentence, something doesn't compile - I would be happy if they used their own guidelines though.

     the menu structure is roughly the same too
Too bad it is getting reinvented; it happened in Office 2010 and it will happen again as people are getting tired of File -> Save; and yet again when touch screens on laptops will become the norm.

So I'm sorry for your mom, but unless she never upgrades, then she's going to have to learn new things.


I do see a side of what you're saying. In those older days the web was a much simpler platform so figuring out what to do and what to click was easy. This was true in Windows too as MFC was the library of choice for UI meaning a lot of the software was easy to figure out also.

Today a lot more software and websites have broken the mold and come up with some really different(not siding better or worse because both exist out there) UX patterns. There aren't standards for web UI anymore that are practiced across the board.

I do disagree with your statement that the client should dictate how a site is presented, not the server. The browser should display the content in a standards compliant way. The days of buttons looking like windows buttons in IE and Mac buttons in Safari should be a thing of the past never to return.


Yes, dropping support for IE 7 is fine. But the recently weird-ified FF release schedule is going to make this new Google Apps policy have strange results.

FF 4 was released 3 months ago. It's just now looking confidence-inspiring enough that I'm considering upgrading to it on my Mac this week. But now Google is going to drop support for it when FF 6 is released in a couple of months???

Another issue: if I'm not mistaken, the fact that Ubuntu releases stick with a particular browser version, means that the LTS releases (and 10.04 in particular) will stay with a version of FF long after Google Apps has stopped supporting it.

Both of these strike me as serious problems. Perhaps the Google Apps people have not really thought through the ramifications of this policy?


I think we can be confident that Google will "Do the right thing" - they aren't blindly following a "Drop Support for versionX when versionCurrent = versionX+2"

I'm reasonably certain by that by "Two Major Versions" they are referring to "Versions that alter how the platform interacts with the Web." - So, as long as FF 5 and FF 6 behave reasonably similarly, they'll be treated as the same major version, and Google will support FF4 and FF5/6.

Or, it may be the case, after some review, that FF4/FF5/FF6 will all be treated as the same major version, and official support on August, 2011 will be for FF3.6 and FF4/5/6. Time will tell.


> I think we can be confident that Google will "Do the right thing"'

Well, hopefully. But they've officially announced the dropping of support, so I have to wonder.

> I'm reasonably certain by that by "Two Major Versions" they are referring to "Versions that alter how the platform interacts with the Web."

That might be. But that makes me wonder about FF. Apparently the changes from 3.5 to 3.6 are more significant than those from 4 to 6. What's the deal with that?


I seem to recall reading that Ubuntu changed its browser policy on LTS releases to upgrade to the next stable version of Firefox when Mozilla stops releasing security updates for the version used by the LTS release.

This is the best reference I could find with a 30-second search (non-authoritative): http://ubuntuforums.org/showthread.php?t=1551527


Yes, it looks like they might be doing something sensible there. Thanks for the link.


Dropping support ≠ stops working. Dropping support means that if you go to them with a support query their support staff are legitimately allowed to say “use a newer browser”.


Basing support on version numbers is stupid because version numbers mean nothing. Firefox 6 beta is due out in two weeks time!

Time based support would be better.


Do you think they'll really drop IE8 once IE10 is released next year? Firefox 5 will be released tomorrow. FF 3.6 support will be discontinued? Actually, it would be great if Google could nudge everyone to keep upgrading.


Or Firefox 4 when Firefox 6 is released in August? I'm curious about this too.

Of course, "not supported" isn't the same as "not working". For Firefox (just like Chrome), I doubt there will be changes breaking enough to cause a problem. The real question is if they'll start using features (e.g. canvas or newer selectors) not available in those earlier browsers for major functionality in their apps.


They can't drop IE8 because too many businesses are still on XP.


The latest builds of Firefox and Chrome both work fine on XP, so they could theoretically force users to upgrade away from IE altogether.


They can't, many businesses rely on Microsoft's browser lockdown support. For example strict IE from browsing through the local disk or restricting settings from being modified. Firefox and Chrome have virtually no lockdown support.


Can't you secure everything on the operating system level?


I am sorry, but adapting to a vendor is not how IT departments run (especially when a vendor aims to replace your IT departments).


They do when they don't have a choice (which is also mostly the only way you can get most people to change at all).


I think a lot of places would drop google apps before they let google dictate what software they use.


And once again, they list specific version numbers for IE, Firefox, and Safari, but just say "Chrome" for Chrome, even though later it groups Chrome with the rest of the two most recent.

I guess it's because they don't know for sure what versions of Chrome will be the two most recent released, but it's still kind of funny.


I'm going to go with the fact that unless you NEVER restart your browser, the likelihood is that you are one or two minor versions behind the latest Chrome code at worst. With Chrome's automagical update system, most users barely notice when an update comes down the pike.


True... however, a moderately large website I have access to analytics has about 5% of Chrome users using something older than 11. It's possible to install Chrome without enabling the auto updates, so exactly which versions of Chrome they still support is somewhat useful information, I guess.

More likely, since the versions of FFx, IE, and Safari they support are the versions of another company's product--unlike their own--they feel like they have to be more upfront.


> exactly which versions of Chrome they still support is somewhat useful information

The current stable release of Chrome the the version before it.


That's because transitioning users from an older version of IE to a newer one can take years, while I think they said that when they upgrade Chrome, it takes about 10 days to get 99% of the users on the newer version.


They won't as long as they have a significant market share, but IE7's market share is like 6%. I think they can live without that, especially if it means pushing a few more users to upgrade to more modern browsers.


Firefox 5 will be released tomorrow. FF 3.6 support will be discontinued?

This would cause problems for anyone on Debian squeeze.


Debian Stable is for servers. For desktop, obviously you want Debian Testing or Unstable.

Of course its a matter of choice, but if you stick to Debian Stable for your desktop, such issues are to be expected.


Sorry, that's just wrong. Stable means stable, be it server or desktop. Testing and Unstable are aptly named by Debian, and that's where you should expect issues to arise.


Thats not quite right. Being stable is subjective. Ubuntu and a TON of distro's are based on Debian Unstable. Does that mean they are unstable because of that?

Debian unstable and testing are a bit more unstable, of course, but they are still good enough for desktop.


Given how often Ubuntu breaks hilariously on dist-upgrades, yes, I would say they are unstable. Though for slightly different reasons than the ones the grandparent poster mentioned.


"not supported" and "not working" are different things. I can't think of any breaking changes between 3.6 and 4 outside of the extension system.


We're using IceWeasel which will, as per Debian's usual policy, still be supported with security fixes.


"Security fixes" doesn't include support for any new features that Google Apps chooses to use without considering older Firefox versions.


You choose Google Apps. Google Apps should not choose you.


Why shouldn't they? It seems perfectly reasonable to exclude customers if they are unprofitable: in this case, it may be that google is willing to take a hit in the short term, with the view to allowing them to strongly innovate in the future.


They'll drop support once the costs become significant relative to the number of users. Right now, older IE versions have a much higher cost due to being built on an antique rendering engine - Firefox 3.6 is generally far less expensive unless you actually depend on certain more recent features because it wasn't missing support for so many basic features (e.g. think of the impact on your CSS without something like inline-block versus, say, box-shadow).


If you think that's bad, we're on Chrome 12 now... So they no longer officially support their own major release from a few months ago. I'm not sure what they're basing this on, but the major revision number doesn't seem likely.

http://blog.chromium.org/2010/07/release-early-release-often...


You'll note that in the link it just says Chrome without a version number. They'll support their own browser differently. Unfair? Some might think it, but considering Chrome's success in forcing version upgrades on people it will work fine.

I think people are putting too much/too little thought in to this version number thing. With Firefox's six-weekly release schedule now they'll define a major release differently from the version number.


In theory I agree with you, but at some point people are going to get tired of the upgrade forces and stop using Google apps.


Why would they be tired? Is it a pain to upgrade? (I admit I've become used to chrome upgrading in the background).


Sometimes OSes are deemed obsolete for whatever reason, and browsers -- all of them! -- simply stop developing for those OSes. This means that you're at least shelling out the cost for a new OS, which can be (1) painful (if you're using Windows or Linux in some cases) or (2) impossible (e.g. if you're using Mac OS X and Apple has decided that your hardware is "end-of-lifed" -- e.g. PowerBook G4). If you're one of the unlucky ones to be in the latter category, you actually have to shell out enough money for at least some upgraded hardware (perhaps in addition to the OS), or possibly a brand new (or newer but still used) computer, which will likely be a lot more painful than just shelling out for the OS.

The key point is that Google Apps are tied to browser versions, and browser versions are tied to OS versions, and OS versions aren't exactly consistent in their support or end-user coverage. Users who figured their current setup works well enough for now, and who for many years haven't had any reason to upgrade will now be forced to if they want to make use of those Google Apps -- or, they can take the much less costly route and simply not use Google Apps, or deal with "simple HTML mode" or whatever.

The worst part about this that irritates me most, though, is that many of these systems simply don't have anything wrong with them except that someone, somewhere, arbitrarily decided they were "too old". So perfectly functional pieces of equipment now lack functionality because enough people have arbitrarily decided that they "don't have" that functionality anymore, even where it might actually be possible.

On the plus side, it does mean that no one has to support old browsers' quirky and always-varied interpretations of the same chunk of code...


It is if you're in a large organization, or you want to make sure thngs don't break before you release them to your users.


The upgrade difficulties in large organizations are often brought upon themselves with poor adherance to public standards and insufficient test automation. Last I checked, Google itself is a large organization, and they don't seem to have any worries over the rate at which software gets updated.


Google is a big organization in the business of developing web software. Most organizations are in other businesses. If I'm running a construction company, it doesn't make sense to have IT consultants come in each time a new version of something comes out. If it works, it's best to leave it alone, modulo security fixes.

Do you feel the need to upgrade your house's plumbing system and electrical each time an innovation happens? That's how most people feel about software.


When the electrical grid connecting to your house get updated and no longer supports your installed system you upgrade as well or you don't get electricity.


Sure. But that happens maybe every 20/30 years. Not every 6-12 months.


A web browser isn't akin to plumbing, rather it would be closer to a fixture. And I do understand why folks would want to update their fixtures often.


I don't. Why would I want to upgrade my fixtures twice a year if they're working and look fine? I'd be happy to keep the same set for 20 years, myself.


Is it really worth carrying a metaphor out to its absurd conclusion? A web browser does not exactly equate to a plumbing fixture, yet they both have maturation cycles that prompt people to upgrade over time. Browser technology moves faster than plumbing, I don't think anyone would be terribly surprised by that revelation, yet when plumbing technology advances I will upgrade. I will upgrade sooner rather than later because the savings in time and energy pay off over time, just like with browser technology. The further behind the cycle I remain, the more it costs me to modernize in the future and the more it costs me to stick with the old. The literal time period is completely irrelevant.


Coincident with the debut of admin-less ChromeFrame support, this makes sense.


I'm all in favor of anything that puts upgrade pressure on users of IE 6 and 7, but this rolling "current and previous major releases" idea is silly. They should simply establish minimum browser requirements like everybody else and update them when it makes sense to, not arbitrarily leaving it to whenever the browser makers release new versions.


"Current and previous major releases" is a peaceful approach. There is no discretion involved - at no point was an active decision made to cut off your IE 7 users.

Plus, your FF 4 users will know exactly when FF support will be dropped - the day FF 6 is released. Simple.


This is excellent news. I think those who don't really understand the importance of upgrading will finally see the light when a company like Google encourages, nay, forces it. This is a great day, especially for app developers.


I like how they don't even put a version number on Chrome.


To me it seems Microsoft needs to take a leaf from Google's book and incorporate the automagical browser updating. IE, sadly, has always controlled the browser market because if comes prepackaged and most users are novice and oblivious to updating and system requirements. Sure people will say they are copying Google, but in doing this, they might finally be able to keep up with web standards as they are released instead of supporting new standards 2 years after everyone else does creating this entire version fiasco to begin with.


Google is only really able to do it because nobody is depending too much on stuff built on top of the browser (yet, anyway). When you've built whole internal apps on top of a browser, as many companies have built on IE, frequent upgrades that change anything can be problematic. Firefox is in an interesting in-between position, where they encourage frequent upgrades, but users who use a lot of add-ons are reluctant to upgrade too often, because upgrades often break add-ons.

Of course, it's not a problem if every upgrade is backwards-compatible with every add-on and intranet app, but that's tough to pull off.


Thank you so much! I love you Google.


A lot of people are talking about oh, what will happen with IE / Firefox support. You know what's a better way to address all these problems? (Especially for Firefox). Why not just do the auto-update process Chrome does for Firefox? I can understand why IE doesn't, but why doesn't Mozilla expend significant effort into just doing that and solving most of these issues of "outdated" firefox versions


Mozilla is gonna do frequent updates like Chrome starting from Firefox 5.


Does anyone have statistics regarding browser usage in the wild?


This report was widely discussed, in recent days: http://royal.pingdom.com/2011/06/17/report-the-most-common-w...


This also means they're dropping IE8, which generally is considere to have largest market share at the moment when IE10 comes out, which is already in Beta.


Does this mean if I'm running Google Chrome 10 I won't work with Google Apps?

I notice there is a lack of effective major version numbering in Google. When does Google stop supporting it's own browser?

It'd also be interesting to see what they intend to do about mobile browsers. How long will an old OpenWave XHTML browser be supported? How long will an Android 1.5 browser?


Google stops supporting anything below the current shipping version. I think Chrome 12 is the shipping stable version, and Dev is at 14, so in chrome time, version 10 is archaic. But Chrome has the automatic auto-update feature which makes something like 90% of installations upgrade to the latest release within a week of its publication, so versions aren't nearly as much of an issue.


is it even possible to not be running the latest version of chrome? who's still running chrome 10?

and despite a lack of official support, there aren't any huge changes between these versions so things will probably still work fine. it just means they aren't testing for it.


The old Microsoft would respond by pushing out 2 major releases in 6 months to reduce Google Apps penetration.

Ballmersoft? Notsomuch.


That's not really a good tactic, because Google could just change its policies.


It could also add significant attention to the issue, and backfire horribly.


Why would it add attention to this issue?

Microsoft would just say they are responding to Chrome's accelerated release schedule and redefining what a "major" release is.



In other news, I've been waiting for Opera support for years and I still don't see it in the list of supported browsers.


This is what really irks me about Google Apps. You don't have any control over this kind of thing. They can (and do) yank the rug out from under you whenever they feel like it. A while back, they forcibly transitioned all accounts to also be normal Google accounts, with no user input. They did this before they figure out how to transition users who already had Google accounts, so now my users are split into first class and second class citizens.

It really doesn't feel like it should be the hard for Google to have some infrastructure where the user can control version transitions. Google makes a new version, but barring security issues, keeps all old versions in their cloud. At that point, you transition when you're ready.


Really, that doesn't feel like it should be that hard? As a developer, that sound like a nightmare. Services like these aren't just a piece of software that can be versioned, it's a system of systems, each one with certain requirements, limits, supported APIs etc. Wrangling all of that is hard enough just to support one user-facing site, much less keeping every version of the site available and working.


It also doesn't sound all that hard to update your browser.


And in general, it's not. But certain platforms (e.g. OS X on PowerPC) have been artificially restricted to the point where no new browsers are being made for them, so the latest version you can get to on them is probably outside of Google's scope of support -- and if it isn't now, it will be shortly. For instance, sure, a PowerBook G4 isn't necessarily powerful by today's standards, but surely it's powerful enough for Google Apps, yet today I don't believe it's supported in the current version of any browser because of its inherent OS X 10.4 or 10.5 cap. Sure, it's possible to switch to Linux, but I don't personally believe the user should have to.

In essence, upgrades are not as simple as you may think due to forced platform incompatibility/vendor lock-in. Forcing an upgrade like this can cost users a non-trivial sum of money on top of what you're already charging for the service!

So while I appreciate the moving into the future, I feel bad for the people whose wallets are going to feel the pain of such a move.


Didn't you also bring that upon yourself by buying a product from that particular vendor (Apple) as opposed to any other? Upgrade treadmills and forced obsolescence are always a risk with software, and users need to learn to account for it.

I recently acquired a Macbook Pro from work, but I'm using Windows 7 on it full-time. I'm far too wary of being forced to upgrade when Apple stops providing security updates to the version of OS X currently installed on it.


Depends on where you work. Banks and financial institutions have notoriously slow upgrade paths in IT.


Banks and financial institutions are subject to extensive document retention and security regulations -- using Google Docs is out of the question. Of course, that is one of the primary reasons why their upgrade paths are so slow.


they have slow update paths because they can get away with slow update paths. if web developers stop catering to lazy IT departments, maybe they won't be so slow anymore.


They have slow update paths because they've spent a lot of money on in-house middleware that they don't feel a need to update, and sometimes those systems don't work so hot on the new browser. Faced with a choice between updating old middleware and not updating the browser, they choose to wait on updating the browser. The longer they do that, the more work the middleware needs to be "up to date", and the worse the problem becomes. Definitely not isolated to banks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: