Hacker News new | past | comments | ask | show | jobs | submit login

I'm still skeptical. Is there any precedent for similar features that are used today? Closest I can think of is turning off JavaScript. Which this seems made to combat, honestly. And, honestly, I doubt that has penetration worth talking about.

And again, just don't put so many giant graphics on a page. Problem mostly solved. With less tech and likely faster results.




Please read the spec. Your accusation of this being intended to defeat turning off JS is dealt with here[0]. It's only a +188 -66 patch and eminently readable. If you are discussing this without reading it, this is going to be an unproductive discussion.

0: https://github.com/whatwg/html/pull/3752/files#diff-36cd38f4...


The size is a non sequitor. This is something that otherwise requires javascript. Full stop. No?

Now, I don't actually think this is being done to sidestep people that turn off javascript. Mainly because I just don't think that is a market worth worrying about.

But I can't see why this is a feature that we need. Progressive loading, I could almost see. But by and large, high resolution images are just not compatible with high speed page loads. I'm not seeing how this feature actually changes that.


The link is to the part of the diff that enforces that it behaves identically to current unannotated tags when Javascript is disabled. Obviously if it's designed to be no different from the current state when JS is disabled then it can't combat disabling JS.


See, for me, disabling it would be not eagerly loading anything. This is literally putting the site into what is assumed the least optimized state of scripts are off. Ironic, as typically scripts off should be faster.


Why should built-in lazy loading combat turning off Javascript? It means less pages will be broken when you do that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: