Hacker News new | past | comments | ask | show | jobs | submit login
Bloated (fabiensanglard.net)
182 points by janvdberg on Sept 27, 2018 | hide | past | favorite | 84 comments



With regards to the "usually a news site" part, somebody put me onto CNN's "lite" page the other day, and it's honestly a freaking joy to use.

http://lite.cnn.com/en

On the other hand, I think there's definitely a case to be made for rich, application-like presentation of content, when it enhances the delivery of information. For instance:

http://www.r2d3.us/visual-intro-to-machine-learning-part-1/

The problem is that everyone dreams of making something cool like the latter, when their actual content is much closer to the former.


>somebody put me onto CNN's "lite" page the other day, and it's honestly a freaking joy to use.

That's neat, but I think having this with some lightweight, web-optimized static images would make this better without being detrimental to lightness.

Sometimes a picture does say a thousand words. A photograph of a person being interviewed is something that adds a lot to an article. Plus there's the whole photojournalism aspect that's lost in only keeping text.

It's not the images, per se, that make heavy pages heavy, after all. It's tons of JS that pulls in FSM knows what.


a photograph of a person being interviewed is something that adds a lot to an article.

Does it? It seems that it ought to add nothing to the content - unless the content specifically references their appearance.

As well as that, unless it's a person you can google independently (like a head of state, or large company CEO) there's very little chance you can know whether the image is actually the person being interviewed, and even less chance you can tell whether it was /this/ interview. What you do know is that the image will have been posed, chosen from a lot of candidate images, and quite likely edited after that.

It might feel good to recognise and see a human face, and therefore "add a lot" from the publisher side - exploiting that sense to make people click and keep people there longer - but that's not quite the same.


Does it? It seems that it ought to add nothing to the content

"Add"? It's a part of the content. It's been in paper for more than a century.


A literal handful of CSS styles would make it more readable on a variety of platforms. Failing that, a full-text feed for the articles would be very readable in an RSS reader.


A few images and some tasteful CSS could turn the light page into something fairly respectable looking.

Consider e.g. https://web.archive.org/web/19990208015258/http://www.nytime...


Umm... that page would have been way better with simple images/animations instead of animating some laggy canvas2d crap based kn my scroll position. Had they not try their page on mobile? It’s horrible.



>http://lite.cnn.com/en

Oh my God. I forgot how nice it is when websites Just Work without any lag or stuttering or ads


I miss the days where sites like slashdot had an “ultramode” version. Text only, 0 bloat.


Also thin.npr.org


This feels like the trillionth time I read a rant like this, but that doesn't make it less true or important.

I really wonder from time to time if it's just me, having become and old grumpy "get off my lawn" grandpa unable to cope with the world keeping turning and moving forward.

"You can't write everything from scratch every time"

"That's just how it's done today"

"Cmon, people aren't running IE5 on a pentium 2 anymore"

"SPAs are so much better UX. Everything should be an SPA!"

heavy breathing with red face

It's funny how this guy is complaining about a 7 year old laptop; any non-flagship mobile device from 4 years ago is almost unusable on the web.


As long as advertising corporations keep control of your web browser, the experience will always be optimized for ad tech and maybe some other megacorp interests, but definitely not for you. That's the problem.

I imagine a web browser actually made for the user will try hard to extract content from web pages by default, render them with decent consistent UX and not blindly let third parties display and execute any crap they want. There is no reason to render say articles with comments in thousand different ways for thousand websites. And no reason it can't be blazing fast on RaspberryPi.


Agreed, current add-ons are clearly not enough.


I really wonder from time to time if it's just me, having become and old grumpy "get off my lawn" grandpa...

I don't think it's just you. At the very least, us :-)

Seriously, I think there are different kinds of people that don't care:

- Most. They don't know it could be done otherwise. For them bloat is the way web is.

- People that likes bloat. It's obvious if you can see how they dress or their furniture.

- People that works making those websites. There are a lot of them here, so don't expect many upvotes.

- People that repeats what others say: the phrases that make your breath heavy.

I believe that at some moment in the future, there will be a swing to simplicity, because fashion if for no other reason.


Kudos to the author. I think if every technical person with a blog takes the time to reevaluate it will be a good start.

Over the years I've reworked my own blog to not use any third party dependencies (bootstrap, jquery, etc.). And use very few images. My last thing is to get rid of Google Analytics, which piwik (now matomo) has failed to replace but hope is not lost. I also discovered that GA does not respect DoNotTrack, so I modified it to not load in this case.

At the very least, all HNers with blogs should concider doing this.


I don't have any analytics on my blog. Can you explain the benefit of GA over Piwik? What genuinely useful insights does GA give you (specifically for a personal blog) that Piwik does not? (I assume that the info from Piwik would be similar to what you get from computing stats from the web server access log.)


It's not the insights, I think piwik got that part right. The problem is when traffic increased piwik and the server it was on could no longer support it.


Agreed, piwik with lots of traffic is a major problem that can slow the server to a halt (for those of us without a massive budget)


The sad thing is that even if 100% of the people with personal websites would do the same, it probably wouldn't change much. And yeah, I'm firmly in the "this old computer should definitely render web pages if it's good enough to run video games" camp and if the people working on current browsers argue that it's complicated, with all the scrolling and JS and stuff.. Well good riddance, web of 2018 - it was absolutely fine in 2008.


> The sad thing is that even if 100% of the people with personal websites would do the same, it probably wouldn't change much.

Why not? A significant portion of my web browsing is done on people's personal websites, since they often have cool things to share. If I'm going to get better performance on these websites, I might change my browsing preferences to favor these even more heavily.


I think part of parent's point is that you're probably in the minority. I don't have data to back this up, but I suspect that the users who make up the "long tail" of web traffic tend to visit a higher number of websites than the vast majority of users who mostly stick to the "walled gardens."


of course I don't have any real data, but I'd bet a lot that most of the internet's non-automated (i.e. human) web traffic is to big publications and organizations. The "private web" is kind of dead. (Let's assume a generous 20% usage, which is still kinda dead).

So your web traffic might be to personal websites (mine is, too) - but we're a minority I guess.

So while the goal is laudable, even a 100% change would probably be a result of 20-30% of the web, I guess even less.


>So your web traffic might be to personal websites (mine is, too) - but we're a minority I guess

But then that would be an improved situation for the two of you, right? (And many others like you, including me.) So it would still make some difference to those who care.

The overall load on the web would still be higher than it could be, though, due to the bloat.

Small improvements should not necessarily be neglected or dismissed, though. Otherwise it becomes like a symptom of the original problem being discussed.


Sure, especially as it won't hurt. I think I've been doing pretty good with the websites I control - but I don't have many visitors (which is fine).

Everything is 'every little bit helps' and it shouldn't make you listen to the naysayers (e.g yours truly) - I'm just a little disappointed because I think it will not make a big difference. At least my conscience is clear ;)


Got you now, makes sense.


Great initiative. Site loads fast, text is readable, adblocker has nothing to block - what more could you want from website? You might want to look into optimizing SVG files, as Inkscape by default leaves a lot of unnecessary metadata.


I came here to write the same. svgcleaner[1] will reduce it from 8.4KiB to 1.3KiB (compressed with zopfli). Which is 85% smaller.

And a manually crafted one is just 477B. And 261B compressed. Which is just 3% of the original.

Now we can talk about bloatedness.

  <svg width="601" height="751" xmlns="http://www.w3.org/2000/svg">
    <rect width="600" height="750" rx="38" fill="white" stroke="black"/>
    <rect x="16" y="16" width="568" height="718" rx="24" fill="none" stroke="black" stroke-width="15"/>
    <text text-anchor="middle" font-family="Arial" font-size="150" font-weight="bold">
        <tspan x="300" y="200">FPS</tspan><tspan x="300" y="350">LIMIT</tspan><tspan x="300" y="660" font-size="350">15</tspan>
    </text>
  </svg>
[1]: https://github.com/RazrFalcon/svgcleaner


The image is 8 kilobytes for 10 characters, and actually it could have been done with just HTML and CSS. Speaking of which, don't look too hard at the page's HTML. Still I applaud the message.

P.S. Why do people put "index.php" in the URL? It would be so much prettier without it --- and a little less "bloated." ;)


what more could you want from website?

I guess once the site is less bloated, then the next step is less bloated content. Could that message have been reduced to the length of a Tweet?

And after that, unpublish it entirely - we'll see the site has changed when we see it.

Fewer websites with less content. Only the most interesting things. Who could argue against that?


This is why is I keep using RSS even though many people say it's dead.

I even went ahead to create a curated feed [2] for Hacker News to pair with my RSS reader [3].

I stopped doing casual web browsing years ago when many web owners started to bloat their websites with an extravagant amount of ads, excessive JavaScript code, and unnecessary "enhancements" (like custom scrolling experiences and the like). Nowadays, websites like Hacker News, the old Reddit interface, Craiglist, and similar are the only ones that I find myself visiting, just because they are among the few hundreds that still load with server-side generated HTML and the UI is fairly minimal [3].

It doesn't helps much that the same front-end programmers that helped increase the bloat in today's web are the same group of programmers building "modern" desktop software, looking at you Electron. They may call it convenient and productive all they want, but to me —and many others— it just means they are lazy and cheap. Not long ago I read an article from the author of TablePlus [4] explaining how he decided to build his app using native libraries even weighing the pros-and-cons of other GUI frameworks, including Electron, coming to the conclusion that slowing development time was worthy if the resulting product was performant.

Because at the end who is paying the bills?

• A customer that enjoys a fast and resource efficient desktop experience?

• Or the front-end programmer who's happy for using JavaScript to develop the app?

Think about it.

PS: almost 100% [5] :-)

[1] https://github.com/cixtor/rssfeed

[2] https://reederapp.com/

[3] https://thenextweb.com/dd/2017/06/29/why-are-some-of-the-ugl...

[4] https://tableplus.io

[5] https://developers.google.com/speed/pagespeed/insights/?url=...


> coming to the conclusion that slowing development time was worthy if the resulting product was performant

What I don't get is where these claims of faster development time come from. Toolkits like Qt make it way faster and easier to build UIs than trying to use HTML/CSS garbage.


I switched to Vscode this year as many emacs veterans did. It's lighting fast. And lightweight. It's a joy to use. I can't imagine a thing much more performant than that.


I use a similar design for my website, it has been like this for like 10 years now. I still tweak some php 5 code, and things just works on some free hosting.

The problem with the web is how permissive it is, so it gets bloated, and since online advertising, this trend is still going.

https://en.wikipedia.org/wiki/Wirth%27s_law

> is a computing adage which states that software is getting slower more rapidly than hardware becomes faster.

What bothers me is how the web is making it impossible to run on modest smartphone, or how it's burning your battery into oblivion.

There might be solutions to this, like basically pre-parsing HTML and sending the parse tree, or re-factoring HTML and css rules into something more lean, or just rebuilding a new document format from scratch that is readable by humans, like markdown as a syntax, and rendering it directly without HTML.

We're often reminded about Knuth's "95% of the time", but HTML is really a crying example of a bad designed document format, which turned into a black hole of computing complexity since JS has access to the DOM.

Frankly web development disgusts me, and I avoid it as a career path as much as I can. We rely on web browsers and search engines way too much.


I think the race to make the web an application platform is responsible for this. Web features are growing every day trying to match native apps, and the media guys leverage them to make us sad.

We really need to split the web stack in two somehow, differentiate consumption sites with some basic interaction from full blown applications, handle them completely separately down to the operating system integration level. Right now a web application like Facebook runs in a tab (?) or worse is shipped with a whole browser instance (eg Slack). That’s something wrong here.

A Google-AMP like version of the web for things that are not GMail or Facebook (99% of the web, especially in the long tail) is probably good enough.


Well, it's also that the "web app" push only ever concerns itself with features never with performance.

You'll frequently see things like "native apps can do X, so the browser should, too" where X is camera, opengl, files, etc...

Not necessarily wrong, but you basically never see anyone pushing the boundaries of performance. I can run games at native resolution in excess of 100fps all day long, but I can't scroll a fucking web page at 60fps? And the round-trip communication with a webworker, the only mechanism whatsoever to leverage the other 75-95% of the system's CPU, is measured in 100s of microseconds to milliseconds?

Being an app platform isn't just features, it's also getting the fundamentals to perform well. And the web just... doesn't.


I love my web apps - I vastly prefer them to native apps. They don't need to be installed. Are mostly sandboxed from my native OS. Are trivial to update and even more trivial to use.

I don't even kind of understand the hatred directed at giving people actual giant useful full featured application that often is on a few meg in size and can be loaded by merely visiting a URL.


WebP only works on ~73% of browsers (https://caniuse.com/#feat=webp, no Firefox/Safari/Edge support), so it's a little early to switch all your images to it. MozJPEG is a good JPEG compressor.


You can use the HTML5 <picture> element to add a fallback jpeg for all browsers that don't currently support Webp


Even better: you can let your image server check the accepted formats in the request header and send webp content transparently instead of jpg. Then you don't have to change your markup at all.

Browser's don't really care about file extensions.

https://github.com/uhop/grunt-tight-sprite/wiki/Recipe:-serv...


Oh. Thanks! I didn't know that was possible


The new Reddit site is a major offender. After about 5 pages worth of scrolling FF becomes janky (takes Chrome about 10 pages, but i'm off Chrome now). Once the jank sets in closing the tab leads to a UI lockup as FF presumably tries to unload all the garbage in memory.

At least they still have old.reddit to fall back to.


I guess the new design is curing the addiction to reddit.


I am dreaming of a solution to bloat where all mainstream browsers would be throttling per-tab CPU/memory/bandwith usage to Pentium III-on-DSL-circa-2001 levels by default.

(The "full speed" option should be there to keep power user from switching, but it should not be obvious to, say, 50% of users).

This should be pushed as a feature: now your browser helps make your computer work faster; any slowness in page loading is to be blamed on the websites.

This will not affects sites like Hackernews, but will bring a lot of websites to a standstill - and will force them to slim down.


A perhaps more workable solution would be to have per-tab indicators of total bytes downloaded, peak memory used, and total CPU time used (the first one already exists as part of the developer tools, but it's hidden by default). People on fast computers and fast networks do not feel the weight of bloated pages; making it visible to everyone provides an incentive to reduce it.


Perhaps summarised as "the percentage of your battery this tab has consumed".


I really think that you are onto something there. Or maybe a bloat index calculating the ratio between user content and downloaded content. Like 1kB of text but with 2 MB of JavaScript/CSS etc would give you an index of 2000 whereas if you only had 23 kB of JavaScript/CSS etc, it would 23.


With an indicator like "better / worse than X% web pages".


I've always wanted something similar as a devtool. I can throttle a pages bandwidth in chrome, but I can't easily put limits on how much CPU time it is allocated, and I can't constrain its memory usage or simulate swapping to disk.

Most designers and web developers chuck their stuff together on a brand new 27" iMac and it runs (and looks like) like trash on a computer an average person would own.


Chrome devtools now has CPU performance throttling. Go to the performance panel and click the red gear in the top right. There's both network and CPU throttling options.


I'll be making heavy use of that, its a shame its a relative slowdown though. I realise throttling consistently down to an absolute performance level would be difficult to implement, but it would be handy.


I had a similar idea but I couldn't work out how to resolve all the issues that would cause. For example, things that naturally need more performance, like 3D games. I don't think playing games in your web browser would put you in the category of power user.


The key would be a permission prompt like browsers now have for camera access, notifications, geolocation, pop ups, etc.

If a page uses too much CPU it will get throttled and ask the user if they want to unthrottle it.

Website owners currently assume they’re entitled to as much of my processor time and battery life as they want, just as they once assumed they were entitled to display popup ads just because they could. If this behavior is going to change, it will only happen because browsers stop letting them and instead say “No, this is the user’s CPU time, not yours.”


You can hack this now by limiting CPU per process,

Not ideal, I realize,

https://medium.com/@sbr464/limit-dropbox-and-others-on-macos...


The first browser to do this everyone would leave because the others are faster.


That's why it's but a dream.

If all browsers did this at once, though...


[flagged]



this is completely insane and would break huge swaths of the web.


Exactly.


that's the point. that's what it's supposed to do.


Kudos, Fabien! Your site has been endlessly informative on my own graphics programming journey. I think you can include a donate button and maybe raise some funds toward a new laptop or nvidia turing gpu ;)

One site I use multiple times a day that I think has decent information architecture is FinViz: The Stock Screener. High resolution quotes, charts and heatmaps. Updated on intervals of a few seconds. With lots of aggregated news and blog links.

This cryptocurrency performance page is a great example. cryptowat.ch and many other tools run high cpu loads. With an emphasis on saturating the information space with real time depth of book, exchange arbitrage, analytics, etc. When all you really desire is a quick snapshot of relative price action.

https://finviz.com/crypto_performance.ashx


Lighthouse (now the Chrome dev tools "audits" tab) makes it really easy to trim down your site to the bare minimum. I've using on my personal blog, it's actually quite fun to work within constraints :)


Disable Javascript by default. That's it. Single one-button plugin made my web experience incredibly better, and almost all of the news and blog sites became much cleaner and more responsive.


I agree with his ideas but I still think I'd rather have his old page back [1].

I think my main issue is the monospaced font which is good for code but it's not very pleasant in the eyes when we're talking about a blog post or any paragraphs.

[1] http://fabiensanglard.net/Compile_Like_Its_1992/index.php


I used to work for the companies that build these bloated websites. I spent a year rebuilding the BBC News front page to make it not-awful[1] but it felt like I was fighting against the rising tide.

I've since moved on and now work at SpeedCurve. It's pretty motivating to see so many of our customers making their websites faster and generally better for the people who use them. It's also pretty depressing to know that the web overall is spiralling down the drain and getting heavier slower every year[2]. What a shit show.

[1] https://wildlyinaccurate.com/introducing-a-faster-bbc-news-f...

[2] https://speedcurve.com/blog/web-performance-page-bloat/


If people actually cared, websites would be lighter. But they don't.

Yes, this community here does care, much like the experienced woodworker who nods approvingly when he sees <some old-fashioned, difficult, and slightly more elegant way to do something... I really have no idea about woodworking>.

Even in cases where people might agree that lighter pages are preferable, it's just not a priority.

I know that might appear wrongheaded. But I assure you there are dozens of things you do every day that would make a subject matter expert's hairs fall out: that's not how you're supposed to brush your teeth ... you have to wait for the engine to cool before measuring oil ...

...and, closer to home: websites should be accessible to people with impaired vision, among other things. This rant, however, scores rather low. It might seem trivial, but not adding an alt description to explain the image joke can make people feel excluded.


But I assure you there are dozens of things you do every day that would make a subject matter expert's hairs fall out: that's not how you're supposed to brush your teeth ... you have to wait for the engine to cool before measuring oil ...

I don't even have 10,000 visitors each day watching me measure the oil in my car or brush my teeth. Neither are they /all/ paying extra in electricity and battery life and wasted time watching me do it badly.

What in my everyday life do I do so badly that is also an externalised cost on thousands to millions of other people, which I profit a lot more by doing badly?


Yeah... that's kind of my point: your indignation just seems a little excessive, considering data charges do not in reality range in the top 100 problems people have.

If you really can't fathom why this doesn't get a Special Prosecutor: Remember that every year, tens of thousands die in the US alone because their doctor's handwriting was so bad they got the wrong medicine! Ask your pharmacist what they think of people diddling their life away minifying CSS while this atrocity continues unabated!


It’s called “Craftsmanship”: doing things right even where most customers don’t see or care.


Dare I say there's a mini-movement starting? :D https://www.jasonhanley.com/blog/2018/09/making-websites-eff...

Here's hoping...


If Google factored page weight into ranking and it would start to move things in the right direction.


I'm going to take this opportunity to talk again about the new Gmail and how horribly bloated it is... Scrolling skips frames all the time. And now you're forced in the new version, you can't 'go back' anymore...


Does adblock + custom element filtering not help against such sites? If every additional feature that takes up more computing resources provides a slightly higher chance to retain the average user and thus make more advertising revenue, I don't see why large sites would not riddle their sites with such features.

Of course, it's sad that happens; but I think adblock sufficiently solves this.

As a side-note, I recently installed DNS66 on my android device and the internet is much faster. Using the mainstream web without adblock seems to be difficult these days.


I for one absolutely agree with you that the web has gotten too bloated, especially news websites, and I salute your initiative for "debloating" your blog. This is a path that more of us should follow.

I think Google's AMP project is a step in the right direction, but it seems to me that it's not adopted enough.


AMP is a step in the wrong direction as it requires you to use js to do stuff that you can do in normal HTML on a non-AMP site.

A step in the right direction would be a proper subset of html/css without requiring more JS to load


Curious as to why you think AMP is a step to the right direction.


Because the pages load fast and are not bloated, which is the point of this article.

Unfortunately Google has bound it to other questionable practices like hosting on their domain, but it’s still a step on the right direction. We need something like that plus a way to cache it on the edge servers in a portable fashion so that any CDN like Cloudflare could pick them up.


AMP is not bound to hosting on Google, Bing now has an AMP cache as well: https://blogs.bing.com/Webmaster-Blog/September-2018/Introdu...

And since you mention Cloudflare, I searched a bit and found https://www.amp.cloudflare.com/

There are problems with AMP, but Google exclusivity is not one of those problems.


@fabien thank you so much for this. I was half-expecting one of your usual dissection of the architecture, but I dont want anythibg coming between you and the release of your Doom book, which I plan on buying at least 2 copies.


In my experience outline.com does a bang up job of getting all that clutter out. I am not in any way affiliated with them and don't even know how they monetize but I hope they stick around for as long as possible.


Kudos

ps: adding to the heartbreaking side of the web, when you use google cache in text-only, you see how often there was nothing to read (almost)


I love the initiative and the execution. Nice job! I feel like I'm becoming a bigger fan of brutalist web design all the time. :)


I applaud you for doing this for your blog; everyone should at least do their best to slenderize their web properties. Kudos to you!


What about the web browser itself?

AFAIK chrome and other modern browsers use resource-consuming features like preconnect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: