Hacker News new | past | comments | ask | show | jobs | submit login

JavaScript isn't the problem, but it seems to always take the blame. And its not even slow.

Here are a couple of real culprits:

* Advertising / analytics / social sharing companies. They deliver a boatload of code that does very little for the end-user.

* REST and HTTP 1 in combination. A page needs many different types of data. REST makes us send multiple requests for different kinds of data, often resulting with those 60-80 requests mentioned above. We could be sending a single request for some of them (e.g. GraphQL) or we could get HTTP 2 multiplexing and server push which completely fix that problem.

* JSON. Its simple but woefully inadequate. Has no built in mechanisms for encoding multiple references to the same object or for cyclic references. Want to download all the comments with the user info for every user? You have a choice between repeating the user info per comment, requesting the comments first then the users by id (two requests) or using a serialisation mechanism that supports object references (isn't pure JSON)

* The DOM. Its slow and full of old cruft that takes up memory and increases execution time.




> Want to download all the comments with the user info for every user? You have a choice between repeating the user info per comment, requesting the comments first then the users by id (two requests) or using a serialisation mechanism that supports object references (isn't pure JSON)

Or you send it like

    {
      "users": [
        {
          "id": "de305d54-75b4-431b-adb2-eb6b9e546014",
          "name": "Max Mustermann",
          "image": "https://news.ycombinator.com/y18.gif"
        },
        ...
      ],
      "comments": [
        {
          "user": "de305d54-75b4-431b-adb2-eb6b9e546014",
          "time": "2015-10-15T18:51:04.782Z",
          "text": "Or you can do this"
        },
        ...
      ]
    }
Which is a standard way to do it that works here, too. Because you can have references in JSON, you just have to do them yourself.


Alternatively, I would think that gzip does a good job of factoring out repeatedly embedded user objects.


Yes I agree that gzip will probably fix this single-handedly. The main cost here is very likely network transfer, so gzip will do great.


Don't forget it takes time to parse large json blobs.


And to serialize it from whatever the in-memory representation is.


yes, obviously – gzip even does that explicitly, replacing repeated text with references.

But it still takes more power for your server to go through gzip every time. And it will take more RAM on the client to store those objects.


Something similar has been standardized as well with the JSON API specification (although it adds its own weight to the message as well, it does address this problem): http://jsonapi.org/format/#document-compound-documents


Its a great solution! I'd use dictionaries for faster lookups though.

But not exactly pure JSON. Client-side, you need to attach methods (or getters) that fetch the user object to the comment. I suppose you could just attach get(reference) that takes this[reference + '_id'] and looks it up inside the `result[reference]`. m:n relations will be harder though.

Otherwise you can't e.g. simply pass each comment to e.g.a react "Comment" component that renders it. You would also have to pass the user list (dictionary) to it.


Well, you could process the JSON on client side.

    response.comments.forEach(comment =>  comment.author = response.user[comment.author]);
Preferably, even, you could do it during rendering so it can be garbage collected afterwards.


Well, that counts as further deserialisation in my book. At least if you set up a convention to automate it for all kinds of objects. Otherwise you'd have to do it manually for every type of request


> * The DOM. Its slow and full of old cruft that takes up memory and increases execution time.

I'm not going to say that the DOM is wonderful but … have you actually measured this to be a significant problem? Almost every time I see claims that the DOM is slow a benchmark shows that the actual problem is a framework which was marketed as having so much magic performance pixie dust that the developer never actually profiled it.


Good point. Lets see:

http://jsfiddle.net/55Le4ws0/

vs

http://jsfiddle.net/nem6tnv1/

Initialising about 300K invisible DOM nodes containing 3 elements, each with 3-character strings is ~15 times slower than initialising an array of 300K sub-arrays containing 3 elements, each being a 3-character string.

Additionally, paging through the created nodes 2K at a time by updating their style is just as slow as recreating them (the console.time results say something different, but the repaint times are pretty much the same and this is noticeable on slower computers or mobile.)

Thats a single benchmark that raises a few questions at best. But I think React put the final nail in the coffin: their VDOM is implemented entirely in JavaScript, and yet its still often faster to throw away, recreate, diff entire VDOM trees and apply a minimal set of changes, rather than do those modifications directly on the DOM nodes...


It's not useful to compare the DOM to a simple array which doesn't do most of the real work which you need to do. Comparing rendering that text using canvas or WebGL would be closer if it also had a layout engine, dynamic sizing, full Unicode support, etc.

Similarly, React is significantly slower – usually integer multiples – than using the DOM. The only times where it's faster are cases where the non-React code is inefficiently updating a huge structure using something like innerHTML (which is much slower than using the DOM) and React's diff algorithm is instead updating elements directly. If you're using just the DOM (get*, appending/updating text nodes instead of using innerHTML, etc.) there's no way for React to be faster because it has to do that work as well and DOM + scripting overhead is always going to be slower than DOM (Amdahl's law).

The reason to use React is because in many cases the overhead is too low to matter and it helps you write that code much faster, avoid things like write/read layout thrashing, and do so in a way which is easier to maintain.


Most of the elements are hidden via display:none - there is no layout work to be done for them whatsoever. The rest of the overhead seems to lie entirely in allocating a lot of extra data per node.

Also, total time to repaint seems to be equally fast whether we're recreating the entire 2K rows from scratch or changing the display property of 4K rows. That seems concerning to me.

But yes, a more fair comparsion would be to write a WebGL or Canvas based layout engine in JS. If its less than 4 times slower (its JS after all), then the DOM is bloated.


Also, remember that my position is that most applications are limited by other factors before you hit the DOM.

I certainly would expect that you could beat the DOM by specializing – e.g. a high-speed table renderer could make aggressive optimizations about how cells are rendered, like table-layout:fixed but more so – but the more general-purpose it becomes the more likely you'd hit similar challenges with flexibility under-cutting performance.

The most interesting direction to me are attempts to radically rethink the DOM implementation – the performance characteristics of something like https://github.com/glennw/webrender should be different in very interesting ways.


> * Advertising / analytics companies. They deliver a boatload of code that does very little for the end-user.

That's not quite fair. They subsidize the content for the end-user. Perhaps that's a crappy status quo, but in many cases without the advertising and analytics the content wouldn't exist in the first place.


I'm increasingly of the mind that:

1. Advertising is the problem.

It's creating technical problems. It's creating UI/UX problems. It's creating gobs of crap content. It's creating massive privacy intrusions and security risks. And for what? Buzzfeed?

2. Ultimately, the problem is the business model for compensating informational goods. Absent some alternative mechanism (broadband tax, federal income tax applied to creative works), I don't see this changing much.

https://www.reddit.com/r/dredmorbius/search?q=broadband+tax&...

3. Micropayments aren't the solution.

http://szabo.best.vwh.net/micropayments.html


Is it not an option to just stop visiting these sites?


Think of the children. I mean authors.

I'd like a system under which creators of quality content would be equitably and fairly compensated. The present system fails this.


What if they already are? I can think of a handful of newsletters that command a three-figure annual subscription price. The people who buy them must regard them as useful and of sufficient quality. It could be that the rabble of news sites drowning in ads are mostly schlock, and that their business model is one of attempting to monetize some of your least-attentive moments.


Perhaps. But maybe, if you add googleads, amazon adsystem, moatads, rubicon project, taboola, scorecardresearch, krdx etc [1], then you start wondering why are users using adblockers, so you add pagefair, and at that point you want to find out what works better so you add optimizely... maybe, just maybe, at that point, you're actually losing money because your page is so damn slow, rather than getting more because of your efforts to perfectly monetise it.

[1]: Just copied them from coldpie's news site screenshot comment. That was about 1/2 of them, there were many more.


I remember when I was young, Television was sponsored by Advertisers: One of my favourite shows, was made possible because of Advertisers. To bemoan Advertisers in my mind is to say that part of my childhood should never have existed, so I think greater, more focused criticism is required.

Do you ever wonder why the landscape became the way it did?


If a TV show were five minutes of content, and 55 minutes of advertising, I would stop watching it. And yet, that's the approximate breakdown for many websites between content and advertising. And they wonder why readers complain.


They know why readers complain. They wish you understood their perspective better.

A significant problem is that content costs a certain amount of money to be produced, and web content is unable to command those prices.

Ad fraud is a big part of it, and some of the companies in the best position to solve it (like Google) are benefitting so handsomely from ad fraud that I can't imagine them stopping it.

Ad blockers will hurt legitimate content producers, but because people defrauding advertisers don't use ad blockers, they'll continue to make money.


I'm sorry, but watching an ad isn't just a payment like you do with money. It's a rape of your mind. It needs your personal data, to rape your brain deeper. So i block ads, not to see content, but because ads don't deserve to exist.


A rape of your mind?

Wow. Well, I'm glad you have your feet anchored down here in reality and aren't exaggerating at all.


> It's a rape of your mind.

This is perhaps the most ridiculous comment I have read this afternoon.


Read more about ads, their perception by our brain, how they work and how they are designed to bypass our consciousness. Next, understand the concept of metaphor. And then, come back... or not.


There's actually a lot of momentum on this from with the IAB (standards body for online advertising). The goal is to separate content from analytics. The result would enable publishers to prioritize loading the content before loading analytics.


Hmm, cool, didn't know they were working on it. That should help.


I'm increasingly of the mind that:

1. Advertising is the problem.

It's creating technical problems. It's creating UI/UX problems. It's creating gobs of crap content. It's creating massive privacy intrusions and security risks. And for what? Buzzfeed?

2. Ultimately, the problem is the business model for compensating informational goods. Absent some alternative mechanism (broadband tax, federal income tax applied to creative works), I don't see this changing much.

https://www.reddit.com/r/dredmorbius/search?q=broadband+tax&...

3. Micropayments aren't the solution.

http://szabo.best.vwh.net/micropayments.html


If it degrades the user experience so much that the page is unusable, it's not serving the intended purpose. Users will block the ads or not bother waiting for the page to load. Some ads/buttons/analytics code is worth it, but most content sites don't seem to think about the tradeoffs at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: