Hacker News new | past | comments | ask | show | jobs | submit login
The viability of JavaScript frameworks on mobile (joreteg.com)
195 points by joeyespo on Oct 20, 2015 | hide | past | favorite | 80 comments



This is good. Many frameworks are just too freaking huge for mobile but even when you use tiny frameworks and you dump a bunch of images into them you're going to have similar issues (at least on load).

Honestly using the DOM API isn't all that hard. Yes it awkward, verbose and sometimes cumbersome but it's still pretty straight forward. I'm actually really liking react lately but if I want something done well for a mobile I almost never use a framework.

Another thing this article didn't touch on was latency. When I was doing work for vehicles that had poor internet access via satellite every single http call just killed the load (this included fetching css, javascript, etc). I can't stress enough how much better your page can load if you combine as much stuff as possible, even images if you can display them as backgrounds.


One significant bonus in drastically reducing javascript is improved battery life.

> Another thing this article didn't touch on was latency. When I was doing work for vehicles that had poor internet access via satellite every single http call just killed the load (this included fetching css, javascript, etc). I can't stress enough how much better your page can load if you combine as much stuff as possible, even images if you can display them as backgrounds.

This, so much this. Particularly with the growing marketplace out side of 1st world countries. Fewer calls of any type are of significant value. Reduce DNS calls, host everything from the same domain where possible, etc. etc.

Use sites like https://developers.google.com/speed/pagespeed/insights/ to help you


This is heartening. How do you deal with cross-browser differences, especially considering you're supporting older mobile devices?


If you don't need to support IE8 (which is at 1.38% global market share according to CanIUse), then cross-browser differences just aren't the problem they were in the 1995-2010 era. The event model, DOM querying, classList, CSS animations & transitions, fast GPU rendering via CSS translations, box sizing, history management, localStorage, ES5, JSON parsing, geolocation, websockets, all of these are well-supported by standardized native APIs on all major browsers.

http://caniuse.com/#search=event

http://caniuse.com/#search=querySelector

http://caniuse.com/#feat=classlist

http://caniuse.com/#search=animation

http://caniuse.com/#search=transform

http://caniuse.com/#feat=css3-boxsizing

http://caniuse.com/#search=history

http://caniuse.com/#feat=namevalue-storage

http://caniuse.com/#search=es5

http://caniuse.com/#feat=json

http://caniuse.com/#feat=geolocation

http://caniuse.com/#feat=websockets

Even things like the template tag have reasonable support. It's only when you get into bleeding-edge stuff like webcomponents and web animations that browser support starts to drop off, and unless you're using Polymer (which polyfills all that), that's unlikely to matter.


You deal with them one at a time. It's not like there's magic involved.

Our SPA isn't even using jQuery. Loadtimes are fast, app is very fluid.

Many people seem to not understand that when you query the dom for a node, you can store that node and reuse it throughout your app.


This is where small frameworks like Mithril shine.

The site says 12KB, for some reason, but minimized and gizipped, the latest version is closer to 7 or 8 KB IIRC.

http://mithril.js.org/


Just came back to mention that two of the most well-known Mithril-based projects, Flarum.org[1] and Lichess.org[2] are open source and have mobile UIs that you can try out, so if anyone's interested in seeing non-trivial codebases that use a "lighter approach" alluded to in this article, there you go.

[1](https://github.com/flarum/flarum)

[2](https://github.com/veloce/lichobile)


http://flarum.org/ is especially relevant since it is a competitor to Atwood's Discourse which sparked the current discussion.

http://lichess.org/ is an [f|F]ree online chess platform.


Mithril is amazing. I've been using it for over a year and only like it more and more. Its fast initial load time is very relevant in today's world of mobile web apps.


I like riot.js; a react-like library weighing in ~3.5 kb min+gzip.

http://riotjs.com/

Browserify gives you modules. This technology can be used to structure the code & package only the bare essential functionality.

http://browserify.org/


Riot looks a whole lot like Polymer.


They compare it to polymer in the docs - much smaller, for one.


I don't say Mithril is bad but code size is no indicator for performance. Counterexample: The minified Box2D code is ~760kB large and impressively performant.


It is an indicator of load time. Most users bounce off pages that take longer than 3 secs to load.

Mithril isn't the fastest of the virtual DOM libraries out there, but it is usually good enough, even on mobile.


Usually latency due to dependent requests are your killer on mobile. Even Edge throughput wasn't that bad once it finally got going.


Fixed :)


With Microcosm we consciously chose to go the other direction: Do the work server-side and serve HTML.

Basically... the old way of doing things.

An example site https://www.lfgss.com/ is 280KB on first load (and most of that is Mozilla Persona) and subsequent requests are usually around 10KB.

Even though that company is now dead, I'm still working on it and aim to strip out Persona (it's deprecated) and to set a first load goal of 100KB (which should be easy to achieve).

People should feel the power of their devices, and the only way to do that is to have those devices do less, not more.


Persona is not deprecated. It was moved to "community ownership", which is not "deprecated" just as much as the average open source project is not deprecated. It's a bit like saying that if Red Hat stopped paying it's developers to work on the Linux kernel then Linux itself must be deprecated.

I wish more sites would use Persona and so I'm heartily disappointed to hear someone expecting to strip it out of an existing usage.

(Aside, I gave a presentation on the subject, for what that is worth. http://blog.worldmaker.net/2015/05/13/mozilla-persona-talk/)


I'd love to continue using it... but I don't want to run an instance as I want to point at someone else's instance. If I have to run my own instance then I need to know I can maintain it and keep it secure, and I am not a Node guy and I find the code and all of it's dependencies to be too much for me to say that I am the man to keep an instance secure and maintained.

The thing is that I want it to perform as well as my application. Just look at Persona's portion of blame for the time, transferred bytes, transfer waterfall, connections: http://www.webpagetest.org/result/151020_X5_17EP/ (and that's with connection preconnects working... older browsers have it worse).

Persona is the biggest performance hit on my web app, and it's holding me back and I do not wish to run my own instance (just to put it behind CloudFlare and make the whole thing fly).

Whilst performance hurts, and it does hurt, and whilst I live under the shadow of "Mozilla aren't owning this and pushing it forward"... I'm very seriously thinking of using https://github.com/go-authboss/authboss and making a centralised front-end for it so that I can achieve a very similar thing in a simplified way that I can support and maintain.

The key thing though... I need performance from it. I don't have that today. I have a web account... but it needs to perform and I've seen no improvements on that front in the entire time I've used it.

PS: I even checked Auth0 and others, but I'm doing so many logins per month that it would cost 100x the costs to run my entire platform to use any of the paid services.


I had some success in loading Mozilla's JS library asynchronously (using RequireJS in that particular application) after initial render/dom-load. That certainly mitigates some of the performance issues. (You'd notice even more of a bit of a slowness in the UI picking up if you were already logged in on page load, but that is more and more common on the internet these days, for many of the "SSO" authentication systems, so I don't consider that much of a problem.)

Also, from your waterfall you've posted here, most of the Persona stuff is happening in background/parallel anyway: the big slowness in getting to render start certainly appears to me to be your fonts more so than Persona, which is what I would expect to see. That is, I'm wondering if you are scapegoating a bit here as the impression that I get looking at your waterfall is that you won't gain as many milliseconds as you might think by removing Persona.


In terms of interactivity this probably only works for the most basic stuff. Also, only for the apps that need to be "always connected" anyway.


It's a forum, the example given in the article is also a forum. They are directly comparable.


Rule #1 for optimisation: "You can't make code run faster. You can only make it do less." It's true for mobile devices as much as everything else.


Fortunately a lot of what these libraries do is spin bloated wheels so there is a lot of opportunity to do less without losing features. This is why I write mobile web apps with vanilla JS only. It takes a little more time up front, but the result is vastly superior to what any library can offer.


Have you tried using server-side rendering as a viable alternative that lets you use frameworks like React?


Or you can make it do more in the same time frame.


The article is interesting but the Atwood article was mainly about how badly Android javascript performs vs iOS. If you have a 5x difference in performance between platforms it will be impossible to maintain feature parity.


It's relevant; iOS may be faster but the user experience still sucks when web devs rely on heavy JS frameworks. Rather than bemoan how much JS sucks on Android, devs should realize that JS sucks everywhere and use it as sparingly as possible.

This is especially relevant since the fragmentation on Android means this problem won't be going away soon. Even if Google fixed its JS performance tomorrow, so many phone manufacturers bundle their own browser that it will be a long time before those users with "old, slow" JS on Android go away. I don't know how the JS engine is bundled on Android, but if it's bundled with the core OS then the vast majority of phones would just never get updated. Developers have to deal with that, so unless you're ok with excluding a large percentage of mobile devices on the street, dealing with slow JS performance is simply a fact of life on mobile.


It's not JS that sucks; V8 performance is actually pretty competitive with Dalvik, if not faster. It's two other things:

1. There is a cost to download frameworks, and most webdevs ignore that cost. A native app has the Android class libraries already on the device, oftentimes already loaded into memory. A webapp that pulls in 600K of JS code + framework has to read that all over the network, parse it, and JIT it before execution can even begin.

2. The native Android GUI frameworks render views on the GPU. Chromium will only render views on the GPU if you have a CSS transform or opacity property set. If you're not very careful, you can easily trigger an expensive layout + repaint calculation on every frame.

It's possible to write webapps that perform just as well as native, with fancy animations and fluid user experiences. I had some existence prototypes when I was still at Google, and some of the research for that went into the current Google Search app experience on Lollipop. But it was a huge pain as far as the developer experience went, and a lot gets lost in translation when productionizing. It basically involved treating the browser as an OpenGL canvas, where certain combinations of CSS + HTML would render text and boxes into a texture and then other combinations would let you move textures around and fade them together in the window.


JS does indeed suck on mobile; since it's single-threaded by design, all those blocking operations (you already mentioned some examples) add up. Hence where you get both issues you mentioned - those refreshes are only necessary because the single-threaded nature of JS causes a few dozen event handlers in your framework to fire on every change you make. It also doesn't help that most mobile devices aren't exactly pushing the envelope on single-threaded performance.

And let me be clear: JS being single-threaded isn't a bad thing in itself (the thought of a multi-threaded language in the browser gives me nightmares). It just makes the user experience of JS suck on mobile. And that's not going to change any time soon given the glacial pace of patch/update deployment on Android.

> It basically involved treating the browser as an OpenGL canvas, where certain combinations of CSS + HTML would render text and boxes into a texture and then other combinations would let you move textures around and fade them together in the window.

That approach -- while interesting from a technical standpoint -- kind of defeats the purpose of a browser :) I mean, you're basically building another browser inside the OpenGL window...


Native apps are also effectively single-threaded: all of your code, by default, runs on the UI thread, and you need to use AsyncTask or explicitly spawn a Thread to do work in the background. If you need this functionality on the web, there's Web Workers, so it's functionally the same story.

CSS transitions/animations, BTW, run on a separate thread in Chromium. This is the primary difference between them and requestAnimationFrame, which always runs on the main thread.


You can add your blocking JS code to a WebWorker which will execute on a background task. This makes your UI responsive and keeps your number crunching from your users perception.

You can learn more about workers at https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers...


When browsers support stale-while-revalidate we'll have the opportunity to make web apps update in the background the way native apps do.

Weight still matters; I just thought this worth mention.


Why doesn't Chromium render more elements on the GPU?


They had (have?) a project to do that:

https://www.chromium.org/developers/design-documents/gpu-acc...

http://www.androidpolice.com/2014/11/19/project-ganesh-demoe...

I don't know what the status is of it. It'll help paint times (probably significantly), but it can't do much for layout, because the way the CSS spec is written, certain CSS properties require global calculations to figure out where every box should be laid out. (Think about floats, where they're supposed to butt up against other floated boxes, even if they don't have the same parent.)


Because rendering on the GPU isn't a panacea. On mobile platforms, it can have negative implications on battery life and performance, and given the sequential-dependent nature of CSS rendering, I'm not sure you'd even get the benefit of parallelization.

In many cases, you'd end up waiting for the main thread most of the time anyway (which you'd probably also get if you just spawned a few threads on another CPU core). It also adds a hell of a lot of complexity since SOCs have varying levels of capability in their GPUs, which your code now needs to identify, integrate and test against.

I'm not claiming there's no benefit, but there's a reason they don't just GPU all the things.


I don't quite get the Atwood article, in both of the benchmarks he shows, Kraken and Octane, the Android device scores better than any IOs device, when using the stock browser. It's only when using TMOUS they do worse, so why do anybody use that?


TMOUS = T-Mobile??? Or something else?


I think that's the key, from the Atwood article:

"In a nutshell, the fastest known Android device available today -- and there are millions of Android devices much slower than that out there -- performs 5× slower than a new iPhone 6s, and a little worse than a 2012 era iPhone 5 in Ember. How depressing."


Most of the performance issues people see on mobile (and desktop!) with javascript isn't with javascript performance itself anymore (unless you're doing some crazy stuff), but with the relatively expensive repaint/reflow browser operations.


This point is often missed. It isn't javascript that's slow, it's the DOM and all its overhead and cruft that's slow. Or it's the connection that's slow. An important distinction.


I think a lot of developers wanted mobile web performance to catch up with desktop and mobile native that we believed it would happen, and much faster than was ever really likely. We're used to things getting better quickly when it comes to technology.

Flagship phones improve things a little each year, but to quote William Gibson, the future is here, but unevenly distributed.


> We're used to things getting better quickly when it comes to technology.

It doesn't help that some influential people in the software industry, like Joel Spolsky, told us to bet on the hardware improving fast. See for example:

http://www.joelonsoftware.com/items/2007/09/18.html

Particularly this part:

> a couple of companies, including Microsoft and Apple, noticed (just a little bit sooner than anyone else) that Moore’s Law meant that they shouldn’t think too hard about performance and memory usage… just build cool stuff, and wait for the hardware to catch up. Microsoft first shipped Excel for Windows when 80386s were too expensive to buy, but they were patient. Within a couple of years, the 80386SX came out, and anybody who could afford a $1500 clone could run Excel.

By contrast, he describes Lotus optimizing 1-2-3 so it could run in 640K of memory. Who would want to be today's equivalent of Lotus? So just pump out features and wait for the hardware to catch up, right?


Blame the language and not the platform is what you're essentially saying. Ok so initial payload can be what 600kb per your argument. But at least once I have delivered that my actual interactions on the server side can be somewhat a minimum instead of re-rendering more HTML and sending across the wire to the handset.

That being said yes you must develop on a crappy computer if you want to be known as dev that's known for making apps snappy. Is there a lot of fat one can cut from these frameworks yes... but I think Atwood's initial argument is that android just sucks with Javascript. The many cores, single thread per tab just makes the experience anemic with rich web applications.


> Blame the language and not the platform is what you're essentially saying. Ok so initial payload can be what 600kb per your argument. But at least once I have delivered that my actual interactions on the server side can be somewhat a minimum instead of re-rendering more HTML and sending across the wire to the handset.

Unless your mobile browser aggressively purges its cache due to the limited memory on most mobile devices. Then you're going to have to reload the payload every time you bring the page to the foreground; or at a minimum reload the framework (which usually takes a noticeable amount of time as the framework creates a bunch of temporary data structures).


Just to clarify, I'm not suggesting doing we should do full page refreshes or that we shouldn't be doing single page apps.

I'm all for minimal server-side interactions once loaded too.


I think this post on mobile performance is relevant: http://sealedabstract.com/rants/why-mobile-web-apps-are-slow...


or you could just write regular Javascript. Browsers seem to be pretty good at handling Javascript, CSS, and HTML. The default size of vanilla js is actually 0 KB and that's not even gzipped.


And after reaching a certain level of complexity what you have is an ad-hoc framework of about the same size but is a slow, bug ridden implementation of half of one of the ones in the OP.


That really depends on the work. There is a lot of APIs in the browser these days that people are not using. For example, there are people adding jQuery to a project just to have the selectors when querySelector and querySelectorAll have been available for some time.

With good care, you can create libraries suited for your problem and keep code size and complexity down. Don't forget that when adding frameworks you're also adding their complexity to your project. Imagine having to debug the internals of angular... I shiver just too think of that.

The thing is that using frameworks is cool up to the moment when you reach a framework quirk or bug and need to dive into it as well, then you're no longer working in your problems domain but on generic framework land and that can be much trickier than building your own, specially if it is a solution made for a single problem.

But yes, I agree with you that if complexity grows enough you start requiring a special type of developer with high skills to keep this bespoke code running well.


That has not been my experience, but if you had a large enough engineering team, lots of churn, and bad management / development practices I could definitely see that developing into an issue.


I know GWT isn't the most popular framework (it isn't really a framework, is it?), but I've been a fan of it for a while. It solves three critical problems that have plagued web development:

1. Cross-browser compatibility.

2. Code performance.

3. Code organization (there are many JavaScript frameworks whose selling point have been making JavaScript more organized).

But in regards to this article, one often overlooked feature of GWT is that dead-code is automatically removed from the compiled JavaScript:

https://www.quora.com/How-fast-is-GWT-compared-to-JavaScript...

Thus you don't have to include the entire Angular/React/Ember/JQuery whatever library just to get a few features for your site.


Sure but GWT == writing Java, downloading an heavy an complex SDK , and writing imperative UIs. Sure it comes with widgets but you can find the same widgets in JS.

Compare with React,AngularJS and co : way more popular,declarative UIs , if you need static typing you can use Typescript which needs no heavy runtime.

GWT might make sense for some LOB app for desktop but not for a mobile webpage.


Google Closure Compiler [1] can also perform dead code removal, but it requires extra work annotating all public APIs and naturally it can't cope with dynamic/indirect calls.

[1]: https://developers.google.com/closure/compiler/


So does UglifyJS. At least, obviously dead code (e.g., if(false) {} )


I'm not sure I understand, how can a framework "solve" code performance? Do you mean that it's faster than other frameworks?


The article mentions load performance. Meaning the time it takes to download and setup the Framework. GWT compiles Java to JavaScript and leaves out code that will never run. JavaScript frameworks load functionality in case that functionality is required, even if you don't use it.


> JavaScript frameworks load functionality in case that functionality is required, even if you don't use it.

Unless you use something like webpack to segment your codebase.


many javascript optimisers / minifiers do this


I think Ember is ahead of the curve on this one. Ember 2.0 introduced no new features, and instead only focused on stripping out deprecations, dead code, or platform specific stuff for browsers they don't support any more.

(Disclaimer: I'm heavily invested in Ember.)


The article showed Ember 1.9 coming in dead last because of code size. 2.0 is supposed to be a little better but do you really think it's going to be ahead of the curve? It seems pretty unlikely... I hope you'll say more.


> The article showed Ember 1.9 coming in dead last because of code size. 2.0 is supposed to be a little better but do you really think it's going to be ahead of the curve? It seems pretty unlikely... I hope you'll say more.

To be fair to Ember here - Ember includes Ember itself, Ember Data and jQuery in that payload. React is just React - you'll probably want to include Redux (2KB) or Alt (33KB) and then potentially a whatwg-fetch polyfill or superagent, normalize and other libs to (possibly) fill out some jQuery features (Zepto @ 9.1KB), if not jQuery itself.


Ember 2.0 came out after 1.13, so there's 4 versions between 1.9 and 2.0.

1.13 added a bunch of deprecation warnings to all the features they were going to strip out, and 2.0 only removed code previously marked as deprecated. So maybe not far ahead of the curve, but they're at least taking steps to reduce their imprint.


Web and Mobile has many frameworks(JS/native) that can be used to solve similar problems. I am wondering how we can write code that can be re-written quickly on other frameworks as a metrics of performance tests. I definitely think that developers should test there code/app on slower/older phones. A performance mindset creates better scalable tech and can save the company a lot of money in the long run and at the same time can be a core of user experience


Even the Facebook mobile web part that has been developed in React, the people search, feels rather slow on all my mobile devices (incl high end iOS amd Android devices).


IMHO It's still faster than most webapps on mobile. I use an old Alcatel with Android 2.3 to benchmark performances on low end handsets and Facebook performances are ok compared to Gmail or even Google search which takes 3+ seconds to load just to display a search box. I think Facebook has managed to do an acceptable job if you compare their app to other web apps.

The truth is, the mobile web space is not a good as developers expected it to be in 2015 unless users visit websites with expensive devices, which most of them do not. Making a website responsive , while being an improvement, doesn't make it automatically mobile friendly.


I am curious about how Discourse works. I have several call to "https://github.com/discourse/discourse/commit/1061a9ed06c500... when scrolling the comment of this article. Any one have an idea why they are repeating calls to GitHub?


I have been using T3 http://t3js.org/ very small and very easy.


Would be interesting to see Meteor stats there too


I'd like to see Meteor comparisons as well. I used to develop with Meteor heavily but I was frustrated by the very poor performance on mobile. The intense javascript would sometimes even lock up browsers on older devices.


I've only tested Meteor briefly, but there is a pretty large initial download by default[1]. However, there are solutions using community libraries for pre-rendering and also hosting static assets to a CDN.

[1] https://forums.meteor.com/t/first-visit-loads-are-ridiculous...


The issue in the forum thread you're linking to was not caused by Meteor but by the OP canceling a deploy halfway through the process: https://forums.meteor.com/t/first-visit-loads-are-ridiculous...


How about Sencha Touch? I'm guessing it's a beast.


Sencha Touch is pretty fast. When I was doing to Cordova development it was still the fastest js framework when it comes to UI responsiveness. Does it makes sense when it comes to developing mobile websites ? I don't know. I think sticky headers , footers , transitions and stuff like that make very little sense for a website. What kind of mobile website would you develop with a mobile UI framework ?


Maybe logging in to the admin side of one of my applications to check analytics, approve new memberships, and the like? I shouldn't need to package and distribute that as an app, but some of the mobile-friendly widgets are handy.


Kendo UI Mobile Gzipped: 136kB (not including jQuery). It is now called "Kendo UI Mobile" and owned by Telerik.

Heavy base library, so custom build only saved about 5% for us.

Good response rate to support issues, although sometimes valid issues are not fixed so we require custom workarounds to basic problems.

Great for POC, or pilot. Not so great for consumer apps using WebView IMHO.


Honestly, as an android-user, I just run my browser with JS disabled by default. My phone is much faster, and I don't really want 90% of the "features" the JS is trying to give me. Too many sites do it poorly, and it just ruins the experience.


JS frameworks don't have to be huge performance killers. I use Chaplin.js, and it's crazy lightweight and awesome even on my old iPhone 4. The user interactions are basically the speed of a native app, or even faster.


What's the status of asm.js in mobile? It would be nice if there was a front-end framework in some other language that could compile to asm.js.


very interesting




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: