Hacker News new | past | comments | ask | show | jobs | submit login

Just because client-side rendering did not work well for Twitter for a variety of reasons, does not mean it is subpar to server-side rendering in all cases. Performance-wise, if server_json_generate + json_download + client_render_json_to_html < server_html_generate + html_download + client_render_html, you can certainly go with client-side rendering.

In Twitter's case, client-side rendering was slow while server-side wasn't. In LinkedIn's case, html_download was slow while client_render_json_to_html wasn't. Performance aside, there are numerable reasons to go either away, all of which vary from project to project.

Having Worked with CoffeeScript + Twitter Bootstrap + BackboneJS/Underscore + LessCSS for some time now, I know I will not willingly go back to the old server-side ways unless there is a major reason to. Working fully client-side to make UI and relying on REST API / JSON is wonderful experience, especially when the app feels so responsive.




What about SEO? I've noticed a lot of these sites that are rendered client-side start with very little content in the HTML, doesn't this mean the search engines can't read the content?


According to this other LinkIn blog post, http://engineering.linkedin.com/frontend/leaving-jsps-dust-m... (look for SEO in the comments), they detect cases where the client can't or won't do JS and perform the rendering server-side.


How do they do that? I can't really figure out anything apart checking user agent (for web crawlers?)


you can simply check for the accepts header.


Can we assume that degradability isn't a concern for you? Accessibility?


Degradability is not a concern. Accessibility is, and client-side rendering does not mean no accessibility.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: