Hacker News new | past | comments | ask | show | jobs | submit login

In 2007 I was writing websites that were mostly rendered on a server, with JavaScript used selectively in some places for interactivity

I remember writing at the time that moving everything to the client was solving one problem (poor encapsulation of JavaScript-powered front-end components) with another (needlessly rendering everything on the client)

The industry moved toward CSR anyway, and I had to learn it to continue having a job. And now, here we are.

Why do we do this to ourselves as an industry?




Because this isn't the same as what you were writing in 2007. This is an improvement on that. There is some degree of chasing what is new, but I think it is more so that all of this is a process of learning which causes the focus to oscillate between different extremes.


I'm guessing it's a bit of a pendulum, this is seen in so many areas of human society. We move in a direction and someone says "we ought to move in the complete opposite way" and so we do, then after some time it is clear this new approach isn't working out and we come back to the other side of the pendulum.

With time, of course, it is likely that we end up in an equilibrium at the middle. In reality, we don't know what we like and what we don't like until we experience it.


While I know it's not trendy, I'm still building plenty of both informational websites using Umbraco/.NET as well as web applications, using progressive enhancement. On occasion a client will have a concern that requires doing CSR (such as adding an unlimited number of "rows" in a virtual table or similar where a round-trip would prevent them from getting work done in a timely fashion), but for the most part the websites and applications I build server-rendered are very fast and performant. I still use front-end tools for minification/bundling/transpilation, along with caching, to get fast load and execution times.


As posted in a similar HN thread: thesis, antithesis, synthesis.

An idea comes up. It solves problems, but it also has shortcomings. An antithesis that deals fixed the shortcomings comes up and is adopted. Turns out it also has shortcomings of its own. The synthesis merged them together.

And repeat.

(these are the ideas from some thinker whose name I forget)


Hegel


I just thought of a possible reason. Maybe because the internet is generally much faster and lower latency these days? So rendering on a server can provide a good UX today, but it couldn't in 2007? Was "the edge" such a prominent concept back then? I honestly don't know if this is a good explanation or not, I have no data at hand to prove it, but I think it makes sense.


The "edge" didn't really exist at the time, along with concepts like cloud, serverless. I seem to remember that even CDN was an evolving architecture idea at the time.

AWS et al had not yet turned web servers into a commodity, so it wasn't feasible to "just deploy the software to multiple regions" to improve latency.


Indeed, I love how the front-end world is following the wheel back to server-side. It's like how all the cool kids are dressing like it's the 1980s/90s.


Because OOH LOOK A SHINY NEW FRAMEWORK!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: