Hacker News new | past | comments | ask | show | jobs | submit login

Lessons already learned by older engineers (who went through similar woes with other languages/tools) are being re-learned again and again.

If only that were true! I think the JS (and more generally web development) ecosystem today is more a case of those who do not learn from history being doomed to repeat it. The thing is, if you actually are Google or Facebook or Microsoft or Mozilla or Apple, you can throw huge amounts of resources at problems and produce sufficiently useful solutions regardless, and then you can do the same again a few months later if you need to. These are the kinds of organisations that set the tone for the whole community because of their sheer size and influence. However, their goals are not necessarily aligned with the rest of the web development community, nor will their approach necessarily work well for someone else.




I think I didn't express myself correctly. Younger engineers are ignoring lessons learned and are thus forced to learn them again :)


Ah, I see. In that case, I think we agree!


I think you're both saying something similar, but there's a "why" in there to connect:

> Younger engineers are ignoring lessons learned

Why ignoring them? Beyond the 'youth' factor directly, the signaling from the larger companies mentioned (google, etc) is that those lessons don't necessarily matter, and... look, google did XYZ, you can too (ignoring that they threw potentially 3-5x as many people at problem XYZ than you even have, much less justify).

Maybe neither of you meant that, but that was the connection I just saw between your two comments.


Yes, that's basically what I meant. I think some of the highly visible projects from tech giants today are succeeding despite their approach to software development, not necessarily because of it, but a lot of the new developers coming into the industry lack the experience to realise that and think of these projects as examples of how things should be done.

Every time I see some young developer posting on HN about how the code we write today only has to work for a year or maybe two at most, or how 10,000 lines of code is a large program, or how some high profile company's web site that is still fundamentally just forms and tables and an API for talking to a database is a complicated UI, I weep a little for the future of our industry and our lack of ambition. Of course things like performance and stability and standardisation and portability don't matter very much if all you ever do is hop from one small throwaway project to the next every few months.

I think the culture around HN is unfortunate in this respect, because from a commercial point of view, it is an alluring idea to build some sort of trivial MVP, try and get crazy amounts of funding, and then if (and only if) you succeed, to throw it all out and start over. As a business strategy for building the next WhatsApp or Instagram, that's rational and perhaps quite effective. But it also contributes to the everything-is-expendable mindset that plagues our industry, and since the overwhelming majority of software development projects don't have the luxury of starting over every five minutes, I think it breeds a lack of respect for professional programming and the skills required to build good software without the benefit of an effectively infinite funding source from benevolent investors (or from the goose that keeps laying golden eggs elsewhere in the organisation).

That might be OK for Facebook or Google, but I wonder how many other startups that fail could still have been very successful, albeit not in unicorn territory, if their developers had paid more attention to some of these issues. I wonder how much time humanity collectively wastes today just because software quality and reliability aren't taken seriously by too much of the industry.


maybe i haven't read this thread closely enough ... the parent to this post mentions "lessons of the past" and its parent mentions typed languages, so uh, what lessons are being discussed here? certainty not "the past teaches that typed languages are best," b/c lisp


For what it's worth, I don't think Lisp is a very convincing example of how useful a language with a limited type system can be, because to a first approximation no-one uses it.

But more generally, the modern web development world is only just waking up to the idea that if you want to build software sensibly beyond a very small scale, you need tools for modularity and composition. Separation of concerns is important to preserve developer sanity and keep maintenance manageable. It is helpful to have well-defined, stable interfaces, whatever the current underlying implementation. Having an expressive language is good, but having an expressive language built on sound foundations and with consistent patterns and clean semantics is much better. Standards for protocols and formats are important. Performance matters, in many different ways. Quality matters. Portability and standards compliance matter. Longevity and stability and backward compatibility matter. Having a few good tools that work well and offer substantial benefits is much more valuable than having a million tiny things that change or become unsupported within weeks. Frameworks can be useful for getting going quickly, but you lose flexibility down the line if your requirements evolve beyond what the framework was designed to support and that can hurt. Libraries offer more flexibility, but you often incur overheads converting to and from their conventions and those overheads need to be reasonable if the library is going to be useful. Reuse is good but not always the best choice.

These and many other lessons in real world software development were brought to you by the words "developers older than", the number 30, and experience of programming outside the JS bubble. :-)


> to a first approximation

late reply, but something occurred to me to say more as a (hopefully funny) witticism than an actual talking point:

if we approximate usage in terms of a poly sum w/ coeff derivatives wrt time (Taylor series is the word, I think) then one might estimate lisp to be a very much used language


I'm mostly lamenting the overall lack of progress in creating reliable, bug-free software.

You would never let a software engineer build a house, would you? For some reason, the discipline of architecting and building houses is far more advanced than building software.

Why is that? We keep inventing new programming languages to tackle various aspects of making software better (safe concurrency, preventing memory leaks, etc. etc.), but we keep introducing 5 bugs per Function Point and the rate of finding and removing bugs has stayed pretty much the same the last 20 years.

There needs to be a fundamental change in how we create software.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: