Hacker News new | past | comments | ask | show | jobs | submit login

In principle, 'agile' is so weakly defined that it is possible to follow a process that produces quality software and describe it as agile. In practice, insofar as agile stands for anything at all, one of its pillars is what might be called an 'active disinterest' in doing much thinking ahead, and in practice, it is very hard to produce quality software without doing some serious thinking ahead.



Is this really true?

I have the feeling "producing software" is a highly volatile process.

First, software can do almost everything. You have WhatsApp, DOOM, Photoshop, Ableton Live, Google, Linux etc. which are hugely different systems and it was just the stuff that came to my mind in 10 seconds.

Second, the requirements change and change. One person thinking ahead and getting a brilliant idea may end up with producing complete garbage, because many things have changed since they had the idea.

I think the reason for the fact, that it hasn't been formalized and regulated like, for example, building cars, is simply that it can't be.

Software isn't a car or a house or a ship.

Software is an abstraction layer above this. It's more like the accumulated orders needed to build these things.


> I think the reason for the fact, that it hasn't been formalized and regulated like, for example, building cars, is simply that it can't be.

NASA writes highly reliable software, as do the organizations producing the software for fly-by-wire airplanes, so your assertion is empirically false. You are extrapolating too far from your personal level of knowledge and experience.


> NASA writes highly reliable software, as do the organizations producing the software for fly-by-wire airplanes, so your assertion is empirically false.

What I find interesting is that both of your examples (and many others) are for software which controls physical systems, i.e. which is pretty well defined due to the nature of the machines it controls. Do you have comparable examples which aren't defined by the physical systems they interact with?


Operating systems/kernels as a whole largely deal with abstractions, and some are built to incredibly high, and in some cases mathematically proven, standards. seL4 is an example of a formally verified kernel.


So are you arguing that very well-defined and carefully thought-out requirements are, in practice, essential to the development of high-quality software?


That would be quite the no-brainer, wouldn't it be?

All the dancing around waterfall, agile or what ever turns around the problem that people usually don't know what their requirements are, are not able to formulate what their requirements are, but hopefully know when they see something it if something helps them or not. If we could just get people to define their requirements in all the detail needed we would take a great step forward, but looking around .. I don't see this happening in the near future.


I.e. Let's avoid the question of quality altogether by declaring it an unsolvable problem in the limit.


Sure they do, but how much is because of formalization?

I mean, just look at SQLite, which is a highly reliable software. They bought this stability with a gigantic test suite.


In what way do you think this supports your claims? What argument are you making here?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: