Hacker News new | past | comments | ask | show | jobs | submit login

There are several objective measures for programming language "awfulness". The ones I've seen published the most in papers are:

- Time to identify bugs in code

- Defects in written software in fixed exercises

- Defects in written software in real code

- Time to develop a certain program

I can't say that the measurements of these is easy - but when we measure these with old vs. new languages we see change.

One can claim something can't be "objectively good or bad" in the Popper sense of science only caring about falsifiability or in the Quine sense of "just predicting" but that becomes a philosophical argument and debate.

In my opinion since languages are how we express ourselves and we write code in a _very_ roundabout way in many cases - languages are in fact objectively awful, the number of defects is huge, development time is long and developer experience is something few people deal with.

The literature on developer experience and programming language API is just starting to emerge and we are still very very early on.




And of course all the research on this is pretty bad, on grad students / biased samples and the methodology isn't great but it's better than nothing and we should strive to improve it.

If anything that just shows how bad things are. (Although I believe they really are improving)


> One can claim something can't be "objectively good or bad" in the Popper sense of science only caring about falsifiability or in the Quine sense of "just predicting" but that becomes a philosophical argument and debate.

Yes, one can claim such a thing. However you don't have to debate it you have the foresight to not make claims about the subjective qualities of the objective world.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: