Hacker News new | past | comments | ask | show | jobs | submit login

> What we really need is a proper study, but they are notoriously hard.

Here you go: "A large-scale study of programming languages and code quality in GitHub"[0]. I've only read the abstract (are we OK on sharing sci-hub links? [1]), spoiler alert, according to this research some languages are more defect prone than others but the effect is small.

[0] https://cacm.acm.org/magazines/2017/10/221326-a-large-scale-...

[1] sci-hub.se/10.1145/3126905




I'm aware of this study, and a couple of others. They are all only looking at one area of productivity though, because that's the available data. In this particular case, there's no indication of development time required to achieve that level of bugginess. Maybe in practice all projects do enough work to reach an average level of reliability, but then what's interesting is how much developer time is required to get there.

It's a total cost analysis that's interesting - and there's a lot of costs. Cost of development, how that varies depending on required time to market, cost of bugs (fixing them, and loss of sales due to customer reliability concerns), cost of complexity as software has to be maintained over time.


> but then what's interesting is how much developer time is required to get there.

Yeah, good point. I read an old research where they tried to test the "productivity" offered by different languages, but then it became difficult to asses the experience of the different developers...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: