Hacker News new | past | comments | ask | show | jobs | submit login

Not quite sure what point the author is trying to make? He agrees that lines of code are a bad measure of productivity because, yet claims that the average he computes can help him predict future development times. Then he explicitly points out that different parts of the codebase required different amounts of work, apparently unrelated to their code line count, yet does not relate this to the previously mentioned points. Also, what is up with the random comment about code coverage at the end? That doesnt fit in with the rest of the article either...



I think they're stuck with the same problem as every dev manager. LoC are pointless/futile/etc, but they're really easy to measure and they're got to measure something right?

Like measuring productivity by how many hours people spend at their desks. Utterly pointless, but really easy so it becomes the default measure.

Trying to explain to professional managers that there is no foolproof way of measuring developer productivity is a really hard conversation that I've had more than once. I'm assuming the OP's target market is exactly these people, so I don't really blame them for succumbing to the pressure.


It looks like the author is showcasing his product, hence the random screenshot of test coverage as well.


Even the assertion of that knowing the lines of code per day helps with estimation seems puzzling. How do you know how many lines the finished product will eventually have in advance?


How do you know how many pixels the ball will take up in a sports photograph before capturing it?

Enough to see it, not so many that you can’t see everything else.

Same with lines of code, except that instead of a photographer, there is an observing project manager.


I am glad I am not the only one to scratch my head...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: