Hacker News new | past | comments | ask | show | jobs | submit login

It's quite ridiculous for a number of reasons, well documented by research and experience: software engineers can't estimate how long something will take with any kind of accuracy.



But that's exactly why people start using things like t shirt sizes: to emphasise the point it's not a time estimate. It's a rough ordering of relative complexity of different tasks, which is something programmers can do.

Of course, the business still needs time estimates, so someone will somehow attempt to turn them into time estimates. But that can't be helped.


> It's a rough ordering of relative complexity of different tasks, which is something programmers can do.

Yeah, when put as an estimate of effort or complexity, we can be good at estimating that. But that isn't how it was put.

> so someone will somehow attempt to turn them into time estimates.

It works until it doesn't. I would estimate manually entering data as "pretty easy" but it won't be done in a day no matter how much you pay me. I can only type so fast. There are many tasks that are easy but take a really long time, and many complex tasks that take a very short amount of time.


I've always felt this is because estimation never gets treated as an exercise which might involve actual software engineering. You get handed a task you've never seen before, for a system you've never seen before, and asked "how long would implementing this take?"

You never get handed a task which is "write as much of a prototype of a system which would do this, so we can estimate how much more work we think is involved".

And then when you do have enough knowledge to reasonably estimate, people just declare with no evidence that it should be quicker anyway and then are surprised when it is not.


It's not just that, but also we tend to estimate in the context of "if I were sitting at a computer working on just this problem, this is how long it would take." The reality is that there are meetings, high priority bugs for unrelated systems, interruptions from the business, coworkers and life, code reviews for other team members, rediscovering what you were doing before being interrupted, etc.

Using time tracking, I was able to discover I only spend 2-3 hours per workday actually programming, the rest was all interruptions and such. Thus I can estimate that one day really equals 3-4 workdays. Then my project manager throws in another 3-4x on top of that to deal with scope creep, rework, bug fixes, etc... and we're usually on-target ~50% of the time.


When I was at Amazon, I read the SDE guidelines from HR, where they describe their view of the role.

An SDE1 was expect to spend 4hr coding a day; an SDE3 about 2.5hr coding a day.

That’s normal for a job, eg, apartment maintenance (my college job) would have us actually wokring about 4 hours a day, between setup, cleanup, breaks, travel, miscellaneous tasks, etc.

Convincing other SDEs to assign points to stories based on that (4hrs of coding per point) was surprisingly hard.


My friend's advisor in grad school (Physics, not CS) used to ask his students for various project estimates, and then he would double it and increase the units: 2 hours = 4 days; 1 day = 2 weeks; 2 weeks = 4 months; 2 to 3 months = 4 to 6 years = thesis project. My friend's estimated 3 month project turned into his 4 to 5 year thesis project. I mean, hey, it was experimental physics and his project ended up using a shipping container-sized faraday cage, scanning tunnelling microscopes, a clean room wearing a bunny suit, building stuff himself in the machine shop, writing software. All this for something that literally had not been done before and no one was sure it would work or what exactly would need to be done to get there (which starts to sound similar to some aspects of software projects). Plus the usual overhead like teaching ungrateful engineering undergrads (guilty!), hosting movie night at the lab and making liquid nitrogen ice cream, etc.


I’m in the wrong industry. That sounds way more fun than writing software.


> You never get handed a task which is "write as much of a prototype of a system which would do this, so we can estimate how much more work we think is involved".

The non engineer types won't hand you that, but I've had some success with proposing that when there's a lot of uncertainty.


But sorting issues according to their rough size is precisely what makes at least basic sense. A scale of trivial (can make many of those in a day), simple (several of those a day), medium (roughly a day of work) or large (days) makes it possible to have at least basic conversation around work planning. I’m not extra sold about calling those by shirt sizes, but I’m sure we’re on the better end of the absurdity scale here :)


At that point, you are estimating EFFORT, not time. Software engineers are REALLY GOOD at estimating effort. The fact that they translate to time (simple == several days) is ephemeral.


I think that is how Agile is suposed to work. The programmer stimates how hard the task is relative to other task he has done.


Right, but they were asked to estimate TIME, not EFFORT.


>Software engineers are REALLY GOOD at estimating effort.

The most common problem with estimates is hidden or forgotten complexity, which makes both time and "effort", whatever that means, go up.


Software engineers, when quoting for fixed priced jobs, learn quite quickly to estimate accurately.

Software engineers, when pressured by managers to provide low estimates and/or to provide estimates quickly, will estimate inaccurately. (It can also be deliberately high as well as low, based on previous experience of having their estimates chopped in half.)

Whether you use SP or T-shirt sizes or whatever, somebody is translating that into days because days (and thus dollars) are what matter to the business. If someone asks me for an estimate, I'll give them a range in days/weeks, and they can turn it into whatever nonsense unit they like.


> software engineers can't estimate how long something will take with any kind of accuracy.

Sure we can, it's always one of:

- A couple of minutes

- Today

- A week or two

- Probably around a month

- I have no idea, could be any of the above or more


At my work "It'll take half a day" has become slang for "I have no idea"


My standard reply is "one to two weeks".


That’s true, but it’s also well documented that biz likes having any estimate over nothing, no matter how unrealistic.


Then ask for an estimate of effort, not time. Let someone else take the responsibility of figuring out how long that will be.


Sizing is useful for one thing: making sure that two people are talking about the same thing.


That sounds a bit extreme? True, estimation is hard, but surely we can differentiate between 1-2 day work, 1-2 week work and a big scary project with a lot of risk. That's what T-shirt sizes are for.


And if a bug in a library stops you from completing your work so you have to develop a workaround, and adds several days to your “1-2 day task”? The estimate is wrong.

There are simply too many unknowns in other libraries and systems to be accurate.


Any recommended readings on the "unable to estimate" claim?


I read Steve McConnell's Software Estimation: Demystifying the Black Art and I can recommend it without hesitation. It is quite old by now so there might be something newer and better out there as well.


That book says the opposite though, we can definitely make good estimates.

It's just that estimating well needs people with training on how to do that, and then it takes substantial time to make good estimates. And there will still be significant error bars (if the estimate isn't a range, it doesn't count as an estimate). But it's certainly doable.


People who ask for estimates, don't consider estimates with huge error bars good. Literally anyone can make estimate with a decent amount of error.

To narrow down the estimate to what they want would take just as long as just doing the work.


I know, but those people need the training.


There are no good recommendations because it’s wrong. GP has taken “software estimation is hard and very imprecise” and vastly misrepresented it to try to dunk on someone on the internet.

It’s this sort of disinformation that perpetuates the contingent of software developers that cry bloody murder whenever they’re asked to say if something will take a day or a year.


I never wrote what you quoted. Please don't misquote me to "try to dunk on someone on the internet."


Thanks, but this isn’t my first rodeo. In the future, please more carefully exercise more discretion when whipping out the snark.

> software engineers can't estimate how long something will take with any kind of accuracy.

This is both irrelevant and wrong.

It’s irrelevant because t-shirt sizes, story points, and other abstract measures, are - intentionally - not measures of time. It’s a measure of effort, benchmarked against other units of work. Yes, this can, sometimes, give a vague indication of time. It’s also useful for other reasons, too, like weeding out whether or not everyone is on the same page with regard to what needs to be done in the first place. All of this is explained in literally any primer on the subject.

You’re wrong in saying that software engineers aren’t capable of estimating effort (or even time) with any degree of accuracy. They can. I can tell you that my Python hello world script will take less time and effort than rewriting the Linux kernel. None of the “research” and “experience” that you so confidently refer to says what you think it does. It says that there are big limitations to the degree to which timelines can be estimated. This is entirely true. But there’s nuance to it. You’re so desperate to find a shortcut to being smart on the internet that you’re spreading blatant disinformation in the process.


I'm not sure you're agreeing with me and using disagreeing words, or you didn't read what I wrote...

Software engineers can't estimate how long things will take: https://www.sciencedirect.com/science/article/abs/pii/S02637...

They're wrong, 60% of the time by overestimation and when underestimated, so vastly wrong it's terrifying. I remember this one time I merely had to update a component in prod. Everything went fine in staging, then when I pressed the "button" in prod ... all hell broke loose. We spent the next 4 days fixing it.

I never wrote that software engineers can't estimate effort, I said they can't estimate time, but you're accusing me of the former.


I think you've linked to a study of "expert project managers", and we might see similar results in a study of whether "expert project managers" can succeed in tying their own shoelaces.

If you're working with a system where your staging environment is not sufficiently close to your prod environment to be entirely predictive of behaviour, that's a "known unknown" and should be in the estimate.


The reason it failed in prod was entirely unrelated to it being prod. The same could have happened in staging. IIRC, the error was entirely due to a RST packet from some external system during the upgrade. It was a bug in the upgrading system that should have been accounted for, had anyone known it existed. Identifying the root cause of the failure, was what took the most time. Had deployments been idempotent it also probably could have been resolved in moments as well ... but here we are, 15 years later with lots of lessons learned.


Sounds annoying, but seems like you found a bug in the upgrading system that could have struck anyone during any change?

The time/work to investigate and fix it probably wasn't considered (or shouldn't have been, at least) part of the work on the component you were changing - that was just delayed, same as it would be in scenarios like "Dave got hit by a bus and he's the only one with the prod password" and "Our CI service suddenly went out of business and we need to migrate everything".


My point is that you can't estimate time with any accuracy. At the end of the day, even this fix and shenanigans was still "easy" once we knew what was going on. The effort never changed and we would have been dead on. The issue is when trying to say, "It will take me two weeks to do this," and it actually takes two weeks -- there are simply too many unknowns for ANY task in our industry for us to actually be confident in that assessment.


Not to the day, but you can estimate a range based on experience. After that deployment issue you may add "release could be delayed by up to a week" to future estimates until you're sure it's fixed.

I've written TV apps and in that world I've often given estimates that are 5 days of actual work but, because Samsung's QA process can take 6 weeks and spurious rejections are common, "deployment" will often take literally months.

Time to release and time for development can be totally different things and it's arguable whether "waiting" time should be included in any individual estimate at all. (You're adding 4 separate features and doing 2 bugfixes in one release, which one gets +2 months? In reality "submit/release" becomes a different ticket/task.)


Tell me, what is a unit of "effort"?

How would we measure that?


HN loves to make this claim but it just doesn't match my experience, from several teams. Estimates are not precise, obviously, but that doesn't mean that they are impossible to make or that they add no value.


I mean I've routeninly seen estimates be 3-5x longer that projected. It's up to you if you thing that's an accurate enough estimate or not.


I have personally produced estimates of 2 weeks that took 2 months, and estimates of 2 months that took 2 weeks. For years, I told my boss that implementing a certain feature would be "very hard" and basically wasn't worth it. When we actually pulled the trigger on it, it was done and deployed in a week.

I'm sure some people are significantly better than me at estimates, but I haven't met them. Estimating the unknown without serious research that borders on just doing the job is nigh impossible. And we're just an in-house dev team, so we never, ever do the same thing twice.

Estimating how long to put up another wordpress site is a lot easier than estimating a new project with new requirements and new tools. I typically find that people who think estimating is easy are just doing the same things over and over for new clients, rather than doing new things all the time for the same client/employer.


Sure, so have I. I don't think estimates are good for determining a date for a contract or anything like that. I think they provide data about the tasks when prioritizing, which is valuable. Code coverage is also a really low quality indicator of test quality, but it is still useful.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: