Hacker News new | past | comments | ask | show | jobs | submit login
The Parable of the Two Programmers (1985) (uwo.ca)
381 points by ims on Jan 25, 2015 | hide | past | favorite | 160 comments



There was a MOOC on Coursera called "Irrational Behaviour" and one of the stories there is about a locksmith who in the beginning of his career used to fix a door lock in more than an hour, with lots of effort and almost always destroying the door. His clients were happy to pay him the 70 dollars he charged for the operation and also tipped him most of the time. As time went by and his experience increased he got to a point where he was fixing the door locks in 10 or 15 minutes with virtually no disturbances for the customers. His tips started to fade off and his customers became outraged at his 70$ charged for those 10 mins of work.

Conclusion: we don't want to pay related to the value we receive for a certain service, but to the amount of effort involved in the delivery of that service.


>Conclusion: we don't want to pay related to the value we receive for a certain service, but to the amount of effort involved in the delivery of that service.

I don't think that's a fair conclusion. There's much more going on here.

Locks are security for people. If I call a locksmith and he takes over an hour to open or fix my lock, I think, "Oh, good, even a professional is going to take some time to get through this. It's pretty secure!" If the locksmith is done in 10 minutes, I think, "Holy shit! With the right tools and know-how, someone can completely bypass my security in ten minutes!" This isn't so shocking around here, because most people on HN know just how vulnerable most security systems can be. Many people will not have that knowledge and be comforted by a locksmith taking longer.

Also, when you hire a locksmith, you usually watch the locksmith do his work- either because you're locked out of something and have to stand around uselessly, or the guy is in your house and it shouldn't take too long. This makes you invested in his effort- if it takes an hour, you watched the locksmith work for an hour. It's hard not to pay, given that you saw how long it took, and how he was working at it.

The same standard doesn't seem to apply elsewhere- if my steak takes an hour rather than twenty minutes, I'm not going to pay more for it, I'm going to complain to the manager. If the oil change takes too long, I'm not going to be happy about that, either. If I buy a cabinet, I usually don't care if the carpenter took 10 hours or 20, I'm going to want to pay based on the quality of the cabinet.

I think your conclusion is true in some situations, but it's definitely not universal to all work.


I think this is mostly true.

However, if you were watching someone prepare the most exquisite steak, and it took lets say a full hour to trim the meat, get the pan ready and properly buttered, cook off the edges of the steak, etc, and you were sitting there watching the chef go through this metiMaybe we just care about seeing the complexity in an operation for it to have value.culous process, then perhaps you wouldn't be upset that it took so long--and most likely would enjoy the steak more, having seen the entire complxex and careful process to cook the thing.

Similarly, if an oil change was a complex operation and you watched the mechanic half-dismantle your car to change whatever needed changing.

Or if you watched a programmer bounce around the screen in vim for 20 hours; coding, testing, tweaking, and debugging.


The smart locksmith would open your door in 10 minutes, then offer to sell you a more expensive lock.


> If I buy a cabinet, I usually don't care if the carpenter took 10 hours or 20, I'm going to want to pay based on the quality of the cabinet.

I don't think that is true if you comissioned the cabinet or watched how to make it. I have seen the irrational behaviour several times in action in different scenarios: painters, landscapers, carpinters.


This is the mechanic story all over: Guy takes his car to a garage because it keeps breaking down, and the mechanic leans in and listens to the engine for a minute. He goes and gets a hammer and listens to the engine again, and then raps sharply on the engine casing. The engine goes back into sync and stops breaking down.

The mechanic says "that'll be £500, please." The guy's outraged: "But all you did was tap it!"

Mechanic replies "it was a pound for the tap, and £499 for knowing where to tap."

Edit: Crap, beaten to it.


The origin of that story is Charles Steinmetz, and it involves Henry Ford, a $10,000 invoice and placing a X in chalk. http://www.smithsonianmag.com/history/charles-proteus-steinm...


According to Snopes[1]:

"Practically anyone famous for his knowledge can be offered up as the virtuoso in this tale... Nikola Tesla, Thomas Edison, George Washington, the electrical genius Charles Proteus Steinmetz... How long this story has been around is a mystery."

http://www.snopes.com/business/genius/where.asp


> One Friday afternoon in 1921, Steinmetz hopped in his electric car and headed off for a weekend at...

Whu?


http://en.wikipedia.org/wiki/Electric_car#Golden_age

"In 1900, 40% of American automobiles were powered by steam, 38% by electricity, and 22% by gasoline.[19]"


Well, that's absolutely blown my mind -thank you.

Now I just want to know how they were dealing with power storage...



Experience pays in other professions that require competency, though: a surgeon who can proficiently perform a bypass in a short amount of time is far more valuable than a fresh graduate who potters around in there for a while. No one ever complains that a surgeon finished too quickly.


The surgery marketplace is way too ignorant to be a real marketplace. Nobody has any idea how long my mother-in-law was under actually under the knife and nobody involved other than the surgeon, has any idea what the going rate is for gallbladder removal. Two minutes? Two hours? No one involved on the "purchase side" of the market can even guess. Without a functioning marketplace, numerical metrics don't mean much.

Another interesting issue is she shopped for a GP a long time ago, the GP referred her to specialist who referred her to the surgeon, can't have much of a marketplace if the participants aren't making any decisions. The only decision she made was a decade ago she picked the guy who referred her to the guy who referred her to a surgeon.

The craziest part of a crazy "market" or whatever you want to call it, is it mostly works. Something this screwed up should predictably kill everyone involved, but most of the time, it works, other than blowing a lot of money. It is the truly shocking part of the situation, at least from a "free market capitalism is the only system that, in practice, actually works" mantra. Maybe that's not so true, given the observational evidence.


It gets worse, because it's such a probabilistic market. How does any patient tell the difference between a surgeon with a 10% fatality rate and a worse one with an 11% fatality rate? (At the usual estimates of the value of a life in the several millions USD, that 1% is worth a lot.) And if they somehow had that data, how would they adjust for the surgeons' working in different geographies or specializing in slightly different patient types?

This sort of problem is why making hospitals release medical data hasn't been very useful. And if you can't get much reliable signal out of large datasets with rigorous statistical analysis, how are ordinary people and gossip supposed to reach the right answers which could allow market mechanisms work?


I have a different experience. Our heating broke on two (unrelated) occasions.

In the first case, the repairman came, fixed the thing in 20 minutes, seemed competent, and I gladly payed him.

In the second case, the company sent an inexperienced employee, who couldn't figure out the problem, and spent 2 hours trying everything before finally deciding the most expensive component needed to be replaced. I was very unsatisfied, especially since they charged me for all the seemingly pointless troubleshooting time.

So maybe it's not only important to make a lot of effort, but to also appear competent while working?


I had a similar experience with a plumber. Normally, I would have no clue how experienced the person was, but this particular person kept "missing" certain parts and had to make 3 trips to the hardware store! He only charged me for the first, but, clearly he didn't know what he was doing.


I hit this on HN a couple of years ago:

http://www.90percentofeverything.com/2010/12/16/adding-delay...

I also wrote about it. It's real. Many customers are not happy if you make it seem too easy.


A couple of years ago, I locked myself out of the house. I called a locksmith, who showed up at the house surprisingly quickly.

After spending a number of minutes fiddling with some impressive-looking lock picks and exercising what must be years of learned skill, he couldn't unlock the door. Sheepishly, he put away the specific tools and pulled out a dummy key + a rubber mallet.

The locksmith put the dummy key into the lock tumbler, whacked it once on the head with the rubber mallet, and the door unlocked. From the time he produced the tools to the time I was inside the house was well under sixty seconds.

I thought, Why didn't he just do that in the first place? And concluded much the same as you've said -- $70 for thirty seconds of work might seem steep and some people need the song and dance. Personally, I was grateful to have on-demand service like that available so quickly :)


If you're curious, the "dummy key" is a bump key. It's a relatively crude tool and is relatively hard on the lock, compared to traditional picking. There's also a certain level of professional pride, as you observed.

Hitting it with a mallet imparts force on the upper half of the pins, causing them all to briefly jump above the shear line, and at that moment the lock can turn freely without binding a pin.

A vibrating pick gun does essentially the same thing in a somewhat more controlled way.


The Parable of the Handyman's invoice

http://www.snopes.com/business/genius/where.asp


That's because value is a nebulous term, and the common unit of measure which anyone can relate to is time.


> Conclusion: we don't want to pay related to the value we receive for a certain service, but to the amount of effort involved in the delivery of that service.

But is that irrational? This seems to be a personal preference. By refusing to pay the same price for 10 minutes of work or one hour of work, we assert that we do believe in a certain income equality. "Disturbing someone for one hour vs 10 minutes" is not a necessarily irrational way to look at the situation.


It's irrational because the 10 minutes of work that doesn't destroy the door is worth much more than the hour of work that does - replacing a door is expensive and costs the homeowner additional time and aggravation.

Also, it's really more than 10 minutes of work, since the locksmith has to spend time driving to and from the customer. His fee needs to cover his travel time and expenses (gas, wear and tear on his truck, insurance, etc.). So $70 for a 10 minute house call doesn't seem unreasonable.


I'm with you on the first point, but your second seems to undermine the first. The argument isn't "$70 per hour is reasonable, and a 10-minute call takes an hour." As far as I can tell, it's whether "Fixing the lock is not worth less because it took less time, the end result is what you pay for" is true.


Not only that, but you have to also account for the time that the locksmith spent learning the craft. And the opportunity cost of that learning/practicing time.


This. While common in many fields, I feel that this is most obvious with musicians. "What? I'm not paying this band $1000 for a 2 hour performance!" When, in reality, you are paying them for many hours of practice in order to prepare for that 2 hour performance as well.


It seems more rational to look at things as having your doorlock fixed, because that is why you call for the locksmith. You don't call just to see someone doing his job for the sake of it.


No, people get angry at being charged $70 for 10 minutes of work because they think they're getting cheated, even though in this case they aren't.

You're not going to find your rationalization for socialism here.


And why do they think they're getting cheated?


Because the hourly rate looks very high.


It doesn't have to be amount of effort, even. In your example, the locksmith has so much experience, its easy for him to fix locks, so not much effort it required by him. But we do make this correlation between effort and time; so more time spent == more effort to our brain. That needs to change.


Nick Szabo has some writings on the economic shift to hourly wages and proof of work; because of the local knowledge problem (nobody can be an expert on everything, lots of information is hidden or inaccessible or tacit, and it's extremely difficult to determine value), looking at the labor or work which went into a product or service can be a simple fast heuristic for estimating the value of something - since if it was not at least that valuable, the actors who produced it and have more local or tacit knowledge would stop producing it.


I have read somewhere (it could be apocryphal, although it makes sense) that for this very reason barbers tend to snip their scissors in the air a lot - to give off an impression that there's more work being done than there actually is...


I can confirm this happens in high end hairdressers.

I have never spent a lot on haircuts, and a few years ago when I was going from long to short hair (for a job interview), my mum offered to pay for an expensive haircut for me. The haircut was not really any "better" than many of the cheap ones I had paid for myself, but the hairdresser spent a lot more time, and there seemed to be a lot more faffing about, with no real purpose as far as I could see.

Most of the places I had gone to obviously needed a bit higher throughput to keep the money coming in.


As a friend of mine told me, "I don't get it how my haircut can take 5 minutes and cost 25 [nevermind the currency here], while my wife sits there for 1.5h paying 150" - so I pointed out that HER haircut is actually cheaper :) Profound? Perhaps... On the subject? Quite so! ;)


It isn't totally irrational, it's merely measuring the only metric we can measure objectively, in the usual way it's measured. The fact the measurement function is exactly backwards is not obvious; in fact, it's entirely reasonable to expect to pay more for taking up more of someone's time.

Locks are actually a very good (pathologically good?) example of this, because we usually have no idea how well and how often they succeeded and a single failure can be catastrophic. Therefore, it's very hard to tell how good of a job the locksmith who installed or fixed it did without being a locksmith yourself. Compare cars: With a car, you drive it around pretty often, and you get a feel for how it handles and smells and sounds relative to the baseline established when you first got it. There's a constant evaluation of factors relevant to how the car is actually doing. With a lock, if the lock is put in well and functioning properly, the successful evaluations are mostly unnoticed unless you have a camera watching the lock 24/7.


I think it's more complicated than this Locksmith example. I think we are so used to being ripped off by "experts", many with a lot of spinach behind their names; We start to think everyone is trying to make the most amount of money they can out of a situation(usually a bad situation). I have no problem paying someone for a job well done--if they do it quick all the better. The problem I have encountered is the expert did the job, but it was a half ass job and their mistake didn't show up until later. I'm now kind of jaded. One other thing about experience. I have found experience doesn't matter quite as much as I thought it did in a lot of professions--especially the professions that are a mix of science, and art--like medicine, and yes Programming. A little experience, but still double checks their work, and is staying current--Gold mine! The pillar of their profession, with an ego to match--I run! (I noticed someone brought up the mechanic and the hammer story. A stuck pinion on a starter sometimes responds to a hammer blow. I used to position a long dowel(broom handle) on the stuck starter through the engine bay and whack the dowel with a hammer.) Oh yea, I have found that people who don't overcharge(whatever that even means anymore) are well liked amoung their peers, and customers.


The story ends well because the project was actually simpler than what it looked at first. Unfortunately, more than often, things happen to be a lot harder than expected

What happens when, after 2 months of scribbling and playing space invaders, Charles realizes the project actually requires 3500 lines of code? He wants the project to succeed but now he doesn't have enough time, and he fears to ask for help because he knows he is labeled as a lazy and arrogant guy.

So he works long hours to fix the situation, then he burns out.

Source? This is somehow happened to me. Several times.

This story can be true, people like Charles and simple projects exist, but these are exceptions, not the rule. It's easy for a beginner to believe he is that guy and then experience a death march [1] Things can go wrong for Alan, but he has a team to support him and his managers know he is working at something complicated.

I'd like to be Charles one day, but for now I'm Alan.

[1] https://en.wikipedia.org/wiki/Death_march_(project_managemen...


The way I see it: Charles made the problem look simple by spending a few months thinking about the whole program, while Alan made the same problem look complicated by writing a bunch of code and always looking busy.

While I agree that Charles is the exception, I don't believe meritocracy is a valid solution; I'll always bet on Charles:

If the project was actually a "3500 lines of code" problem, then Charles might have taken longer to think about it, but it's my experience that Alan never would have finished.


If this story had happened in real life, and the problem had been more complicated than anticipated, and Charlie had realized this, it would be likely to fail, because he did not have enough leverage toward the upper management to get the resources required to succeed, so in that case the project would be likely to fail, or at least be severely delayed.

However, Alan was making the problem more complicated by introducing a lot of accidental complexity, and I have often seen that this is done even when the problem is more complicated than anticipated. Such a project could easily create enough work for 10-15 people in the hypothetical scenario in the article if the project had enough necessary complexity to be a four person project.

It is very hard to distinguish between what is necessary and accidental complexity, and that is precisely the point of the article. Prestige is very often bound to how many subordinates you have rather than how well you solve a particular task, so making your project artificially complex can be a strategy for climbing toward upper management. This may of cause be a conscious or unconscious from the employee/manager in question.


Indeed, it's very important to match the structure to the problem.

I've seen real-world situations where the Charlie approach was clearly wrong - a maverick programmer that wrote a module on his own, only some light testing with end users, no peer-review, not even source control. The program was finished in record time (a month or so), looked good, apparently did what was required, users and management were delighted, he got a huge raise. "Charlie" went on holidays... and then disaster struck. He hadn't considered the impact on other systems, and a bug delayed the montly accouning closing process, costing thousands of man-hours correcting the errors. Other programmers had to go to his terminal to see the code... and found a huge hardcoded, unstructured mess.

OTOH, in the same company, they hired a Java Senior Architect "Alan" to lead a module, just a little more complex that what "Charlie" had done. "Alan" spent the first few months meeting with all possible stakeholders, writing process diagrams, selecting a 4-person team, then spent a few more months building a "perfect" software architecture, an entire ORM layer over the systems he had to connect to. Then they chose a complicated Javascript framework for the frontend which none had experience with. After a year and a half (over a year over budget), they finally launched a first version... which wasn't what users needed. A year and a half later (3 years total), they finally have a working system.

While he didn't get the credit "Charlie" got, everyone thinks "Alan" is some kind of guru and that he understands "hard" problems, and he's going to be given the lead (again) on an even larger project, which the company is betting several millions on.


> ... and Charlie had realized this, it would be likely to fail, because he did not have enough leverage toward the upper management to get the resources required to succeed

Not to mention, the company will be paying Alan and his team much more than they’re paying Charles.

It’s my suspicion that people will give more weight to people they’re paying more, simply because they’re perceived as more valuable - independent of any other hard data. So despite having already spent more money on Alan, they’d be likely to continue doing so, due to the perception of hard work and the perceived value to the company.


I read an study testing an idea similar to yours, but with wine. They organized a study in the guise of a wine tasting event where there were 3 bottles of wine, but 2 bottles were marked up (one from $5 -> $50 and another from $10 -> $90). Even though people didn't know they were tasting the same wine, they said the the more expensive one tasted better. The other half of the study was with the prices removed, people couldn't guess the more expensive wine. The conclusion from the study was that "Individuals who are unaware of the price do not derive more enjoyment from more expensive wine."

Source: http://www.wired.com/2011/04/should-we-buy-expensive-wine/


Related Hacker News comment: https://news.ycombinator.com/item?id=8941621

From 'Software Requirements & Specifications' [0], by Michael Jackson (not that Michael Jackson, and not the other one either), published in 1995, a parable [1]:

'What do you think?' he asked. He was asking me to tell him my impressions of his operation and his staff. 'Pretty good,' I said. 'You've got some good people there.' Program design courses are hard work; I was very tired; and staff evaluation consultancy is charged extra. Anyway, I knew he really wanted to tell me his own thoughts.

'What did you think of Fred?' he asked. 'We all think Fred is brilliant.' 'He's very clever,' I said. 'He's not very enthusiastic about methods, but he knows a lot about programming.' 'Yes,' said the DP Manager. He swiveled round in his chair to face a huge flowchart stuck to the wall: about five large sheets of line printer paper, maybe two hundred symbols, hundreds of connecting lines. 'Fred did that. It's the build-up of gross pay for our weekly payroll. No one else except Fred understands it.' His voice dropped to a reverent hush. 'Fred tells me that he's not sure he understands it himself.'

'Terrific,' I mumbled respectfully. I got the picture clearly. Fred as Frankenstein, Fred the brilliant creator of the uncontrollable monster flowchart. 'But what about Jane?' I said. 'I thought Jane was very good. She picked up the program design ideas very fast.'

'Yes,' said the DP Manager. 'Jane came to us with a great reputation. We thought she was going to be as brilliant as Fred. But she hasn't really proved herself yet. We've given her a few problems that we thought were going to be really tough, but when she finished it turned out they weren't really difficult at all. Most of them turned out pretty simple. She hasn't really proved herself yet -- if you see what I mean?'

I saw what he meant.

[0] http://www.amazon.co.uk/Requirements-Specifications-Software...

[1] http://www.win.tue.nl/~wstomv/quotes/software-requirements-s...


I've lived this experience, many times over, and the story is true at it's core.

Some folks might dismiss this as a fantasy scenario, and sure, it's missing a lot of details. In the real world, the lone programmer doesn't get to write an efficient program for different reasons: the software design is locked down by managers, there is fear of abandoning the poorly written legacy code, newer programming tools aren't allowed because most developers don't know them, or there is a sunk cost of an expensive software license that has already been purchased.

The story is symbolic, but the lessons are universally true, even today. For managers: conventional methods of judging productivity are unreliable. For developers: few organizations will enable you to work optimally.


I don't believe some parts in the story.

1) In my experience people fresh out of college don't deliver well tested code that goes beyond the spec. I'd rather expect something that barely conforms to the spec, neglects a few edge cases, and crashes when you input typical data.

2) A coder that spends a few weeks all by himself to implement a spec and then delivers a perfect product seems implausible even for an experienced coder. Specs are usually ambiguous and often don't even describe the problem that actually needs to be solved. It takes a lot of talking to clients to discover what you actually need to do. Halfway through the project you'll realise that half of the spec should be changed. If you just implement the spec without talking to anybody, you'll end up with code that solves the wrong problem.

But then there are things I can absolutely relate to. Good solutions seem simple and obvious in retrospect. But it takes a lot of effort to come up with simple solutions.


It's explicitly labeled a parable: "a simple story used to illustrate a lesson." It's not meant to be literally true. The second programmer's inexperience is just another reason why his boss judges his work to obviously have been easy.


Yeah, but it's interesting - the parable would definitely work better if Charlie were the super-experienced engineer who knows enough to value simplicity and Alan were the fresh-out-of-school greenhorn still impressed by fancy one-size-fits-all methodologies.


Unrealistic parables teach unrealistic lessons

That a thing is a parable does not render it immune to disagreement on realism grounds

Indeed if the defense is to point out that it's not meant to be realistic then it's probably not just valueless, but of negative value

Parables are not a good way to learn in my opinion


The point of this parable is that neither company has any clue what the actual value of the work done was. Alan looked like he was working hard and was rewarded for looking like he was working hard. Charles looked like he was goofing off and was rewarded for that.

Yet when you compare the results, Alan took 24 programmer-months to solve the problem, and Charles took 3 to do it better. If the companies had some way to objectively value the work done, Charles is clearly vastly better. But they don't.

It's got nothing to do with the specific techniques involved. It doesn't even weigh in on the reason for the difference -- is Charles brilliant compared to Alan's normal, or did Alan just stumble and turn a simple task hard? That's all irrelevant.

The point is just to illuminate how hard it is to know the value of a developer's work. I know I've run into that repeatedly in my career.


Re 2) I've done this a few times.

The trick is to ignore the detail of the spec, and understand what they actually want better than they do.

If you get that right, then the actual coding becomes insignificant, and everyone is happy


If you just implement the spec without talking to anybody, you'll end up with code that solves the wrong problem.

Good, then they'll pay you again to write it a second time. As the Demotivator definition of consulting puts it: "if you can't be part of the solution, there's lots of money to be made prolonging the problem."


If it even remotely conforms to the spec I'd say that's gold. More often than not 'barely' translates into 'does not bear any resemblance to'.


> I don't believe some parts in the story.

Cool story, but I have trouble believing parts of it.


  Day         What I Did        TotLOC
  ---  -----------------------  ------
  Mon  Prototype Possibilities     300
  Tue  Write Building Blocks       900
  Wed  Construct Major Content   1,700
  Thu  Add Features              2,200
  Fri  Refactor, Test, Deploy      300

  Customer: Looks great. Thank you!
  Boss: Only 300 lines of code in 5 days?
  Boss: How can you be more efficient?
  Me: I could take Fridays off.


I don't understand; your numbers add up to 5400, not 300.


Total lines of code kept increasing until he was able to refactor everything on Friday and cut them back down to 300.


The final column isn't LoC written that day; it's LoC in total. Friday was spent refactoring (what we in the old days called "rewriting", before the hipsters made it cool); taking the code that was already working, and making it smaller, lighter, better.


> what we in the old days called "rewriting", before the hipsters made it cool

Calling it refactoring dates back to at least the early 90s[1], which I'm pretty sure predates "hipster hackers." If anyone "made it cool" it was probably Martin Fowler[2].

[1] http://www.ai.univ-paris8.fr/~lysop/opdyke-thesis.pdf

[2] http://www.amazon.com/gp/product/0201485672


I dislike this strongly. It is attempting to demean a group of people without really giving them a fair chance. It exaggerates and vilifies the "enterprise programmer" while lionizing the young-but-inexperienced smart programmer. It's a very common theme here on HN. It's playing into the narrative of the young entrepreneur that can drop out of college and build the next successful start-up. The fact is that most people in that situation will fail the first few times as they gain experience. People who can operate like Alan, though obviously Alan does not deliver results, are very valuable in a company. Systems have a tendency to get more complex and you will need people who have experience dealing with that complexity, managing it, and reducing it.

Let me address the parable a bit more directly.

The parable is very hard for me to relate to because to me it seems like the two characters are cherry picked such that they're vastly different in terms of skills. It looks to me like Charles is either much better than Alan at design or got lucky. I think the only wisdom that can be gleaned from the story is to not over-complicate your design. Charles and Alan both spend time upfront designing their code, but it seems that Charles came up with the better design and saw the simple kernel of the problem. That's the only real difference between the two. Alan saw a problem that needed a lot of work and specs to get right, but Charles saw it was actually a very simple problem.

I would say the lesson is to make sure you understand the problem well enough, but Alan is an experienced guy who still couldn't figure it out, so it must have been a hard problem to know how to work with in the first place. Maybe Alan was working defensively to make sure he had all his bases covered in case the design turned out to be different than he initially anticipated. I can't say.


This is indeed a comment-baity article. It is fiction, those two people didn't really exist and do those things. It gets programmers/developers riled up - because it's so believable.

It is indeed the case that the characters are vastly different in terms of skill, and that is sort of the point. Alan did spend time developing his skills, but it was focused on frameworks and processes, and not effective programming or problem solving. Maybe he just didn't have the right start, didn't know what he didn't know, and just went the wrong way from there. But management, and Alan himself, can't tell the difference.

I think that's the ultimate take-away. Non-programmers and "bad" programmers can't recognize good programming, even if they look at results. Well they can, but it takes a lot of time and luck to come across comparable situations. It seems like an intractable problem. They don't know who to trust; and there doesn't seem to be a way to find out for them, except to be genuinely good at developing software... but how to know which way to go about that?


A lot of people seem to think this is a story about programming. It's not. It's about perceptions. A good programmer knows that a certain portion of his work is managing perceptions. This is part of a skill-set known as 'soft skills' and will get you farther than a perfect understanding of monads.


The issue that seems to be overlooked here is the objective value of solving the problem itself. Instead it focuses on the developers and their rewards.

A does a thing, it takes a long time, solves the problem and he becomes System Analyst.

C does a thing, it's shorter in both time and effort, and he leaves the company a year later.

What about the value produced by the solution?

It's clear that if a single (inexperienced and poorly communicative) developer can produce a satisfactory solution in a fraction of the time of a majorly architected approach then the first view (A) is overvalued.

The moral of the story seems to be something about how huffing and puffing, self-importance is rewarded over a thought out appropriate solution to the task at hand.

How about going forward in the story? A sequel-sequel?

At Automated, their unrealistic approach to planning and development, and their focus on titles and rewards led to bloated and unmaintainable software, where everyone in the company fought to make something "smart" that demonstrated their intelligence and worthiness of being rewarded - the problem specifications being nothing more than a means to achieving this.

At C, Charles either gained the experience necessary to understand the reasons for communicating efforts and planning to the rest of his management and went on to design appropriate solutions in another company, or he decided that being rewarded for his efforts was more important than solving problems and gaining domain experience and joined the above.


I know it's a parable and the objective is to send the message that "customers value complexity". In the enterprise we see lots of news where a business spend millions in a certain solution and we sometimes think it was overspend.

But I see another point why the story reflects the reality in some aspects: experienced developers tends to overengineer a problem, and I recall the 'hello word' joke about programmer evolution. But at the same time, the novice, rarelly will deliver a good sollution in the first time.

In my point of view, the simpler solution, will just work, could have lots of technical debts and could be hard to scale, while the complex one, could turn costly to support. Its very hard to find an optimal point between both.


This brings to mind the maxim, "A job, no matter how small, will grow to fit the time and resources available".



Well stated. I will jot it down for my next meeting with my manager :)


Shhh. It's a secret. Use it to your advantage, don't tell your manager!

In all seriousness though, your manager probably won't take kindly to hearing your advice about project management.


If there is a second sequel to this story, it's that the one guy goes on to start a successful company, and the other guy is entrenched in the politics of his company.

I think this would be a better comparison if space invaders was left out. Or perhaps if it were, "he scribbled for a while, then played space invaders, then scribbled some more.", as opposed to him starting with nothing but space invaders. The other guy began by inventing problems rather than solving them.

It comes down to, a direct, though outwardly puzzling, solution that is apt VS. an indirect, political, frustrating(for everyone), bloated, slow solution.

If it weren't a programming department, and it were two different companies, survival of the fittest would kick in, and the better faster guy would be more successful. And, it has. We now learn from the successful companies that are similar to the space invader guy and it doesn't seem so weird. The fact that people were catching on to this idea in 1985 is pretty striking.

This is anecdotal and doesn't really teach the bigger picture. It's also not encouraging, as it doesn't lead the user to the correct path of action.

It really needs a second sequel where the space invader guy goes on to be wildly successful to be a good story.


That is why I love perl: the manager can not understand it and can not say it was simple.


I wonder how much language popularity is affected by this effect.

We Lisp/FP fans crow about how easy and quick dev in more expressive languages is, but to a clueless manager that's potentially exactly the problem. If it doesn't look like work, then the manager is apt to wonder what they're paying you for.

Whereas JavaEE ...


APL should've been much more popular if prestige by intimidation was a major factor.


I can see exactly where Charles went wrong. Here is how the story should have gone:

Charles implements an entire interpreted language with garbage collection and an incomprehensibly coded virtual machine. Everything is documented and works flawlessly at that level, however. Nary a bug can be found by anyone experimenting with the language.

Charles writes the program in 30 lines of this language, which appear fairy simple on the surface, but rely on some deep semantics (that Alan's team doesn't even comprehend, for instance, let alone management, without studying).

In this case, nobody can dismiss the problem as being something easy that a junior programmer can solve, and Alan's team look like troglodytes from the Dark Ages. All the more so because their buggy, incomplete program is over 2500 lines, while Charles' language fits into 2300.

:)


Rarely (unless you are working on your hobby project) you get a chance to make program simple. There are inherently complex requirements that can't be changed for political reasons, complex legacy interfaces to conform to et c.

Interestingly, most programmers seem to think that inherent complexity doesn't exist, i.e. that any requirements can be translated into a simple program. The result of this approach are leaky abstractions that later on cause much pain to the developer and anyone maintaining the codebase.


Quite an interesting article. The best part is that the article does not judge. It simply lays down facts for us and leaves the opinionated part onto us.


Is this sarcasm? The article pretty clearly judges in an extremely opinionated way.

It's a shame it never actually stated the specification of the program. It seems like that would be the kind of thing you would need to know to form a fair opinion.


It's not sarcasm. I really interpreted the article this way. Might not be the correct interpretation.


You got it right.

Perhaps a better way to put it would be to say that the article teaches by demonstration instead of teaching by assertion.


But, it's a fictional story. I can make up a fictional story to demonstrate anything, regardless if it's true in the real world.


I thought this was a fictional story.


It might be, my intention was to highlight the fact that the article is laid down as a set of facts with no judgement. It's not like Charles was a hero. It just shows how Alan was rewarded. It's like reading something as simple as fate.


It might be the most fictional story ever written about programming.

It's completely ridiculous.


I don't think it's ridiculous at all. Have you ever worked in the defense industry?


Yes, actually, I have.

So this story isn't ridiculous because it applies to a niche that it never mentions?


I have seen this kind of problems many times in real life. By start writing design doc and prototype code from very beginning, you demonstrate effort and progress. You most likely get hold of the project and eventually deliver it along with all the incremental complexity during the process.

On the other hand, if you spend 2 months iterate through the requirements and alternative design choices, you are far more likely to come up better design, but your manager (or the entire company) would have no patience to watch you thinking in your head. As the result, I have seen software designs could have been 10x or even 1000x better, but most people prefer to get something out first (this is especially necessary for startups).

Another random comment is LoC per day. I worked at a few large companies. The statistics show the residual code is about 6-16 lines of code per business day per software engineer. A lot of time goes into design, debugging, testing, iterations, redesign, refactoring.


This is a parable about whether you really understand the nature of the problem, or not. If you do then you can produce a concise definition (i.e. a better program in fewer LOC) and if you don't then it can get much more complex. It's like the difference between trying to make a machine fly by making wings out of feathers and having it flap, or understanding the principles of aerodynamics and making a fixed wing out of canvas.

However, for most non-trivial problems understanding everything up front isn't possible. To know what works you may need to create prototypes and iterate, abandoning things which don't work or which end in a hairball of complexity and getting opinions from testers. In that situation just intuiting a solution and then typing in the code won't work and the more formalised process might work better.


The parable is really about the nature of corporate politics and the art of perception management.

I'm not sure how obvious the underlying point is -> programming languages and methodologies are structured by political requirements.

Programmer A is writing software. He knows nothing about perception management and status plays.

Programmer B is furthering his career and angling for a raise. By accepting 'corporate coding standards' he has proven he knows how to play the game and uses the project to increase his status.

The quality of the finished product, in terms of user experience and practical usefulness, isn't nearly as important as the social processes that created it.

IMO the interesting question is how much this applies to coding culture and language and OS design as a whole.


> However, for most non-trivial problems understanding everything up front isn't possible.

You know, people keep saying that, but I'm yet to find one example of it on real life. Yes, there are contexts with many rules, but I never found one that you couldn't modularize into person-sized problems.

That said, for several times I've solved in a month or two (took six once) problems that other people struggled with for years in the "corporate way" of development without any kind of success.


Wow, as someone who is going to start doing Computer Science next year, I had already heard that most of programming is about thinking, but had no idea that is was to the point of 5 lines of code per day being exceptional.


Green-field software should be well in excess of 5 lines a day. If the software is old to the point that nobody fully understands it any more, and previous maintainers have left the company, the number of production lines of code added per day may easily be lower than 5.

Older code has a lot more functionality, so every line of code is capable of leveraging more. 5 lines of code in a mature system may do more than 500 lines in a new system. But a bigger factor is figuring out which 5 lines to write. A third of the time can easily be spent researching the code trying to figure out a plan of attack, and the remainder iteratively debugging different variations of 5 lines validating (or invalidating) assumptions made about the code's emergent design during the planning phase.

And I'm not counting test code. Test code is usually cheap to write, if the code is testable. Writing lots of tests is an easy way to inflate lines written, should you be stuck somewhere in the dark ages where they actually measure productivity by such a discredited yardstick.


Don't wait until next year. Start programming now. School is good to learn computer science; school is not needed to learn how to program.

Don't read anything into the 5 lines of code per day. This is a pretty old article and even then, that would only be a reasonable figure if you take a large number of people on a large project and compute an average for everyone. In other words, counting refactoring, fixing bugs, re-writing things, doing administrative work, and so on towards the time elapsed. And even then it would vary so much from one project to the next that you couldn't get a reasonable representative number, though 5 LOC per day would certainly be possible.


I have already started programming, it is just that all the problem sets I have solved were easy enough that I could get to coding right away, and when I couldn't I just thought I was dumb


Don't ever let yourself think you are dumb. It's a lot more harmful to your psychology and to learning than it may at frist appear.

If you can't figure something out, you are just lacking some knowledge you need. I wish I had learned that a long time ago.

Being dumb is never the explanation for not understanding or figuring out something.


In my experience, as an outrageously well paid programmer, I will touch a lot more than 5 lines a day whenever I am programming, especially if I include changes to the unit tests, integration tests and such. There are two caveats though:

First, you will not be coding every day. There's requirements gathering, setting up environments, build systems and such. Depending on what you are working on, this could be over 50% of your time.

And second, and perhaps more importantly, most of the lines you touch will replace lines that you, or someone else, has written in the past, instead of just adding more functionality without touching the existing codebase. For instance, 5 months ago I started working on a project that had been written by two junior programmers that did not understand the language they were using, functional programming, or the problem they were solving. The code was full of repetition, bugs, and there was no way to make heads or tails of what it was actually solving. So I started refactoring, working on eliminating duplication, and trying to build abstractions. After a week, I had made a good dozen commits every day, but they were on top of each other: total LOC actually went down. Only after that week of coding that didn't add to the codebase I could see the places where refactoring just wouldn't do anymore, and was able to figure out what to rewrite, and how to add the additional requirements.

So I had spent a month on a 15K LOC codebase, and ended up with 12K LOC that did more, had more tests, ran faster, and made sense to the people that would end up owning the product in the long term. But yes, I mucked with over 100 lines of code most days.

Situations like that happen to experienced programmers all the time.


When working on an existing system, I find it's a good sign if your SLOC/day count goes negative for a while.


One contract I had, I dropped the line count of an inherited codebase by 50%. Much of which was necessary to simply be able to run (and debug) it interactively.

It still ran. It ran correctly. It ran faster than previously.

I never quite grasped what it was doing.

The company folded some time after I left it.


That is wisdom...


don't be discouraged, it's par for the course.

we're talking about quality lines of code: debugged, qa'd, and working as expected. perhaps even after the customer has had a chance to peruse the beta. you will be surprised how much time you'll spend in meetings and how hard navigating and contributing to large projects becomes.

curiously, the number of lines of quality code expected by a developer has not changed much since "the mythical man month" was written, which was about IBM machines in the 60s, despite the obvious advances in programming languages and programming environments. it's still around 10 or so.


For me, lines of code per day varies depending on what is being done and how.

If it's a meaty problem that is not well understood, or a language or framework I'm unfamiliar with, or I sense the situation is complex enough that an early bad decision could lead to unforeseen troubles later (and I should tread carefully), very few or no lines are written during the day.

On the other hand, if it is similar to a thing I have done many times before, and I'm comfortable with the problem, the language and framework then there is much less time thinking and StackOverflowing and I can go at nearly typing speed. (Likewise, if the code is quick "one of" type things for tests or throwaway stuff). More than several hundred lines of code a day (JS or Python) in cases like these is very achievable.

This doesn't count things like html templates, css etc (although sometime that take a bit of thought as well).


Remember that you will spend more time debugging, documenting and testing your code. And this is if you work somewhere that doesn't expect you to spend 10+ hours a week in meetings.

[edit] And you will also write a lot of code you throw away; sometimes you need to solve 90% of the problem the wrong way to discover the right way.


No, this is just an extreme example of the "don't try to be a smart guy" ideology. In practice 100-500 LOC per day is normal.


100-500 LOC per day is normal.

It is not. You are not writing a novel here. Yes, most of the time is spent thinking. Some days there is no coding because it is spent on just trying to figure out what to do.


While emotions may run high, it's important to remember that this is an empirical dispute. It can be resolved simply by looking at everyone's git history. For myself, the least code I've written on any day in the past year is 20 lines. My average is around 100. The most is a tad over 1500. Of course everyone's different and LoC is a terrible measure and it depends on the language, task at hand, etc. But in general, most of my time is spent "honing" code; testing/fixing corner cases and working around issues in other software. "Other software" includes browsers, filesystems, databases, JITs, but mostly browsers.

I personally don't spend much time thinking about software-related problems. At the risk of sounding conceited, I'll admit that most problems I encounter are pretty straightforward. Five minutes of uninterrupted thought is more than enough to get into diminishing returns.

It would be very interesting if GitHub pulled an OKCupid and published some statistical analysis of programmer behavior. They could put many of these disagreements to rest.


I'm curious to hear about how many problems you solve with these lines of code. I guess that's pretty much impossible to quantify though. Also, what languages do you work in?

I'm not that experienced, but I had a notable experience working with a 5000+ loc app that was a nightmare to maintain and extend. The last guy had basically reinvented every wheel. He was even, in my opinion, reimplementing the DOM in places with absolute positioning. Also, the code was poorly organized, with pieces of logic that could have been easily consolidated appearing throughout the app in multiple places so that they could not be abstracted out.

After about a month of trudging through his code and making almost no progress, I basically told my manager that we had to rewrite it (I had been thinking this the whole time, but I didn't want to be the junior dev who comes in and demands to scrap everything).

Me and a colleague paired on it for about 2 weeks and rewrote the whole thing entirely from scratch, leveraging several open source libraries and writing a few tested "internal libraries". The whole thing ended up being around ~800 loc when we were done, and it had the extra features that needed to be added, and was pretty bug free.

I'm not trying to blow my own horn here, in other instances I have spent way too long pondering about code and making it way more concise than necessary.

But unless you are a truly great coder who is cranking out 100 lines of good code per day (and I have no doubt that you are), I would be pretty suspicious about the amount that you are writing. I would worry that you are placing a great maintenance load on those who come after you.


(Note: I don't mean for this comment to be perceived as bragging or showmanship. It's just that... well, you asked about my background.)

Computers have fascinated me since before I can remember. I spend most of my waking hours in front of them. I've been honing my craft for over 20 years. I've written and maintained projects in C, C++, C#, Java, JavaScript (both browser and server-side)[1], Perl, PHP (we all make mistakes), Python, Ruby, and a couple Lisps. In my travels, I've discovered and reported bugs in popular software such as Firefox, Chrome, Node.js, Apple's XNU kernel, and libxml2.

I completely agree that some programmers are like machine guns, firing off vast quantities of poorly-aimed code. I try my best to avoid that. I hate sloppy code. I hate repetitive code. Most of all, I hate re-inventing the wheel. If a decent library exists, I'll use it. I have no qualms with something Not Invented Here.

I pair sometimes. I do code reviews often. And I use as many profiling, testing, and static analysis tools as I can get my hands on.

It sounds like your ordeal made you a better coder. Those sorts of experiences are indispensable, but I've found it takes more to keep improving. It's very useful to become an expert on programming, not just an expert at programming. There's a growing body of literature to aid anyone interested. McConnell's Code Complete is still great. Michael Feathers has a book called Working Effectively with Legacy Code. It contains some great techniques for incrementally improving hard-to-maintain projects. Lastly, browsing It Will Never Work in Theory[2] is a good way to stumble into some academic papers that apply to your own work.

1. "JavaScript" is such a nebulous term these days, but I've worked on JS codebases using tools ranging from nothing (vanilla JS) to JQuery to Google Closure to React.

2. http://neverworkintheory.org/


These metrics such as featured in Mythical Man Month are usually based on the SLOC count at the end of the project, divided by the time, not how much code was committed each day. Your churn might be 100+ LOC per day, but hopefully that isn't all permanent additions to the codebase -- bug count seems to be proportional to LOC.


Firstly, any real working programmer has days where they write no code. You are writing documentation, meeting with users and sponsors to discuss new features and schedules, doing code-reviews and mentoring, merging branches, debugging race conditions, meeting vendors, chasing dependencies in other parts of the company, a million other things. That brings the average LOC/day down.

I can't imagine anyone sustaining 100 LOC/day over the long term unless they think that cranking out HTML templates or something counts as programming.


I realize that I was referring to number of LOC touched a day, not change to the total codebase size. I'm not sure what the first poster had in mind.

For reference, I count myself as writing about ~40 LOC a day for a greenfield project (which I spend about 50% of my time on), measured by total codebase size.

And no, it is not HTML templates, although it is a fairly verbose language. And I do all the things you describe in your first paragraph.

In my experience, a good programmer is almost always a fast programmer, and non-trivial programs usually require a lot of LOC to get the job done.


I have to disagree here. This is definitely possible, especially on more greenfield projects. Most of the engineers on my team produced that much sustained. And my average is closer to 400/day even though a big chunk of my time is spent doing things other than programming (e.g. tech leadership / management). LOC isn't a great metric though and can be easily gamed.


It depends a lot on what kind of code you're writing. Don't expect to write 500 LOC/day if your project is to write a device driver or an improved computer vision algorithm.


>>an improved computer vision algorithm

I do this for remote sensing, and I might write 50 loc on the main thing, and often a bunch of python to test if the idea works.


I think it's actually a great little story because in the end, everyone got what they wanted.

The lone programmer did his work, got paid for it and left, hopefully to find a company with a better fit.

The team was hired and got paid to do their work and, while doing so, created a process that the company felt comfortable with. Perhaps they are still working there today.

Everyone seems to have taken what they needed and got what they wanted.

There are an infinite number of ways to slice a thing. Choose one and figure out if it works for you. This applies to both sides, the worker and the management.


The meta message of the parable is programming is, in actual practice, an art, not a vocation or profession. For political / economic reasons we sometimes have to pretend its a vocation or profession, but it really isn't in actual application. Software architecture should be in the "fine arts" department at university, not a branch of finance, engineering, or math.

In art, people expect the highest quality producers to instantly produce effortless appearing product (music, dance, painting, sport, drama, whatever) and are generally willing to pay for quality unless they're a hopelessly tasteless neanderthal. The only time effort is rewarded in art is when parents watch their (own) kids perform.

One epic fail of the parable is it was written in an era of C (pascal?) dominance where 500 LOC means something different then from now. That would be at least 2000 lines of boilerplate-ish java (think of the classic "enterprise hello world" java implementation) or 10 lines of clojure and pretty much everything else fits in between. Something that has stayed constant over the decades, from my observation, is a "complicated functional block" takes maybe three hours long term total average, a "little bit less than before/after lunchhour", and the only determiner of how many LoC are produced in that time is language quality and programmer quality. Maybe another way to phrase it, is a correctly sized block on a project flowchart takes a couple hours in all languages for a given class of programmer ability.


While software architecture certainly is an art, I would not say it is unique in that aspect among the fields you mentioned. Both engineering and maths should be considered art in their own rights.

If you consider abstract mathematics, it is easily seen that many mathematicians find most of their inspiration in the aesthetics of their ideas. Somehow the way we teach maths obscures this fact from the students.

Oh, and if you didn't already: Read «Zen and The Art of Motorcycle Maintenance». Aside from being a great story, it goes into detail about how technical work becomes art when done with Quality in mind.


How neat to see this little article pop up from 1985. Professor Neil Rickert was one of my computer science profs while I was at Northern Illinois University. He was a pretty solid teacher and beyond a doubt, knows his shit in and out. Definitely a highly respected professor within the computer science ranks. He's now retired or Professor Emeritus now, I believe.


[deleted]


I don't know, maybe it took quite a bit of time and thinking to realize that the problem was simpler than it seemed. Maybe Charles was playing games to occupy his eyes and hands while he was thinking about the problem.


"I don't know, maybe it took quite a bit of time and thinking to realize that the problem was simpler than it seemed."

That's also how I interpreted it (based on personal experience).

"Maybe Charles was playing games to occupy his eyes and hands while he was thinking about the problem."

Or he may have been playing games while taking a break from thinking.


It's not that the problem was simple -- the other team's example shows that it was not. Rather, programmer C took the time to think about the problem enough, and write code that was elegant enough, that it __looked__ simple. Simple solutions almost always look easy in hindsight, but often are not simple to arrive at.


"It's not that the problem was simple -- the other team's example shows that it was not"

Not necessarily. It might show that the other team was either

(1) not smart enough to realize the problem's simplicity, or

(2) deliberately playing the political game of making the problem appear hard because they knew that in the end they'd get more recognition. The lead programmer, Alan, got to have three people working for him, which could have been a step up for him on the corporate ladder. The corporate world is full of this kind of empire-building.


The two teams were entirely independent, neither knew of the other, he wouldn't prove the other team no good.

Likewise if anything Charles made it blatant that he was not 'working', he was playing games and scribbling on paper with legs up on the table. Doesn't seem like feinting hard work.

Are you a programmer?


[deleted]


There is no C...


Sorry, skimmed too quickly, didn't realize the two companies were not communicating throughout the story.


And if the specs change after the code was written (as usually happens in real life), who's code would have been easier to update? There is a reason experienced people may put more effort in to certain areas they junior programmers, they will have an idea what is likely to change and when.


Oh god, this thing's actually getting traction here.

Look, there's no moral to this story and very little to glean from it because it's about a situation that's ridiculous on its face. It's written to appeal to cowboy programmers to make them feel better about their prejudices in software development.

Can any of you actually relate a real-world case where a 4-month-long team-driven effort produced some code to solve a problem that could be solved by one cowboy in 3 months and 20% as many lines?


Yes, I did just that. Actually I would guess it was more like 2%.

This was the 90s, and the client was a big corporate programming consulting company. They wrote software for big corporations or the government. They couldn't get a website done internally.

The spec meeting was memorable, because they'd say "We want a form on the website that sends emails," and I'd say "Okay," and then they would look at each other, baffled. I realized later they expected pushback on every single item in the spec. We were just agreeing to everything without even thinking about it – their demands seemed modest to us.

So, we whipped up something that worked reasonably well with good old HTML and Perl, and even had some CMS-like features. It didn't take more than a few weeks.

I later met that internal team that was continuing to work on their ill-fated website project.

They were too elite to use something like Perl. They were working on a web server. From scratch. In C. They proudly showed me how their web server could serve a web page. They had even worked out how they could show a preformatted table of numbers, by printing it inside a textarea.


I think he meant "Cowboy vs. Normal", as opposed to, erm, "Normal vs. NIV-addled incompetence".


"Can any of you actually relate a real-world case where a 4-month-long team-driven effort produced some code to solve a problem that could be solved by one cowboy in 3 months and 20% as many lines?"

I've had the experience of refactoring my predecessor's code and ending up with 20% of the number of lines. Inexperienced programmers tend to write repetitive code instead of looking for abstractions to simplify the problem.


But sometimes you can see the possible abstractions only after the fact... I often end up deleting a lot of my own code that turned out to be unnecessary.


Then there's the situation in industrial control systems where sometimes you really itch to do that, but you can't because the return statements mess up the tight timing :)


Can't you use stuff like gcc's __attribute__((always_inline)), or just straight-up C macros, for cases like those? Worst comes to worst, you can always use m4/cpp/some other preprocessor, although that does tend to hurt...


If the compilers you use have been updated this millennium, yes. However, that's a big if.


There are people who don't care to simplify the code even after the fact.

I worked on a project where half or so of the code was copy-pasted from somewhere else in the same codebase.

The original author claimed that this way is better as it allowed him to see what the code does without jumping around the whole codebase.


Yes. I worked at a company that spent hundreds of thousands of dollars, and nearly a year, on a server monitoring system that never acheived a high enough timeseries resolution to actually solve the problem it was funded for. Then a 20-year veteran of the company stepped in and wrote a 50-line shell script, pushed it to the relevant machines, plugged it into the timeseries database, and solved the problem. He called it his 89¢ solution.


Eww. Were there extenuating factors though? The monitoring system was somebody's pet project, or was purchased under contract on the hopes and dreams of sales, or the monitoring software did work but not in that specific environment, or...?


I've seen this exact same thing too. The "extenuating" (if you could call it that) factor was that the monitoring software was written in house and it was just shit. It might have been a pet project initially, but it didn't end up that way. It was just continued out of some combination of habit, risk aversion and not invented here syndrome.

When dumped and replaced by one of the better OSS versions in a few days by a developer who got fed up, things "magically" got much better, very quickly.

No praise or career boost at all for the developer in question, though. I think a large number of people (from management down) were embarrassed by the whole affair and wanted to forget about it. It seems that praise can only be dished out if you didn't embarrass somebody important.


IMHO, the person selected to run the original project was a beneficiary of the Peter Principle, and on top of that, left the company before the project was completed. I wasn't directly involved, but I specifically remember the hero of the story complaining about having been "sold a bill of goods" -- i.e., the company was promised the new system would perform screamingly, but when push came to shove, it didn't.


wouldn't be the first time I've seen a shell script outperform an overengineered solution


Absolutely, I finished (as in actually got the system to meet requirements) a system in 6 weeks that a team of ~15 (all gone by the time I had got there) had failed to do in a year.

Their system had many thousands of classes, mine probably had less than 50. Theirs used pretty much every technology in J2EE (this was a few years back) - mine used a very small set. I didn't even have to work very hard....


Yes, I can relate, and I just barely started my career in software development.

Let's have a thought experiment this way: imagine picking 4 programmers and ask them to implement Git from scratch, can you imagine any scenario in which they would take at least twice longer than Linus to produce the first working version? I certainly can.


Yes, I've seen situations like this. As a rule of thumb, problems that fit into the head of one person are more efficiently solved by one person on its own than by a team.


This, 1000 times, this. If you ever find yourself having difficulty cleanly separating a task, than you are going to spend more time defining the separation and integrating the parts than have having a single person do the task.


> Can any of you actually relate a real-world case where a 4-month-long team-driven effort produced some code to solve a problem that could be solved by one cowboy in 3 months and 20% as many lines?

Yes, I have real-world experience of that kind of ratio. All it took was some unreasonable non-functional requirements, many coming from a "chief technical architect" who had the authority to overrule everyone and made a bunch of poor decisions.


> [Alan] asked his department manager to assign another three programmers as a programming team

The scenario's quite believable. Many "programmers" out there got transferred from a user department, lied on their CV, and/or cheated the aptitude test. They can't code very well. Or maybe their coding level's up to scratch but they can't coordinate and communicate with other programmers, hence meetings and numerous drafts of what's been agreed on. Or perhaps work is hard to find or some programmers are getting older, so they deliberately complicate the task or the program to keep themselves in a job.


I know this story for perhaps ten years and it is one of my favorites.

> Can any of you actually relate a real-world

I know about 5 cases. I have been on both sides of this story. Plus there are similar cases outside of programming.

Lets just call it luck, shall we?


I met a lot of cases; in some of them I was on the slow team side and in others I was the lone cowboy.

The problem is not if it happens, the problem is that our industry still sucks a lot.


You're going to get a lot of "yes" responses to your questions that themselves are going to be embellishments (or outright lies), omit context, pretend that refactoring an existing set of code is the same as writing it from scratch under the same conditions, etc.

Our industry is filled with people who are "cowboy programmers" and think of themselves as Charlie when in reality they're more like Alan, and are attributing what is actually pure luck and circumstance to their own brilliance. There's an especially high concentration of this BS on HN. You only need review the HN threads about the HealthCare.gov website to see this in action.

That's not to say this sort of scenario doesn't exist, never occurs, etc. It's just that there is likely to be an extraordinarily high number of people who read HN who believe they have experienced it (on either side), even if an objective evaluation of the circumstances would reveal they haven't.


Yes. If you want to complain about the exact numbers, start with these

> Charles announces he has completed the project. He submits a 500 line program.

> Charles did produce about 5 lines of code per day. This is perhaps a little above average.

Obviously that's not the point though.


This happened to me last year.


Yeah, the parable is wishful thinking by a developer who wants to be left alone and code without having to communicate with anybody. There are rational reasons that this is not a viable approach for most non-hobby software development.

A single developer outperforming a team is not that far-fetched - under ideal circumstances. It is obvious that there is a lot of overhead from communicating and coordination in team development, and the biggest drop in efficiency is scaling from a single developer to a small team. If a project is just small enough that a single developer can handle it, it is clearly the most efficient. The problem is that the lone-developer approach is totally non-scalable. If it turns out halfway through the project that it will require 20% more work than originally estimated, then the lone developer will have to add two or three colleges at a late stage - and since there is no written spec, the project is probably going to end up being even later and bad quality code. But of course this is not considered in a parable written to show that the lone coder-approach is superior.

Also the 5 factor difference in lines of code and bug count for solving the same problem is not far fetched either. But I think it has nothing to do with the number of developers but rather on culture and competency. It might just as well be the lone developer who writes a lot of bad code to solve a simple problem. Of course it could be a very competent single developer compared to a team of mediocre programmers, but it could just as well be the other way around. And in a team development effort, the most competent developer may perform code reviews and so on, so the overall code quality is better than just the average competency of the developers on the team.

The major difference in the two approaches in the parable is that the one team actually produces a written problem analysis and spec, while the cowboy has it all in his head. The story suggest that the lone developer actually produces a perfect problem analysis and is therefore able the write the perfect program without the overhead of producing written spec. However, the boss/customer have no way of knowing until the program is finished. If there was an error or miscommunication in developing the problem analysis, it is too late to change now, or at least will be much more costly than if it was discovered in the specification stage. Also, in the real world it requires communication with the customer to develop and verify the spec. It the story the development happens completely without outside input, which almost never happens in the real word. Even if developers would wish it was the case.


Thanks. I have read this many years back, but could not find it anymore.


I can't wait to get out of software development.

I'm just a couple of years removed from uni and I'm already planning my exit. My first corporate job was a huge eye-opener. This type of stuff left and right with zero career advancement and complete disdain from corporate.

I'm not even thirty and I plan on retiring and starting my own business (tech or other) by the time I'm 40. The only good thing about tech is that I can actually find a job and that the pay is above average giving me the opportunity to take the money and run!

It's a complete shame too because I enjoy solving problems through code. I love learning about new technologies. But everywhere I look software developers are completely screwed, absolute wasteland of an industry.


This comment is more for others than for you, but there are good companies out there that are engineering focused. Almost none of them are corporations in which "software" is just one division of many,


> It's a complete shame too because I enjoy solving problems through code.

Who/what's there to stop you doing just that, once you have taken "the opportunity to take the money and run"?


If we pretend this was a real story, Charles's manager should really have spoken to Charles about playing Space Invaders for 2 weeks straight. While Charles may be getting some valuable thinking time in, I can imagine he's also pissing off his colleagues who will feel they're working to support a guy messing around.

The exact nature of that depends on Charles, his manager, and the company, but it is possible to have those conversations without diving in and accusing Charles of goofing off.

Also, the manager made a huge mistake in judging the code based on what he saw. He should have spoke to Charles about this and tried to glean some insight into how it came to look so easy. If I'm speculating, the manager was looking to punish Charles for his behaviour and went about it completely the wrong way.


And that is why this is a parable—not a real story, and not intended to be read as one.


This parable is somewhat untrue, as it pats over the "good students engineers" shoulders, while throwing "cowboy programmers" out in the cold.

Unless you are producing libraries or frameworks, management don't care about internals of software. It's an engineers' problem next time they ask for a modification. Management care about having their problems solved. That's why relations between management and engineers are so notoriously dysfunctional: engineers want to produce art, while management wants to sell stuff to customers.

A manager inspecting a code? Really? A manager congratulating with engineers for releasing a half-working software while going off budget? Seriously? Managers hate engineers as much an normal people hate lawyers: they make everything sound way too complicated to squeeze more comfort zone to roam into.

Lone cowboy programmers that can conjure working stuff in a minimum time are praised by management, and insulted by colleagues.

And, since this parable was probably written by a disgruntled, good scholar engineer, I think this whole story is a little biased...


I think you missed the point.


...though the idea that "engineers want to produce art, while management wants to sell stuff" has a lot of truth, and does have oblique relevance to the post, in the sense that Charles' "goofing off" was about thinking about the best way to abstract the problem and write it elegantly and concisely.


tell me, please




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: