For me, the real issue is ownership. I come from a finance background and have seen "business people" often run technology into the ground because they simply can't let go of ownership or control. They want to define too much and leave little in the hands of the developer. Development managers can be just as bad with their overarching methodologies. Really bad ones can stifle the creativity of their developers by micromanaging and just end up making people miserable. I personally don't follow any particular methodology, but instead, I agree with each team member a well defined deliverable and deadline, and just let them get on with it (with whichever methodology they prefer). The only thing we stipulate as a firm is the version control system and test procedures. The whole team has an informal chat once a week or so to make sure things fit together properly - no daily stand up rubbish. It's not complicated or even that structured, but it works and my colleagues seem happy. Basically our philosophy is, hire good people, make them responsible for something, and let them find the best way to deliver. And definitely don't tie them up in stupid admin tasks as stipulated by the latest fad.
"The greatest people are self managing, they don't need to be managed. Once they know what to do they will figure out how to do it. What they need is a common vision, and that's what leadership is. Leadership is having a vision, being able to articulate that, so the people around you can understand it, getting a consensus on a common vision."
"We were in stage where we went out and thought, Oh! We are going to be a big company, so let's hire professional management. We went and hired a bunch of professional management but it didn't work out all well. Most of them were bozos, they knew how to manage but they didn't how to do anything!"
"If you are a great person, why would you want to work with someone you can't learn anything from? You know what's interesting? You know who the best managers are? They are the great individual contributors who never ever want to be a manager but decide they have to be a manager because no one else is gonna do a job as good as them"
Well that's the big secret that's mostly secret because nobody wants to know it:
Programmers/developers are only effective if either the developer himself or enough people in the team have sufficiently deep domain knowledge.
That means you can only write accounting software if you are an accountant, in addition to a developer. You can replace "accounting" with anything else you want.
Software developers don't want to hear this because it means that being a developer is near useless : it allows them to express themselves in code but ... they have nothing to express.
Accountants don't want to hear this because it means no generic software developer (or firm) can deliver on the software they want.
The real bad news for software devs is this : you'll do a lot better as a bad developer with expert domain knowledge than vice versa. This is why Excel sheets and VBA macros can run for decades when great and easily maintained software cannot : the knowledge they were written with is what makes the difference.
Of course both situations are what you constantly see in the real world. Software developers just making software that doesn't support the function it was written for, and really, really badly written pieces of crap software that work amazingly well.
>That means you can only write accounting software if you are an accountant, in addition to a developer. You can replace "accounting" with anything else you want.
That's overstated. You don't have to be a full-blown accountant; you just have to have enough accounting knowledge to do your job. I worked at a funds company for a few years and I didn't know anything about the business when they hired me. But the amount of domain knowledge I had to learn to be effective in my little area wasn't that difficult to pick up.
>The real bad news for software devs is this : you'll do a lot better as a bad developer with expert domain knowledge than vice versa.
Nobody is going to thank you for producing software that would have been great if only it worked the way you intended.
> That's overstated. You don't have to be a full-blown accountant
Nope, ideally, you should be better than your average "full-blown" accountant.
> Nobody is going to thank you for producing software that would have been great if only it worked the way you intended.
Look there are minimum levels of competency for a lot of things before you can do anything. In similar fashion, you also need to be able to walk, or at least get around, have some modicum of how to run a business (even if you're just a TL), ... and so on and so forth.
This says that the most successful people in the the business-software space are people who have domain knowledge and software skills, and write software in their domain. And those people do quite well, dominating their various industries (Lexis-Nexis for law, for example)
Ownership is good, but only up to a point. In this approach, developers end up being independent contractors on a payroll.
If you focus only on deliverables and deadlines, you'll end up with developers using a mix of different approaches, libraries and even languages. It hurts team cohesion and makes the logistics of project management much harder since Joe can't take over Tom's code now that Tom has the flu.
As I see it, one of the main tasks of a technical manager is to set conventions so everyone would feel at least semi-comfortable with everyone else's code. That's not micro-management, that's management.
One view of management is that it's about making sure the cogs of the machine are completely interchangeable. Everything is uniform, orderly, perfect. Everyone is replaceable. No one's contribution can be distinguished from that of any other. Everything must be measurable and measured. See how the graphs in our status updates always show progress!
This is actually an extremely inefficient and demoralizing environment for those involved. Yeah, it's easy to take over when someone leaves, because they were so hamstrung by the environment that they never built anything interesting. And you're going to be doing a lot of this taking over, because people are always leaving, because they were hamstrung. So this idea that we can't trust individuals to stick around and do a good job, and we have to make sure they never have enough power to do damage when they make a mistake, it's a self fulfilling prophecy. It makes them untrustworthy and drives them away.
It really is about ownership. Programmers who are achieving at a high level move much faster and do much greater things. It's worth letting them make mistakes to retain the best people and get their best work. The only catch is figuring out how to keep them accountable for their decisions. OK, you want to use this new tech or try this new architecture. How do we tie your compensation and career progress to the success of those decisions?
When the mantra among developers is "the only way to get a raise is to change jobs", and people are moving on a bi-yearly basis - yeah, you need to plan for cogs.
No amount of ownership is going to cover for an otherwise brilliant developer who jumps ship once the project has shipped its MVP. And if that brilliant developer wrote that project in, say, Haskell (because they wanted to learn it), that project is probably doomed from a financial point of view - a quality Haskell developer willing to do maintenance on an existing project will likely cost more than the project is worth in the first place.
The root of this problem does lie in management, but it's a self fulfilling prophecy at this point: employees have taken the lesson to heart. Even companies which do give fantastic benefits are going to still see high turnover, and need to account for that.
There's a big spectrum between "interchangeable cogs" and "nobody even writes in the same language or architecture" and I don't think either of those extremes would be a good development environment.
> One view of management is that it's about making sure the cogs of the machine are completely interchangeable.
That's not what I was talking about. Employees are neither bricks nor cogs. But having said that, if you run any long-term project, you should expect people to come and go, it's part of the game. Thinking that this will not happen in your project is simply negligent.
> This is actually an extremely inefficient and demoralizing environment for those involved.
It doesn't have to be. Working with common conventions and tools doesn't have to be demoralizing. Conventions should be set for a good reason, after an open discussion, and possibly even a vote ("Do you think that it's worth bringing in this library?", "Should we all use the same IDE?"). Also, conventions aren't set in stone, they can change over time.
This gives the team ownership of their project. It makes passing knowledge between team members easier, it makes it easy for team members to help each other when they've run into an issue, not to mention that it makes code reviews and various technical discussions a lot easier.
If Bob is writing using a different language or a set of libraries than the rest of us, who can review Bob's code? Who can help him out when he's stuck on some bug? Who can help him flesh out his design ideas for some feature he's writing?
Without team cohesion, you're creating not a team of developers, but a bunch of individuals who happen to work on related stuff. I don't know about you, but to me that sounds a lot more demoralizing. To me, one of the great benefits of working on a team is that we all get to learn from each other, plus we get to share a common goal. Agreeing to some common conventions seems like a small price to pay for that.
Team cohesion is definitely not about using the same IDE.
I worked at a place that essentially standardized on Vim. They emphasised pair programming so IDE standardization helped with that. If you already use Vim, sounds great. If not, probably sounds terrible. So what looked like team cohesion was homogeneity. They subtly turned away a lot of people who didn't fit the mold in what should be a minor detail.
I've worked at two companies that basically banned Redis. Having a conversation with an SRE who had a bad experience, trying to convince him it's worth having this tool available, is one of my worst professional memories. Consensus driven technical decisions suck. It's better to give people authority and responsibility. I would have happily admined it myself.
Nobody's going to review Bob's code, not really. If they're not working directly with him on the project and it's doing anything reasonably complex, then they don't have the context to say anything useful beyond catching typos. Things get rewritten every few years anyway.
Ownership (by devs) doesn't have to mean a hyper-personalized idiosyncratic creation. Part of the pride of ownership can be making something understandable and modifiable by other people.
The bigger problem is that many companies do not reward that, they reward doing it quickly and not asking too many questions.
The idea of interchangeable cogs is completely unattainable if a non technical manager is the one hiring, replacing, etc the cogs. If you think about it it makes sense that the only type of person who could ever theoretically pull this off would be a master programmer who understands the project at such a deep level that tasks can be broken down into pieces consistently. I'm not saying it's possible but it's definitely impossible for any non technical manager to do.
A manager is not a developer, only developers can set good technical standards. A manager's job is to make sure there is a lead developer with clear authority to set technical standards.
A technical manager can certainly be a developer. I'm not talking about business people setting technical standards, I'm talking about tech leads/lead developers.
> one of the main tasks of a technical manager is to set conventions
I'd be more inclined to say one of the main roles any manager is facilitation more than anything else.
Any manager that "sets convention" is somebody who wants the easy part of technology without the corresponding hard part.
Talking about technical solutions is easy. Implementing it is much harder.
I've come across multiple "technical managers" who were hands on with code greater than x years ago and they always end up talking out of their arse.
And that's not to mention the numerous wrong choices they've committed to/spoke about at a high level meeting with zero notion.
I think that we're talking about two different types of managers. I was talking about a technical manager, as in a tech lead/lead engineer/lead developer/whatever you want to call it - someone who has to manage and coordinate the day to day of people doing technical work (on top of taking part in the said work). I think you're talking about someone more business-oriented.
Conventions set collaboratively by the team as a whole gain better traction than those dictated by a leader.
Common ownership does not end up with a random patchwork of technologies. Teams make collective decisions on them.
I might like to write CGI programs in Prolog with a MarkLogic db, I'm not going to unilaterally decide to write my bit of a team application like that when everyone else writes WSGI Python applications with Postgres.
Managers manage and Bosses Boss. A good manager is like a lineman guarding a quarterback. Everyone wants access to the engineers for features that customers want, bug fixes, etc. A good manager won't let anyone near their engineers. They will filter everything. A boss will toss every... single... little(and big)... fire... at the engineers. A boss will then wonder why nothing ever gets done. A manager manages; staying on top of everything every day. A boss reacts and knee-jerks to everything that comes up "unexpectedly".
That's a pretty big 'if', though. Hiring good people is difficult. Having bad people is more common and that may be the justification for methodologies.
They don't exist to make good people more productive but to make mediocre hires marginally productive.
That is a very astute observation and I do agree with it. I would argue though, that a lot of mediocre people are actually very smart, but have been trapped under micromanagement all their lives. As a result, they constantly expect to be told what to do. They have the ability to think cogently, but believe it imprudent to do so. I've hired a couple of "non-performers" in my firm from corporates, and once they've been thrown in the deep end a couple of times, they're actually pretty productive.
To be honest, my own senses were numbed by being a yes-man at a corporate job. My job title made it sound like I ran the world, but much of the work I was made to do was simply idiotic. And I was very happy to toe the line because it was easy, you could always spread blame and I didn't have to think much. Then I realised I have a limited amount of time on this planet and decided to do something more useful with my time...
I feel like this actually explains a surprising amount of things that look silly at face value.
Another example I've talked a lot with my colleagues is company wide coding style definitions. They usually have stuff like "never ever use goto", but then you have Linux kernel code using goto, and it seems all weird. But here too the coding convention that feels "stupid" to the rockstar coder is in place to make the code of the summer intern even remotely usable after they are gone.
Not that it is implied but it isn't just the "summer intern" that needs some coding convention guidance sometimes. It can be as useful as coming onto a project and at least you have some chance at (mentally) parsing the code/project without a huge melting pot of styles across all parts of the project (you'll obviously still get some but with convention you at least start from a slightly better place).
goto is used in pure c code as a substitute to exceptions, to prevent unnecessary repetition of the resource deallocation code. Even in C++ code, it is often way faster than exceptions. Goto is not scary.
Very true. I've found at every company that I've worked at that a "bad" process followed consistently by everyone always outperforms the "perfect" process followed haphazardly.
This reminds me of the recent threads about people using Excel instead of "real" software. At the end of the day, you are trying to add value. As long as you aren't adding technical debt or other risks, the how matters much less.
Taking on technical debt isn’t necessarily a problem, zero debt methodologies are the worst! The ones that acknowledge debt as necessary and so deal with it are much better.
>"business people" often run technology into the ground because they simply can't let go of ownership or control
Maybe I am one of those people? The first time I hired a developer (contract, and senior contract at that) I had written a prototype of the software I wanted with the core features working but it was clunky. It was to interface a modern piece if software to a legacy system with a poor API. I gave him the code and said that I wanted that same thing, but written in a professional and maintainable way. I asked him to let me know anything he was not sure about, and I would document it further. 'Let's be Agile about it, we don't need to write heaps of documentation', he said. Apart from having to make him recover from various flights of fancy about new features I hadn't asked for, he kept blundering on with things he hadn't understood properly or he lacked the specific domain knowledge for. I had those things, and many were working in my prototype. Several times the project stalled due to a problem he couldn't fix and I recovered it with my limited (at the time very limited) coding knowledge. In the end we went live with his solution which never quite achieved its aims, but when the business found new requirements I couldn't face this again and wrote the whole thing from scratch.
One bad dev? Well he was a lot like many of the ones I met subsequently tbh. Far to eager to find the one vague thing in what you asked for and interpret it the wrong way. Far to quick to think that users should bend to fit the software and far too willing to plow on with code, when they should have been looking at a flow chart. The great development managers I have met are the ones that have spent considerable time exploring the domain they are working in, know how to talk to business people, and stop and ask when something is ambiguous. Sometimes you need to get out of the tech stack and think more in terms of processes.
My methodology is now something like this.
1) I write in plain English what I want (thanks Joel Spolsky)
2) I bullet point my definite requirements
3) I explain a process in simple flow chart blocks
4) I send this to my devs in good time
5) I sit down with them and explain it again, drawing charts as I go if necessary.
6) I invite and expect questions/challenges and note them down
7) I amend my docs and reissue it
8) I let someone else translate whatever I wrote into 'user stories' or whatever else they want to do
9) I test them against the requirements I first wrote, now I know that I have conveyed my meaning correctly
10) I meet with them regularly and take plenty of time to just talk through where we are. I like to have a mix of business people and engineering in the room, because it makes the devs talk in different levels.
11) I get into UAT with the least tech savvy people, who have no understanding of the project as soon as there is something to show them. Secretaries, clerks, call centre operators all find different faults than the tech people who don't do their jobs every day. They ask all the straightforward questions that you never thought of.
Sounds obvious, but I meet with a lot of third party agencies and developers who look like they have never seen anything like requirements from a client! I have had them say things like 'but we use Agile, we will collect our own user story' How dumb it that, 'we are the smart ones, we will cut out the people with domain knowledge and guess'! I tell them they can do what they like, but I will test their finished article against my original requirements when I am deciding if I am going to pay for what they did!
The main thread of this is to force as much human interaction between the developers and the business as possible, all the way through the project.
Yes, when I started in this business developers were expected to work hand in hand with the business to understand how the business worked and why a feature was warranted and the find an elegant solution in software, and even suggest new solutions to fine tune the business processes encapsulated in software.
That still happens, but it seems the "cogging" of developers has largely made that a rarity as cheap offshore developers aren't expected to do that sort of thing anymore.
As an aside Spolsky should be required reading for any person who oversees any department that interacts with developers in any way. Most people think of development as a scientific endeavor, but it's largely artistic with mathematical tools.
I'm sure it is possible to have an artistic layer of software management that reduce everything to "write me a function that passes this test", which you could outsource to wherever. But having reduced all of the code to that level you have done all of the hard work in writing it anyway, haven't you? By involving the all of the junior devs in that reduction process, you are also teaching them how to do it, so they can be he next 'architect'.
An interesting and sort of related aside: I do some work with a company that makes old fashioned 'Enterprise software'. Their process seems to be that the back end dev gods write functionality that is convenient for the database, and then the front end people make that accessible to humans in the most convenient way for them. So instead of working the way humans do, the humans have to work the way the database does! When you talk to their back end devs, they talk down to you like you are an idiot for not following the way they work! Their db implementation is actually pretty good, the application is only really usable to people that can write SQL, however! If their API didn't also suck it would be tempting to re-skin the whole app.
>> Sounds obvious, but I meet with a lot of third party agencies and developers who look like they have never seen anything like requirements from a client!
The first job of a dev is to define the requirements with the clients. But most dev don't know or don't want to do that.
Set a deadline for coming back with better information. If after, say, a week or two they still can't narrow down a timeline (to something like "1-10 weeks" or "6-18 months") then either simplify or abandon the task.
I would advise not asking for timelines at all, actually. Instead, ask roughly what all they will have to do. How many tasks will it need. Do they need new datastructures, or is it modifying an existing one? Same for logical components. Modification, or addition? In either case, what are they modifying and what are they adding to?
If as the manager, you don't know enough about the codebase to get a sense of "last X development efforts on component Y took about Z weeks of effort", then it is your job to get to know the components better.
Note that doesn't mean building up the expertise to be able to do the changes yourself. Just that it should not be completely greek to you.
Also, I think I figured out what I don't like about this answer - I don't want a dev to have outlined every task and designed the data structures to be used before they can figure out whether a feature will take closer to 6 or 18 months.
And this is where I'm confused: where does a dev learn the skill of figuring out how long a feature will take before figuring out the user flows and data structures.
I've never worked on a project where I was estimating something at a scale of 6 months. But if you ask me whether something will take 1 week or 4 weeks, before I've broken the task down, my intuition is going to scream "I don't know the answer to that."
If I told you "Somewhere around 1 week to 2.5 months", would you accept that answer? Or would you think I was trolling you and we should have a conversation about my performance and place at the company?
If I instead told you "2 weeks", how would that be anything other than a lie?
I get the impression we are talking past each other. I certainly don't want every task outlined. Indeed, I don't necessarily want to know new data structures. I would expect an idea of what has to be modified. And if that isn't known, then I'd expect any estimate to be bad.
That said, my main point is that time estimates lead to bad negotiations. If someone says it will take six months and you need it in five, what is on the negotiation table? Just a month? This is how our industry often finds itself in crunch time, making up for time estimates gone awry.
Whereas if there is a list of things that can be negotiated, you can order the construction such that things are natural cuts.
We do seem to be miscommunicating. If the estimate is six months and the deadline is five, then you ask what can be cut to make that work, and you talk in terms of reducing scope or quality of the work. And it doesn't work to just do things in order of importance - maybe if I need this feature this month it can be done, but if you give me three months then I can implement it in a more resilient/scalable/maintainable way that will also solve problems y and z. My point is probably that a lack of time estimates lead to bad planning.
My guess to where we are talking past each other is that I am asserting you can go from the work to the time. So, you should ask for that. You cannot go from the time to the work. (Again, both are assertions.) You seem to be saying you should just ask for the time, and only dig on the work if you aren't satisfied with the time.
If you are able to turn every estimation session into a series of back and forths where it is "how long?" followed by "why?" if you aren't satisfied, then I feel we are essentially agreeing. Whether you are asking for it or not, you want them to estimate the work required and to summarize it into a time.
And to be perfectly clear, going two people removed from the work, this is required. Similarly, getting 3 people removed from the work, the relevant question will not be "how long" but "how many dollars?"
Similarly, it would be nice to think everyone will eventually need the skill of estimating the value of a feature or product. Because, at the end of the day, that is what is most important.
However, I'm assuming anyone asking someone specifically for an estimate is one of their managers. And they should have more familiarity with what they are asking their colleagues to estimate. And I'm also asserting each of these skills is not trivial. And that they build on each other.
I would probably ask for a timeline even if I felt like I could predict it myself, to help them learn how to do it. And as a manager, you can't always be an expert - sometimes you're the new guy, for instance.
But a timeline is of seriously limited value. You can't burn down on time. You can't combine time estimates. You can't work uncertainty into time estimates. The list could keep going.
I agree no estimation process is perfect. Nor do I think you should do comprehensive estimates on new work. However, the thing you want to know it's how much work there is. Not necessarily when you'd like to release a year from now.
Odds are high you have a deadline. So the incentive is high to keep the estimate below that line.
Of course you can work uncertainty into time estimates.
That aside, the incentive is high to keep the amount of work planned below the amount of work that can be done before the deadline. It doesn't matter how many abstract units or concepts you use, that's what you want to know if you have a deadline.
Working uncertainty into time estimates still just gives you two sets of times to consider, with no way of knowing why you missed the low one and just if you will hit the high one. (Unless you do really wide bars on your estimates, in which case, the high bar is going to be worthless.)
That is, if you give me a high confidence and a low confidence estimate, how do I know why you missed it when the time passes? More, how "close" to making it were you? If you got half of the work you estimated done by that time, I'd know you were about halfway there. If the date just passed, all I know is that you didn't make it.
And I used "I" up there. But it is really even more personal. All you know is that you didn't make it. You literally don't have anything to learn there.
Contrasted with any work you do. Look back and quantify what you did. Not how long it took to do it.
> If you got half of the work you estimated done by that time, I'd know you were about halfway there
And only 90% of the work remaining!
It sounds like you are imagining an estimate as a standalone number that isn't revisited or given more detail. If that's how you do it then of course you can't learn from it. It's supposed to be an ongoing process - if you said '3-6 months' when we started then after 2 months I'd expect a different answer. I'd also expect the initial estimate to be accompanied by an explanation that talks about the work to be done and which areas are causing uncertainty.
If you know of anything to read, watch, or work through to learn to estimate tasks, I would be extremely grateful if you could link or describe it here.
As it stands, I’ve only ever been able to estimate tasks if I’ve done similar ones a few times before. If asked about a new type of task or one with a new toolset, I would currently have to refuse an estimate more fine-grained than a week—-the stress and shame of lying to you would be too much otherwise
Consider how you would estimate anything else. Say you want to retile a kitchen. Would you expect someone to just say, I could do this in a weekend? Because, that is what shows typically seem to need?
Or would you feel more comfortable asking how much they will have to actually do? For example, they would have to disconnect the fridge and move it out of the way, disconnect the oven and move it out of the way, then buy enough tile and mortar for the space, which would be X boxes, and then clean and do the work.
Benefit of this way is when you come back, if none of those tasks has been done on day two, you know it isn't likely to get done on day two.
So, try and use the same approach for programming. Don't just say "it will take 2 weeks." Instead, say, it requires updating X component, modifying Y tests, incorporating Z new dependencies, etc...
As a team, you can try and portion size estimates on each of these. But don't spend too much time on that. Experience is the secret, not perfect estimates. (That is, the more things the team has done, the better they will estimate what they can do.)
Right, my current approach, which works fairly well, is to spend some time writing out a couple approaches and splitting them it out into coherent delegatable tasks. This results in a 2-3 page doc that I can check with other engineers and business stakeholders to be sure it does what we really want it to. It also means we notice if an assumption we made is untrue and we need to re-think the scope or approach of the project.
I’m just worried that at some point, somebody will come along and ask for an estimate and I’ll say I cannot give one and that this means I am not really a professional.
Just be sure to do retrospectives on these documents and I'd wager that you are beyond any skill that I can lay claim to. Honestly, sounds like you are already beyond my skill levels.
Can't say if that helps strengthen your claim to professional, but that practice certainly sounds professional to me.
So for time-based estimates, I've been given the advice to take what I think it will be and multiply by a factor of 4.
If it can be that far off (or more), what is the business value? And how do I know that the person I'm talking to will realize when I say "this will be done by the 20th" that I am thinking "I have no clue."?
> I think programmers should pay much more attention to listening to and working with their peers than to rituals and tools, and that we should be skeptical of too much process or methodologies that promise to magically make everyone more productive.
Sounds pretty much like:
> Individuals and interactions over processes and tools [1]
I don't know why, but somehow people don't get it that agile is not a methodology but a spirit.
Oh wow amen to this. I've started calling one of our managers "Reverend". Particularly when he begins a meeting with a statement that's starting to sound a lot like "we are gathered here today to...".
I'm sympathetic to agile but it does have a quasi-religious feel to it. I've noticed that its proponents make claims about the right way to do things with an unwarranted level of certainty. When I ask "why" (politely) I seldom get a satisfactory answer and often they get offended by the very question.
All methodologies are created for people who do not understand "why", they are inherently religious. Otherwise why would you need a methodology if you do understand all the "why"s? You can just develop one much more suited for your people and your projects and fine tune it over time.
And this is not helped by the true believers constantly saying "you're doing it wrong."
So, point me at someone doing it right then! Because the landscape is coated with people "doing it wrong", and since I'm doing this as a job, I don't have time to sift through piles of pyrite for a single nugget of gold.
You forgot the tools! The current "capital-A Agile" project I'm helping out at started with long sessions where management and Agile Coaches developed a highly sophisticated (buggy) JIRA ticket workflow.
Even now, when we ask for something simple like a new Confluence board, we have to actively push back against new rules, additional restrictions, and more gimmicky Atlassian plugins. It pains me that these misguided parasites are paid to make my life worse.
I've said something like this before, I'll probably say it again.
The change to Agile is often led by management, not by developers. And when it's led by management, it's done in a way that keeps management central to the development process (which is the opposite of original Agile), which means an over-focus on process.
Part of this is cynical survival skills: management wanna manage. The more forward-thinking ones probably realize that original Agile is an existential threat, and they can 'get ahead of' that threat by controlling how it's implemented. But most commonly, it's just plain simple myopia: Process guys will tend to view Agile as a process because they're process guys. And when they implement it, they will make management of the process the central role in everything.
As soon as they start to treat is a a religion, its over. Its all about the team. I have worked with all the methodology and it all comes down to the team. For a good team it doesnt matter íf you do waterfall or agile, focus and dedication is the key.
There is such a things as physics in software: between time, scope and people, one of them almost always has to give. Exceptions I've seen are with mature and well-bonded teams working on familiar scope they understand clearly, with a timeline they themselves defined.
I've found the best "methodology" to deliver decent results are sticking with short iterations. Software is often about doing something we've either not done before, in a way we've not done it before, with people we've not done it with before. So we will have surprises (aka delays) on the way. The more frequent we check just what these delays are, the more realistic we can be about whether we can make it on time, or if we need to cut scope or pull in more help to make it on time.
> There is such a things as physics in software: between time, scope and people, one of them almost always has to give
This can be true but can also be completely false. Massive differences in productivity are possible depending on how individuals work together on a team.
True. Also, there is a fourth variable where you can cut corners (even if it's almost never a good idea): quality.
Great teams can produce much more than mediocre ones, but they too have a limit. When deadline is set too close, one of these 4 things has to give, and it is good to know in advance which one that is, so team can set the priorities accordingly.
> Massive differences in productivity are possible depending on how individuals work together on a team.
Addressed by this:
>> Exceptions I've seen are with mature and well-bonded teams working on familiar scope they understand clearly, with a timeline they themselves defined.
Add: process vs getting work done. We have to be cognizant that every minute spent on process, is not spent on writing deliverable code. I know, process can help ensure the correctness etc. But when the job becomes the process, and not writing and delivering code, then everything slows to a standstill. "And we're going to keep having these meetings, until we figure out why nothing's getting done around here"
> "And we're going to keep having these meetings, until we figure out why nothing's getting done around here"
Do you work with me?
I've been fighting this battle for the last 3 months. They've added almost an hour of meetings every morning at 9am. Half the office shows up at 8:30. That's enough time before the meeting to check email and get coffee, so essentially no work starts until 10. At 10 they get back to their desks, there's a bit of whining about whatever management has changed that day and how stupid the meetings are, some email correspondence and by 11 nothing is done and they go to lunch. They get back by 11:30-12 and finally start doing work. So your 8 hour workday turns into maybe 4 hours of real labor time, and no one works 100% of that.
This sits uneasily with me. I cannot remember any time when a Methodology was followed in practice. Just my bad luck?
The relogious people (mentioned in the article) harmed by false adherence. They adhered to the headlines and warped the substance of what the Methodology said. I remember (with pain) a place that wouldn't develop development scaffolding. They had rules for software development, good ones, motivated by achieving near-perfect uptime for customer-facing services. Implementing a scaffolding service or crontab to that standard was a lot of work.
Then there's the non-adherents who eroded the Methodology. Like the scrum shops that eroded scrum by deemphasising the product owner and stories until the result looked more like a waterfall.
The Methodologies may be broken as a whole but the practice I've seen was generally so distorting that I feel it's unfair to blame the Methodologies.
This would be an excellent point if there existed a methodology that people do manage to follow.
Some of the failure I've seen can be partly explained by people who wanted to have their cake and eat it. Who wanted, say, the promised advantages of Scrum but were not willing to pay its costs (lack of long-term plans and fixed finish dates).
That's not all. It's part of the explanation for some of the suckage I've experienced.
I do blame people for not making up their minds. The people who invented scrum were willing to give up some parts of long-term planning, and got remarkable results for that. They are not to blame when others later failed by not giving up blah.
Maybe some blame should go to conslutants who oversold the benefits of Methodologies without stressing the costs. "YES YOU ACTUALLY HAVE TO DO THIS, IT WILL WORK BADLY OTHERWISE".
I take the other view. I actively blame some of the people that "invented" these methodologies.
Being introspective of your capabilities and your achievements is an extremely valuable skill that I wish more of us had. However, selling your capabilities and giving vague promises that it will help with development if you only followed these practices is a deceitful way to make some money off of your reputation.
Worse, many of them do this by attacking those that came before them, but then taking the stance that their "teachings" are above attack. And that anybody that isn't getting the same benefits they did just aren't applying themselves correctly.
Maybe this is an embodiment of the rule that "all complex systems operate in failure mode most of the time." The fact of failure mode doesn't necessarily justify abandoning a system, because the system that replaces it will also run in failure mode.
Note: I understand "failure mode" to mean that rules are being ignored, or guards have been disabled.
Methodologies are like any other pattern of human social behavior: if they're not structured in such a way that individual actors have a reason to buy into them, then they don't work, no matter how great it could be if everyone just bought in all the way. Literally everything else in human civilization is like that, why should software development methodologies be any different?
I think methodologies could benefit by being treated like software, since that's effectively what they are - 'human' software to manage teams. That means:
* Frequent releases (i.e. do iterations or 'sprints' or whatever).
* Accept that your methodology has bugs and 'fix' those bugs between releases. Most software is horrible and buggy. Don't trust the "methodology gods" that they wrote a perfect piece of working software. It's probably half assed and worked semi-well for their specific use case so they 'released' it along with a reality distortion field.
* Accept that different use cases require different methodologies. Writing space shuttle code? You need vastly different team dynamics to a group of 10 people at a marketing agency running short lived campaigns.
* Follow the UNIX philosophy: don't have ONE methodology that you follow to a T - string together a bunch of small, self-contained rules and team processes that serve your purposes and iterate upon them.
tl;dr fuck scrum. it's the internet explorer of methodologies.
Scrum is great for non-software companies that need to write software.
They will not attract the most talented software developers (on average, not in all cases), and the business people for whom the software is a means to an end care more about consistency and predictability rather than quality.
As a result, fungible resources (humans), deeply regimented stories, regular delivery milestones (sprints), and consistent velocity IS the best possible outcome.
I don't think it really matters what kind of company you work for. I've
worked for many software and non-software companies and the same issues
crop up in both.
The main one is that scrum accelerates the accretion of technical debt, which
"business people" can somehow not care about right up until the point where
it drives them out of business.
It has some good ideas (retros, sprints, no deadlines) and some terrible ideas
(treating team members as fungible, story pointing/velocity, too many meetings,
PO has to make decisions about specific pieces of tech debt).
My main problem with it is the teachers, coaches and promoters who take an
all or nothing view of it and who treat deviations from the official
'scrum' policy as, by default, problem with the team rather than,
potentially, a bug in SCRUM.
I used to think that it was a good base to work from, but after arguing
fruitlessly with the people who take a religious approach to it
I've come to the opinion that it just needs to be trashed wherever possible,
because the problems it does have will only be resolved by moving on to
something else. Better to move on sooner rather than later.
In my experience, if you track velocity long enough, and 'categorise' the values based on things like team size, technology etc, you get values that are useful to predict how much work you should commit to in an iteration.
Yes, things change: productivity, moral, team members join and leave, some teams are shit at estimating etc - but at least in my experience if you take an average you do arrive at a useful figure.
In my experience if you have some sort of measure that looks a bit like productivity then senior management will latch on to it and treat it as a proxy for productivity.
That inevitably means that developers have an incentive to inflate their estimates, which means story point inflation.
My own view is that something analogous to Conway's Law is at work: software design reflects the constraints inherent in the software methodology used.
The problem is not software methodologies per se, it is trying to apply a software methodology to software development where the priorities of the methodology are fundamentally at odds with the requirements and goals of the software being built. The root of the problem is the notion that there is one software methodology that is efficient and productive for all possible types of software development. I would argue that there is an optimal methodology for most software but it is a different methodology for different types of software.
If we discard the oft-argued proposition that a PHP website, an embedded system, and a high-performance database kernel can -- and should -- all be developed with the same software methodology then this entire discussion goes away. A software methodology is a tool; they work best when you select the best one for the job.
I think this is why DevOps is becoming so popular. It's less about methodology.
There's no certificate, role, set of tools or prescriptive process. There's no specification, it's not a product, or job title. There's no one true voice on what DevOps is or isn't. It's about attitude, ideas, customs and behaviours. Culture, paradigms and philosophy. It's a way of thinking, a way of doing and a way of being. Practicing as well as preaching. It's a conversation. It's about taking the best experiences and sharing those with others.
The company I work at did this. The CTO and I even joked about it being "DevOps in name only". It's gotten better, but they're really still an ops team.
I mentioned to some people at my company that "devops" was meant to be more of a philosophy than a separate team and they were bewildered. But I guess I'm happier with ops being a separate team anyway.
Right? Ops, but an ops team you'd trust touching the code.
I feel like devops to most just means "ops but closer to the code now that there are so many code/release management tools and someone has to manage it all".
I've never met an ops guy that does anything resembling development other than ones that officially split duties as a developer and an ops guy. So I have no idea still ever devops actually is.
There's what it is, and what's practiced. DevOps is a culture change where operations and development work more closely with each other. There's more to it, but tying it to something practical: A shop that embraces the concept of DevOps is shortening the time between implementation, testing, and deployment and working towards the ideal of continuous deployment.
Consider Waterfall where development happens, then it's tossed over the wall to testing. Almost everyone acknowledges that this is a bad idea and so testing and development happens concurrently now, but not necessarily by the same people (can still have testers and developers). I guess DevTest doesn't have a good buzzword sound to it, but it's what we do.
DevOps sees the same issue with how developers finish the work, it gets tested, then it's tossed over to ops who don't really understand what the code does (not their fault, they didn't build it). DevOps brings the two groups closer together so ops still does ops, dev still does dev, but as a group they shorten the cycle so that dev can deal with the real issues faced by the operators instead of always lagging months and years behind. It's not necessarily an organizational change, the critical part is opening communication between the two groups.
In a way, it's an extension of agile (little-a, because I don't mean the shit the coaches sell), where the operators become the customer and development gets feedback from them. It's one of those "obvious" things that for some reason isn't very commonly practiced.
Now, that said, in practice it has issues. Management sees it like your comment and tries to make dev do ops or ops do dev, or mash them into one team (which may or may not work). More likely they try to make the devs do ops work and it's a disaster. It may run well, but features are added slowly. Or features are added, but the operations story is a nightmare. They end up understaffing the group (1 devop = 1 op + 1 dev, right?) and creating problems.
The original agile manifesto didn't have a certificate or set of tools either. There is plenty DevOps tooling, andI've little doubt that management types will hijack DevOps for their own gain and offer certifications. It'll probably be easy for them, since there is no single 'DevOps', everyone has their own idea of what it means to them.
In my experience devops more often than not is a codeword for lower prestige (and compensation) grunt work no one really wants to do but essential nonetheless. It is a good environment for camaraderie to develop.
ITIL is not really known outside of specifically IT at large, bureaucratic companies that look at operations as a cost center (even in tech companies, there’s some basic internal humdrum IT after all) and it’s probably where I’ve had my lowest professional compensation historically as an engineer that does infrastructure and operations at scale. A lot of these same places are basically going to fail at any interpretation of the term devops because change is so unbearably slow and because most of these organizations are in crisis mode as a business spending money to stay relevant as technology-first companies eat their lunch.
I find the problem with development "methodologies" are the people implementing them. Frequently the Agile Master or someone who has read the book and granted a certification but doesn't understand the subtleties of company culture, team capabilities or project etc. They just want the "process" to be followed.
Methodologies help.
But ultimately there is no methodology that lets you make money, deliver products that buyers want, outsmart your competition, eliminate team/corporate politics and so on.
I don't know why people buy into the "when we'll be agile enough, everything's going to be ok; until then let us use this whip and self-flagelate for not being agile enough:
It's not exactly how I interpret it. I believe a methodology functions due to proper embedding in the team. The embedding happens (or has happened) due to good personal interaction, shared beliefs and shared knowledge. In other words, the methodology is a result of the personal interactions of the team.
I've seen many teams claim to have switched from 'waterfall' to 'agile', but the only ones that haven't slipped back into a 'waterfall'-like mode after a few bad releases have been ones that put a strong emphasis on tools - specifically testing tools.
Good interpersonal dynamics are important, but the sheer level of irony that the first rule of the Agile manifesto emphasizes deprioritizing the main thing that will actually get you away from waterfall-like development is pretty staggering, IMHO.
Never mind. Forget the tests. If you have a meeting at 11am every morning where anybody who sits down is shouted at then you've done it. You're "agile".
The manifesto emphasises the importance of sharing the values of a method, before actually applying the method. It's the difference between wisdom and dogma. Take tests for example: blindly going for coverage metrics will be much less effective compared to having a deep, shared understanding of the pro's and con's of testing. To know that your co-workers have a similar understanding of the methodologies, makes the methodology itself of lesser (or no) importance.
Personally, I am of the opinion that a strong emphasis on test-driven development in the long run will cause waterfall-style development. Tests are all about risk-prevention, instead of risk-mitigation. Prevention eventually becomes exceedingly expensive, whereas mitigation is all about building robustness into the running system. Due to that, the scalability of mitigation systems, such as true micro-services or actor-systems, are inherently more dynamic and cause less latency in development.
I don't understand your last statement. It seems to confirm my position: "... anybody who sits down is shouted at ... ". The process (standing up, not sitting down) is less important than good team dynamics (not getting shouted at).
(edit: down-voters, please share why you down-vote! I'd like to know. Also, please don't down-vote based on opinion, but on weakness of argumentation instead.)
I think you are getting down voted for questioning TDD. TDD seems to be another religion. You're insulting a sacred cow for some.
Anyways, I would really like the avenue of thought you've planted in me to flesh out: risk prevention vs mitigation and how it applies to software development.
Well, at least I now know why you down-voted :) Sometimes I ask because I feel a comment was properly written, non-offensive and with reasonable argumentation. I put time and effort into finding the right words and thought through my argumentation multiple times before writing it down. Since English is not my mother-tongue, I'd like to know if i caused a possible misunderstanding. Hopefully others share my belief that down-voting should be reserved to punish abusive behaviour.
I understand the desire for an explanation but more often than not in my experience a comment that has quickly been downvoted to in a fit of follow-the-leader will soon be voted back up into positive territory. You just have to be patient :)
I usually only down vote people who begin or include 'i know I'm going to get down voted...' in their original post. If someone edits asking for explanation I usually give them leeway... It tells me they are monitoring the conversation and willing to engage thoughtfully.
>Take tests for example: blindly going for coverage metrics will be much less effective...
I agree. However, this is about tools and processes, so it's not really relevant to my argument, which is that you shouldn't explicitly deprioritize tools and processes.
>I don't understand your last statement.
I'm making fun of people who cargo cult 'agile'. Actually standing up is the least important feature of stand ups. Sitting down during a stand up and seeing people's reactions is is a good litmus test. It highlights those people who put a great emphasis on non-functional rituals.
I was trying to convey that fostering an environment within which tools and methodologies can be understood for their specific merits is more important than the tools and methodologies themselves. Or: a tool shouldn't be a crutch, but a training aid. At least, that's how I interpret it.
Obviously, the true spirit of the agile manifesto is the non-prescriptive, ambiguous wording.
Unfortunately, 'scrum' has come to be known as 'agile', which I think is an unfortunate consequence of all the cargo-culting going on. Rituals are fine, as long as they have any depth and identity to them. The Scrum rituals are shallow, simplistic and feeble. Without identity it is no wonder many participant want them to be over as soon as possible...
I'll agree that the manifesto is vague enough that you could reinterpret it your way. However, its inimical vagueness is hardly a point in its favor, since all that means is that people project whatever meaning that they really want on to it. Meaning that it has very little meaning.
But it still strongly implies that tools basically don't matter, and I've worked on teams where people did choose to interpret it that way, which ironically led to waterfalling...
Yes, I agree with that standpoint. The agile manifesto is too vague to be used prescriptive. It requires a certain amount of intelligence and wisdom to acknowledge that. It's also not able to bootstrap: you need to be grounded in the philosophical underpinnings in order to understand it.
I'd really could go on to talk about those underpinnings, saying that they are mystic and grounded in certain Western cultural ideals (that are deteriorating at a rapid pace at the moment). But, HN is probably not the right place to discuss it and it is 1 AM here.
> you shouldn't explicitly deprioritize tools and processes
Is it fairly safe to say that the goal of most [software] companies isn't to just make good software or to develop [good] software fast but to find a repeatable process to make good software quickly? And this is why tools and processes are still needed.
Waterfall hasn't really been used widespread in years (70s, 80s?), well before Agile in the aughts. It's more of a red herring to compare agile to something. It's like saying memory foam is way better than sleeping on the floor (completely ignoring traditional mattresses).
We always used a pseudo waterfall but with much shorter iterations. We called it cinnabun. For example, we would cut 4 CDs a year so our iterations were about 3 months. We would plan what we wanted to do in the 3 months (bugs + features), code it, developer test it and throw it to QA. Once we had a good build, we would distribute it, have a small celebration the start over again.
It's similar to agile, but with the 3 month cycles, you could actually plan and design a lot better because you could see out a little further.
I suspect Agile is so popular because the business doesn't want to think of, make and stick to a decision for 3 months.
You just said that waterfall hasn't been widespread since the 80s and then basically admitted to using it :)
These days it still happens, it's just that few people call it that unless they want to be rude.
IMHO waterfall is a natural state to revert to if you have a test deficit, technical debt and you don't trust releases that haven't gone through a round of manual testing first. The level of faith you have in your code/tests basically dictates the length of the mini-waterfall iterations.
Continuous delivery (the exact opposite of waterfall) comes equally naturally if you have no test deficit. It doesn't require some kind of paradigm shift or a special 'agile mindset', it just requires an automated regression test suite you can trust.
(this usually only happens when you write the tests first, but it's not strictly 100% necessary)
Well, the definition of waterfall I grew up with was that you plan the entire project end-to-end with MS Project or something similar, including requirements gathering, delivery milestones, timelines and resource allocation. This was mainly done when building massive, in-house mainframe or AS400 apps that were expected to keep running for decades.
Waterfall, as described, carried on through the PC revolution but was falling out of favor to version scoping and planning (iterative). Iterations could be whatever you wanted. There were backlogs, usually a list of tickets in some software system like Start Team or whatever. I think Visual Source Safe had this as well. It's fairly simple, so our UI designer just build one in a week or so.
Agile and it's short sprints (2-3 weeks) fits nicely with SAAS (websites) because their distribution costs (comparatively none) doesn't lend itself to versioning (or isn't encumbered by it). I just find that the short iteration isn't ideal because the short iteration, and lack of long-vision planning doesn't allow designers to develop in a "big-picture" way.
>it just requires an automated regression test suite you can trust.
Testing doesn't really fit anywhere in the equation as a differentiation. There was automated unit testing before Agile, there just weren't any common frameworks and it usually involved custom test harnesses (usually a console/terminal app to run and log stuff).
It also requires automated deployment which can be tricky, especially when there is a single / clustered relational database involved.
In conclusion, I like the Agile methodology, I just think the short 2-3 week iterations don't fit every organization and if you can handle 1-2 month sprints, you'll have better designed software because you can see further into the future.
>I just find that the short iteration isn't ideal because the short iteration, and lack of long-vision planning doesn't allow designers to develop in a "big-picture" way.
I don't think that kind of planning is particularly valuable. I've worked in many really unsuccessful companies that had a long term plan (which never really came to fruition) and some really successful companies which may have had a blurry long term vision about their general direction but mainly just reacted to customer input and market conditions as they saw fit.
The Linux kernel is developed in exactly this way too and it's hardly a model of an unsuccessful software project. It's not just me.
>Testing doesn't really fit anywhere in the equation as a differentiation. There was automated unit testing before Agile
Automated testing was barely used though. XP, the progenitor of agile, did emphasize it though.
It 'fits' because once you do enough of it (and you have automated releases), it just sort of stops making sense having release schedules. Every change to the code base that gets through code review and the test suite is releasable so why not release it?
>It also requires automated deployment which can be tricky
I think 2009 was the last time I worked on a system that didn't have automated deployment.
>I just think the short 2-3 week iterations don't fit every organization and if you can handle 1-2 month sprints, you'll have better designed software because you can see further into the future.
I've always aimed for iterations that are as short and tight as possible because the future - meaning how your software is really going to be used - is by far and away the least predictable thing you'll have to deal with. The most you can do is react to that quickly before veering too far down a dead end.
Hell, even when I develop software for myself I end up being surprised about how I actually end up using it.
There’s a difference between methods that are shown with evidence to help developers and methods that are merely marketed as such. The writeups on methodologies should distinguish between these. It’s also possible these people have thought about methodologies a long time without ever discovering the ones that work. Fagan Software Inspections, Mills’ Cleanroom Software Engineering, Meyer’s Eiffel Method, and Praxis’ Correct by Construction all worked if we’re talking about developers delivering products in acceptable timescale with low defects. They all let developers do their job in an iterative way providing extra tools or restrictions that help ensure quality and/or maintainability.
On concurrency side, there were also SCOOP for Eiffel and Ravenscar for Ada which eliminated race conditions by design. Some methodologies in high-assurance sector were using tools like SPIN model checker for it. People spent a long time talking about those bugs while some design methods just removed them entirely. A lot less debugging and damage might have happened in industry with the aforementioned methods getting way better with industry investment.
In the age of Internet, I believe there is a way to conduct experiments that would yield an answer based on data. Spitballing here, but how about a kind of contest for 'points' where a statistically significant number of devs volunteer to participate. They each provide a 'resume' (GitHub, LinkedIn, CV,...) into the system. The programming task is presented and the devs self assemble into teams, and endeavor to complete the task.
As an example, let's say 100 devs jump in. The task is to create a simple Android app, with a requirements statement provided, with server back end, launch it into the app store, support it for some period with bug fixes and improvements, and then declare it '1.0 released' to wrap up the experiment.
What you'd wind up with is a variety of team sizes, a variety of team experience, a variety of development systems used, a variety of outcomes. But all building the same software.
The key would be that as many attributes of each team's efforts as possible would need to be recorded and entered as data to be studied in search of patterns.
Repeat this n times and I believe valuable insights could be gained.
Rather than trying to control for all the variables of team size, experience, method, you control for the end product being targeted and then look for insights into the variety of approaches that teams took.
The problem with the observational approach is generally that it's really hard to decorrelate the results. In your proposed experiment you control only for the project itself. What if good developers generally prefer modern programming languages, so they chose Roslin. Does that imply that using Roslin is a better language for programmers that are not as good? Does it even show that they wouldn't have been even more productive using Java?
From the other direction, even if you get the value of controlling for the project itself, that might also add some bias. Could be that for a project with that setup waterfall actually works pretty well, but is it representative of projects overall? Are most software projects comparable to developing a simple Android app with a well defined specification up front?
I do agree that it would be good to do this kind of experiments where multiple teams get tasked with building similar systems to figure out what works. But I don't think it makes sense to actively avoid controlling for variables. That would make the results very hard to interpret and much less usable.
Really being agile basically amounts to "use best judgment at all times". It is highly unstructured because it provides absolute freedom. That means that the only people who can actually be agile (have the experience and discipline necessary), are the ones who don't benefit from being told to be so. Any time a team is told to "be Agile" to fix their problems, that team lacks the maturity to actually "be Agile".
Not really, although in the book "The Checklist Manifesto", Atul Gawande reports that having something akin to a "stand up meeting" before a surgery has been shown to reduce the likelihood of errors occurring during that surgery. And that brings us around to "agile" methodologies. I feel like most people agree that the underlying value system outlined in the agile manifesto made sense, and that many of the practices outlined can be useful, but at some point (scrum, maybe?) a particular set of practices got bundled up and sold as The One True Way of Doing Things, and then sanity went out the window. For example, test-driven development can be a useful tool for thinking about how to approach a problem, but of course it isn't the only way to write a program (see Ron Jeffries hilarious attempt to TDD a sudoku solver as a counterexample). But sadly, many teams aren't in a position to think critically about what they do or don't do - instead, they have to show that they're "doing Agile".
I am aware of Kanban’s origin and also of the checklist manifesto. My question wanted to stimulate pondering/debating about what a “methodology” is supposed to be and what, exactly, is trying to accomplish.
A lot of the comments seem to say that “above average developers will always find the most productive way to self-organizing.
Letting apart the problem in always finding “above average people” - at least for now - I think that this has a fatal flaw: what happens when someone leaves and/or someone else joins the group?
I suppose that Hospitals and the Army have this happening fairly often, they must have “methodologies” catering for a wide spectrum of talents, and also accomplish satisfactorily results even when dealing with thorny, unexpected problems.
What do they use? Is this a “methodology”?
(One important thing that I am not sure is adequately represented in IT methodologies is having an established vocabulary to describe situations: we have “Patterns” but these are low-level, and divorced by the actual business-specific scenario - this is just one example but I think it helps pointing out that IT methodologies are trying to standardize the wrong elements).
> But in terms of the only measurement that really matters—satisfying requirements on time and within budget—I haven’t seen any methodology deliver consistent results.
Good, Fast, Cheap. Pick two.
That's what you are doing when you are pitting requirements, a time frame, and a budget against each other.
The first problem is this is a company wide process, not just software development. The only thing development tells you is how long it will take given the budget. Development doesn't define the requirements or the budget.
The third statement in the Agile Manifesto, which tends to get overlooked:
"Customer collaboration over contract negotiation"
That is entirely a business process and ultimately determines both the requirements and the budget. It is something that is sorely lacking at most companies, regardless of how hard their engineering department tries to follow agile. It doesn't work without the full company buy-in.
> Try this thought experiment: Imagine two teams of programmers, working with identical requirements, schedules, and budgets, in the same environment, with the same language and development tools. One team uses waterfall/BDUF, the other uses agile techniques. It’s obvious this isn’t a good experiment: The individual skills and personalities of the team members, and how they communicate with each other, will have a much bigger effect than the methodology.
Another thought experiment: imagine getting two teams of programmers using the same methodologies and everything else and expecting the results to be the same. It's just not practical to perform studies like this because there are too many variables.
That's because you are using an extremely small sample; of course the individual differences are going to matter. Now, get a random sample of 1000 teams using waterfall and 1000 teams using agile, and you'll get much more meaningful data. The larger the amount of samples, the more the individual differences are going to be smoothed.
Of course, not many people have the resources to do that kind of experiment.
> Of course, not many people have the resources to do that kind of experiment.
That's what I meant about it not being practical. Who would invest that amount of money? There's infinite variations of different methodologies as well.
I'm not sure what the solution is but it's tiring seeing bad studies used to promote certain approaches.
>That's what I meant about it not being practical. Who would invest that amount of money?
Tons of organizations and governments could.
20 10 person teams * 5 methodologies = 1000 people, for a 1 month project = 1000 * $100.000 = 100M dollars.
In the grand scheme of things this is insignificant amount -- in a world where businesses spend $15 billion for buying Instragram. A military could do that kind of spending for buying a single airplane -- and such a research could potentially have a huge impact / savings on future soft-eng projects.
And it could be even subsidized or be tax-deductible. Or could easily drop to like $70K or $50K per month compensation.
It would still result in basically anecdotes. Maybe team A picked a library that made life difficult for themselves or misinterpreted one of the requirements? Small things completely throw the results off.
Funny you mention that. The only "methodology" I feel I have enough anecdata to comfortably vouch for is YAGNI. In my experience, whichever team fucks themselves with bad library choices is going to lose this experiment 100% of the time, agile be damned.
Software methodologies don't work because we're not getting the fundamentals of software development right. Reorganizing your kitchen layout won't help your restaurants if your chefs are still struggling to make scrambled eggs.
The most important thing any software team needs is proper logging, monitoring and metrics. No matter how great your process and engineering culture, you'll need logging, monitoring and metrics; things will happen. The worst part is that this is relatively cheap and simple to do (at the scale that most of us operate on), with huge rewards, and most teams still do it wrong. Whether they're logging too much noise, or collecting metrics that show what's right vs what's wrong, or swallowing exceptions, etc. This is the litmus test.
Next are automated tests. Unit tests, integration tests and fuzz testing. The downside with this is that it takes a long time to master. Yes, it costs time at first, but that's why you have senior developers who should be able to use tests to save time and teach others from their mistakes (like too much mocking).
Finally, code reviews and pair programming. Almost every line of code is an opportunity to teach and to learn. No methodology or tool can help if you hire junior programmers and don't do pair programming (or some other really involved mentoring, but I don't know of any).
Technical debt is real. Most of the time people don't have time to do things right is because they didn't take the time (or didn't know how) to do things right in the first place.
I don't particularly think the methodology merchants are in the business of improving software development. Their bottom line is affected by selling trainings, consultancy and certificates, thus the single metric of a methodology working is whether and how fast it can grow its user base, keep them happy and coming back for more. It might loosely correlate with improving software quality, but that is entirely secondary.
They don't work because it seems most places treat software development as a function separate from whatever the company does (with the exception of software companies which rarely follow one "methodology" but just do what works for them). Software is a way to tell a computer how to produce an outcome. The more important thing is how that outcome fits into the business, not the method of reaching it.
I like the article, and agree mostly. Though I've come to look at it a little different.
Agile/waterfall/etc are PM methodologies used to manage software devt projects. Especially Agile has limited use beyond software devt.
Methods of deving software are things like: domain driven, data models first, TDD, etc.
Then there are programming paradigms: OOP, FP, Actor based, etc.
So the set of "techniques" the article lists in one list are of different types. All of these have to be evaluated against the people that have to use them (don't for OOP on an FP team, or vise versa) and the type of problem to be solved (TDD is less usefull for a simple UI project, than for a complex algorithm involving time and lots of corner cases).
"Whether a methodology works or not depends on the criteria: team productivity, happiness, retention, conformity, predictability, accountability, communication, lines per day, man-months, code quality, artifacts produced, etc. Every methodology works if you measure the right thing. But in terms of the only measurement that really matters—satisfying requirements on time and within budget—I haven’t seen any methodology deliver consistent results."
You star with a premise that requirements, time constraint and budget are all set magically right before the project began.
That's actually a good point. In my company most projects start with a fixed budget, a hard deadline and unclear always changing requirements. Everything is already set before the dev team gets involved.
That's a failure of the project manager. Before any deadline is set the dev team leads are supposed to meet with all of the leads on the project, who in turn meet and come to a consensus with their teams. The results of the negotiations are supposed to start at the top with the idea and goals, go all the way down to the people who will be executing them, and then go all of the way back up to the top over and over until everyone until a consensus is reached. Things may change, but in proper/formal project management a change management process is in place where the leads must come to a consensus before any change is approved. Good management enforces this process, and good leadership at any level makes everyone above and below them aware of the implications of what everyone wants. Everyone has to be on board and working towards a common goal, and come to agreement to how best to reach that goal. The biggest problem I see is at the organizational level, where the project manager is not the one with authority to control the process. When this doesn't happen, the people above and below them are not protected from the implications and things go bad for everyone. This is taught in PMP courses, but what isn't taught is what happens when the organization doesn't give the project manager authority but instead turns them into a project coordinator or project expediter. A formally trained, knowledgeable and experienced project manager with the authority to control the process can and will make sure everyone is on board, but they can't do that when their authority is undermined by the organization, client, or teams. Likewise for the team leads. This is the definition of cooperation.
Since when did a customer actual have a solid understanding upfront that was 100% correct. Thats just la la land, and one of reasons for the Agile approach. Iterating after discovery works.
I find if you have excellent programmers who enjoy the problem they are working on and are put under the right amount of pressure you get great software no matter what process is followed.
The thing that I feel these methodologies solve is reminding developers of the users.
Ultimately a lot of people tend to lose track of higher level objectives when working on the (admittedly complex at times) implementation details. This is probably the biggest productivity killer in the business
How many of us have had that first demo with some people external to the dev team and for all the feedback to be super obvious things that could have been caught before a single line of code was written?
Because software has uncertainty (about requirements, environment, tolerances, complexity) and creative elements thrown in.
It's not a regular pipeline kind of workflow, like some Taylor-inspired assembly line, or regular old civic engineering.
Besides all those methodologies are unscientific BS invented by consultants, not something derived from actual studies (even when there are some comparative studies involved they are laughable in scope by scientific standards).
Exactly. Software development is fundamentally unpredictable because you're always making something that's new, at least to the team doing the making.
After all, if you were repeating yourself, you'd just re-use the methods and classes and packages you'd already written; worst-case, you could copy+paste the code and tweak it.
And since you're doing something novel, of course you're not going to be able to predict how long it will take, beyond extremely broad guesses.
Thanks for the comment. I'm the author of the original article. I was surprised to find this old piece on the HN front page today.
With all respect, I don't misunderstand TDD or OOP. I agree that those aren't methodologies in the strict sense. But rigid adherence to OOP design or TDD can paralyze a team and focus development on goals that aren't the customer's priorities. OOP and TDD can influence how the team works and what shape the project takes just as much as waterfall or agile. When I read articles claiming that TDD is the sure path to reliable development, that's a methodological claim, not a technical practice.
I think we're really in "violent agreement" here. Rigid adherence to these "things" (not necessarily methodologies) never work. The benefits of what these "things" offer, to the degree that they are understood by the practitioners, are real. But they are often (perhaps categorically) misunderstood, applied haplessly and held to claims they never made in the first place.
Software is no different than any other engineering discipline. We absolutely do have software development methodologies that work, we have decided to exchange speed for quality.
Better, faster, cheaper, you get to choose 2 and only 2, we've selected faster and cheaper.
It sounds like author is not too fond of code reviews, test approaches, and linting rules. These things are based on personal preference and semi-religious beliefs often leading to conflicts, even on two person team.
> satisfying requirements on time and within budget—I haven’t seen any methodology deliver consistent results
There seems to be a false premise. Methodologies don't deliver consistent results on time and within budget because they attempt to help to figure out those software requirements, so that an actual problem can be solved, not useless requirements satisfied.
> Maybe social skills come harder to programmers than to other people (I’m not convinced that’s true)
This is, in fact, the answer.
You haven't truly SEEN office politics until you've worked on a team of developers. I'm shocked a reality show hasn't come out yet about software development. It would make Survivor look like Family Matters.
I don't think you have ever seen real politics in a company. Work in marketing or advertising and you will see some real politics. Developers are way too blunt to be slick politicians.
I think this is bullshit. I tend to get on pretty damn well with most of my co-developers. The friction comes with managers trying to fit us into weirdly-shaped boxes.
This run's counter to a lot of my experiences. I don't tend to see much office politics at all with software developers. Maybe because salary isn't tied as closely to rank and it's so easy to shop around and jump ship if you feel like you're getting fucked.
From what I've seen, with us, fear is the motivator for the mistreatment of others, not greed.
It's the same story at every company: developers are under constant pressure from management, we aren't given the tools or help we need at the right time, we get no credit, but all the blame when things go wrong. When people feel threatened, the open and creative part of their brain shuts down and it becomes fight or flight and every man for himself. Devs become extremely rigid and dogmatic, as every bad work experience leaves a scar on us, and they resolve to never EVER make the same mistake again.
"Hell is other people" is like the personal motto of some programmers. You don't agree with me, fine, I'll just refactor your code when you're sleeping. People read random blog posts and then take it as holy writ, undisputable proof, and if you disagree then let me write a long email lecture to educate you about why I'm right and you're wrong. I've seen people get into physical FIGHTS at scrum, resulting in broken bones. I wish I was making this up.