Hacker News new | past | comments | ask | show | jobs | submit login
Poll: What proportion of start-ups/non-start-ups use automated unit tests/TDD?
22 points by swombat on Feb 4, 2009 | hide | past | favorite | 40 comments
We're trying to resolve this question... Do you use automated unit testing and/or TDD on your start-up? Or on your non-start-up? Feel free to mention why or why not below.
I use TDD or automated unit testing on a start-up
69 points
I use TDD or automated unit testing in a non-start-up environment
63 points
I don't use any automated unit testing in a start-up
54 points
I don't use any automated unit testing in a non-start-up environment
40 points



I've admitted on a few occasions that I've never written a unit test for anything. This is something I keep meaning to change, if only I could find the time. For the time being, I can still keep the code dependencies and side-effects in my head so it has yet to cause any serious issues.

I still slap my own hand, though.


The problem is that many testing advocates are 100% testing diehards. They believe that anything under 100% test coverage in several disciplines (unit, functional, integration, etc) is just as bad as no testing.

There is, of course, a middle way. Implement as much testing as makes sense for you. Think Pareto.. just the right 20% of testing could give you 80% of the gains.

So.. implement integration tests that test your apps very broadly and as soon as one fails, you know you can start looking deeper. No need to always start with line by line unit tests :)


You're right. And the problem with that is that the only opportunities for 100% coverage happen when you can start a project that way with the buy in of everyone on it. That is only likely to happen when everyone is familiar with TDD and comfortable with it. That creates a substantial hurdle for a group to cross when adopting it for the first time.

I prefer the approach we used here (at a non-startup). We simply decided that we were going to use automated tests as much as possible on a new project. Everyone was encouraged to use them. Our coverage wasn't 100%. However, when you are committed to creating an automated test case to reproduce any bug regardless of how the bug was originally found, your coverage expands. Usually you can write variants of the same test to cover related areas.

I can't speak for every member of the group, but my total testing time went down slightly and my test coverage improved enormously. Full TDD is not necessary in every instance. Just because you can't immediately make the jump to full TDD doesn't mean that automated unit test tools won't make your code better, and your testing easier.


I agree, it seems to me that like many things in life, the answer lies between the extremes


You and I both suffer from SDS: single developer syndrome.


This is a common fallacy regarding test driven development. I use BDD/TDD MORE as a single developer. I may spend significant time writing tests, but the product is significantly better and I spend less time tracking down bugs.

The time you "lose" on the frontend of development is made up on the back.


I know that in my mind. That's why I call it a syndrome - it is irrational. Without others, you make worse decisions because they work immediately.


Don't worry, working with TDD is kind of like wearing a seat belt.

You could do it even if you've never had an accident, many people do.

But most of the time you start doing it AFTER you go off the road once.


> if I could only find the time

The asinine thing about this is that unit testing saves a massive amount of time in the long run. In the short run, even, it saves time.

You're really shooting yourself in the foot.


is the any studies done you can point out, or is it just personal anecdotal evidence?


http://www.rttsweb.com/research/studies/study_functionalTest...

"...automated test suite saved the client 117 days of manual effort per testing cycle."

Sure, it takes more time to construct any given feature with unit tests versus a completely untested solution. However, any time you need to change the code, add a feature, or change something it interacts with, you are saving an incredible amount of time both in testing and the prevention of bugs discovered in production.

I really don't understand how, in the long run, automated testing can't save you time. If you don't think so, you obviously haven't maintained any software.


Too busy building stuff ;)


We do use automated tests, but not test-first. The tests are written once the design and the initial implementation have settled down a bit.

I think production code absolutely needs good test coverage, otherwise changing things is a nail biting affair.


At Plurk.com we use automated testing, but we don't develop by the TDD principle or do unit testing - as it's really hard to do with lots of code and lots of diversity (e.g. MySQL, JavaScript, Python, different layers of caching, Mako templates, Erlang etc.). We test the core functionality and cover things that 80% of users would use. This has worked ok so far without too many bugs - even thought we deploy new versions each day.


I recently finished contracting with a startup that heavily relied on outsourced programmers for 70% of their code. Use of TDD & Cruise Control were essential in keeping the code quality up while rapidly adding developers. I have to say, they managed it all pretty well.


I upvoted the first option then realized I needed to upvote all of them. I have and will continue to confirm to each as the situation calls and allows.


Which situations do you think call for "no automated tests"?


The quick and dirty jobs that we all do every now and then.


Yeah many times when you are 'spiking' and idea you can work through a simple small example fastest without testing. I agree when you are quickly just trying to see if something is possible testing will slow you down, and isn't worth the benefits of TDD.

Then when I start to integrate that solution into real production code I go back and add some tests. (mostly integration tests)


These times are when you need automated testing the most.


I don't really need automated testing tools when I write a small program that I just use myself for some task.

If it works it works, if it doesn't I do it again.


I use (automated) unit tests on anything that is going to be re-used (library code), but all the 'one off' stuff (mostly web related) is unit-test free.

I don't think that's optimal, but that's the way it is, and due to time constraints a major refactoring of all the code I have running is not on my to-do list anytime soon.


My first ever useful website (http://www.newmetalarmy.com) had only a few unit tests and it's a real fucking pain to change one year after launch without breaking anything.

Now on my next site (http://www.beepmystuff.com) I've got many unit tests and I'm much more confident when I make large changes.

The two site share a lot of code so the first is benefiting from the second getting a bit on TLC.


We are very test focused, but I wouldn't claim we are TDD. We also have layers of testing which I consider important.

We have a quick set of unit tests that we run before very check in (10 seconds).

Next our CI kicks off after check in and runs 10 minutes worth of full system tests, booting up opensource software and running it against the system, destroying DBs and recreating them from scratch and running again, etc.

* Finally we have nightly test which run for a few hours, at this point we have 15 projects run against our system. We run projects over and over to test stability, we do security testing against the current code. We have multiple simulated users running against the system checking for performance bottle necks. This phase doesn't catch new issues that often, but helps us to assure we don't regress in performance, security, or overall compatibility support.


We have tests for both Rails and front-end javascript. Not 100% coverage, but all of the critical code is covered. TDD is used to develop the more complicated domain specific features that we need to make sure are correct.

We also create tests to cover bugs to prevent regressions.


I spent a lot of time at the beginning of my project (bigtweet.com) writing some unit tests in the Perl Catalyst environment. I think that it was very worthwhile at the time while I was changing structure, implementing exception handling, etc. Now that the project has stablilized and I'm mostly working on incremental functionality, I've largely skipped writing new tests (due mostly to time constraints).

I still run "prove" before any new release though.

I've also experimented with Perl code coverage early on with good success. Again, time constraints prevent me from pursuing this further for now.

If you have the time, I would highly encourage writing some unit tests for the core functionality.


Write tests first: http://c2.com/cgi/wiki?CodeUnitTestFirst

It's more fun that way. Instead of your code breaking when you add the test, it's broken until you add the functionality.

My code has many many tests. My co-workers would laugh at me, telling me I spent all my time writing tests. But if my code even RUNS, I'm confident that it is correct. I would get the last laugh, because my co-workers would always be doing frantic debugging right before a deadline because they just uncovered a silent bug.


Remember.. almost every major piece of software you will have used together was not and has not been developed using TDD (think: does the Linux kernel have unit tests? That's way more important than what you're ever likely to develop!). That's nothing to be proud of, but if you are even aware of TDD and considering where you can implement parts of it, it puts you ahead of 90% of the industry.


For Sponty (http://thesponty.com/mit) we write unit and functional (http level) tests for every server side functionality. We use Python, so writing tests is actually fun. This has saved us from many regression bugs, and made our release cycle very short. We can write a feature today, and release it tomorrow.


Personally, I write a lot of tests. It dominates my development process. I'm always thinking about structuring the code I'm writing in such a way that it can be tested.

I didn't always. I think a project has to grow past a certain size, to the point where you can't immediately remember what all the parts do, before the need for tests becomes painfully apparent.


I do use automated testing, but trying to get more on TDD. I do slap myself in the face when something's not unit tested though.


I used to be pretty slack about it, but then started working in a non-startup environment where it's the only possible way to protect your code from random external breakages. I wouldn't code without unit testing even on my own stuff now, and ideally I'd have continuous integration if I'm working with even one other person.


We don't use TDD because we don't understand how to do it even though unit testing in principle seem like a pretty good idea. On the other hand, I can't imagine unit testing preventing most of the non-trivial bugs that turn up in our project.


For a non-startup, I TDD where it makes sense, for example when I'm implementing from scratch and requirements are pretty vague. However, I try to cover existing code with automated tests where risk of change is very high (bad code).



By the way, do you know a good unit test framework for C programs ?



Thank you !


A point on terminology: unit testing is an upstream, in-development process, done usually either by development or CM. For example, a java build tool like maven can include running unit tests during the build process. Automated testing is usually a downstream process done by QA. Currently I do SilkTest automation, using the 4-Test language of Borland SilkTest. That is under the "umbrella" of QA. In the past though I have done unit testing (CPPUnit mostly, the C++ port of JUnit), which would have been under the "umbrella" of development. In my experience, it seems like people adopt automation (i.e. GUI automation) from the QA end of it, before they adopt unit testing from the development end of it, unless a test driven development mentality is there from the get-go. If a company starts without this, they are more likely to adopt automation in the QA end of things, and then maybe later adopt unit testing in development. Seems like a company needs to do unit testing from the very start, or otherwise it is very slow to adopt it. I think unit testing is a great idea, but really it is hard to put into place from a management point of view, unless it is there to begin with.


In addition to the cultural resistance, retrofitting unit tests into an existing codebase is usually a big technical challenge; certainly much harder than doing TDD to begin with.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: