Hacker News new | past | comments | ask | show | jobs | submit login

Tests that are slow to run don't get run often. People don't write tests first. They write fewer tests. They tend to test the happy paths.

I'm not saying we should avoid integration testing. It's necessary.

It doesn't mix well with PBT for a main test suite, in my experience.

You're better off making sure your domain types and business logic are encapsulated in your types and functions and don't require a database, a network, etc. PBT is great at specifying invariants and relations between functions and sets.

Use your integration testing budget for testing the integration boundaries with the database and network.

Update: there's no cure for writing bad tests. you just have to write better ones.




Tests get run as often as they get run. There's no rule that says that you have to run them less frequently if they are slower. It's not 1994. CPU time is cheap.

Realism is much more important than speed. A test that catches 20% more bugs and takes 5 seconds instead of 0.05s is just more valuable. It's simply not worth economizing on CPU cycles in 2024.

Yes, DI can make sense sometimes but definitely not all of the time.

The people who have ultra fast unrealistic tests inevitably end up leaning on manual testing as they experience bugs which their unit tests could never have caught.

>there's no cure for writing bad tests. you just have to write better ones.

Better means more realistic, hermetic and less flaky. It doesnt necessarily mean ultrafast.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: