But everywhere I've worked in the past 10 years has always asked me to unit test everything, and unit testing is specifically called out all the time.
Other types of tests are rarely even discussed, because many developers don't do a lot of testing beyond unit testing. Sometimes there are other organizations that pick up this slack, like a dedicated test organization, but many times there are not.
Perhaps it's just my experience, but I've found unit tests often give a false sense of security when most of the issues I have to fix (which are frankly the hard issues, not simple logic bugs) heavily involve how the code interacts with its environment. I'm talking OS level, user level, privilege escalation (such as UAC in Windows), networking, load, perf and stress.
It's not that I think that unit tests aren't good, they are, but I think people focus on them too much, to the detriment of other testing. Integration testing should catch all the things the unit tests would catch as well, the only difference is you have two copies of the test, and the unit test will give you more granularity into what is failing when you look at the test suite results. But as soon as you have a call stack or a logging facility (especially around error conditions), which is best practice, most of this is obvious as soon as you look at the result of the test.
I totally agree that unit tests for bug prevention in new code are overrated. I mainly like them because they 1. Encourage decoupled code and 2. Make future refactoring easier especially in dynamically typed languages.
Not having other tests is a huge mistake. I think I'm spoiled because I've spent my entire career at places that cared deeply about testing and proper TDD.
I think it's a mistake to assume that integration tests will capture all of the bugs unit tests will. Even with a full whitebox approach, the analogy I'd make would be testing from the next room over with a waldo arm that can only move in a preset number of directions.
It's just much harder to put together tests that hit any level of internal granularity than it is by controlling closer to the target. Theoretically it can be done, but that doesn't make it a good idea. Plus you potentially end up with a large body of brittle tests that can be invalidated by any hiccups in the stack.
Integration tests really should concentrate on the behavior of the integration. It's not just a philosophical approach, it's about not hitting the hockey stick of diminishing returns in your testing while trying to maximize value of the results.
But everywhere I've worked in the past 10 years has always asked me to unit test everything, and unit testing is specifically called out all the time.
Other types of tests are rarely even discussed, because many developers don't do a lot of testing beyond unit testing. Sometimes there are other organizations that pick up this slack, like a dedicated test organization, but many times there are not.
Perhaps it's just my experience, but I've found unit tests often give a false sense of security when most of the issues I have to fix (which are frankly the hard issues, not simple logic bugs) heavily involve how the code interacts with its environment. I'm talking OS level, user level, privilege escalation (such as UAC in Windows), networking, load, perf and stress.
It's not that I think that unit tests aren't good, they are, but I think people focus on them too much, to the detriment of other testing. Integration testing should catch all the things the unit tests would catch as well, the only difference is you have two copies of the test, and the unit test will give you more granularity into what is failing when you look at the test suite results. But as soon as you have a call stack or a logging facility (especially around error conditions), which is best practice, most of this is obvious as soon as you look at the result of the test.