But we don't know the full story (for each individual) - we don't know if it really is discrimination based on age or if there are other things going on. I've seen both things happen in medium sized companies so I won't be surprised if it's the same for Microsoft (or other major companies for that matter)
- case 1: a guy in his fifties writing C++ like it's C with classes in the same style he did 20 years ago. Every piece of code is a complete trainwreck. I could go on for hours on how bad it is. Anyway, the programs he creates like that sort of work. Barely. But because there isn't anyone else to do his job, managment is stupid and too lazy to find someone else and rather waits until it explodes in their faces (i.e. when he retires), he can stay. Until one day a new manager steps in and immediately sees the problem and lets the guy go. Would you really call this discrimination or rather the proper thing to do?
- case 2: the opposite. Decent guy in his fifties, hard working, always learning, writing proper code, not afraid to speak up. The letter eventually yielding him fired because (again) of stupid managment: one manager able to convince the rest he's too old and too costly too keep, rest of the pack too lazy too stand up and just let it happen.. That is discrimination.
There is also a trickier and more interesting case 3:
Experienced guy in his 50's, seen it all before. Remembers the era of object databases, knows why SQL was invented, and shuts down his hipster colleague's plans to use Mongodb. Remembers CORBA, SOA and J2EE and is skeptical about microservices. Understands why logging/metrics/dependency injection is necessary and pushes RoR guys to use more verbose syntax to support it.
Younger folks dislike him because of this. There is conflict. Management has a real problem, since they both need experience but the senior guy can't build everything on his own.
dependency injection isn't needed in RoR, as mocking/stubbing is so simple, and built in (with RSpec, say)... (Same with metrics, they can easily be injected into the entire object tree). Logging sure, you need to write some of that.
There are also arguments for NoSQL dbs as well, some times old knowledge is exactly that.
This doesn't seem to be a compelling example... But alas, I think I've taken this off topic.
> dependency injection isn't needed in RoR, as mocking/stubbing is so simple, and built in (with RSpec, say)
Disclaimer: not a RoR guy.
Dependency injection isn't about unit tests. Dependency injection is about weak dependencies, easier refactoring, and preventing bad design decisions from being a disaster 5 years down the line. Dependency injection is a good idea no matter what the language/framework is.
That's the point of the parent comment. A dev in his 50s would point these things out.
Dependency injection in practice is almost always implemented for tests.
In terms of architecture, it's usually a case of YAGNI. Every dependency injected is a configuration point that is almost never altered, except for mocks in testing. When this style infects a whole codebase, it makes it far harder to read and navigate, because code flow is dependent on runtime data flow. Heavy use of indirections that only ever go to the one place is a bad code smell, and it's a stench over the entire field of enterprise Java.
Code that is easy to read is easy to refactor. Except for natural architectural chokepoints that are typically intrinsic to the problem being solved, you're fooling yourself if you don't think architecture changes are needed for most refactoring - dependency injection isn't buying you what you think you're buying. Every injection point you create is a prediction about the future, about the possibilities of change, but there's one gotcha: the future is hard to predict, so most of your decisions are wrong.
You're better off being agile, following YAGNI, doing the simplest thing that will work, and altering it when requirements change.
I've done DI many times for things other than tests - swapping out different algorithms is the most common case (BetaBanditCalculator -> TimeVaryingBayesianBanditCalculator -> HierarchicalPersonalCharacteristicCalculator, etc). It also makes testing easy, which is great, but that's far from the only use case.
DI tends to work great for the integration point between systems - e.g., connecting the REST or Thrift interface to the calculation backend and datastore.
The fact that a concept is hard to use in Enterprise Java is a poor argument against anything other than Enterprise Java.
DI is nothing else than partially applied functions, which is useful to limit the number of values flowing explicitly to the call point. Limiting the data flow diameter is good, as our brains can handle so much information simultaneously.
On the other hand, DI frameworks are a "modern" way to do global objects. I have no clue what problem they solve or why using a DI framework is a good idea. I've had the "pleasure" to work with a Java DI framework project last week, it turned 10 lines of code into 300 lines of boilerplate spread around 3 different packages.
Furthermore, I agree, the pervasive use of mocks is a smell. The point of tests is to have test suite T.A to validate module A. Littering the test base with mocks of A which don't pass the test suite T.A is just a recipe for pain, both semantically and as increasing the amount of code change required by a refactoring. A is changing? Go chase the 100 mocks that make their own assumptions about A and fix them. Your tests are now part of the liability under change instead of being the safety net that tells you whether your module B depending on A is still working.
One extra point: what @barrkel is saying here works much better in a language which is fairly concise. In more verbose languages, any kind of change becomes more difficult (there is simply more code to change and more opportunities to make mistakes), so it is understandable that people want to build in "hooks" for extension.
Dependency injection should pretty much be renamed: "Unit testing in java". Sure you can come up with contrived examples like: Algorithm swapping, or integration point swapping, but I would hazard a guess that covers about 1% of DI code written, the other 99% is for unit testing.
You can achieve weak dependencies, easy refactoring and bad design without DI (Maybe it's harder in java to do thought :\ ).
The magic question of DI is, how do you decide what is a dependency worth injecting vs a static/strong dependency? In my experience its: "Inject everything". Which leads convoluted code. The correct answer (imo) is "you can't, unless you can predict the future".
- case 1: a guy in his fifties writing C++ like it's C with classes in the same style he did 20 years ago. Every piece of code is a complete trainwreck. I could go on for hours on how bad it is. Anyway, the programs he creates like that sort of work. Barely. But because there isn't anyone else to do his job, managment is stupid and too lazy to find someone else and rather waits until it explodes in their faces (i.e. when he retires), he can stay. Until one day a new manager steps in and immediately sees the problem and lets the guy go. Would you really call this discrimination or rather the proper thing to do?
- case 2: the opposite. Decent guy in his fifties, hard working, always learning, writing proper code, not afraid to speak up. The letter eventually yielding him fired because (again) of stupid managment: one manager able to convince the rest he's too old and too costly too keep, rest of the pack too lazy too stand up and just let it happen.. That is discrimination.