I realized this way back in the mid 1990s when suddenly everyone wanted a "webmaster" with 10 years of experience, when suddenly HTML - a document markup language - became the most important "programming language" one could know.
Since then I have studiously avoided specializing in any technology. As soon as I feel like I've spent enough time in a particular stack to start to "know" it, I move on. I have refused to be pigeonholed into any particular tech.
The coding / language skills are the very least important skills I have, I intend to keep it that way. However I can present an extremely long laundry list of technologies that I have built solutions with - the length of the list, not the presence of any particular TLA on it, is the key to demonstrating my learning ability.
Not sure why you think I haven't learned new technologies.
I have learned them whenever I needed to. Most of the new technologies are not that special. That makes them easy to learn, but, also, kind of annoying because I can see that they're just repeating a mistake I've seen 20 years ago already.
I obviously don't think you haven't learned new technologies in 25+ years. It seemed like the claim to ageism was that the kiddies expect you to know the hot, new technology. Requiring knowledge of new technologies for candidates isn't exclusive to veterans. And it's definitely not a requirement of the other 90% of companies that are using older technologies.
I thought you thought that, because you said it. Perhaps I missed a nuance, English is not my strongest language.
My point, going up this discussion thread a few clicks, was that new technology is not always better than the old. Ageism comes into play when one's opinion about the new tech is dismissed just because one has some gray hair.
Welcome to the technology industry.