Not long ago I had to write some JS (had to, not wanted to) to do something on a site, and got something working based on what I knew, only to be derided by a "real JS developer" that my code was "deprecated", "not following best practices", "not modern", etc.
He gave me his version which was around 10x more code and only worked in a subset of the latest browsers, while mine not only did but would probably work in everything since maybe IE5 or so...
Maybe that's considered a bug, but I don't want any of this trend-chasing. I write code to get things done. My users don't care, and they want to get things done too.
I think he probably doesn't know JavaScript at that point. I remember I got invited to an interview one time and they were impressed by my code for their coding assignment partially because it could print out in less than a page (no minification), and they referred to another applicant whose code had taken 6 pages. And I was just like - how could it take that much code?
if someone ended up with 10X the code using 'modern' JS and it did not work cross-browser while yours worked cross-browser including IE6 and up, then I strongly suspect I would prefer yours.
I think the should be improved is implicit in the first step, does can be improved imply a suggestion - if not the third step is a suggestion, and it's that third step I'm always seeing problems in, not to mention the third step in a scenario of too much too learn runs into the XKCD standards problem https://xkcd.com/927/