It might be bad for the specific organization pushing their new solutions, but for the software industry as a whole, it seems like the good side of "throw spaghetti against the wall and see what sticks". Many ideas will fail, but the good ones will stick around. So I'd say by all means, let people feel free to innovate. And those of us who consume the innovations just need to remember the mantra of "leading edge, not bleeding edge."
I didn't say it was bad (though the tone of my post definitely hinted at it). I agree with you. Following the trail of necessity can easily lead to locally optimal points, away from the global maximum.
It (and tech innovation in general) is also driven by people scratching their own itch. That's where we started. And we actually didn't start as a DB company, but as an IoT company storing lots of sensor data. We needed a certain kind of time-series database, tried several options, but nothing worked. So we built our own, and then realized other people needed it as well.
And we picked Postgres as our starting point exactly because it wasn't new and shiny but boring and it worked.
I have to say that a sentiment similar to this is why we built Timescale as a Postgres extension and not started a new database from scratch. We didn't want a new shiny thing just for kicks but rather a product focused on solving a particular problem.
Relentless change is most likely the motivator for these innovations. However, I don't think it is needless innovation or innovation for the same of innovation. Personal opinion that there isn't much innovation in what this article covers, but still it is happening elsewhere.
Changes in telemetry production (IoT and others) have fundamentally changed the requirements... millions (and often billions) of data points points per-second are now happening. This is being driven by RSM and IoT. RSM is alluded to in my ACM Queue article: https://queue.acm.org/detail.cfm?id=3178371