In the first case you will find you are presented with some "new idea" which is so revolutionary it has its own set of buzzwords, its own methodology, and its own unique values. Except that it isn't new, its like those folks who drag up popular HN posts from 2 or 3 or 4 years ago and re-post them for the quick karma hit. There is are people who re-wrap old ideas and pass them off like holiday presents at a white elephant gift exchange. The trick there is to spend enough focus points to validate that the idea isn't really all that new, identify where (and if) it varies from what you've already seen/heard/done and then either discard or integrate it into the same slot.
I'd strongly argue that this is a very bad filter. Almost all good ideas are "not new". The fact that you recognise them as not new just means you've been around the block now. 3-4 years ago they were just as old - you were just younger and more naive.
Discarding an idea just because it's not new seems foolish. If it's good, use it - even if it was first dreamt up by the ancient Sumerians.
I'm a big fan of going back and re-asking the question, this is because things change and sometimes stuff that was impossible before isn't now.
As an example, one of the things that 'killed' Java[1] in '94 was that the team had built this very easy to use 'cartoon' type UI for video on demand applications. The problem? Our "set top" box was a SparcStation 10 with two CPUs and 64MB of memory and a cg6 frame buffer. That configuration cost $15,000 (roughly) and we talked to cable type folks who were providing the set top boxes (OpenTV) and plant operators (Palo Alto Cable Co-op) and they agreed they were going to use more expensive set top boxes, they would spend perhaps $100 on one.
Running a complex, animated, graphical UI died that day.
But if you came to me today and said "We're going to have this really cool animation with a guy who runs around and makes suggestions and interacts with this world." I'd be totally cool with it because you can get enough CPU and graphics horsepower to do that for a retail price of $100 now.
So re-examining old problems again with new information is a worthwhile thing to do. And that is what older engineers need to be prepared to do (and if done well they can quickly zero in on the chances since they already know what went wrong the first time). And sometimes the problems with the first version still exist in this new shiny version, and in those situations, when you're not in a position to say "Please don't do this, it won't work." It is very hard to stay focussed on getting through enough of the steps so that everyone else understands as well.
[1] - One of the pitches the Oak (nee Java) team made to management about why the group should continue, was that it could help in Sun's battle with SGI and their efforts to dominate the interactive video market. But given the set top box requirement it was considered impractical. The server stuff lived on in "Sun Interactive" but the set top business was another dead end.
Yes, think in tablets and how many "history cycles" required. The context of the idea is extremely important.
The lesson? If you have a rough idea about how the future will be in some domain start to build today. The difficult part of this is knowing what to build. For example, you need a fancy rich UI for your application but there are no components for that so you start building your own GUI library BUT at the same time juggernauts like Adobe, Sun, Microsoft, and Google try (in different technology eras) to move forward with Flash, Java, Win32, Silverlight, WPF, and HTML5.
What is the correct decision? Waiting like a Confucian can be the best option.
I'd strongly argue that this is a very bad filter. Almost all good ideas are "not new". The fact that you recognise them as not new just means you've been around the block now. 3-4 years ago they were just as old - you were just younger and more naive.
Discarding an idea just because it's not new seems foolish. If it's good, use it - even if it was first dreamt up by the ancient Sumerians.