That's another case of the automation paradox though, is it not? The engineers are suddenly inundated with all kinds of information and it isn't clear what the actual cause is. When you look at AF447, part of the problem was that the pilots appear to have been unaware that all the errors they were seeing were caused solely by loss of airspeed indicators.
So most of the time you aren't looking at stuff, then suddenly something happens and you get tons of information on what exactly isn't working right now, all of which may or may not be useful in determining what the real problem is. And in this case humans have to step in quickly and react flawlessly based on perfect knowledge of the failure.
So most of the time you aren't looking at stuff, then suddenly something happens and you get tons of information on what exactly isn't working right now, all of which may or may not be useful in determining what the real problem is. And in this case humans have to step in quickly and react flawlessly based on perfect knowledge of the failure.