You wouldn't say that is the camera was assisting with a serious surgery or firing a military rocket, it's not okay, we need to make sure the bar is set very high and there needs to be penalties/ fines for mistakes if we are going to jump on the fully autonomous bandwagon.
Those applications already have higher standards for general software reliability, to say nothing of AI. Obviously the bar is different for different applications.
Considering that at least a cameraman has lost his job recently, there's always going to be a social cost to moving to pure tech. It's never just one thing (ie. It's not just a soccer match)