I don't disagree that this is an interesting and troubling trend to keep an eye on. The lowered cost of war absolutely has all the effects you describe, and that is, in a global sense, absolutely a scary thing, with disconcerting implications.
However, it is philosophically untenable to take the position that we should be forced to risk a life in order to take one. That is an anachronism that only feels justified out of a misplaced sense of moral fairness. What we should be doing is focussing on better procedures to prevent civilian casualties. Drones actually represent a great opportunity to employ very stringent procedures to prevent such things. Much more so than a fighter jet or troops on the ground. You can easily do things like have multiple people sign off before a missile is launched, employ machine learning to enhance images and attempt to classify and predict common sources of operator error, etc..
I was irked that you were downvoted, hopefully my upvote bring you out of the gray.
===
You make solid points, I don't disagree with you entirely. I do want to clarify: my point of view is not based on a sense of moral fairness. War is probably always going to be unfair, otherwise it's an even match and even more pointless. If I were for fairness I'd want the war to be local and not just in other countries, but I don't want that.
What I want is for it to end. The ability to cover up the human costs by distance from the front, distance from the people fighting (on either side), means it's too easy for us to allow it to continue. I don't want US men and women in harm's way. I want them out of harm's way, and for us to stop killing people who (mostly) just want to live their lives. Some level of military action will probably be necessary somewhere in the world, as entangled as we are, at any given time. But going to more automated and unmanned systems allows us to continue to be far more violent than circumstances actually require, with little cost to the elected and appointed officials directing this violence in our name.
I basically agree entirely with you. However, I think almost everyone would. If you asked Obama or any of the generals running these drone ops to sign off on what you just said, I think they'd gladly do so. They'd then likely go on to say that the war they're waging is one of these unfortunate necessities. Further, that the civilian casualties horrify them, but they're doing what they can do stop them while still accomplishing the (in their minds) paramount goal of eliminating terrorist threats.
What it seems to me to come down to ultimately is that it's ok to use drones when you come down on the right side of that moral calculation and not when you don't. Which, of course makes the drones an irrelevant part of the equation in the first place, since that is the fundamental moral question of all use of force.
Maybe the answer is to un-depersonalize the nature of drone strikes. Maybe their video feeds should be publicly available after X amount of time when they've used their weapons?
If I get you correctly another re-phrasing could be: If we put a human soldier into a situation that is dicey, like a mix of civilians and terrorists we didn't expect, we are obligated to get that human out even if it risks more soldiers and local civilians. But if we put a drone in the same dicey situation we can just accept the loss of that drone and self destruct it without harming any soldiers or civilians.
However, it is philosophically untenable to take the position that we should be forced to risk a life in order to take one. That is an anachronism that only feels justified out of a misplaced sense of moral fairness. What we should be doing is focussing on better procedures to prevent civilian casualties. Drones actually represent a great opportunity to employ very stringent procedures to prevent such things. Much more so than a fighter jet or troops on the ground. You can easily do things like have multiple people sign off before a missile is launched, employ machine learning to enhance images and attempt to classify and predict common sources of operator error, etc..