Playing God: creating technology or inventions that are far beyond our control, or that will have unintended consequences that have the potential to cause significant material or psychological damage to a large proportion of humanity that cannot be reversed. Such technology is of the type that despite attempts to control it, it proliferates, and it would proliferate under a variety of economic and political models so that there is hardly any way to stop it except complete physical destruction.
Solution is simple: take it step by step, so that we deploy technologies/inventions that are only ever so slightly beyond our control. Learning means making mistakes; we can control the magnitude, but ultimately, the only way to not make mistakes is to never do anything at all.
> We have never applied this type of deployment so far. For example, fossil fuels -- we still can't control it.
Let's not forget that pretty much everything you consider nice and beneficial about modern existence - from advanced healthcare to opportunities to entertainment, even things as trivial as toilet paper or tissues or packaging or paint - is built on top of petrochemical engineering. Sure, we're dealing with some serious second-order consequences, but if we overcome them, we'll end up better than when we had "simple life more connected to nature".
> I'd argue that it's much better to live a simple life more connected to nature, even if that means more diseases and more manual labor.
Hard disagree.
> this life is at the expense of the DEATH of millions of nonhuman organisms
It's not like nature cares either way. Sure, we may be accelerating the disappearance of whole species, but even then we're more humane at the killing. It's really hard to do worse than nature, where every multicellular organism is constantly fighting against starvation, disease, or being assaulted and eaten alive.
I have the feeling that concepts like money and property fit your description, with the addditional detail that much more than humanity has been impacted. I also feel that AI has a pretty good chance to be the only way to revert the significant material / psychological damages that have already been caused by those. It seems short sighted to not weight in the whole of our "playing god" so far, and dismiss the idea that we might better defer-to/get-help-from a different intelligence at this point.