> Electricity, the Industrial Revolution, the internet, gene editing/bionengineering also came with unknown existential risks.
In a way, all of those things have led to where we are now. And might I point out that a form of bioengineering may have caused the latest global pandemic? Not taking these things seriously because "they haven't killed us all yet" seems a little shortsighted.
> "not taking these things seriously" is very different from regulating something in it's infancy because of fear of the unknown.
I mean, we should have better regulated carbon emissions generations ago. We'd have a lot more time to deal with it, because we haven't been able to deal with it in the century+ we've known about it. And AI extinction risk is likely to move faster than carbon-related climate change.
> "not taking these things seriously" is very different from regulating something in it's infancy because of fear of the unknown.
I don't pretend to have the answer of what the balance is for oversight and regulation to optimize safety and innovation - but the first step for those responsible is to recognize the potential dangers and unknowns, and produce plans that can be discussed and debated.
> Even accepting this hypothesis as true, is the answer that we should have regulated cell culture in the 50s when HeLa cell culture became a thing?
How about gain-of-function research? How about nuclear weapons? I lean libertarian but also recognize we have a responsibility to safeguard humanity.
In a way, all of those things have led to where we are now. And might I point out that a form of bioengineering may have caused the latest global pandemic? Not taking these things seriously because "they haven't killed us all yet" seems a little shortsighted.