"cataloguing these risks is important if we’re going to be serious about having an impact in important but ‘fragile’ fields like reducing extinction risk"
I certainly hope that I'm mistaken, but that sounds awfully a lot like a pro - eugenics bit of propaganda.
Personally, I would suggest that people with a clear intent toward 'love thy neighbor' rather than 'We're the best set of people to fix your problem for you' is best for humanity.
I can tell you that the "extinction risks" that EA people are likely to be concerned with are things like nuclear war, runaway AI, runaway bioweapons, catastrophic feedback loops related to global warming, etc. The "extinction" they're referring to is that of the entire human race.
If you thought they were referring to something else, I wonder where you got that idea.
I certainly hope that I'm mistaken, but that sounds awfully a lot like a pro - eugenics bit of propaganda.
Personally, I would suggest that people with a clear intent toward 'love thy neighbor' rather than 'We're the best set of people to fix your problem for you' is best for humanity.