Hacker News new | past | comments | ask | show | jobs | submit login

> The issue is that the effective altruism crowd are focused exclusively on AI safety, at least in what they publish online.

Just looking at the topics section of the EA forums (https://forum.effectivealtruism.org/topics/all), we've got

- 466 posts in AI safety

- 1060 posts in existential risk (seems like a roughly even mix of pandemics, nuclear war, and AI)

- 816 posts in global health

- 193 posts in animal welfare

- 42 posts in biosecurity & pandemics




I've never been on this website before, but am I correct in understanding that the "biosecurity" categoery is about a month old?

It would be interesting to understand how much of the "existential risk" category is devoted to biosecurity or pandemics, and how recent those posts are.

Pending the above, my tentative takeaway from this is that COVID - arguably the greatest threat to human health and wellbeing in the past several decades - seems to have been a minor/non-existent blip on their radar until it actually happened. This raises serious questions in my mind about the community's predictive abilities and/or bias.


> am I correct in understanding that the "biosecurity" category is about a month old?

It looks like the Forum is confusing here: it has "Biosecurity & Pandemic Preparedness" [1] as one topic, and "Biosecurity" [2] as its largest subtopic. But when you click on the parent topic it's not showing the subtopic posts; I've filed a bug.

Older good biosecurity posts include:

* 2014: Good policy ideas that won’t happen (yet): https://forum.effectivealtruism.org/posts/n5CNeo9jxDsCit9dj/...

* 2017: Biosecurity as an EA Cause Area: https://forum.effectivealtruism.org/posts/eXgCREwL5PQWHEtvm/...

* 2021: Concrete Biosecurity Projects (some of which could be big): https://forum.effectivealtruism.org/posts/u5JesqQ3jdLENXBtB/...

[1] https://forum.effectivealtruism.org/topics/biosecurity-and-p...

[2] https://forum.effectivealtruism.org/topics/biosecurity


Thanks for the reading. The 2017 talk/article is interesting in light of COVID:

"Some of the reasons I'm skeptical of natural risks are that first of all, they've never really happened before. Humans have obviously never been caused to go extinct by a natural risk, otherwise we would not be here talking. It doesn't seem like human civilization has come close to the brink of collapse because of a natural risk, especially in the recent past.

"You can argue about some things like the Black Death, which certainly caused very severe effects on civilization in certain areas in the past. But this implies a fairly low base rate. We should think in any given decade, there's a relatively low chance of some disease just emerging that could have such a devastating impact. Similarly, it seems like it rarely happens with nonhuman animals that a pathogen emerges that causes them to go extinct. I know there's one confirmed case in mammals. I don't know of any others. This scarcity of cases also implies that this isn't something that happens very frequently, so in any given decade, we should probably start with a prior that there's a low probability of a catastrophically bad natural pathogen occurring."

I wonder if this is a case of extrapolation fallacy? Modern human dynamical behaviour is significantly different to both animal behaviour and less modern behaviour. Viruses spread more easily in the age of global travel; those that used to be too deadly to spread very far suddenly have more opportunity.

EDIT: Reading this more carefully, the speaker does actually address globalisation, but seems to dismiss it as a counterargument and I'm not really sure why.


The rationalist and EA communities are often credited for taking Covid much more seriously sooner than most and the media did. https://unherd.com/2020/12/how-rational-have-you-been-this-y...


> Sooner than most and the media.

This is a very low bar to jump over!

Anyway I read through that article. There's a lot of guff about Newcomb's paradox and Eliezer, and a single paragraph referencing COVID, with two links supporting your statement. I've clicked through those links and have come out unimpressed.

The first link is used to supported the claim that "The rationalist community was well ahead of the public": https://slatestarcodex.com/2020/04/14/a-failure-but-not-of-p... .

Scott rightly highlights how the media and institutions got things wrong, but conversely gives only a few examples of "generic smart people on Twitter" getting some things right in the early days of the outbreak. He confesses that he himself did not predict the seriousness of COVID.

The second link is used to support the claim "the wider tech community, were using masks and stocking up on essential goods, even as others were saying to worry about the flu instead": https://putanumonit.com/2020/02/27/seeing-the-smoke/ .

The linked article was written at the end of February, when panic-buying had already firmly set in across the US.

There is nothing here about the reasoning or predictive abilities of the rationalist or EA community specifically. Nor is there any compelling comparison of its response compared to the wider public.


didn't really have much of a dog in this fight, but looking at that breakdown I wonder which oil company is bankrolling them.

I know I'm jumping the gun just a little, but c'mon. An apocalypse list without climate?


> An apocalypse list without climate?

The topics listing isn't great, but there's been a lot of discussion about climate change on the forum: https://forum.effectivealtruism.org/topics/climate-change (239 posts)


Climate change is unlikely to kill everyone on earth. Other risks, such as an engineered pandemic, asteroid impact, or an AI apocalypse have the possibility of killing ~everyone. This is not saying that climate change is not a real issue.


Well, we're past the asteroids, at least. [1]

The AIpocalypse seems incredibly unlikely. I'm a lot more worried about the nukes, and even if we end up with robotic overlords, I'd bet they'll be a whole lot better at administration than meat -admins have proven.

I'm with you on engineered superviruses. Feasible, likely, and incredibly high impact.

What I kind of keep coming back to is the risk profile of all this stuff. That magic vector of likelihood * impact. Global warming is happening now. And it's real bad - worse than I think we give it credit for.

What I worry about is that we're the proverbial frog in the pot. Things get just slightly hotter each year, so we'll miss it when we actually boil.

1 https://www.google.com/search?q=double+asteroid+redirection+...


So what you're saying is that if a risk can totally upend our society, destroy most of our cities, make most of our farmland unusable, destabilize geopolitics in a way that's almost certain to lead to war between nuclear powers (who also have the ability to engineer deadly diseases), and massively disrupt every ecosystem on earth, and that there's enough evidence to say with almost complete certainty that this will come to pass without massive societal and political change and technological intervention... but it's unlikely to kill everyone... then it doesn't really deserve mention?


The cascading risks you mention are certainly real and serious, and are worthy of our best and urgent efforts to solve. Effective altruists are rightly concerned about these effects e.g. https://80000hours.org/problem-profiles/climate-change/

My comment was written with the summary of that article in mind (I didn't make this clear):

> Climate change is going to significantly and negatively impact the world. Its impacts on the poorest people in our society and our planet’s biodiversity are cause for particular concern. Looking at the worst possible scenarios, it could be an important factor that increases existential threats from other sources, like great power conflicts, nuclear war, or pandemics. But because the worst potential consequences seem to run through those other sources, and these other risks seem larger and more neglected, we think most readers can have a greater impact in expectation working directly on one of these other risks.

There is an excellent chapter on the existential risks associated with climate change in Toby Ord's book The Precipice, which you can get a free copy of at https://80000hours.org/the-precipice/


Says whose crystal ball?


> 42 posts in biosecurity & pandemics

The way the EA forum handles parent topics is confusing, but this is only counting posts tagged with the top-level label. It has several subcategories, and there are hundreds of posts when you include them:

* 218: Biosecurity

* 176: Covid

* 54: Pandemic preparedness

* 40: Global catastrophic biological risk

* 39: Vaccines

* 20: Biotech

* 16: Life sciences

* 11: Dual-use

* 10: Biosurveillance

(Posts can have multiple labels, so the above list double-counts a bit. I don't see an easy way to extract the actual count from the forum, but it's at least 218, the count for the largest category.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: