Hacker News new | past | comments | ask | show | jobs | submit login

> to assume that SBF only turned to a life of crime because the EA movement lured him into it.

It's not an assumption. It's a matter of record, understood by anyone who knows the basics of the FTX saga. SBF is a lifelong utilitarian, and his co-conspirators were also committed EAs. Anyone who obscures that fact is abetting the campaign of obfuscation.

> Many of the people involved seem to have good intentions

Yup. That includes Sam and his friends. How did that turn out?

> a lot of money has been donated

To borrow your language, "it's quite a leap to assume" that anyone donated money just because EA lured them into it. How do we know they weren't going to behave altruistically in the absence of the movement?

Why is there an isolated demand for rigor when confronting the movement's adverse effects? Do you think that Sam is inherently a criminal, with ideology playing no role, whereas EA donors aren't inherently generous?

> to objectively good causes

Fair enough. I will concede that there were a few of these. In some cases, early EA principles may have helped people arrive at these good ideas in a way that wouldn't have happened without the movement.

It's just that, I get annoyed when people dismiss the downsides, writing them off as aberrations or lone bad actors. And given the major flaws in the philosophy, it's hard to shake my suspicion that most of their longtermist explorations are worse than just "squishy". It seems quite plausible that they're doing more harm than good.




This type of argument can be used against pretty much any group that has ever existed, then either it's too broad and not a meaningful critique of the group itself or a critique of humans getting together in general.


Sorry, you've lost me here. Which part of my argument are referring to?

I'm not criticizing a group, btw. I'm criticizing ideas. Ideas have specific consequences. Some ideas inspire good actions. Some inspire bad actions that outweigh the good.

When an idea claims to have big consequences in the distant future, we can look at its consequences in the present day to help us guess the likely nature of those future consequences.

Oxford deals in ideas, and gets to decide which ones to host and fund. Sometimes they get it wrong, mistaking bad ideas for good ones. That's unfortunate, but it's nice when they come around to the right assessment eventually.


The grandparent poster stated that "grandparent poster statement that "to assume that SBF only turned to a life of crime because the EA movement lured him into it." and you responded "It's not an assumption. It's a matter of record, understood by anyone who knows the basics of the FTX saga. SBF is a lifelong utilitarian, and his co-conspirators were also committed EAs. Anyone who obscures that fact is abetting the campaign of obfuscation." Your response never responded to the grandparent's statement because you never showed which parts of Effective Altruism encouraged crime. Instead, you showed that the FTX criminals claimed they believed in Effective Altruism.

Every movement, religion, organization, company, and government has bad apples. Just because Sam Bankman-Fried and his executives said they followed Effective Altruism does not mean they really did. Could you please cite some evidence that the Effective Altruism belief encourages crime? Also, could you please explain why a belief or movement should be judged by what a few members do?

I am making this post because I do not like seeing groups, organizations, and movements blamed for what a few members do. It's unjust and unhelpful.


If you want a citation that EA belief does, in practice, encourage crime, I would direct you to literally all the reporting on FTX.

But if I'm reading you correctly, you're asking instead for a citation of some EA materials in which EA leadership says, "do fraud". There is no such citation. That is not how this works. Looking at history, a recurring pattern is that good-sounding ideas can have bad consequences. When it happens, they rarely come out and say "do bad stuff".

> please explain why a belief or movement should be judged by what a few members do?

A belief or movement should be judged by the total impact across all members. I'm not just talking about a naive weighted sum -- that would just be more utilitarianism, which is the evil idea we need to unlearn here.

A better framing is that you're trying to infer a risk distribution based on the observed data. In so doing, you incorporate an understanding of the history of altruism, where it's way easier to do major damage than major good. Therefore, you heavily weight a downside event, because this is merely a taste of the true tail risk.

Tail risk is easy to underestimate (see the work of Nassim Taleb on this). If you throw out negative data points, you're making it even harder to appreciate the worst case.

> I am making this post because I do not like seeing groups, organizations, and movements blamed for what a few members do.

I suppose this is a fine instinct in general. And I'm all for protecting minority viewpoints and their right to think freely.

But when a group makes wild claims about how its revolutionary thought process can discover shortcuts to astronomical benefits in the distant future, then they're operating on pure ideology without much of a feedback loop. In such a case, the question of the ideology's tail risk becomes extra important.

You may notice that by my reasoning here, the lessons from FTX should have less bearing on how we view the "neartermist" cause areas, where EAs are measuring their impact and reacting to facts on the ground. This is indeed true. I believe that early EA did a lot of good, and there are still a few "neartermist" EAs left, doing decent work.

So to bring the thread full circle, this distinction is all the more reason to celebrate the end of FHI, which helped drive the shift within EA toward longtermism, where their utilitarianism is a poor fit and, I would claim, a danger to humanity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: