I'm non-religious but many religions have a norm of charitable giving which I'm glad EA is trying to introduce for secular people.
I'm not a pure utilitarian and try and get a bit of feelgood consumption out of my charity.
I have an automatic monthly deduction to a cause I'm interested in, https://www.orangutan.org.au/ which buys up rainforests and protect orangutans in Borneo.
The reason I go for this instead of "antimalarial bednets" (ie optimal human life saving) is because I have visited Borneo a few times, and think protecting great apes is important to me personally.
I also kick in a few hundred bucks a year to sponsor friends doing various charity challenges.
I personally care about rainforests and orangutans as well. But if I had a monthly donation going towards orangutans I would want to know what impact it had. How much of my money actually goes to successful habitat preservation, rather than e.g. the running costs of the organisation? Does it cost $5 million to intervene to save one orangutan, or $5,000? What does the evidence look like?
If I imagined two rooms, one with a child about to die of malaria in it, and another with an orangutan in it, and you told me I could spend $7,000 dollars to save the child or $50,000 to save the orangutan, I'd be very hard pressed to save the orangutan.
I think I would get the same 'feel good consumption' from more effective charitable giving, where my money went further and did more good in the world. The few thousand dollars needed to save a life from malaria would feel as good or better than tens of thousands of dollars needed to fail to save an orangutan.
And yet a consideration that, in my experience, rationalists ignore at their own expense is that human factors like familiarity weigh in to the thinking of the typical layperson. Many people might bristle at, and feel "told" or "admonished" by this line of argumentation.
I'm all for methods to encourage greater rationality. I'm all for a world where, someday, folks are doing this sort of calculus more often in their lives.
We're not there, yet. The approach you are taking here falls short in being a persuasive, and thus effective, argument. imo, more awareness and effort needs to go into the rational community itself trying to get more in touch with what sort of, well, let's call them "less rational", methods work effectively for/with/"on" the general populace.
I know, dark arts and all that. I'm not suggesting to outright lie to people. I am suggesting that falling into the typical mind fallacy is a sort of highly visible antipattern exhibited, in my experience, by rationalists. I'm suggesting that a world where a person helps save orangutans and rainforests and acts as a conscientious, charitable giver is better than the world where they don't. I'm suggesting to focus on that good, and then do work where you can to make the world even more better.
Perhaps a look into the works of David Chapman [1] could help, here. Perhaps a greater appreciation for the complexity and subjective reality of other humans' thought processes and rationalizations, to them, would help you make better use of empathy and prosociality to become even more effective.
Buying / leasing rainforest or lobbying government for protection seems pretty cheap in the grand scheme of things, 11k wild oranghutans for 3.3M dollars last year is less than $300 per oranghutan per year
There are 70,000 humans per oranghutan so even though they are more genetically distant from us rarity should count
I disagree with your choice of charity but I support your line of thinking and your choices. The fact that this charity has numbers puts it way ahead of the slush funds that do zero good (or negative good). I personally think that's how EA will go mainstream, by encouraging everyone to make more quantitative decisions that result in more funding to better charities.
This person is also straightforward with their reasoning - yeah sure it's not as effective as malaria nets but it's a personal cause they care about. Most people don't even think about how effective the causes they support are - at all! It's mindblowing.
The trade-off is often not between charity X vs the most optimal charity, but any charity vs further consumption/savings etc.
Imagine EA ideas applied to fitness - sure some people may stick to a perfect routine etc but some people just getting off the couch and doing an activity they like enough to keep doing is best.
I'm also really curious about HN's view of Effective Altruism, and how do individuals here handle their charitable giving.