> We dedicated a team of ~5 research staff to work full-time for 2 weeks for each top charity.
> The majority of this dedicated team did not have significant experience investigating these areas, giving them the advantage of a relatively unbiased starting point.
> An additional 5-7 researchers with more expertise in these areas of our research provided support by answering questions and assessing the red team’s arguments.
So there was no red teaming. It was instead an independent analysis of the assumptions, actions and processes of the charities in question.
"Red teaming" has a different meaning in the effective altruist and less wrong communities. Here it means playing devil's advocate for an opposing view.[1] I can see why this would be confusing on HN, but let's not forget that cyber security folks borrowed this term from the military and intelligence communities.
Yeah it's literally just an audit, i don't know why it annoys me so much that they called it "red teaming" but it's actually really pissing me off. Maybe woke up on wrong side of the bed today
I think they conceptualised it as red teaming their *own* analysis.
Givewell, being EAish, is generally staffed by people steeped in ratadj and/or nerd culture, so that seeming like a natural way of describing it to them seems unsurprising.
I mean, I do see your point, and "getting grumpy about such things" is very definitely not an experience I'm personally unfamiliar with, but in context I can't quite bring myself to be annoyed about this specific instance :D
Thanks. Should've spelled that out in the first place.
Too used to kicking around with people where it's a basically standard term (not always one used *positively* depending on which group of people, but I forgot it's not always clear).
Genuinely nice to know EA circles are involving/including people who aren't already sufficiently steeped in the origin/history/context/whatever of the movement to find jargon like 'ratadj' as obvious as I thought it was when I originally typed it.
(and it was jargon and I've just edited this comment because my original phrasing almost certainly read as condescending in a way that was absolutely not why I was meaning to say; mea culpa and my apologies to anybody who read the first version)
This does quite seriously strike me as an excellent sign for its sustainability and while, yes, EA has a whole buuunch of weirdness to/around it, I'm unconvinced you could've got a first generation of something like EA without being tolerant of the people who brought the weirdness, so:
Agreed. I thought this was going to be about infiltrating charities to see how corrupt they were or something.
But granted your typical charity has no idea of what the jargon is in the security (or even broader technical) field, and so "red teaming" is a vague enough phrase that could mean anything.
Yes thank you. It's a pet peeve of mine when tech people start using tech terms for things that already have widely understood meanings. Like "audit" you numbskulls
Okay...a lot of the time HN flags something I'm like "fair enough whatever."
But this? Really? Can't imagine what the heck might make it flagworthy other than the people who didn't know the origin of Red Team and polluted the comments.
In classic hacker news style, all the comments so far are nitpicking the definition of the word red-teaming. Yes, they are "red-teaming" their own research.
> We dedicated a team of ~5 research staff to work full-time for 2 weeks for each top charity.
> The majority of this dedicated team did not have significant experience investigating these areas, giving them the advantage of a relatively unbiased starting point.
> An additional 5-7 researchers with more expertise in these areas of our research provided support by answering questions and assessing the red team’s arguments.
So there was no red teaming. It was instead an independent analysis of the assumptions, actions and processes of the charities in question.