4 Comments
User's avatar
PSJ's avatar

I was involved in EA around 2014, and a principle that was mentioned back then seems to have disappeared from the EA lexicon since–that is, one sign of a good intervention is that it dominates another currently funded intervention. That is, for every dollar, it probabilistically improves upon every goal of that dollar to move it to the new intervention from the old one. Things like moving funds from bednets to x-risk cannot dominate because in at least some worlds the x-risk dollar is literally burned; it does nothing. I don't think this was just an epistemic principle but also a practical one in that when trying to convince somebody to move their money from one place to another, if they are honest about the reasons they are donating, demonstrating dominance is very *convincing* in a way I've found modern EA has stopped caring so much about, instead pursuing like-minded whales or homegrown earners-to-give (no commentary about the wisdom of that now).

Expand full comment
JC's avatar

I don't think any interventions do that!

There's always going to be some crazy hypothetical world where the better intervention triggers some bizarre catastrophe. Like there's some world where if your malaria nets aren't hand-sewn by Frenchmen, a giant squid comes out from the depths and eats everyone.

Expand full comment
King Nimrod's avatar

This principle seems way too restrictive be useful beyond a few narrow applications.

Suppose you can roll a D6 or a D20 and receive the outcome in dollars. The D20 reduces the probability mass on 1 through 6, but surely you'd rather roll the D20!

Even holding fixed the possible outcomes (1 through 6), suppose you can distribute the probability mass however you want for this game. Surely you place it all on 6, even at the cost of lower probability of rolling 1 through 5!

Maybe when we were exploring low hanging fruit in global health and development it was common to find the dominance you describe, but I struggle to imagine it being desirable at this point.

Expand full comment
JC's avatar
Mar 12Edited

Wow that paper in footnote 5 is amazing! Fanaticism (in the strong sense used in that paper, where you should accept Pascal's-Mugging-like scenarios) seems clearly wrong, but they point out some bizarre consequences of denying it!

What do you make of this? How do you feel about extremely small probabilities of extremely high-value events (like a 10^-100 chance of 3||||3 blissful lives)? should we put all EA resources towards those types of scenarios?

Expand full comment