As Adam Becker shows in his book, EAs started out being reasonable “give to charity as much as you can, and research which charities do the most good” but have gotten into absurdities like “it is more important to fund rockets than help starving people or prevent malaria because maybe an asteroid will hit the Earth, killing everyone, starving or not”.
I haven’t read Becker’s book and probably won’t spend the time to do so. But if this is an accurate summary, it’s a bad sign for that book, because plenty of them were bonkers all along.
(Becker’s previous book, about the interpretation of quantum mechanics, irritated me. It recapitulated earlier pop-science books while introducing historical and technical errors, like getting the basic description of the EPR thought-experiment wrong, and butchering the biography of Grete Hermann while acting self-righteous about sexist men overlooking her accomplishments. See previous rant.)
That Carl Shulman post from 2007 is hilarious.
The “two articles below” are by Yudkowsky.
User “gaverick” replies,
Shulman’s response begins,
Ray mothersodding Kurzweil!