• 43 Posts
  • 851 Comments
Joined 2 years ago
cake
Cake day: June 27th, 2023

help-circle



  • Yudkowsky was trying to teach people how to think better – by guarding against their cognitive biases, being rigorous in their assumptions and being willing to change their thinking.

    No he wasn’t.

    In 2010 he started publishing Harry Potter and the Methods of Rationality, a 662,000-word fan fiction that turned the original books on their head. In it, instead of a childhood as a miserable orphan, Harry was raised by an Oxford professor of biochemistry and knows science as well as magic

    No, Hariezer Yudotter does not know science. He regurgitates the partial understanding and the outright misconceptions of his creator, who has read books but never had to pass an exam.

    Her personal philosophy also draws heavily on a branch of thought called “decision theory”, which forms the intellectual spine of Miri’s research on AI risk.

    This presumes that MIRI’s “research on AI risk” actually exists, i.e., that their pitiful output can be called “research” in a meaningful sense.

    “Ziz didn’t do the things she did because of decision theory,” a prominent rationalist told me. She used it “as a prop and a pretext, to justify a bunch of extreme conclusions she was reaching for regardless”.

    “Excuse me, Pot? Kettle is on line two.”








  • Ed Zitron:

    Sam Altman is talking about bringing online “tens of thousands” and then “Hundreds of thousands” of GPUs. 10,000 GPUs costs them $113 million a year, 100k $1.13bn, so this is Sam Altman committing to billions of dollars of compute for an expensive model that lacks any real new use cases. Suicide.

    Also, $1.30 per hour per GPU is the Microsoft discount rate for OpenAI. Safe to assume there are other costs but raw compute for GPT 4.5 is massive and committing such resources at this time is truly fatalistic, and suggests Altman has no other cards to play











  • The MIRI, CFAR, EA triumvirate promised not just that you could be the hero of your own story but that your heroism could be deployed in the service of saving humanity itself from certain destruction. Is it so surprising that this promise attracted people who were not prepared to be bit players in group housing dramas and abstract technical papers?

    Good point.

    Logic. Rationality. Intelligence. Somewhere in all these attempts to harness them for our shared humanity, they’d been warped and twisted to destroy it.

    Oh, the warping and twisting started long before Ziz. (The Sequences are cult shit.)