• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2024

help-circle
  • So, you know Ross Scott, the Stop Killing Games guy?
    About 2 years ago he actually interviewed Yudkowsky. The context being that Ross discussed his article on one of his monthly streams, and expressed skepticism that there was any threat at all from AI. Yudkowsky got wind of his skepticism, and reached out to Ross to do a discussion with him about the topic. He also requested that Ross not do any research on him.
    And here it is…
    https://www.youtube.com/watch?v=hxsAuxswOvM

    I can’t say I actually recommend watching it, because Yudkowsky spends the first 40 minutes of the discussion refusing to answer the question “So what is GPT-4, anyway?” (It’s not exactly that question, but it’s pretty close).
    I don’t know what they discussed afterwards because I stopped watching it after that, but, well, it’s a thing that exists.






  • Also if you’re worried about digital clone’s being tortured, you could just… not build it. Like, it can’t hurt you if it never exists.

    Imagine that conversation:
    “What did you do over the weekend?”
    “Built an omnicidal AI that scours the internet and creates digital copies of people based on their posting history and whatnot and tortures billions of them at once. Just the ones who didn’t help me build the omnicidal AI, though.”
    “WTF why.”
    “Because if I didn’t the omnicidal AI that only exists because I made it would create a billion digital copies of me and torture them for all eternity!”

    Like, I’d get it more if it was a “We accidentally made an omnicidal AI” thing, but this is supposed to be a very deliberate action taken by humanity to ensure the creation of an AI designed to torture digital beings based on real people in the specific hopes that it also doesn’t torture digital beings based on them.