shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • I see this in a lot of news regarding AI, but these tools don’t generate pictures of anyone. They generate pictures that maybe look like someone, but they can’t undress you.

    If you make an AI generate “Billy Eilish boobs”, you’re not seeing a picture of her real boobs. You’re seeing a reproduction of her face on top of a reproduction of some boobs.

    These tools aren’t x-ray goggles, they’re the automated equivalent of the village creep cutting out celebrity faces to paste them onto pin-ups. We’ve had the same moral panic about Photoshop and I’m sure we’ll see the same thing happening again with whatever image manipulation technology comes next.

    We need to educate everyone, especially the elderly, that deep fakes exist. We can’t stop deep fakes and even just trying to is futile.

    In terms of blackmailing, I think this actually provides an opportunity. Ex leaked some nudes? Deep fake. Hacker broke into your phone? Deep fakes. Anyone can generate nudes of anyone else with only a few pictures on their gaming PC and modern models don’t actually fuck up the hands like people often claim they do. This even works on interactive videos. If we can get that message across, we can pretty much end the effectiveness of sexual blackmail.

    • raccoona_nongrata@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here. The philosophical discussion of whether it’s “actually them” or not doesn’t really matter, it’s still intrusive, violating and gross. In the same way that stealing someone’s identity is illegal, it doesn’t matter that the identity created by the con man isn’t the real you, damage can be done with identity theft.

      Maybe there’s nothing you can do about it on the dark web, but sites absolutely can manage deepdakes, in the same way that PornHub will take down non-consensual ex-girlfriend type content.

      The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.