shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • I see this in a lot of news regarding AI, but these tools don’t generate pictures of anyone. They generate pictures that maybe look like someone, but they can’t undress you.

    If you make an AI generate “Billy Eilish boobs”, you’re not seeing a picture of her real boobs. You’re seeing a reproduction of her face on top of a reproduction of some boobs.

    These tools aren’t x-ray goggles, they’re the automated equivalent of the village creep cutting out celebrity faces to paste them onto pin-ups. We’ve had the same moral panic about Photoshop and I’m sure we’ll see the same thing happening again with whatever image manipulation technology comes next.

    We need to educate everyone, especially the elderly, that deep fakes exist. We can’t stop deep fakes and even just trying to is futile.

    In terms of blackmailing, I think this actually provides an opportunity. Ex leaked some nudes? Deep fake. Hacker broke into your phone? Deep fakes. Anyone can generate nudes of anyone else with only a few pictures on their gaming PC and modern models don’t actually fuck up the hands like people often claim they do. This even works on interactive videos. If we can get that message across, we can pretty much end the effectiveness of sexual blackmail.

    • raccoona_nongrata@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here. The philosophical discussion of whether it’s “actually them” or not doesn’t really matter, it’s still intrusive, violating and gross. In the same way that stealing someone’s identity is illegal, it doesn’t matter that the identity created by the con man isn’t the real you, damage can be done with identity theft.

      Maybe there’s nothing you can do about it on the dark web, but sites absolutely can manage deepdakes, in the same way that PornHub will take down non-consensual ex-girlfriend type content.

      The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

  • Melmi@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I worry that the cat is out of the bag on this. The tech for this stuff is out there, and you can run it on your home computer, so barring some sort of massive governmental overreach I don’t see a way to stop it.

    They can’t even stop piracy and there’s the full weight of the US copyright industry behind it. How are they going to stop this tech?

  • SteleTrovilo@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The tech isn’t there yet. There are so often distracting flaws around the hands/feet. The AI doesn’t really know what a human is, its just endlessly re-combining existing material.

    • rhabarba@feddit.deOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      As much as I loathe having to reveal this to you, the shapeliness of the hands should be semi-negligible to most people who would love to have an image created from the statement “I want to see Billie Eilish’s boobs”.