• flubba86@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Kinda weird that it details how badly this affected the girls’ mothers. The girls don’t get a say, but wont someone please think of the mothers?!

    • NotAPenguin@kbin.social
      link
      fedilink
      arrow-up
      19
      arrow-down
      2
      ·
      1 year ago

      It is pretty weird, like “The reaction was one of massive support for all the affected mothers”…?

    • ParsnipWitch@feddit.de
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      1 year ago

      How do the girls not get a say? They asked their mothers for help who organised to found others who are affected.

    • EnderMB@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I imagine it leans into the idea of some people being “too young” to form a grown-up opinion.

      Really fucking weird, given the context is around their likeness being used for the purpose of porn.

  • MonkderZweite@feddit.ch
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Violates the right to your own image. You are not allowed to upload images of a classmate to an AI cloud without asking and neither to reach the generated images around.

    • MartianSands@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      Is that an actual legal right? If you’ve described it accurately, then Facebook and Instagram would be completely illegal

      • rentar42@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        It depends on your location, different countries have very different laws.

        For example in most countries it’s perfectly acceptable to have someone in a picture that you’re taking in public (for example you’re taking a picture of a building and someone happens to walk by). A notable exception to this is France, where apparently the right to ones own image is quite strong which technically makes most pictures of the Eiffel Tower illegal (as long as any one person is identifiable on it).

        Taking (and distributing) a picture specifically of a specific person that’s just doing random stuff in public is already less uniform and varies. There’s often some protection to basically say “no, you can’t make fun of some random person for having the wrong tshirt, they have a right to privacy”. A notable exception to that is usually “public figures” (which mostly means people in political, religious or commercial leadership positions): they mostly just have to accept to be pictured wherever.

        Protection for pictures taken in a private is usually the strongest (so yes, if you post a picture of your 3 best friends at a small party in your home, you might have to ask them for permission!)

        How all of this applies to pictures that “aren’t real” but look disturbingly so is probably going to be fought over in court for a good while.

      • MonkderZweite@feddit.ch
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        No, human right. And yeah, they mostly are. But it’s not Facebook offending but each of the teens, so nobody can really enforce it. Same like with phone numbers, except that those are actually protected by law in most countries.

    • CrayonRosary@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      AI generates novel images, though. They are merely trained to produce your likeness. None of the pixels are from any source images.

      In this case, I’m mistaken. They used a clothing remover app on normal photos and did not train an AI.

      • stevedidwhat_infosec@infosec.pub
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Jesus Christ that’s not even close to AI they literally stitched together shit like photoshop

        By the way, where do you see that clothoff ( the app mentioned in the article) doesn’t use trained AI models? I’m refraining from visiting their site to check myself as I don’t really want to give them that traffic and I figured I’d ask you direct instead as you already went their to verify I assume)

        • CrayonRosary@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I didn’t say the app doesn’t use trained models. I said the students didn’t themselves train an embedding or LORA against the other students’ faces in order to generate entirely new pics.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    7
    ·
    1 year ago

    You can’t stop them being made, they’re just the same deepfakes people have been making before. It’s important to note that they’re not photos of people, they’re guesses made by a algorithm.

    • strider@feddit.nl
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      2
      ·
      1 year ago

      While you’re completely right, that’s hardly a consolation for those affected. The damage is done, even if it’s not actually real, because it will be convincing enough for at least some.

        • ParsnipWitch@feddit.de
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          2
          ·
          1 year ago

          I think the people who made the pictures have to suffer consequences. Otherwise this sends the message as if it was just fair game to behave that way.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        This stuff can be run locally. Its not something that can be stopped by just going after some service providing it. It may make it slightly less convenient to access, but if anyone wants to access it it’ll be available. Pandora’s box has been opened and it can’t be closed.

    • maegul (he/they)@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      4
      ·
      1 year ago

      To push back your attempt to minimalise what’s going on here …

      Yes, they’re not actually photos of the girls. But, nor is a photo of a naked person actually the same as that person standing in front of you naked.

      If being seen naked is unwanted and embarrassing etc, why should a photo of you naked be embarrassing, and, to make my point, what difference would it make if the photo is more or less realistic? An actual photo can be processed or taken under certain lighting or with a certain lens or have been taken some time in the past … all factors that lessen how close it is to the current naked appearance of the subject. How unrealistic can a photo be before it’s no longer embarrassing?

      Psychologically, I’d say it’s pretty obvious that the embarrassment of a naked image is that someone else now has a relatively concrete image in their minds of what the subject looks like naked. It is a way of being seen naked by proxy. A drawn or painted image could probably have the same effect.

      There’s probably some range of realism within which there’s an embarrassing effect, and I’d bet AI is very capable of getting in that range pretty easily these days.

      While the technology is out there now … it doesn’t mean that our behaviours with it are automatically acceptable. Society adapts to the uses and abuses new technology has and it seems pretty obvious that we’re yet to culturally curb the abuses of this technology.

    • Rayspekt@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Exactly, the technology is out there and will not cease to exist. Maybe we’ll digitally sign our photos in the future so that deepfakes can be sorted out by that.

    • rentar42@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I think that happened at least 10k years ago … it’s just that the spiral is getting faster and faster …