They offer a thing they’re calling an “opt-out.”

The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn’t actually opt-out.

When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won’t be shared with other companies.

LOL no. That’s not an opt-out. The way to opt-out is to stop using Slack.

https://slack.com/intl/en-gb/trust/data-management/privacy-principles

  • originalfrozenbanana@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    That’s not true at all. If you obfuscate the PII it stops being PII. This is an extremely common trick companies use to circumvent these laws.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      You could say it’s to “circumvent” the law or you could say it’s to comply with the law. As long as the PII is gone what’s the problem?

      • Lemongrab@lemmy.one
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        LLMs have shown time and time again that simple crafted attacks can unmask the training data verbatim.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          9 months ago

          It is impossible for them to contain more than just random fragments, the models are too small for it to be compressed enough to fit. Even the fragments that have been found are not exact, the AI is “lossy” and hallucinates.

          The examples that have been found are examples of overfitting, a flaw in training where the same data gets fed into the training process hundreds or thousands of time over. This is something that modern AI training goes to great lengths to avoid.

      • originalfrozenbanana@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        Legally obfuscation can be anonymization depending on how it’s done

        Depending on the data structures there are many methods to anonymize without supervision. None of them are perfect but the don’t have to be - just legally defensible.