• MeatPilot@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    ·
    19 hours ago

    ChatGPT, how do I erase this evidence that Trump had Epstein murdered? Here are all the unredacted files for review.

  • lando55@lemmy.zip
    link
    fedilink
    English
    arrow-up
    81
    ·
    1 day ago

    Yeah, he really should know better, but why were the necessary controls not in place to prevent the C-suite from doing stupid things? I know it’s not possible to eliminate all risk, but enterprise-level DLP should really have caught this.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      50
      arrow-down
      1
      ·
      edit-2
      1 day ago

      You’re assuming that it wasn’t caught. He could have easily been informed and did it anyway because opsec is in opposition to their goals.

      They want to make us vulnerable.

      • NOT_RICK@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        1 day ago

        Definitely possible and even likely for at least some of them, but I would bet money a good deal of it is just hubris. A ton of these people give off the vibe that they earnestly believe they can do no wrong and know better than the “so called experts” because they’re so great and brilliant and strong. Anyone that tries to pierce that bubble is just a “jealous loser”.

    • scytale@piefed.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      19 hours ago

      triggering multiple automated security warnings that are meant to stop the theft or unintentional disclosure of government material from federal networks

      They were, or at least detected if not prevented. That’s how they knew it happened.

    • Wildmimic@anarchist.nexus
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      22 hours ago

      This is the same guy who failed a polygraph, then smeared the people who told him he only needed to take the polygraph when he wants to see a highly classified program where only a limited number of people are allowed to see it (the previous guy on his seat didn’t want to see it because it’s not necessary for this job) for “giving him misleading information”.

      He also wanted to remove Costello, one of the people at CISA who is seen “as one of the agency’s top remaining technical talent” after around 1000 employees were cut (he was hindered to do so after others learned about that - Costello had already gotten a letter giving him the choice to move to DHS or resign). Sources say that Costello pushes back regarding policy and contracting decisions - probably because he knows better.

      He is Noem’s pet IT guy she took with her from South Dakota, and i think he’s out of his depth for sure, and probably compromised.

      • Tanoh@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        12 hours ago

        In his defense, polygraph is just pseudo-science bullshit. You “fail” or “pass” depending on what the one doing it wants you to do. It is just made up.

          • Tanoh@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            I don’t know why, but my guess would be. Everyone involved knows it is bullshit, the people working there, management, etc… but it gives a good loophole to fire anyone that is starting to stir up something, “oh, he/she failed the polygraph.”

            The people working there knows it, so they are more likely to stay in line so they can “pass” their annual test.

  • TheObviousSolution@lemmy.ca
    link
    fedilink
    English
    arrow-up
    22
    ·
    21 hours ago

    AI will reach true intelligence when it is able to tell its users they probably shouldn’t be providing them that information.

    • BigPotato@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      20 hours ago

      I can’t tell you how many times I emailed someone last week asking that they don’t send me any sensitive information over insecure channels.

  • minorkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    2 hours ago

    The government is run by incompetent but power hungry cretins who have been convinced by the techbros that LLMs can finally make them competent at being in charge, making them grossly over confident in their capabilities.

    Edit. Cretins

    Also, the salesman and marketing bullshit around LLMs has fooled those people into believing that LLMs are actually capable of letting them run things competently.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    17 hours ago

    If the United States government wants to use ChatGPT on sensitive information, I’m pretty sure that it can come to some kind of contract with OpenAI to set up their own private cloud thing dedicated to that.

    I get that maybe this guy just wanted some kind of one-off use, but then arrange to have something set up for that.

    EDIT: To clarify, set up for that sort of thing, not this specific use. Like, have a way to throw up secure, temporary setups for particular users who just need one-off stuff for sensitive material.

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      22 hours ago

      They can. They probably do. But he wanted to use the online one specifically and got an exemption. And of course he fucked it up. This is why management should be the last people to be granted extra access.