• leadore@lemmy.world
    link
    fedilink
    arrow-up
    86
    arrow-down
    1
    ·
    9 days ago

    She has since jumped ship to become a director at another trendy company, PsyMed Ventures, which Newsweek described as a VC fund investing in mental and brain health. Many of the companies PsyMed invests in feature AI tools — which Ner says she still uses, albeit with a newfound sense of respect.

    She hasn’t learned a thing. She should have learned she’s susceptible and needs to stay as far away from the shit as possible.

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      9 days ago

      How many drug addicts don’t distance themselves from drugs? Why would this be any different?

      • hendrik@palaver.p3x.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        9 days ago

        I mean I know a fair share of people who did manage to quit smoking. Or cut down on weed or alcohol. Especially after things went sideways for them… So… Both is possible and being demonstrated by people each day. I mean you’re certainly right. It ain’t easy to do some self reflecting and stay away from addictive things. Requires motivation, mental strength and a good amount of effort. Just saying… because it’s good not to portray it as if resignation and drugs were people’s only option…

  • wavebeam@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    18
    ·
    9 days ago

    I have significant problems with AI, particularly around reckless implementation in tasks it is simply incapable of providing real value (most of them), but I struggle to find these kinds of articles as anything but the journalism version of the same lazy application.

    Blaming AI for a mental health issue is like blaming alcohol for making someone an asshole. They were an asshole before they got drunk, it just became more obvious while drunk. Same thing here, AI is not causing psychosis, it’s just revealing it in a place that we’re not used to seeing stuff like this come from: a computer.

    This article isn’t likely using AI to write it, but the application of AI as the subject matter and tying this person’s crisis to the use of AI seems lazy at best, negligent at worst.

      • wavebeam@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 days ago

        I think my concern is with it being called “AI Psychosis”

        It doesn’t seem like an effective call to action for treatment for the individual if we simply blame the AI not being “safe enough” or something.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          7
          ·
          9 days ago

          If we bring this back to alcohol, the alcohol absolutely is to blame for worsening symptoms. There’s even the term “alcohol-induced psychosis” or “alcohol-related psychosis” to describe the effect. Without the alcohol they are fine, with it they enter psychosis.

          If someone is symptom-free without AI and experiences symptoms with AI, then calling it “AI psychosis” would be reasonable.

          • wavebeam@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            9 days ago

            I guess so. I’m just so wary of everything revolving around AI, even blaming it. If the idea of AI psychosis gets big, but then undermined by shoddy reporting or research, it could cause people to dismiss it like the McDonald’s hot coffee thing.

            I want AI companies to be held accountable, but only appropriately so they can’t use shakey arguments against it to get out of that responsibility.

    • _g_be@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      9 days ago

      Yeah AI isn’t causing psychosis, it’s amplifying and enabling people who might already have problems.

      What is your argument here? That since AI is not the direct cause of it, these articles are pointless? I think we’d want to know if, to use your example, commonly available thing like alcohol was giving people psychosis

    • Harvey656@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 days ago

      I don’t think anyone here is blaming AI for this woman’s mental issues, those clearly existed before generative AI. AI just happened to enable her issues, which the article points out.

    • 87Six@lemmy.zip
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      9 days ago

      You’re right but posting this in the Fuck AI community is like hating nazis in the Tesla community