• PeleSpirit@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 year ago

    It’ll be harder for AI to replicate many of the “soft skills” that define a good CEO, like “critical thinking, vision, creativity, teamwork, collaboration, inspiring people, being non-empathetic with a callous, narcissistic personality, being able to listen and see,” says Agarwal.

    Okay, the italics are mine, but it’s implied, right?

    • remotelove@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Except for the implied bit you added, aren’t those words just copypasta for every single job opening ever? Everyone is a CEO, I suppose.

  • NatakuNox@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    I worked IT for a multi million dollars start up. The board members were made up a couple other CEOs from larger companies, and let me tell you. They all are idiots. They read books from other “successful” people and just parrot what they saw on FOX business the night before… “FOX business said firing all your full time well trained and educated staff for part timers so we can save on overhead.”

    I shit you not! Before I quit, they fired 40 of our customer service representatives and hired 120 part time reps, only scheduled them 10 to 15 hours a week and turned a huge profit the next quarter. Customer satisfaction was in the tank but simply comping our service for disgruntled customers was still cheaper than the 40 full time employees that kept customers happy. Capitalism baby!

    • PoliticalAgitator@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      1 year ago

      Yep.

      Most of them are already in a position where they could retire tomorrow and maintain a level of comfort and luxury beyond what most people experience for even a day, for the rest of the lives and the rest of their children’s lives.

      If I was in that situation, I wouldn’t give a shit if I lost my job either.

      But it doesn’t end there of course. They don’t just have millions of dollars sitting in a bank account, they’ve got also got millions of dollars tied up in the success of companies.

      So if an AI takes their job and can fuck every possible person out of every possible penny with a psychopathic efficiency beyond anything we’ve ever imagined, they still win.

      And boy, the horrors that could be endless. If an AI CEO runs multiple companies, who do you charge with price fixing and insider trading?

      What do we do when an AI spying on our phones and social media knows exactly how much money we have and can adjust prices in real time?

      What about when it’s able to personally astro-turf your social media and content aggregation sites to convince you to buy more and complain less?

      What about when it can arrange kickbacks for politicians that are perfectly calibrated to be the lowest bid needed to crush out any resistance or accountability?

      Your own personal, digital Elon Musk, following you around every second of the day, basing all his statements and decisions on every action you take, pushing his financial and political agenda, from the safety of an impenetrable mask.

      It’s heartbreaking to watch the death of human-created art, but its a papercut compared to the torture to come when it replaces greedy psychopaths instead.

  • LillyPip@lemmy.ca
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Translation:

    When AI makes your job redundant, you’ll be unemployed. Fuck you.

    But when AI makes my job redundant, I’ll keep my position and just take more naps and vacations whilst my computer works for me. I may also get a pay rise for my efficiency. Also fuck you.

    I honestly only skimmed the article because I kept getting more and more angry. Is that a fair summary, or am I selling these cunts short?

  • Poggervania@kbin.social
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    I’ll bet $5 we’ll start seeing more CEOs become CVOs, or Chief Visionary Officers, and the current pay for CEOs will switch over to CVOs because AI can’t be a visionary or some bullshit.

  • huginn@feddit.it
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Not to say that CEOs are valuable but they also believe LLMs can replace writers.

    There’s 0 evidence they can (their work is good enough for maybe Big Bang Theory?) And yet CEOs are convinced otherwise.

    CEOs think that LLMs can replace programmers. Any programmer who has used copilot knows that the fucking thing hallucinates all the time and is barely good enough to be documentation autocomplete.

    CEOs are gullible as fuck and 50% thinking they can be replaced is just more evidence of their gullibility.

    CEOs aren’t that important but I guaranfuckingtee that an LLM with full autonomy would run a company straight into the ground.

    • winky88@startrek.website
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      And yet CEOs are convinced otherwise.

      Many CEOs are little more than MBAs in fancy suits. And let me tell you, an MBA is often more an indicator of incompetence than otherwise.

      The truly sociopathic ones fake their way to the top. It’s all smoke and mirrors. Go ask Jim Farley.

    • Semi-Hemi-Demigod@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      “We should build a time machine”

      “Um, Mr. CEOBot, we build washing machines.”

      “Yes but if we devote all resources to building a time machine we can go back in time and get the original patents and optimize returns.”

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    CEO’s, above everything else, manage risk. They hedge their bets against the market and against their own workforce.

    While “AI” might be better at calculating risk with known factors, the joys of hallucination and often having a poor amount of true knowledge of what’s actually going on means that a LLM would run a company into the ground hilariously fast.

    It’s why I wet myself laughing whenever someone suggests to replace HR or recruiters with AI. Their job is to protect the company, and all it takes is a poor AI decision, and suddenly there’s a lawsuit for millions or a bad actor was hired and fucked the company up from the inside.

    AI tools are powerful, but that’s all they are, and all they will be for a long time. If anything, we’re likely to see regressions in performance for ChatGPT as they fight legal battles over their use of protected data/hallucinations, and slightly improved performances from Google/Amazon/Apple on their own initiatives.