• wonderingwanderer@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    43
    ·
    9 hours ago

    That’s fucking hilarious. How many instances of this have there been now? And companies keep doubling down on AI? Fucking idiots. I’m not even savvy enough to call myself an amateur, and I know better than to make such a series of obvious mistakes that predictably led to this outcome.

    One possible concern, amid the amusement, is whether Anthropic programed Claude to punish companies it sees as potential competition. Or is this just a completely bonkers, off the rails LLM making terrible decisions because it’s just a probabilistic model and not actually capable of abstract cognition?

    Either way, these people are idiots for giving a machine program enough permissions to wipe their drives, they’re idiots for storing their backups on the same network as their main drives, and they’re idiots for trusting a commercial LLM API, when it would be cheaper to self-host their own.

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 hours ago

      AI writes code

      User vets code

      User runs code

      If you’re not lock-step watching that shit, you need to just be doing it yourself.

      • Landless2029@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 hours ago

        The problem is the owning class what’s to cut out human elements so bad they keep letting tools run wild.

      • wonderingwanderer@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        The point of what? The push for AI in industry?

        You’d have to ask someone else. I can only make conjectures, but I’d say it has something to do with companies feeling the need to justify to their shareholders that their investments in AI were worth it, so they double down on the sunk cost fallacy. Or maybe those shareholders also own stock in big-name AI companies. It’s hard to say exactly…