Altman’s remarks in his tweet drew an overwhelmingly negative reaction.

“You’re welcome,” one user responded. “Nice to know that our reward is our jobs being taken away.”

Others called him a “f***ing psychopath” and “scum.”

“Nothing says ‘you’re being replaced’ quite like a heartfelt thank you from the guy doing the replacing,” one user wrote.

  • Jakeroxs@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    6 hours ago

    Lmfao and computers are just for nerds

    Edit: OpenAI, Anthropic, etc can all die, but LLMs are not. You can run a local model.

    Now I completely agree with the hype train is completely out of control and its a monetary bubble, but the tool itself is not going away.

    Edit2: I think the dotcom bubble is a good analogy, the underlying idea of the internet and all it can do and online ordering and such was solid, just an insane amount of hype on top that simply couldn’t be reached at that time. But now, the biggest companies ever are mainly internet/tech companies.

    • MartianRecon@lemmus.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      Computers are input-output devices. You put things into a computer and it does what you tell it to do.

      LMM’s do not do this they just give you a facsimile of what it believes you want.

      LMM’s will not go away but their functionality is extremely limited, as has been proven by it’s failure to ‘change business forever.’

      And no, ‘but the tech isn’t there’ isn’t an argument right now. This is economics. The investment for it’s current capabilities are far outsized, and there will be a massive contraction.

      • Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        We are so far beyond “computer is just input output device” realistically. There’s thousands of layers of things built on top that produce what we know as a computer and anywhere along that chain things can be broken/not perform as expected because any other layer on the chain failed to do what it was supposed to.

        Realistically, what’s the difference between a thing and the facsimile of a thing when the result is the same?

        • MartianRecon@lemmus.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 minutes ago

          Semantics.

          A person creates something. LMM models just blurt out an approximation of what they think might be what you want.