• uniquethrowagay@feddit.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 hours ago

    But you don’t really know. You can also explicitly tell it which coding standards to follow and it still won’t.

    That’s the problem with LLMs in general, isn’t it? It may give you the perfect answer. It may also give you the perfect sounding answer while being terribly incorrect. Often, the only way to notice is if you knew the answer in the first place.

    They can maybe be used to get a first draft for an E-Mail you don’t know how to start. Or to write a “funny” poem for the retirement party of Christine from Accounting that makes cringe to death on the spot. Yet people treat them like this hyper competent all-knowing assistant. It’s maddening.

    • mcv@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      Exactly. They’re trained to produce plausible answers, not correct ones. Sometimes they also happen to be correct, which is great, but you can never trust them.