• kadu@scribe.disroot.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 hours ago

    AI doesn’t reason, so it heavily depends on what’s been presented in the training set.

    Python is everywhere and most importantly whatever you can think exists in Python, from critical bioinformatics tools to somebody learning programming from the first time and posting their prime number finder or sorting algorithm online.

    Rust? Not at that point yet, so the AI fails

    • cheesybuddha@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      52 minutes ago

      I dunno man, I tried coding a simply http listener with an LLM one time in python (a language I’m unfamiliar with). Just something to sit on a port, listen for a request, and run a script.

      I ended up spending more time troubleshooting the maybe two dozen lines of code than I would have spent just looking up a tutorial online.

    • Spice Hoarder@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      Yeah, for everything I’ve seen it’s just a classical case of overfitment. I only tried it because it was recommended to me by a coworker. It failed at problem solving and choosing comparable dependencies. Completely jarring because like you said, it could likely do it in JS and Python. But clearly not Rust. I often wonder if the code you get from AI is +85% stolen verbatim.