• BJW@lemmus.org
    link
    fedilink
    English
    arrow-up
    16
    ·
    23 hours ago

    I’d say 99.9% of people. You’re actually the first other person I’ve seen who doesn’t!

        • some_designer_dude@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          18 hours ago

          Then build better guardrails. These are the tools of the future. (And I intend both meanings of the word “tools”.) AI is very good at following rules. In their absence, they require someone far more experienced to drive them properly.

          • ClownStatue@piefed.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            16 hours ago

            This is a really good point! It used to be “a computer is only as smart as its user.” The same can be said of AI: the model’s results are kind of dictated by the prompt. while anyone can prompt an AI with whatever they want, it takes experience to use an AI to develop a project from idea to v1. At the e d of the day, the AI can search the web better than me, and type faster than I can - but I know what I want my code to do, and I know how I want it done. Those two things don’t have to be mutually exclusive.