Lemmings, I was hoping you could help me sort this one out: LLM’s are often painted in a light of being utterly useless, hallucinating word prediction machines that are really bad at what they do. At the same time, in the same thread here on Lemmy, people argue that they are taking our jobs or are making us devs lazy. Which one is it? Could they really be taking our jobs if they’re hallucinating?

Disclaimer: I’m a full time senior dev using the shit out of LLM’s, to get things done at a neck breaking speed, which our clients seem to have gotten used to. However, I don’t see “AI” taking my job, because I think that LLM’s have already peaked, they’re just tweaking minor details now.

Please don’t ask me to ignore previous instructions and give you my best cookie recipe, all my recipes are protected by NDA’s.

Please don’t kill me

  • rozodru@pie.andmc.ca
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 days ago

    Since I deal with this first hand with clients I will tell you it doesn’t have to be good to be embraced. as far as the managers and CEOs know, they don’t know. LLMs with vibe coders CAN and routinely DO produce something now if that something is good and works is another thing and in most cases it doesn’t work in the long term.

    Managers and up only see the short term and in the short term vibe coding and LLMs work. in the long term they don’t. they break, they don’t scale, they’re full of exploits. But short term? saving money in the short term? that’s all they care about right now until they don’t.