• jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    ·
    3 days ago

    “AI” coding tools can offer some value. The problem is that they often generate tech debt with pay day loan level interest rates. What’s made the interest rate so high is that now, not only do you have the actual code base tech debt. You also have a bunch of code that no one understands and the barrier to entry for software engineering has become so high that fewer younger people are actually learning how to be good programmers. Lots of organizations don’t give a shit about their rapidly growing mountain of tech debt today but they’re sure going to at the end of the week when the payment comes due.

    What their “leadership” fails to understand is that any idiot can shell out code. I’ve seen lots of terrible programmers generate millions of lines of really shitty code that somehow, by the power of the dark Lord himself, manages to compile. That’s not what a software engineer actually does though. Software engineers design operational systems with software. Writing code is a secondary function of that. There are currently no AI agents that can successfully design a software system with any degree of complexity because LLM’s don’t actually understand anything.

    • 1rre@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      I think it’s more of a “how you use it” thing, but you’re definitely right that AI agents can’t design systems properly.

      Some people I know have produced way more code, removed tech debt, and all without introducing any bugs since they started using AI. That’s because they’re not using it to do anything beyond their skillset, understand everything it’s doing, and are using it to catch mistakes they otherwise would have made. Other people are using it without reviewing the output, or are using it to try and do things beyond their skillset, and that’s how you end up with infinite tech debt and a whole host of bugs.

      Personally I’ve recently started heavily using the AI code review bot we have at our company, both for my own code and other people’s. While 50% of what it says is hallucination or wrong, that’s not an issue because I know it’s wrong or a hallucination so can just tell it no and to focus on other things, like catching bugs or issues that most reviewers would just glance past, and also gives you a rubber duck that talks back.

  • thedeadwalking4242@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    Problem is the code is crap. Code without intent is crap. Code IS intent. LLMs are just statistical models with no understanding.

    They are fancy copy paste machines. They have some minor “intelligence” but it’s pretty much just the ability to associate like information. Their responses are nothing but compositions of there training data and local context.

    It has some uses granted. But writing your code beyond auto complete or extremely simple and straight forward tasks isn’t one of them. LLM generated code is not production ready code.

  • cAUzapNEAGLb@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    3 days ago

    I was having this exact conversation with one of the archetects in my company - top down orders were to start integrating llm’s into our engineering workflows - so be it, they sign my paychecks and selfishly i want those to keep flowing - but beyond the quality and slop concerns i had already raised and were disregarded, i was realizing that i was losing my intuition about the code i was releasing under my name - it takes me longer to answer questions and i cant just wing my answers based on the intuition built during development because i didnt build intuition during development when development was outsourced to the llm - thats a liability im trying to highlight to managment and their risk analysis and also to myself for risk reduction

    Intuition isnt made in documentation (nor the slopdocs llms makes that no sane person will ever actually read) - its built by the effort of comprehension and theyll be no shortcut to that

    • boogiebored@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      I agree and yet if this is what the paycheck signers WANT… and if we work for paychecks… then aren’t we giving them exactly what they asked for?

      • badgermurphy@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        3 days ago

        Yes, but we also have to occupy and exist in our workplace. If it becomes a metaphorical dumpster fire, it is more stressful, less rewarding, and provides less opportunity for growth and development.

        If I get the same pay for worse working conditions, the deal was unilaterally made worse and it is fair game for complaint from the shafted party, the worker.

  • pinball_wizard@lemmy.zip
    link
    fedilink
    English
    arrow-up
    48
    ·
    3 days ago

    But we’ve seen this pattern before. When code production gets cheap, the cost doesn’t disappear. It migrates. It moves from creation to comprehension.

  • violentfart@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 days ago

    The profits also aggregate at the top. The problem is it’s short term, and the later cost isn’t funded by the top as it’s a one-way flow.

    Record profits, new product release: they’re often followed immediately by layoffs so the next quarter’s metrics look extra good and they can cash in.

    The few poor souls remaining hold the fort until hiring starts again and the cash out cycle begins once more.