• AA5B@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 day ago

    I’m not buying this. Sure minimizing dependencies is a good practice, but not updating? That’s a recipe for disaster.

    It’s important to note that you can’t predict supply chain attacks or vulnerabilities, and vulnerabilities are much more common. Also, while frequent updates might expose you to that supply chain attack more quickly, it also mitigates it more quickly. Frequent updates in combination with vulnerability scanning, and limiting downloads to reputable sources (that try to prevent supply chain attacks and discover them quickly) is a much better approach.

    There also the maintainability argument, that I’m having right now with a couple of our legacy software teams. Not updating can lock you into the past, for entire ecosystems of dependencies. You cant update if you have to, you cant take advantage of new features anywhere in the ecosystem, and it’s now an expensive emergency when something stops being maintained or has an unresolved vulnerability. If you’re being continually kept up, then choices or features are easy

    Then the goal is how do you automate your updates as smoothly as possible so they do not become noise, do not create extra work? Tools like dependabit and renovate bot have a lot of config options to help that

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 hours ago

      not updating? That’s a recipe for disaster.

      Not blindly updating.

      It’s a different thing.