AI-generated code is shipping to production without security review. The tools that generate the code don’t audit it. The developers using the tools often lack the security knowledge to catch what the models miss. This is a growing blind spot in the software supply chain.

  • Not a newt@piefed.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 hours ago

    Forget SQL injection and XSS, LLMs are bringing back unsanitised inputs as a whole, including reintroducing previously removed vulnerabilities. You can casually browse Github for submissions by Claude bot and find …/… vulns all over.