Recently saw a youtube video about a service created to change an open source software license.

  • One agent reads code and gather specs
  • Another agent, without access to the original code, creates equivalent software

In theory this should allow someone to take any open source software and change it’s license.

For a large portion of open source likely this is not an issue, because nobody may care for the particular software, but for larger projects I wonder what sort of impact this may have. In particular any open source software where it’s authors are making a living from donations or public support.

Has anyone read, or thought, of a way to prevent getting one’s code license changed this way?

  • cole@lemdro.id
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 hours ago

    I think it might be hard to argue that it is a clean room implementation if the project is in the training data for the model, which it probably will have been

    • fodor@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      Yeah this is a key point. It’s pretty safe to say that AI generated code that’s based on open source projects is going to be trained on open source projects. If the people running the AI software make any mistake then they could be facing massive copyright violations.

      So I’m kind of interested in whether that type of risk is something that would be pragmatic for a company to take. There probably are some situations where it would be, but I’m not convinced that would happen too often.

      • cole@lemdro.id
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        The irony here is if you host your open source project somewhere where it isn’t being scraped by LLMs your legal case might be weaker.

        What an interesting idea