• mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    3 hours ago

    The doctors were better, until someone yanked the tool away. That’s how every tool works! Even going from a handsaw to a table saw and back will make you lose some skill with the handsaw, because your brain focused on higher-level goals and finer motions. That’s not proof a table saw is bad for woodworking. The problem is “and back.”

    since apparently AI can’t feed into AI without collapse

    Have you checked on that narrative? It’s been a while. Things stopped getting yellow. Improvements continued.

      • mindbleach@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 hour ago

        That’s a lot of “could” and “will” from an article a year old, primarily about concerns from two years ago, while image models to-day keep getting smaller and better. They didn’t find a second internet’s worth of JPEGs. Better training on the same data, or even better labels on less data, beats a simple obsession with scale.

        Yes, photocopying a photocopy will degrade, but diffusion is a denoising algorithm. Un-degrading an image is its central function. ‘Make it look less AI’ is how you get generative adversarial networks.

        Anyway, the grim truth is that the central concern is mistaken. Training data for cancer screening does not require the patient lived.

        • ell1e@leminal.space
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          58 minutes ago

          The article links a study. What’s your study that collapse isn’t a concern?

          For what it’s worth, my worry was never focused on cancer, these doctors were just an example measured for the likely universal unlearning effect.