• WatDabney@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    123
    ·
    13 hours ago

    If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      13 hours ago

      Nothing made-up is CSAM. That is the entire point of the term “CSAM.”

      It’s like calling a horror movie murder.

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        15
        ·
        10 hours ago

        The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

        Were you too busy fapping to read the article?

      • ryper@lemmy.ca
        link
        fedilink
        English
        arrow-up
        28
        ·
        edit-2
        12 hours ago

        It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.

        • greenskye@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          10 hours ago

          I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

          I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

          Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

          • shani66@ani.social
            link
            fedilink
            English
            arrow-up
            8
            ·
            9 hours ago

            Sure, i think it’s weird to really care about loli or furry or any other niche, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can’t have effective safeguards against that harm it makes sense to restrict it legally.

            • greenskye@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 hour ago

              Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

              But generating images of adults that don’t exist? Or even clearly drawn images that aren’t even realistic? I’ve seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky.

              Like let’s take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone’s fucked up fantasy. Yet lots of people want to make that into a thought crime.

              I’ve always thought that if there isn’t speech out there that makes you feel icky or gross then you don’t really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          12 hours ago

          You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            12
            ·
            edit-2
            11 hours ago

            The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.

            You are completely wrong.

            https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/

            “CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”

            “Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              11 hours ago

              RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.

              We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                11 hours ago

                Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.

          • VeganBtw@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            11 hours ago

            Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
            […]
            Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
            (Emphasis mine)

            https://en.wikipedia.org/wiki/Child_pornography

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              ·
              11 hours ago

              ‘These several things are illegal, including the real thing and several made-up things.’

              Please stop misusing the term that explicitly refers to the the real thing.

              ‘No.’

          • rainwall@piefed.social
            link
            fedilink
            English
            arrow-up
            10
            ·
            edit-2
            9 hours ago

            It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.

            The child porn it’s generating is based on literal child porn, if not itself just actual child porn.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              9 hours ago

              You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?

              Like combining unrelated concepts isn’t the whole fucking point?

              • CerebralHawks@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 hours ago

                Yes and they’ve been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI.

                Anna’s Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they’re using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.)

                So, yes, that is exactly what they’re doing. They are training their models on all the data, not just all the legal data.

              • mcv@lemmy.zip
                link
                fedilink
                English
                arrow-up
                9
                ·
                7 hours ago

                No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.

              • stray@pawb.social
                link
                fedilink
                English
                arrow-up
                4
                ·
                8 hours ago

                It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.