• Lvxferre [he/him]@mander.xyz
    cake
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    3 hours ago

    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 hour ago

        That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

        Non-consensual porn victimises the person being depicted

        This is still true if the porn in question is machine-generated

      • Lvxferre [he/him]@mander.xyz
        cake
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

        Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

        There ARE victims, lots of them.

        You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

        Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

        Is this clear now?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          18 minutes ago

          Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

          The real thing to talk about is the presence or absence of a victim.

          Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 minutes ago

      I hate that the newest Unreal Tournament just kinda… Disappeared. I mean, it’s still playable I think, just not online and aside from a week or so after it launched, I ain’t ever heard anyone talking about it. It was okay… Balance was not quite there, and it only had 2 maps when I last played it. But it had potential.

  • CerebralHawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 hours ago

    So he’s saying CSAM is free/protected speech? Got it.

    Dude had absolutely no reason to out himself as a paedophile, but that seems to be exactly what he’s done. And for what? Epic is in legal battles to get onto other companies’ platforms (Google’s Android and Apple’s iOS) without paying the fees outlined in the terms and conditions every other developer had to agree to. I’m not saying he’s 100% wrong in that opinion, but outing himself as a paedophile by stating CSAM is protected speech only hurts his argument in the other case, because he’s saying he could put CSAM on his own platform (i.e. Fortnite, a game aimed at children and teenagers) and he’d be against you censoring his “free” and “protected” speech.

    I just see no good outcomes for what Sweeney is fighting for here.

    To address the political opponents angle, no one was calling for a Twitter/X ban before the CSAM problem, even when our political opponents were up there (i.e. before and after Trump was banned and un-banned).

    • brachiosaurus@mander.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      It’s called being so effective at marketing and spending so much money on it that people believe you don’t do nothing.

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 hours ago

    Absolutely insane take. The reason Grok can generate CP is because it was trained on it. Musk should be arrested just for owning that shit.

  • myfunnyaccountname@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 hours ago

    What a weird thing to saw by the man whose company is 35% owned by the Chinese government. His argument, the words said and not implied, isn’t directly about csam, it’s about political censorship. Wonder how his Chinese overlords are taking that statement.

  • Jankatarch@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    6 hours ago

    Did he take an oath against common sense? Is he bound by a curse to have bad takes for his entire life? Does he ragebait as a living? What the actual fuck is up with this man?

    • RaoulDook@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      It sounds like he’s simply a Trumptard, brainwashed with their koolaid of doom. And again I’m glad I never signed up for Epic.

  • bread@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 hours ago

    What a reprehensible, disingenuous representation of what he actually said. I’m not a fan of the guy, but PC Gamer is trash as well. Scary to see how people here are reacting just because it’s about X and AI.

    • popcar2@piefed.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      Yeah nobody in this thread went past the title, but that’s literally not what he said.

      He actually said that demanding X remove AI features is gatekeeping since competitors get to keep them, which is still a dumb take but very very far from “Tim Sweeny loves child porn”…

    • D_C@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 hours ago

      I think it’s more a:
      “Guy who enjoys making child porn on xhitter gets angry when decent people want to ban it.”
      type situation.

      If I see someone arguing for something then that’s because they want it. Or want to use it.