- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Perverts wanna perv
Tim Sweeny is a jackass.
IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.
Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.
This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.
And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.
Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.
That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.
Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.
There ARE victims, lots of them.
That is a lot of text for someone that couldn’t even be bothered to read a comment properly.
Non-consensual porn victimises the person being depicted
This is still true if the porn in question is machine-generated
The real thing to talk about is the presence or absence of a victim.
That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.
Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.
There ARE victims, lots of them.
You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)
Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.
Is this clear now?
Yes, it certainly comes across as you arguing for the opposite since you above, reiterated
The real thing to talk about is the presence or absence of a victim.
Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.
Tim Epic sucks, and has always sucked. Bring back Unreal Tournament you coward.
I hate that the newest Unreal Tournament just kinda… Disappeared. I mean, it’s still playable I think, just not online and aside from a week or so after it launched, I ain’t ever heard anyone talking about it. It was okay… Balance was not quite there, and it only had 2 maps when I last played it. But it had potential.
More like Tim NotEpic.
So he’s saying CSAM is free/protected speech? Got it.
Dude had absolutely no reason to out himself as a paedophile, but that seems to be exactly what he’s done. And for what? Epic is in legal battles to get onto other companies’ platforms (Google’s Android and Apple’s iOS) without paying the fees outlined in the terms and conditions every other developer had to agree to. I’m not saying he’s 100% wrong in that opinion, but outing himself as a paedophile by stating CSAM is protected speech only hurts his argument in the other case, because he’s saying he could put CSAM on his own platform (i.e. Fortnite, a game aimed at children and teenagers) and he’d be against you censoring his “free” and “protected” speech.
I just see no good outcomes for what Sweeney is fighting for here.
To address the political opponents angle, no one was calling for a Twitter/X ban before the CSAM problem, even when our political opponents were up there (i.e. before and after Trump was banned and un-banned).
Literally this meme again

It’s called being so effective at marketing and spending so much money on it that people believe you don’t do nothing.
Isn’t this the “won’t somebody please think of the children” party?
Oh, they’re thinking of 'em all right!
Absolutely insane take. The reason Grok can generate CP is because it was trained on it. Musk should be arrested just for owning that shit.
sweeney sounds like a pedo
I wonder if his name showed up in the Epstein Files
What a weird thing to saw by the man whose company is 35% owned by the Chinese government. His argument, the words said and not implied, isn’t directly about csam, it’s about political censorship. Wonder how his Chinese overlords are taking that statement.
Did he take an oath against common sense? Is he bound by a curse to have bad takes for his entire life? Does he ragebait as a living? What the actual fuck is up with this man?
He wants to see loads of AI generated porn of himself…
It sounds like he’s simply a Trumptard, brainwashed with their koolaid of doom. And again I’m glad I never signed up for Epic.
steam
does nothing
wins
What a reprehensible, disingenuous representation of what he actually said. I’m not a fan of the guy, but PC Gamer is trash as well. Scary to see how people here are reacting just because it’s about X and AI.
Yeah nobody in this thread went past the title, but that’s literally not what he said.
He actually said that demanding X remove AI features is gatekeeping since competitors get to keep them, which is still a dumb take but very very far from “Tim Sweeny loves child porn”…
Guy atomically made of shit takes has another shit take, colour me surprised.
I think it’s more a:
“Guy who enjoys making child porn on xhitter gets angry when decent people want to ban it.”
type situation.If I see someone arguing for something then that’s because they want it. Or want to use it.










