I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord’s new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.
They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.
They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn’t start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.
The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.
How how is Discord’s policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.
So there you go. I’m all ears for how to do this better. That’s one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn’t want any kind of protections at all.


Isn’t this an uneducated take as well? I mean to me it sounds like it’s making the situation worse for everyone. If you strip teens access (and do a half-assed job), they’re likely going to find the next more depraved place to do the same. Which will be more illegal. And they’re gonna be more exposed and more vulnerable… We can see this with the newer social media laws in Australia for example. And they’re probably less motivated to talk about this to adults, or get help, once they feel they did something wrong.
And then we have the issue with the adults as well. They can now not talk about valid topics any more. And they also have to pull down their pants, and upload their biometric data and name and ID to some shady companies. And we know this ends up in some large databases which will in turn get leaked and shared with third parties. So they’ll be more vulnerable as well. Both to hackers and the dark enlightenment people like Peter Thiel and his political friends.
We also know from speaking to experts in the field, like police staff. How weird surveillance measures like Chatcontrol are utterly ineffective against the criminals. And not only that. They flood the investigators in false positives, and they’ll have less time to do their actual job. So it actively takes away from the supposed cause.
I mean these things frequently sound great on some emotional level. Unless you think about it for 2 minutes, or look at the hard facts and numbers after implementing these things. I mean why an intransparent procedure by a private company, that also includes guesswork and shady things? Wouldn’t we -instead- need effective means to protect minors?
And I don’t think the new Discord procedures do, what you think they do. I think it’s about limiting nsfw groups (and content). There’s nothing stopping predators grooming kids. They can still talk to each other after March. In fact, they call it “teen by default”. So the adult groomer will now be indicated as a teen as well unless they register. So, that makes it worse?!?!! How is there something “in defense of” that?
Edit: I mean I’d love to actually do something for young people. But that’d have to be something that actually tackles the issue. Not make the situation even worse for them… I’ve had some ideas on how to make the Fediverse cater more to young people, and provide a safe space to them. But that requires an entirely different approach.