I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord’s new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.

They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.

They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn’t start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.

The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.

How how is Discord’s policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.

So there you go. I’m all ears for how to do this better. That’s one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn’t want any kind of protections at all.

  • 1dalm@lemmings.worldOP
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    “You do realise you’re interacting on a platform that would shut down if they had to do this because they can’t afford it.”

    Yes. And I also believe the fediverse community should take this problem more seriously than it currently does, and not just wait until the government forces them to take it seriously.

    One big difference is that the fediverse generally isn’t broadly working marketing itself to kids to bring them onto the network, as opposed to other networks that directly market themselves to kids to get the kids locked in at young ages.

    • Skavau@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Yes. And I also believe the fediverse community should take this problem more seriously than it currently does, and not just wait until the government forces them to take it seriously.

      How would the government do that? The Forumverse has 40k members (which is tiny) and it’s split up into over 100 instances.

      Who do they try and talk to?

      How can the Fediverse “take it seriously” when they simply can’t afford to?

      • 1dalm@lemmings.worldOP
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Honestly, saying “we can’t afford to take it seriously” is exactly what gets organizations in trouble.

        You can’t afford not to.

        • Skavau@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Honestly, saying “we can’t afford to take it seriously” is exactly what gets organizations in trouble.

          The fediverse isn’t an organisation.

          As I asked: How would the government do that? The Forumverse has 40k members (which is tiny) and it’s split up into over 100 instances.

          • 1dalm@lemmings.worldOP
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            You wouldn’t have to treat it like an organization. Go after individual hosts. If a police investigation found that a forumverse host was providing an opportunity for a child predators to use their system to blackmail kids into sending nude photos of themselves, then I think the host, the individual, should be held responsible for what happens on their server. Just like they would be held responsible if it happened in their house.

            • Skavau@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 days ago

              You unironically think that governments are going after hosts that have, in many cases, less than 1000 active monthly users purely because they don’t have age-ID services on their platform?

              • 1dalm@lemmings.worldOP
                link
                fedilink
                arrow-up
                1
                ·
                2 days ago

                100% they would. Yeah.

                If child pornography was found to be stored on a host’s server by one of their 1000 users, “I didn’t think you guys would care about a platform with less than 1000 monthly users” isn’t going to be a great argument in courts.

                • Skavau@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  2 days ago

                  100% they would. Yeah.

                  How would they know?

                  If child pornography was found to be stored on a host’s server by one of their 1000 users, “I didn’t think you guys would care about a platform with less than 1000 monthly users” isn’t going to be a great argument in courts.

                  You’re talking here specifically about child pornography. Not just not age-verifying users to access ‘adult’ content. No server, to my knowledge, allows this.

                  • 1dalm@lemmings.worldOP
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    2 days ago

                    How would they know?

                    Well, if, and really when, a predator is caught by the police, that police department will do a full investigation and find all the places they are having communications with kids. Sooner or later, one will be found to be using Lemmy. On that day, the host is going to need a good lawyer.

                    It’s not enough to “not allow this”. A person that allows anonymous strangers to use their servers to store information in secret is asking for trouble. They need to take much more care than that.

                    And I never said that age-verification is the only solution to this problem. >>