• 5too@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    7 hours ago

    As I understand it, most adult content producers aren’t actually interested in having minors using their sites. It seems like the easiest thing to do would be to have them add some “Adult Material” flag in their metadata, and let consumers respond as they wish to that tag - whether that’s done through browser settings, router nannyware, or whatever.

    Is there a technical reason this isn’t what’s being pushed for? I’m sure there’s lobbying and “optics” reasons for not doing this, but is there any practical reason for not pursuing this, or something like it?

    • SynonymousStoat@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 hours ago

      We already have multiple solutions for blocking children from websites that parents don’t want them to access and the companies providing those situations maintain their own databases of different types of content tagged so that parents can have some control over what is blocked and what is not. This stuff has existed since the 90s it’s nothing new. It requires parents taking the initiative though and really when we get down to it this is another, "but think of the children, " sort of situation where they are using child safety as cover for making it easier to collect biometric data of people online.

      • DireTech@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        The hell are you talking about? Yes there are blocklists for adult domains, but that doesn’t actually block adult content since it leaves stuff like YouTube open. If you think there isn’t full on sex on there then you’ll be surprised.

        The only thing that functions right now is whitelisting and it is super annoying since so many apps open a web container inside the app. All this id verification is nonsense, but providing an actually filtered internet is still nigh impossible for parents who aren’t tech savvy.

        • SynonymousStoat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Pretty easy solution to that, don’t let your kid have access to youtube without observing what they are watching. If a parent isn’t willing to learn how to setup parental controls and/or web filtering and take the time to observe what their child is consuming then it shouldn’t be shoved onto the government and made a problem for everyone else.

          • DireTech@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            Yeah that’s a nice idea but so is parents being able to provide food and shelter for their children. However, most counties in the world have social services because we recognize what should happen and what actually happens don’t always overlap and we don’t want to starve kids for their parents failings.

            Once you accept some parents either won’t or in many cases can’t, the question is what you value more: the kids in those situations or the companies profiting off the parents failures.

            Plus YouTube is an easy example. There are thousands of other websites that have much worse. Making the websites themselves responsible for flagging the domain puts the onus on those most likely to have the technical know how rather than those more likely to be ignorant.