TL;DR

ID scanning is becoming a more common requirement to access bars and clubs in Australia (and worldwide). A company called ScanTek is used in over 1,000 clubs in Aus and provides tools such as biometric-matching someone’s face to an ID, detecting fake IDs, flagging people and sharing data with other venues automatically

As well as verifying ages, ScanTek boasts “collect marketing information from IDs and drivers licences, which business owners can use to target specific demographics with promotions” on its website in a pitch to business owners. Though they claim to not share any of this with third parties

Australia’s privacy laws are vague, don’t specify what can be collected and how it must be stored, and only say that companies shouldn’t keep data for longer than is “reasonable”

  • Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    46
    ·
    8 hours ago

    As well as verifying ages, ScanTek boasts “collect marketing information from IDs and drivers licences, which business owners can use to target specific demographics with promotions”

    And here we have the real reason why they want the scanning

    Added security to keep the random asshole out, sure, but the marketing is the point. Fuck that shit, if rather not go out to a bar than this

    • partofthevoice@lemmy.zip
      link
      fedilink
      arrow-up
      19
      ·
      8 hours ago

      Insurance companies are going to love that data. “Oh, Mr.Doe, your application says you don’t drink but once a year. However, we see you went to Moe’s Tavern twice last year. Sorry, but we can’t cover the cost of your medicine with these application discrepancies present.”

        • a4ng3l@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          3 hours ago

          Yeah but it’s going to be more insidious than that. A few points in a vector somewhere in a large model used to personalise a quote. You typically never know what they fuck you about. Even here in Europe the right of access provided by gdpr might not reveal that bullshit: once the model has been training the atomic data is eventually purged…