• sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      20 hours ago

      Unless I am missing something:

      Most people do not have a local LLM in their pocket right now.

      Most people have a client app that talks to a remote LLM, which ‘lives’ in an ecologically and economically dubious mega-datacenter, in their pocket right now.

      • GamingChairModel@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        20 hours ago

        Plenty of the AI functions on phones are on-device. I know the iPhone is capable of several text-based processing (summarizing, translating) offline, and they have an API for third party developers to use on-device models. And the Pixels have Gemini Nano on-device for certain offline functions.