• neon_nova@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    I have a 16gb MacBook Air m4.

    I like the idea of having a model I can run locally in the event of a possible long term internet outage.

    Can you recommend a model that would be suitable for my computer?

      • neon_nova@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Thanks! I figured it’s low on ram, but with the way things are going in the world, maybe it’s better than nothing is what I’m thinking.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
          link
          fedilink
          arrow-up
          8
          ·
          2 days ago

          It’s entirely possible we might see fairly capable models that can be run with 16 gigs of RAM in the near future. Qwen 3.5 came out in February, and you needed a server with hundreds of gigs of memory to run a 397bln param model. Fast forward to a couple of weeks ago and 3.6 comes out with a 27bln param version beating the old 397bln param one in every way. Just stop and think about how phenomenal that is https://qwen.ai/blog?id=qwen3.6-27b

          So, it’s entirely possible people will find ways to optimize this stuff even further this year or the next, and we’ll get an even smaller model that’s more capable.

    • bountygiver [any]@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      long term internet outage is not that likely. But getting priced out of any online models is quickly the reality.