• NotSteve_@piefed.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    17 hours ago

    What are you running that needs more than 32Gb? I’m only just barely being bottlenecked by my 24Gb when running games at 4k

    • Jeena@piefed.jeena.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 hours ago

      Two browsers full of tabs but that is not a problem, but once I start compiling AOSP (which I sometimes want to do for work at home instead in the cloud because it’s easier and faster to debugg) then it eats up all the RAM imediatelly and I have to give it 40 more GB or swap and then this swapping is the bottleneck. Once that is running the computer can’t really do anything else, even the browser struggles.

      • usernamesAreTricky@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        ·
        13 hours ago

        Have you tried just compiling it with fewer threads? Would almost certainly reduce the RAM usage, and might even make the compile go faster if it you’re needing to swap that heavily

        • Jeena@piefed.jeena.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 hours ago

          Yeah that’s what I’m doing but I played for the fast CPU and can’t make it the bottleneck ^^

    • hoshikarakitaridia@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      17 hours ago

      AI or servers probably. I have 40gb and that’s what I would need more ram for.

      I’m still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn’t buy them in time before the prices skyrocketed. Fuck me I guess.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        17 hours ago

        It does work, but it’s not really fast. I upgraded to 96gb ddr4 from 32gb a year or so ago, and being able to play with the bigger models was fun, but it’s not something I could do anything productive with it was so slow.

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          17 hours ago

          Your bottle necked by memory bandwidth

          You need ddr5 with lots of memory channels for it to he useful

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          17 hours ago

          You can have applications where wall clock tine time is not all that critical but large model size is valuable, or where a model is very sparse, so does little computation relative to the size of the model, but for the major applications, like today’s generative AI chatbots, I think that that’s correct.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            16 hours ago

            Ya, that’s fair. If I was doing something I didn’t care about time on, it did work. And we weren’t talking hours, it it could be many minutes though.

      • panda_abyss@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        15 hours ago

        I’m often using 100gb of cram for ai.

        Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.