• StinkySocialist@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    14 hours ago

    I may be slipping up on jargon.😅

    I think the versions of deepseek you can get from olama are FOSS. I have that running on my homelab and can access it with open webui. Are you looking for something like that? I could link some stuff.

    • armandoenlachamba@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      13 hours ago

      Thanks! I will do some searching on my own, and your comment is a good starting point. I will probably ask you for links if I’m unable to find anything.

      May I ask what kind of hardware you use to run your LLMs? Like, do you have a rack full or GPUs?

      • StinkySocialist@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        12 hours ago

        I got an old machine off eBay(see pic) I only run models that are 8b parameters or less.

        I got Ubuntu server on it. Then docker running in that. In docker I have olama, open web UI, jellyfin and a game sever. No issues running any of that.

        Edit: if you want something that can run better LLMs I recommend more RAM and a better GPU

          • StinkySocialist@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 hours ago

            I use them mostly for helping me write emails or meal prepping tbh lol. I’ve used deep seek to help me with python before but if you’re not just dicking around like me you’d definitely want something more powerful.

            For image generation it sounds like this tool called comfy UI is the way to go. I have it running in docker but haven’t set anything up inside it yet.

            It’s pretty neat, I really set this up to help keep my data out of the hands of the corps and the feds lol.