• XLE@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    9 hours ago

    @ikidd@lemmy.world @ingeanus@ttrpg.network do you two have a source for these supposed great models?

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I personally love glm-5 and qwen3.5, specifically: https://ollama.com/library/qwen3.5:122b

      I’ve used them both for coding and they work really well (way better than you’d think). They’re also perfectly capable of the usual LLM chat stuff (e.g. check my grammar) but all the models (even older, smaller ones) are capable of that stuff these days.

      For a treat: Have someone show you using some of these models to search the web! It’s amazing. You don’t see ads, you don’t have to comb through 12 pages of search results, and they read the pages that moment (not cached) to give you summaries of the content. So when you click the link to go to the content you know it’s the thing you were looking for. They’re not using a local index of the Internet, they’re searching on your behalf using whatever search engines you configured. It’s waaaaay better than ChatGPT (which uses Bing behind the scenes whether you like it or not) or Gemini (which uses Google, obviously). The (self-hosted) LLM will literally be running curl for you on Google, DuckDuckGo, Bing, or whatever TF else you want (simultaneously) then reading each of the search results and using your prompt to figure out what the most relevant results are. It’s sooooo nice!

      FYI: Ollama.com’s library page is actually a great resource for finding info on all the models that can be self-hosted: https://ollama.com/library

    • ikidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      What inclined you to @ me into this? As far as I can see, I haven’t even replied in this thread, and you just seem like you’re on the warpath with anyone that wants to defend using LLMs. If Greg KH thinks it’s coming into it’s own, you might want to heed him.