Let me preface by saying I despise corpo llm use and slop creation. I hate it.
However, it does seem like it could be an interesting helpful tool if ran locally in the cli. I’ve seen quite a few people doing this. Again, it personally makes me feel like a lazy asshole when I use it, but its not much different from web searching commands every minute (other than that the data used in training it is obtained by pure theft).
Have any of you tried this out?


I’ve run several LLM’s with Ollama (locally) and I have to say that is was fun but it is not worth it at all. It does get many answers right but it does not even come close to compensate the amount of time spent on generating bad answers and troubleshooting those. Not to mention the amount of energy the computer is using.
In the end I just rather spent my time actually learning the thing I’m supposed to solve or just skim through documentation if I just want the answer.
I have had really good luck with Alpaca which uses Ollama
Gemma3 has been great
Alpaca is the GTK client of Ollama right? I used it for a while to let my family have a go at local LLM’s. It was very nice for them but on my computer it ran significantly slower than what they expected so that’s that.
This has been my experience with llms in my day to day job. Thank you for comment
thank you as well