Let me preface by saying I despise corpo llm use and slop creation. I hate it.
However, it does seem like it could be an interesting helpful tool if ran locally in the cli. I’ve seen quite a few people doing this. Again, it personally makes me feel like a lazy asshole when I use it, but its not much different from web searching commands every minute (other than that the data used in training it is obtained by pure theft).
Have any of you tried this out?


Playing with it locally is the best way to do it.
Ollama is great and believe it or not I think Googles Gemma is the best for local stuff right now.
Agree, Gemma is the best performing model on my 12GB VRAM.