Let me preface by saying I despise corpo llm use and slop creation. I hate it.

However, it does seem like it could be an interesting helpful tool if ran locally in the cli. I’ve seen quite a few people doing this. Again, it personally makes me feel like a lazy asshole when I use it, but its not much different from web searching commands every minute (other than that the data used in training it is obtained by pure theft).

Have any of you tried this out?

  • arcayne@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    Lowest barrier of entry would be to run a coder model (e.g. Qwen2.5-Coder-32B) on Ollama and interface with it via OpenCode. YMMV when it comes to which specific model will meet your needs and work best with your hardware, but Ollama makes it easy to bounce around and experiment.