• Nelots@piefed.zip
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 days ago

    I see your previous post got deleted. I’m just going to paste my old comment here in case you didn’t see it. Feel free to ignore it if you did I guess:

    How good’s your computer? Running locally is always the best option, but an 8-13GB model is never going to be as good as the stuff you’d find hosted by major companies. But hey, no limits and it literally never leaves your PC.

    You can find models on Huggingface, and if you don’t know what you’re looking for, there’s a subreddit where they have a weekly discussion on enthusiasts favorite models. I don’t remember the sub’s name, but you should be able to find it easily enough with a google search like “reddit weekly AI model thread”. Go to the poster’s profile and you’ll find all of the old threads you can read through for recommendations.

  • yardratianSoma@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    3 days ago

    fyi, I don’t roleplay often, but if I do, I do it with a local LLM in LM Studio, for instance. OpenAI and the others already have too much, they don’t need to know my kinks too.

    Offline AI has the benefit of being swappable, so if you don’t like the results, you can just use a new model