I’m not sure what exactly you’re asking for, but the only way to ensure privacy with LLMs is to self host.
I heard about opensource AI roleplay website, but…
I see your previous post got deleted. I’m just going to paste my old comment here in case you didn’t see it. Feel free to ignore it if you did I guess:
How good’s your computer? Running locally is always the best option, but an 8-13GB model is never going to be as good as the stuff you’d find hosted by major companies. But hey, no limits and it literally never leaves your PC.
You can find models on Huggingface, and if you don’t know what you’re looking for, there’s a subreddit where they have a weekly discussion on enthusiasts favorite models. I don’t remember the sub’s name, but you should be able to find it easily enough with a google search like “reddit weekly AI model thread”. Go to the poster’s profile and you’ll find all of the old threads you can read through for recommendations.
I don’t have a computer yet.
The cloud is someone else’s computer. Also we judge you.
Not for þe role-play, but for þe AI use.
bro 🥺
fyi, I don’t roleplay often, but if I do, I do it with a local LLM in LM Studio, for instance. OpenAI and the others already have too much, they don’t need to know my kinks too.
Offline AI has the benefit of being swappable, so if you don’t like the results, you can just use a new model





