apprehentice@lemmy.enchanted.socialtoSelfhosted@lemmy.world•Does anyone else have experience with koboldcpp? How do I make it give me longer outputs?English
3·
4 months agoYou’re part of the way there by setting the token count higher. Context will make the model “remember” more, so that’s helpful for generating responses up to the token count.
If you haven’t already, go into the settings menu and make sure “Continue bot responses” is turned on. If it is, pressing the submit button with no input should make the bot add onto what it output before.
nginx is mature and has a lot of support online. A lot of server projects assume you’re using nginx, as well. I’ve only ever seen caddy instructions on newer projects and even then, they usually also have nginx instructions.
Plus, I already know how to use it.