

It could be considered biochemical warfare


It could be considered biochemical warfare


along with the compose.yaml file, unless I need it in a different drive for any reason


yeah, I put rules to highly discourage comments entirely when generating code


Cloudflaring my program
I’ll start using this one


and that indentation defaults in decent editors are usually language dependent. I’m not familiar with these editors, but… come on - if they use one default for all files, OP should use a better tool.


almost as if using a memory safe language actually reduces the CVEs related to memory


most repos use 4 spaces


it’s not like the whole driver is written in unsafe rust


Maybe. The problems I have with codeberg are the lack of support to private repos and the 100 repo limit. I have some personal stuff in version control that I prefer to keep private, like notes, dotfiles, and shell history.
At the same time, I’m not sure I want to maintain a self-hosted forge.


but the whole thing is self-hosted, not just the action runner, right?


not AI, and still not open source either


any alternatives to GH that allow private repos and self-hosted action runners?
btw, the prices of managed runners are going down, not increasing
https://docs.github.com/en/billing/reference/actions-runner-pricing#standard-github-hosted-runners
still good to have a self-hosted alternative though
ah right, my bad
fwiw, you can self host a GitHub actions runner


maybe they resumed development then, it was removed from Ubuntu and RHEL repos about 5 years ago when I had to look for an alternative


are you using a maintained alternative? Distros started to remove it from their repos years ago because it was not maintained anymore afaik


I’m the only user of my setup, but I configure docker compose stacks, use configs as bind mounts, and track everything in a git repo synchronized every now and then.
It doesn’t require, but it runs better on GPU. It’s not for nothing that modern terminal emulators like Kitty, Alacritty, Ghostty, and Wezterm ship with GPU support. Sure, they just “render text”, but it is a rendering workload that is highly parallelizable. There’s no reason to waste CPU cycles with that.
ok()