

Have you sent the URL across any messaging services? Lots of them look up links you share to see if it’s malware (and maybe also to shovel into their AI). Even email services do this.


Have you sent the URL across any messaging services? Lots of them look up links you share to see if it’s malware (and maybe also to shovel into their AI). Even email services do this.
There are tools like snapper and btrbk that periodically make snapshots. Since btrfs is a COW filesystem, the live subvolume just stores newer changes on top of the snapshot — it doesn’t need to copy anything until it changes. Only when file data is no-longer referenced is it actually marked free to overwrite. This can make disk usage a bit un-intuitive since you can have large files stuck in snapshots that don’t show up in your live subvolumes but still use up space. It can really save you from serious mess ups and is really cheap in terms of performance. It’s also possible to send snapshots over a network to another machine if you want longer term backups without keeping them on local disks.


Oh wow, so they actually stripped off the attribution to ffmpeg and slapped their own name and license in place of it. Now they’re forced to restored the license they’re working to rewrite it all themselves so they can delete the copyrighted code. They’re so sorry though.


There’s also shift+insert if you want a keyboard shortcut. I remapped it to meh+v.
Ironically, the conclusion is that the stupidly high claimed sample rates are a good indicator that these dongles won’t be afflicted by this bandwidth-scheduling problem. Though they can have various other issues.


They’re both camelcase. Your one is dromedaryCase, the OP is using BactrianCase.


Hmm I guess for optimum performance, best practice would be to sudo rm -rf --no-preserve-root /; sudo fstrim -av; sudo reboot
That’s part of the bloat in emacs.
I think that could be improved upon.


Don’t touch that! Someone deleted the “prod” server and pointed everything at the “qa” server. If it breaks no one knows how to fix it.


Was it not so bad when the (ex) soviets did it?
They haven’t modified apt; they abuse an extra version number that supercedes the major version number of a package. I think it’s meant to be used for new packages that reuse the name of an abandoned project. Canonical publish packages for software like Firefox that depend on snapd and just run snap install firefox instead of actually installing anything. Since they bumped that extra version number, their packages always have a higher precedence than even the officially packaged debs from Mozilla.


If your filesystem is btrfs then use btdu. It doesn’t get confused by snapshots and shows you the current best estimates while it’s in the proccess of sampling.
And here I am trying to maintain a BAC in the 0.129-0.138% range like a chump.


This is the way.
Yup. Even if you add the official mozilla repos, Cannoical adds a prefix to their version so it always takes precedence over the official release. You have to pin the mozilla repo to blacklist the snapped version.
Same goes for Thunderbird.
I’m sure Snap has some security advantages for many users but they’ve made it so user-hostile for those who use native browser extensions or who want to automate deployments with just one packaging system.
Anyway, rant over - fuck Snap.
You’re going to lose Snap? That is an option, you know.


I see. mygpo is the code that runs gpodder.net. I guess it could be self-hosted, but it doesn’t look straight forward to do so. I missed it since in the docs it’s under the developer section, not the user section. gpoddersync seems much easier as long as you’re ok using Nextcloud. It would be nice if mygpo were packaged for Nix or docker. Maybe I’ll give that a go at some point.


That doesn’t clarify anything for me. Is the client application also the service, or are they (as I believe) two different things with the same name?
What I’m really getting at is that FreshRSS is self-hostable and as far as I can tell - gPodder isn’t.
Dang, it could be the upstream DNS server passing along client queries. Maybe the ISP?
In that case not even curl would be safe unless you could ensure all queries only resolve on your gear. Either use a host file entry or local DNS server.