This doesn’t feel like something that should happen. Like at all. I don’t want experience repairing stuff. I want stuff not breaking. I know mos tpeople here treat a OS like a hobby, but for most people its a tool.
This doesn’t feel like something that should happen. Like at all. I don’t want experience repairing stuff. I want stuff not breaking. I know mos tpeople here treat a OS like a hobby, but for most people its a tool.
Thats true, but that sadly won’t help against a state forcing a company to put these things into the silicon. Not saying they do rn, but its a real possibility.
I mean can’t they just audit a version that doesn’t have a backdoor/snoops. Verifying against silicon is probably very hard.
How do you want to verify a RISC core not doing something funny?
I see the appeal for the package manager for a lot of things, but space got so incredibly cheap and fast that duplication is way less of a deal than the effort to make stuff work the traditional way. But im not a real linux user. I don’t like tinkering, I want to download something and it works. And the amazing thing is we can have both. If people like spending time to package something be my guest.
The funniest interaction I had recently. I downloaded a program that isn’t in my package manager or had any sort of flatpack/appimage so I downloaded it as a deb and it didn’t run because of some dependency. So I could clone the git and build it from source which might have worked, but I was too lazy to. So I just downloaded the windows exe and ran it through wine, which worked flawlessly.
But I like my applications years out of date and I think its good that every distro has to spend manhours on packaging it individually.
I had a problem with a Intel HD4000 on arch.
How do you want to federate Petabytes or even Exabytes of content? And your second sentence leads to a monolithic instance.
You aren’t supposed to do serious work over these things. They should be a last resort imo.