

Okay but why is it inherently funny though?
Linux gamer, retired aviator, profanity enthusiast


Okay but why is it inherently funny though?
So we’re just doing food items with no other context as memes? What’s it gonna be next month? Pasta?


Wasn’t Linux first released in like 1993?


Okay so, this is less a line in the sand and more a 14 foot concrete wall topped with razor wire and guarded by marines with rifles with fixed bayonets in the sand:
I will not install an end-user application using Cargo, and I will say many mean things to anyone who suggests it.
Python’s Pip or Pypi or PyPy whichever it is (Both of those are the names of two different things and no one had their head slammed into a wall for doing that; proof that justice is a fictional concept) I can almost accept. You could almost get me drunk enough to accept distributing software via Python tooling, because Python is an interpreted language, whether you ship me your project as a .exe, a .deb, a flatpak, whatever, you’re shipping me the source code. Also, Python is a pretty standard inclusion on Linux distros, so Pip is likely to be present.
Few if any distros ship with Rust’s toolset installed, and the officially recommended way to install it, this is from rust-lang.org…is to pipe curl into sh. Don’t ask end users to install a programming language to compile your software.
Go ahead and ask your fellow developers to compile your software; that’s how contributing and forking and all that open source goodness should be done. But not end users. Not for “Install and use as intended.” For that, distribute a compiled binary somehow; at the very least a dockerfile if a service or an appimage if an application. Don’t make people who don’t develop in Rust install the Rust compiler.


My thing is, a dumb phone has the features I would like to do without on my smart phone. Telephone and SMS ARE THE GODDAMN PROBLEM. If people who were not explicitly whitelisted by me out of band had no method of contacting me, that’d be great.


My problem is people keep infecting the world with software designed with Gnome’s “Mac with Meningitis” style sheet.


If you tell me to install an end-user facing application with a programming language’s package manager, I’m out. Like, Adafruit was at one point recommending a Python IDE for their own implementation of micropython called Mu, and the instructions were to install it with Pip. Nope. Not doing that.


Amazon sold me a defective planer that had sawdust in it. Ibwas apparently the second to return it under warranty.


I aim for 10 years with a mid-life upgrade. I even do this with my laptops; my Inspiron got a new battery, a new CPU fan and an SSD for its 6th birthday. It’s 11 now.
My Ryzen 3600 rig is an HTPC now.


Mostly I have this distinct memory of badly communicating with a Verizon employee when I got my first smart phone, an LG Ally.
I remember asking “Is this a Droid?” Meaning “is the make and model of this handset a Motorola Droid?” And the reply was “They’re all droids.” meaning they all run the Android operating system. I miss LG phones, or at least the state of my personal life back when I had LG phones.


was it Verizon or Motorola?


They keep re-implementing things.
Just the Start menu. You can see how 95 evolved into 98 evolved into ME, then they changed it for XP, and they never stopped making big pointless changes. In many cases, those big pointless changes have been lengthening the process of going from the bare desktop to the thing you need by adding pointless screens and dialogs. Or, like the Start menu, they just drastically redesigned it such that a user used to Win XP tries to use 7 and they just…stare at it because it’s not what they were expecting. Windows 7’s Start menu might even be objectively better, Microsoft’s software engineers could very well produce good research documentation about UI design based on observing or polling users about what features they wanted and then they made the thing people seemed to want, but to people who got used to how it already worked the new thing was bad because it’s different.
I could be convinced Windows 8.1 is a mental unwellness simulator. In Sierra’s FMV horror game Phantasmagoria 2, the player character goes insane at work, and this is simulated by the paperwork he’s working on flashing scarier words for a split second. You’re reading this document and then near the bottom of the page an ordinary word like “recommended” turns to “murdered” for a few frames. Win 8.1’s animated tiles reminded me of that. Plus the whole “The desktop and all normal Windows apps therein is itself just an app that can be run in split screen next to special phone-like single tasking apps which pretty much only we will develop for and we won’t include desktop versions of so you have to deal with this.” I hate Windows 8.1.
What’s real fun is you can tell when they abandoned work on a project by which drastically different UI it’s encrusted with. The modem dialer looks like Windows XP, the fax program looks like Vista, some things have the flat purple stank of 8, some things have the dark glass look of early 10.
Funnily enough, the main place I worry about resolution is on a desktop computer doing desktop computer stuff. My 1440p ultrawide is kind of decadent for games, but when I’m doing something I just want a bunch of real estate.
Just watching TV or movies…honestly I think I might like lower resolutions more. I’ve got a copy of Master and Commander on “fullscreen” DVD, 480p 4:3. I’d really like it to be 16:9 but I can’t come up with complaints about the video quality. I get immersed in that movie just fine at DVD quality. I’ve got a few films on Blu-Ray, and at 1080p film grain starts being noticeable. And the graininess of the shot changes from scene to scene as the film crew had to use different film stock and camera settings for different lighting conditions, so I spend the whole movie going “That scene looks pretty good, oh that’s grainy as hell, now it’s better.” Lower resolutions and/or bitrates smooth that out, but I think they actively preserve it on Blu-Ray because the data fits on the disc, there’s no internet pipe to push it down, and film grain is “authentic.”
So at 4k, it’s either going to display a level of detail that I’m sitting too far from the screen to notice, it’s going to look even noisier, or it’ll be smeared by compression rather than resolution because of bitrate limitations. So…?
Something I recently learned: it is outright impossible to legally play 4k blu rays on PC.


Hey OP: forget it. throw your laptop in a lake and take up acoustic guitar.
If I understand correctly, the 1541 was initially launched for the VIC20, where the datasette and several Commodore printers that would remain compatible with the VIC20, C64 and later were PET-era. This is what I think I’ve learned from Youtube, mine was an IBM household since before I was born.
Windows, like DOS and CP/M before it, was designed for a standalone microcomputer that the user had physical access to, so they lettered the drives A, B, and C, That would allow mounting 26 drives which should be enough for everybody forever.
Linux, like UNIX before it, was designed to run on a minicomputer in a university basement accessed through a dumb terminal where the end user has no physical access to the hardware, so the file system presents as completely abstract.
In the modern paradigm of local PCs attached to network storage, both approaches have their disadvantages.
I wonder how UEFI treats it; diskette drives were kind of sacred in the old BIOS days. How modern Windows handles it is anyone’s guess, I’m sure it’s been rewritten by Copilot by now.
I had reason to use an optical drive lately, and even that was a blast from the past. Hitting eject, watching the light blink and then the drawer opens. USB-based storage just isn’t the same.
Hey, that’s a personal question.