

I aim for 10 years with a mid-life upgrade. I even do this with my laptops; my Inspiron got a new battery, a new CPU fan and an SSD for its 6th birthday. It’s 11 now.
My Ryzen 3600 rig is an HTPC now.
Linux gamer, retired aviator, profanity enthusiast


I aim for 10 years with a mid-life upgrade. I even do this with my laptops; my Inspiron got a new battery, a new CPU fan and an SSD for its 6th birthday. It’s 11 now.
My Ryzen 3600 rig is an HTPC now.


Mostly I have this distinct memory of badly communicating with a Verizon employee when I got my first smart phone, an LG Ally.
I remember asking “Is this a Droid?” Meaning “is the make and model of this handset a Motorola Droid?” And the reply was “They’re all droids.” meaning they all run the Android operating system. I miss LG phones, or at least the state of my personal life back when I had LG phones.


was it Verizon or Motorola?


They keep re-implementing things.
Just the Start menu. You can see how 95 evolved into 98 evolved into ME, then they changed it for XP, and they never stopped making big pointless changes. In many cases, those big pointless changes have been lengthening the process of going from the bare desktop to the thing you need by adding pointless screens and dialogs. Or, like the Start menu, they just drastically redesigned it such that a user used to Win XP tries to use 7 and they just…stare at it because it’s not what they were expecting. Windows 7’s Start menu might even be objectively better, Microsoft’s software engineers could very well produce good research documentation about UI design based on observing or polling users about what features they wanted and then they made the thing people seemed to want, but to people who got used to how it already worked the new thing was bad because it’s different.
I could be convinced Windows 8.1 is a mental unwellness simulator. In Sierra’s FMV horror game Phantasmagoria 2, the player character goes insane at work, and this is simulated by the paperwork he’s working on flashing scarier words for a split second. You’re reading this document and then near the bottom of the page an ordinary word like “recommended” turns to “murdered” for a few frames. Win 8.1’s animated tiles reminded me of that. Plus the whole “The desktop and all normal Windows apps therein is itself just an app that can be run in split screen next to special phone-like single tasking apps which pretty much only we will develop for and we won’t include desktop versions of so you have to deal with this.” I hate Windows 8.1.
What’s real fun is you can tell when they abandoned work on a project by which drastically different UI it’s encrusted with. The modem dialer looks like Windows XP, the fax program looks like Vista, some things have the flat purple stank of 8, some things have the dark glass look of early 10.
Funnily enough, the main place I worry about resolution is on a desktop computer doing desktop computer stuff. My 1440p ultrawide is kind of decadent for games, but when I’m doing something I just want a bunch of real estate.
Just watching TV or movies…honestly I think I might like lower resolutions more. I’ve got a copy of Master and Commander on “fullscreen” DVD, 480p 4:3. I’d really like it to be 16:9 but I can’t come up with complaints about the video quality. I get immersed in that movie just fine at DVD quality. I’ve got a few films on Blu-Ray, and at 1080p film grain starts being noticeable. And the graininess of the shot changes from scene to scene as the film crew had to use different film stock and camera settings for different lighting conditions, so I spend the whole movie going “That scene looks pretty good, oh that’s grainy as hell, now it’s better.” Lower resolutions and/or bitrates smooth that out, but I think they actively preserve it on Blu-Ray because the data fits on the disc, there’s no internet pipe to push it down, and film grain is “authentic.”
So at 4k, it’s either going to display a level of detail that I’m sitting too far from the screen to notice, it’s going to look even noisier, or it’ll be smeared by compression rather than resolution because of bitrate limitations. So…?
Something I recently learned: it is outright impossible to legally play 4k blu rays on PC.


Hey OP: forget it. throw your laptop in a lake and take up acoustic guitar.
If I understand correctly, the 1541 was initially launched for the VIC20, where the datasette and several Commodore printers that would remain compatible with the VIC20, C64 and later were PET-era. This is what I think I’ve learned from Youtube, mine was an IBM household since before I was born.
Windows, like DOS and CP/M before it, was designed for a standalone microcomputer that the user had physical access to, so they lettered the drives A, B, and C, That would allow mounting 26 drives which should be enough for everybody forever.
Linux, like UNIX before it, was designed to run on a minicomputer in a university basement accessed through a dumb terminal where the end user has no physical access to the hardware, so the file system presents as completely abstract.
In the modern paradigm of local PCs attached to network storage, both approaches have their disadvantages.
I wonder how UEFI treats it; diskette drives were kind of sacred in the old BIOS days. How modern Windows handles it is anyone’s guess, I’m sure it’s been rewritten by Copilot by now.
I had reason to use an optical drive lately, and even that was a blast from the past. Hitting eject, watching the light blink and then the drawer opens. USB-based storage just isn’t the same.


Every slicer I’m aware of runs on Linux. I’ve got PrusaSlicer and slic3r installed right now. Cura is on Flathub. Hell, Simplify3D does or did offer a Linux version, though it was one of those janky .run installers where they translate the Windows install process as literally as possible to Linux.
As for modeling software, depends on what kind of modeling. I tend to use FreeCAD, but it’s mostly suitable for engineering and not art.


Ages ago when I still bothered with Octoprint, Cura Engine could be installed as a module, and you could slice an STL on a Raspberry Pi through Octoprint. I quickly gave up on that as a stupid gimmick because you pretty much always need to do adjustments in the plater, but once upon a time Cura could do it.


OP asked for software that runs well on a 10 year old laptop with 16GB of DDR3 and Linux. Saying that I found that Godot runs well on my laptop of similar configuration and vintage absolutely is relevant, you disingenuous troll.
I understand how Python modules work just fine, you install a module with Pip, and it’ll run on your computer and only your computer until your computer gets some update in the future because Python’s module versioning and dependency management are the worst in the business. Python also has a well-deserved reputation as a fast and performant language even running on old and limited systems…oh wait no it’s a sow in treacle. The more you implement in Python the slower it’s going to run. Can you name a commercial game that is implemented in Python, using modules like Pygame? I can’t.
If you’ve got the talent to open up a general purpose programming language and create a video game, use something like C# or Java, something designed for creating performant cross-platform graphical applications. Or, if you’re going to start gluing applications like Tiled and such together, you might as well go with something like Godot because that’s basically what you’re janking together.


I did call Godot lighter than Unity or Unreal, which I believe to be factually accurate. I have run Godot on a 2014 era laptop, it runs well on a system of that vintage.
It is a full featured 2D/3D game engine and development environment, which can be a lot to take in. A lot of what I learned about game development I learned from a Youtube channel called Clear Code, who made the same snake game in both Pygame and Godot.
Python and Pygame does away with the cluttered IDE, and you can build a functioning game in one file, then you translate those concepts to a more full-on game engine which is going to be a bit more practicable for making larger games with things like tilesets and more complicated physics and collisions and whatnot. I’d hate to try making a Zelda-like game in something like Pygame. Fear the men who made A Link to the Past in 6502 assembly.


I’ll join the chorus recommending Godot. A lot lighter than Unity or Unreal, it’s open source, well documented and quite capable. It’s got a lot of features, in a lot of ways it isn’t “dead simple.”
I might recommend starting off using Python’s Pygame library. Do something like create Flappy Bird in it, that will give you a pretty good idea of how a video game works under the hood, and it’ll run on a potato.
For pixel art you might go with LibreSprite or Pixelorama. These will allow you to create tile sets for backgrounds as well as character sprites.
If you’re looking to get into 3D art, you’ve basically got to go with Blender.


So there hasn’t been any RAM manufactured in Europe in nearly 20 years? Is that the point you’re making?


This, I think, is related to where I draw the line between a ‘moon’ and a ‘sattelite.’ If an average human can hit escape velocity under muscle power, it’s a satellite. If you can strand someone there, it’s a moon.


The US doesn’t have an official language, it also doesn’t have an official national casserole.
English proficiency is required by law in a lot of places for various things; you’ll find it in just about every subpart of FAR parts 61 and 65, for example.
Amazon sold me a defective planer that had sawdust in it. Ibwas apparently the second to return it under warranty.