I am running a Ryzen 5 3600 (not a 3600X), and a GTX 1660 Super.
It still gets the job done. And when it doesn’t, it’s in games that use DLSS as a crutch instead of properly optimizing it. And fuck those games.
Built a new home server last year for literally everything.
Ryzen 5800xt.
It’s fine. Does the thing. Fuck off.
I love old hardware. It took me eleven years to finally decommission a top-tier CPU I got in 2015. And that’s only because I got an extra CPU and motherboard from a work upgrade.
Old hardware can be fine for sure. I’ve been thinking about replacing my 9 year old laptop with a newer used Thinkpad but I’m probably going to by a racing wheel instead. I think I’ll end up using that more.
Old hardware = better drivers
Depends how old. I have a Phenom system with an iGPU and an audio chip that went unsupported fow a few years. Then after a few cycles of updates, it became supported again.
Same with the GPU of an old laptop with an Optimus system. At some point nothing would be working correctly but then new nouveau (huh) modules got out and this old hardware could suddenly work much better than before.
Apparently I have a lot of hardware that goes through a phase of being unsupported in Linux for a while, to working better than ever before.
Oh fuck yeah, especially for laptops. I was pleasantly surprised when all of the hardware on my new to me 13 year old laptop just worked out of the box with Debian 13. I was expecting to have to fix something.
I feel like I occupy all places on this bell curve. Main rig gets retired into a server which I usually over build to have it’s later years be fine as a server. Which then it becomes old stuff. For tinkering around on projects, old stuff is preferred until it’s usefulness is proven, then it gets an upgrade. Old potato goes behind the TV until the steam machine comes out.
I’m still running a 1950x in my desktop. It works fine for most stuff, but the gaming performance is a bit low now with all of the spectre mitigations. I was planning on upgrading, but that will have to wait a few more years now.
I ran my previous laptop from 2012 until it died last year. I used it as my electronics workbench PC for several years because the battery wouldn’t hold a charge, but it still worked fine for running KiCAD and programming microcontrollers. I replaced it with a used Thinkcentre that I got dirt cheap on ebay just before the RAM prices went crazy.
I’m using a 7700k from like 2017. I got me a 3070 and 32GB of ram and I can still play 4k @60fps on games like Horizon Zero Dawn, Dead Island 2 Starwars Outlaws (great game) and Cyberpunk (yes fine with DLSS).
And older stuff like F04, Battletech, HL2, Black Mesa, Fallen order.
Would I like something more powerful. Sure. Cyberpunk natively rendered would be cool. But it won’t really change my day to day. For that I’d need to spend way more on a fancy monitor which seems kinda moot when I’m already at 4k @ 60hz.
This is so true! lol
At least in my social circle, the ones who use older hardware are either the ones who just do very basic tasks with them, or the ones with advanced tech skills. The average users tend to be so consumerist, expecting that a better hardware will compensate for their lack of skills…
At one point I thought like the middle guy, but I didn’t have the money to do it so my computers were made from whatever parts I could scrounge up from dumpster diving, school auctions, or whatever. I was building or upgrading my computer every six months or so with what I found.
Once I had the money to buy basically whatever computer I wanted, I would build a high end machine and then not bother to upgrade it until I had a friend who needed one. I pass my old one on to them and the cycle repeats.
My laptops are still random auction finds. My current one came as a pair for $40. I popped in a new SSD and battery in one and have ignored the other. I should see if Haiku supports it well enough to be worth daily driving.






