When’s the last time you tried to emulate something? The Wii and PSP can handle up to N64 decently, and that hardware is going on 20 years old.
Depending on your phone you can emulate PS2 and even Switch games.
Just hold on to whatever you have, even phones, and emulator devs will figure out the rest. Accuracy has been king for a long time while we’ve had strong hardware availability, but there’s no reason we can’t do per game speed hacks again like old SNES emulators when hardware gets scarce.
Phones degrade through daily use as the charging port wears out or it gets dropped and damaged. Beyond that, software update assume more resources because they’re tuned to new models that have that. If new models don’t, we’ll see that change.
The only part with a real use before date in it is batteries, but you can use them plugged in. Capacitors too, technically, but we’re long past the days of the capacitor plague. Most should hold up for decades, as far as I’m aware.
As far as what we do a decade down the line?
I’d be shocked if business trends don’t shift back to owned hardware which will in turn revitalize the consumer market. They’ll take the functional parts that don’t pass QC for business use and rebrand them for comsumers. They’ve been doing that with tons of hardware for ages now. The ram chips don’t hit the right clock speed for the premium product so they’ll bin those ones together into a lower grade cheaper product instead of trashing them entirely.
In a lot of ways this is just another go around of mainframe and terminals vs personal computers again. So we had the mainframe in the cloud vs on-prem setup, and various companies fell on either side based on their needs by now. Cloud is fairly stable, and we’re at the point that most companies are able to evaluate pros and cons in a more clear headed manner. Most are discoverign that it makes sense to keep some things on-prem.
So now we get another go-around due to hardware scarcity because of AI hype. We’re already seeing news stories talking about the importance of having actual metrics to judge success of “AI-enhanced” 🤮 workflows. Companies can stay irrational longer than we’d like, but they can’t do it indefinitely.
Enough companies have enough legitimate use cases (and cash to burn) that on-premises hardware isn’t going to just die. Eventually that will trickle back down to consumers.
And if it somehow doesn’t, people will continue to figure out how to keep older stuff running. There has always been specialists doing it, now there will be more due to more demand.
I’m not really seeing where there’s a lack of compute power in citizen hands that can’t be overcome with more resource aware programming techniques, which will also start coming back into vogue if it has to.
There’s no point (financially or open-source wise) making software no one can run. People will adapt.
I’m not trying to say that it won’t suck, and that we aren’t likely to see some big changes coming. I just can’t imagine a future in which we all don’t adapt and keep moving forward.
The only way everything goes into unsalvagable shit is if people just wholesale stop trying, and that’s not something I’ve seen people just give up and do as some mass homogenous group in my life.
not when your computer goes to the cloud.
When’s the last time you tried to emulate something? The Wii and PSP can handle up to N64 decently, and that hardware is going on 20 years old.
Depending on your phone you can emulate PS2 and even Switch games.
Just hold on to whatever you have, even phones, and emulator devs will figure out the rest. Accuracy has been king for a long time while we’ve had strong hardware availability, but there’s no reason we can’t do per game speed hacks again like old SNES emulators when hardware gets scarce.
my main worry is that these things aren’t built to last. especially phones.
yeah if they switched everyone to the cloud right now, i’d have a decade of computer at this point, maybe a bit more if i can fix it, and then what?
consoles tend to last longer but they don’t do general computing, which is an important thing we’d be losing in the process.
Phones degrade through daily use as the charging port wears out or it gets dropped and damaged. Beyond that, software update assume more resources because they’re tuned to new models that have that. If new models don’t, we’ll see that change.
The only part with a real use before date in it is batteries, but you can use them plugged in. Capacitors too, technically, but we’re long past the days of the capacitor plague. Most should hold up for decades, as far as I’m aware.
As far as what we do a decade down the line?
I’d be shocked if business trends don’t shift back to owned hardware which will in turn revitalize the consumer market. They’ll take the functional parts that don’t pass QC for business use and rebrand them for comsumers. They’ve been doing that with tons of hardware for ages now. The ram chips don’t hit the right clock speed for the premium product so they’ll bin those ones together into a lower grade cheaper product instead of trashing them entirely.
In a lot of ways this is just another go around of mainframe and terminals vs personal computers again. So we had the mainframe in the cloud vs on-prem setup, and various companies fell on either side based on their needs by now. Cloud is fairly stable, and we’re at the point that most companies are able to evaluate pros and cons in a more clear headed manner. Most are discoverign that it makes sense to keep some things on-prem.
So now we get another go-around due to hardware scarcity because of AI hype. We’re already seeing news stories talking about the importance of having actual metrics to judge success of “AI-enhanced” 🤮 workflows. Companies can stay irrational longer than we’d like, but they can’t do it indefinitely.
Enough companies have enough legitimate use cases (and cash to burn) that on-premises hardware isn’t going to just die. Eventually that will trickle back down to consumers.
And if it somehow doesn’t, people will continue to figure out how to keep older stuff running. There has always been specialists doing it, now there will be more due to more demand.
I’m not really seeing where there’s a lack of compute power in citizen hands that can’t be overcome with more resource aware programming techniques, which will also start coming back into vogue if it has to.
There’s no point (financially or open-source wise) making software no one can run. People will adapt.
I’m not trying to say that it won’t suck, and that we aren’t likely to see some big changes coming. I just can’t imagine a future in which we all don’t adapt and keep moving forward.
The only way everything goes into unsalvagable shit is if people just wholesale stop trying, and that’s not something I’ve seen people just give up and do as some mass homogenous group in my life.