Remote computing almost never makes sense. Budgeting for continued access inevitably costs enough to buy something local - less powerful, but powerful enough. One year university supercomputers could run multiplayer first-person dungeon crawlers. The next year, so could an Apple II. (Christ, $1300 at launch? It did not do much more than the $600 TRS-80 and C64. The Apple I was only $666. Meanwhile a $150 Atari was better at action titles anyway.)
When networks advance faster than computing, there’s glimpses of viability. Maybe there was a brief window where machines that struggled with Doom could have streamed Quake over dial-up… at 28.8 kbps… in RealPlayer quality… while paying by the minute for the phone call. Or maybe your first cable modem could have delivered Far Cry in standard-def MPEG2, right between Halo 2 and the $300 launch of the 360, while Half-Life 2 ran on any damn thing.
Nowadays your phone runs Unreal 5 games. What else were you gonna stream games on? If you have a desktop, it’s probably for gaming. Set-top boxes keep Ouya-ing themselves, trying to become “mini-consoles” that cost too much, run poorly, and stop getting updates. Minimalist laptops like Chromebook find themselves abandoned, even though the entire fucking pitch was an everlasting dumb terminal for the internet. The only place cloud gaming almost works is for laptops, and really only work laptops, because otherwise-- buy a Steam Deck. You’re better off carrying a keyboard for normal desk use than a controller for gaming on the subway.
Remote computing makes sense from an environmental perspective. There would be a drastic reduction in e-waste if people were using zero clients instead of desktops.
Cloud gaming isn’t real.
Remote computing almost never makes sense. Budgeting for continued access inevitably costs enough to buy something local - less powerful, but powerful enough. One year university supercomputers could run multiplayer first-person dungeon crawlers. The next year, so could an Apple II. (Christ, $1300 at launch? It did not do much more than the $600 TRS-80 and C64. The Apple I was only $666. Meanwhile a $150 Atari was better at action titles anyway.)
When networks advance faster than computing, there’s glimpses of viability. Maybe there was a brief window where machines that struggled with Doom could have streamed Quake over dial-up… at 28.8 kbps… in RealPlayer quality… while paying by the minute for the phone call. Or maybe your first cable modem could have delivered Far Cry in standard-def MPEG2, right between Halo 2 and the $300 launch of the 360, while Half-Life 2 ran on any damn thing.
Nowadays your phone runs Unreal 5 games. What else were you gonna stream games on? If you have a desktop, it’s probably for gaming. Set-top boxes keep Ouya-ing themselves, trying to become “mini-consoles” that cost too much, run poorly, and stop getting updates. Minimalist laptops like Chromebook find themselves abandoned, even though the entire fucking pitch was an everlasting dumb terminal for the internet. The only place cloud gaming almost works is for laptops, and really only work laptops, because otherwise-- buy a Steam Deck. You’re better off carrying a keyboard for normal desk use than a controller for gaming on the subway.
Remote computing makes sense from an environmental perspective. There would be a drastic reduction in e-waste if people were using zero clients instead of desktops.
Maybe in theory, but in practice, Chromebooks.