Game streaming serices are never going to catch on because the capital needed to build out the infrastructure is ridiculous.
I don’t know about “never”, but I’ve made similar arguments on here predicated on the cost of building out the bandwidth — I don’t think that we’re likely going to get to the point any time soon where computers living in datacenters are a general-purpose replacement for non-mobile gaming, just because of the cost of building out the bandwidth from datacenter to monitor. Any benefit from having a remote GPU just doesn’t compare terribly well with the cost of having to effectively have a monitor-computer cable for every computer that might be used concurrently to the nearest datacenter.
But…I can think of specific cases where they’re competitive.
First, where power is your relevant constraint. If you’re using something like a cell phone or other battery-powered device, it’s a way to deal with power limitations. I mean, if you’re using even something like a laptop without wall power, you probably don’t have more than 100 Wh of battery power, absent USB-C and an external powerstation or something, due to airline restrictions on laptop battery size. If you want to be able to play a game for, say, 3 hours, then your power budget (not just for the GPU, but for everything) is something like 30W. You’re not going to beat that limit unless the restrictions on battery size go away (which…maybe they will, as I understand that there are some more-fire-safe battery chemistries out there).
And cell phone battery restrictions are typically even harder, like, 20 Wh. That means that for three hours of gaming, your power budget because of size constraints on the phone is maybe about 6 watts.
If you want power-intensive rendering on those platforms doing remote rendering is your only real option then.
Second, there are (and could be more) video game genres where you need dynamically-generated images, but where latency isn’t really a constraint. Like, a first-person shooter has some real latency constraints. You need to get a frame back in a tightly bounded amount of time, and you have constraints on how many frames per second you need. But if you were dynamically-rendering images for, I don’t know, an otherwise-text-based adventure game, then the acceptable time required to get a new frame illustrating a given scene might expand to seconds. That drastically slashes the bandwidth required.
What I don’t think is going to happen in the near future is “gaming PC/non-portable video game consoles get moved to the datacenter”.
I’m confident in my “never” because of the capital economics; the service has to be expensive enough to pay for the infrastructure it requires plus some profit for the shareholders, while simultaneously being cheap enough to offer gamers a better value proposition than buying their own hardware. There’s no margin between those limits, so the only market left for them to appeal to are niches where local rendering performance is limited but network latency and bandwidth are not. Even then, gamers still have the option of streaming from their own hardware using Moonlight rather than paying for a third-party service, so the only customers left are the ones with more money than sense.
Don’t get me wrong, I love the concept (I even bought an OnLive microconsole back in the day and still regularly use a Steam Link to stream games to the living room TV), but it isn’t nearly convenient or performant enough to justify itself as a subscription service.
I don’t know about “never”, but I’ve made similar arguments on here predicated on the cost of building out the bandwidth — I don’t think that we’re likely going to get to the point any time soon where computers living in datacenters are a general-purpose replacement for non-mobile gaming, just because of the cost of building out the bandwidth from datacenter to monitor. Any benefit from having a remote GPU just doesn’t compare terribly well with the cost of having to effectively have a monitor-computer cable for every computer that might be used concurrently to the nearest datacenter.
But…I can think of specific cases where they’re competitive.
First, where power is your relevant constraint. If you’re using something like a cell phone or other battery-powered device, it’s a way to deal with power limitations. I mean, if you’re using even something like a laptop without wall power, you probably don’t have more than 100 Wh of battery power, absent USB-C and an external powerstation or something, due to airline restrictions on laptop battery size. If you want to be able to play a game for, say, 3 hours, then your power budget (not just for the GPU, but for everything) is something like 30W. You’re not going to beat that limit unless the restrictions on battery size go away (which…maybe they will, as I understand that there are some more-fire-safe battery chemistries out there).
And cell phone battery restrictions are typically even harder, like, 20 Wh. That means that for three hours of gaming, your power budget because of size constraints on the phone is maybe about 6 watts.
If you want power-intensive rendering on those platforms doing remote rendering is your only real option then.
Second, there are (and could be more) video game genres where you need dynamically-generated images, but where latency isn’t really a constraint. Like, a first-person shooter has some real latency constraints. You need to get a frame back in a tightly bounded amount of time, and you have constraints on how many frames per second you need. But if you were dynamically-rendering images for, I don’t know, an otherwise-text-based adventure game, then the acceptable time required to get a new frame illustrating a given scene might expand to seconds. That drastically slashes the bandwidth required.
What I don’t think is going to happen in the near future is “gaming PC/non-portable video game consoles get moved to the datacenter”.
I’m confident in my “never” because of the capital economics; the service has to be expensive enough to pay for the infrastructure it requires plus some profit for the shareholders, while simultaneously being cheap enough to offer gamers a better value proposition than buying their own hardware. There’s no margin between those limits, so the only market left for them to appeal to are niches where local rendering performance is limited but network latency and bandwidth are not. Even then, gamers still have the option of streaming from their own hardware using Moonlight rather than paying for a third-party service, so the only customers left are the ones with more money than sense.
Don’t get me wrong, I love the concept (I even bought an OnLive microconsole back in the day and still regularly use a Steam Link to stream games to the living room TV), but it isn’t nearly convenient or performant enough to justify itself as a subscription service.