Having spent the bulk of my handheld gaming time with the Steam Deck, it was a bit of a shock last year to discover that PC gaming isn’t just possible on Android phones and retro handhelds, it’s powering on in leaps and bounds.
I’ve seen so many different games running beautifully, from older AAA titles like Tomb Raider and Prey (2017), all the way to more demanding ones like RDR2 and even Cyberpunk 2077 (no surprise that the last one is still an imperfect experience, as things stand…but it is possible!).
GameNative lets you play all manner of PC games on Android from GOG, Epic, and Steam.
I reached out to my friend Utkarsh, who is the lead developer of GameNative to ask if he wanted to share his story and let me interview him.
His background in development and gaming through to how GameNative started and is built, all the way to what the future might bring for his program. This is an interview on what I think might be at least part of the future of handheld gaming, and I hope you find this interesting:
https://gardinerbryant.com/i-genuinely-feel-gamenative-could-replace-handheld-pcs/


This claim is a ridiculous overreach. There’s only so much computing power you can fit in a small space due to heat dissipation. You can’t beat thermodynamics. You can get a lot of games to run on lower end systems, but only if you’re willing to make a ton of compromises.
In no way are you going to be running something like Cyberpunk at 4k 60fps on a phone within the next 10 years. Thats what the “expensive, bulky gaming PCs” are for.
And I don’t get why they’re painting a target on the back of high end gaming hardware or even the Steam Deck. There’s another target that would be more beneficial to society to take out: consoles, particularly their locked-in ecosystems. Democratize gaming.
I don’t think the greater power of larger devices is being questioned. There just happens to be a threshold where a technically inferior but more accessible solution becomes “good enough” for most people that they never consider moving up.
Just look at mobile devices. Of everyone who accesses the internet, 75% do so via smartphone only. As someone who doesn’t even like desktops losing ground to laptops, that statistic scares me.
We know there’s a growing number of people who use their phone as their primary and only computing device. And the success of the steam deck is proof that a “good enough” experience can attract an audience. It is also likely that Valve is planning for a future where the Steam android app will be capable of installing and playing games locally without the 30% Google tax.
None of that will change the fact that gaming will always push technology forward with the need for faster CPUs and GPUs and that will never be the domain of phones where efficiency is king. There is no reason to worry.
There’s also some people moving in the other direction, and I wouldn’t be surprised if that grows. My parents only had their smartphones for years, but recently had me pick out a laptop for them because trying to use their phones for everything was a headache.
I think one thing to consider is that cost of living has been going up in the US with wages not keeping up. So budgets are getting tighter, and if you can only afford a single device to buy, you’re going to buy the phone, even if a PC makes a lot of things significantly easier.
Tbh, I think we’ve reached a point of diminishing returns on video game graphics. Do we really need games to be any more photorealistic and power hungry than they are now?
That being said, I don’t think android phones are going to usurp this domain any time soon. Power requirements for 4k 60fps are way too high, and mobile devices simply can’t distribute enough heat to handle it unless there’s enormous bumps in efficiency. And advancements in chip design have seriously slowed down the past few years
I definitely think graphical fidelity is “good enough” now, but there’s still quite a bit of advancement available in other areas still drawing on the CPU and GPU, VR and local AI being a couple. I’ve been all in on VR since the Vive, and while I reject corporate AI as much as most people here, I do run local models occasionally and would like to have NPCs using the tech.
Need? No. Want? Absolutely.
There are two interesting articles that have shaped my view on this:
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
https://www.microsoft.com/en-us/research/wp-content/uploads/2018/02/perfectillusion.pdf
I’m not hung up on who is right about 1000Hz vs 1800Hz, only that >=1000Hz at >=1000fps is the goal. We’re a long way away from that when the best gaming CPU can only manage ~600fps in CS2 at 1080p.
One of the digitalfoundry guys got hands-on time with a prototype monitor at CES and played a game at >500fps and while he couldn’t really convey what it was like, it was clear that the experience was very different than even playing on 360Hz displays.
We’re at least 2-3 hardware generations away from being able to push >1000fps with relatively simple games and much further away for AAA games. I think it’s something worth looking forward to.
No, people aren’t going to want 1000+fps in games. As someone else pointed out in the thread, 4k 60fps is <5% of builds in Steam hardware surveys. Going even higher framerates just adds more and more cost, with reduced returns.
If you could build a system that goes from 500fps to 1000fps, you’re theoretically reducing latency by 1ms (it’ll most certainly be less though). But how much more expensive is the 1000fps build? Based on tech trends the past few years, that’s probably going to be a lot more expensive, since architectural improvements of chips has slowed down over the past few years. Right now, Nvidia’s just pushing more and more power into their cards to get more performance, because efficiency has plateau’d
Add to that, the human eye only sees up to 500fps in ideal conditions. Why would you pay a bunch of money for extra framea that you physicall can’t see?
I’m not worried about the tech going away so much as the market percentage dropping to make enthusiast hardware more niche. Among other things, it makes enshittification in the space harder to fight.
Fair, but I don’t think that threshold will be passed by smartphones for at minimum a decade. If you want 4k resolution for games you need 10-12GB of VRAM minimum. By claiming that high-end PC’s will become pointless in 5 years suggests that the interviewee thinks that mobile chips will surpass those requirements.
Except even among most current PC gamers, the threshold isn’t that high. 4K is still less than 5% of the market.
Also, I’d argue “anachronism” isn’t the same as “pointless.” It’s just claiming that something that was once more common will become less common.
Eh… “Anachronism” more suggests that they’ll be considered “out of place”. But that’s me nitpicking
Well, anachronism most literally means “misplaced in time.” You can go two directions with that, something being more at home in the future or more at home in the past. The former obviously doesn’t apply here, and I would consider my wording identical to the latter. A reduction in belonging implies a reduction in commonness.
You don’t need 4k on a phone though, might make it more doable.
Way to entirely miss my point
Cyberpunk came out in 2020. Are there games from 2010 that you would be surprised to see running af full speed on a high-end smartphone?
Trying to push the narrative to focus on 2010 games feels a bit like moving the goalposts, but I’ll bite
Trying to run anything in 4k 60fps native still is challenging for a lot of systems today, even older titles. Anything with high fidelity like the Last of Us would be a problem.
Plus anything with a lot of characters on screen at the same time would likely be a struggle. I’ve done 4-person couch co-op of CoD: Black Ops Zombies on XBox 360 (the system it was designed for) and it got choppy due to the number of zombies and perspectives the CPU had to handle. Open world games could potentially end up in a similar situation.
Then you get games that usually end up modded a lot like Skyrim and Fallout: New Vegas that would likely be trouble from the start, and modern graphics mods still require fairly powerful systems to handle well
Why? Isn’t the comparable expectation for consideration of what high-end phones ten years from now could do with a six-year old game to ask what today’s high-end phones can do with sixteen year old games?
Moore’s Law was always a marketing gimmick, but progression of information technology has been a rather steady cycle of “next year’s model will be even better” that it strikes me as a good starting point.
Advances in computation have slowed significantly the past few years. Moore’s Law is generally considered to have been dead for the last decade. There’s a reason Nvidia keeps adding a higher and higher power requirement on their top-end cards the past 2 generations. They’re running out of potential for optimizations, and the main route for higher compute is to now throw tons of power at it.
A better way to look at it is the Steam Deck. It only works because the TDP is 15W. If you wanted to make it more powerful, you’ll need to figure out how to dissipate the extra thermal load. If instead you tried switching to ARM for increased efficiency, the extra layers of translation and emulation puts you about where you started, meaning you’d still need to throw more power at it to get more performance.
Metro2033