It just kinda makes no sense to me. How can you improve the framerate by predicting how the next frame should be rendered while reducing the overhead and not increasing it more than what it already takes to render the scene normally? Like even the simplistic concept of it sounds like pure magic. And yet… It’s real.


That is not strictly true, the actual latency increase is half the original frame rate. Because the input is not just the frame image but also the motion vectors (in which direction the pixel moved) for the current frame. Frame gen also knows a lot about the image, like which bits have transparent pixels (which move in multiple directions at once) and when the game is done with the frame yet still has to wait for the GPU (time which can be used for more work with little impact).
Frame gen is much more involved than the old “motion smoothing” of televisions, the so called “soap opera” mode, which did increase the latency much more and had no knowledge of how the source image was built, so processing was much more involved.
Stuff like DLSS5 is supposed to use the same inputs (source images and motion vectors), now that is magic to me.
If you only use source image and motion vectors as input, so you’re essentially predicting instead of interpolating, surely that introduces some type of stuttering now and then due to correction which will be necessary eventually? Or am I misunderstanding it?
They don’t ever do more than 4 predicted frames per 1 full frame, and usually just 1:1
That and the game can flag frames that are too different (camera cuts) to mitigate this problem.
What the game supplies is the current frame + motion vectors, but the framegen bits take over how the frames are displayed onscreen. This is where the extra latency comes from, at worst you are seeing one true frame behind what the game is rendering, while the presentation layer generates the intermediate frame(s).
Yeah, I simplified it to keep it at ELI5 level, but you’re right
It absolutely does increase latency though. If I’ve got the option for steady frame rates without frame gen, I’ll take it over frame gen. Frame gen was just about mandatory for Borderlands 4 at launch, and it gave me a convincing 80 FPS. After a performance patch, the game can get 60 FPS on my machine for real with a few of the settings knocked down, and it feels so much better.
Isn’t that the thing NVIDIA was found to be lying about?
Yeah devs apparently saw it didn’t use internal engine data much at all
That was about the Yassify filter.