There were literally huge G-Sync logos in the boxes of the last three TVs I helped people buy. When you plug in a game console and press the settings button on my current display in game mode it pops up a large HUD element that says “VRR” and displays the type of VRR currently active and the current framerate. Every other option and metric is hidden away in a sub-menu.
Not that this matters, because the point of VRR is you don’t need to know it’s there. If it’s working, the drivers and the display should talk to each other transparently. The end result if you have a Windows machine with VRR and a Linux machine that doesn’t support it and you plug them both to the same display is, again, that the Windows game will look smoother, regardless of how many fps it’s spitting out.
And as always, a reminder I’ve given many, many, many times in my life, both personally and professionally, “it works on my machine” means nothing and doesn’t mean there’s no bug or that your code isn’t crap. Your anecdotal experience and my anecdotal experience aren’t the same, because I have a showstopper bug and your seven friends don’t, which still means there’s a showstopper bug.
Yes that is my blind spot that I have created. I go into a store, I see g-sync is nvidia, and assume it won’t work. I have been avoiding stuff that I know doesn’t work or suspect won’t within the decade, for decades. I’ve been recommending friends and family avoid certain specific brands/tech buzzwords on the basis that it probably won’t work in a few years when the maker decides to drop support for version 1 and similar scenarios, or the ‘surprise’ real life case of windows really crossing the line on how shitty they can get away with and making people want to switch and coming to me to ask if this or that linux distro would work for them.
There were literally huge G-Sync logos in the boxes of the last three TVs I helped people buy. When you plug in a game console and press the settings button on my current display in game mode it pops up a large HUD element that says “VRR” and displays the type of VRR currently active and the current framerate. Every other option and metric is hidden away in a sub-menu.
Not that this matters, because the point of VRR is you don’t need to know it’s there. If it’s working, the drivers and the display should talk to each other transparently. The end result if you have a Windows machine with VRR and a Linux machine that doesn’t support it and you plug them both to the same display is, again, that the Windows game will look smoother, regardless of how many fps it’s spitting out.
And as always, a reminder I’ve given many, many, many times in my life, both personally and professionally, “it works on my machine” means nothing and doesn’t mean there’s no bug or that your code isn’t crap. Your anecdotal experience and my anecdotal experience aren’t the same, because I have a showstopper bug and your seven friends don’t, which still means there’s a showstopper bug.
Yes that is my blind spot that I have created. I go into a store, I see g-sync is nvidia, and assume it won’t work. I have been avoiding stuff that I know doesn’t work or suspect won’t within the decade, for decades. I’ve been recommending friends and family avoid certain specific brands/tech buzzwords on the basis that it probably won’t work in a few years when the maker decides to drop support for version 1 and similar scenarios, or the ‘surprise’ real life case of windows really crossing the line on how shitty they can get away with and making people want to switch and coming to me to ask if this or that linux distro would work for them.