Yeah, usually the issue I’ve seen is when drivers are optimized for specific games to show off hardware performance and then another application or game tries to emulate that game to take advantage of that optimization, but then the optimizations change and the application now is creating conflicts and errors, just as am example. If drivers followed more open standards and optimized to those instead of trying to draw out a few more FPS to market incremental upgrades rather than all that proprietary junk, it probably would solve some of those issues. But otherwise unless the applications keep up with reverse engineering the proprietary stuff, doing that ends up binding the application to a specific driver version and/or hardware. There’s some value in optimizing specific hardware to specific software. It’s how MACs and iPhones and such have always been successful. But outside of controlled ecosystems like Apple, it is a big burden for app developers to not have reliable middle layers.
But not long ago it used to be that even just desktop environments like KDE and GNOME were super unstable on NVIDIA drivers. That seems to be a thing of the past, mostly, but older hardware does still have some of those issues and a lot of Linux users were brought to Linux to support older hardware. And so there still some bad reputation out there more than realistic expectations of a market that’s driven by today’s profit over keeping existing customers happy or future profit.
Yeah, usually the issue I’ve seen is when drivers are optimized for specific games to show off hardware performance and then another application or game tries to emulate that game to take advantage of that optimization, but then the optimizations change and the application now is creating conflicts and errors, just as am example. If drivers followed more open standards and optimized to those instead of trying to draw out a few more FPS to market incremental upgrades rather than all that proprietary junk, it probably would solve some of those issues. But otherwise unless the applications keep up with reverse engineering the proprietary stuff, doing that ends up binding the application to a specific driver version and/or hardware. There’s some value in optimizing specific hardware to specific software. It’s how MACs and iPhones and such have always been successful. But outside of controlled ecosystems like Apple, it is a big burden for app developers to not have reliable middle layers.
But not long ago it used to be that even just desktop environments like KDE and GNOME were super unstable on NVIDIA drivers. That seems to be a thing of the past, mostly, but older hardware does still have some of those issues and a lot of Linux users were brought to Linux to support older hardware. And so there still some bad reputation out there more than realistic expectations of a market that’s driven by today’s profit over keeping existing customers happy or future profit.