That’s… not really true, and not what that link shows. Those latency tests still show then-modern devices topping the list. They’re arguing that some then-modern low end devices have more button-to-screen latency than older hardware (which they would, given he’s comparing to single-threaded, single-tasking bare metal stuff from the 80s spitting signals out to a CRT to laptops with integrated graphics). And they’re saying that at the time (I presume the post dates from 2017, when the testing ends), this wasn’t well understood because people were benching the hardware and not the end to end latency factoring the I/O… which was kinda true at the time but absolutely not anymore.
I’d get in the weeds about how much or little sense it makes to compare an apple 2 drawing text on a CRT to typing on a powershell/Linux terminal window inside a desktop environment, but that’d be kind of unfair. Ten years ago this wasn’t a terrible observation to make with the limited tools the guy had available, and this sort of post made it popular to think about latency and made manufacturers on controllers, monitors and GPUs focus on it more.
What it does not show, though, is that an apple 2 was faster than a modern gaming PC by any metric. Not in 2017, and sure as hell not in 2026, when 240Hz monitors are popular, 120Hz TVs are industry-standard, VRR is widely supported and keyboards, controllers, monitors and GPU manufacturers are obsessed with latency measurements. It’s not just fallacious, it’s wrong.
That’s… not really true, and not what that link shows. Those latency tests still show then-modern devices topping the list. They’re arguing that some then-modern low end devices have more button-to-screen latency than older hardware (which they would, given he’s comparing to single-threaded, single-tasking bare metal stuff from the 80s spitting signals out to a CRT to laptops with integrated graphics). And they’re saying that at the time (I presume the post dates from 2017, when the testing ends), this wasn’t well understood because people were benching the hardware and not the end to end latency factoring the I/O… which was kinda true at the time but absolutely not anymore.
I’d get in the weeds about how much or little sense it makes to compare an apple 2 drawing text on a CRT to typing on a powershell/Linux terminal window inside a desktop environment, but that’d be kind of unfair. Ten years ago this wasn’t a terrible observation to make with the limited tools the guy had available, and this sort of post made it popular to think about latency and made manufacturers on controllers, monitors and GPUs focus on it more.
What it does not show, though, is that an apple 2 was faster than a modern gaming PC by any metric. Not in 2017, and sure as hell not in 2026, when 240Hz monitors are popular, 120Hz TVs are industry-standard, VRR is widely supported and keyboards, controllers, monitors and GPU manufacturers are obsessed with latency measurements. It’s not just fallacious, it’s wrong.