Hacker News new | past | comments | ask | show | jobs | submit login

When the analog signal enters the CRT, there are nanoseconds of lag before that signal causes the phosphor to be lit up on the other side. That is different to how a typical digital display works (with buffering) and that is where you get the zero input lag of CRT.

> presumably you have some sort of digital to analog converter box

Yes the CRT will display the output of that DAC in real-time. No additional buffering/latency after that conversion.

The Atari 2600 video hardware had no framebuffer and the code running on the CPU was racing the electron beam, updating the graphics mid-scanline in real-time while the electron beam was scanning that line on the display. That was about as raw and zero lag as it gets. The other 8 bit and 16 bit consoles also did not have framebuffers.

Another guy has commented here saying that in theory you could do the same with digital displays. In practice they buffer the input and add latency.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: