Hacker News new | past | comments | ask | show | jobs | submit login

CRT monitors had exceptional "motion clarity", because they worked more like a fast strobe light than displaying a sequence of frozen frames for a fraction of a second, like current flat screens. The former is apparently better at tricking our eyes to perceive it as fluid motion. OLED monitors could theoretically emulate that, to some degree, with black frame insertion. But manufacturers hate to implement it, perhaps because it causes wear. It also makes the screen dimmer.

CRTs also had no native resolution. They would instead change resolution physically, which made fuzzy interpolation unnecessary.

For a long time, they also had much higher refresh rates and better contrast than LCDs, though this has been matched in recent times.




I had a 21" Sony Trinitron for a computer monitor through the 2000s. 100hz at 1600x1200, it bested everything I bought after it for years. Until something in it popped, anyway.

CRTs didn't have a native resolution, but colour ones did have either masks or grills with individual RGB subpixels just like a flat panel. There was still interpolation in a sense, it was just done physically instead of transforming a framebuffer. If you made a CRT photon gun accurate enough that it could reliably address those subpixels it would then have a "native" resolution.

e: Also, a heads up that VR headsets strobe by default, it's part of what makes them work at all. Even LCD models, where they strobe the backlight.


Interesting, I wonder why manufacturers don't offer LCD backlight strobing as an option in TVs or monitors. It wouldn't cause wear like in organic displays. Perhaps the backlight doesn't get bright enough for that? I assume it should at least be possible with HDR TVs, as they have quite powerful blacklights.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: