Hacker News new | past | comments | ask | show | jobs | submit login
A 20-year-old CRT monitor can be better than a 4K LCD (2019) (vice.com)
257 points by MrJagil on Oct 11, 2020 | hide | past | favorite | 284 comments



Heh, everything is better on Tubes whether it is guitar amps or gaming monitors? :-)

I can pretty much assure you that with a 240Hz refresh rate 4K LCD monitor it will look better than your CRT :-) But it is perfectly valid to say "What is the best experience I can get for $X?" and find that CRT solutions out perform LCD solutions at various price points.

That said, I suspect it is less about the "superiority" of the CRT than it is about the corners cut by the LCD manufacturer in terms of display fidelity. A lot of the early "high res" displays got there by sacrificing video image quality.

That those monitors aren't great for gaming is not surprising, it is also not surprising if that was the only monitor you've ever gamed on, when you saw gaming on a better image experience you would be impressed.

If you consider the amount of RAM and processing power you have to have inside the monitor at 4K resolution you start to understand why there is a thing like nVidia's G-Sync technology. That is a lot of bits to throw around. Similarly, a monitor that processes the 4k video stream down to 1080p, and has 10 or even 12 bit dynamic range on the pixels with full motion emulation might give you a better looking display than a 4K display.

So many ways to optimize for particular markets.


The CRT monitors do have a faster response time and higher refresh rate (I've seen up to 640x480@480hz <1ms).

But you would be sacrificing fidelity for this "competitive advantage".


You might find this note[1] interesting to read. I would love to see a commercial off the shelf cathode ray tube (CRT) that could do 480 Hz, so please share with us a link if you can find it in your notes.

A CRT monitor, with a refresh rate of 480 Hz, allows just 2 mS (2.083333 mS to be precise) for excitation of the phosphor. As you know, (here is a similar explainer [2]) the phosphor is excited by the arrival of electrons which contribute their kinetic energy when they impact the phosphor to the electron energy of the particular phosphor. When the energy decays back to base level, it emits a photon at a frequency that is characteristic of the band gap between the excited and rest state of the electrons in the outer orbit of the crystal.

The more electrons you can get excited in the phosphor crystal, the brighter the display. But the more electrons in the beam gives the beam more inertia and so rapidly moving it from one side of the screen to the other requires higher voltage potential.

I haven't seen any high refresh rate CRTs but back in college we had a device which was a "flying spot" scanner which had a very fast phosphor but it was enclosed in a black box and basically relied on photographic film for persistence (it was made by a company called "Dicomed"). The shorter the persistence of the phosphor gave it a very high dynamic range of brightness which allowed us to transfer digital photos with a high dynamic range to film without losing fidelity.

I haven't really followed the evolution of cathode ray tubes post the Sony Trinitron era, I would really love to play with the tube you mention, or at a minimum get the engineers specification for it. I'm assuming it was full color? (the Dicomed's screen was a sort of yellowish white and it had three filters that it would place over the screen to scan out a color image).

[1] https://www.nasa.gov/centers/dryden/pdf/87778main_H-609.pdf

[2] https://www.phosphor-technology.com/how-do-phosphors-work/


P22 phosphors are the ones used in all trinitron PC monitors. I get conflicting numbers on the actual decay time of these phosphors, but with a 240 fps camera the decay time is difficult to judge by flipping through frames. I’d have to make a proper set up to try and even give an estimate.

I can safely say the P22 decay time is below 1 ms. Here [0] it is stated to be 100 us for G & B and up to 1 ms for red. A commonly referenced list of phosphor decay times [1] lists it as “medium”. In general, CRT phosphors seem to have sub 1-ms decay times unless you want a long persistence.

The limiting factors will be in a monitor’s max horizontal scan rate (typically 130 kHz or lower) or the DAC’s maximum pixel clock (it used to be 400 MHz but most VGA adapters these days struggle to keep 200+ MHz stable).

0. https://www.epanorama.net/documents/video/phosphor_decay.htm...

1. http://www.bunkerofdoom.com/tubes/crt/crt_phosphor_research....


Excellent data! Now you need to add the time to saturation (TTS) which gives you how long the beam has to sit on the phosphor to achieve the maximum amount of light. Different tubes do this differently from moderating the energy in the beam itself to modulating "time on spot".

What you end up with at the end of all this research is a really nice ideas of the "hard bits" around engineering a CRT for a particular application. When we looked at CRTs in depth in the EE program one of the things that came across was how so many of the things you had to vary were interconnected. The professor suggested that was why there were so many different models to choose from, even when they were all the same form factor.

Thanks for the links too, I've added the bunkerofdoom one to my collection.


And to be clear, the time it takes for a phosphor dot to go from "off" to "full bright" to "off" again is its frequency response. And every dot of phosphor has to be "visited" by the electron beam on every frame. With those two numbers you can get the absolute fastest you can scan through the dots with full dynamic range.

In the CRT designs we looked at in college they typically used a fixed "dwell time" of the beam, so it was on each pixel for a fixed amount of time, and modulated drive power to get different intensities. (some phosphors are non-linear in their response so you can correct that in the drive table).

And as I was reminded in email, the beam has to visit every pixel AND get back to the top of the screen for the next frame. So on a 480i screen that is 640 x 240 per frame at 30Hz, that is 30 frames per second / 640 * 240 pixes per frame so you have just 195 uS to spend on each pixel. (less than that actually since with overscan you have 720 pixels per line but it is a very short time).


Somewhat tangential, but Dicomed also made pretty special large format digital backs at one point [1] that really have yet to be eclipsed in terms of chip size (definitely in quality). Interesting they also produced what sounds like a really high quality digital to film solution too.

[1] http://www.epi-centre.com/reports/9604cs.html?LMCL=xIZMrP


FYI the SI unit of the second is abbreviate with a lower case “s”. Only units named after humans have uppercase abbreviations.


Are you talking about specialty industrial monitors? I've seen 60-90hz normally. I heard there were 120hz...but 480hz? Which company made that?


I’m used to seeing VGA CRTs with maximum vertical scan rates of 170 Hz and horizontal scan rates of up to 130 kHz. Impressive stuff to be sure, and still relevant compared to LCD in terms of motion clarity. If you can afford a 4K 120 Hz OLED with black frame insertion, that has all of the technical advantages.

However, CRT TVs playing 240p games still have those thicc scanlines and good nostalgic feels.


The CRT monitor being discussed in the article does 160hz at < 2K resolution and 80hz otherwise.


That 240hz panel will have terrible colour reproduction compared to the CRT. Only TN panels are clocked that fast, and TN are ugly.

If you want nicer colours you go for IPS, which has a very slow response time (especially when compared to a CRT).

The only modern display technology that comes close to CRT in terms of colour, contrast and responsiveness, you need to go OLED. And that's great (Oled with scanline emulation and black-frame insertion is incredible!) but OLED has the issues everyone knows about regarding degradation of the organic compounds (burn in).

I want my cake and to eat it, I want great colours and contrast and a stupid fast response time, with a panel that won't burn-in my desktop.


Tech changes fast. Nano-IPS 144hz IPS panels are becoming common now, with upper 90s DCI-P3 coverage, something you'd only dream of a few years ago.


Becoming common isn't quite how I'd phrase it, but they certainly are becoming available if you're willing to pay.

I've been interested in Nano IPS to replace my current ultrawide, but it appears the contrast ratio of Nano IPS panels isn't that great vs. my current MVA panel.


And their contrast is still shit.


I recently bought a couple of Alienware AW2521HFL monitors that are 8-bit IPS and do actually run at 240Hz. They don't even use FRC to achieve that 8-bit color! There's also the MSI MAG251RX which uses the same panel but adds FRC to get 10-bit color!

The downside to them is that they're 24.5" and 1920x1080. I'm ok with that because I paired them with my 34" 3440x1440 screen as side screens. However, it's a really low pixel density to use as a primary monitor.

That's the part that sucks. A nice 4k or 5k monitor like the LG Ultrafines will only do 60hz.

All that said, the Alienware screens seem expensive until you look at how much the other 240hz IPS screens cost. At the price, they also come with a nice 3 year warranty and a really good stand. The part that irritates me is they don't calibrate them. One was way off compared to the other, so I'm waiting on a warranty replacement just on the basis that I think that one is not going to be easy to correct.


These are solid points, the IPS stuff is getting better though. And if there is a miniled salesman out there he will be calling :-), it promises all the win of OLED and none of the downside.

There was also the tech which was basically an array of really small 1-pixel CRTs, it is too bad they couldn't get that to scale to the pixel densities of OLED.


> IPS stuff is getting better though

Not when it comes to contrast. There has been pretty much no improvement on IPS black levels over the past years.


> f you consider the amount of RAM and processing power you have to have inside the monitor at 4K resolution

You need nearly zero memory inside an LCD monitor of any resolution. Technically all you need is 256 bytes for EDID and that’s it. You can drive the panel directly from the graphics card from DVI, HDMI and DisplayPort, often with a single format conversion chip.

In fact, Apple’s HD Cinema Displays were built using LG panels with a DVI interface built right into them (which isn’t the norm).

Source: I designed custom FPGA-based LCD panel interfaces for many years.


> I can pretty much assure you that with a 240Hz refresh rate 4K LCD monitor it will look better than your CRT :-)

Since you bring up the refresh rate, then i'd assume that you include motion in the "look better" - in which case, i can easily respond with "no, it absolutely does not".

I have a CRT next to me which can do 120Hz at 640x480 (which is a low resolution but the CRT is also very small and i use it on a mid-2000s PC for playing some older games, so it doesn't bother me). It is a Samtron which is basically poor man's Trinitron, so not even among the best out there (i also have a Trinitron but that was also not among the best out there... it was one of the cheaper models).

Despite that, motion on this thing at 120Hz can only be described as liquid butter smooth. Just moving the mouse around makes you want to... well, keep moving the mouse around because it feels so good. FPS games feel amazing.

It is so good that i decided to buy a high refresh rate monitor for my main PC. I avoided that for a long time for two reasons: a) good monitors use high resolutions like 2560x1440 often at huge sizes like 27" and i do not really like high resolutions nor huge monitor (huge in terms of viewable area) and b) all flat panels have persistence issues so chances are they wouldn't be that good.

But you know, having that CRT next to me and using it from time to time really made me want to have a similar experience on my main PC. So i decided to find something that would be close enough and bought a (rather expensive) 165Hz monitor. I mean, ok, how bad can it be?

I rarely get disappointed with new purchases i make and i can't say i was completely disappointed, but i can easily say that if i hadn't experienced using a decent CRT for years (like many that seem to praise modern display tech do - assuming they ever experienced a CRT at all, things aren't getting younger) i'd probably be much more enthusiastic.

The thing is however, i had experienced a CRT and my brand new expensive monitor is far from being as good as the CRT i bought for barely 15 euros.

It is day and night. Not just something that you need to go back and forth to compare - i realized how worse the new monitor was the moment i tried to move some windows around and launched a game i also had played on the older PC and that even though i hadn't used the older PC since a while. All it takes is using both once to realize how better the CRT is.

And it isn't like the new monitor doesn't feel smoother than the 60Hz, but it just isn't as good at the old CRT i have next to me. At best it brings back some of the responsiveness i lost when Windows forced a vsync'd compositor on me. But i never used vsync in games so that sort of responsiveness in games wasn't something i lost and i disable vsync for almost two decades now, so any tearing not only doesn't bother me - it barely registers.

Also, aside of motion, CRTs (at least the decent ones) have much better contrast than any tech outside OLED (which isn't available in PC monitor form, at least not at a non-ridiculous size, non-ridiculous resolution and non-ridiculous price).

Sadly it isn't just a matter of older or cheaper monitors. It is just that modern monitor tech simply sucks at most things outside being flat and having high resolutions. It is fine for office work, etc, which is how i guess they became popular, but for gaming they just aren't as good as the better CRTs (sure there were many crappy CRTs out there - and i am certain that anyone who complains that they dislike CRTs because they flickered used a crappy one - but people who say that they prefer CRTs do not refer to the crappy ones as i'm certain that people who say things are better nowadays do not refer to crappy TNs with washed out colors either).

> So many ways to optimize for particular markets.

I'd like an optimization for a high end CRT please :-/. The article mentions $500, i'd actually pay $1500 for a brand new (not old stock, i mean truly new) good CRT like those mentioned in the article.


I don't know what brand of LCD you might have, but do you have an opinion on the refresh rate settings? My Samsung 4k LCD has a feature I only a few days ago realized I had jacked up to "Fastest" as soon as I got it (from its default of "Standard"). It apparently does something wonky with the backlight? (And without it I think the published refresh rate is nowhere near accurate, as that's like the maximum if you are willing to accept the ramifications of this strobing?)


It is an Agon though i do not remember the exact model name[0]. What you refer to is some trick a few high end monitors have to alleviate image persistence. I think some also do black frame insertion (ie. they insert a black image inbetween updates) which also tries to reduce pixel persistence.

Both of them try to mimic how CRTs work (though i think black frame insertion is closer) but as a side effect they affect the image brightness (usually causing everything to become dimmer) and can introduce perceptible flickering. And yes, some monitors are indeed marketed as the theoretical refresh rate you'd achieve with these effects in action - you'd need to find dedicated monitor review sites (e.g. i've heard some good comments about rtings.com, though ignore the scores and look for the metrics... their scores take into consideration stuff you often wont care about) and of course look for comments on places like Reddit's /r/monitors subreddit for people who already bought a monitor.

[0] https://twitter.com/System32Comics/status/131319474720949862...


I recently got to experience a 240hz IPS monitor. Colors and viewing angles are way better than the 120hz TN monitor i had in the past.

That said, even at 240hz, motion is still blurry. If i move a window and try to read text in it in motion it's still not smooth like real life.

I wish i could experience a Trinitron CRT to compare the motion resolution! On Digital Foundry they make it sounds as you do: amazing. I wish i could judge for myself


My experience is the opposite, a high refresh rate LCD is far more pleasant than my old CRT ever was.


What do you mean with "far more pleasant"? If it is about eye strain, your old CRT was probably either bad or you had it at a very low refresh rate. But a bad modern flat panel will have a lot of its own issues too (e.g. a very cheap Fujitsu monitor i bought some time ago had colors so washed up and awful, it was straining just to look at a static image on it).

If it is something else... i'm not sure what that would be. Can you elaborate?


It seems like 4k is pretty darned wasteful part of the hedonic treadmill. At a more reasonable resolution, this processing power could be used for higher-end graphics algorithms, better AI, longer battery life, and so on.


I've had 4K monitors on both my work laptop, my PC, and my TV for as long as that was possible. Nearly 7 years.

This has cost me thousands and thousands of dollars.

My logic is that when I use a computer, it's the monitor that I'm looking at. I don't sit next to the case staring at RGB LEDs. I can't "experience" a processor in any aspect except speed, and that's been perfectly acceptable for a decade.

But I'll stare at the screen for 8 hours a day for work, and an additional few hours at home.

Spending money on improving the image quality of something I look at for 10+ hours a day is absolutely worthwhile!

This is also the advice I give people: Buy a bigger, better monitor than you planned, and make sacrifices elsewhere. A larger 4K monitor is a very visible upgrade in the literal sense versus, say, a few hundred more megahertz on the processor.


I feel this way not only about monitors, but chairs and shoes, and for the same reasons.


I've heard the saying as "spend good money on what separates you from the ground" - tires, shoes/boots, mattress, etc.

In this sense, it'd be "spend good money on what separates you from the virtual". That is, of course, our I/O devices :)


And keyboard, mouse, headphones, bed, pillow, and underclothes.


>My logic is that when I use a computer, it's the monitor that I'm looking at.

Sure. But that doesn't get over the law of diminishing returns. It might not be 4K that, but would you buy 8K or 16K because "it's the monitor I'm looking at", when the only thing it does is waste CPU/GPU, resources, and battery -- for angular resolution lost to the human eye anyway?


>for angular resolution lost to the human eye anyway?

Personal experience tells me that the resolution difference between 4k and lower resolutions (for the same size screens) are Very Noticeable on my 27 monitors. (I have one 4k and one 1440p.) Text is noticeably easier to read on the higher resolution monitor. I think this is primarily because we sit much closer to monitors than to TVs. (I don't think I can tell when my TV plays 4k vs 1080p, as I sit about eight feet away.)

If I had a 30" monitor with 8k resolution, I would probably not be able to tell a difference, unless I went looking very closely to it, so I don't think I'd spend the money on that. However, at 40" or 42", I suspect one would notice it at the distances one normally sits from a computer monitor. At that point, my reason for not buying it would be that I can't afford it. ;)


I've seen an 8K monitor in person.. and... wow. It's like a glossy magazine that can move. It's absolutely astonishing how much additional detail you can see in fonts: subtle differences that are totally lost in normal rendering suddenly come to life. Each font has a unique personality that's just lost in the clunky rendering to a few dozen(!) pixels of a mere 4K display.


If you have poor eyesight it's even more noticeable! Personally, the higher dpi the better for me. I have a 24in 4k monitor and want to buy another but these sizes with 4k are rare


> Buy a bigger, better monitor than you planned

But! Do a bit of research. I sent back a 24" 1080p monitor and got a 21.5" one instead, because the pixel density was too low. And it always will be, but still there's tons of 24" monitors out there at 1080p because people think bigger is better.

Make sure you're going bigger and better in the right ways.


I feel this way about high refresh rate.

Having a 120Hz monitor feels like having a computer that keeps up.

(Provided your input devices are also fast enough.)


A larger monitor I can understand, but on a laptop, how can 4k resolution be visible if it's... too small to see?


That’s sort I’d the point: that you can’t see the pixels … Text looks amazing because you don’t have to anti alias the jaggies away now that the pixels are so tiny. I much prefer text on a very high dpi monitor because everything just looks so crisp. Like a really good printed page, or something else physical. Going back to a low dpi monitor is real hard for me now—pixelated text is really hard to look at.


Well, you wouldn't run the computer at a 1:1 resolution, it would be a logical lower resolution displayed at a higher resolution. For a 4K display, everything would show at the size it would be on a 2K display — but actually rendering at 4K, everything is much crisper rather than just blown-up pixels.

Apple calls it Retina Display, Windows calls it display scale. Apple made it mainstream in 2010 (iPhone) and 2012 (iPad, Macs) whilst Microsoft seems to have made it much more mainstream since Windows 8 and the introduction of its Surface computers.


Depends how close you sit to it (touchscreen models promote closer use) and how good your eyes are :)

I can definitely see the difference between 4K and Full HD on a 15".


Compared to what? A cheaper monitor? On a relative basis, spending $400 on a 4k monitor vs $250 on a similar 1440p monitor is wasteful. But if you compare to any other category of goods, it's a drop in the bucket. Most people don't even consider efficiency when adding thousands of dollars of options to a new car purchase. You can buy a lot of 4k monitors for the amount families spend on larger than needed houses throughout the suburbs of the United States.


Things are getting obscene at the high end of the treadmill though. Current high-end monitors like 38GN850G and X27 are already plumbing the $2000 price point and the X32 is expected to tip the scale at $3600 MSRP. "Compared to" just going out and buying a TV and using it as a monitor - you could buy a 55" OLED for your PC and a 65" for living room at that price. Or two 55" oleds, replace it in 3 years when it starts having burn-in from years of marathon Starcraft sessions or whatever.

There really isn't a lot to justify a 32" IPS 4K144 over a 48" OLED 4K120 other than burn-in.


Enthusiast, early adopter products have always been expensive. This doesn't seem that exceptional to me. It's not like tons and tons of people are buying that high end monitor. I'm also not sure what 4k 120hz tv you're referring to. Do they exist? Hdmi 2.1 isn't even done yet is it? How do you get a 4k 120hz signal into such a device? And how am I meant to fit it in my 3ft deep desk?

Fwiw I have a 4k 144 monitor that I got for $800-900. I believe I first heard about it on this site actually. 4k is important to me for text and 120hz is important to me for gaming. It's nice to have both, and gsync compatibility, all in one device. Well worth a small splurge.


Yes, high-end is a thing but high-end gaming monitors have tripled over the last 3 years or so, and now they are to the point where they are competing with premium televisions which offer a better overall picture quality. So you really have to figure out what you actually want.

IPS contrast is not good and OLED contrast is amazing (it's the only way to get "real" HDR except for VA which has poor motion clarity), with the only real downside to OLED being burn-in (it's gotten massively better over the last few years but heavily UI type stuff in lightmode-type color schemes with fixed elements is the worst case scenario, this should not be your daily driver for Excel spreadsheets). But again, you have to bear in mind that you could literally buy two OLEDs for the price of that "premium" IPS monitor, so if it doesn't become obnoxiously burned within 2.5 years or so (assuming a 5y lifespan on your IPS monitor) you are coming out ahead on burn-in.

> I'm also not sure what 4k 120hz tv you're referring to. Do they exist?

Yes, LG's OLED series has supported 4K120 input (not strobing/interpolation, true input at 120fps) since the last generation (C9, support continues with CX). Also supports HDMI VRR.

CX adds 120 Hz black frame insertion - so it can take 60 hz and alternate between a signal frame and a black frame for additional motion clarity.

> How do you get a 4k 120hz signal into such a device?

For the LG OLED series, HDMI 2.1, from an NVIDIA 3000 series or Radeon RDNA2 GPU.

Or, display stream compression over DP 1.4, which is what the 4K144 gaming panels are currently doing (none of them currently support HDMI 2.1). Also, that currently only gets you to 4K120 without chroma subsampling, another disadvantage of the current "gaming monitor" crop.

> And how am I meant to fit it in my 3ft deep desk?

Wall mount. CX 48" wall mounted 3 feet away is probably about the same as a 34" ultrawide that's 18" away (which is what I'm currently using) I'd think. Certainly a "big" monitor but the people who are really up shit creek are the ones with shallow desks and/or who are currently situated in a corner and can't wall mount.


I don't disagree at all with this.

In an adjacent field, graphics, we used to debate spending cycles on simulating thing like lens flare, as if in a first person shooter you are wearing goggles with tubular optics? Sure it showed off mad math skilz but what was the point?

I do appreciate the lack of eyestrain for detail work on a high resolution CRT though.


at least on a 27" display, I mostly agree 2160p is overkill for games. it's hard to notice the additional sharpness over 1440p, and it breaks a lot of old games that don't support UI scaling. the performance hit isn't necessarily that bad though; at 2160p, I'm satisfied with much lower AA settings which almost splits the difference between 1440p and 2160p. if the game is really demanding, you can always drop down to 1080p with perfect scaling. it wasn't too long ago that 1080p was an acceptable resolution on a 27" display. not really an issue for me anyway. my 1080ti can do better than 60 fps in most games I play.

knowing what I know now, I'd almost rather have a higher refresh rate 1440p monitor. but now that I'm working from home, I really appreciate the higher dpi for coding. the difference in font rendering really is night and day. I can't unsee it.


yeees. Just look at the crappy Netflix 4k, which is less bandwith than a 1080p Bluray (like 30% less iirc). Now, obviously, it's a better codec, but h265/VP-9 (or whatever they use) won't give you a > 400% increased efficiency over h264. So, people are buying new Fire-TVs/TVs/GPUs all so that they can enjoy a 4k artifact show.

Meanwhile I've fallen under the hedonic treadmill and bought a TV. Sitting 3m from this 49inch screen, I play Netflix in 540p (on Linux) – and it totally looks fine. Well...

But, truth be told: I'm typing this on a 32inch 4K monitor. For work it's gorgeous (yet some people still buy FHD...).


Even worse, you can't see in 4K. :) https://www.youtube.com/watch?v=VxNBiAV4UnM


Yeah but could you play duck hunt?

And in the old old days you could skip the scanning and just go draw what you wanted where you wanted with vector graphics. (as long as there weren't too many "what"s)

There were even tricks like defocusing the beam.

:)


"I can pretty much assure you that with a 240Hz refresh rate 4K LCD"

It depends what you will plug on it. CRT are considered the best solution to play unmodded old consoles (up to the PS2/Dreamcast/Xbox/Gamecube generation, and even the Wii) or VHS tapes. Games and consoles were made to look good on CRT, and to take advantages of specific features of these TV.


I agree with this and it is key. Video that has been "designed to be displayed on TV" looks better than stuff that wasn't designed to be shown there. The same is true for LCDs of course.


> On a CRT monitor, the screen is coated in millions of phosphor dots, with one red, green, and blue dot for every individual pixel.

Not true. The number of dot triads is usually greater than the number of pixels. The distance between them is specified as the “dot pitch,” which serves as a physical resolution cap. In aperture grilles they’re continuous RGB vertical lines, with the dot pitch being the horizontal distance between them.

Framebuffer pixels don’t align exactly with the dots, and that’s one reason why CRTs are so blurry. The other reason is that the DAC analog output doesn’t transition discretely between pixels. As the refresh rate and resolution get higher, the DAC has to spend less time on each pixel, so the output becomes blurred horizontally.


I loved the warmth of this blur. When I sold my old mitsubishi diamondtron, I tested it before shipping it. And watching an old divx made me feel all confused. It looked nicer .. warmer .. something.


This idea is expressed by the Brian Eno quote:

“Whatever you now find weird, ugly, uncomfortable and nasty about a new medium will surely become its signature. CD distortion, the jitteriness of digital video, the crap sound of 8-bit - all of these will be cherished and emulated as soon as they can be avoided. It’s the sound of failure: so much modern art is the sound of things going out of control, of a medium pushing to its limits and breaking apart. The distorted guitar sound is the sound of something too loud for the medium supposed to carry it. The blues singer with the cracked voice is the sound of an emotional cry too powerful for the throat that releases it. The excitement of grainy film, of bleached-out black and white, is the excitement of witnessing events too momentous for the medium assigned to record them.”


I don't think that applies, it didn't feel like 'home at last' but really 'new made me forgot what can be'


I have a different problem with LCDs. I still think they're too bright. With a CRT you could set everything to white on black and you had almost no light coming out of the screen. Also brightness meant brightness and you could turn it way down for when working in dim light or darkness.

Of course i'm talking about text work here.

Maybe when OLED monitors become affordable we'll go back to monitors that aren't basically a lamp shining into your eyes all day. At least the oled on my phone looks like it doesn't put out light where it shouldn't.

For gaming, i'm not latency sensitive, but then i don't play twitch shooters any more. I'm more the kind that buys mid-low range video cards and turns on the fps limiter where it's available.


I’d recommend adding bias lighting to your monitor. It won’t make the monitor less bright, but it’ll light up the wall behind your monitor so there’s less of a contrast between the wall and your monitor, which reduces eye strain.


I tried that but looked like too much effort. Instead, I never turn off the room lights any more when using monitors. Seems good enough since oled is coming.


I use a normal desk lamp and point it at the wall behind my monitors. Super easy, cheap, and effective.

https://smile.amazon.com/dp/B083DG5CK1/



Same!

I do it for both the monitor and the TV, with two cheap "Ingared" lamps from IKEA; can recommend.


LED strips stuck to the monitor itself, pointed at the wall. Powered from a USB port on the monitor if it has one.


This is what I do. Cost what $10 and took about 5 minutes. Well worth it.


In the age of video conferences, this will also improve your appearance a lot. Soft, indirect light coming from behind the camera… you just build your own softbox.


I've been working from home for at least 15 out of the last 20 years. Never did a single video conference.


I actually have panel lights for this precise purpose that I turn on during meetings.


Yeah I have no idea why the lowest levels on monitors are so bright. Are there models that can go really low without the PWM flicker effect kicking in (that's another problem I have with monitors, I can notice the damn cheap PWM backlight)?

Laptop (and phone) displays don't seem to have this problem, the brightness goes really low...


I’ve always wished the Macbook’s lowest brightness setting was subdivided two or more times. It’s still too bright when I feel like reading in my dark room perhaps on the way to sleep. I have to use an app called Shady to cover the screen in a dimming overlay.


Have you tried using option-shift-[f1] to subdivide? or the brightness slider in the settings.


Interestingly, on my early 2015 13" MacBook Pro, the very lowest brightness setting seems to have no change in brightness on those last three subdivisions. Once I get down to one full "block" of brightness, option-shift-F1 moves the slider down in quarter-block increments but the screen gets no (perceptibly) dimmer until it shuts off entirely. All the other brightness settings have perceptible changes in brightness at every subdivision level.


Oh my god, thank you!

How does one find these "hidden" features?


Wow, I had no idea that was available.

Unfortunately it doesn't reduce the dimmest level on my MBP. The bottom 4 subdivisions look identical. Like level 0.25, 0.50, 0.75 and 1.00 are all the same brightness, only 1.25+ starts to get brighter.

The lowest level is still too bright at night. That's unfortunate.

However it does reduce the dimmest level of the keyboard illumination!

That's great. I always found the keyboard too bright at its lowest level at night, but some illumination at night would be helpful for obvious reasons. Now I can enjoy 3 subdivisions lower - thank you :-)


After lowering brightness to 0 the brightness can be lowered by lowering the contrast ratio. It’s obviously not perfect but does work.


Most better monitors and everything targeted at office use does not use PWM for backlight brightness control.

LCDs have poor blacks because they're a filter put in front of a lamp, so to render black, they have to try and block as much light as possible, but our visual perception is logarithmic and adaptive, making them look quite bad. The workaround is to add bias light behind the monitor; it quite literally provides a bias for the eye to keep it inside a certain adaption range, which reduces the perceived poorness of dark tones. (It also is more ergonomic.).


The vast majority of modern high-quality monitors have flicker-free backlights without PWM dimming.

https://www.rtings.com/monitor/tests/motion/image-flicker


Yes, there are a bunch of monitors now that do not use PWM for brightness control. And since PWM became known as a problem PWM rate is tested in professional benchmarks. So you definitely can get monitors that do not have that flicker, where the minimal brightness will also not be too high.


I am still waiting for a decent size e-ink screen (22"+) without backlight which should be enough for low-FPS stuff like backend coding, slack, terminal and text content web pages. Existing solutions are still too small - Dasung and Onyx Boox are 13.3" with HDMI.


i have never heard of an e-ink screen with fast enough refresh to be viable for coding


I think both of these are usable https://www.youtube.com/watch?v=NozoRkE0DTo they use various optimizations to avoid full panel refresh. Here is coding example with Dasung although it is hard to judge scrolling quality https://www.youtube.com/watch?v=OO0Qzuw18q8


How about a display that had a large e-ink screen plus a small LCD of maybe 2-4 lines at the bottom?

Go old school for editing. Use something like TECO or ed or Rob Pike, David Tilbrook, Hugh Redelmeier and Tom Duff's Unix version of QED [1].

The small LCD is for seeing the command you are currently typing and a little command history.

With those editors you entered editing commands but did not see the results until you asked for them. You'd tell the editor to show you the current line plus say 10 lines before and after. Then you'd give it commands to edit the current line, such as telling it to change the text "float" to "double", or telling it to insert a new line before the current line, and so on. When you had done enough changes that you needed to refresh your notion of the current state of the file, you'd ask it to show you again.

Even the older generation, slower e-ink screens would be fast enough for that kind of work.

Maybe instead of putting the small LCD display at the bottom of the e-ink screen, make it a separate unit that can attach to the e-ink screen or attach to the top of your keyboard or stand alone somewhere if you prefer.

[1] https://github.com/phonologus/QED


what is your minimum refresh rate needed for coding?


Scrolling would be painful on e-ink


Also programmers tend to type pretty fast after getting some experience. It's disconcerting enough when code completion slows down the editor, not sure i could stand a screen that can only refresh every full line.


This is why vim has multi-character movements.

The problem here doesn't seem that different from coding over a slow network, something that's been possible for quite a while.


Coding over a slow network is the sole reason I know to use vi :)

But i'd rather not do it daily.


When I'm coding I'd usually do pgup/pgdn, don't think this would be a huge issue.


Syntax highlighting is too important to me to use this.


Decent color e-ink tech is finally here:

https://gizmodo.com/the-first-color-e-ink-devices-are-finall...

It's just a matter of time before a major brand (Amazon, Microsoft, etc.) decides to take a chance and use it to make a big multipurpose tablet.


It is only color coding one may miss, although it is going to change with faster color eink panels. There is a still a lot of other ways to mark syntax: font, borders, underlines, cursive, grascale tones, background inversion.


I suspect that people leaving LCDs at their eye-burning default brightness (looks great in a store demo competing against the lighting and other monitors, but not at all for long-term work) is at least part of the reason for all the excessively low-contrast websites.


If you calibrate your LCD monitor you'll also end up with eye burning brightness as far as i know :)


It's mildly annoying but easily fixed with a touch of ambient lighting or a software tool looks f.lux.

Brightness isn't nearly as bad for eye strain has contrast between the screen and ambient light.


Better even than f.lux was Nocturne (https://github.com/strider72/blacktree-nocturne), though it hasn’t worked in years. (Maybe a decade? I can’t even find screenshots.)

Nocturne would let you invert your entire screen, then shift it to monochrome (ideally red). Obviously useless for gaming or color work, but I’ve not run across anything since that’s so great for late night text work across the whole OS.


I've never seen Nocturne but on Windows 10 inverted colors+night shift works well (IIRC has an option for grayscale, too). On Linux you can achieve the same effect with xca+redshift (not sure about grayscale).


I have switched all programs to darkmode where possible. If not possible I don't use it.

f.lux can do invert and redshift though: shift+alt+end


I have OLED screens all over my house - I can never tell if the monitors are on or off unless the screensaver kicks in. 0 regrets spending that money.


By "monitors" you mean TVs? Or did you buy some of the $3-4k monitors that seem to be available?

At a quick google there's nothing under 3k and they're either 21" (too small) or 55" (fine for console gaming but not for programming if you ask me).


According to this youtube video [0], LG's OLED TVs work well as a monitors, at least the 48 inch one. Can't vouch for the quality of that channel though, and it is a sponsored video.

Regarding size for programming, I use a 40 inch monitor and wouldn't mind additional width.

[0] - https://www.youtube.com/watch?v=Xzp3fF6AL88


Funny, most programmers complain they don't have enough height. Me included.

Say, are there any 16:10 oleds? :)

Edit: Hmm, one 40" 4k instead of 2x24" 1920x1200. Maybe.

Not that LG has any 40" oleds, the smallest i see (locally) is 55".


I would usually agree with them - but I finally feel like I have just about enough height with my 40 incher. Not that having more would be harmful, but I'm worried that the webcam on top would start to look weird if it was higher.

Here is the 48 inch oled from LG: https://www.lg.com/us/tvs/lg-oled48cxpub-oled-4k-tv


VA panels have by far the blackest blacks of any LCD technology. Viewing angles and uniformity aren't as good as IPS, but they're the best option if you care about contrast.

https://www.rtings.com/monitor/tests/picture-quality/contras...


I think we need monitors with more dynamic range.


No thanks, because they add more light at the top not complete darkness at the bottom. The lantern shining in your eyes just gets brighter.

Incidentally this is how LCDs became so bright. So they can claim better contrast.


An OLED HDR1000 10-bit display can go from 0 to 1000 peak nit brightness with 10 bits per channel of color differences. Is that not good enough for you? It’s better than we’ve had for a long time with our 5 or 8 bit monitors that peak at like 250 nit.


Please don't let CRTs come back in style. After a while they tend to develop this headache inducing high pitched tone almost akin to tinnitus that is emitted constantly while they're powered on. It seems to be so high pitched most people cannot hear it at all but if you're one of the lucky few who can it can actually be really disruptive. Unfortunately it's also often loud enough to hear through doors, walls, etc... Please be mindful of this before setting up a CRT if you go down this path. For example a house might be better for this setup vs an apartment or condo.


For the longest while I used this to tell my friends in high school I had a mini-superpower. Because I knew when there was a television on in the house, and somehow none of them heard it.

There is an old telly in my apartment, I turned it on to test it last week and heard the tone again, must have been like 18+ years since I last turned one of those on.

It never gave me headaches but I definitely heard (and still hear it) all the time.


Yep. I experienced this too. My left ear hears 22kHz to this day and I could hear even even plasma TVs.


I thought this was my superpower too as a kid. I don't really notice it anymore with all the digital equipment, but analog was noticeable.


My hearing is not even that great in some ways, but I can easily hear the old CRT whine. It blows my mind that not everyone can hear it.


Holy shit, I wasn't crazy all those years


When I was a kid I would always hunt down and turn off CRTs left on anywhere in the house since the sound annoyed me so much. Either my hearing has gotten worse or all the CRTs are gone now... probably both.


I read somewhere that you lose high pitch hearing first, which is why the young could hear the CRTs when the elderly could not. I tried to google for this (actually DDG) but all I got was SEO spam.


Something like this?

https://medlineplus.gov/genetics/condition/age-related-heari...

https://playback.fm/hearing-test

and

"gradual loss of sensitivity to higher frequencies with age is considered normal" (https://en.wikipedia.org/wiki/Hearing_range)


I wish there was audiophile equipment for older people that took that into account. Headphones and speakers, say, that only go to 14 KHz instead of 20 KHz. By not needing to design for as wide a frequency range, they should be able to either make them less expensive, or more accurate, or both.


Capping the hertz won't make it cheaper or easier to produce...


I doubt most headphones can reproduce sounds above 14 kHz.


They can


"The Mosquito" (https://en.wikipedia.org/wiki/The_Mosquito) came to my attention when we were walking out of a multi-story car park with my kids. They complained of an irritating repeating high pitched noise that gave one a lingering headache and sense of nausea that lasted for about 20 minutes after leaving the area. If I concentrate very hard I can hear it myself but age and probably one too many loud concerts has rendered that range basically silent to me.


Teen repellent! My school had one above the entrance.


Probably both. The CRT sound was between 15,625 and 15,734Hz which is high enough that you almost certainly lose it as you get older. It now lives only in our minds.


The responsible component in the CRT is probably the flyback transformer. Supposedly replacing it with one that is more well made or applying some high-voltage silicone putty to it should prevent the sound.

Of course, this DIY solution does not scale well, so it wouldn't necessarily help with a room full of CRTs with bad flyback transformers.

EDIT: the voltages involved are potentially lethal


Oh the flyback transformer noise or whatever it is, that you notice whenever you enter a room where a CRT is powered on. It's very annoying. But I love how old games look on my vintage CRT monitor!

Now that you've reminded me of it I swear I hear some high pitched noises coming from some other electronics. ;-p

At least you can escape the high pitched noise by going into another room. Low frequency rumble – usually from aircraft – is a nearly inescapable torture that cities have somehow decided that their residents must endure.


That was the flyback transformer. I could hear them too, but it never really bothered me much.

It was useful ability if you were responsible for closing up a computer lab at night.


You can often get a CRT to stop whining, at least for a little bit, by either turning it off for a few seconds, then back on; or degaussing it. At least that's my memory of the situation. I haven't owned a CRT monitor in 20 years, and I don't miss that 90-lb monster one bit. That may be because I carried it up stairs a few too many times. :-)


This noise bothers me intensely, has since I was about 4 years old; and still very much does. (31 atm)

It's why I don't keep any iMac G3 units around any more, even though they are gorgeous art pieces. The iMac G4 obviously does not emit such a similar tone, being LCD-based. :)


A horizontal scan rate of 16 kHz is audible. If CRTs were brought back today they very likely would not run NTSC. I wouldn’t worry about coil whine at 30+ kHz.


You might not hear it anymore by now if you are still of a CRT generation since that tone is 16kHz and you quickly lose the ability to hear it.


Besides that (I have the same problem), there's the obvious "electron GUN pointed at your face/eyes" problem.


maybe if you spoke a different language you wouldn't be bothered?

electrons can't travel very far in glass, or atmosphere, which is why the tube has a vacuum inside of it.


What does the language bit have to do with anything? I speak 2 foreign languages and I'm not a native English speaker.

There is still the fact that the CRT display is very eye straining. Due to both the flicker and the way it works. My eyes would constantly be red after using them for a long time.


> What does the language bit have to do with anything? I speak 2 foreign languages and I'm not a native English speaker.

capitalizing "GUN" as though it mattered looked rather like hoplophobia.

> There is still the fact that the CRT display is very eye straining. Due to both the flicker and the way it works. My eyes would constantly be red after using them for a long time.

then say that, don't spew some garbage about electrons, when they're not making it past the glass.

nobody can (reasonably) question how you experience the CRT, but we can prove that the electrons have nothing to do with it.


My eyes used to hurt a lot on CRTs. A few hours every day in front of one and I could not go outside without sunglasses, anything would make my eyes hurt.

Everything was fine if I went a week without using a computer.

With LCDs, that's not a problem anymore, and I like it.


I guess most people can’t see the flicker on a CRT set at 60Hz, but I can. It’s especially bad in a room with fluorescent lights. First thing I’d do upon sitting down at a new computer with a CRT monitor was to bump up the refresh rate to 75Hz so I didn’t get a headache.

Taking standardized tests in the early 2000s on locked down computers in fluorescent-lit test centers was torture.


I found the flickering at 60HZ unbearable, but at 75HZ it was barely noticeable. My sweet spot wast at 80-85HZ. I tried until 90HZ, but you had to sacrifice resolution and past 90HZ, I didn't notice much difference.

I think for today's LCD/OLEDs 90-120hz should be the sweet spot, especially for gaming and motion. More than that is a waste of computing. (there are monitors that offer 144-200Hz refresh rates, but that is mostly a waste).


But the fundamental difference is that LCDs are static. Like, if you have a static image on screen, it does not refresh. A CRT is always refreshing at x Hz.

On LED backlit displays, it's the cheap PWM brightness controllers that can be noticeable. They're supposed to pulse hundreds of times a second to maintain the brightness setting, but as you go lower, it can be noticeable and tiring on the eyes.

On LCDs, refresh rates matter only in constantly changing pictures, like movies and games.

On a CRT, you're exposed to the constant refresh flicker even if you're editing a file or reading an article.

All in all, I'm happy with LCDs. Even in games, they're enough for me, I'm not that competitive.


An LCD with 20hz won’t give you a headache - an LCD pizel is always on (but only changes on refresh).

A crt pixel is on for only a very short time, while picture results from eye’s persistence of vision (which is also why in movies CRT screens flicker)


Any kind of Motion, scrolling, movies, games... etc... refresh rate matters.

Sure, cinema movies are 24fps, and they do manage, but you are usually not 2 feet away from the screen.


I’ve had a 240hz monitor for a while now and I wouldn’t say 144hz is a waste.

I do agree 100-120hz is the sweet spot, good balance of refresh rate and graphical quality.

I would not suggest a 240hz monitor if you are not playing competitive shooters a majority of the time.


I agree. By 'a waste', I didn't mean that it is not better, but I meant that it is a diminishing returns curve, and you will end up sacrificing a lot to achieve it (resolution and graphical quality, as computing is not free) and it doesn't feel that much better. Going from 60HZ to 120, or even 90HZ is very noticeable by everyone, even in normal 2D content. Past 120 it is like, eh, just a bit better.

Between 120HZ 4k, and 240HZ HD, I'd take the 120HZ 4k Between 120HZ 4k, and 90HZ 4k but with fully Ray Casting, I'd take the 90HZ (unless it is a competitive FPS)

etc...


BlurBusters argues even 1000Hz worth, interesting.

https://blurbusters.com/blur-busters-law-amazing-journey-to-...


most people absolutely can see the 60hz, the effect is simply subtle. if you look at a CRT from an angle it's more visible, or you can wave your hand in front of you and observe the strobe effect.

i definitely do not miss 60hz CRTs, they made me miserable.


I was so happy when I figured out how to switch to 60 Hz. Living in a PAL region I was stuck at 50 Hz. Later on PC I cranked up the Hz as high as it would go while still display something.

And yes, the flicker is easy to spot if you just sit in front of the screen and look just above it, prefferably in a dark room. Or wave your hand in front of it. With LED lightbulbs you can also use your phone camera, move it realy close to the lamp and if you see banding, it's not good.


Yeah, I used the highest refresh rate I could force, usually 75-80 Hz. Some monitors could handle 90, and it made a difference, but it was still worse than LCD PWM flicker.


60Hz wasn't really a default for long though? Most CRTs in Windows 9x era would do 75Hz or 85Hz if you didn't push them above their native resolutions.


I'm pretty sure Windows 9x defaulted to 60 Hz even when the monitor could handle more, and most people just kept the default. I recall walking past computer labs in high school, seeing row after row of blinking rectangles.


I’m fine with 60 but I’m in a PAL region so all of my 90s consoles run at 50Hz. That is seriously uncomfortable to look at now and I have no idea how we all put up with it.


May not have been bad in the earlier sets as they would have used phosphors that persisted longer to compensate, but later on, manufacturing costs were driven down and the same tubes/coatings from 60hz sets were being used in 50hz sets, making the flicker much more apparent.


I'm using a PVM so it's high end kit, however perhaps because it also supports 60Hz they are unable to use the longer-glowing phosphors?


Presumably the phosphors in the PAL CRTs glowed for longer?


Many LEDs have an annoying flicker as well, which can be very annoying and distracting for me. Especially around christmas time, when everyone and their grandmother puts out thousands of cheap LEDs. Even when they're not trying to blink, they're still flickering. I suspect I'm sensitive to this kind of stuff because I suffer from migraines, but still, I can't imagine I'm the only one.


Christmas LEDs are technological abominations. They literally hurt to look at.


And they don't really twinkle like the heat driven, blinker style incandescent lights do. I loved to watch the pattern on the ceiling as a kid. It would change in subtle ways. Never the same, but sometimes really close.

I got a set and I am scared of them. Fire hazard. But I do run them once in a while for young ones at Xmas time. Just so they can check it out.


I have had the same feeling about early cfl lamps. Turning those on seemed to make the rooms darker and more dingy instantly


Yes! We tried to like those and had the same impression.

Amount of light seemed good, but quality was in the dumps.


I don't understand what it is about some LEDs that make your eyes hurt when you look at them.

Best way I can describe it is that my eyes can't focus on them and are constantly trying to.


The worst LED lights are 60Hz, very short duty cycle, very quick flash, with no diffusion. They're seemingly-infinitely small pinpricks of pure chroma flashing at an unpleasant speed. The worst are the blues.


I can't focus properly on bright blue things. There's a football stadium not far from where I live and its name alternates between red and blue in big text.

I know what it says but I cannot read it at night when it is blue.


> I can't focus properly on bright blue things.

No one can, for a bunch of reasons. The eye is mostly corrected for red-green wavelengths; the eye is a lot less sensitive to blue light; there are far fewer receptors for blue light, which reduces spatial resolution.

Monochromatic blue displays are an extreme UI antipattern and should not be used for anything. If combined with high luminosity, they become essentially unreadable unless close up. E.g. ultra-blue seven segment LED displays. VFDs can suffer from this, but this is also the reason why many of them use green-blue phosphors, and not blue phosphor (which would be possible).


It was impossible for me to spend 8 hours a day in front of a screen until the end of the 90s. I still have a 1999 CRT monitor and I don't have that problem. This feeling of pain in my eyes is something I wouldn't want to experience anymore. There are times when I miss my dot matrix printer.


This sensation may be caused by the flickering as the phosphor fades towards black before the screen is retraced. There are CRT monitors with long persistence phosphor that mitigate this, but it makes them unsuitable for motion video.

My favorite concept is that of the Tektronix vector terminals. They trace the image to a long persistence phosphor as necessary and then retain the image without retracing it by constantly bombarding the whole screen with just enough electrons for the already lit phosphor to stay lit. As a result, once the image is drawn the screen may have a little brighter black than a "normal" CRT monitor, but no flickering. I've never seen one in real life, but it seems like it would be great for ergonomics.


Check out this demonstration of "Space War" on a PDP-1 using a radar long-persistance CRT: https://youtu.be/1EWQYAfuMYw


I remember when I switched from a CRT to an LCD monitor. The difference was so dramatic I remember my head aching. For me it wasn't the brightness it was the curve vs flat. Going from looking at a curved screen for years to a flat one was a massive change.


Out of curiosity, did you ever have similar issues with flat CRTs?


I had a flat CRT TV a 30-ish Toshiba widescreen. But I was seated on a couch ten feet away so no problems there. I never had a flat CRT computer monitor.

I think my reaction was due to being so close to the computer CRT monitor. My brain had adjusted to the curve but when I got the LCD I felt like I was going to throw up I felt so disoriented. It took about a week to get over it.


Same.

A crt with refresh rates below like 60 and I could see the frames. It was torture. 120hz or more didn't seem to be an issue and its certainly not a problem since ive had lcds.

That being said it might be fun to go back for while.


While running my CRT monitor at 60 Hz recently I discovered that peripheral vision has better temporal acuity than foveal vision. Flicker is much more noticeable of things you’re not looking directly at.


I think this had a lot to do with low quality fluorescent light ballasts in offices. I agree that it seemed to happen a lot more with CRT, but I have experienced it with LCD too.


In 2008 or so, I ended up going to a neurologist because I was getting these brutal headaches that couldn’t be resolved by my GP. They couldn’t find anything either. And then I quit my job and no longer spent my days starting at a CRT under shitty fluorescent lights, and all the problems went away. Magic!


Some people call it allergy to electricity but most of them just have problem with flickering lights. It's incredibly straining even if we don't notice the flicker.


On LCD monitors, I could notice that on very low brightness settings, which was likely due to PWM flicker. I guess I'm sensitive to it.


CRTs were really good. They really were.

I remember back in the day the HardOCP forum[0] had massive threads about people buying the FW900 in the mid-2000s. It definitely achieved legendary status. Sadly I never had a chance to use one.

I remember having a 21" NEC that let me play Quake 2 / 3 at 120hz at 640x480 in the mid-late 1990s and I think I paid like $120 for it back then from one of those refurbished monitor sites. It also did 1600x1200 at 60hz for non-gaming.

I still don't know how those refurbished sites stayed in business because they offered free shipping, but a decently sized CRT back then used to weigh like 60 pounds (30 kg).

Back then I remember waiting so many years to get an LCD because the input latency, refresh rates and color accuracy were horrendous for so long despite being 3x the price.

[0]: https://hardforum.com/threads/24-widescreen-crt-fw900-from-e...


In 2005, I went and bought a $400 19" LCD.

In 2006, I pawned it off on the family and bought a $65 21" Sun-branded Trinitron CRT. Used it for a couple years then sold it to a co-worker for $25.

It wasn't til 24" 1920x1200 LCDs became available that I felt like there was a wild improvement over a CRT. Yeah, the FW900 would be the same story, but you never saw those for sale locally.


For a long time the big advantage of LCD was just size and energy use. You lost a lot of the clarity with the conversion from analog VGA or SVGA signal in devices that predated DVI and HDMI connectors. It would be really hard to trade in my 28" 4k display for any CRT I've ever had.


For work, LCD is absolutely superior.

Gaming - especially shooters -, however, are a totally different use-case: Refresh rate is absolutely necessary for quick reaction times, while something like readability is not as big of a factor (Games tend to use big and easy-to-read fonts). Of course, you still need some color quality to spot enemies, but it's very different from reading text for several hours.


> Refresh rate is absolutely necessary for quick reaction times

This reminds me of weekend warrior bicyclists dropping $5k on components to save a few ounces of weight and go a tiny bit faster.

Most people are average enough at video games (or whatever their hobby of choice is) that they’re still going to be average whether they have a CRT or an LCD (or whatever pricy pice of gear they “need”).

Sure it’s fun to spend money on your hobby but to say it’s “absolutely essential” is a little silly.


> This reminds me of weekend warrior bicyclists dropping $5k on components to save a few ounces of weight and go a tiny bit faster.

I think anyone who is a step up from a casual gamer would recognize the difference between 60hz and 120hz if they were somewhat familiar with a game. Especially games that required tracking your opponent with a mouse, such as aiming a hit scan weapon like a lightning gun where your goal is to keep a continuous beam on them while both of you move unpredictably.

It was especially apparent in Quake 3 back in the day. If you didn't run at 125 fps with a 120hz refresh rate you were at a disadvantage at competitive levels. Not even world class competitive, but enough to be good enough to play in tournaments with thousands of dollars in prizes.

If you ever want to see hardcore technical analysis of hardware from a gamer's POV, try reading forum threads from the early 2000s when ball mice started to transition to laser and optical mice. Or using after market Teflon pads on your mouse feet to get more consistent friction.

Fortunately most of these things were only a few bucks. Most competitive gamers in Quake would also use config settings that reduced the graphics quality[0] of the game to maximize visibility and get consistent frame rates.

[0]: https://www.esreality.com/files/inlineimages/2011/82175-conf...


60 vs 120 Hz is obvious enough that you don't even need a shooter to notice the big difference. The mouse cursor is enough, scrolling is enough, dragging windows around is enough, scrolling in an RTS is very obvious etc.


> This reminds me of weekend warrior bicyclists dropping $5k on components to save a few ounces of weight and go a tiny bit faster.

Was in a high end bike store in Philadelphia in around 2007 or so (pre-crash). There's a guy in front of me at the repair desk who is holding a carbon fiber bottle cage. The tech is looking at it, because it's cracked. He tells the guy that he probably overtightened the bolt holding it on ("very easy to do on the carbon cages") and suggests that he just puts a metal (even titanium!) washer between the bolt and cage, and everything will be OK.

Guy says "yeah, but the extra weight of the washer that would mean that it was pointless to buy the lightest possible cage"

I interrupt and ask the Guy if he ever adds slightly too much powder to his energy drink when going out for a ride.

/badaboom


If the guy in question was an optometrist with a beer gut, it would match the ridiculousness of the conversations I’ve heard in Portland’s high end bike shops.


When one craves that trance or flow state possible in gaming, average or not, the low latency may be essential.

Relative to elite players, it won't matter.

In terms of the experience? It really can be.

We all get old. Enjoying what potential we have on the way through can be worth a little money.

"Need" and "want" can be blurred here and it is OK.


60 fps games look and play obviously better than 30 fps games, to anyone. For something like a first person shooter, 90 fps feels obviously better than 60, I think to most people. In VR, meaningful latencies get even lower. I don't think we're really at the pro-athlete-gear types of improvement with most display technology yet.


I play a decent amount of BallisticNG, a high speed racing game. I'm not great at it, but when I upgraded from a 60HZ display to a 144Hz display, I suddenly started breaking all my previous track records by several seconds without trying.

Lower latency absolutely gives a noticeable advantage in competitive games.


Drag a window around on your monitor. Can you read its title when it's moving?

This is what it's about: clarity in motion. You assume that average modern monitors are in the 95 percentile and suffering from diminishing returns from better refreshing technology. I disagree with that. Moving things on a modern display is far from being smooth. Look at VR. This field is heavily impacted by this fact. Look at the tablets and phones migrating to >60hz in the past 2 years.

Resolution in motion is an issue.

In terms of analogy, it is not about scraping nanoseconds in high-frequence trading for example, but more like moving your pen on your iPad and have the line not lag behind in a disturbing way.


> Sure it’s fun to spend money on your hobby but to say it’s “absolutely essential” is a little silly.

Well, when optimizing for FPS performance, it is :)

I agree that it's rather unnecessary in general, though. You can game on an old TV with 60hz interlaced and 30ms response time just fine. But then you're probably very much not in the market for dropping four-digit sums on a display.


> You lost a lot of the clarity with the conversion from analog VGA or SVGA signal in devices that predated DVI and HDMI

VGA connection on non-crap hardware and cables should have identical image to digital connections.


> It would be really hard to trade in my 28" 4k display for any CRT I've ever had.

Agreed. I use an LCD today (25" 2560x1440) and I'm happy with it.

But I did wait until about 2007 or 2008 to get my first LCD which was a 1600x1200 Dell FP2007 (which still works the last time I plugged it in a year ago as a very temporary 2nd monitor while I waited for a replacement).


I had three FW900s that I got used around 2007. One from Craigslist and two from a liquidator. The liquidator ones had taken a bit of a beating. The craigslist one was great except for a couple inch diameter splotch of tinted image in one corner. I got rid of them during a move. Now that I know more about electronics repair I think they all could have been fixed. Doh.


Many a year ago, in college, I was helping a friend move. He had a giant 21" CRT, which seemed like it weighed eighty pounds. It was the dead of night, in the middle of winter. We were walking on ice, carrying it together. (We both had late-90s computer nerd biceps.) He slipped, and with a great exertion I managed to grab his side and prevent a fall to the pavement. Neither of us had much in the world at the time, and that sweet monitor was one of his most valuable posessions. Needless to say, he was grateful that I'd rescued his 1600x1200 view into the future.


Judging from the description, that may have been a G520: the 4:3 brother to the FW900. Truly something to carry into the future.


Gaming LCDs these days boast response times in the low single digit millisecond range. Even at 144Hz that’s less than a frame.

If there’s framebuffer-to-photons lag, I’d point my finger at DVI decoding before the physical movement of liquid crystals.

Also, for all the focus on “physical changes” there’s no mention of phosphor fade rate, which is a physical property of the CRT screen with a built-in trade off between latency and flicker.

Please let’s not have CRTs become the new Monster Cables.


The gray to gray response time is heavily gamed. There's some tech reporter investigation on it that shows how most monitors that game the GTG response time have slower overall response than monitors with a ~7ms time.


TFT Central's reviews are a great source for detailed measurements of response times: https://www.tftcentral.co.uk/reviews/asus_rog_swift_360hz_pg...


I used to have one of these (GDM-FW900) at an old job, a long time ago. They were great monitors for the time, top of the line. They weighed almost 300lbs, IIRC. And like all CRTs they can put out some heat.

LCDs at the time still left much to be desired so VFX houses weren't rushing to replace them with inferior, at the time, technology.

I remember running mine at a higher refresh rate. 75Hz and 100Hz was easily reachable. 120Hz and you started to drop resolution and the coils would sing. I miss the wide range of resolutions you could run. LCDs still haven't been able to replicate that without making everything a blurry, boxy mess.

I was able to take the broken smaller GDM-F500 (21") home. It was dim and after a bit of research I found out replacing a capacitor and a few resistors would bring it back. But it was terrifying working directly on the HV board. Once it was working I had so much screen. Almost getting electrocuted was practically it.


Yes! I think the ability to watch lower-res video (even 480p) on them and still have a decent picture is what I miss the most.

It would be quite comical if the 300lb figure were correct, having lugged one around before. They hovered around 92lbs, though.


> It would be quite comical if the 300lb figure were correct, having lugged one around before. They hovered around 92lbs, though.

I might be mistaken then. There was also a Trinitron in the room, 30-something inches and required 3 people to pick up. The FW900 certainly felt like 300lbs, for a single person.


I had one of those that a programmer that worked for me sold me.

It did cost him something like $6000-9000 new.

The image was incredible, but it had lots of drawbacks:

1. It was huge and heavyweight. Most people that say "people obsession with flatness" has never expirienced having an screen that is as deep as long and tall. Attaching it to the wall was a nightmare.

2. It had problems with magnets. I put a big magnet like 1 meter away and it affected the screen.

3. It emitted X-Rays directly to your eyes. Not good.

In the end replacing it with LCDs was a great decision. Much better for the programming or CAD that I do.

OLED is great, as the article says, if you can buy-recover the cost somewhat.


3. It emitted X-Rays directly to your eyes. Not good.

What's your source for this? Was it limited to the Sony?


No display made in at least in late 90-s would emit X-Rays.


Ugh. Keyboard hipsters are bad enough. Now we're going to have CRT hipsters!

It's like, name some piece of newly vintage tech, find its fans, start a movement, sell folks their shit nostalgia, lather, rinse, repeat


Being able to choose your desired mechanical response of the keyboard switches is useful for serious work.


For some, I guess. For others it is pointless materialism and social posturing that this hobby-slash-profession could really do without.


I wonder if there’ll eventually be enough demand for new CRTs, and if manufacturing technology has improved enough to make a CRT that is less of an ecological menace. If cost were only partly a constraint, how good could we build a CRT today?

I am still mesmerized by new display technologies / display technologies that I haven’t seen in a long time. Like the vector CRTs that exist in old Asteroids cabinets. The phosphors are really bright—brighter than you can imagine an LCD being.

I also remember the first time I saw e-ink and being shocked to see something so inert looking shift to a new image.

This article really captured my imagination. I’ve not seen a modern GPU driving high frame rates on a CRT, but now I’m very curious to do so. I’d imagine the experience would defy my intuitions the same as it appears to have done for the authors here.


https://en.wikipedia.org/wiki/Surface-conduction_electron-em...

These have almost all the advantages of a CRT, but without a lot of the downsides like bulkiness and materials usage.


Reading through the history section pisses me off. That seems like some really impressive tech; I wonder where it’d be today if it had succeeded.


They took up a huge amount of desk space though, which was quite inconvenient. But perhaps they could be made flatter?


Not without significant consequences. The closer the electron gun is to the phosphor, the lower the incidence angle of the electron beam becomes - think of aiming a laser at a card perpendicular to the beam, and gradually rotating the card toward the parallel. The same thing happens to an electron beam hitting an acutely angled screen as to a laser dot hitting an acutely angled card. So you lose power density as the beam spreads out, which both dims the target phosphor dots (as they receive less energy) and causes adjacent dots to light dimly (as they receive energy "smeared" off the target point). To passively counteract this effect, you need to increase the curvature of the screen, which causes pincushion distortion in the resulting image. To actively counter it, you need to vary beam intensity with scan position, but that still doesn't resolve the loss of focus away from center. That can only be fixed by using a narrower beam, so that the incidence angle matters less. Whether that's feasible I don't really know - this is about where my understanding of the relevant physics and electronics plays out, unfortunately. But if it can be done at all, I would intuitively expect the required control and drive electronics to be very expensive to produce, and maybe also needing regular tuning in a way that ordinary CRTs largely avoid.


The obsession with device thinness and smallness is a cancer. Engineering gets put into the wrong efforts, and cost, efficiency, and repair-ability are all damaged.

Modern lcd screens seem to be the worst of this. The screen itself is quite thin, but then the base, and the cables coming out the back mean it’s still effectively 3-7 inches “thick.” The thinness accomplishes nothing, and has made the device worse.


Sure, but I think you're looking at CRTs with rose tinted glasses if you think the thickness, size, and weight of those monitors was not a problem. I physically couldn't fit a couple of extra monitors on my desk, and a vertical monitor for code, etc, on my desk if they were as thick as the CRT I used to use ages ago. And having recently upgraded from a plasma TV to an LED - good lord even the plasma TV was stupid heavy. Sometimes you want to be able to move around physical objects you know? Cleaning, rearranging, etc? There's a difference going from a 10 mm phone to a 9 mm phone vs going from a 30 inch CRT to a 65 inch LED


I used to have around 10 CRTs at a given moment in my collection, 5 of which were usually in my trunk, ready to go at a given moment.

The weight made transporting them absolutely horrendous. I find it ridiculous to think that it's not worth spending time, money, and research on smaller form factors if possible.

I will take the compactness and weight of a modern LCD over a CRT any day.

The only reason I had those CRTs was the smash bros gaming community needed them, as the nintendo gamecube natively outputs to 480p. Which makes modern displays untenable due to their upscalers, which take different amounts of time to upscale. (480p) CRTs are consistent.


The thinness makes it take a little less space if you lose the base and switch to an armature (which is generally nicer anyway).


the base has to have depth or else the monitor will fall over. you can also pop the screen off from the base and do whatever you want with the loose screen. at least for mine. wires also typically are inserted parallel with the screen so no trobule there.


At least back then we got to have desks that were bigger than a cabinet door, if only because they had to accommodate the CRT.


> I’ve not seen a modern GPU driving high frame rates on a CRT, but now I’m very curious to do so.

You can probably find some decent CRTs on Facebook marketplace for very cheap (e.g. the other day i found a bunch of them less than 10 euros each) - they wont be as good as those mentioned in the article (unless you get really lucky) but anything that can do 120Hz or above should be enough to let you see that.

A bigger problem would be connecting them to the modern GPU. Nvidia removed the DAC from their GPUs after GTX 9xx series and even that wasn't that great (AMD also removed it some time before). So you'll need to find some way to convert the digital signal to analog VGA signal and a good DAC for that. I think there is some thread in the HardOCP forums about that but personally i haven't tried to go down that rabbit hole (yet :-P).


Next will be hobbyists longing resurrection of plasma TVs :)


CRTs are still heavily used in the smash bros. melee gaming community as the nintendo gamecube natively outputs to 480i.

Using a modern 720p+ display involves upscaling, which causes latency. This latency is wildly inconsistent across tvs, which makes competitive play untenable.

There are monitors with extremely low response time (~1ms), BenQ was the go-to for a while. It used to be that the cost was prohibitive, but now finding CRTs is more prohibitive... especially finding a good trinitron.


I have a tiny Sony trinitron (I think) CRT. The screen is the size of a banana diagonally. I specifically have it around just for practicing SSBM.

Many people nowadays do just use their PC with a good monitor. There are a ton of hacks that you can do to reduce the lag that you get from a monitor: https://www.youtube.com/watch?v=J6B4t5fCEbQ

The video above is from Hax, a prominent smasher who has been part of the SSBM scene for a very long time.


I was going to mention this. I've heard stories about players scouring the streets on council cleanup day in the hopes to snag an old CRT.


> Taylor said in an interview that he's willing to pay up to $500.

I wonder what the shipping cost of those heavy monsters are.

Me, I've used CRTs for 25 years. I have zero nostalgia for them. I don't want to ever use one again. There's nothing about them I prefer.


Reminds me of a story in when I was in university in around 2005. In our usenet group some people found a sale of used fw900s (definitely for less than 500€ each). They then organized a group buy, which had around 20 interested people.

Then they wondered how they would actually get all of those transferred across half the country. I think in the end they rented a truck or trailer to pick them up. It was definitely a bigger feat than getting a LCD shipped from your favorite online store.


I still want my desktop, which is currently a Dania dining room table, to be a display. Wouldn't that be cool, even with my coffee cup and junk sitting on it?

I also want a true wall-sized display. No, not a projected image.


I feel the same way about CRTs. Things are so much better now (to me anyway). I also feel the same way about loud keyboards. The current fetish for "clackety clack" keyboards cracks me up.


There was a really good article shared a few weeks ago about video memory on old systems that didn't use a frame buffer, instead the video data was streamed to only a couple of bytes and calculated in real time. This had the effect of nearly zero lag because sprite location could be calculated right up until the moment the first row was drawn to the screen.

I wish I could remember exactly what that article was.


This was quite a popular technique to achieve fancy video effects on low powered hardware.

A good example is the various parallax and scrolling effects in the intro cinematic to Link’s Awakening on the Game Boy/Color, achieved by changing various hardware scroll registers in the horizontal blanking interval (time between one line being finished and the other starting). The main limitation is that you can only do horizontal effects, not vertical.

I’m wondering if the article you’re thinking of was the Wired piece on the Atari 2600? That famously ran all of its game logic in the H- and V-blank intervals.


I seem to recall the article being about Apple II hardware, but it might have been Atari 2600. They are of similar generations.

edit: Yep, you are right - the article is titled "Racing the Beam"

https://www.wired.com/2009/03/racing-the-beam/


I was part of the Commodore 64 demo scene back in the 80s, and this technique was heavily used. You only had so many sprites available, limited color palette, could only use one font (with a small set of glyphs), etc.

By setting up "raster interrupts", which would execute code when the CRT reached a specific raster line on the 320x200 display, you could do all sorts of trickery. The raster interrupt would trigger after it rendered the first horizontal line, and then you could change the pointer to the sprite/font/palette/etc. for the duration of the next display line. A common effect that everyone would learn first was to simply change the background color, so that every display line was a different color, producing a rainbow background.

In a sense, the [video] demo scene was almost entirely about this effect. How much could you do in the tiny number of CPU cycles available to you between the raster interrupt triggered at the end of one horizontal scan line, and the start of the next line, as the beam wrapped around? You could just barely execute about a dozen CPU instructions. Usually you'd flip a couple pointers that the video hardware would look at when it started rendering again, and then you'd pad in a couple no-op instructions so that the change occurred "off screen". This really happened in the margins of the CRT that were covered by a bezel, which meant that on some displays the transition really was invisible to the eye, but on others, there would be flickering at the edges as the background color changed. I used to test my demos on multiple displays to try to minimize the effect, having to shave off an instruction or two in order to make things look better.


If so, “sprite location could be calculated right up until the moment the first row was drawn to the screen” is a tad optimistic.

In theory, you could, but on the Atari 2600, during screen drawing, the CPU cycle budget for each line was 76 cycles (see https://cdn.hackaday.io/files/1646277043401568/Atari_2600_Pr...). Rounding up, that’s 40 instructions, and if the current line is different from the previous one, you’d have to update memory from that same budget.

Because of that, reading of user input and updating of player/object positions typically/always was done during the vertical blank interval.


Color convergence on large CRT monitors is problematic. Back in the early 90s I spent $1200 out of my own pocket to buy a 20" CRT. At the time I was designing 3D rasterization algorithms and circuits and would spend hundreds of hours tweaking constants (how much difference loss does a 5b interpolation fraction introduce vs 6b, etc) and A/B testing to get the most cost effective & high quality results.

I bought and returned two and just lived with the defects of the 3rd one when I realized it was impossible to get good convergence across the display. I would confine my A/B tests to a specific area of the screen where convergence was best.

BTW, I forget the exact details, but color convergence at the factory was set with the monitor facing some particular direction, lets say north. If you set up the monitor facing east, the convergence would be off a little bit.

When quality LCDs became available, convergence was a non-issue, but they used dithering to attempt to increase the color gradations, so it didn't really help my kind of work anyway.


In my youth I bought one of those types of high end monitor second hand. Had 5 BNC inputs and required a resolution tweaking utility to change the sync polarity.

I spent hours tweaking convergence on the 9 million trim pots to get it just right. Best color gamut of any display I ever owned. Sadly the display physically broke something internal when I lost the included plastic screwdriver and tried using a metal one. The internal heads of the trim pots were apparently connected to the circuit.

The display was fixed to 1024x768x75Hz exactly (though you could squeeze in a bit more with modeline tweaking), but the colors were great.


I remember reading something by John Carmack back in the early Oculus days, where he was basically saying he wished he could use small CRTs in VR headsets for many of the same reasons the article highlights.


What stopped him?


Not sure. At a guess, weight? And I'm guessing no one was making CRTs small enough to be practical? Maybe power requirements?


As a former epileptic, the strobing that comes with CRTs is something i’m happy to be rid of. It wasn’t something I could see so much as perceive, but it’s a feeling my body still remembers.

That said modern monitor tech still has big problems. Quantum dot has a chance of shifting the field in the next 10 years but right now any monitor effectively excels at only 2-3 of: color gamut, color accuracy, pixel density, refresh rate, response time. HDR is a whole different topic that the industry hasn’t figured out.


I'm curious if you've ever had the chance to see an oled monitor in person and if it reacts similarly to a CRT for you. They have a global refresh/flash instead of one that's rasterizing so it might not have the same effect and they don't have the same phosphor fade of a CRT either. I'm also curious about the LCD monitors that strobe the backlight to reduce apparent motion blur (basically they shut the backlight off during the time that each pixel is fading between colors in each frame).


I own a 77” LG oled that I positively love and have not noticed any strobing. I’m also no longer epileptic, so maybe that also plays a role?


It might be a much higher frequency which you're not sensitive to. My Galaxy S5 had 240 Hz. You can figure it out using a digital camera. Set the shutter speed to e.g. 1/10s, and take a picure of a bright spot displayed on a dark background while moving the camere so you get "motion blur", and count the number of times the object repeats. Multiply by 10 and you have your refresh rate.


The problem with monitors is much more with the display industry's preference for finding new ways to scam customers instead of commercializing display technology improvements.

The history of LCD monitors and TVs is a history of the industry increasing prices while finding new corners to cut by removing functionality that reviewers and users didn't know they needed to test for.

When users first realized that IPS was vastly superior to TN for most purposes, the industry responded by making low-quality 6-bit IPS panels, which the industry happily described as 'IPS' without disclosing that the panels can't display color gradients--and often flickered badly due to FRC. Early 6-bit panels were worse than TN in everyday use.

6-bit IPS is now essentially unavoidable in smaller, non-4K, screens.

Another fun scam the display industry concocted was releasing one production run of good quality 8-bit IPS monitors, to get positive professional reviews and user word of mouth, then making subsequent production runs under the same model number with 6-bit fraud-IPS, PVA, or even TN panels.

The low-frequency PWM backlighting scam (which probably saves all of ten cents per display) is now known well enough that reviewers will test for it, but the industry has developed other ways of preventing users from purchasing good-quality hardware at a fair price.

Dynamic contrast is a particularly insidious problem. This piece of joy from the display industry changes the backlight intensity and RGB pixel values depending on screen contents. As a result, the color tint of the display changes depending on screen contents. These constant color shifts make dynamic contrast monitors unusable for even non-professional design/photo/video work and prevents color calibration entirely. This technology is even applied, with no user option to disable it, to monitors sold as sRGB pre-calibrated and marketed towards mid-range color work. Officially, dynamic contrast is intended to save energy, but the actual intent is likely to force people who want stable colors to buy professional displays at ten times the price.

A new scam that's emerged since the start of the pandemic is color gamut restriction; it's now very difficult to get displays--especially laptop displays--that support more than 50% of the sRGB gamut. Displaying actual red is now something the industry has decided to exclusively paywall into professional panels targeted at the design market.

The problem with displays isn't technology; it's an industry that's built on the premise of ripping off consumers.


Unless you play video games, the space, power and heat aren't worth it. I didn't have much desk space and higher resolutions became really cheap to come by.

So maybe 8 years ago I got rid of my 2 fw900s and now use rotatable 4ks that I put side by side in portrait mode, essentially getting a ~4kx4k square (18:16). It's $750 or whatever 2 go for now well spent. Most modern laptops can even drive two monitors as long you get the adapters right. It's a really good work set up. Highly recommended.

I'd honestly say it's the most significant thing that's affected how I interact with computers (and I've done foot pedals, gestural systems, custom made input devices I've designed myself, repurposed midi device for UX, my own window manager, etc ... 2 4k monitors in portrait is the top of the list, really)


I love CRT displays myself. The standard 50/60hz is a bit rough, unless one is working on a high persistence phosphor. Older monochrome, amber, green screens often have that phosphor.

I like how simple they are to drive too. Managing a high resolution analog stream takes far fewer resources.

For personal electronics, retro fun, I recommend one.

Glowing phosphor in a glass tube just rocks!

There were variations too. I got to work on one of these in the 80's:

https://m.youtube.com/watch?v=T-F7ZySfgZ0

In a dim room, these are beautiful. Up to 4k resolution in the 70's, no display buffer. Surprisingly techy looking and feeling.

I will miss CRTs when they are no longer available. They should be made in 16:9 for a while longer. People would use them.

All that said, current flat panels continue to improve. They are fine for the majority of things most of us do.


Just a late edit:

My favorite thing to watch on CRT displays is well produced SD programs. They look really good.

At the peak of SD, I had tuned a great Sony to near what a PVM can do. Watching DVD movies on that via RGB OR component in a dark room was a good experience. Huge dynamic range. The occasional trail from a bright entity in the program. Real black.

In many ways, many of us did not experience what was lost in the tech change. This revival makes sense in some ways.

Look at vinyl. It is a similar thing. The overall experience is really good. Gratifying.

Fact is, where there are limits, there is art. A great vinyl production is something I appreciate a lot. A similar one done digitally actually sounds better, but the art is not there.

The CRT is art. We had limits and the CRT bubbles up out of that just nailing it.


Ah man, memories, and not from 20 years ago, either. I bought a top-of-the-line 21" Trinitron in 1999 and after 4.99 years of the 5-year warranty something stopped working inside it so I shipped it back to Sony and they couldn't or didn't want to fix it and they didn't have any more to replace it so they shipped me a new 24" GDM-FW900 for free, which gave me years of trouble-free service. Eventually I had to get rid of it because video cards with respectable analog outputs became scarce. Younger computer users never had to deal with this but in the late stages of CRT technology the quality of the analog end of the video card (the RAMDAC) was a major differentiation between brands. Having a quality RAMDAC and a quality cable of minimal length made a visible difference in the resulting picture.


Trinitron were great monitors. Except you could never un-see the embedded wires once you noticed them.


> Except you could never un-see the embedded wires once you noticed them.

So true. Seeing those wires for the first time was a major WTH moment. And you could sort of forget to notice them for a while, but they always came back.


`xsetroot mod 2 2` and I rarely thought about them.


I still use CRT, and I own a Radeon R9 380X.

I am worried with the fact that this card is the final one with a RAMDAC, any newer one doesn't have, but any monitor better than mine is ludicrously expensive.

I dunno what to do now.


It's weird to me to see all this affection for CRTs here. I had a "flicker free" 120 Hz CRT and I could still easily see it flicker and got headaches when using it for more than a few hours.


TL;DR: For gaming, because ‘input lag’.

[In a CRT] the electron-to-photon exchange happens instantly. While CRTs do have some sources of lag —namely, the time spent buffering each video frame and scanning each line of the frame from top to bottom on the screen —those delays are on the order of microseconds. When you move your mouse or press a button on the keyboard, the response time is imperceptible...

By contrast, an LCD requires physical movement on the part of every pixel. On an LCD, the back of the display emits a constant stream of white light, which passes through a polarizer and onto an array of liquid crystals. Applying voltage to each crystal causes them to twist, altering the amount of light that comes through the screen's front polarizer.

Compared to electron-photon conversion, the physical movement of liquid crystals inside an LCD display takes a lot more time, introducing input lag. It also creates blurriness when there's a lot of motion happening across the screen.

Also TL;DR: CRTs to look for.

Sony FW900 16:10 CRT [also sold as] HP A7217A, SGI GDM-FW9011, and Sun GDM-FW9010 ... 16:9 CRT monitors including the Intergraph InterView 28HD96 and 24HD96... [and] you'll need a graphics card with an analog output, such as Nvidia's 900 series and AMD's 300 series cards, or a digital-to-analog converter.


The description is actually not very good.

Input lag doesn't purely exist because of liquid movement and pixels changing colors. If it would, then an input lag of 50-100ms (which where common with older generation LCDs) - would also mean the screens response time (the time to change color) would be that high. Which would mean your picture would be a blurry mess.

The majority of input lag came from additional signal processing algorithms which were built into LCDs, which buffered the full picture for 1-2 frames. E.g. to apply overdrive algorithms, color correction, scaling, etc. Those are all running before the signal gets towards the crystals.

I think a lot of that was actually just bad design in the earlier generation of LCDs. And it was really bad, I had a screen with 100ms input lag and neither working nor gaming on it was really fun.

But it seems like manufacturers now understand the problem better and try to minimize the delay through signal processing.


>those delays are on the order of microseconds

That's wrong, on a 60Hz CRT it takes the beam 1/120th of a second to reach the center of the screen, which is an average lag of 8.3 milliseconds. There are LCDs with 9ms lag.


This whole article is nonsense. A 60 Hz CRT is going to have more input lag than a 144 Hz LCD (with a decent response time), purely because of the lower frame rate.

Why is nobody calling this out?


> a digital-to-analog converter.

How much latency would that typically add?


If you're using DVI-D -> VGA or HDMI -> VGA, probably a couple of pixels worth on a simple converter, as it would be expensive and unnecessary to cache a whole frame.

DP has incompatible timings so I would expect a frame or so on a cheap (though significantly more expensive) converter.


For eternal analogy to digital converts/upscalers, with good ones you can have zero lag since they would be no frame buffer. I have an OSSC that I use to upscale my the VGA signal from my Dreamcast to HDMI output and there is zero lag.



I remember my Sony Trinitron and Iiyama monitors fondly. RIP.


Had both, don't remember them fondly at all because I moved a lot. These 35kg monsters were the perfect combination of unwieldy and delicate at the same time...


I feel like CRTs today belong to the same nostalgic area as cast iron skillets, vinyl players and manual gearbox cars. Sure, they are good for certain things, and they excel in some of them, but really they are usually way to much hassle to use and acquire, and need too much sacrifices in other areas in which they don't excel. Articles which compare such products usually completely ignore or outright forgive any disadvantages for "nostalgic" product, something which they wouldn't even think of doing for product created in 2020, instead of several decades ago.


Haven’t seen any other mention of this in comments but the sound of CRT’s used to drive me mad, particularly CRT TV’s. If I walked in to a house where the TV was on, I could often hear it from the entrance!


I was using my 24 inch sun monitor for a long long time when LCD's came out, wouldnt want to downgrade to a 17 or 19. Also It was running at 1920x1200, going to a 4:3 lcd with smaller display seemed wrong.

Funny, I bought first 24inch LCD, a dell, color matched, and they gave me a free pc. Put a nvidia 8800GT in and gamed for quiet a while, it came with a pentium D 820, after a few years put a 960 in it. Think I got 5+ years of gaming on it.

Still using the LCD as my 2nd monitor net to my 27 inch 1440p wide screen gsync gaming monitor.


Some of the Sun monitors were incredible, back in the day. I remember using a GIS system on, IIRC, a Sun 4 workstation with Sony Trinitron monitor. Just razor sharp and fast as hell.


I bought the Sony Trinitron from the story back in 2000 at Fry's. When the salesman tried to upsell me on a service contract I accidentally laughed in his face. It happened before I could stop myself and Sony had an absolutely incredible record for quality at the time. I used that monitor heavily for over a decade, but finally replaced it because the phosphors were starting to go and the picture was getting dingy. Still it was a great piece of hardware.


Recommended video to see how CRT works.

https://www.youtube.com/watch?v=3BJU2drrtCM


Also, a 20-year-old CRT monitor (21") consumes almost four times the electricity of an LCD, and can cost about 5x more to run-even with energy saving mode.


Gamers aren't really known to go out of their ways to save 100 or 200W.


In the early-90's I worked on prototypes for the new FAA air traffic control system. We sourced 2Kx2K CRTs from Sony that where the most incredible things to behold when used with hardware anti-aliased vector graphics. Almost 30 years ago. Where has the time gone?

https://en.wikipedia.org/wiki/Trinitron


Interestingly enough the LG CX 48 inch TV has been pretty much targeted at gamers and has received some really good reviews. This was targeted at Nvidia setups. Granted at this size they don't work well on desktops

One Review https://youtu.be/IR6RnZI2uoY


Better at some things than some LCDs maybe, like taking up desk space, heating the room, that sort of thing.


don't forget the lead and the ionizing radiation!


I have one of the variants(HP A7217A) sitting away gathering dust at a relative's place. Its weight/size just made it too difficult to move with over the years.

I hadn't realized they're still desired now. I'll have to dust it off and see if there's a buyer out there.


This is part of the reason why, when LCD screens were being introduced as "the new thing" most large shops would not display them side by side with CRTs - so you didn't notice how aspects were much worse, like responsiveness and sharpness.


Yeah, CRT was amazing. In my childhood I experimented with my 320x200 black and white CRT monitor and 4-color CGA, and went over the 720px resolution by manipulating the registers of the video adapter.


What's the energy consumption like for LCDs, OLEDs and CRTs? It wouldn't surprise me if using a CRT meant higher consumption. Shooting around electrons I'm a vacuum can't be cheap.


Now you just need to find a modern video card with a VGA output, because if you use a converter, you're going to introduce lag, and probably more lag than you shaved off going from LCD to CRT.


The only thing I miss are the resolutions. 15 years ago I had a Sony Trinitron running at *XGA. Today most monitors are still HD. And even most 4k monitors are in fact sharper HD displays.


This !

I recently bought two new monitors and had to return them, swapping them in for a laserjet printer, because my ragged Acer (manufactured 2011) gave better FAR perfomance


I’m still experiencing pthread lockups in KDE Plasma under Debian Bullseye (11).

It’s time to hook up that dumb 20’ CRT to the serial port of my expensive rig with 42’ QLCD panel.

Grrrrr.


I stumbled upon this article while trying to figure out if i could hook up my ipad to a crt. If any one has any wise words for me on my journey, I’m all ears.


Apple sells both USB-C to VGA and Lightning to VGA adapters, just buy one of those depending on which iPad you have.


The biggest argument to use a CRT to me is interfacing to a computer using a literal particle accelerator pointed at your face. How metal is that?


I've picked a lot of junk off the streets but never managed to get a nice Grundig.


I miss 4:3


My lower back disagrees. Not having to lug super heavy CRT monitors is one of the best things about the light weight LCD.


that was a surprisingly good read


lol, thanks for the downvotes. so i guess it was not that good of a read after all :D


Ignore downvotes, especially early downvotes. There are some knee-jerk reactions these days.

(But do keep in mind that your initial comment didn't really add anything to any conversation. So it didn't really deserve any upvotes, either.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: