Hacker News new | past | comments | ask | show | jobs | submit login
The Last Scan: Keeping old TVs alive (theverge.com)
57 points by Hooke on Feb 7, 2018 | hide | past | favorite | 32 comments



Reluctant CRT owner here. I have one because it was cheapest and easiest way to get a low input lag display for speedrunning. I paid $30 for my 27" Trinitron on Craigslist and had to drive a bit to pick it up. It's a monster of a TV, requiring a sturdy rack to sit on and a couple strong friends if I want it somewhere else. The picture quality is great and the scanlines definitely add an old-school feel but it doesn't look better than my SNES Classic on my generic HDTV.

It's not that you CAN'T speedrun on LCD displays, it's that it's expensive and inconvenient. Connecting a retro console like a SNES to a modern display requires converting one of composite/S-video/component/RGB SCART to DVI or HDMI, and most of the devices that do that aren't designed with latency in mind. If you search on Amazon you'll find a bunch of cheap knockoff upscalers that, if they even work, add 50ms+ of latency. The cheapest acceptable option is the OSSC, a custom-made upscaler for retro gaming that costs about $200. Many people who built their setups before OSSCs were easily available use XRGB Minis/Framemeisters, which run about $320 and add very slightly more lag. Both the OSSC and Framemeister expect SCART input, so there's some extra cost there for cables/converters.

After you buy an upscaler, you'll still need a monitor with acceptable latency after all the extra image processing is disabled. The Melee community uses particular models of Asus monitors that people have checked, but it's difficult to find good information on input lag in general because it's not something manufacturers care to advertise. If you're buying new, figure about $200 for something like an Asus VG245H.

Lastly you'll need DVI/HDMI capture and splitting/passthrough. Not an issue if you're already capturing modern consoles, but S-video capture for retro hardware is dirt cheap. A good 60fps HDMI capture setup is easily $100-200.

So, overall it costs $500-600 to replicate the CRT experience from a retro speedrunning perspective and the results are a mixed bag - slightly (although imperceptibly) higher latency but higher video capture quality and some space savings. For comparison, my CRT setup cost $30 for the TV, $35 for a GV-USB2 capture dongle and $20 for a S-video/composite splitter, under $100 total. Unless something happens to make low input lag HDMI more cost-effective, expect retro speedrunners to continue using CRTs as long as they're viable.


Is there an international effort to document and preserve human knowledge so if/when disaster strikes, we can get back on our feet sooner rather than later? In this case the simpler technologies need to be documented with more importance because they will be far easier to reconstruct than more advanced ones.

This might seem like a frivolous effort, but this, along with documenting and conserving the planet's biodiversity are honestly two of the most valuable things humanity can do. We musn't recklessly rush ahead without even planning fallback scenarios or preserving valuable buffers that will cushion the impact of potential setbacks.


> Is there an international effort to document and preserve human knowledge so if/when disaster strikes, we can get back on our feet sooner rather than later? In this case the simpler technologies need to be documented with more importance because they will be far easier to reconstruct than more advanced ones.

This is one of the things I have been thinking about lately a lot. Our current society is very fragile because of all the high tech.

If there is a single large disaster we can be easily thrown back a few decades from a technological point of view because the knowledge of high tech (e.g. creating computer chips, hard drives, ...) is concentrated in a few places. If those places and people are gone then we will have a serious problem.

We are not even able to produce solar cells after a disaster but we have to go back to burning coal and wood for electricity.

There needs to be some sort of book series which teaches our basic knowledge (medicine, electricity, math, physics, ...) in a way that it can be restored rather easily after a large disaster.


Physical chips would be a huge starting point to redeveloping computers a second time. The population loss associated with that kind of knowledge loss seems like a much bigger issue.

We need vast economies of scale to continue current chip production. A world population at say 1% our current size would have issues building and maintaining modern equipment.

PS: Also, with a smaller population hydro power would easily meet our energy needs. Long enough to bootstrap wind / solar again.


> We need vast economies of scale to continue current chip production

They don't actually need current chips, but recreating something along the lines of an early Z80 would be an enormous win for a society which is rebuilding itself from scratch.

> Long enough to bootstrap wind / solar again.

wind power might be actually better for local development. You just need some magnets and some coils and you have electricity to power a few lightbulbs (well, actually someone has to build lightbulbs or leds...)


Depending on how it happened, the world population at 1% means that there's all this infrastructure and existing inventory to support up to 100x more people. Hopefully that should be sufficient to get a long way back to where we left off.


I saw a talk by Tim Lossen a couple of years ago where he spoke about what needs to be done to be able to rebuild civilization from the ground up.

https://www.youtube.com/watch?v=U3gTL-XJcgs


I would suggest that the biggest group to mourn CRT technology would be our feline overlords. I had several cats that used to perch themselves atop a screen, trying to capture my attention and keeping warm but blocking an air vent in the process.

The switch to DVB-T from PAL saw us recycle our last CRT telly. I threw out my Trinitron computer monitor some years earlier. Its sharpness had deteriorated over time. Text became too fuzzy, perhaps something to do with electrons mentioned in the article.

I miss the vertical pixels - 1600x1200 was a standard but LCDs embraced 1080 panels from the Blu-ray craze. Luckily I have a 1024*1280 LCD (rotated) for A4 documents.


> I miss the vertical pixels

This HD craze (especially for notebook displays) was one of the worst mistakes in recent computer history!

If I use a computer for reading and writing then I need a lot of vertical space and pixels! I don't care about the width at all!

I own a bunch of old IBM/Lenovo notebooks and I am convinced that a 14" display with a 4:3 aspect ratio (for example 1400 x 1050) is still the best type of display for a portable developers notebook.

Newer Microsoft Surface devices go for a 3:2 aspect ratio but if I would design such a device I would make the screen even more square.


Agreed about the "HD craze", especially because of the following facts:

- "HD" in the context of laptop screens means "1366x768", which is unusable. It's funny that "high definition" means "the lowest resolution available on a PC". It makes explaining to family members which monitor is right for them very hard, because of course "HD" must be a good thing, right?

- Most entry level laptops come with said resolution (this may be starting to change in the past few years).

- Businesses, even software development companies, have a hard time understanding that higher resolution monitors are not for gaming or watching movies, but for actual work. In fact, they are more useful for working, because for watching movies a standard "HD" monitor is enough.


> "HD" in the context of laptop screens means "1366x768", which is unusable. It's funny that "high definition" means "the lowest resolution available on a PC".

Unmodified HD (without “full”) generally means 720p for other screens, that's not a “laptop screen” specific quirk. (Well, I guess it is, in that 720p is 1280×720, but you aren't complaining that 1366×768 is a higher resolution than it should be for the bare “HD” label.)

(Also, 1366x768 is not “unusable”, and I say that as someone who was habituated to a 1600×1200 desktop display for years before starting to use 1366×768 laptops a lot in parallel with it.)


You're right, unmodified HD is 720p for all screens (you have to admit it's funny "HD" means "low res"). It's just that I first encountered this annoying resolution in entry-level laptops. Even more irritating, if you wanted higher res you had to buy a "gaming" laptop, which is ridiculous since for gaming a 720p screen is more than enough.

I'm glad you don't consider it unusable for work, but you're in the minority[1]. The resolution sucks and there's no excuse for it. You can barely work with an IDE and it forces you to use an external monitor -- and I say this as someone who finds the Jeff Atwood "I can't work without 3 external monitors" kind of snobbery irritating.

[1] Likewise, there always used to be someone who didn't mind flickery low refresh rates in CRT monitors. Their opinion can be ignored since pretty much everyone else cares, even when they don't know what's happening and they just know their eyes and head hurt after a while.


> 1366x768

There should be a special place in hell reserved for companies who still produce that garbage. This resolution was already too low 10 years but nowadays is not not usable at all.


IMO the problem is not the resolution itself, but the fact that it is the same for all screen sizes. 1366x768 is too high for an 11" screen but too low for a 15" one; 16x9 ratiio also is bad. The only company that understands it right is Apple: the DPI is about same across all of the non-Retina laptops, yet the resolution varies.


> If I use a computer for reading and writing then I need a lot of vertical space and pixels! I don't care about the width at all!

So, get a good rotatable VESA mount and use your widescreen monitor in portrait; more vertical space per $ than any other monitor if you “don’t care about width at all.”

> I own a bunch of old IBM/Lenovo notebooks and I am convinced that a 14" display with a 4:3 aspect ratio (for example 1400 x 1050) is still the best type of display for a portable developers notebook.

I don't find the keyboard on anything short of a 15” widescreen to be good (probably take a 17” 4:3), and keyboard is pretty important to me for coding.


Well, if you use a 2-in-1 convertible notebook, you can simply use it in portrait mode :)


Vote with your wallet. 16:10 1920x1200 displays exist and they are often better than the 1080p displays, the reason being/was that since the screen size is much bigger (16:10 vs. 16:9) they can't compete on price so they have to compete with quality instead.

But nowadays they are starting to fetch a premium just because of low demand. The 30" 2560x1600 displays are quite expensive and there is nothing in between 24" and 30" if you want 16:10.


> "1600x1200 was a standard "

Man, I really miss my Dell UltraSharp 1600x1200 monitor. I loaned it to a friend whose only monitor had died and he couldn't afford a replacement right away. It had been driven by my Mac mini which had fallen out of use anyway so I didn't miss it at the time. When I finally got it back from him, he neglected to tell me his son had hit it with a toy and damaged the LCD near the top. It was unusable.

I think I'd almost prefer a 4:3 beast like that instead of my 1080p screen today; about the only thing that would suffer would be games optimized for widescreen. First person shooters in particular would probably be claustrophobic without the extra peripheral vision a wide ratio provides.


Some friends you have there..


I remember my parents bought a 32" widescreen CRT HDTV (1080i) back when HD was first becoming a thing. That was the heaviest TV set I've ever had to move in my life.

They offered to give it to me a few years later when they upgraded to an LCD... wish I would have taken it (no idea what happened to it now). It took me forever to find a decent CRT on craigslist for retro gaming :)


My brother was a bit of a home theater enthusiast when we were growing up and he bought a widescreen CRT HDTV and you aren't kidding about the weight. It took three of us tumbling it up the stairs just to get it up there (nearly killing us twice). It was a beautiful set he gifted it to me when he upgraded to an HD projector. Used it until I moved out. I wasn't moving that battleship any further than the street by that time LED panels had started coming out.

In retrospect I wish I had hung onto it but with the size and weight it just wasn't possible.


I bought a widescreen 32" CRT back in the day. I chose that particular model because it was the only in that price range that had a VGA port (in addition to antenna, SCART and other PAL inputs).

It was advertised as handling a resolution of 640x480 via VGA. But I discovered that it could accept an 848x480 image from my desktop computer.

Coupled with a 50 GBP (~75 USD) Logitech wireless keyboard, I could surf the internet comfortably whilst leaning back on the sofa. In widescreen!

(This was maybe 17-18 years ago.)


Importantly, the widescreen ones always have post-processing to convert from the SD resolution to the resolution of the tube, they aren't like PC monitors where they switch resolutions. This adds about as much input lag as a modern hdtv :(


I am surprised that an article on CRT displays mentions John Logie Baird but not Philo Farnsworth and Vladimir Zworykin. Baird's TV was electromechanical, whereas the Farnsworth/Zworykin designs (the two were competitors) were all-electronic. Zworykin, in particular, is credited with coining the name "Cathode Ray Tube."


Will we someday see some low-latency, low-persistence TV screens, like the panels we can find in VR HMDs? These could easily replace most uses for a CRT TV.

What I miss the most on a modern TV or computer screen is the low persistence. Just look at some moving sharp edges on a CRT versus an LCD screen (glxgears, for example). The difference is striking, as you can follow the edges on CRT TVs, but not on a regular screen.

I actually suspect that the 144Hz gaming craze has more to do with persistence than framerate!


OLED displays might be the answer.


What I miss most about CRTs are accurate light gun video games. I tried Overkill on PS3 and while it was fun, the crosshair positioning was terrible. Rail shooters seem to have vanished into history (much like Ruby on Rails) (I kid! I kid!)


The Nintendo Wii's cursor positioning was perfect, and was as intuitive and direct as using a mouse. Instead of having to move a joystick to move your entire view, you just point the remote at an enemy and shoot. I wish the Wii was slightly more powerful so that its FPS games were taken more seriously


I'll agree that it's as intuitive as using a mouse. I'll strongly disagree that the cursor positioning was perfect or that "you just point the remote at an enemy and shoot".

Point the Wiimote generally at the screen, waggle it around a while to figure out when the sensor bar is in view, and then use mouse-style relative motion to get it pointed where you want. If I just point it at the TV, I don't know where the cursor will be, but it won't be exactly where I pointed.

A CRT with a lightgun is much more of a point and click situation.


I still keep a CRT around for my C64 - a little 13" Sony Trinitron. It might see more use once I get around to replacing the electrolytic caps in my SNES and getting it working again.


Similar niche business keeping monitors going (catering to contemporary art) in London: http://www.the-block.org


Critical for Atari-fans. Some of the old systems, without mods, need cable input or antenna screws.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: