Hacker News new | past | comments | ask | show | jobs | submit login
LCD TVs won’t see any further development (tomsguide.com)
219 points by belltaco on June 20, 2023 | hide | past | favorite | 475 comments



While the headline implies it's talking about LCD displays, it actually refers to LCD panel technology. The innovation is in the backlight technology behind the LCD panel, so LCD displays still have an exciting future ahead of them.


To be blunt: An exciting future in mass-market and mid-to-high tier. Which is perfectly fine.

For Premium tier the cost of elevating the picture quality to a competitive level (with elaborate backlight tech, multiple layers of LCD, etc.) is simply too high.

Samsung was only riding LCD in premium TVs for so long because they didnt have the technology for large-size OLED. So they spent a ton of money to establish the "QLED" brand as a new name for "LCD with yet another backlight" and started explaining people that brightness (not black-level or contrast) is the most important thing for a TV (because uniform LCD backlight can be brighter than per-pixel OLED)


The thing is: can I buy a big OLED and use it for whatever, without doing anything special to take care of it, and not have it suffer from burn in 10 years from now?


"burn-in" on OLED is actually the degradation of brightness of LEDs based on their operation hours. The impact is just more noticable on OLED because the pixels degrade individually (instead of LED-LCDs, where both brightness and color degrades uniformly across the whole zone of the backlight)

There's lots of methods now to make this degradation more uniform on OLED, like slightly shifting the image to distribute the load onto multiple subpixels, "cleaning" the panel when its switched off, down to measuring the operating hours of individual pictures and adjusting the brightness of surrounding pixels to blend the degradation.

Overall you can choose what you want to degrade over those 10 years: The LED brightness and color-accuracy of a whole zone / backlight unit (LED-LCD), or have the OLED panel actively compensate its aging (which has gotten ALOT better in the past years)

Either way, I wouldn't put my money on QLED (which is not a technology but a brand). QLED is actually far worse on lifetime than OLED [1], but as it was driven by the marketing powerhouse of Samsung this information is barely known. Considering how expensive it is to produce and that it was basically only made by Samsung to buy time until they can produce OLED as well, I wouldn't expect it to become actually a mature technology. It's more likely that they will replace the Quantum Dot backlight with something cheaper and still keep the brand "QLED" (if they haven't done that already).

AH-IPS LCD panels with zoned LED backlight probably still provide the most stable picture quality over longer time of use, compromising on image-quality compared to OLED. But the last really good TV with such a panel was produced quite a few years ago already...

[1] https://www.nature.com/articles/s41467-019-08749-2


I don't care much about average panel degradation.

I do care about seeing ghost letters or shapes on top of the latest TV series I'm watching.

Edit: The QLED article is from 2019, that's quite a few yers ago for cutting edge tech. I imagine tech is constantly improving.


You might be interested in the RTings longevity test which is exactly about that. The current round is playing CNN for 20 hours a day for 2 years, and they already have some results at the 6 month mark.

https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...


Thanks!

This is a great overview demo of current panels! You can already see some very slight degradation in some of LGs OLEDs, but many of the Sony/Visio (and Samsung) OLED panels.

The others mostly look like standard LCD and DDI/Flex defects rather than burn-in so far.


You'll be fine with any of the existing technologies of today. In either case you will see luminance degradation over time, in case of LED-LCD or QLED also color-shifts of backlight zones.

But none of this is new. In reality, I barely heard people complain about their LCDs becoming more and more weak in brightness and color-tinted (usually yellow) each year. And yet, all of this happened to 100% of all LED-LCD panels.

> Edit: The QLED article is from 2019, that's quite a few yers ago for cutting edge tech. I imagine tech is constantly improving.

Yeah, there's alot of research ongoing to improve QD lifetime as a foundation for new LED technology. Here's one from Dec.2022 [1], and one from as late as 2023 [2], all with the subject of solving performance stability of the technology to become on-par (!) with common LED.

[1] https://pubs.acs.org/doi/full/10.1021/acsaelm.2c01351

[2] https://pubmed.ncbi.nlm.nih.gov/36990060/


You're missing the point:

> I don't care much about average panel degradation.

> I do care about seeing ghost letters or shapes on top of the latest TV series I'm watching.

Nobody cares about "luminance degradation", they just dont want ghosting


I think the point you're trying to make is you can calibrate away LCD backlight luminance degradation because it's uniform across the panel. And you don't notice because your eyes adapt to the whole screen.

With OLED, not so much. OLED does have tech to compensate based on total pixel usage. You might see that if you tried to change a panel without the electronics. I'm not aware of how widespread it is, but I know iphones can do it.


No I did not. Check the first sentence of my reply: "You'll be fine with any of the existing technologies of today."

And "ghosting" are artifacts on fast movement due to a lag in panel refresh, it's something entirely different.


> You'll be fine with any of the existing technologies of today

I think you're going too far with your statement. OLED/qd-oled and to a lesser degree WOLED all suffer from significant burn-in, which makes them pretty bad for desktop use.

Previous LCD technologies will degrade over the years by losing luminance and color accuracy, but that's a pretty minor issue compared to OLED as far as I'm concerned.


From context, you know the post is talking about ghosting from image retention, not ghosting from motion blur. Please don't be pedantic. We're not all versed in the terminology.


I'm pretty sure that on HN, being pedantic is basically required by law.


Sorry, but I've been in that industry (LCD and OLED) for over 10 years and this doesn't match with my understanding. QLED is basically an LCD technology with QuantumDot on top. It is just Samsung's marketing.

Most all mid tier LCD TVs currently use QD (whether in the backlight or the color filter layers). They don't use Blue QD (only longer lifetime Red/Green), but crystal InGaN in the Blue backlight. Other lower end LCDs use RG phosphor instead of QD to produce white, which also has long term burn-in issues, but as you mention they are not individual to pixels. Color shift of the (global backlight) screens over years can be compensated, but since environmental lighting has a bigger effect on perceived color, I'm not sure why you'd bother (mini-LED can be worse).

Even the very best currently shipping OLEDs (not yet phosphorescent Blue) have visible burn-in at ~2000hr even with image shifting de-contrast techniques. Early 2014 QD (Red/Green) have visible burn-in at ~7khr and modern ones about 20khr. They're about 10x better than OLEDs, but you could still see it several years in (rather than months in for OLED).

I've got no great love for QLED, but R/G QD just doesn't have the same level of burn-in problems that fluorescent Blue OLED does, and neither do Blue InGaN LEDs.

https://en.tab-tv.com/samsung-has-released-recommendations-f...

I'm including this link, because the SID.org papers are all paywalled.


Same here, working with and in that industry for ~17 years on technology strategy. I'm no evangelist either, neither for LCD nor OLED.

I frankly don't know how you technically define "visible burn-in" in context of QD (do you mean actual pure QD tech or Samsungs mixed use of the tech for QLED TVs?).

If we can agree, that: 1.) The jargon "burn-in" on OLED describes the luminance degradation of individual sub-pixels (i.e. a weaker red because the operation hours of a red sub-pixel were much higher than its neighbours).

2.) Samsung's "QLED" is a brand for a LCD panel with the backlight created using R/G QD and blue LED (instead of blue/yellow LED). The backlight on standard QLED is separated in ~700 zones of the display, so you have huge clusters of pixels using the same backlight zone.

Then: What is the equal meaning of "burn-in" then in the context of QLED? The luminance degradation of the whole zone?

Or is it the "legacy" meaning of burn-in on LCD: The occurence of weaker/dead pixels due to issues in the TFT layer (broken/weak transistors no longer properly rotating the crystals)?

Either way, I can't confirm and don't agree that OLED TVs will see a visible "burn-in" after ~2000hrs of normal use. First-gen models yes, but definitely not currently shipping models.


I suspect we agree more than disagree. I was on the Driver/TCON development side. I remember the QD guys demoing new OLED TVs at DisplayWeek '17 that showed visible degradation of extreme (B&W bars and Logos) visible at 64 sRGB (10min B/W 1min Grey) during the show after ~72-96hrs. At show setup it was invisible and by the end of the show it looked terrible. You could swap the cables between TVs for comparison.

Similar to standard definitions of Mura a method of defining visible burn-in, you'd have to define the burn image (16pix B&W block or CNN Logo?), the detection image (64 Grey?), viewing distance (30deg FoV?), and visible contrast (+/4 MND?). I guarantee you all the OEMs have a spec.

Here's a set of modern 20hr/day tests at 6 months (posted above) where OLED has visible degradation (mostly Sony/Viso but also Samsung and LG). The new ones are definitely better!

https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...


As far as I can tell, the QLED screens on the market are not based on the QLED technlogy that article is about, and the lifetimes given in that article are obviously far too low to build a usable television set with. They claim non-uniform degradation between different colours with blue in particular dropping to half the initial output after just 20 hours - that's nothing compared to the life of a TV set. Samsung QLED TVs seem to use standard LEDs in the backlight with a seperate quantum dot layer that improves colour accuracy and gamut.


It is the same technology. Samsung is just compensating the deficiency of Blue QD by using conventional Blue LEDs in their QLED TVs (creating white light with Green and Red QD as you wrote as well). The point is that the degradation of QD material still exists, and along with the usual degradation of the Blue LED, also the QD-material will degrade and shift over time.

The research paper describes the lifetime and degradation in a uniform setup, as it's research for foundational QD-tech, not for TV panels.

There's alot of research ongoing to improve QD lifetime as a foundation for new LED technology. Here's one from Dec.2022 [1], and one from as late as 2023 [2], all with the subject of solving performance stability of the technology to become on-par (!) with common LED.

[1] https://pubs.acs.org/doi/full/10.1021/acsaelm.2c01351

[2] https://pubmed.ncbi.nlm.nih.gov/36990060/


There is certainly a difference in either technology or implementation. I've seen LG OLED TVs get noticeable lettering burn-in after a couple hundred hours and on the other end of the spectrum that guy left an unmoving bright-dark pattern on his OLED Switch for over a year straight before seeing any.


> The impact is just more noticable on OLED because the pixels degrade individually (instead of LED-LCDs, where both brightness and color degrades uniformly across the whole zone of the backlight)

Don't organic LEDs also degrade more than their "normal" counterparts?

> Overall you can choose what you want to degrade over those 10 years: The LED brightness and color-accuracy of a whole zone / backlight unit (LED-LCD), or have the OLED panel actively compensate its aging (which has gotten ALOT better in the past years)

I'd imagine it would be FAR easier to compensate the backlight as you just need to re-calibrate one string of LEDs that all see same load over lifetime of the monitor vs somehow tracking per-subpixel usage of each OLED


> Don't organic LEDs also degrade more than their "normal" counterparts?

Yeah, amplified due to their much smaller size, having less surface area for heat dissipation, and also depending on pixel-color in RGB-OLED setups. On TV's WOLED panels are used instead of RGB-OLED, which allows for better heat-distribution and a uniform lifetime regardless of pixel-color.

> I'd imagine it would be FAR easier to compensate the backlight as you just need to re-calibrate one string of LEDs that all see same load over lifetime of the monitor vs somehow tracking per-subpixel usage of each OLED

For luminance there is nothing to compensate, as it simply degrades uniformly. The peak-brightness is simply reducing over time.

Color-shifts cannot be re-calibrated that well because the shift happens due to the chemical decomposition process of the phosphor in each LED, which causes the color of the light going INTO the color-filter to shift (the base light that is filtered for RGB is no longer white).

Once it starts it's not a binary process, the color keeps shifting. Manual recalibration often has no real benefit then, as the color-shift continues and is not uniform across the whole display.


> Color-shifts cannot be re-calibrated that well because the shift happens due to the chemical decomposition process of the phosphor in each LED, which causes the color of the light going INTO the color-filter to shift (the base light that is filtered for RGB is no longer white).

But you can change white balance of most monitors right ? So they at the very least need cold white and warm white LEDs, just change balance of those. using RGB LEDs in backlight could also allow for that correction.


> It's more likely that they will replace the Quantum Dot backlight with something cheaper and still keep the brand "QLED" (if they haven't done that already).

Or something more expensive? Not sure if OLED is more expensive or not, but they are already using that as a backlight on the high-end. Specifically, blue which seems to be the issue for QLEDs if the paper is anything to go by.


My point was that "QLED" is actually a registered brand and doesn't represent a specific technology, so Samsung Display could change the underlying tech, adjust the meaning of "QLED" and keep going.

If they replace it with something more expensive they usually give it a new name to sell the higher price, like "Neo QLED"


The (tinfoil hat?) thing about the thing is: Does Samsung actually want their TVs to survive a whole 10 years?


Considering that most of them are Internet-connected now, they can just stop patching the firmware to obsolete them.

TV takes 10 minutes to load Netflix because it's part of a botnet? Better buy a new one.


Oh, Samsung are already "accidentally" sending out broken patches to TVs that happen to brick them. Coincidentally this happens just a few months after their warranty expires. Whoopsie!

It happened to me and it happened to a friend of mine, just a few months apart.

Never ever connect a Samsung TV to the Internet under any circumstances.


I'll take it one step further: never plug in a Samsung TV. Or better yet, never let one into your home.


This might backfire in EU countries since EU consumer protection laws require extended warranty for devices expected to last longer. For TVs it's 5 years I believe.

So if they're bricked after two years you could take it to the retailer.


With smartphones this would be a problem. With TVs you do still always have the option to use it disconnected/dumb with your own computer, chromecast, TV box, console, whatever.


If a corporation like Samsung became evil, I wonder if they could make a bunch of money running a DDOS botnet and attacking other casinos or something.


Do consumers want to pay the premium for such a TV? And would those that do choose Samsung?


The problem is Trust.

Consumers don't trust manufavturers. If I pay 2x premium I have zero trust the product is better made or will last longer. My $300 laptop worked for 10 years, $3000 laptop has issues after 2 years.

We have a Toshiba TV that has lasted 15 years. Kinda getting to the end of it's usefull life - its 720p.

On the other hand, I don't expect image technology to advance at the same rate any more, we are at the usefull limit of resolution for example. So maybe if I was buying today, I would actually keep the TV for 20 years.


> Consumers don't trust manufavturers. If I pay 2x premium I have zero trust the product is better made or will last longer. My $300 laptop worked for 10 years, $3000 laptop has issues after 2 years.

I'm confused — why are you talking about trusting "manufacturers" as a whole? Isn't the whole point of your anecdote that one manufacturer is trustworthy and the other is not?


As far as I am concerned, none of them can be trusted. For me there is no way to know which product is designed to last - I bought a few that are, by random chance alone.

Well maybe there are a handful, in some markets, but Miele does not make TVs


I don't really know much about TVs, but in my experience, Apple certainly makes long-lasting products — my 2008 and 2012 macbooks and 2014 phone are still in good shape, albeit with dead batteries. In cars, from a second-hand experience (I don't drive and hope to never start), Volvo still has great quality.

Personally, I know more about clothes. Once you switch from fast fashion (Zara, Bershka, Uniqlo) to quality brands (for me, it's Rick Owens, Margiela, Thom Krom and McQueen, adjust for your personal style), they do last a lot longer — I still wear T-shirts that I bought in 2016. However, it means that I shop much less often, own much less clothes and they're all the same style (and the same colour — full black). And I see that people pay lip service to degrading quality, but what majority of them _actually_ want is a lot of regular dopamine from shopping and a wide variety of garments.

You can eat out at fast-food everyday, or spend the same budget to eat at a Michelin-stared place once a couple of weeks (1 star; 2 or 3 stars it would be once a month or more). People tend to choose the former, but then blame "the industry" for their own choices.


That's the thing about the thing about the thing, I have a 10 year old Samsung LCD that's just fine.


I have 7 LN46A's now and left over parts from others that had panel defects.


Is it actually just fine, or have you not noticed the slow and gradual degradation of its color accuracy and brightness?


I haven't noticed it. So it's not a problem.

I DO notice OLED burn-in, so that <<is>> a problem.


Rtings are doing a major test which seems to indicate it's not a major issue for the majority of modern OLEDS: https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...


Some of these look pretty rough. I would be pissed if my TV had the burn-in I see with some of those Sonys, for example.


There is a mix of newer and older OLED and LED TVs in there, and that is 6 months, 20 hours a day of watching CNN. That said, it doesn't look quite as good as the last time I checked on the results. The new QD-OLEDs seem to do worse than standard OLEDs.


If anything I am impressed. They are running these at max brightness with static images on screen for 20 hours a day. I would have guessed the burn in would have been way worse.


There's no control images there to tell you how much it actually degraded. The panels might not be uniform when brand new either.


At the top of the table, you can choose to view month 0, which would be the day they started the test. There's your control.


The real answer is that it doesn't matter: vast majority people buying bleeding-edge OLED TVs (yours truly included) will not use the same TV for 10 years.

And for exactly the same reason most people buying high-end computers and phones won't — the tech will have moved on in 5 years to such a degree that newer products will be significantly better.


I bought a 43" Sanyo LCD TV (720p) in 2004 that still works today. It's good enough for movies.

I have a 15" Samsung CRT TV from the 90's that still works great. I use it for retrogaming.

> vast majority people buying bleeding-edge OLED TVs (yours truly included) will not use the same TV for 10 years.

IMHO people mostly upgrade their TV because a bigger one comes out. At some point we will reach the maximum practical size for a home TV (kinda like CD drives maxed out at 52x) and then I think people will buy them less. TVs are already kinda cheap and boring and if they don't keep getting bigger, the average consumer will eventually treat it as a commodity item (and maybe it will be priced that way too).


I agree. I am a moderate TV user. I bought a new one about 3 years ago, and went with a 65". I could have fit an 80 on the wall, but it didn't seem necessary. And I'm glad I didn't. The 65" is already too big if you get up off the couch and stand nearer to it. It's kind of disorienting to watch from any closer than the opposite side of the room. And it was a mid-range TV at the time and still looks plenty good for me. I don't foresee upgrading anytime soon, unless something goes wrong. Honestly the thing I like the least about it has nothing to do with the display. It's the annoying Android software.


Sure, but then by your description you're not someone who's interested in buying bleeding edge tech (at least not for TV's).

I'm not saying those people don't exist — they clearly do!

But those are not people who buy the most expensive/best TV's on the market.


You should certainly be prepared to buy a new TV/monitor after 10 years of regular use, with OLED. If not burn-in or color drift, brightness is likely to decrease noticeably.


Is burn in still a thing? I was under the impression burn in rate had been reduced to a reasonably negligible value in modern OLEDs.


Yes, burn in still exists for new flagship OLED panels

But you're very unlikely to experience it compared to past technologies, even if you are displaying a static image for extremely long periods


OLED panels willl always suffer from burn-in in some form. What I meant by "is it a thing" is "will users ever experience it to a detectable degree". I guess the answer here is "No" then.

I've seen the video where the guy leaves a bright high-contrast static image on his Nintendo Switch OLED for 10,000 hours before finally seeing a small amount of burn-in, but that's not the level that should concern average users. Was just curious if typical large screens fared any worse than the new Switch.


I just want the tech to get to where it’s like consumer SSDs; they do have limited wear endurance but practically most home users will never get close to it anymore on high capacity modern SSDs.


Plus, the TVs have good default settings to prevent it even if you're negligent.


> even if you're negligent.

It's a TV!!! There shouldn't be any "negligent" :-)

Some people are fine with tiptoeing around the problem, I imagine most people just don't do anything and mess up their TV.

I shouldn't have to configure FVWM for 20 days to be able to operate the appliance correctly.


CRTs had the same problem for decades, so while I agree it shouldn’t I’m certainly not unfamiliar.


You wouldn't have to configure anything, even on older TVs. But there are things you can do that will eventually harm it, like leave the same bright image on the TV for hours. That's negligence, especially if the manual tells you not to.

The newer TVs can detect that scenario and turn the TV off automatically, unless you specifically disable that setting.

TVs are a luxury good, and there will always be ways to be negligent with anything. Leaving them out in the rain is negligent. Some cleaners will harm the screen, I'm sure. Using them in a too-hot or too-cold environment will harm them. I don't expect my TV to be battle-hardened, and I'm certainly not going to pay the price for that. Instead, I'll take care of it.


CRT TVs automatically degaussed every time they were turned on...


I have an LG.. if you left your nintendo switch paused on a static image it's going to put a screensaver on after a while. It knows the image is static.


Reduced yes, negligible not really.


Which was always so silly. No amount of screen brightness can make up for loss of detail in a dark scene and even OLED maximum brightness is enough to hurt your eyes.


Small-format OLEDs are great, but not large-format not yet.

1. Large-format OLED maximum brightness is very low. LG C2 peak SDR brightness is some 425 nits. A Samsung Neo-QLED QN90B does 1200 nits, and an iPhone 14 (small OLED) does 800 nits. In my experience, a room with windows facing the sun with thin curtains require 1000-1500 nits SDR to be comfortably legible. 2000 would be nice to have, but that's not available for the masses yet.

2. That SDR brightness is only maintained for a 10% window, and quickly drops off for larger bright areas. At 50%, it's only 350 nits. The QN90B does 850 nits sustained at 50%.

3. Same panel does peak 6-700 nit HDR, well below the 1000-4000 nits generally needed for HDR content. An iPhone 14 does 1200 nits, The QN90B does around 1700 nits. The high peak is needed to give proper "blinding" effect to e.g. skies or lights, as opposed to just looking like SDR-style bright white surfaces, especially outside a dark cinema room.

4. WOLED, which is the dominant panel technology for large high-end OLED panels, lose color saturation quickly as they approach their (already low) maximum brightness, due to relying on a white subpixel for raw luminance.

5. MiniLED backlight for LCD with 10k+ diming zones - while a stopgap - get dimming artifacts far enough down to be competitive on contrast, while destroying OLED in brightness and color saturation.

Large-panel OLED will beat LCD, but right now it is has many shortcomings. QD-OLED shows significant promise, with latest generations finally starting to see some decent brightness, but it's early (and rtings report both thermal throttling and some quite unfortunate burn-in behavior on those).

And yes, 1000 nits+ for SDR is retina-searing once the sun sets, but for those who have forgotten how sunlight feels I can report that it is quite bright and requires effort to outshine. Something like ambient light control is necessary to adjust throughout the day to not accidentally go blind.


Personally, 500 nits full screen white at 3m away is enough to hurt my eyes. I’m grateful OLEDs exceed that only for very small areas and briefly.

Also, no amount of screen brightness can make dark scenes visible in a bright room. You have to make the room darker anyway, so why bother with a brighter screen?


Because lots of people like a bright living room and want to watch Youtube without pulling the curtains?

Sure, to get the full cinema experience you should darken the room, or just watch at night. But having a bright and legible screen without having to close the curtains is still worth a lot.


Same. Sometime I even have to dial it down if I'm watching it in dark(er) room, I also try to have lights on in room/hallway next to my living room just so my eyes don't bleed when bright scene appears


This is just plain false. Enough screen brightness works with current generation LCDs. OLED specifically is just incapable of delivering it yet. 500 nits is only bright in a darker room.

Darkening rooms in daytime for non-cinema watching is just a workaround, which is both impractical and not necessarily preferred. 1000-1500 nits is not perceived as bright in daylight, but is perfectly comfortable and legible. A little more would be nice, but not much.

(If you are light sensitive and it hurts because of that, then this is a different topic - but that is not the norm)


Glare from windows or reflected from light walls/furniture can easily exceed that, so any dark scenes (or dark editor theme) are entirely illegible. No amount of screen brightness can make blacks darker under glare, which will show up as the brightest white the screen can produce.

Darkening the environment is the only option for displaying any dark colours and it happens to make high screen brightness unnecessary.


I have bought a monitor with 600 nits so that in the summer I can work at home without drawing curtains. I can have sun in the room and still read the display without issues.


The current top-of-the-line LG OLEDs hit ~1500 nits in HDR: https://www.rtings.com/tv/reviews/lg/g3-oled

The QD-OLEDs from Samsung which use a similar trick as their "Quantum-Dot" LCDs do, also hit similar levels: https://www.rtings.com/tv/reviews/samsung/s95c-oled#page-tes...


Both of those tests report real SDR brightness of 500 nits, or 600 for 10% sustained. The QN90B test reports 1200 nits and 2000 (!) nits respectively: https://www.rtings.com/tv/reviews/samsung/qn90b-qled

Real scene SDR brightness is all that matters for use in a bright room, as it represents the overall brightness of the screen when playing normal content. HDR brightness on the other hand is only observed in certain scenes of particular HDR content mastered to use it - not to mention that such HDR content is more likely to be consumed in a dark room to enjoy the darker details not present in SDR.

I think the limiting factor right now on those new OLED sets is thermal throttling and power limits. But all is not bad - it shows great promise of near a future where self-emissive panels have food brightness.


Peak small-window nits is not the proper Rtings metric for viewing in a bright room. It's Sustained 100%, which is only ~300 nits on the brightest, most expensive OLED panel on the market. A bright LCD panel puts out 600-800 nits here. This is the closest metric to something like watching sports in a well-lit room.


rtings "real scene" brightness is a better measure than 100% sustained - I think some brightness limiting is acceptable for a fully white screen.

Still, that's 500 for the those OLED sets vs 1200 for the QN90B.


When I was choosing a new TV I was a bit concerned about brightness. 2000 nits is so much better (on paper) than 400 nits. But I wasn't planning on putting my TV outdoors and got a C2.

My current brightness settings are at 30% when watching TV at night. "Bright room" settings bumps pixel brightness to 45%, which is good enough for a room with windows on 3 out of 4 walls. It is much brighter then a 10 year old LCD it replaced. And I had no problem using that old one until I put two screens side by side.


I also know people happy with their OLEDs, but if you use your OLED at anything less than 100% brightness throughout the day, the only reasonable explanation is that your living room is much darker than mine. Which is fine, it makes it easier for you to enjoy your TV so good for you - but if someone crazy if they suggested that an iPhone never needed more than 30% screen brightness (which if the scales match would be brighter than your TV setting), I'm sure everyone would agree that they were wrong, even if it is sometimes enough.

For reference, I have my QN95A on max brightness (with auto-dim to not be killed at night), and it's not quite as bright as I want, but bright enough to be comfortable - and I don't think I'm being unreasonable. I do have windows both left and right and can look straight at the sun from my couch position twice a day (sometimes both ways due to neighboring reflections), and for WAF reasons my curtains are the thin kind.


This simply hasn’t been my experience with a 43” OLED TV, which I use as both a TV and my main monitor when working. It’s facing away from the window, which I’m sure helps, but I’ve never had to draw the curtains to see what I’m doing.

Maybe if your TV is facing a window that’s more of a problem, but if so I’d suggest not doing that. You’re never going to get a good experience from a TV facing a light source because of nothing else it will reflect off the screen.


A brighter tv is less expensive than rearchitecting your livingroom.

I don't have walls that don't face huge windows in my livingroom, so I had to splurge for a Samsung Q90. It does well for the most part. On sunny days I still miss even more brightness.


You look to have great experience in this field. What do you think about MLA OLED, and which one of these various technologies have the best chance of getting price-competitive with normal direct-lit LEDs at large screen sizes?


OLEDs really struggle in daylight, but are certainly fine in dimly lit rooms.


Our 77" LG C2 looked fine, even when we have all our windows open


I have a recent QLED, my parents in law have a recent OLED (LG), seems to be the same. I'd say the QLED is easier to watch during the daylight with sun in the room.


Having a Neo-QLED myself, the brightness is definitely a plus - still a bit dimmer than I'd like with daylight, but certainly better than having, you know, half the brightness with an regular OLED...

The main downside I can note is that the dimming zones are way too large - especially noticeable with local dimming set to High, and credits scrolling. Not noticeable for normal content though. Mini-LED backlight would have made it indistinguishable from most OLEDs in anything but the darkest of cinema rooms...


What generation do you have? 2022 or 2023? Dimming zones get smaller each generation. Neo-QLED branding startet 2022?


2021, QN95A75QE, an impulse-buy while they were massively discounted some time before the 2022 release. Mine has some ~700-ish dimming zones IIRC.

Moving bright content on dark background reveals the dimming zone resolution as the zones chase the content - credits is the main thing that comes to mind, where the text brightness fluctuates as it transitions zones. A circular loading indicator can also trip this, but the TV recognizes it after a rotation and then evenly illuminates the path.

Turning down the dimming from max would likely help, but some may be surprised to hear that credits are not my main content priority.

The brightness is great for my bright living room though, and worlds apart from the OLED sets I've seen in similar settings.

As an aside, super happy I got the 2021 - a friend got the S95B, and the newer Tizen UI is way more intrusive and more ad-filled.


Thanks!


I think all Neo-QLED TVs are mini-LED - something like 1000 dimming zones?


IIRC, 500-750-ish. Good mini-LED are in the several thousands.


You can also use a ton of small leds for the backlight, like the display Apple calls the ‘Liquid Retina XDR’ display and there’s probably a world of controller software to improve on for years.


Which is what the LCD-TV industry did for years. It's no longer economic, except maybe in specific high-margin industries (like medical displays)

But even in those areas I'm confident that its existing tech, I doubt that there's alot of investment going on.

WOLED panels are on a trajectory from Premium to High-Tier with production volume increasing each year, with RGB-OLED increasing in panel-size each year as well. The window for investing into R&D for a large-volume LCD-panel which outperforms OLED in accuracy but at a lower price has already closed.


The generic term for this is mini-led which is not to be confused with micro-led. Apple is expected to start using MicroLED displays within a couple years.


I find it quite astonishing their newer monitors run iOS.


I believe the point is that the main research is now on oled and micro-led. Those are truly LCD-less. Instead of having a backlight that is variously dimmed by an LCD panel, they have self emmisive pixels.

Micro led truly aims to just have a single LED per (sub) pixel.


Micro-led seems to me to be one of those technologies that's always just around the corner.

Perhaps I'm a pessimist, but from the launch of the first mainstream OLED tvs several years ago, I've been hearing "OLED is just a stopgap, Microled is where it's at, and that's not far off now".

Ten years later ... oled is still where it's at, or LCD for a lot of people. I know these things take time, but the anticipation of this tech seems to have lasted at least a decade so far.


Maybe you're not old enough to remember, but OLED was also touted as the future for many years before it actually became available - I remember hearing about OLED at the time when flat panel LCD monitors were starting to become widespread in the early to mid 2000s, but it still took some years for them to appear in phones and then, much later, in TVs. Actually the first device with an OLED display that I had was a cheapo Chinese USB-stick-with-integrated-MP3-player which I bought more out of curiosity for the OLED display and never actually used because I already had a phone that could play MP3s (this one: https://en.wikipedia.org/wiki/Sony_Ericsson_K800i).


My dad had a textbook that said something like 'when current is applied liquid crystals glow, although this effect is interesting it is completely useless. The first LCD watches came out as he was finishing school.


That's not how LCD Displays work, though.


The book was thrown away a few decades ago, I remember seeing it as a kid, but floods got it since. I'm not sure what the quote is.


They do if you power them wrong enough


The first commercial OLEDs launched in 2007 I think, I had an OLED phone in 2012(AMOLED?), and then the tvs got 'big' commercially in what, 2017ish?

My point isn't necessarily that Microled is endlessly delayed, so much as that people are endlessly talking about it, have been for years, and it's still years away from even high-end consumer models. I don't remember the same for OLED, in fact I remember people getting very hyped over LCD vs Plasma, and OLED not really getting a mention as a tv tech, even when it was in phones. Perhaps it's just my memory.


> Micro-led seems to me to be one of those technologies that's always just around the corner.

You can actually buy a MicroLED TV today, but they're huge and expensive:

https://www.samsung.com/us/televisions-home-theater/tvs/micr...


How funny they are selling a $150,000 display and still feel the need to inflate its size from 109.2" to 110".


Once it's powered on for 10 minutes, the heat causes the materials to expand.

Soon thereafter, the thrust from the cooling fans causes the tv to leave forever, but for those first few minutes it is a viewing experience beyond any other. Just last week, one was spotted cruising past the freeway at low altitude, out to sea. Some say it is still going.

Mind you, the display technology doesn't produce mind excess heat. That would be the smart-tv cpu as it loads up a barrage of ads with which to torment watchers with. Skip ad 14:56s


They are so massive that the difference is caused by the curvature of spacetime.


Huh? MicroLED is pretty new. I think the first research demonstration of a microLED display was in 2009 or 2010. You can buy microLED TVs right now. A bit more then 10 years to go from research to product is totally normal.


It’s a bit of a stretch to say they are available now. Yes, there are very low volumes of a few very expensive models being sold, but that’s not really what I’m talking about.


I thought MicroLED was about per pixel backlight for good old LCD.

Gonna hit wiki.

Edit: it's not LCD hehe. I have memorized some incorrect info


That's "mini-LED". MicroLED has small enough LEDs to use the LEDs directly as the pixels.


You seem to be right. I don't have them too confused, though. There have been MiniLED display on the market for years, and they're all LCD


Manufacturers blur this distinction on purpose to sell cheaper items


I believe mini-LED/zone dimming was a development expected to be followed by MicroLED, yet longevity and yield continued to be problem in Micro which left “Mini” branding hanging in the air. IIRC.


Yeah, LCD tech isn't going anywhere; unless OLED is or becomes cheaper (long term, it probably will), they'll be churning out LED panels for decades to come yet.


There are companies still selling EL display panels.


Where is the old plasma displays? I suspect those at least are not being made any more?


Panasonic ended their Viera line about 10 years ago, unfortunately. AFAIK they were the last &/ the best.

Mine still works great, although it still draws ~200 watts of energy.


It's amazing that pixel density has been stagnant since 2014, when the first 5k TV (low ppi) and 5k desktop displays (>200 ppi) were released. It's 2023 now and it still takes a kidney to get 5k >200 ppi, and we only recently got the 6k >200 ppi option for 2 kidneys. Pixel density, however, is stagnant.

I expect the industry realizes 6k is basically the stopping point so they're intentionally approaching it very slowly.

Edit: Updated with ppi specs to draw focus to pixel density of desktop displays.


I saw a Samsung 8k screen and the resolution was amazing. Of course there was not much more content available than just that demo movie, but still it was interesting.

I do see individual pixels on retina screens but I don't notice them on more organic stuff like movies on my screen. I guess computer content is just perfect squares and then perfect circles and fonts with some jaggy edges where they're rounded off, which is really not that hard to spot on retina screens. The difference between 4K and 8K in my opinion is that you can better see little glimmers, textures and things like that. But the 8K demo movie of course was highly optimized to show exactly that.

What I really like is HDR on true black OLED, killer feature.

But what I want more than anything is a "compression" feature on the audio of movies. I mean I just want to hear what people are saying to each other without the neighbors calling the police when going into a scene with explosions and/or music. I'm using subtitles in English for English movies nowadays FFS!!!


> "But what I want more than anything is a "compression" feature on the audio of movies. I mean I just want to hear what people are saying to each other without the neighbors calling the police when going into a scene with explosions and/or music."

This is due to the (misconfigured? buggy?) way that 5.1 surround audio sometimes gets mapped to 2-channel output. Dialogue in a film is usually carried on the centre channel, which in a proper cinema will be pretty powerful and loud set of speakers. Music and surround effects are carried on the other channels.

But for whatever reason, sometimes when down-mixing the centre channel comes through way too quiet and gets drowned out by the others. This seems to have gotten better in recent years (Apple TV, Netflix, etc are either good at down-mixing or the streams come with good 2-channel audio), but some TVs still make a mess of it.


No, it's due to bad mixes being trendy these days. Downmix issues are certainly a thing, but the real problem starts in Hollywood, with auteurs like Nolan that no sound professionals can say 'No' to.


With Nolan at least, he's acknowledged it was a deliberate "creative decision".

> “We made carefully considered creative decisions,” Nolan explained. “There are particular moments in this film where I decided to use dialogue as a sound effect, so sometimes it’s mixed slightly underneath the other sound effects or in the other sound effects to emphasize how loud the surrounding noise is. It’s not that nobody has ever done these things before, but it’s a little unconventional for a Hollywood movie.”


Which just rephrases my point.


Also in case it will help someone: OSMC/Kodi has a similar “downmix center mix level” option. Mine is always at -7db for 5.1 audio. Works like a charm most of the times


Or if you have a PS5, plug the controller into the console (must be wired, not BT) and then headphones into the controller. The virtual surround is surprisingly convincing and sounds quite natural and not headphoney.


I'm not so sure that's the case. I have true 5.1 and I still have to have the center channel at a relative +7db minimum for the dialog to be as audible as I want it to be, compared to the rest of the audio.


Yeah, but that's probably because your centre channel is relatively smaller than on the cinema audio systems the 5.1 track was designed for. So setting it to +7dB at home is probably normal/expected.

After all, (Tenet aside) we don't experience these problems at actual cinemas, where the centre channel is a huge speaker stack behind the screen, and the surrounds are relatively small by comparison. But I agree that it's a common issue on on home setups.


If you've got a receiver, a lot of them have audio compression options. Not sure how well they work though, but they're there. I think some TVs have them too. A receiver should also let you increase the levels on the center speaker vs the others (maybe even if you only have a 'virtual' center where you don't have an actual center speaker and the audio goes to left and right). The center speaker gets mostly dialogue, so that can help a lot.


I have a Marantz receiver. They call this Dynamic EQ, and it seems to work quite well.


Ha, I thought I am weird for using subs while watching English stuff. Have small kids sleeping not so far, can't wake them up just because some idiotic sound mixer thought that some explosion/crash/something should shatter your windows and that's how everybody should experience given movie.

Older movies made so many things so much better. Sound 'friendliness' is definitely one of them.


Yeah, the blacks absolutely. Recently upgraded to a PS5 (for 4K discs) and a Samsung S95B. Some of the most impressive discs I own are well restored films from the b&w era. The dead blacks and (with HDR bright highlights) gives it a wonderful natural projected look. The quality is probably better than even the most pristine release print, since these days these sorts of restorations are fine by scanning the original camera negative, so several analog generations better than prints for projection.


Agreed on both counts. 4K OLED > 8k LCD and dialogue is so hard to understand I have to resort to voice enhancement options.


I only watch TV wearing headphones.


I just bought a new receiver after my old on was killed by lightning and I was very surprised to find that few receivers and virtually no inexpensive ones (< $500) can pair with Bluetooth headphones. They all support Bluetooth, but only as an audio source.


If you use your receiver to send to a Bluetooth device, it's not really doing its job of digital to analog conversion or amplification anymore, it's just acting as an audio relay. Most times the source that is plugged into your receiver can do the pairing to Bluetooth, so the receiver is sort of redundant. I admit it'd be a nice feature to have for occasions when you want it, though.


The positive of using the receiver is that the receiver is also usually the AV switch determining what device is actually active. Otherwise you're having to change what device you're paired with when you change inputs.

Also, I haven't been shopping for things like Blu-Ray players and Rokus/Fire TVs/other media boxes in a while, do things like that really do Bluetooth pairing?


> Blu-Ray players and Rokus/Fire TVs

I'm not sure if any of those do bluetooth pairing these days. In retrospect I probably shouldn't have said "most" sources will do the pairing themselves. Some will.


I'd like it in the receiver because Bluetooth introduces a delay that my receiver can compensate for and I can configure that in one place rather than in every source.


If you’re in the Apple ecosystem, connecting Airpods to an Appletv is pretty flawless.


You're right. I've been thinking about picking one up because the software in my Sony TV is pretty bad and Sony stopped supporting it a couple of years ago (the TV is a 2019 model).

I'm probably going to buy one because the ethernet port in my TV recently failed and I had to switch it to WiFi. Streaming on the TV seems to be less reliable and an AppleTV would fix that as well.


It’s amusing that you want more dynamics to your visuals, but fewer dynamics to your audio…

Why the disconnect?


Vox had a good video on why dialog sounds terrible in movies/shows these days.

https://www.youtube.com/watch?v=VYJtb2YXae8

It boils down to a combination of:

- Actors speaking naturally instead of projecting toward a microphone

- Smaller microphones

- Dynamic range - big explosions have to sound loud, so dialog gets pushed quieter

- Downmixing from Atmos to 7.1, 5.1, stereo, mono

- TV speakers have gotten smaller and mounted on the back due to thinner TV's (and also poor quality speakers on devices)

I guess the solution is to buy a surround sound system and hope it's good enough.


>> I guess the solution is to buy a surround sound system and hope it's good enough.

This made a huge difference for me. I went from TV > Sonos Beam > Sonos Beam + two surrounds. I've tested going back to just the Beam (I wanted to use the surrounds as stereo music speakers) but the huge decrease in my ability to understand dialog made me keep the surrounds. It's particularly problematic on TV shows where background music is used a lot. Having that come out of the surrounds and dialogue from the Beam made things so much better.


> I guess the solution is to buy a surround sound system and hope it's good enough.

Yet many people opt for a soundbar which has similar issues with the physics of thin/small speakers. I personally have a set of restored Bozak B-302As from 1958, big box speakers that are a solid meter cube, and they sound great :)


The consequences of high dynamic range video are trivial to enclose within the bounds of one room in a way that high dynamic range audio is decidedly not trivial to enclose.


Hearing loss or damage is more of a challenge than vision issues as well. Wider dynamic range doesn't hinder visual understanding the same as it does audio.


They did lay out their reasoning pretty clearly... Audio goes beyond the room/home, video does not.


I want more dynamics on audio too. But it requires to keep my audio speaker volume settings loud.

I can easily isolate brightness pollution by closing door and sun shades. But isolating sound is difficult.

At this point, going to cinema is the only solution to enjoy high dynamics audio.


I recommend getting a used receiver from pre to early 2000s and getting some used passive speakers and a sub to go with it. I spent $100 on my setup and it sounds better than any sound bar and can be endlessly adjusted to match my room’s acoustics


Agree with this, lots of sound bars are crap and even the ones that aren't are very constrained by their form factor. Good speakers can actually make very dynamic audio be less annoying as well, by virtue of being less distorted. Same volume, more signal, less noise.


Yep and they are maintainable. I replaced the foams on the drivers and the tweeters when they gave out, $30 worth of parts and easy to do


Well, the only affordable one.

https://smyth-research.com/ works pretty great


Speaking of affordable, Apple TV + Dolby Atmos movies + AirPods Pro 2 with spatial audio enabled is probably the consumer friendly way to go for dynamic surround sound that won't get you evicted.


Dolby atmos is trash.


You should additionally explain why you think it's "trash", so that your comment might have some value to people.


We can probably draw an analogy with frame rate: why do movie fans insist on 24 fps, when TV is 50/60 fps, and you could play your games at 240 fps if you really wanted to?... Because "more" is not always better.

Btw. I do actually prefer LESS contrast in video sometimes, just as I prefer a compressed dynamic range in audio. This has a lot to do with the environment I'm watching / listening in, and the quality of my screen and headphones or speakers.

And finally, using dynamic range to "enhance" the sound of detonations is just awfully childish. If you are watching a movie just because the detonations are loud in it, I would argue you are watching the wrong movie...


> We can probably draw an analogy with frame rate: why do movie fans insist on 24 fps, when TV is 50/60 fps, and you could play your games at 240 fps if you really wanted to?... Because "more" is not always better.

That seems to be more "they can get away with more at 24fps" and so using same techniques but just upping FPS results in more of it looking "fake" and I lot of "it doesn't look to movies I'm used to".


Also sometimes referred to as the Soap Opera effect as those shows tended to be shot video instead of film and end up with higher frame rates. Somehow we've associated that smooth motion with cheap and shallow content which turns people off to it.


Maybe they’re looking for contrast in a particular scene, eg bright beside dark. If a sequence as a whole is very loud, while a later sequence is quiet, they might have to drop the volume down because of the loud part, effectively compressing the range of the quiet part to the point that it is unintelligible.


play a podcast, drop the volume to 50% for few minutes then to 150%.

You'll understand right away


TV resolution goes at the pace of available content, and everything goes at the pace of context (living room 8-12' away), which is presently stuck at roughly what the human eye can resolve at that distance, which is about 4k.

There isn't a world where a 5k or 6k tv ever makes sense. Monitor, yes. But not for TVs.


> which is about 4k

4k does not exist because it's optimal for the human eye. It's dominant right now because anything higher carries a really significant bandwidth penalty, which would be hard to do for both streaming and optical media.

If we lived in a world where blu-ray disks could hold 1TB, and global internet speeds were 10x what they are now, 8k TVs would be the norm.


People can't even tell the difference between 4k and 8k. You need a microscope.


People have been saying "You can't tell the difference between X and Y" since the dawn of HD graphics in general.

"You can't tell the difference between HD and FHD." FHD and QDH. QHD and 4K. 4k and 8K. 60FPS and 120. 120 and 240. A higher resolution texture. Higher resolution audio. A higher resolution mesh. Etc.

Every single time someone makes the claim it makes no difference, yet we do it regardless.

For something that seemingly makes no difference, people sure seem to continue improving it regardless. And shockingly, consumers seem to really enjoy the products too.


Yeah, but you actually can't tell the difference

https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-co...


Maybe. That test doesn't cover everything. Not all content, not all TV sizes, not all individuals, not all display technology, not all environments, not all compression tech, etc.

What about images that use HDR10+? Or Dolby Vision? What about completely un-compressed lossless video? What about common compressions like X264 and X265? What about on an 80-inch screen? What about in dark vs bright scenes? Or even environments? (Human eyes handle images differently at low vs high brightness)

I'm not saying the data is worthless because it's not, but I am saying that I think it's too sweeping to say it's completely un-noticable. Not enough data for that conclusion. Vision and display technology are both very complicated topics and there's always room for improvement IMO.


Pixel density has diminishing returns, we've known forever its not linear. Its the same reason why phones aren't all 4k by now, nobody can tell the difference.


Every single upgrade in resolution has been diminishing returns, and that hasn't stopped anyone from doing it anyways.

The only reason it will ever stop is if we hit major technological or even physical barriers. And even then, we'll most likely just move onto another technology that allows yet higher resolution.


Now you're gonna tell me I don't need more than 44,100 Hz for sound.


You don’t need more than 44,100 kHz thanks to the Nyquist-Shannon sampling theorem.

https://en.m.wikipedia.org/wiki/Nyquist–Shannon_sampling_the...


If you got an ideal filter maybe. Having margin to half the sampling frequency helps filter design.

Then again, the frequency response of the loud speaker is not flat so I dunno if the sampling filter cutoff frequency matters.


Yeah but you can keep source material at 44.1k and oversample before the DAC.

And 24 bit is nice purely because you can regulate volume in software without losing fidelity (if DAC is good enough). But again, no such requirement on source, just the processing pipeline


The real rabbit hole is when you start looking into the frequency response of your eardrums


Sure but I guess the goal is for the replayed sound to sound the same as the original would.

According to Wiki the absolute max an amazing(?) ear can hear is 28kHz. You can maybe hear aliasing of higher frequencies too?

I can't hear any difference between 44.1kHz and faster sampling, but we can't refute it on purely theoretical ground.


That's not what I'm getting at. For most purposes the 22KHz limit is a hard one. What I mean is that the eardrums do not have a linear response at any frequency.


I don't know if that is meant to be tounge in cheek, but for most ears it is physically impossible to discern any broader audio spectrum.


Actually it is, but I'm stoned to death even when I share audible sound deltas between lossy and lossless codecs, so I'll just leave it with one sentence:

    Higher sampling rates and higher quality encodings result in a broader, more impressive and immersive sound stage primarily, and more details are secondary, and in some cases are elusive to run after.


"more impressive and immersive sound stage" is a nice way to say "placebo".

Lossy codecs, at too low bitrates, can be a problem, yes.

But every content can be recorded and played back as 48kHz 24-bit FLAC, which is perfect. No amount of money you spend can improve upon that.


Here we go.

So I have a setup: Akai AM2850 Amplifier, a pair of HECO Celan GT 302 speakers, a Yamaha CD-S300 CD player, which has USB input with iPod interface and MP3 capability.

A well mastered album (read: No brickwalling) like Wasting Light (Foo Fighters) or Brother in Arms (Dire Straits) will provide you a larger soundstage when you listen in front of this set from CD (or any lossless source, but let's keep everything in the same DAC), even compared with 320kbps MP3 (it's minimal, but audible).

This can be ruled as placebo alright, but audio science is hard science. It's subjective, and there's an aura of lies around it due to great deal of real snake oil in this.

Yet, I listen to same amplifier for 30 years, and I know how it behaves in any genre, any sound source and any input level, yet I can make sense of the subtle sound stage changes, because I'm so familiar with it.

Honestly, I played in large orchestras and whatnot, but when listened intently, the difference is there. It's difference between a good cup of Tchibo vs good cup of Davidoff coffee, but it's there.

I'm currently not at my home, but if you want I can provide you a fresh set of sound deltas (FLAC vs. 320kbps MP3) when I return.


I agree that higher encoding quality can provide the differences you describe, but higher sampling rates (beyond 48kHz, that is) cannot.

Personally, I try to acquire all music on CD or as FLAC from e.g. Qobuz or Bandcamp. But I see no reason to go beyond 24-bit 48kHz for any kind of audio unless I plan on slowing it down for e.g., a slow-mo video.


I have a 24bit WAV version of Radiohead's "OK Computer Oknotok" directly from the band, and sounds absolutely beautiful, though I don't remember whether it's 48K or 96K.

In any case, I really don't believe that I have a system can render 96K sound meaningfully different than 48K. I'm also not sure that I can tell the difference unless I try very intently, either.

I also an avid Bandcamp shopper. Some of the people make really beautiful music there.

Lastly, if I'm happy with the sound I get, I don't run after numbers. I use a Behringer Bass V-Amp as my bass processor, and even though it's far inferior on the paper, the result is not for my ears.

Heck, if Haggard can use this thing for live and studio, why can't I?


A sound stage is not quantifiable, subjective and probably not real. I've heard similar things said many times for a number of audio systems and encodings and in a blind test it is just not discernible IME.


It's discernible and measurable. It's called stereo separation and instrument placeability in mastering parlance. iZotope Ozone has a nice stereo imaging view to show the stereo separation in tracks.

When you mic a big orchestra (or any orchestra) clearly, you can capture the stereo image (which is generally done with non-instrument ambient microphones), too.

Mixing this ambience into the final stereo mix can increase the instrument placing when done correctly, but needs a good room with good acoustics. This is why we have Atmos and other tech. It allows you to virtually position channels in a room, and increases this separation, in theory. Nice when done subtly, ugly when done overboard.

Also, speakers' sound rendering has something with it. For example, the Celans I have can fill a room with positionally correct instruments fairly impressive. What's more impressive is, Creative Gigaworks speakers also capable of doing this while being desk speakers (yet they have kevlar cones and silk tweeters too).

There's a sensible upper limit to pay for audio systems. Mine is neither top of the line, nor too cheap to be true. It's a powerful, yet vintage all analog powerhouse, and I have it for 30 years or so.

After some point, even if the resolution provided by the system is justified by the money it requires, finding sources to saturate is impossible to find. So, when you find something you like for the first time, you stop there. I plan no changes in my audio system for example.

There are infinite number of variables in audio (system, source, room, placement, etc.), but sound resolution and soundstage are not placebo effects. If it was, a $4 headphone would sound the same wth $44 or even with $440 one, yet they don't.

Same for speakers, amps, etc. Until a certain point as I said.


When you hit a drum there are all sort of short duration high pitched components that make that chest thumping all the more impressive.


48khz is better for video due to the way it can be divided more evenly with popular video frame rates - and besides it’s the standard for audio alongside video so you might as well use it for everything.

So yes, for that reason alone you need more.


> 48khz is better for video due to the way it can be divided more evenly with popular video frame rates

Is the half sample per frame at 44,100 Hz and 24 fps really an issue? DVDs support 48kHz, but they only support 25 fps and 29.97 fps. You're going to run into so much NTSC content that I'm not sure even divisibility is a thing.


And if you'll edit, remix, and slow the audio down, you might want 96kHz or 192kHz, sure.

But for the final delivery, there's no need to ever go beyond 48kHz at 24-bit.


Why do you keep repeating 24 bits? It is way too much, even at 16 bits the dynamic range is far more than what you can reasonably hear (unless you want to damage your ears!) - especially if you are listening in a real world environment, and not in a completely silent room... Vinyl has a significantly worse dynamic range, yet it is loved by hi-end fans, sometimes even preferred to CD!

24 bits is useful for mixing various sources, but otherwise it is a waste of space.


You're absolutely right. Personally, my two most used playback setups cap out at 44.1kHz and 16-bit and that's good enough.


Unless you're editing. 192kHz recordings slowed down to the audible spectrum are gorgeous.


Tbh, if you're editing, remixing, etc, 32-bit float at higher sample rates obviously make sense. But not for a final delivery, and not for your playback devices. Even when I edit 192kHz 32-bit float audio (e.g. for a slow-motion video sequence where I need to slow the audio to match) my playback setup still only supports 16-bit 48kHz (and I don't need more).


My 20+ years of being involved in broadcast/tv/video technology says otherwise about TV resolution goes along with content. It’s actually a weird chicken/egg situation, that involves more than screens or content such as delivery, price of content creation and customer acceptance.


Doesn’t the tv size matter? My little 42” 4k at 12’ can’t be the same amount of fidelity that a monster 83” at 12’ is doing.


A larger TV is meant to be viewed from a further distance though. At the suggested distance, your eyes shouldn't be able to resolve the individual pixels.


You're right, the angular size is the true determining factor. But most people don't have a large enough living room to sit 12' from their TV :)

e.g. I've got a 48" UHD TV at 2.5m (8') distance, and a 27" UHD monitor that I use at 60cm distance, and while the TV is beyond the resolution limit of my eyes, the PC monitor isn't.


12 feet? The kids are working on strapping a pair of them to your face and calling it VR!


Even with really good film scanners, 4K seems to be about the limit of what’s worthwhile. Even at 4K itself compared to 1080, film grain I’d much more evident (not a bad thing persay, but a thing).


You've missed screen size as a factor. TVs are getting larger by about an inch per year since flat screens took over, and that trend doesn't seem like it's stopping any time soon.

There's also the type of content and your personal visual acuity. If you're young, your eyesight is probably quite capable of resolving much finer details than normal 6/6 vision. I can tell the difference between 4K and 8K at a normal sitting distance on a 65" TV.

However... the difference isn't enough for me to care. I'm much more interested in brightness, colour, framerate and a load of other things over another resolution bump.


Content over 4k is readily available and even cheap to produce, but most of the good stuff is still being produced in 4k, so why is that???


As someone who used to run a photography company: total costs of producing 4k content (camera system, computer setup to handle file management and editing, time required to process and manage content, storage, headaches, etc) is absolutely enormous compared to 1080p. Higher resolution content like 8k seems like a nightmare for everyone but the consumer. I do think and hope we’ll get to the point where it’s not a big deal, but even today a brand new macbook pro will struggle to render a basic 1080p composite in After Effects. Even basic 1080p footage editing in Premiere Pro can overwhelm reasonably modern machines.

(Yes, I know that you do X on your Y machine without problem, but my point is that it’s easy to forget to account for all these little costs, and they pile up! Especially for professional workloads where people love to push the bar higher and higher.)


Also, from experience: The bitrate and bandwidth of streaming services is so bad that the actual resolution doesn't really matter.

A good high-quality 1080p export will have significantly higher perceived quality than a "4K" (UHD) video on YouTube, Netflix, Amazon Prime or Apple TV.

I've decided that I'll produce all content in 1080p at the current time, even though I'm recording video in DCI 4K oversampled from a 6K sensor. No viewer is ever going to see the difference anyway.


There are several studies which say that viewer emotional engagement caps out at 1080p, and the only thing that drives more engagement beyond 1080p is HDR/WCG and HFR.

Customers are also not as willing to pay for 4k as those invested in UHD would like to think. It's a "nice to have" and having big numbers makes people feel good, but it doesn't actually make them love the content more.


I know that you do X on your Y machine without problem, but my point is that it’s easy to forget to account for all these little costs, and they pile up!

Cloud-based After Effects render farms do help significantly though.


You still doesn't seem to get the point.


I think I do. The point is that you can build your video in Affect Effects locally at 1080p (or less) on a standard Macbook, and then render the video in the cloud at 8k really easily and at quite a low cost. You don't need to "do X on your Y machine". You can rent someone else's machine for that.


> and at quite a low cost

So, let's actually look at that. For a "low-budget" 8K pipeline you'd first need a camera that can shoot at 8K or more. Ideally you'd want more so you can crop-in or do stabilization in post. So you'd be shooting on the Blackmagic Ursa Mini Pro 12K (~$6k) [1].

Usually you're using 5-10% of the material that you shot in the final edit, for some movies that can go as low as 1% (e.g., Apocalypse Now [2]), but let's calculate with a 45min documentary and 7.5% material used.

That means you've now got 30 TB of raw material [3]. You'll obviously use proxies for editing, but at least for color grading and for delivery you'll need access to the full material.

So now you'll need to store 30 TB of raw material somewhere in the cloud accessible to the machine that's doing the delivery. Even assuming you've got a symmetric fiber connection so we can disregard potential traffic limits and transfer speeds for initially uploading the material, you'll still need to pay for the cloud storage.

Creative Cloud has a storage limit of 100 GB for individuals or 1 TB for business plans [4], which is obviously far too limiting, so you'd need to use something like LucidLink. As you'd be working with large video files, you'd need high-performance storage, so you'd have to calculate with their performance plan, which is another $80 per TB per month, so $2400 per month just for this one single project [5].

________________________

[1] https://www.bhphotovideo.com/c/product/1578059-REG/blackmagi...

[2] https://books.google.de/books/about/?id=wB7cAAAAMAAJ

[3] https://www.braw.info/capacity/

[4] https://www.adobe.com/creativecloud/plans.html

[5] https://www.lucidlink.com/pricing


And of course all of the above may be "peanuts" for professional studios, but I am not aware of any hardware/workflow/money/things you can throw at the problem to make the editing and production experience smooth. There are tons of inconveniences, practical barriers, bandwidth issues (and I'm not even talking about network/cloud bandwidth, that's a whole other thing-- even just disk IO, moving stuff around via USB-C, backing up stuff, etc... It's all super labour-intensive and annoying).


Well, if you're editing with proxies (which is really easy with Resolve or Media Composer), the editing experience is really smooth. But that doesn't help for grading or delivery, where you'll still need the full resolution files.

Even with Gen 4 NVMe storage you'll quickly hit bottlenecks at those resolutions.


You'd be surprised at how much buffering and loading is needed to just playback 1080p content (even directly from your local machine's built-in SSD) in the video editor before it's rendered. It's incredibly frustrating to do video work on "regular"/prosumer (macbook pro) hardware.


> After Effects

A lot of that is on Adobe more than anything else, mind you. Same with Premiere, too. These tasks can be a lot faster than Adobe software allows them to be. But there is a limit, of course


Agreed 100%, and I hear Final Cut Pro has much better performance.

But realistically, a lot of work happens on Adobe or other (Autodesk, Houdini, etc...) products that also have their own issues.


Asking as a person conpletely out of that world: Can DaVinci Resolve or Final Cut Pro do the same as After Effects and be faster?


I don't think these products are directly comparable. I think Final Cut Pro probably maps better to Adobe Premiere Pro... But even then, it's not a complete overlap. There are things you can easily do in Premiere that you can't easily do in FCP, and vice-versa. In general, I find that Premiere is a bit more powerful/flexible than FCP, but FCP is much better software and does what it does much smoother and much faster.


No, they're for completely different use cases. While it's technically possible to edit a video in After Effects, its main use case is compositing VFX and advanced motion graphics.


The demand for it would be from a minority.

A lot of people are watching content on their phones, for one thing.

I create some travel video content and up until last year, most of my clients weren't even fussed about 4K. I had been shooting at 5.1K for most of last year and then dropped to 4K this year because no one has needed it, but the storage and delivery costs are higher.


Cost of delivery. Most stuff is being streamed these days, and nobody wants to have to pay for the cost of streaming 100gb files. Plus the cost of commercial videography equipment is insane. Why upgrade before you have to?


The better the capture, the longer the shelf life of that footage.

So, while the pipeline might not run at more than 4K, the source material ideally is at a higher resolution.

And you need to make sure it still looks good on lower quality displays.


Nobody is making movies that anyone would really care to preserve for the coming decades anymore anyways.

Besides which, we took a big step back when we switched from 35mm to digital. Digital has only recently reached par quality.


The BBC probably thought that when they recorded over old Dr. Who episodes. It’s up to our grandchildren to decide what they want to see and what they don’t.


Because 4k (and even 1080p) is good enough for most people?


Yeah, for TVs that sounds right, but not for desktop displays. I wonder if desktop display pixel density is only stagnated because the TV industry has so much influence on the momentum and the TV industry thinks 4k is good enough.


I think that influence is pretty clear. In my neck of the woods, when pc monitors in 16:9 became a thing, the marketing copy was basically how great they were for watching movies.

Only now do some manufacturers start to put out taller monitors, but they're still rare, at least in my market.


Is it even needed on desktop displays? My experience has been that anything under 32” you need to use scaling to make anything readable on a 4K display. I’m current using a 43” 4K display as my main monitor, sitting not very far from to, I can’t distinguish individual pixels as it is, and frankly even this is uncomfortably large, I have gutters set up around the edges so that windows maximise to a comfortably viewable size.


That's why I'm happy to see new 8K TV release. I think it's overkill for 99% TV usage, but it's needed for PC display.


Also, if you compare same laptop available with a 1080p screen or a 4k screen, the former gets much better battery time.


Bandwidth. ISPs already throttle streaming and have data caps.


Everytime you increase the resolution all the requirements in term of cpu, ram, bandwith, storage explode for production, media distribution and end user explode.

It is not just replacing one screen for another on client's side.


Right, and then there’s HUDs!


> TV resolution goes at the pace of available content

there is also AI upscaling which can upscale content to higher resolution


I don't care about resolutions > 4K, like, at all.

Things I care about:

- More HiDPI laptops that are not 16:9, preferably 15"-16".

- More 4K Monitors in the 22"-24" size range, preferably not 16:9.

Having more of those things would have a much bigger impact on my daily display usage than a 5K TV. Apparently, the market doesn't share my preferences in displays very much. We're finally getting 16:10 and 3:2 alternatives in the laptop space, after more than a decade of enduring 16:9, but other than that, almost everything 4K is 16:9 14" and 27"-32".


Why do you dislike 16:9?


It's only good for content consumption.

If you do code or anything that has text in it, the vertical space needed is infinite.

And even if you edit 16:9 video content, you need extra space for your controls.

Incidentally, on a 16:10 monitor you can have a 16:9 movie or game take up the whole upper part of the screen and still have your dock or task bar visible.


For "infinite vertical space" you can do 9:16, i.e. portrait. That said, widescreen is quite useful for two 8:9 windows side by side, which works less well on more square ratios.


> For "infinite vertical space" you can do 9:16, i.e. portrait.

The top feels too high for me then. Most ergonomic advice says your eyesight should see the top of the monitor horizontally doesn't it?

Also you will run into some web interface that was designed with landscape in mind sooner or later.

I have a friend who keeps two 16:9 monitors, one landscape and one portrait. That may work.

> That said, widescreen is quite useful for two 8:9 windows side by side, which works less well on more square ratios.

Matter of taste. I prefer multiple smaller monitors so i can keep a full monitor in my field of view. What you describe requires a monitor so large that you can't look at it entirely from programming distance.


> Also you will run into some web interface that was designed with landscape in mind sooner or later.

It’s fun when websites assume I’m using a mobile because I’m on a rotated 1200x1920 display.


> Matter of taste. I prefer multiple smaller monitors so i can keep a full monitor in my field of view

I think this is mostly due to software support, e.g. for ultrawides (21:9) you often can have it pretend to be two or three separate monitors. At that point the only difference between multiple smaller monitors and one wide one is whether you have bezels.

That said, in my opinion the problem is not the aspect ratio but the number of pixels. For example, I could take my 2560x1440 (16:9) display and put a black bar on the right to make it 1920x1440. That would be a "better" ratio, but having the extra horizontal pixels does not hurt (e.g. for a dock). The problem is that Xx1080 is just not enough vertical pixels, period. No matter whether it is 1920x1080 or 1080x1080.


Oh my multiple smaller monitors are 1920x1200 :)

There are no 4k 16:10 monitors that I know of - one day I'll have to make the jump.


Huawei Mateview is 3840x2560, so you even get a few more vertical pixels for a 3:2 aspect ratio. There also is the Surface Studio at 4500 x 3000, which, however, is an all-in-one.

I haven't tried either of them, but there are (expensive) options.


The Surface Studio isn't a monitor is it?

And at a quick glance the Huawei Mateview has too many "features" to call it a monitor as well.


Definitely agree. For software development I always end up rotating my monitors to portrait. You are always going to have more lines in a file than will fit on a landscape 16:9 screen. A taller screen gives you a much better "view" of the context of the code.

And with window title bars, tabs, url bars, bookmarks, headings and sub headings the first jira issue appears more than halfway down the screen


Display technology is not the limiting factor in resolution viability - remember that driving pixels is expensive for the whole system. A 4k at 120Hz panel takes a ~30Gb/s of data to drive. For an 8k display, this is ~120Gb/s.

That's a lot data, for which content needs to be continuously rendered, read from VRAM and sent over a link during scanout and finally processed by a fast enough display controller to update the pixels. High-end systems can do this, but for the resolutions to become the norm you need to be able to do this from a low-power media boxes and entry-level laptops. Only then will such displays become the norm and prices go down.


That's by far the cheapest part of it, decoding will eat more power and transistors than just few differential lines. Add few magnitues if you need to generate it in a video game


Differential transceivers and high speed circuit design is not cheap before economy of scale kicks in for that particular unit, but the mention of throughput was not as much to denote the link as much as to denote the speed at which content must render, scan out and decode...


Sony uses screens with 3.840 x 1.644 pixel resolution and 120 Hz in their top level smartphones, so a laptop is probably able to do significantly more.


Driving at that rate is either full hardware decoded video, or rendering at much lower resolution or refresh.

Take an Xbox series X pulling some 200 watts as example - it can technically drive 4k@120, but most content won't render at that speed or resolution.

Now look a power budget of 20W for a laptop, or singular watt for a phone.


Gaming consoles do expensive 3D rendering. 2D must be much cheaper, otherwise the Sony phone couldn't do it.


That's not how really how rendering works - there is no 2d or 3d, just variable complexity. On that phone, some UI activities like simple scrolling in some apps might be able to fun at a full 120Hz. Many apps apps won't, and games will either dynamically render a much lower resolution to keep up (a common performance technique), or just render nowhere near 120Hz, with sub-60 being more likely.

And that's at 4k. This thread is about higher resolutions. To give the same meh performance at 6k, the phone would have to be more than 2x as fast in every metric. At 8k, more than 4x as fast. All while staying within the same power envelope.

That's a lot to ask for in a generational bump.


Wait, we've had 8k TVs since 2019: https://www.cnn.com/cnn-underscored/electronics/8k-resolutio.... It doesn't even seem that expensive: https://www.bestbuy.com/site/samsung-55-class-qn700b-neo-qle... (this is just the first result on Best Buy's site). Are we thinking of the same thing?


I should have been more specific about pixel density. TVs have very low pixel density. I've updated my comment for more clarity about this.


pixel density doesn't matter. what matters is anyway pixel density. your TV is 2x bigger than your monitor, but you sit at least 2x further away, so the angular density is the same


ah ok. thanks.. it is confusing


"8k" name in this case is just a marketing. It's not 8k, it's 4230p only.


8k is double the vertical and horizontal resolution of 4k. You can argue that 4k was deceptive, since it switched from measuring the vertical resolution (2160p) to the horizontal, but 8k is just sticking with the established standards.


4k was the marketers cashing in a puffery token they'd been carrying for quite a long time by accurately measuring the short side of displays.

They moved to measuring the long side AND exaggerating by rounding up in the same generation, stealing the name of a slightly-better existing standard in the process.

You're right that 8k makes sense so long as you accept 4k, though.


Well, 4K is called that because when 1080p became common at home, cinemas switched to DCI 4K and suddenly TV manufacturers had to compete with that, so they branded the closest they could get to DCI 4K as their own "UHD 4K".


8k is 7680×4320. 4k is 3840x2160.


Nope, That TV is 4320p (presumably 4320x7680)

"4k" means ~2000x4000 pixels and "8k" means ~4000x8000 pixels.

It is 8k and reasonably priced at $1000 new / $752.99 used


No, 4k === 3840x2160 resolution.


Originally 4K came from the DCI spec for digital cinema and it was 4096x2160. For consumer use it was shrinked to UHD which is the 3840x2160, but cinemas still use 4K (although most of the screens actually use 2K)

Now 4K refers to image roughly the width of 4000 pixels according to wikipedia [1]

[1] https://en.m.wikipedia.org/wiki/4K_resolution


All new/upgraded auditoriums installed 4K (laser) projectors when I left the business 5 years ago. So not sure if most auditoriums only have 2K these days?


IMAX has only started switching to 4K laser projectors a few years ago, they were one of the last hold outs for 2K projectors (and many IMAX cinemas still use 2K projectors sadly).


Are those cinema projectors or more like video projectors? The cinema ones are more heavy duty and not changed so often.


Cinema projectors. It should be said that I worked in the business in Norway which had converted all auditoriums to digital in 2010, so many theatres were looking at upgrading their projectors around the time I left for something else.


...and 3840x2160 is roughly 4000x2000, thus "4K".

It's close enough for the marketing department, and now that's what it's understood to mean.


There really isn't much benefit of going above 4K with either TVs or desktop displays and multiple drawbacks. I run dual 4K monitors on my desktop and the pixel size is smaller than you can resolve at reasonable viewing distances, while I've yet to find an OS that deals with HiDPI displays well (Windows and Linux are both terrible at this). If you want to game you are pushing 4X the pixels as with 1080p so you don't get good framerates unless you spend an exorbitant amount on a GPU.

For content viewing, 4K is probably the upper limit of what you want, and if you game there probably isn't much benefit for going above 1440p.


Different people have different use cases.

Surprise

My DSLR takes pictures bigger than 4k for a few years already. Guess what? You can see a difference between the 4k 27" and a retina display.

And no for this use case you don't need a magic GPU.

And yes I also see a.difference in resolution on my 4k OLED between 4k and 1440p.


> a retina display.

What does this mean? The "retina" resolutions are all over the place and depend on the device size and type. Also they seem to always be somewhere in the middle of pc options, e.g. for the macbook air 13 you have 1920x1200 (average pc) < 2560x1664 (air or expensive pc) < 3840x2400 (overkill pc). For your example at 27" that seems to be 4k (average pc) < 5k (studio display) < 6k (dell etc.) < 8k (overkill).


Ah im not familiar with it.

For me retina means my Mac book pro 14" which should be 3k x 2k.


All displays from Apple have different resolutions. Natively they are comparable to previous Apple displays of half the linear density (800x1280-ish for 13”, 1080 for 15” and so on).


Okay, so you can "see the difference" (positively, I assume?) of this to a 27" 4k monitor. What about 27" 6k or 14" 4k?

Side rant: Yeah, apple makes great stuff, but the naming is obnoxious. It is not "hidpi" it is "retina", not "high refresh rate" but "promotion" and then you have to look through the marketing material to figure out how '14" retina' compares to a normal UHD+ display.


If I can see the difference I would assume I will be able to see a difference between 27" 4k and 27" 6k.

My canon 80d has a resolution of 6k x 4k.

I also can perceive the sharpness of text.

I don't mind to not run 8k IF driving the display pixels consumes too much energy. The argument of Performance though should not really exist. It's much more pixels true but there are plenty of technics to separate this technically.

Even on gaming they have adaptive rendering but the display resolution itself stays the same.


The power consumption is not so much on the gpu side. Denser pixels require stronger backlights because less light gets through. So even if you were to run the screen at half the max resolution, you get worse battery life. Similarly, oled is awesome tech, but high brightness requires more energy than led backlights.


My comment was based on visual acuity limits. A person with 20/20 vision and a 32" 4K display will hit that limit at around 2ft. Going to a higher resolution you won't be able to see individual pixels anymore unless you have better than average eyesight or sit close to your monitor.

The GPU bit was purely about games and how many pixels you are pushing in games vs cost for a minimum quality and performance level. For content consumption or creation use it isn't a problem.


Gaming on a proper 4K native display is superior to 1440p: I have both, and the clarity is worth it. In my opinion of course!


Speak for yourself. I want a single curved 55" 8k display to replace my three 32" 4k monitors. Going from 6k to 8k with no bezels is an upgrade I will pay $$$$ for.


For gaming it’s useful, but for work I like the separate panels. Multiple single-window screens beats overlapping windows.


Why overlapping? Nothing prevents you from tiling the large screen into multiple screens; just with zero bezels between them.


Then the temptation to drag the "bezels" is too great - I have poor impulse control.


I like having a 4k monitor with a somewhat underpowered video card for the task. Most games that are graphically intense let you set the 3D render scale to something lower. At 70% render scale I get the benefit of sharp 4K text and UI stuff but with the 3D parts being closer to 1440p resolution. Upscaling can be really ugly at lower pixel densities, but rendering 1440p scale on 4K is hardly distinguishable from 1440p native, in my experience.


KDE handles HiDPI really well since a few weeks ago. Try the current release of Ubuntu Studio on Wayland, it's working incredibly well.


I've got a 5K 27" display at home (the much maligned LG Ultrafine) and generally use a 4K 27" one at work. The difference is very obvious.

That said I wouldn't see a huge amount of point in going far _beyond_ 5K at that screen size, and 27" is about as big as I want to go for a screen, so for my purposes we've hit the limits (albeit only expensively).


I've been wearing glasses for most of my life due to nearsightedness, and I don't care at all about density anymore. Even 4k is a bit luxurious. As I get older, it matters even less. I simply cannot tell the pixels apart anymore.

The real limiting factor is display size at the current densities. I can always use more real estate. A 49" super ultra wide is still fairly expensive (but getting cheaper quite quickly) even at not 4k densities.

Even with phones, Apple pretty much nailed it with the Retina display. The only thing I notice with newer displays is that the colours are better. In terms of smoothness, I can't tell the difference.


This. It's not like 16k displays wouldn't be desirable or useful in some use cases. It's that they are useful for most consumers.


3,400 PPI is in the display of Apple Vision Pro

That's an order of magnitude increase over current displays today.

https://wccftech.com/apple-vision-pro-retail-units-have-lowe...


A HMD is much closer to your eyes than a computer monitor and a monitor will be much close to your eyes than a TV in your living room. The closer it is to your eyes the more PPI maters.


Please compare like with sort of like.

Oculus Quest 2 has about 800 PPI at roughly 1/12th the price.


Here in Japan, every big electronics store sells 8k televisions from a number of Japanese brands. NHK also broadcastsl some (all?) of its content in 8k. I don’t know much about the technical aspects so I can’t attest to whether it is “true” 8k being streamed but it looks much better than the 4k OLED televisions, which also look amazing.


NHK only broadcast 8K in one of their channel. And AFAIK all 8K TV are premium segment.

EU has new energy efficiency requirement which basically means 8K in EU market wont happen unless there are substantial improvement in technology. 8K has more exposure in Japan mainly because they were the first proponent of the technology.

I much rather we settle on 6K and HDR10, and leaves room for 60Hz or 120Hz for different types of video which requires those framerate.


Samsung's already sidestepped the 90W regulation for the EU by shipping 8K displays this year by dimming the display in factory settings (you can turn this setting off)

There's also nothing physically stopping engineers from designing more efficient 8K displays that are sub 90W (in fairness they generally are more than double that atm, though)


Except for Cost. Even before EU regulation, 8K has been an hot topic between Panel Vendors and TV brands mainly pushed by Japan. And market were showing signs that 8K may be just like 3D TV. EU regulation were more like the nail in the coffin. Forcing panel marker to focus on other aspect rather than pixel density. Which is something much harder and costly to achieve with self emission panel technology.


Also worth mentioning not all pixels are created equally. Many years ago a Nokia phone’s camera had 44mp but the images were worse than many 12mp phone cameras.


Yes. 4K TV is enough for people with 20/20 vision. That is roughly ~80%. of population. And even with better vision most of the time they are only visible during static screen comparison. In motion as in video they are not as visible.

We really should have taken the middle ground between 8K and 4K and settle on 6K. Instead we now have 4K that is good enough but not the best for most people.


It’s no secret that higher pixel counts are harder to achieve since computational power required grows approximately quadratically while GPU and CPU power improvements are more linear.


We also just didn't have any good ways of driving such high PPI displays until recently without having to resort to janky hacks like dual cable setups (and the lines down the middle that those tend to have).

A lot of laptops still don't support HDMI 2.1, and DP 2.0/2.1 only just came out (and I don't know if any displays are using it yet). There's also the bullshit the HDMI forum pulled where they relabeled HDMI 2.0 as HDMI 2.1 (years after HDMI 2.1 was out and established as "the one that can do 4k120hz no chroma subsampling" (a lot of implementations were around 40Gbps instead of the full 48Gbps because it was cheaper, and they didn't need the full 48Gbps for 4k120hz 10bit)), despite them having hugely different specs.

Displayport 2.0/2.1 is also a clusterfuck because they decided that it would max out at either 40Gbps or 80Gbps depending on the implementation (DP40 and DP80), and companies don't generally tell you what their implementation actually supports.

Macbook Pros only got HDMI 2.1 this year, iirc base and airs still don't have it. Intel iGPUs only do HDMI 2.0 and DP 1.4, AMD's Zen4 laptops can do HDMI 2.1 and DP 2.1 (idk what bandwidth). For dGPUs, AMD's 6000 series is HDMI 2.1+DP 1.4 (but there are issues with DSC), while 7000 series is HDMI 2.1 and DP 2.1 (DP80). Intel Arc dGPUs are DP 2.0 (DP40) and HDMI 2.1. Nvidia's 30 and 40 series are HDMI 2.1 and only DP 1.4.

edit: Also, people don't realize just how fucking huge an 8k framebuffer is, and you need multiple for a swapchain. Even for just basic 8bit colour you're looking at ~133MB per framebuffer.


Great comment. I just bought a new Dell 6K display and I'm struggling to imagine a purpose for 8k, but it's incredible that we have the technology to drive that.


> I just bought a new Dell 6K display and I'm struggling to imagine a purpose for 8k

It's for large monitors. I run a 43" 4k tv as a monitor. I would absolutely benefit from a similar sized tv in 8k. And people who run closer to the 50" size would benefit even more.

I think you're probably right about traditionally sized monitors, though.


> janky hacks like dual cable setups

I’m a bit surprised we don’t see point-to-point ethernet cables doing this. They are standardised, reliable, and carry a lot of data. My napkin math says an 8K display at 40 bits per pixel and 60 fps is about 18 Gbps, well into the range of fancy ethernet.

And, if consumer electronics can drive down the costs of these interfaces, everyone wins.


Your math is significantly off here. That's about what it takes for 4k60. 8k60 10bit takes about 80Gbps.


Oops. I blame the napkin.

Anyway, with compression it should be a piece of cake. I never quite understood why HDMI even streams pixels and not refresh instructions.


> ~133MB per framebuffer

That's not a lot for people in the market for 8K monitors. 24G VRAM in the form of used 3090 cards is way cheaper than high end monitors. But yeah lack of DP2.1 support on current-gen cards like the 4090 is a problem

> you need multiple for a swapchain

There should be just two with variable refresh rate


On an 8GB (eg. base Macbook) or 16GB laptop where you're sharing ram between the CPU and iGPU it's a ton.

People want 8k for better PPI for text rendering, not for games, meaning they want to use it with their work laptop.

> There should be just two with variable refresh rate

There's still a lot of software out there that's using DXGI blit model instead of DXGI flip model, and a triple buffered vsync+windowed blit app means you're looking at 5? framebuffers in flight at once. (ignoring any additional intermediate buffers for each app, and potentially having multiple overlapping windows drawing onscreen at one time).


Actually...

Thanks to Moore's Law, computer power has been growing exponentially since forever. Single-core performance has stagnated, sure, but total performance is still following the curve. Graphics is easy to parallelise, so throwing more cores at it is not a problem. In fact, this is precisely what GPUs do.

In other words, the available "operations per pixel per second" have gone up over the years, despite the increasing resolution!

In fact, we're at the point now that we can even do real time ray tracing, which was unthinkable at any typical resolution just a decade ago. Top-end RTX 4090 GPUs can even do ray tracing at 4K, which is just crazy.


>total performance is still following the curve

Nitpick: Moore's law is about number of transistors on microchips not performance.


Yes, but for GPUs performance scales fairly close to 1:1 with increasing transistor counts. Newer GPUs generally have more of the same type of compute units tiled out.

The original general-purpose GPU, the NVIDIA 8800 GT had 112 cores. The RTX 4090 has 9,728! Each core has pretty much remained the same: a 32-bit arithmetic/logic unit (ALU).


Moore’s law is a thing yes, but in the real world, power consumption and heat dissipation issues seem to be the actual constraints rather than transistor counts.

Vertical scaling is of course possible, but it’s not really useful for driving a higher resolution monitor in your living room because of bottlenecks being completely elsewhere.


>In fact, we're at the point now that we can even do real time ray tracing, which was unthinkable at any typical resolution just a decade ago

So are you saying that we should go back to "just wait for the picture to render..." era just because we can have higher density?

Top-End RTX 4090 cater for a niche market.


The bottleneck is bandwidth and storage not computer power.

Data from sensor and to display are usually (but not always) raw and uncompressed.

To go beyond 8k 120fps we will need fiber, or better compression.

We have good codecs to move data online. And specialized ASIC to do live encoding with good quality.

The current compression defined by HDMI is very rudimentary.


Power is definitely a major factor. Unless games do not exist in your world


Games do not exist in my world. Playing games is a waste of silicon, and human potential.


You're not the person I replied to. Why the hell did you respond to this. Having a bad day or something?

Quite ironic to complain about waste at the same time as making this pointless and aggressive comment


>Power is definitely a major factor. Unless games do not exist in your world

Maybe you don't realize you're on a worldwide forum where anyone can respond to anyone else for any reason? I'm perfectly entitled to respond to your comment as someone who does not game. The person you replied to isn't the only person in this entire thread. And suggesting that it's strange for me to reply to your comment is just strange in itself.


How does commenting on anonymous forums rank in your world?


For the 10 minutes every other day I spend doing it, it's just fine. I don't waste hours gaming like some people. LCD displays interest me, and so I came into the thread - is that reasonable enough to satisfy your question?


> GPU and CPU power improvements are more linear

Do they?


I think PC displays are not the innovation driver here. They get technology leftovers from TV. And TV doesn't care about Pixel density because of viewing distance.

Meanwhile phones (and apple's VR thing) have massive pixel density, but PC displays are more like scaled down tvs than scaled up phones.


> It's amazing that pixel density has been stagnant since 2014, when the first 5k TV (low ppi) and 5k desktop displays (>200 ppi) were released. It's 2023 now and it still takes a kidney to get 5k >200 ppi, and we only recently got the 6k >200 ppi option for 2 kidneys. Pixel density, however, is stagnant.

Maybe this is so for desktop size screens, but for larger things, the big advance in the last 18 months is real 8K 60Hz displays at 65 to 80 inch diagonal size going from "absurdly incredibly expensive" to being priced similar to where 4K displays were when they were introduced. 8K displays are absolutely viable now.

There is obviously a lack of real high bitrate 8K content unless you're shooting it yourself.


That's not true. You can get 643 PPI screens in a phone. Hell the Apple VR headset is claimed to have 4000 PPI.

The reason the pixel density stagnated in TVs is because it's simply useless to go beyond 4k.


Most of the market doesn’t buy 4k. 1080p TV’s from the early 2000’s are still running strong.

The steam hardware survey says most gamers (a market infamous for their tech) still use 1080p displays. The latest survey puts 64% of gamers at 1080p.

https://store.steampowered.com/hwsurvey/processormfg/?sort=c...


The rationale behind buying PC monitors for gaming purposes is very different from the rationale behind buying TVs.

As a PC gamer, you take your GPU into account when choosing a monitor. Doesn't help to buy a huge screen, even if it‘s relatively cheap, when you can't afford the high-end GPU you need to drive those pixels.

The TV buyer doesn't face this problem at all. Usually no signal source is connected to the (smart) TV anyway, and the internal streaming apps have no problem to play 4k content, as long as the subscription allows it.


Also, gamers are often more concerned with refresh rate than pixel density.


If 44% of homes in America had 4K televisions in 2021 then it is probably over 50% ("most") by now.

https://www.statista.com/statistics/1247334/4k-ultra-hdtv-us...


Pixel density is pointless after you get above your retina’s “resolution”. I can’t really tell 4K from 1080p from my couch. What I can tell apart is HDR from SDR. I think a lot of progress will continue on that area because it results in higher perceived quality.

If I had a pet owl, they’d probably be the only ones in the house to be able to tell content apart by resolution.

Even for my 27” desk monitors, it’s almost pointless to go beyond 5K.


It just doesn't matter that much for the average consumer. Most office workers (that is, not silicon valley programmers) work on 1920x1080 and that's fine. There really is not much to gain by doubling or tripling the resolution.

I'm working on a 27" 2560x1440 screen. I can see pixels, but that really doesn't matter. Text is readable, nothing is blurry, I can do my work and get on with my day. Screens are good enough, they work, and there is not much to gain by having higher resolutions.


I feel like text that isn't crystal clear stresses my eyes more. I can feel eye fatigue and blur after an intense day of work.

I would more than welcome higher density and crispier text any time. I'm on 24" 1080p and unhappy of that.

I'm gonna buy a widescreen 34" at much higher pixel density soon.


It's probably a matter of demand vs cost; I'm confident technology can make displays at a higher DPI, but... why? If it's not for consumer use (e.g. 4K, now 8K screens which are IMO completely gratuitous), if they can't scale up to produce millions of panels, it's not worth the investment beyond the research of whether it's possible.

Anyway, the next frontier is (has been?) VR, which needs / needed high DPI but small screens.


The only area where I’d think we want more pixels would be in medical imaging but even then we can always zoom out the interesting parts of the image (and rely on machines to look for interesting parts)


My phone screen is >600ppi. My e-book reader (e-ink) is 300ppi. For me 300ppi seems to be the point at which things start to look "pixel-less". A 300ppi e-ink display looks remarkably close to paper print quality. I think popularity of sans serif fonts online is mostly due to serif fonts looking shit on low PPI displays. It would be really nice to have a 300ppi monitor just so I could switch to more pleasing fonts.


If 200ppi is your spot you have to go back another decade to hit the IBM T220 (2001-2005): https://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors


There are more metrics than just pixel density that they need to optimize for. HDR, WCG, refresh rates and latency, aging/color stability, ... 4k is good enough for most needs, so the demand is elsewhere. Plus the gaming market is limited by what GPUs can render.


It’s because of yield. More pixels per panel means more panels ruined by dead pixels.


I feel like these DPIs have improved beyond the point of it mattering. Diminishing returns given the limits of human vision. 5k, 6k, seems more about marketing, like 96khz in the audiophile world. Makes little subjective difference.


5k (5120x2880 resolution) on 27" monitor makes difference for text. It looks crisp clear on 200 PPI, much better compared to 4k.

I still use HP Z27q bought in 2016. I wonder why the industry has so few offers for 27" 5120p monitors.


I'm guessing it's due to the influence of the TV market on the digital display industry. TV users don't care for >4k, which could make manufacturing >4k more difficult. It's only a guess, and not a very educated one either.


It's 2880p. the p is for progressive lines.


I think there is actually a pretty common consensus that returns diminish at ~220 ppi, i.e. what Apple calls "Retina."


I got a 218 ppi 27 incher, and I was disappointed when English is crisp but my native language is a pain at small font sizes. My 350+ ppi mobile renders them fine at those sizes.


No, the consensus is that pixel density should be inversely related to viewing distance.


The scoreboard at Oracle Park (SF Giants) looks great and has 2.5 ppi. It’s a 2,032” 4,672 x 2,160 display.


Sure, and this is for the range of distances most common for desktop computer users.


Talking about PPI without including distance to the screen is like talking about how far your car can drive on a tank of gas without mentioning how big the gas tank is.

220 PPI on a TV screen that you're sitting 8 feet away from? More than most people will ever notice. 220 PPI on a monitor that you're 18-24 inches away from? Probably the ideal density. On a phone or tablet? Meh. On a VR display? Absolutely unusable.


Oh god. I hate that Apple helped stick this BS in people's heads.

The human eye can resolve way, way past 220PPI. Even at distance.

I'm sitting here typing on a surface book with a 13" panel that is 267PPI and I would adore it to be double that. I run at native resolution with no scaling.

For the "average" human eye (20/20 vision) it's something like ~338ppi at 25cm right? If you have better vision (my corrected vision is 20/10 which is near the theoretical max for a human (20/8 I think is the max~?).

It's aggravating that people are like "over 300PPI is the max so screw improving". It's literally just the "max" at being able to discern the spacing between pixels accurately. Higher PPIs/DPIs still lead to an increase in objective clarity and crispness as you are able to improve aliasing etc.

Some of my old devices had PPIs in excess of 520PPI and I absolutely adored it (Note Edge). I would kill for a desktop monitor with similar PPI.

SO yeah, I guess "diminishing returns" is a thing but I wish it was at least a readily available option. I would adore a 27-30" monitor that was ~500ppi.. sighs


Yeah, as a fellow ultra-high dpi lover I'm just disappointed by the market offering a lot of things but not higher framerates.


> Oh god. I hate that Apple helped stick this BS in people's heads.

You do realize that without apple pushing the others froward, most laptops would still be 1366x768, right?


I would guess that returns diminish a lot earlier, and just stop to matter entirely at around 220.


Read any old book on photography printing. If it is new enough to mention DPI, it insists at 300 for amateur prints.


That is not an entirely fair comparison, most print techniques are inherently less crisp because dots (pixels) tend to bleed into each other.


Oh, right. Now I have no clue how these numbers compare. But indeed they’re definitely not apples to apples.


I don't have a definite reference for anything between 185 and 217, but it is a big difference visually. Somewhere in that range, the pixels disappear at a reasonable viewing distance.


For TVs I would agree, but there’s a visible difference between 4K and 5K at 27”.

Don’t seem any point going past 5k for 27” monitors though, that truly is overkill.


> 5k, 6k, seems more about marketing

On phones, maybe. 4K 32" monitors are nowhere close to retina


There's no reason to go higher. 4k at most sizes in most living rooms is already at a point of diminishing returns. 8k TVs do exist, but nobody bothers making content for them. There's no appetite for it.


I agree 4k is enough for TVs, but not for computer displays. 27"@4k still looks pretty bad on a desktop display. For desktop display, 220 ppi (27"@5k or 32"@6k) is perfect.


We're probably approaching the limits of human vision. Increased PPI wouldn't have a corresponding increase in visual clarity to be worth the price.


There’s obviously a limit to how small you can make something. I don’t really see how you could make things much smaller.


Most people won’t care for the extra definition going from 4k to 8k or 16k+. Barely perceptible.


Didn't apple just release a VR headset with 4k displays for each eye?


Yes but those displays are stretched over a much wider field of view. When I sit on my couch and look at my TV it is covering maybe 30 degrees of my FOV. A VR headset typically would like to have 100+ degree FOV, which means a 4K VR display will have way less fidelity than a 4K TV.


Right, but presumably you would need insane pixel density to achieve 4K at that range?


This is the fundamental struggle with VR displays. Human eyes have very good visual fidelity and it would be real nice to match or exceed that with a VR headset. This presents many technical challenges.


Yes, it is around 3400ppi. This has been achievable for a bit in very small displays (these are 1.4in), although I believe there are difficulties with anything larger.


I don't. Most people don't care and won't pay a premium


Clearly the tech is possible since phones have gone from 200 ppi to 400 but IMO desktop monitors simply don't need it.


I think desktops need at least >200.


Makes sense, but I hope that there’s meaningful progress on microLEDs soon. While OLED burn-in has improved a lot, it’s still a concern for the lengths of time that people keep screens for… my 5 year old VA panel TV still looks identical to the day that it was unboxed after thousands of hours of usage at high brightness, and I’d hope any TV replacing it would be capable of the same or better. Same goes for monitors.


microLEDs also have

* better linearity

* smaller size (that is, the light comes from a tiny pinpoint in the middle of the pixel rather than a whole rectangle being lit up). This actually helps make the display look a lot sharper.


You don't want small pixels. This causes optical aliasing and associated problems.

Edit: By small pixels I mean a low pixel aspect ratio(area of lit pixel vs total area per pixel). If you consider this as a DSP problem(instead of a time series problem, a brightness value over distance or angle), there's effectively no reconstruction low pass filter. You can think of small lit pixel area with larger distance between like zero padding. This can make some things appear sharper, but by doing so it'll appear less like the image source wants it to.


No, the perfect pixel is a sinc() function of appropriate bandwidth. A pixel of the form "dot with a dark area around it" is a better approximation of sinc() than pixel of the from "fill all the available space evenly".

And it doesn't mean the bright part of a pixel should be as small as possible, no, it should be tuned for bandwidth.

Even eye receptors (cones) approximate sinc() in the same fashion.


Sure, but at least for RGB subpixel displays common to LCD TVs, each subpixel has neighboring dark-area not only in the form of the black matrix, but effectively also that of the two other colors.


I said pixel aspect ratio, but what I meant was pixel aperture ratio. I hope that's clear from the context.


I have a 12 year old LCD panel. Do OLEDs have similar longevity?


I know someone who uses an ancient samsung oled phone which they had left on overnight regularly for youtube for years and that had the faintest of burn in only visible on a pure white screen.

It's really not something I'd be concerned about.


As a kid, our Sony trinitron also ended up having burn-in due to leaving it on with a fixed NES game screen. Fun cycle.


Overnight YouTube videos? For sleeping to?


Makes sense to me; I like to listen to a podcast while falling asleep. I bet you could also find nice background noise YouTubes, rain and whatnot.


YouTube is not bad use case for burn in. Static UI is.


They had it in the half screen view so some of the UI had mildly burned in


That would depend entirely on how many hours you used it. They don’t age like meat, over time. They age from having current running through the pixels. So their degradation depends on the kind of content you watch, for how long, and how bright you configure the picture. I have thousands of hours on a 6-year-old OLED and it looks perfect, but I only watch films in a dark room so I don’t think it’s very demanding.


The Samsung Galaxy S III is 11 years old and had an AMOLED screen (does that still count as OLED?). People are still using them with custom ROMs somewhat, and I haven't heard of display issues.

There's also the first revision PlayStation Vita that had an OLED screen and is a similar age. I have heard of yellowing on some models, unsure how prevalent burn-in is. Of people I know with a Vita or several, most got a second gen only, or switched to a second gen and so the old one didn't get used as much.

Much less old, there's the Switch OLED, and WULFF DEN on YouTube has one he's leaving constantly turned on to see what happens to the screen. I think we're 1.5 years in or so and he's done a handful of videos showing what it looks like now.

https://www.youtube.com/watch?v=7TMyUDKqWWI


If you're using a 12 year old LCD panel, you're probably hobbling yourself with terrible resolution, response time, refresh rate, color gamut, viewing angle, backlight evenness, black level/contrast, etc.

It's absolutely bizarre how luddite HNers can be when it comes to keeping even remotely current on hardware.


If it meets the poster’s needs, what’s the problem? Better that old electronics continue to get used than for them to be junked.

Though my primary monitors are newer, I have monitors that are coming up on a decade old that get used as secondary monitors because I don’t need anything fancy for that use case… it just needs to be a functional screen. Some people have similarly undemanding needs for TVs.


I do notice the backlight unevenness in very dark scenes. I also wish it had HDR.

Everything else is fine; it's got local dimming so the contrast is really good, and viewing distance is such that higher resolution would do nothing.

Response and refresh are essentially a non issue for movies -- it can handle 24Hz content just fine.

Why should I throw a perfectly good box full of hazardous waste in the trash?


My OLED is only 7 years old but has no issues so far.


For the record, my 5 year old LG C7 has mild subtitle burn in, only appears on brown background.


I want my entire desktop surface to be a display. 4*8 feet more or less.

I also want a 0-width bezel, so they can be tiled seamlessly. Why? I want an entire wall to be a display.

I want, I want, I want, ... :-)


I also want my Roku box to be able to display PDF files. I'd even pay for it.

and a pony.


Roku records and collects multiple screenshots every second of whatever you're watching, even if the content is being streamed from another device, and they analyze the images to figure out exactly what you watch, when, how long/often, and use that to make assumptions about who you are and what you're into which can then be sold to anyone willing to pay for it (and/or leaked/hacked).

Maybe it doesn't matter much for the files you'd be viewing, but if roku could display PDF files they'd be doing the same thing there as well. Sadly, the more things roku can display for you, the more parts of your life they can siphon and use to stuff your dossier.


I didn't know that. It's pretty disappointing news. Looks like I might just get one of those media player boxes.


Just plug in a Raspberry Pi or PC stick of your choice. That and a wireless mouse or trackball. You'll probably want to move around the file.


I know I could plug in a computer to it. The Roku would be much more convenient.


What's the use case for displaying PDF files on a Roku? You would really read documents on your TV? I'd be interested in developing that if there's a market


I have an uncounted number of PDFs, easily in the many thousands. I'd like to lounge in my LazyBoy and read them. I'd like to read them while I do my exercises. I'd like read them while in bed and sick of watching TV.

I'd upgrade to a 4K TV if I had a viewer for it.

I'd pony(!) up $200 for an app that did that. I don't know if other people would, but I would.

I find it strange that Roku comes with an app to play movie files, display jpgs, play song files, but it won't display a pdf. I've searched Roku for such an app, and googled for it, but came up empty.


Wouldn't you want a portrait mode monitor for your PDFs? I read them on a 24" UHD portrait mode display. It is so nice to have the PDF page fill the display so I don't have to scroll within each page.

Of course this doesn't solve the problem of reading them in bed! Or would it?

At least in the LazyBoy you could have a portrait mode monitor on an Ergotron arm and that would work pretty nicely?

I wrote about this in more detail here:

https://news.ycombinator.com/item?id=36179132


I actually like 2-page display, like an open book.


There are several options for the Apple TV [1]

There's probably something similar for Android TV.

Roku doesn't really have a third party app ecosystem

1: https://apps.apple.com/us/app/pdfontv/id1071669734 , https://apps.apple.com/us/app/easy-pdf-reader-for-all-cloud/... , https://apps.apple.com/us/app/pdf-viewer-for-tv/id1661646958


> Roku doesn't really have a third party app ecosystem

There seem to be a lot of 3rd party apps for it! I even have the SDK for it.


Right, but the ecosystem is nowhere near as robust as that of the other two platforms, as evidenced by the wealth of apps that solve OP's problem.

Lots of people have Rokus but not very many people are developing for them.


Android TV probably has a PDF app, so a Chromecast (with Android TV) should do the trick.


I don't want to buy another box. It's already a rat's nest of cables. I once looked through all the various "media player" boxes on Amazon. They all did video, pictures, and music. None did pdf.


Almost every phone can cast/screen mirror. Unless your Roku is really old it supports it.


OOC: what is your use case for PDFs on Roku?


I want my Roku box to play AAC.



It seems like the way this will be delivered is via AR similar to what was demonstrated by Apple Vision Pro at WWDC or by some evolution of UST projectors.


I already have set up a projector that fills a wall. I love it. But the problems are:

1. it only works after dark

2. fan noise

3. I have to constantly fiddle with the focus to get it crisp across the screen

4. you cannot have anything in the room in front of it

5. running wires to the ceiling mount is ugly

6. you gotta point the remote behind you, not at the screen (I know, I'm such a whiner)


1) Black-out shades are your friend

3) the awesome newly affordable technology is lasers - because of physics, there's no longer focusing to do.

5) Running wires to a TV is ugly too, unless you run conduit. Bugs the hell out of me to have dangling cords

6) I've always wondered about this. The remote should have the IR LEDs pointed backwards, like a V shape because it just makes more sense.

I'll add a 7. The blank space on your wall when not in use.

Either you have a screen that comes from somewhere, or the wall just has an empty space, but, quite obviously, the space the projector projects on needs to be free of stuff. I can't hang any art there, so there's glaringly empty space, at which point I might as well put a TV there. And the art frame tvs, like the LG LS03B is actually really quite nice.


> Black-out shades are your friend

It's amazing how much sunlight gets past the smallest gap.

> the awesome newly affordable technology is lasers - because of physics, there's no longer focusing to do.

Huh, I didn't know laser projectors were available now.


A big A/V conference concluded this past week, and there’s a decent looking portable laser on its way to the market from Optoma called the ML1080. Curious if anyone is aware of something similar in newness, quality and price.

I’m planning to paid it with an Apple TV. https://www.projectorcentral.com/ProjectorCentral-names-2023...


Phillips has https://screeneo.com/ which seems to be in the same category to me.


> It's amazing how much sunlight gets past the smallest gap.

While true, my purchase from blinds.com included a L shaped piece of plastic to block out the gap between the wall and the blind so it's dark enough to use my projector during the day.


2, 3, and 6 are resolved in newer projectors. 4 and 5 are also gone with ultra short throw projectors, although they do have to sit on something. And 1 can be solved with a screen or even the right sort of paint.


> even the right sort of paint

I'm using projector screen cloth. It doesn't get much better than that.

I've never seen anything that came close to an active display.


I use an ambient light rejecting screen. They only reflect light that hits them from directly below, to pair with a UST projector. It's still not comparable to an active display, but it is usable during daytime, even in rooms with large windows.


Sounds like you might be in the target market for the Apple Vision Pro. Floating screens anywhere with great UX. ;)

If their resolution is good enough for reading I could see using it.


There was the Microsoft Surface (no, not that Microsoft Surface, the other Microsoft Surface), circa 2007/8, though it was more like a huge "all in one", not particularly cheap/convenient, US$ 10,000 (in 2007):

https://www.windowscentral.com/microsoft-surface-pixelsense-...


Walter Bright calling for parlor walls to become a reality.


yes, dammit!

Wouldn't it be cool to have "wallpaper" that is whatever you want it to be? Like the view from Mt Everest? or the Amazon jungle?


Pretty sure I saw this in the Apple store.


I hope OLED's major flaw gets fixed at some point - burn in. As long as burn in exists, brightness will have to be carefully controlled - which means OLEDs can never get as bright as a traditional LCD, and they're dangerous to use in cases where content is frequently static like desktops.

For TVs, OLEDs are fantastic. But for now I feel they're too limited for desktops.

Still, it's an exciting time for display technology. So many advancements made in the last 10 years.


OLEDs are great for computers if you're one of those tech review youtube channels who can afford to completely replace a $3000 monitor every few years because you often have to buy stuff to review it anyway.


I agree. It's definitely gotten better with automatic brightness limiters, pixel shifting, pixel refreshes and all that. It's still not at a point I consider acceptable though.

If I have to micro-manage or change my usage of the computer at all to accommodate the display, the technology isn't there yet imo.


I’d wager it’s going to take a lot to get to the point where a huge portion of customers aren’t satisfied with LCD TVs at their current level of capability, especially if prices are only going down.


Sure, but that’s not what this is about. This is essentially manufacturers saying “we’re going to stop trying to make LCDs better and concentrate on other technologies”.


Well, according to the article they intend to continue making LCD TVs and you can expect them to get cheaper, so I expect them to have a long life as tried-and-true choices even if they're not being actively researched for improvement.


Agree. There's still plenty of people alive with 4k TVs today that remember when black and white broadcast CRT television was all there was.


Grew up with CRTs giving me horrible headaches and LCDs was a breath of fresh air for my eyes. Headaches gone! Then came OLEDs and brought back headaches.

Luckily Apple still sells the iPhone SE with LCD. I really don’t know what I will do once LCDs are gone.


Is it OLED displays themselves that bother you, or the truly horrid subpixel arrangements they use on phones?

I love my LG OLED TV, with a full matrix (RGBW at every pixel). Phones, using something like PenTile, usually have half the number of red and blue subpixels, and in funny arrangements at that. Phone OLED displays make my eyes hurt, especially when anything is scrolling.


Pentile is a pox on the entire phone industry that they use to justify using crappy low resolution displays under the guise of something higher resolution. "1080p" Pentile is really like an 880p RGB matrix display.


Is that the same with iPhones too?


Yes it is. Deranged subpixel arrangements aren't an inherent feature of OLEDs, but I believe they're easier to manufacture so they're used in everything. The only OLED product I remember having normal RGB stripe pixels was the original PSVR, and that was dropped for the PSVR 2.

Samsung especially likes to mess with this, these days they are using triangular pixels for some reason.


I wonder why? With CRTs there is noticeable flickering but no such issue happens with OLEDs. Is it just the contrast ratio?


Regarding CRT flicker: for monitors, you don't want to drive a CRT below 90Hz (120 Hz is pretty stable), even TVs were at the end 100Hz (refreshing frames multiple times). The problem was "cost effective" office PCs with graphics hardware just capable of 60Hz refresh, which isn't a great idea, unless you have a monochrome monitor with slow phosphor. (Meaning, this was just an abuse of technology – and workers.)


> even TVs were at the end 100Hz (refreshing frames multiple times)

Maybe in Europe? But 100Hz is a terrible thing to do with 60Hz signals. Possibly US bound HD CRT TVs did 120Hz? But I don't recall hearing about those ever.


Yes, my experience was definitively European, as in PAL. It was rather neat, until the image panned… (The problem being, apparently you still have to alternate between fields in order to maintain the illusion of interlaced video, so you get an A-B-A-B C-D–C-D… sequence. Some attempts were made to improve the resulting interlace clash by comb filters, etc, but this only started to work somewhat properly, when the days of CRTs were already numbered.) I actually don't know if there were similar developments with 60Hz standards.


Not entirely sure but I think OLED pixels modulate brightness with PWM, which causes high frequency (though invisible) flickering.


LCDs have PWM-dimmed backlights, so if your laptop isn't set to maximum brightness, you're probably staring at a flickering surface too.

Perhaps a simpler explanation is just contrast? Both OLEDs and CRTs can produce much higher contrast than LCDs.



The PWM frequency is typically much higher for LCDs than OLEDs, if it is used, and much less likely to be a problem.


Some OLEDs don't use PWM, and some PWM fast enough not to cause problems.

Also badly driven LCD backlights can flicker badly too.


My wife has issues with bright screens, for her, matte coatings/films are a lifesaver. Have you tried those?


I'm not sure how OLED can give you a headache, but not an LCD.


What about MicroLED? New Macs and iPad Pro use them.


Not the OP but I am sensitive and sadly found the MiniLED to be unusable as well. Had to switch from a 15’ MBP to the low-cost 13” with the old-school 500nit display.


You mean MiniLED?


i'm holding out for LED-C


Apple is thoughtfully calling the Vision Pro displays microOLED, not to be confused with microLED.


In this situation microLED should be the same as OLED, also those don't use it.


For what it is worth, I just replaced a Samsung 65” from 2019 (QLED) with a new LG 77” OLED and was really blown away with the massive difference in the picture quality. The living room is full of windows, so the room is really sunny. Even at noon the pictures quality is great and easy to see. TV cost $2700 at Costco. The 2019 Samsung will get rotate into my office replacing a 2016 Samsung that crappy software for ARC was so buggy that it did not even work will with the Samsung sound-bar (having to unplug the TV once every 3-4 power on to allow it to work again).


What I'm mostly wondering these days is if anyone in any part of the display technology "stack" (so to speak) is working on improving the number of colors displayed to us by our screens. Or will we be limited to using 24-bit color forever (because it is "good enough")?

24-bit color is pretty good, mind you, yet there are everyday cases where we would appreciate the ability to view a more nuanced transition between adjacent colors on certain images and videos.


Many good monitors are able to display 30-bit pixels in the Display P3 color space (frequently described as the DCI P3 color space, even if there are small differences between these two color spaces).

I have been using only such monitors for more than a decade (e.g. I am still using a pair of relatively cheap Dell U2720Q; now there are newer, better models).

However most monitors by default display only 24-bit sRGB colors. Many users never change the default settings, so they never see the full color range of those monitors.

An annoying fact is that there are many programs, even expensive professional programs, which come with installers written in Java, and those installers crash whenever they see a 30-bit monitor, regardless of the operating system.

As a workaround, it is usually enough to temporarily change the monitor colors to 24-bit during installation, as the actual programs do not care about the color resolution, so they work after installation.

Color spaces larger than DCI P3 (e.g. that of Rec. 2020) can currently be displayed only by laser projectors.


HDR content is pretty damn mainstream and uses 10 bits per channel for 30-bit color. Some variants can do 12 bits per channel, but I believe this isn't often used.

Moreover, this content often comes with dynamic tone mapping to allow access to many more colors in a movie, whilst remaining limited to 30-bit color per frame (or usually, per scene)


> Some variants can do 12 bits per channel, but I believe this isn't often used.

Dolby Vision has 12bit colour channels, but it's reportedly not such great a leap as going from 8bit to HDR.

iPhones(12 onward) apparently support this standard, so anyone with such a device can see for themselves.


Dolby vision can use 12 bit channels, and they are proud of it, but it also has a 10 bit channel mode, and that is used essentially everywhere.



Lots of comments regarding LED brightness for sun lit spaces. If you are suffering with that have a look into Samsung The Frame (2022 and later) series. They have a special coating that removes reflections, I have installed multiple of those in very bright places and the mate finish works marvelous well.


Those are LCDs. The concern is about OLEDs (though realistically it’s one that is already largely solved at the high end, and that’ll percolate down)


I’m convinced consumer TVs/monitors are a dying product and we’ll all be using AR glasses by 2040 anyway. I hardly ever want to interact with a computer without using my xreal now.


How do you use the xreal? Looks interesting, never heard of it before.

Is it mostly an entertainment thing or do you actually do work with it?

Hows the pixel density? Does a virtual monitor look as good as a real 2k screen?


> Is it mostly an entertainment thing or do you actually do work with it?

Both. I just got a steam deck and my current plan is to sell my MacBook and just use Steam Deck and some AR glasses as my set up for everything. I can use the desktop mode of Steam Deck along with Distrobox and Podman for dev work and then I can play games, watch Netflix etc using the game mode. The other alternative I was thinking of was getting a Samsung Galaxy and using Dex with some glasses and then using a linux cloud machine for maximum portability but I'm leaning more towards the Steam Deck at the minute because it's a lot of fun and being able to play/code locally regardless of internet signal is a major plus.

> Hows the pixel density? Does a virtual monitor look as good as a real 2k screen?

It's only 1080p so if you're used to a MacBook screen or a 4k monitor it's not going to be anywhere near as clear. That being said, the ergonomics are sooooo much better that, in my opinion, it is worth the screen clarity sacrifice. No matter what monitor set up I had going on, (chair, monitor height, position etc) I would always end up hunching. With the glasses you get none of that. You can fix the screen in the air if you use the Nebula app but I prefer to just have it move with me (which is the default) so you literally have zero shoulder/back pain throughout the day as the screen just moves with you. As for the lack of clarity, I just stick the zoom level to 110% and everything is fine.

The Xreal does have a few other flaws:

- brightness levels are pretty low, so daytime use can sometimes be a pain. you'll definitely want to use light mode during the day and then will probably switch to dark mode at night.

- there's a faint halo ring on the periphery. doesn't interfere with the screen at all and I can ignore it but it's still annoying when you notice it.

- You can get a painful pinching on the sides of your head when you wear it for a while. Oddly, you can completely eliminate this by wearing a thin polyester running headband underneath. Why this works I don't know, we're literally talking a very thin bit of material here and it completely stops it, it's very odd.

- They don't have an accessory available that allows you to charge your device and use the glasses at the same time, you have to purchase either the Nubia Red Magic Gaming Dock or the Viture dock.

Like I said, I am willing to put up with all of these issues because the hunching has got to the point where I think I'm getting the beginnings of kyphosis and I'm going to have to do some work in the gym correcting it.

Personally, I think I'm going to be selling the XReal and purchasing a competitor, the Viture glasses, as they look to be a superior product. Reviews I've seen say they eliminate all the above pain points (far higher brightness, no halo, no pinching, accessory available for passthrough device charging - I've actually already bought this and it works a dream with the Xreal) as well as offering diopter adjustment and the really cool ability to just magnetically snap prescription lenses on to them rather than having to faff about like you do with the Xreal. I don't need these yet but could see it happening in the future as I get older so would be nice to have the easier option if need be. They just seem to have thought about the whole setup a bit better than Xreal have including a cool Android neckband accessory, magnetic power cable etc. There's also Rokid, but the styling puts me off. Links to all here so you can do your own comparison:

https://www.viture.com

https://xreal.com

https://global.rokid.com/

It's a really exciting time at the minute, I don't think it'll be long until we get someone releasing 4k glasses in this space at which point they will be the clearly superior option to monitors.


Really sad to see traditional LCDs tending toward obsolescence. The newer OLED and especially mini-LED technologies often give me migraine symptoms.


What could be causing your headache? OLED (and mini-LED is very similar) are just better on almost all account (looking at you gradiant banding). Color, perfect black, brightness, high refresh rate.


PWM flicker probably


Same. There are now devices on the market that help find out sensitivity risk, I use this one when I’m going for hardware shopping: https://flickeralliance.org/collections/tools/products/opple...


Had migraines when CCFL backlights were replaced by cheap LEDs with PWM flickering in 50/60Hz visible frequencies. Pretty much same shit happened to early LED bulbs. Just wait for the better evolution.


I read something similar about LG and OLED last year. Will look for a reference.

Update: https://www.techradar.com/opinion/come-in-regular-oled-tvs-y...


This sounds unlikely, and it's just one source. I'm pretty sure LCDs still make the great majority of the TV, desktop and laptop monitor market. Even expensive iPads still use LCD. Only for phone screens OLED seems to be more common.


Oh, they’re going nowhere immediately. But if you’re a panel manufacturer, you’re probably winding up long term development; if the question is “how do we make better LCDs on a 5 year time horizon”, the answer is probably “that’s not worth investing in; in five years LCDs will be low-end only”.


In the past OLED growth has been massively overestimated. E.g. Samsung entered with many years of delay relative to original projections, and even now only in the ultra premium QD-OLED category. So I wouldn't expect LCD becoming a niche product so fast.



Eh? No; as a technology reaches its end of useful life, it’s natural that investment into it falls off to ~nothing. LCD screens will still be made for years (and for niche uses for decades) but the writing is on the wall.

The Phoebus cartel deliberately made a then current and unlikely to be replaced anytime soon technology deliberately worse. I can’t see any commonality at all.


Good riddance. The LED backlight bleed has overstayed its welcome.


Do micro-leds have same burnout problems like OLED ones ?


I fear that the days of no-flicker displays are numbered. Can't imagine that they invent something with per-pixel LEDs without PWM.


When does the flicker bother anyone for my information?


There is a whole forum of people who are deeply bothered by PWM: https://ledstrain.org/


CRT monitors, especially earlier ones running at 60 Hz, must have been terrible for them. Those were orders of magnitude more flickery than any LCD backlight.


Today I learned that liquid crystals are being used to develop antennas?!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: