Hacker News new | past | comments | ask | show | jobs | submit login
The state of external retina displays (caseyliss.com)
379 points by zain on Dec 27, 2021 | hide | past | favorite | 336 comments



> Ridiculous Option: Apple Pro Display XDR

I'll admit, a monitor that costs more than my first car is beyond affordable for 99% of the population.

But saying that, many of the people reading this are pulling truly exotic salaries right now. After devoting untold hours to reaching the top of their profession.

Let's say you're a professional earning $300k plus in software.

You're often working 100% remote, using your monitor 8 hours a day, 5 days a week, just for work. Plus untold hours of HackerNews (strictly after 5pm of course).

I just find that's truly not an unlikely situation to be in.

Without reference to the technical merits of the Apple Display, I don't think dropping $5000 on a monitor is outlandish for any professional in the situation I've described.

My hairdresser for example has a $4000 pair of scissors. He uses the damn things every day, and he sharpens them twice a day.

Whilst it's not essential, taking pride and investing in the tools of your trade is not a thing to frown upon.


Some work tools get treated like capital or heirloom and some get treated like consumables.

Scissors or a hand plane are decent "heirloom" items. Potentially you can retire with the same one you started with, or pass them down.

Electric drills start to lean toward consumable. Battery tech changes over the years and they do wear out in non-fixable ways.

Screen and computer tech advances so fast that 5 years is a "good" lifespan, and 10 years is exceptional.

So a 4k investment in scissors is a lifetime investment, while a 5k screen may be more like "1k / year for good monitor as a service" (plus 2k off your next refresh).

I think it can still make sense, but very often when people buy the premium in computers or displays they are getting the same thing everyone else gets 30-50% cheaper 1 year later.


This is a good comment in general, but:

> I think it can still make sense, but very often when people buy the premium in computers or displays they are getting the same thing everyone else gets 30-50% cheaper 1 year later.

I'd normally agree, but the article's point is that this isn't happening with displays - we're stuck at 4K as the best available except for a handful of displays that are still expensive even years after introduction.

I bought my LG Ultrafine 5K in 2018, and it launched in 2016. There's still no real alternative - we seem to be in a monitor progress limbo.


Resolution is only a single spec. Other things like viewing angles, response time, refresh rate,brightness, HDR, etc. have been getting better and better for less. A premium 1080p monitor from 15 years ago will not be nearly as good as a modern reasoanbly priced one.


I got one of the 4K ones last year and it was the same price as when introduced. The only price reduction was due to inflation.

I assume the desktop monitor business is so low margin that it’s barely worth investing in these days.


There's a market of at least 1 (me) waiting for an HD-DPI wide display. All of the 34 - 38 inch displays are LD-DPI :( My LG 38" is 3840x1600. I want it to be 7680x3200 in the same size monitor


Dell offers a 32” 8K monitor. Driving that beast needs two DPs on your graphics card, however.


8 years ago you had to pay $1k for a 1440p 27" sRGB IPS panel. 2 years ago I spent $800 on a 4k 27" 144 Hz monitor with >95% P3 coverage.

Or another example: 8 years ago I spent $320 on a 1080p 27" (massive pixels) monitor with horrible color reproduction and contrast. 4 years ago I upgraded to a new monitor and for $300 I was able to get 4k 27" with >100% sRGB.

And when you consider inflation, progress is even greater.

It's the same with GPUs, and to a lesser extent CPUs and memory and storage. Moores law is great in many ways, but as a buyer of high end technology it always stings to see the same thing be 30-50% cheaper a few years later.

Progress does seem to have slowed down within the past few years for monitors. I expect the pandemic is relevant there.


Five years ago, I got a 40-inch 4k@60Hz for under $400. I recently got a 43-inch 4k@60Hz for $450. The new one is lighter and somewhat more color accurate. I don't get the obsession with small pixels; my eyes prefer if I stand/sit a bit further from a bigger monitor, and it's great for pairing/teaching.


I spent a while with a 27" 4K monitor. It looked nice in HiDPI modes on my Mac, but I didn't get that much usable screen real estate (I was running it about 1440p). When the pandemic hit and I started working from home every day, I upgraded to a 43" 4k display. I now have the equivalent of 4x 21.5" 1080p displays, which is far more usable than 1x 1440p, even if it doesn't look quite as nice.


Progess is being made specially in lower and medium range of monitors.

Apple took a large leap into giant monitors with high DPI. Most people don't care a lot about it, when you are not that close to the monitor, the retina effect is not worth, for mass market, the cost difference.

In the end, you are both correct. It is slow for the top of the line but super fast for everyone else upgranding mid-range monitors.


GPU is a bad example. Models purchased in 2016 cost slightly more than that now, secondhand, after 5 years of usage.


If you take into account pandemic shortages / price inflation, of course.

But a current $500 (MSRP) GPU thoroughly spanks a 5 year old $500 GPU. Even my 2 year old Radeon 5700xt is feeling out of date.


A current-generation $500 MSRP GPU is actually selling for no less than $900, and it's not due to inflation or the pandemic—it's due to cryptocurrency.


I thought crypto’s impact on GPUs largely ended, as they moved into far greater cost/efficiency ratios provided by ASICs? Is it for the non-bitcoin cryptos?


Yes, bitcoin mining has long been unprofitable on GPUs and is low-profit with ASICs. But most newer cryptocurrencies are designed specifically to be hard to make affordable ASICs for, usually by requiring GPU-like memory bandwidth. If you participate in a mining pool like NiceHash, your GPU will likely end up being put to work mining something like Monero but the pool will pay out in bitcoin.


For casual miners, they can use mining pools to make money that still pay pretty well. Gamers especially, are buying these cards and then joining a mining pool when not playing games to recoup some of their money.

This unfortunately leads to a cycle of more mining, even if the card isn’t optimized for it, as no card is a net negative cash flow atm even factoring in energy costs.


I bought a 30" Apple Cinema Display in the mid-2000s, and gave it away last year to a colleague who wanted to resurrect a period-correct Power Mac after upgrading to 5K screens. Not holding out much hope for the build quality to be the same sadly, but monitors certainly can be decade-long investments! FWIW 5K screens were 1100 apiece from Microcenter - vastly less money than the Cinema Display was in the UK back then!


My kids are rocking my old 23” cinemas, the newest of which is 15 years old.


> the same thing everyone else gets 30-50% cheaper 1 year later.

This is very true. I just got a previous-generation top-of-the-line CPU and GPU. It's way more power than I will ever need and if I went for latest gen I could have afforded barely mid-level for the same price[1]. Even taking into account the fact that latest gen is faster, I still got better performance for dollar (20-30% better depending on benchmarks).

[1] With regards to the CPU I'm factoring in the saving of not having to buy a new motherboard.


I am guessing scissors, esp if they get sharpened 2x/day will not last multiple decades (because sharpening an edge removes material).


High quality scissors, even with constant use, can easily last that long or even longer. Source: I used to work as a tailor. In fact many people in that industry prefer to use older German-made scissors as they are in many ways better than the stuff you can buy now.


He’s almost certainly referring to honing the blades. Actual sharpening would require equipment that would be just ridiculous to have around a hair salon.


A stone? A strop with polishing compounds?


Not necessarily. You can sharpen a knife(and I assume a scissor) with a sharpening steel. Which won't remove any material just straighten the edge.


Honing is different than sharpening. Sharpening removes material, like a pencil. Honing merely aligns material.

A honing steel doesn't remove material, a sharpening steel does.


Eventually that folds over after becoming brittle.


Excellent point.


How do the batteries wear out in non fixable ways? I always thought you could put in new cells and be back in business.


Regarding ridiculously-expensive Apple heirloom tools: Go to eBay right now, and buy any Activation-Locked Apple device. It's OK, I'll wait.

=== ====

OK, now: try to use it to do something. Anything. Something as simple as write a NOTEPAD note, or play one song from iTunes.

=== ====

Nope! You are surely getting screenfulls of information telling you, this Apple device has been locked and only the original owner can unlock it.

I suppose this is to prevent people from stealing Apple devices and reselling them to pawn shops or on eBay and Craigslist, and it must be working. All thieves everywhere must know by now, don't even bother to try to steal an Apple device or a Tesla car, for as soon as you do it gets bricked from all the satellites in space.

=== ====

I'm writing this because I bought a nine year old iPad on eBay and have spent the last two weeks trying to get through to AL-SUPPORT Activation Lock support at Apple, asking them to unlock it. And of course they don't care, I am not the original owner of the device, and unless I am, it is as useless as a digital brick. Battery life, screen quality, beautiful craftsmanship of the item itself, nothing matters. It just sits there telling me I'm practically a criminal for even owning this device.

No matter that the original owner gave up on it long ago, when even the simplest apps like YouTube and Gmail stopped working on it, by design, intentional planned-obsolescence coming down from Apple themselves. With the iOS getting relentlessly updated every year, all the apps get forcibly recompiled and anything old just doesn't work any more at all.

They don't care about this device any more. It was just a real-world dongle they used to get information from and about the original owner, information they've got stored in their giant database in the Cloud. They don't really care about me at all.

I imagine the universe is now littered with these devices, an Oort Cloud of them completely surrounding the planet, mentally bricked by annual iOS updates and physically bricked by Activation Locks, I'm a criminal for owning one, and were I to bring it to an Apple store to complain, they would have no useful help for me at all, "Buy a new one!" they'll tell me, "that's our company policy!"

=== ====

Who is causing the problem here --- me, or Apple?


Depends.

If it was stolen from the original owner then you don't really own the device legally and this is the anti-theft part of activation lock functioning as designed. If you're an innocent third party to the original theft, your remedy is to get a full refund.

If it was just because the original owner forgot to disable activation lock, you should be chasing them to unlock it for you. If they refuse, your remedy is to get a full refund.

edit: reputable recycling places require you to have disabled activation lock on functional devices, or they don't pay out.


I've got a bizarre situation in which an OS upgrade bricked my Mac. I can't remember the details now, but it was something about an interface change in which I had managed to activate 2FA, but the login interface didn't have an input box for it.

The device was old enough that I just heaved a sigh and gave it to Apple for recycling.


There's actually a solution for that: https://support.apple.com/en-gb/HT204915

> What if I use two-factor authentication on a device running older software?

> If you use two-factor authentication with devices running older OS versions – such as an Apple TV (2nd or 3rd generation) – you may be asked to add your six-digit verification code to the end of your password when signing in. Get your verification code from a trusted device running iOS 9 and later or OS X El Capitan and later, or have it sent to your trusted phone number. Then type your password followed by the six-digit verification code directly into the password field.


This weirdly works in some other places. Iirc one of the Amazon's seller login pages accepts the 2FA code appended to the password to avoid having to go through another page.


> It just sits there telling me I'm practically a criminal

I mean, you bought stolen goods. The law says you are culpable to some degree. If this was a legit sale, the original owner would be willing to do what's needed to make it useable.


The original owner dropped it off somewhere when his YouTube app stopped working, which probably happened years and years ago.


If you bought it from a less than reputable vendor, then this is the kind of thing that happens. If a reseller purchased this from the original owner in this condition, I would not call them a reputable dealer.

Apologies if this sounds like victim blaming, it's not what I'm going for. Just trying to enlighten on how to avoid this in the future.


What do you want exactly, for people to have the right to repair the products that they own? Ridiculous! It can't be done!


The startup dude with the Embody and topped out M1 Max with four roommates mates is such a meme today, though.

That's because it's incomplete.

A $5k screen is not so much a tool of the trade as an accessory for someone signalling themself to be valuable or valued. More commonly, coming to own these things is the result of little professional envy.

Our software engineers spend all day developing. They sit in $2-300 chairs and work on their preferred displays (last gen dual Apple TB displays for the Mac guys, dual anythings for the Windows, 1440p ultrawides for the java team).

You know who the only person in our org with an XDR is? The CIO with a MacBook Air, for writing emails and teleconferencing. They got theirs because another C-level made a big deal about how they couldn't work without their dual 5K UltraFines, also for sending emails and teleconferencing.

None of our enployees who could make use of an XDR display owns one because they either own a more industry-capable professional display (Flanders Scientific, Dolby, Sony pro series) or got an iMac Pro when they existed for under $5k.

I don't fault anyone who admires the XDR any more than I fault myself for liking any of the niche luxury items I do, but I wouldn't delude myself for even a moment into believing that an $8k watch is an essential investment because time management is an important skill to have.


I hope you understand you’re very alone In a crowd of sane people if you want to say that your engineers are happy in a $300 chair. A steel case or Herman miller is tablestakes for a decent software shop and if you’re too ignorant or miserly to see it then it’s probably not a great place to work at. A good chair makes a huge difference for ones quality of life. Please don’t skimp on it.


I'm not saying Herman Millers are a scam, I'm saying that we purchase chairs for close to 20k employees, and great chairs do exist for $2-300.

And I get it, I buy for quality, too. But no one is passing up offers over the faux pas of being presented with an inferior chair.


I tend to find that the essentials in a chair are the mesh back, the rest is just extra features.

And yeah good mesh back chairs come in at about $300 wholesale.

I find often that the ones available retail are not as good as the ones my big-ass corporation buys though, because furniture has a huge retail markup.


I have an Aeron, Embody, and Gesture. Yet, some of the $300-$400 mesh chairs at Staples feel better. Downside is I’ve never had one of those chairs last longer than 2-3 years before the mesh loosened up more than I’d like, but while they are good they are the best I’ve tried.


I use a Pro Display XDR with my MacBook Air because of people like you who write long posts about how dumb it is.


I do the same because I wanted the best productivity experience out there and have tried just about any monitor anyone could name before landing on this...

but I might start claiming this too.


Thanks, Apple pays me a commission every time someone uses that arrangement.

I never get a check, though, because they bet me back that they could slap a logo on a towelette and sell it for $20. Uncanny how evenly it cancels out.


You showed him!


the problem with the apple xdr display isn't just the price though. it's not a good monitor for a lot of users. it's limited to 60hz, it's response time is really poor, it suffers from really obvious blooming and bleed at high brightness and it has bad off axis color accuracy and brightness. sure, if you care deeply about color accuracy and 6k resolution it's pretty good. these aren't the only metrics on which monitors are judged though

apple does a really poor job of supporting external displays. there's virtually no good options if you want something with high refresh, good latency and reasonable pixel density and color reproduction


I go into more detail below but I completely disagree with your take.

If anything for most users this thing is amazing (but expensive), and for the few users who were "tricked" into thinking this is a $5000 reference monitor it's not good...

-

Bleed is non-existent, right off the bat. In fact it doesn't make sense it'd have bloom and backlight bleeding, aggressive HDR produces bloom... so the same aggressive HDR would hide BLB.

Bloom is also a complete non-issue for most users. I use this monitor primarily to consume text and casual media... I got weeks at a time before remembering this monitor had HDR enabled full-time when some super thin loading spinner or something appears on a completely black background while the brightness is cranked to max but the room is kind of dark... it essentially takes a torture test to get the FALD to be problematic

Off-axis color accuracy, again, complete non-issue for most users. If I was producing Avatar 2 it might be an issue, but compared to any "normal" monitor it's a complete non-issue, especially when the trade off is exactly 0 backlight bleed, which is 100 times more annoying.

For "most users" there's no better productivity monitor. Even a $20,000 reference monitor would be useless for us since we're not going to be able to drive or configure it...


Their marketing compared it with Reference Monitor. And if it was anywhere close to Reference Monitor, Pros would actually queue up and buy it.

But it wasn't even close. ( And I am still somewhat pissed off about it)


Fair enough, I wrote the above knowing absolutely nothing about the Apple's specs. I'm just speaking in generalities.


Yes, the issue is not that it's expensive, it's that it doesn't justify that cost (serious question, what's that stand made out of to cost $1000?) If you can get an equivalent (or better) monitor for the same price (or less), then it's not really about spending an extremely generous salary to have better tools, it's about spending it to have a substandard tool that happens to have an Apple name attached to it.

If it's actually the best in enough areas to make it worthwhile, then that's great, and maybe it is worth the cost to some people. Maybe something like the Dell 8k monitor is a better fit for more people in the market for one of these, at a cheaper cost ($4k with a stand), more pixels, and similar feature set (if lacking some named features). Maybe not. I probably don't know enough to accurately judge, but it sure looks like Apple's trying to fleece some people that want the brand name to me.


It's not an issue that the screen is expensive, but we should take issue that the way Apple positioned the screen is beyond misleading. It's a hard lie.

During the launch, Apple generously rewarded it with the title "best Pro display on the market" and directly compared it to a 35k Sony reference monitor.

The problem is, Apple's screen is unusable as a reference monitor. The screen is not uniform, has a strong "vignette". The backlight zones create blooming artifacts. Color accuracy is fairly good, but not better than something found in a low-end BenQ.

Not only is it not the best pro monitor, it's not a pro monitor at all. It literally cannot be used as a reference monitor. It's basically a 6K LCD screen with all the problems that come with LCD.


If you can get an equivalent (or better) monitor for the same price (or less)

You can't; that's the point. There are no 6K monitors besides the ridiculous Apple one and the Dell 8K probably needs to be run at 2.5x which I don't know if macOS even supports.


I didnt realize it's only 60hz, that's pretty bad. You can get 4K gaming monitors that do 120hz for under $1000, even if the monitor itself looks dorky it's a very comfortable viewing experience scaled to 2560x1440 high density.


I just got a 32”, 4K, 120hz monitor for $1000.


I got a 32" 4k 144hz monitor for $1000 and given how few of these there are odds are it's a similar panel

It's nothing compared to the XDR. I have no idea why this person thinks the XDR is bad for everyday users.

- I got one as a programmer who mostly looks at text. If you didn't tell me it has HDR always enabled I wouldn't have known, they've done an excellent job with managing bloom on non-HDR content.

- The resolution makes text milky smooth in a way that I always though 4k was this whole time

- 60hz has never bothered me despite regularly switching between this, the 144hz monitor, and a 360hz monitor....

- Accelerometer is neat enough, I have mine on a gas spring monitor arm and can be occasionally helpful to rotate

- Build quality is unmatched by all the gamer aesthetic stuff out there. Remove "backlight bleed" from your vocabulary.


4K isn't dense enough for proper 2x scaling at that size though. You have to go fractional, which is okay, but not great.


Yeah, who could ever imagine using a 1080p 32” monitor…


It's pretty awful.

I got Dell's "5k" ultrawide before the XDR, and at the time M1 Macbooks couldn't do fractional scaling at that high of a resolution.

The result was essentially a wider 32" 1080p screen. And while the text was crisp, the space efficiency was absolutely awful.

I can't even imagine the 16:9 version of that...


I was originally making a snide joke (sorry) since it’s a popular form factor, and I can’t think of a better common option for 2x scaling than 4k 32”… but on review, I actually take your point on space efficiency, if we’re talking about it as a primary display.


I find 1080p at 27" to be comically big, let alone 32". Maybe my eyes are still young though.


I call bullshit on the $4000 scissors. The very very highest end goes to $1000 and most hairdressers use scissors around the $100 mark, which are still extremely good professional tools. Furthermore, if you have a very expensive pair of scissors, you are not going to be sharpening them twice a day... More like twice a year (and you take them to a professional, you don't do it yourself) .


Playing devils advocate, they probably strop the edge twice a day with a leather strop or a super fine grit whetstone. This isn't really sharpening as such, very little material will be removed from the blade (as you probably know), but to an observer would appear to be 'sharpening' and would theoretically assist in maintaining the very minor burrs that would occur from daily use on hair, if the scissors have some bypass this may also be desirable as a burred edge can interfere with the bypass action. Caveat, am an Amateur bladesmith, not hairdresser so I don't actually know much about scissors specifically and dont have a frame of reference for how much damage hair does to an edge.


I can't comment on the real world value of a $4k pair of professional scissors, but the price is not infathomable if it is hand made, made to order Japanese scissors made out of the Japanese Damascus steel. They are likely to take a month or a few to get made and will outlast the humanity. Yes, they probably require to be sharpened ahem serviced by the same master who has made them.

NHK World ran a program a few years back dedicated to the art of the Japanese scissors making, and it was an enlightening experience to watch. The meticulous attention to every single aspect of the scissors, ranging from the carbon content of the steel to ways of sharpening them by hand (no machinery used) has nearly implored my mind. Or, perhaps, I was just that impressed.


Give OP a break, they're planning to show responses in this thread to their spouse later to get permission to buy themself a nice toy.

If they say their hairdresser uses $4000 scissors because that's what the real professionals pay, hey, let them dream.


I got in to sewing recently and none of my cheapo scissors actually worked for cutting fabric properly. I spent $20 for a two pack of fabric scissors at the fabric store and they work great. I can definitely see how $100 scissors would be just fine!

Also sharpening twice a day sounds more like a ritual than a real need. But what do I know I’m not in that industry.


Sounds like they cut the hair of people with hair like mine. My old hair dresser used to joke about needing to get her shears sharpened every time she cut my hair.


Obviously no one should bankrupt themselves spending more than they can afford on a monitor, but I agree completely with the sentiment of the parent.

In early high school I scrimped and saved and purchased a NEC 3FG flat CRT (13 inches I think) and I was so glad I spent the money—-much sharper, bigger, and better that what I had before.

In late college it was the same story for a used 21” Sony CRT. The thing weighed an absolute ton but it was awesome. Crisp 1600x1200 at 85hz. I lugged it way too many places—-coding in friend’s basements and playing games.

As a young engineer, Apple released their first 1920x1200 24” LCD at $3500. I remember telling a coworker it was a bit too much for me but at $2k it would be a no-brainer. A year (or something) later I remember him telling me that Apple had just dropped the price to $2k. I ordered one the same day.

And, yes, today, I have a Cinema Display XDR.

I have never regretted spending money on monitors. The total number of hours of use that I get out of them (say 6 hours a day for 6 years ~= 13,000 hours) makes them one of the easiest things to justify spending money on.


> I just find that's truly not an unlikely situation to be in.

Uhm, I would say that fully remote $300k+ jobs are still unlikely.


>> untold hours to reach the top of your profession... I just find that's truly not an unlikely situation to be in.

> Uhm, I would say that fully remote $300k+ jobs are still unlikely.

Well, yes, reaching the top of one's profession is unlikely. But those who fit the preconditions established by GP probably have no issues finding 300k remote.

(That said, I would never spend more than $300 on a monitor regardless of income... for me, there is literally no return because cheap monitors have gotten so darn good. The one possible exception would be an e-ink display that mimicked the coding/authoring experience of a standard display, but AFAIK that doesn't exist yet. That said, I'm not exactly paid for my aesthetic intuition ;-) )


Think of it as people who earn 300k being able dictate that they work from home, and it sounds a lot more feasible


In my experience, the companies able to pay $300K salaries are able to dictate where their employees work, not the other way around.

FWIW, I’ve had a remote job in this pay range pre-pandemic, but even I wouldn’t say that it’s easy or common.


Not really. I was pleasantly surprised at the options out there when I looked recently.

Entry level dev, yeah going to be hard. Senior level dev with a decent resume? Shouldn't be an issue at all.


My experience working as a senior engineer in FAANG is you get a 2 year old Intel MacBook and maybe a grand to spend on some accessories. Like 80% of my peers thought their 60hz Dell 1440p monitor was state of the art and couldn’t tell the difference between a Celeron and an i9 10900x if their life depended on it.


Same experience.

You can get nicer stuff, but you should order it specifically, and explain why you need it.


possible != likely


fully remote is not a benefit for most serious programmers, it's a choice.

I know many dozens of engineers that work fully remote and make >> $300k.


I find it's just too much to justify. A bit more than $1k on the Ultrafine 5K was justifiable for me, and I've seriously considered buying another.

If it was $2k or so for the Apple Pro Display XDR, or if they did a middle-man option that provided the 6K resolution but wasn't as bright etc, I'd snap one up in an instant - but I want it mostly for text & software development. I don't need perfect colour accuracy (although, I do like it).


The 8k Dell would be that... if it wasn't so horrible to drive.

It's ironic, it's probably heavily discounted because no one can drive it... but if it had come out later with newer connectivity options, it'd be selling like hot cakes and just as expensive as an XDR


I make around half that and the biggest problem with the XDR is the aspect ratio and the reason why that is an issue is how Apple designed the XDR. It is primary for people on the low end and medium end who edit / color grade video.

If you are a developer try considering a 12.9 iPad Pro possibly? At least the aspect ratio is much better and the refresh rate is there but it also is too small for the rest of the examples. I don't see Apple releasing a monitor that will be middle of the road but I could see an LG refresh.

As a developer when I am doing my side projects I mainly make my terminal consume the middle third of the screen and have other relevant information or have it fill the screen. The only issue filling the screen is that you are possibly looking at the edges too much and that can cause eye strain.

But, would I buy an XDR again? Absolutely. Doing anything with this monitor makes every other monitor I own second rate. Also, content consumption is excellent to the point that its brighter than the theaters around me. Also, having a Mac which is fully compatible with an XDR allows me to manage it without touching one button on a monitor. It's integration with the rest of the system is by far the most appealing option. I do own the non Nano option.


I bought the XDR pro display and the stand June 2020 after completing a challenging but decent paying contract.

It’s a fantastic piece of equipment. Apple’s business discount and Apple Card cash back program chips away at the initial cost.

I sell used Apple products regularly as I upgrade and they hold their value pretty well. XDR is not an exception, check eBay.

Liquidating an XDR in favor of say Apple VR or an upgraded XDR will reduce the cost of ownership a great deal. Do the math for yourself.

I use it with a 2018 Mac mini and standard Blackmagic egpu largely for full stack development. It’s amazing.

I occasionally edit video in final cut, and watched Dune in 4K HDR on it.

If you’re considering an XDR I would not let all the negative opinion here sway you too much.

You can read about the experiences other developers and non tech folks have had w the XDR in this thread: https://forums.macrumors.com/threads/pro-display-xdr-owners-...


“Let them eat cake” is all I hear from this comment.


Home computer displays for radiologists can cost well over $10K. Of course they dont have to pay for it themselves though. And they can make over $600K working from home reading images all day.


I'm quite curious how much your hairdresser charges you given that he uses a $4000 pair of scissors.


>"Without reference to the technical merits of the Apple Display, I don't think dropping $5000 on a monitor is outlandish for any professional in the situation I've described."

I can afford $5000. But I would not pay this money. I do away just fine with 4K 32" BenQ at $700. I use it for programming and for this task that $5000 brick offers no advantages for me. And it is not ever real pro display. Those go for much more than 5K. Try 20K and up.


Well yeah, but while I am a professional, I am not earning $300k plus. So some budget constraints apply. Ironically, I might have considered an Apple Pro Display when it came out, if Apple had sold a computer to go with it :p. Now with the M1Max laptops, things are starting to look differently.

But there are several issues with the Apple Pro Display if you are not a video artist:

- it uses a lot of power and even requires a fan.

- it does have only one input port. Of all companies, Apple might consider the use case that you need to connect your (work) laptop and your (private) Mac to your screen.

So the price isn't only pretty much outside of the budget of most mortals, even if you treat it as a once in a lifetime luxury expense, it is quite limited. I so hope either Dell makes a screen with the same panel, but with ports ans somewhat less expensive, or Apple makes a consumer screen again, I hope so much it is larger than 27", I find this size too limited. And if they do, please give it more than one port.


>I'll admit, a monitor that costs more than my first car is beyond affordable for 99% of the population. But saying that, many of the people reading this are pulling truly exotic salaries right now. After devoting untold hours to reaching the top of their profession.

It's not even about "getting something expensive but good as a professional expense to invest in the tools of your trade" - as a developer might see it.

It's quite a bargain for what it is, period. This monitor has specific capabilities, competitive monitors to which, used in the video and post-production industry, cost anywhere from $10,000 to $20,000, from vendors like Sony and such.


I think the issue is most employers are unlikely to buy a monitor like this for anyone except video editing professionals. That leaves it on the employee to purchase it out of their own salary, even though it is almost certainly a work only expense.


Counter point, many older more experienced hairdressers get away with a $1000 scissors just fine. Your point stands though, pay for what you want. Just don’t tell me it’s the tools that make the operator.


People aren’t thinking clearly about this because of how cheap all monitors are. Here’s how I think about it:

For a software engineer, all monitors are worth $100k+. This particular monitor is worth $110k+. So yes, spend an extra few $k to get the best.


How exactly did you arrive at that conclusion?

Why would a consumer product that I can get for $800 would be worth $100k for me, thus justifying overpaying for it? Am I supposed to buy $50k wallet because I take it everywhere now? Or $250k glasses because I wear them all day?


Good question, but I have a good answer: you need to understand the amount of consumer surplus of each product. If a wallet cost $50k, you wouldn’t buy it, you’d shove your credit cards in your pants pocket or find a workaround. But if a monitor cost $100k, you or your company would simply buy it and get on with your life - assuming you’re a professional programmer. Monitors have a ridiculous amount of consumer surplus. Once you accept that a monitor is worth $100k of consumer surplus, it’s more intuitive that a 10% noticeably better monitor can have 10% more consumer surplus, regardless of the low market price.

And yes, strong vision correction is also potentially worth $250k if you make a programmer’s salary and that’s the only way you can work. It’s just important to operationalize “10% better” as a measure of how much it helps your output. If your productivity increases 10%, it’s worth an additional $25k cost.

One more example: Imagine McDonald’s Big Mac Value Meals cost $1 each and every other meal cost $50. Assuming you make a $100k+ salary, I think you should opt for the $50 meals quite often, even though most of the country is eating Big Macs and thinking you must be crazy.


The hypotheticals make sense, but they're not reality.

As for your last example, it again make sense on its own.

In case of these monitors though, I see it more like every burger costs $10 but there is also $40 one that has a nice logo stamped on the bun. Also you have to pay extra for a plate. I don't think I am buying that one very often.


Well obviously if you have that kind of cash you might as well, but it's a bit like drugs. You think "I need this to be productive" when you're really just stroking your vice - where does it stop?


I'm sorry, tell me more about the $4000 pair of scissors.


Pro Display XDR really only makes sense if you buy it with pre-tax dollars.


Someone could also buy a really nice monitor for $2k and donate the other 2k to a local food bank. A lot of people are struggling and hungry.


What if I buy a $5k monitor and donate more than that, are you satisfied then?

Why turn this into a morality play?


> What if I buy a $5k monitor and donate more than that, are you satisfied then?

I don't believe you would actually do this or you wouldn't make that comment, but either way don't do it to satisfy me.

> Why turn this into a morality play?

As I said, a lot of people are struggling and hungry. The point of the comment is to inject a little reality in to the idea that a $4,000 monitor is "worth it" because someone is at the computer a lot. But the marginal happiness of an HN user buying a $4,000 monitor instead of a $2,000 is very low, and there are other ways to spend extra cash. If you have extra money right now, consider donating it. It's cold outside and people need help.

Why get upset anyway? I think I made a fine suggestion. Someone could do what I suggested. Or not. You don't have to do it if you don't want to.


In Europe we don’t kindly ask for your extra money. We just tax it. Takes the moral dilemmas out of the equation.


You know, I wasn't upset...

at least not when you wrote this:

> Don't do it because I am satisfied or not.

but for some reason you felt the need to edit it to:

> I don't believe you would actually do this or you wouldn't make that comment, but either way don't do it to satisfy me.

You should probably rollback the edit.

I'm from Ghana, I probably donate what you've donated in a decade every month to people who need it more than any American ever will.

And not just money, time. I don't go back home just to see my family after all, I've spent months of my time working with my father on his USAID project in the country. Is taking a 6 month unpaid sabbatical and giving up 200k in pay enough for our resident patron saint of the poor?

I grow reallll tired of people like this. People in the "1st world" who based on their virtue signaling you'd assume live like monks but in reality live in relatively cozy excess completely unaware of half the reality the actual downtrodden face.

-

Semi-off topic but based on this comment I feel like you're exactly who needs to hear this. The degree to which virtue signaling has become an integral part of some people's sense of identity in this country is infuriating: It's not cold where I'm from, people still need help.

Getting giddy off slogans and token shows of kindness in one of the most privileged countries on earth...

Where I'm from "needing help" isn't a seasonal issue, and it's not even close to being as bad as it gets globally.

-

> But the marginal happiness of an HN user buying a $4,000 monitor instead of a $2,000 is very low, and there are other ways to spend extra cash.

What a joke. Did it ever occur to you spending fractions of a percent more compared to what you get paid to sit in front of the damn thing is a marginal expense?

> Why get upset anyway? I think I made a fine suggestion. Someone could do what I suggested. Or not. You don't have to do it if you don't want to.

Because you have the audacity to talk so condescendingly based on someone's monitor choice. I couldn't resist replying in kind.


I’m not trying to tell anyone from Ghana how to spend their money. But the person I was responding to seems to know a lot of people making $300k and they seemed to be really good at convincing themselves to spend a lot of money on things. I made the rather meek suggestion to buy cheaper and donate. Right now people are freezing to death on park benches where I live so people really do need more help.

But other than that you’ve read too much in to me. My entire philosophy is that we must change the way our economy operates to eliminate poverty. I’m setting my own life up so that all of my engineering work goes to support this goal. Everything I do is open source so people all over the world can benefit, and I am learning how to operate an engineering project sustainably that can stay open source without needing to cave to commercial interests. The project I am learning this on is an open source farming robot of my own design, and I have sunk a fair bit of my time for free in to the project and earn less than half of what I did working at google. It was my idea to operate the project as open source, and to intentionally collaborate with people all over the world to make a design that can be fabricated cheaply anywhere.

Once I learn how to manage this community oriented engineering project, the next project will be large scale free hot meal producing machines. I want to make free meals the way the Sikhs do in India, where an army of volunteers cooks 50,000 free meals a day at a single location, and collectively across India their non profit NGO produces over 1.7 million free meals a day. I want to use my skills in automation and engineering management to make open source machines to do the work of those volunteers, and if I succeed we will open a demonstration facility in Oakland that can serve hundreds of meals a day, scaling hopefully to thousands.

You mention virtue signaling. But I am not here for signaling. I really do find it weird when people on here talk about how they’re going to spend all their money on themselves. And I make this mild suggestion that they consider donating their money because I want to see how people respond. I wasn’t condescending, I just said someone could buy cheaper and use their excess money to help the needy. We have a real problem with consumerism in the USA and it is destroying the planet. I think it’s worth making a gentle suggestion to donate. And invariably someone gets upset and makes a big deal out of it. So today that person was you.

But I’m working very hard to do my part. Sharing my work with all and trying to make it sustainable. I taught a robotics class in Mauritius to some students from Ghana, and Kenya and South Africa and Ethiopia and Morocco. When I design my farming robot I have them in mind. Once our design is operational I want to find people in Nairobi who can build them, and I will help them every way I can. Hopefully some day one of my students will be able to use it. A few of them really wanted to bring farming robots back home.

EDIT: This linked comment below really nicely sums up what I am getting at. I’m not saying a developer shouldn’t have a nice monitor but there is a point at which it becomes extravagant, and I really don’t understand why you’ve fixated on me: https://news.ycombinator.com/item?id=29708573


This entire comment is walking back what you said because I'm not the exact person you presumed I was... based on the monitor I use. You understand how ridiculous that is right?

> I don't believe you would actually do this or you wouldn't make that comment

That's what you said to me, not someone else. And all you had to go on was me saying: Spending an extra $2000 on a tool you use 8 hours a day is not something to pass a moral judgement on.

Most people would not object to what I said: a $5000 tool is a pittance in comparison to what some not nearly as well compensated people end up spending on better tools.

So if despite that you try to turn it into an issue of donating more to your fellow man... what's the term for that?

What is a term for forcing virtue and morality into a conversation in a way that does more to signal your own position than actually add to the conversation in a cohesive way?

-

To put it more plainly: browbeating people over not spending $2000 less on a monitor is the height of virtue signaling in a forum where people are stating they make a living off of them.

I mean even in your reply you're doing it! People are talking about an expensive monitor and somehow you twist it into:

> "I really do find it weird when people on here talk about how they’re going to spend all their money on themselves."

You really don't see how nauseatingly disingenuous you're being? Talking about spending $5,000 on a monitor is tantamount to saying you spend all your money on yourself?

Didn't that exact wrong assumption already lead you astray with me?

My first comment said it all: Why make this a morality play.

It's the definition of virtue signaling.


Respectfully, I do not see how I am browbeating anyone when I say:

> Someone could also buy a really nice monitor for $2k and donate the other 2k to a local food bank. A lot of people are struggling and hungry.

And I think it is reasonable for me not to believe that a random internet commenter is going to make a $5000 donation while they accuse me of making a morality play. I'm not saying you will never donate $5000, but I do believe you will not make any additional donations due to our exchange. It just sounded like hot air.

But I can see that you want to tear me down, and I don't really care. We have strayed far beyond good faith conversation so I'm going to exit this now.


If you don't see the arrogance in telling someone talking about the tools of their trade to consider instead downgrading and donating to food banks, as if the two are mutually exclusive, or one topic begat the other: then there was never much of a conversation to be had.

You're free to martyr yourself though. Act like I'm just tearing you down. Not addressing anything you've said in these comments.

-

Maybe in a moment of introspection you might realize that no one would tear you down for suggesting food banks exist, and maybe, juuuuuust maybe you did something wrong when you:

- Tried to make it about choosing between spending money on tools and donating to food banks, as if the two are mutually exclusive.

- Made assumptions about a complete stranger's donation habits based on a monitor purchase and telling you not to make a tooling choice a morality play...

- Claimed that people talking about a monitor are actually saying they spend all their money on themselves.

Of course, it'll be easier to not do any of that and go on thinking that there's this weird Ghanian guy who hates food banks and spends all his money on himself because he likes the Apple Pro Display XDR.


I have been waiting for good external Retina displays for years. That the situation has not improved after all this time is incredibly frustrating. Why is no display manufacturer [0] interested in setting itself apart by producing and marketing a lineup of reasonably priced external pixel-doubled (~200 ppi) displays? For some reason everything must be 16:9 and 4K with no regards to display size, resulting in some very awkward pixel densities. The fact that Apple is able to mass-manufacture a 5k display with a whole computer in it for just a little bit more than an UltraFine 5k is to me an indication that at least it’s technically feasible.

[0] There is LG, but these displays have their issues as the article explains.


The monitor market as a whole is pretty underwhelming.

TVs and mobile/portable devices get lots of attention. But pc monitor they are very far in manufactors priorities.


Focus is on 1) gaming, where response time and high frame rate is more important than high resolution, and 2) regular offices, where price is more important than any other feature.

Add to this fact that manufacturing displays is a costly affair, so there are no "artisinal" choices.


If you're looking for a 5k display, I've found Wikipedia's list of 5k devices to be valuable reference point: https://en.wikipedia.org/wiki/5K_resolution#List_of_devices_...

Personally, I'm hanging out for a 5120x2880 display larger than 27". We've had 27" 5120x2880 displays for 6 or 7 years now, you'd think someone would have taken that res to a larger panel by now. But nope, still waiting.


If a 5120x2880 display were larger than 27", then it wouldn't have a good pixel density. In my experience, either slightly above 100 or 200 ppi are the sweet spots for screen resolution, because you get along well with an integer scaling factor.


I'm aware of all of these monitors, but most are discontinued or unavailable. Luckily I've recently been able to acquire a used LG 5K monitor and it's pretty great most of the time, but my model has some display quality problems.


it's a matter of connectors and display bandwidth. Apple's extra-expensive display doesn't use HDMI or DisplayPort connectors, it uses a proprietary connector.

edit: I'm wrong on that. oops. apparently that invalidates my entire point. (it doesn't)

display bandwidth matters, and connectors are where display bandwidth goes to die. apple had to design a special one for the bandwidth requirements of that display.

it's not so much a matter of panels, but display protocol bandwidth and connectors that allow it.

even Microsoft have the Surface Studio, and it's excellent 4500x3000 display, and it's only available as part of the Surface Studio because that's the cheapest way they can get the display bandwidth all the way to the screen. eliminate the connectors and hardwire it.

this is also why laptops (especially apple laptops) have such good displays. they don't have to destroy the signal integrity with connectors, and they can use more wires to carry the signal than HDMI or DisplayPort allow.

high resolution, high framerate monitors just won't happen over DisplayPort or HDMI without serious advances. I expect a new connector to appear before that.


DisplayPort over Thunderbolt is not a proprietary connector (this is what the Pro Display XDR uses).


It's really stupid to force Thunderbolt usage though when USB-C has an alt mode for DP directly.


I believe the reason, IIRC, is it's actually doing two DP streams because one isn't enough bandwidth for the full resolution, which you can't do in alt mode where the PC is literally using the pins of the connection as though it were a single DP cable (although this may be more viable with newer DP specs these days, not sure).


The DP altmode didn’t provide enough bandwidth, which is why the best experience is over the Thunderbolt alternate mode.

edit: you can in fact get HBR3 working, what's not working at full rate with DP instead of thunderbolt is the USB3 hub on the display.


Oh it does? Nice. Kinda expected Apple to go full asshole mode on that.


You mean Intel - older Thunderbolt displays don't support USB-C alt-mode because Intel didn't support it as input until Titan Ridge Thunderbolt controllers.


DisplayPort 1.4 (quite old and common) has 8k at 60Hz. DisplayPort 2.0 has 16k at 60Hz. How is that not enough?


I think at the time the 5k Mac screens arrived only DisplayPort 1.2 was available, and probably some of the chips they use still only support that. But indeed - since time moved on and DisplayPort can now do 8k/60Hz it would be nice to see more screens going beyond 4K and supporting it.


Why in the world do you need proof from me?

the proof is that video cards and monitors and TVs either simply don't support that bandwidth, or they're so expensive that they're essentially hand made.

That's how you know, you look at the market and see what's available, and what it costs. People would buy the heck out of this stuff if it were possible to make it work with common, sloppy, reusable connectors like HDMI or DisplayPort. The fact of the matter is that it isn't possible to make this stuff with removable connectors, yet.

That's why you see very high DPI displays in applications where there isn't any need for high cycle-count connectors, or only single-use connectors well before you see the same displays with high-cycle count connectors.

or they use display compression to fake it.

but what do I know? you did some google searches.


The "serious advances" is DSC, which the XDR is already using to support 6k over a single DisplayPort HBR2 link. Which, as others mentioned, can be carried via USB-C alt-mode in addition to Thunderbolt.

6k 120Hz and 5k 144Hz are both possible over a single HBR3 link with DSC. Such monitors don't exist because the panels don't exist; rather the closest panel for sale is perhaps the 5120x1440 at 240Hz in the Samsung G9. Which has the same bandwidth requirements as a hypothetical 5120x2880 at 120Hz.

If you want 240Hz at >4k, or 8k 120Hz, then sure that exceeds existing DisplayPort 1.4 and ThunderBolt 4 bandwidth capabilities even with DSC, and you'll need the upcoming DisplayPort 2.0 link rates. (or DSC+chroma subsampling)


It's using a Thunderbolt 3 connector I wouldn't call that proprietary. There are quite a few TB3 monitors out there.


It uses a usb 3 connector, although, given bandwidth and power requirements, I guess it has to be rated for the task.


Why is <220 ppi a dealbreaker? 27 inch 4k displays seem fine to me for retina and they are quite abundant. Though I've tried 32 inches 4k, and there the resolution is not enough for text work.

Dell has a 32 inch 8k display which is ~280 ppi in stock.

https://www.dell.com/en-us/work/shop/dell-ultrasharp-32-8k-m...


If you're on macOS, your choice for a 27" 4K is realistically either the 'looks like 1080p' high DPI mode, or the 'looks like 1440p' high DPI mode. The 'looks like 1080p' mode makes all of the UI elements huge. The 'looks like 1440p' mode makes text slightly fuzzy as it's being downscaled from 5120x2880 to 3840x2160.

It's not terrible, but it's nowhere near as nice as the 27" 5K.


That sounds like a MacOS problem, not a "220dpi is not enough" problem.


This specific issue is definitely a macOS problem - instead of doing proper resolution independent rendering, macOS just renders at 2x the 'looks like' resolution, and then scales the result to fit the panel.

But even on Windows, the choices at 27" 4K aren't great. 100% is too small. 200% is too large. 150% is quite nice, but I find apps regularly don't scale properly, and text still isn't as nice as 200% on a 5K display.

edit: and on Linux, it's similarly not great - integer scaling works quite well on most modern DEs. Fractional scaling is a mixed bag and a lot of apps have broken or compromised UI when it's in use.


I use a triple monitor setup with the LG 24UD58-B. 24", 4K, $300. 2X scaling is absolutely perfect on it. 27" is not a good monitor size for this, I agree.

As an aside, a user here recommended putting the outside monitors in vertical mode with the middle monitor in horizontal mode. It works extremely well for reading websites/documentation/terminal output while coding in the middle panel.


It's cheap enough that I've been tempted to pick up one of these to try, but I think going back to 24" would be a little frustrating for side-by-side editing in an IDE.

I think it could make a great secondary though.


> Fractional scaling is a mixed bag and a lot of apps have broken or compromised UI when it's in use.

I use 1.5× scaling in Sway, and haven’t observed even the slightest problem in any Wayland app. (I also use the high-DPI patches for XWayland, and run it at 3×, so that it’s an integer multiple of my scaling factor, a minor visual improvement on the very few X11 things I use that is probably not worth it.) I will note, however, that Firefox is rendering content at 2× and downscaling; setting the experimental widget.wayland.fractional_buffer_scale to 1.5 distinctly improves rendering, but had some very annoying bugs when I tried it several months ago which became debilitating maybe a couple of months ago, so I gave up on it.


The one that sticks in my memory the most was a bunch of issues with Qt-based apps in particular where UI would scale in 1x, 2x, 3x steps, but font size would scale with the fractional size specified.

Led to some pretty broken looking UI.


I have a few Qt apps and haven’t observed anything like this; most likely that’s been fixed. I’ve been using Wayland since April.

(One thing that is comically broken about Sway and I think Wayland in general is cursor sizes. Specifying `seat seat0 xcursor_theme Adwaita 96` nets me at least four or I think five different cursor sizes, varying by app, and at least one of the dodgy sizes is actually a fractional scaling bug, scaling by ceil(1.5) rather than 1.5.)


Ummm, I think 274K on 200% is definitely fine? I just use it as 1080p with better text rendering and graphic. I don't even use it at 150% because that at a 50 ~ 60 viewing distance isn't really a pleasure experience. Did you place the monitor very close to your eyes?


Display distance is between 3ft and 4ft away for me depending on exact seating posture - so somewhere around 1m.


Exactly. I end up to get use to this fuzzy text, but the first time was like "Ughhh this is ugly".


I'm using a 28" 4K (with Wayland, which does scaling exactly like macOS) — used to use 1.5x with downscaling but switched to 2x "huge UI" for perfect crispness and honestly? It's not bad. It's actually good. Don't fear the big buttons :)


> switched to 2x "huge UI" for perfect crispness and honestly? It's not bad. It's actually good. Don't fear the big buttons :)

There's always a balance and trade-off here.

2x scaling with 4k really means you end up with the same screen real estate as 1080p.

In my opinion I would very much prefer 24" or 25" at 1440p with native scaling which is ~120 PPI. I personally run both a 25" 1440p display and another 24" 1440p display at 1x. I still consider going from 1080p to 1440p one of the biggest quality of life upgrades I've encountered when I made the switch 7 years ago. It's the same level of "wow this is great" as going from a HDD to SSD.

Before there were shortages you could get a high quality IPS panel monitor from a reputable brand (Dell, ViewSonic) with low input latency for around $300. 4k at 1.5x also gives you the same real estate as 1440p but with higher PPI but I don't know how crisp that will be given the scaling ratio.


>2x scaling with 4k really means you end up with the same screen real estate as 1080p.

Yes and no. With much clearer looking text, I can use smaller fonts (13 -> 11 in IntelliJ) than on regular 1080p displays, so there's _some_ difference.


> 1.5x also gives you the same real estate as 1440p but with higher PPI but I don't know how crisp that will be given the scaling ratio.

1.5x is nice on a 27 inch 4k with 14+ fonts. HiDPI monitors do seem to have bloom, so slightly bigger fonts with higher DPI are good to have.


For reference I'm at 25" 1440p with a font size of 9 using Consolas. Things look pretty crisp at native scaling and also very comfortable to read with no strain.

In terms of viewing space this fits (4) side by side code windows at 80 characters with a little bit of breathing room.


Yep, 27" 4k at 1.5x (plasma) I can have 50/50 or 60/40 with font size 14 and get 110+ characters, not just 80. 60/40 would be for an IDE with a bar or two on the side and still get almost 110 columns in the editor/console comfortably without eye strain. With 144 DPI fonts on plasma (using Hack), fonts seem plenty crisp to me down to size 5.

For screen size, I think there is some screen size that's most comfortable to view, and I want to say that's like ~40% of the human FOV... so, I think a 24" monitor and 15" laptop are most comfortable for me, but 27" might be a bit nicer if you're using it for media as well.


I miss read a bit when replying in the other comment, for 4 side-by-side 80 column windows, you'd need to bring the font size to 9, which isn't as comfortable as 14 for me in my side-by-side setup, but then 1.25x is also an option.


> you end up with the same screen real estate as 1080p.

Is that even a problem? I am not buying better screen to torture my eyes. IMO. A smaller sharper text won't really be easier to see compared to a larger but blurrer text.


I think so, but it's personal opinion.

I wear glasses and 25" at 1440p has very clear text. I feel no strain. Even the grey text on HN is very readable from about 32" (81 cm) away which is my normal viewing distance. It's readable without much strain from 45" (114 cm) but I wouldn't want to use things that far away.

For reference my work issued laptop is a 13" 2020 MBP so I have experienced retina. I would take the 1440p screen real estate at ~120 PPI (24"/25") 10 out of 10 times vs going back to 1080p but with higher PPI. There's also the added benefit of never having to think or worry about poor UI scaling since not all apps handle scaling well, but that's not a deciding factor in my mind, more of a nice to have.


Then probably this is the reason.

I never need to run app and editor side by side on same screen. I always have 2+ screen in any setup. So I just threw it to another if I want ever want to run it side by side. And besides, I seldom codes with my 27 4K screens.

They ares used mostly on browsing web/gaming. which you can't use something side by side isn't even a issue.


Yes, for developers screen real estate is everything. You want crisp fonts and lots of space. Both for just code editing and of course for testing, where you often run your application side by side with your code window. At work I still use my old 30" 2560x1600 screen (basically the old cinema display) and its just gorgeous. My iMac is way sharper, that is also great for image processing, but already noticeable smaller. You can squeeze a bit more onto the screen due to the higher resolution. My 24" 4k screen works with a scaled resolution (2300 horizontal resolution) but its considerably worse. A hidpi 30" screen would be a dream.


> with Wayland, which does scaling exactly like macOS

I believe that depends entirely on the desktop environment. Sway does real fractional scaling rather than the idiotic downscaling way.


No, Sway supports downscaling. Augmenting downscaling with exact fractional scales is still in discussion: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/i... (And Firefox has a prototype implementation using wp_viewporter but you have to set the scale manually.)

Of course, independent of the desktop environment, you could just tell supporting applications to just manually render bigger UI (exactly how you would on X11 with some TOOLKIT_DPI variable thing per UI toolkit) and keep the monitor scale in the compositor at 1x. That's probably what you've been doing. That doesn't support multiple monitors with varying scale.


> Why is <220 ppi a dealbreaker?

Because it's very blocky and not pleasant to use.


I have been using Dell P2415Q (24" 180ppi) for a few years side-by-side with a Retina MacBook, and the difference is visible: I can see pixels on 180ppi text if I lean forward, but not on the 220ppi. Not a deal breaker, of course, but the "retina" marketing pitch isn't lying.

BTW, LG 24UD58 mentioned in the article is also only 180ppi.


I am using the same, as I needed an external screen for my MacBook. I use it in scaled resolution for screen estate (2300 horizontal pixels), which works, but isn't as sharp as my iMac.


> Why is <220 ppi a dealbreaker? 27 inch 4k displays seem fine to me for retina and they are quite abundant.

I regret buying a 27 inch 4K monitor each and every day.

Sure, the PPI is much better than my 24 inch 1200p Dell Ultrasharp but a 27 inch 4K needs wither fractional scaling, which is a mess, or scaling fonts, which makes UI elements look weird.

24 inch 4K and 27 inch 5K should be a standard but all monitor OEMs are interested in making shitty gaming monitors.


I wish Mac OS and Apple hardware supported 8k monitors. I have this monitor and love it, but I can't use it with a Macbook at higher than 4k.


My utterly unrealistic dream is: now that the first batch of 5k iMacs is approaching “old age,” some business would start selling kits than had the tools and instructions necessary to gut old iMacs, and a new controller board that would convert the remaining screens to basic DisplayPort displays.

At $300 and around $500 for an old iMac on eBay, this could be somehow both more economical and likely more reliable than the displays currently on the market (see my other message in this thread). Also it’s just going to be a pity when those beautiful old computers start hitting the landfills.


I assume they use eDP[1] internally so it wont be that har to convert them to a monitor[2]

[1]: https://en.wikipedia.org/wiki/DisplayPort#eDP [2]: https://hackaday.com/2019/07/18/put-those-ipad-displays-to-w...


It’s been done, I just don’t have the know-how to acquire the parts or properly do the soldering. I need a kit that says “throw away this, plug in here.” https://9to5mac.com/2021/01/26/apple-5k-display-do-it-yourse...


The 27" 5K iMacs suffer from image retention as they get older. I'm typing this on a Late 2014, and the retention has started to become annoying.


This exists albeit more DIY than your dream: https://www.reddit.com/r/hackintosh/comments/hrlf8x/the_secr...


my current setup consists of 2x imac27 retina sitting next to each other. I use Barrier (a software KVM) to switch mouse/keyboard between those. It works well for my need and is an inexpensive way to get a 2x 5K monitor setup (total cost: 2100 EUR)

I am considering getting a M1 mac mini and use my 2 current imac27 retina as external monitors. My plan is to get a cheap small monitor for the mac mini, then use Barrier to use the imacs as external monitors.

Anyone here has gone that route?


I wouldn't blame you for wanting to remove the Intel innards, but iMacs can already be used as external displays.

https://support.apple.com/en-us/HT204592

Also, you can just spec the panels from Aliexpress most of the time. That's how I got my first Korean Retina IPS display over a decade ago, and I'm doing it again now for a non-standard OLED panel for a project.


Target Display Mode doesn't work on any of the 5K iMacs, from what I understand.


You are correct! IIRC, it wouldn't have been possible (or at least, wouldn't have been very easy) with the external display protocols available at the time the first 5K iMacs came out, and they've never gone back and revisited the idea.


Thanks, I missed that.


Check your local craigslist listings. I bought a lightly used LG Ultrafine 5k in 2019 for $600 in SF.


I did something similar, and sold it for far more during the pandemic. I don’t think they’ve gotten cheap again, though I haven’t looked hard!


If you are considering the purchase of an old Apple Cinema Display, make sure it includes the "power brick" or "power supply".

It's a proprietary bit that combines the DVI and power onto a single cable that plugs into the back of the display.

Many resellers process used Apple displays just like every other brand. You don't sell a DVI or VGA cable with a monitor. It's no big deal.

But without the required Apple brick, your very nicely calibrated large LCD monitor is... just a brick.


A bit of offtopic: Mini LED screens honestly are disappointing for me to some degree. The blooming is a real thing and I can see it every time there is bright content on dark background. I do accept it because there is no other better option really. I can see this blooming both on MacBook Pro 16 M1 Max and on iPad Pro 12.9. It looks especially bad when bright contents moves slowly, you can see those backlight zones edges and you can observe one zone gets turned off and other on. Awful.

I also own Gigabyte AERO 15 OLED laptop and I must admit I prefer this screen over Apple's. White text on black looks amazing and to be honest feels a bit magical, like a thing floating in the air, especially if I look on this screen in a dark room. Code, terminal content looks amazing. It's only 60Hz though, but for mostly static coding 60Hz is enough. Ubuntu works really well, so I don't have to deal with Windows for programming work if I don't have to.

I'm tempted to buy LG OLED 27EP950, it's expensive, but honestly I'd pay those money to reproduce Gigabyte AERO 15 OLED experience on 27 screen. But I know it won't happen, because LG screen will have worse pixel density, it's only 4K. It should be 5K at least, or even 5.5K or even 6K.


Problem with OLED is the 240hz PWM, which is present in the AERO as well.

For instance, on my iPhone 13 Pro, I can see the on-off-on-off clearly once I’m below 20% brightness. The eye strain of oled is terribly annoying.

I agree with you on the blooming, but there are no options today that scratch all itches.


Are all laptops with oled using pwm?

Some amoled phones are using dc dimming...


I’m only aware of the Gigabyte Aero (240hz), Dell XPS 13 (240hz), Dell XPS 15 (60hz!?), and Razer Blade 15 (60hz!?).

I wasn’t aware some AMOLED phones use dc dimming. I would love to have that. It’s so weird looking at your phone and seeing the screen turn on and off repeatedly.


These Apple displays were rumored to be microLED, and I was incredibly bummed to find out they’re just an iteration of regionally-backlit LCD’s. OLED is definitely the best option for contrast, and I’ve had no luck finding an OLED monitor that wouldn’t bankrupt me.

I bought a couple Dell U2720Q instead, and they’re great; until something more compelling is available.

https://www.dell.com/en-us/work/shop/dell-ultrasharp-27-4k-u...


LG 27GN950 seems like it hits a a lot of the criteria others are discussing. It is not 220 PPI though.

4K IPS at 144Hz (or 120Hz depending on video card)

HDR

400 nits

It also rotates to portrait mode, has a built in backlight for eyestrain, is essentially borderless- and it's $800-900.

Have had it over a year with no issues.


Is it actually HDR, or is it 'HDR'? I have another LG monitor that claims HDR support but really it just squishes the SDR content to make space for HDR - so it makes everything look awful.


400 nits HDR is a marketing gimmick. For real HDR you need a 1000 nits minimum as well as a 10 (if not 12) bit display.


I've heard others say that unless it has FALD, anything claiming HDR is basically lying. And I can't speak to how accurate that is, but this monitor has HDR600 which requires 10bit color depth and local dimming.

I think it looks better with it enabled than without, seems like darker darks / richer color, but I'm not an HDR or monitor enthusiast, so check better sources before making a purchase.


There are so few thing on windows actually utilize HDR properly. The few I come up with are some games from UBI (Tom Clancy's series) and forza horizon 5. You get far better experience when play them with HDR. The others are either nonadjustable too dim, too bright or have a totally incorrect light curve.


It's HDR600 so I suppose 'HDR' in quotes.

It's also matte which I like more than the LG 5k's glossy screen.


I have this as well, and I like it, but it's quite difficult to find a computer capable of driving it at 4K and 144Hz. You can only do that if your GPU supports DisplayPort Stream Compression. My GeForce 960 does not, so I can only drive it at 95Hz (I'd buy a better GPU, if I could find one...). It doesn't support HDMI 2.1 either, although there's a newer model, the 27GP950-B, that does.

It's kind of funny to me that the limiting factor in high-res, high-refresh-rate display technology is not so much the panels but the bandwidth of the cables and connectors.


I have the same problem. I got a the Neo G9 which can do 240hz but if I want 10-bit HDR it’s limited to 60hz on both my 2019 MBP and Geforce 1080 PC because neither supports HDMI 2.1 or DSC. Windows can do 8-bit + dithering at 120hz which is actually fine but still feels like a silly limitation.


I have a computer and display combo that does 4K @ 144hz just fine with no stream compression. Instead it uses two DP cables at once. It’s a 2080Ti GPU and Acer Nitro XV3 display.


The 27GN950-b variant have up to 160hz(overclocked) and 600 nits. Probably it is the successor or something of your one? I don't even know there is a one whithout `-b` exists. And it's around $1000 when not on sell. Even lower when it is on sell.


Curious as to what people think of https://consumer.huawei.com/en/monitors/mateview/specs/ - https://www.amazon.co.uk/dp/B09C6JZPZG

It's 28.2" and 3840x2560 - it's a 3:2 aspect ratio.

I think it's a very tempting option.


Thank you - that is very interesting ...

Perhaps I can pay this forward by mentioning the Eizo 1:1 (1920x1920) monitor @ 26.5" diagonal:

https://www.eizo.com/products/flexscan/ev2730q/


Looks interesting.

Includes audio - stereo speakers and microphone array.

Not sure the display can be used on a VESA mount; there are ports on the stand, but a detail illustration shows feed-through to ports on the hinge.

The Amazon UK site seems to be willing to ship it to my US address.

About $780, with that shipping cost.


Looks cool, but I haven’t heard of this monitor once before — I suspect it’s likely because Huawei’s brand has been mildly killed by the US govt (I also don’t see that monitor on US Amazon).


That price is a lot less than I expected.


A reasonable high-enough DPI option I've been watching for a while is a 55" 8K TV. We're close to the 8K utopia!

- HDMI 2.1 can push 8K @ 60Hz over a single cable (though I am told the colors are not full range? https://twitter.com/MaratTanalin/status/1426726300585185284 - "Afaik, 8K@60Hz via HDMI 2.1 is only possible with either DSC compression or chroma subsampling — both are lossy.")

- A 3090 can push the pixels

- 8K TV prices have come down to ~$1500-$2000 - In theory, some combination of Game Mode and/or manually turning off all forms of picture post-processing can get you into good-to-good-enough input lag

If you want to wait for monitor manufacturers to manufacture 8K monitors, well, while it's inevitable, it'll be a while. But for the bold and stout of heart, for those willing to take risks, a new adventure awaits you. 8K


Dell P2415Q. https://www.dell.com/en-us/work/shop/cty/dell-24-ultra-hd-4k... you can still find it in stock on Amazon despite Dell not carrying it anymore. I have one, it's a glorious IPS panel for photo editing.


I just picked up a pair of P3222QE monitors. Kind of pricey but hands down the best monitors I've owned. I think it's too big to be "Retina" at 4K, but I really can't see any pixels.

I still find it hard to believe though that I can't change the brightness via the keyboard though in state of the art monitors. Does anyone know of any monitors where this is possible, or do they mostly require you to go through the terrible built in controls?

https://www.dell.com/en-us/work/shop/dell-32-4k-usb-c-hub-mo...


See https://lunar.fyi to change brightness via keyboard


Very cool! This lead me to another project as well for Windows: https://github.com/emoacht/Monitorian


I have a couple of these and I used to be a fan, but they just shit the bed constantly with macOS over a Thunderbolt dock for some reason. Needs constant power cycling, almost every time the laptop is plugged in.


Can confirm that I've had more than a decade of very satisfying experience of using Dell displays. They have a very predictable menu and just work. Soundbar is a very good solution if you don't want speakers messing your desktop.


I have two in my office and two at home. I have been using this monitor for 6-7 years now. I love this monitor. The bezels are huge though.


I have this monitor. It is insane value for money and gives you a beautiful picture.


Yep, having the same problem. I'd love an Ultrawide with comparable pixel density as the 14" Macbook Pro.

I'm sitting on a very large database of monitor data coming from Lunar's diagnostics and error reporting (https://lunar.fyi)

Maybe I should organize it into Postgres and then search for monitors with the PPI I want.

Or is there any easier solution than Postgres for fast ingesting some JSONs and filtering by arbitrarily nested fields?


Postgres? Seriously, for one-off playing around? Just use Airtable and you can do all the filtering and nesting you want. Plus then you can share it with us and we can collaboratively contribute.


Could I easily import 30k large and arbitrarily nested JSONs into Airtable?

I'd love to have something with an easy to use UI for filtering that can also be shared.


Yes, the Pro plan allows 50k rows per Airtable base. Arbitrary JSON can't be imported easily though, you'll have to convert it to CSV or use the API to dump data.


Depending on what you mean by large, you can go very far with SQLite. It even has a JSON extension if you want to drop blobs into there.


If you know sqlite there is also a json extension for filtering. Here a link to the example section: https://www.sqlite.org/json1.html#examples_using_json_each_a...


JQ can get you started without any db setup as you explore the data


typical ultrawide - 34" 3440x1440. that's 110ppi, regular density. (owning one, perfect size IMO, filling field of view almost exactly) if you wish retina (220ppi), that'll be 7K - which is alot of pixels. Nothing like this on market though


I’d immediately go with Elasticsearch for stuff like that.


The computer monitor market is at the mercy of gamers who want 1080p-1440p if it means getting 600fps in a game rather than 150fps at higher resolutions. It's why since 2015 there has been an arms race of refresh rates rather than of resolution. We had 4-5k panels at retina in 2015, market conditions could have had us with much higher quality panels if the entire thing wasn't catered to young, cash-strapped gamers whose opportunity cost when putting money towards a monitor is to put money towards a GPU.

Thankfully, as more and more public figures begin using OLED TVs instead of computer monitors, and as the monitor market takes after the larger consumer TV market, we will start to see better technologies compete with each other like OLED and FALD/miniLED.

It's a horrible time to buy a monitor right now, if you could wait even a year you should do so.

But keep in mind the relativity of "retina". It depends just as much on viewing distance as it does PPI. There are several handy charts you can view online relating the minimum noticeable viewing distance based on PPI.


Whilst true, it's more accurate to say the monitor market is hostage to the economics of LED panel manufacturing.

LEDs are normally produced as very large panels on production lines where every panel is made to a given PPI (and upper bound on the refresh rate).

Of course the technology mix is the technology mix, but the industry is built around reasonably large investments in these production lines, amortized across the entire screen industry.

A 27 inch 1080p monitor and a 24 inc 1080p monitor are produced on distinctly different production lines as it were, as they have a different PPI. But a tiny screen with the same PPI as said 27 inch 1080p likely came from the same factory in Korea.

Apple's panels are actually produced under contract by LG in Apple specific production lines, hence they can obtain exotic PPIs. Because they promise to use the line's capacity for multiple years.


As some of them have found (e.g. Linus window snapping burn in, in under six months), OLEDs have real drawbacks for use as standard desktop use displays. It's why even Apple don't use OLEDs on their Macs yet, who you would expect to do so if it was just a matter of "more money = more premium" with no deference to the gaming use case.


Agreed re: the medium-term issues of OLED.

But I can get a $1000 4k, 120hz OLED screen. Sure, it might be 42-48", but I could just as easily buy a deeper desk, wall mount it 4-5 feet away for the same effect. I'm currently sitting 3.5 feet from a 24" monitor. In this instance, the TV becomes "retina" past 3 feet.

https://www.designcompaniesranked.com/resources/is-this-reti...

Let's assume when used as a monitor without excessive amounts of care, irreparable burn-in takes 2 years. I could buy two of these TVs and wait out the whole nonsense the market is going through right now. Re: Linus, Keep in mind that window snap burn in was fixed with the pixel refresh function. And the conclusion he and Wendell came to in that video was that it isn't perfect, and it's often wasteful, but you can just get another monitor.


For many many people, 2 years is not a reasonable life expectancy for a $1000 monitor. My current home monitors are 1 (€400 27" 1440p165 IPS), 4 (€500 27" 1440p144 IPS) and 7 (€250 21" 1080p60 IPS) years old


Ditto this. My home monitors are:

27" Ultrafine 5K - bought in 2018, so 3+ years old, $1000+. 27" LG 4K - bought in 2019, so 2+ years old, $600+. 27" Dell Ultrasharp 1440p - bought in 2016, $800+. 24" Dell Ultrasharp 1920x1200 - bought in 2013, $400+.

All of these are in active use still - even the 8 year old 24" is still a solid AH-IPS monitor with decent wide-gamut colour support and LED backlighting.

I'd be very disappointed in a $1k display that was showing irreparable issues after 2 years. I'd be positively angry at a $1k display with issues after 6 months.


Although you make an ok point, you're greatly exaggerating FPS. I'm on a beautiful 1440p 144hz LG and I really only need 150fps. This still often requires less-than-ultra video settings even with a 3080 ti. There is a very small niche of competitive gamers who want 240hz at 1080p and the fps that justifies it.



BlurBusters is amazing resource to investigate HFR.


there are 360Hz monitors, now, and they are likely to sell well as soon as the silicon shortage eases, if they aren't already.

the real problem is that a 1080p monitor at 360Hz requires six times the bandwidth of a standard 1080p60 monitor, while a 120Hz 4k UHD monitor requires eight times the bandwidth of a 1080p60 monitor.

it's easier to reach the FPS targets incrementally than it is to reach the resolution targets in larger jumps.


Isn't this the entire esports community though?

edit: This was intended to be a serious question. I thought the esports folks were mostly interested with minimizing input latency and maximizing framerate, to the exclusion of most other concerns. ESPN covers esports tournaments now, sometimes on the front page of espn.com, so I thought it was more popular than perhaps it is.


Being a small niche really not mutually exclusive with the eSports community, unless you extend the eSports community to everyone who has ever played LoL or CS:GO. I think more people max out settings on those games than minimise them for the extra FPS still.


The driver is really what manufacturers want to sell. We saw this with those awful 13xx by 768 screens no one wanted. They made loads of them and then had to sell them.

What I suspect is really going on is it's easier to overclock screens than it is to increase yield on high resolutions.


By defining Retina as 220PPI most options are excluded as 4K at 27" or 32" is common but more isn't. A look at what Apple calls Retina though shows that angular density is a much better definition. The initial presentation by Jobs was about angular resolution and the actual screens follow it. PPI ranges from 218 to 476 but angular density only between 57 to 92.

So if you set a 27" or 32" screen at a decent distance you get "Retina" quality. If you want the screen closer maybe not. Unfortunately the market for more than 4K isn't really yet here for various reasons. Maybe if 8K catches on as a media format we take the next step. At that point only if you want to set the screen so close you can't view it all at once will it be worth it to go higher.


Yep, trying to find PPI based rules of thumb is silly.

The better definition of "retina" is far simpler: you render UI at >1x scale on it.


What does 1X scale mean? If your application is resolution independent, it doesn’t seem to be a meaningful distinction.


In a "logical pixel" system (macOS, Wayland, …Android??) the scale is the relationship between logical and physical pixels. In an "app go draw bigger stuff" system like Windows, it's just the factor of how much bigger the UI elements should be. E.g. in Windows settings you can set 150% scale and all UI elements will be 1.5X as large.

And the "good" scale depends mostly on PPI of the monitor, as well as user preference. On a 27" 4K monitor, 1x scale is near unusable: the UI elements would be microscopic. Windows defaults to 150%, but 175% and 200% are fine as well (here's the user preference aspect).


WPF and some other windows APIs support resolution independence, so you don’t have an arbitrary size defined set as 1X. It seems weird that Apple hasn’t been able to do resolution independence yet, and there is some arbitrary 1X that is based on monitor tech that was popular this decade (and definitely not resolutions like 800 x 600 or 640 x 480 resolutions that we used in the 90s).

Eventually, we should at least rename 2X to 1X and move on like we did before.


Directly using physical PPI has basically never taken off anywhere. Everyone is using "arbitrary 1x" because designers and developers have never started using physical dimensions for digital UI elements. Most people speak in "96-dpi-ish logical pixels" because it's much more convenient. Everyone knows what 128px approximately looks like. No one measures UI elements with a physical ruler in millimeters.


If you care about HDR, ironically enough there are only 4 options for a decent monitor as well- And almost all of them are nearly impossible to buy new. Some of these monitors are also years old https://www.youtube.com/watch?v=lCcSK3R8HcM

*

27" 3840x2160 ASUS PG27UQ / ACER PREDATOR X27 (same panel in both monitors)

32" 3840x2160 ASUS PG32UQX

35" 3440x1440 ASUS PG35VQ

49" 5120x1440 Samsung Neo G9


I have a PG27UQ and a X27 side by side and it's bliss. They're on a custom built PC (so Mac Retina scaling issues are moot.) On Windows 11 I run at 125% and it's comfortable for me. (I Bought both new within 6 months of each other when they were about half off before the supply chain crunch.)


I bought a Neo G9 on Black Friday. It took 3 weeks to ship but they are available!


I have one of the LG Ultrafine 5K on my 14" M1 Max Macbook Pro - it is far more reliable with that Mac than with the 2018 15" I was using it with before. It's also worth swapping the TB3 cable if you still have issues.

After both of those I find it works absolutely flawlessly for me now, and it is comfortably the best external display I have ever used.

I would adore any option that is even remotely as good.


Huh, I’ll have to try the wife’s new 14” mbp and see if that is better. Too bad I’m a year away from a refresh.


Don't even get me started on brightness... good luck finding anything that's higher than 350 nits.


Why do you need so much brightness in an indoor environment? Personally I've more frequently had the opposite problem, i.e. minimum brightness only going down to the 30-50 nits level, which is quite bright in dark-ish environments.


One field that needs it is editing 4k+ HDR content that has 16 steps or more of dynamic range. All of the blacks would blend together on lower output monitors like 400nits. Really need like 1000.

Here's a decent 18.4", but it costs $14k[1], or you could get the 32" for $30k.

[1] https://www.bhphotovideo.com/c/product/1676872-REG/canon_560...


Not the person you’re replying to, but better speed of thought at higher brightness levels. Sounds ridiculous, but it is so apparent in me that I even have a 20ms delta in simple response time between 300 nits and 200 nits in a moderately lit room - the same delta between me being fresh awake and being tired after being awake for 20 hours.

Additionally I’ve anecdotally observed that the dimmer the screen is relative to the background, the more disconnected I feel.


That's funny, because I've found something similar to be true for me. I wonder if that's a function of some measurement of interactive visual information.

For instance, I have a green screen CRT that I can work on at any nit as long as the contrast is basically infinite (text dim or bright), but as soon as the rest of the screen becomes illuminated, it's like I'm peering through a literal fog.

Contrast that to a dimmed 4k 10-bit panel, and I find myself dropping out of focus or losing my place. I'm looking for a particular OLED panel right now on AliExpress to test with.


Having an open window near your desk it’s pretty easy to want a brighter display depending on the time of day. If you don’t have natural light it is a much smaller concern.


I have to close my blinds to be able to see my monitors during the day.


People are different? I found it very straining on my eyes when I worked at a job where people liked to keep most of the lights off and the monitor brightness low.


The only currently acceptable external display is the LG 27-inch 5K.

Sadly the firmware is flaky, the webcam sucks, it has huge bezels, and is very thick.

Rumor is apple might have a new consumer priced external display next year. Let’s hope it’s true.


The 5K monitor model number and presumably the actual monitor has been updated a couple of times. Any chance the most recently updated design actually has corrected those issues?


I've been researching this for a couple days before this post and am close to pulling the trigger on the LG 5K. The consensus from Googling and Twitter seems like with the M1 and newer at least people are saying the issues have been resolved. I'm still torn though because if they are coming out with a new monitor in the next few months that also has some dedicated graphics processing and better display, that will be really tempting. The resell value of the 5K will go down pretty quickly as a result after purchasing (< 6 months).


How is the contrast and BLB on that panel? The other LG panels in this area seem to have sacrificed basically everything for low reaction times. Also, just one input, and that being Thunderbolt on top, won't work for a lot of people.

Edit: Oh I see, that's one of the LG-but-really-for-Apple-only displays.


My dream display would be close to retina PPI + 16:10 ratio.

Still running 2x 30” Apple Cinema Displays because I prefer the 1600px height.

They are awful displays by modern standards. But what I’d give to have a panel that is 5120 x 3200.


I’m in the same boat, I’m holding on to my 30” ACD until I get get a pixel doubled version of it for under $2k.


Another major issue not mentioned with non-retina displays is that you have to chose between a decent on-screen text size and pixel scaling.

For example, I use a a 4K 27" LG monitor. If I use native 4K resolution, the text is way too small. If I use 2:1 resolution, the text becomes too big and I lose real estate. I ended up with a scaled resolution with pixel approximation. I won't explain here why it may be a problem for some people, but it is.


I recently gave away one of two of my 7 year old 4K 27" monitors, thinking I could upgrade to something larger and higher resolution instead of using 2 monitors. I assumed I could get something around 35" and >4K but it doesn't exist. Aside from the Apple XDR display (32" 6K), there are 27" 4K monitors, a few 32" 4K monitors, or much larger (too large in my opinion) 4K or 8K TVs. Then there's a refresh rate arms race of smaller 1080p or 1440p monitors for gaming. And almost all of the options above aside from the XDR and large TVs look terrible next to a macbook. If you read monitor forums there's posts going back 2 years now saying "with HDMI 2.1 coming out soon, there will be more options." But nothing really appeared.

In the end I pulled the trigger on a 1440p 49" ultrawide[1] because it appeared on sale one day for $600 with only $50 shipping from US to Japan. I really like the form factor for working with references off to the side, but it's dim, washed-out, and blurry compared to the macbook I use it with. I'll be replacing it as soon as something better is available.

And why do all monitors have absolutely terrible UIs? I virtually never use any feature aside from switching inputs, and that usually takes 10 button presses. Usually the buttons are all in a row but represent up/down/left/right. Adding insult to injury, the monitor I'm now using has single button access to change the power LED color or add a crosshair to the screen. And if it loses signal for a split second, such as to change resolution, it either changes to another input or powers down. Either of these takes 10 seconds to fix. The manual shows a remote control with the buttons I want, but apparently it is no longer included, or available.

[1]https://www.monoprice.com/product?p_id=43305


There’s also a market of those small portable displays that are definitely ”retina”. I own a pair of those, one 15" and one 12.9", both UHD. Sadly, many of them, while not precisely “no name”, are of dubious quality when it comes to driver boards.

My 15" display likes to produce a loud “pop” sound with the picture going bright white for a fraction of an eyeblink. It’s also HDR without being HDR — in SDR mode, the picture is hideous and in HDR, it’s merely bearable.

The 12.9" display is way better. No issues whatsoever, I sometimes take it with me when on the road. And it’s simply gorgeous when you have a lot of text to work with.

I feel the sweet spot would be the 17" ones I’m seeing on Amazon. If only they had proper VESA holes, proper DP sockets ubstead of USB-C and were reliable, I could replace my Dell P2145Qs with two of those. A good size, super small pixels, and just a fraction of power consumed.


How are they using "a fraction of the power"?


10 W vs 45 W?


I’ve got a pair of the ultrafine 5ks. I’d happily trade them in on something that I can plug into and know that it will power up the display reliably.

Right now if I take my laptop off somewhere and come back, who knows if they’ll both power up. I’m frequently stuck on one display and no amount of unplugging and reconnecting or swearing helps.


I have 2x LG Ultrafine 5Ks as well.

I found the USB-C connector wears down over time and causes display and power issues. I've reflowed the connector itself, and am looking at doing a complete connector swap soon. The mechanicals of an active USB-C cable puts stress on the joints and causes them to crack and create unreliable connections. I wish there was a way to lock the connector in and use something else as strain relief.


I have an LG Ultrafine 5K and suffer from random display disconnections. I was wondering if I was just unlucky, but this kind of explains everything now. Is there any guide that you know of for doing the reflowing? It's at the point now where I need to spend up to an hour carefully tweaking the connector into the right position to provide a signal so I don't mind hacking away at it with the risk of it breaking.


Maybe try a powered usb-c hub and connect the displays to that instead of directly to the laptop?


You can't. These are thunderbolt 3 displays.

edit: and they use too much bandwidth to daisy chain with anything else, too.


One of them goes to a thunderbolt3 dock. When they display refuses to power up, it doesn’t matter if it’s plugged in direct or into the dock.

Also, I can see the display in the system report in the thunderbolt section, and devices daisy chained off of them work. It’s just the panel that’s being a bitch.



I have no brand loyalty whatsoever, but all my monitors have been Eizo's Since I bought Eizo Nanao Trinitron CRT monitors in 1990s. I'm a lazy shopper and those things are high quality every way. I still have FlexScan EV2303W's from 2009 or 2010 and use them daily.


While that looks like a great monitor, it's a 21.6" 4K OLED monitor that seems to retail for over $3000. It's less of a ridiculous option than the $6000 Apple 32" display, but for most of us (including, I'm fairly confident, original article author Casey Liss), it still falls into the "if you aren't sure you need this, you don't need this" category.


I bought two LG Ultrafine 5K displays in 2016 and they're still going strong. They're good — probably the best affordable retina monitor option available, but not perfect. For example, I wish I could daisy chain, and there are occasional ghosting when changing from a contrasty/bright screen to a dark screen. I also ran into problems this year with my MBP maxing out CPU when running both displays at once — after 5 years without problems. Switching to one display solves the issue.

If Apple released an affordable pro level display with reasonably good specs, I'd make the jump.


I wish I could daisy chain too but unfortunately it’s a limitation of Thunderbolt only supporting 40Gbps. Daisy chaining two LG 5Ks would be 5120 * 2880 * 60hz * 30 bits/pixel * 2 = 53Gbps (if my math is correct)


LG Ultrafines sold at Apple Stores are clearly co-designed or at least advised on by Apple. From packaging and unboxing to the mold lines of injection molded plastic parts, quality of die casts and stamped panels, clean uncluttered outer design with no buttons, it's almost like a different company made the monitor, compared to other LG ones.

I have a 4K one which is just amazing. Picture quality isn't everything (unless you work in a lab somewhere under time/budget constraints).


Are you having any image retention issues? Mine seems to hang onto images, especially along the outermost inch or so of the screen. Not a deal breaker--still love the screen--but kind of unacceptable given the price. But I bought it second hand so warranty isn't really an option for me.


That's the ghosting problem I referred to, although image retention is the accurate term. I never noticed that it was mostly the outer edges of the screen, but now that you mention it, the image persistence on mine seems to be mostly on the outer 2-4 inches of the screen.

Agreed. Not a dealbreaker, especially since it's such a great screen otherwise. Although given the price, it's a shame this problem exists.


Could you link out to the specific model? Thx


https://www.apple.com/shop/product/HMUB2LL/A/lg-ultrafine-5k... — It's the same LG 5k model from the post.


I really wanted the 27-inch LG UltraFine 5K (https://www.lg.com/us/monitors/lg-27md5kl-b-5k-uhd-led-monit...), but I heard many complains about it not being as reliable as you'd expect, then I decided to get the LG UltraFine 4K (https://amzn.to/3qs2pFW) a little over a year ago, and I don't regret at all spending so much more for a display, than I'd normally expect to.

Now, I want to update to Apple's Pro Display XDR, but can't justify paying ~7 times more on it for now. Although 24-inch and 4K was a huge improvement for someone like me, used to only using a MacBook display (I spend the workday programming, and love photography), I still feel like 4K isn't enough.

I can't comfortably open my editor/browser, while also watching a video, for example. For some time I've tried to use my MacBook 15 Pro as a second display, but that didn't work either. I hate the fact that there're small differences in color/brightness, even though the quality of either individually is superb (the external is slightly better)... The only thing that helped a little bit with regards to using multiple displays was using the iPad to play videos thanks to AirPlay, but this is suboptimal.


I have owned the LG 5K UltraFine monitor for 6 months and it's the greatest thing ever. I will never go back to 4K.

I drive it with an OWC Thunderbolt 3 dock with CalDigit Thunderbolt cables and a maxed out 13" 2020 MacBook Pro, and as of the latest version of macOS Big Sur it is rock solid.

It's mounted on a monitor arm, so I can't comment on the included stand.


How is the OWC dock? I just got the CalDigit, but super disappointed that the monitors often don't reconnect on wake from sleep.


I have both of these docks.

The OWC is my primary dock, but I did try the CalDigit one for a bit (both TB3.) Why? Same problem you're having - the monitors would never wake from sleep.

At the time I was using a Dell P2415Q with macOS and this monitor would just refuse to wake up from sleep or when I swapped laptops on the dock. It "just worked" far less than half of the time. I would have to hard power cycle by unplugging the monitor power cable - power button recycle was rarely sufficient.

I could never find a fix for this, but after a lot of searching I was led to believe that it was a problem with EDID and macOS with this particular monitor. I never confirmed this though.

The problem was almost entirely fixed with the latest macOS Big Sur update. I have to power cycle the monitor only a couple times a month now. Huge difference, and seems to be the case with both docks.


Slight differences in white balance, even after calibration, preclude the use of multiple monitors for myself as well. Even if they are the same monitor, same model our nervous system is very good at detecting these differences.

I have no settled on a new monitor either, I'm using a 24" that is slightly too small for split screen use, but doable. 27" would be nice but 32" would likely be the best here.

There are a few good ones in the works. LG announced a new "IPS black" technology, which offers better contrast than your MacBook screen (claimed 2000:1, we'll see). As well, LG is manufacturing a few monitors for Apple this year. Some of these will be miniLED like the XDR display, but cheaper as manufacturing costs have likely reduced significantly.

The only monitor I'd even consider right now is the LGGN950 or one of the 5k2k displays, though the latter are notoriously finicky I've heard and currently not usable on Mac without a dedicated GPU.


The mount for the 4K display sucks. I don't know why the hell they designed it this way, but it's a rectangle that protrudes about 5cm in front of your display. This mean that it forces your keyboard to be placed closer than it should depending on the distance you use it.


So I have a 28" 4K display (USB-C, ethernet, the whole shebang) and I don't notice a massive difference between that one and the MacBook one. Those displays are widely available and pretty cheap too.

Is there something wrong with me that I don't see a massive improvement between that PPI and the MBP one? To the point where I'd lose my mind about it and demand 5K there?


You probably sit farther away from your screen than most people. At least for me there's a pretty binary threshold where you can either see pixels or not, which makes a huge difference for text smoothness.


Same here. If I get up really close, I can tell the difference, but not the way I normally sit in my seat. It probably helps that I have a fairly deep desk and monitor arms, for maximum eye-monitor distance.

Retina is 326 PPI+ on iPhones. It’s 220 PPI+ on MacBooks. I have to say that my 170 PPI or so is plenty on my 4k 28” desktop monitors. Though I would be very very happy to upgrade to 120hz (without turning my laptop into a jet engine) and maybe 5k as well for perfect 2:1 1440p scaling.


I've been waiting for a solid 5K display that meets my requirements since 2011. In 2011 I said, "when I replace my iMac, I can't wait to have a few 5K 120Hz displays!"

In May 2021 I did my workstation build. I just bought new value 1440p panels while I sit here with my arms crossed, wondering when the 5K displays which'll meet my requirements will exist.


Got a Huawei Mateview recently - 28.2", 4K+, 3:2 aspect ratio, USB-C charging, £500! Fantastic monitor with a great feature set. Retina screen similar to M1 Macbook quality, can barely tell the difference. Looks just like the Apple Pro Display, but with a better aspect ratio for devs. 100% Recommended.


I am hanging on to dear life with a slowly dying Dell UP2715K in the hope that 2022 will be the year someone finally makes a decent replacement for the glossy 5k displays that came out years ago.

The fact that the pro xdr only has one port makes it a non starter for me. I need to connect more than one device to my monitors.


My UP2715K broke recently, and I thought, well, 4k can't be that bad, there's almost no 5k monitors so perhaps the difference isn't that big of a deal. So I bought Dell's UP2720Q. Brilliant colors, can't complain at all on that point, but text is noticably blurry compared to what I'm used to. This is such a bummer


So sample size of 1, but I have had the "Apple" LG 5K for 3 months now and it is great. I was able to drop the TB dock and wire my external USB-C HDs right to it. It is hooked to M1 Mac Mini running the latest MacOS. This monitor replaced the "Apple" LG 4K which I ran for ~2 years.


Very basic geometry problem, in which I discover that a 32-inch display with a pixel density of 218 pixels/inch (to match Apple displays) requires 6108 x 3436 pixels.

An 8k display would be 40 inches.

h = 32, 218 = p / x, c = 16/9 y, x = h cos(t), y = h sin(t), h^2 = x^2 + y^2, 90 = t + s, x > 0, y > 0, h > 0


I have the lg ultrafine 4K. Picture quality is obviously a lot better than the matte and lower resolution displays available for windows. It adjusts brightness for the room which is also nice. It has quite an annoying bug though in that whenever there is a power outage, it never wakes up. I have to get another monitor, plug only that monitor in, plug the lg back in after restart, unplug other monitor just to get it working again.

Resolution is not quite retina but good enough. Sound quality is not as good as iMacs on built in speakers. It has no web cam but not sure I care.

I used to be content with windows and low resolution monitors. I think the eyes just adjust and make up for anything that is lost. I am not even sure resolution even matters. 4K video would be nice, but very little content is in 4k.


For your consideration a calculator.

https://www.designcompaniesranked.com/resources/is-this-reti...

According to it's methodology which seems reasonable a 4k display is "retina" at 26" at a typical 20" viewing distance. A 27" at 21.

I think a 27" is so near as good as to be indistinguishable despite being a reasonable upgrade in screen area.

I have something very similar to this.

https://www.newegg.com/p/N82E16824025891

I'd easily put it in the position of the bargain choice.


Can anyone tell me how to force an M1 Mac Mini to produce crisp images on a 27" 1440p (~110dpi) screen? I've tried various things and cannot get it to render anywhere near the clarity of Windows. I'm not going to buy a 'Hi-DPI' (220dpi) screen just to solve Apple's rendering problems.


I have two of the LG 24-inch 4K monitors from this list. They’re great for web browsing and Terminal work.

I use them with an Intel Mac mini and unfortunately the power management doesn’t work. Sometimes on computer boot, the monitors just don’t come on, and sleep is problematic too. Sometimes unplugging and replugging the video cable works, sometimes no. Sometimes the only fix is another reboot, which seems to reset something.

I’ve found that Thunderbolt to Displayport cables work best, and I disabled all monitor power management both in the Mac and the monitors. Still sporadic issues though. Don’t know whether to blame the monitors or the Mac.

Other than that, no complaints.


>So, all-in, the Pro Display XDR is $7,000. Which is, charitably, absurd.

In what planet is it "absurd"? Because the calibrated monitors it competes with (from Sony, etc) cost around $10K to $20K.


It is absurd if you don't compare it to other video reference monitors but being the only alternative for a large screen with 200 ppi resolution. The problem is the huge gap between this screen and almost every other screen outside. There are plenty of developers which would be happy to pay good money for 30" 200+ ppi screen. But the Apple Pro Display is very expensive for that and has some weaknesses as being optimized for something else (video production). It also isn't a really great screen otherwise. It doesn't even come with a second video input.


I also found it difficult to shop for a monitor in the past couple of years I ended up going for this: https://www.gigabyte.com/Monitor/AORUS-FV43U#kf

My constraints were proper 4k (no compromised height resolution) -- the 4:3 ratio is nice but I was happy with most ratios. Overall at 43 inches it's big probably more suited for console gaming than PC but the 165Hz refresh rate makes it suitable for gaming.


I use dual LG 24UD58-B’s (one landscape, one portrait) for coding and I’ve been satisfied.

I don’t run them at native res, but this never bothered me. I also don’t game or edit photos or video either.


I just want a 27" 5k display with 120hz and HDR 600 or better.


Why is it so difficult to get the right screen for a mac, relative to how easy it is with Windows? Is this a proactive design choice by Apple, or just a lack of care?


The issue is subpixel antialiasing. it got removed in Big Sur and hence nonretina screens have blurry fonts.


But, why?


I will put another monitor into the mix because it is the first of its kind and might interest some: https://www.bhphotovideo.com/c/product/1621993-REG/lg_27ep95...

It is a full 27" OLED panel for $$$ ($3,000), but it is not the highest DPI at only 4K resolution


OLED isn't that great in the monitor space because you are going to get burn in, it's only a matter of 'how long?'.


When the original Retina Macs came out, they basically ruined me for non-Retina laptop screens. However, regular 27" 2560x1440 screens were just fine.

In 2017 I was able to pick up a deal on the Anandtech or [H] forums, 2016 15" MacBook Pro and the original LG 4K USB-C display for $1600!

I still have the screen to this day. It is still amazing. I never ended up buying a 27" 5K display.


When I’m using the laptop as a standalone unit, the built-in screen resolution makes a difference. When it’s docked on my desk surrounded by two 24” monitors, I wouldn’t be able to tell the difference except, perhaps, by very subtle cues in text rendering.

I can, however, see clear artifacts on the two 100-dpi ish screens. A 150-dpi screen would work perfectly for me.


Not knowing too much about monitors and trying to search for "I want a 24in monitor that looks somewhat similar to my macbook" has been kind of challenging. Especially when you're shopping online. So many variables to tweak, and I have no idea which matter and which don't. I just want something that looks good.


I bought 2 XDRs because they are stupid-cheap compared to BARCO clinical options. Check out the Coronis and Steris lines if you really want to spend some money. It's been a while I since I did the comparison, but as I recall, the BARCOs are not hands-down better than the XDR, despite 5x the price.


I've found running 4K @ 2x/200% scaling "retina" to be the best price/performance option right now. 5K displays are nowhere to be seen and 6K/8K have absurd prices.

My dream monitor would be a dual-5k super-ultrawide, even at 60Hz. That's at least 5 years away if not 10.


New Apple displays are rumored to be coming up: https://www.macrumors.com/2021/12/16/apple-displays-24-and-2...


A lot of this depends on the distance of the monitor to your face. High actual DPI matters more if the screen is close to your eyes. Do a 24" 4k is great at close range, but if you move the screen 8in-12in back, a 32in 4K can work just as well.


Every time I buy the LG 5K I feel a little sad (it's overpriced for what it is, outdated, and limited with non-Macs), but every time I use it, I'm happy with the purchase.

Biggest downside (aside from the price) is extending the Apple lock-in.


I've also had a great experience w/ this monitor.

As mentioned elsewhere, it and my mac are sensitive to the cable, so it's worth trying a couple if they give problems. (More apple idiocy: what if we put the same connector everywhere, but made devices super sensitive to some qualities of the cables, connectors, or who knows what -- there's definitely no way this will be super confusing for users!)


This is also somewhat usb-c in a nutshell. Not all ports or cables are made equally, and the spec very much allows that.


I have two. One has a bad burnin problem. It takes 30min to set and the rest of the day I see that original image. The other has a weird color wave.

I used to buy an iMac 27 so I could have a great first screen. I’d like to go back or have an apple monitor again.


I can actually connect it to Ubuntu and use some code to simulate physical buttons. So you are not totally locked in apple ecosystem.


Here's a handy website to find panel displays without a frame:

https://www.panelook.com/modelsearch.php?op=advancedsearch


When talking about hi-end monitor I'm always surprised that nearly nobody list Eizo monitors among the choices.

https://www.eizoglobal.com/


Fwiw, Eyoyo offers 4K 15” monitors, which qualify as _retina displays_. Those are usually around $150, depending on reseller and exact model.


What's the smallest 4K (external, I don't want your meme laptops with 4k displays in the recommendation pool) monitor currently on the market?


There's a variety of 15.6" portable monitors, most seeming to have the same specs and obviously based on some laptop panel.


I had completely forgotten about those when I wrote my question. IIRC, Dell makes a 24" 4K monitor, but I can't think of having ever encountered a 22" monitor at 4K.



That's a 27" 4K, so it's already low PPI compared to the monitors in the article.


That's double the PPI compared to 27" 1080p ;) Granted, the usual "non-retina" 27-inch resolution was 1440p, but it's still a monitor you could use 2x scaling on. Yeah 1.75x or even 1.5x would be kinda preferable if you want more screen space but I like the huge UI elements.


Exactly - 27" 1080p is too large, 27" 1440p is nice for UI size for me, and then it's about extra pixels for sharpness & clarity.

Unfortunately the OSes I want to use don't do fractional scaling well - macOS or Linux, but Linux can work here, it's getting better. Windows does fractional scaling well in theory, but I find it works poorly in practice as app support for it is really patchy.


But only when they make a glossy one: https://www.youtube.com/watch?v=3mTV1TOblbA


I disappointed that this was about monitors and not something that I could clip on my glasses that would project onto my retinas.


Thank Apple for fucking up the term Retina Display. It used to mean something very specific until the marketing department at Apple overheard it. Same happened earlier to "real time".


Oh don't worry, I'll bet they re-use it when they have a working retina display. And the trademark will still be in place!


I am currently looking at the dell or Lenovo 40 inch 5k2k displays... Want to pair it with 2020 Macbook air. Any thoughts?


How is the Ultrafine “unreliable”? What functionality would a monitor even have that could possibly break?


Here’s a list of issues the LG 5k Ultrafine has had over the years:

The initial batch would get disrupted if a WiFi antenna (like the kind in an ordinary laptop) was too close. There was a recall to replace the shielding.

Sporadic dumping of devices connected to the internal USB-C hub (mine did this all the time when I had 2.5” HDs connected; got better when I switched to SSDs; mostly disappeared when I switched to an M1 Mac).

Ghosting (I currently have this problem bigly).

The pins in the powered USB-C socket that are responsible for data are poorly soldered. Over the years and wear, this can result in displays that can power laptops but not necessarily show a picture.

The whole product line is a big fat lemon.


I’m not sure on others but for me it’s been issues when connecting the monitor to my laptop.

I bring my laptop in each morning and after I plug it into the monitor I have issues.

Often the monitor is not correctly identifying the MacBook and so it doesn’t correctly turn on even though my MacBook responds by moving everything across two screens. It can lead to 5-10 minutes of plugging and unplugging and switching the monitor on and off.

Not an experience anyone wants.


Why the insistence on Retina displays, though? If the pixel count is too small for the monitor size, then place the monitor further away, and use more monitors if necessary. E.g. you could have 2x 30-inch 4k monitors. The DPI is low but you can place the monitors further to compensate.

Having monitors further away from the eyes may also help to prevent eye strain from focusing too close for too long.


It's shameful that Dell's 8K monitor is sabotaged by Apple.


I would add LG CX/C1 48 inch. The lowest price has come down to 1000 usd + tax around recent month. It has all the features and packs a top tier oled panel.


I find that bigger than 32" at desktop distances is uncomfortable at best, unfortunately. 27" is definitely the sweet spot for me.


You need a deeper desk.


That’s basically a small TV - it’s nowhere near retina, and 4k is barely acceptable at 1/4 that size.


Unfortunately there are no 40-43 inch models like other usual 4k tvs. 48 is the best we can get for now


Where PC displays ought to be in 2021:

At the high-end: 6K or 8K 120Hz 1000 nit, 12-bit HDR in various sizes from 27" up to 40" where this kind of resolution makes sense.

At the "enthusiast" or "gamer" tier: 4K 240Hz with 10-bit HDR, either 1000 nit LCD or 600 nit OLED. (From 2022, RGB OLED with 1000 nit brightness)

In both cases wide-gamut colour (Display P3 or wider) should be available and "just work". Ideally much wider, near Rec.2020. I still use a 8 year old(!) monitor with 100% AdobeRGB coverage. For some bizarre reason, this is a "high bar" that 99% of currently sold PC monitors don't reach, despite quantum dot filters being a thing since 2013(!!).

This isn't wishful thinking, this is all existing, off-the-shelf technology widely available elsewhere in the general consumer market -- just not for PCs.

So for example, my iPhone has a wide-gamut, 1000 nit HDR OLED screen with 120 Hz.

LG sells an 8K OLED with 120 Hz, and 240 Hz 4K televisions are commonplace. Most TVs have 1000+ nit HDR, and OLEDs will reach about 1000 nits in 2022.

Most of the above is simply unavailable in the PC display world, or is only available "piecemeal". I just got a laptop with an OLED 4K HDR display, but it is limited 60Hz. This isn't an input cable limitation, it's a built-in display with a 5cm long cable!

There are exactly zero 120 Hz PC OLED monitors available on the market under 40" (i.e.: not a re-purposed television). None.

Back to my laptop: HDR "works" in the checkbox tick sense. The panel is physically very good, but even on Windows 11 with the absolute latest drivers it's hit & miss:

- Windows is not colour managed by default. Still. In nearly-2022 era. Apps randomly look correct or have visibly "stretched" gamuts. Most things are limited to sRGB, and can't use the display native gamut, despite wide-gamut support for the UI being available as far back as Vista.

- HDR content is randomly correct by accident (YouTube), overbright and clipped horribly (also YouTube for some reason), or way too dark (NetFlix).

- Dolby Vision doesn't work at all. The "Dolby Access" app is abandonware that does nothing. (Again, both my TV and my phone can play DV content!)

- Games don't work in HDR at all on the internal OLED display, but do work with an external TV plugged in. So the GPU is capable of it, it just refuses to work!

- The panel is 10-bit capable, but only if HDR is turned on.

- With HDR on the screen stays black when the laptop resumes from sleep, making this a no-go for general use. This is a known issue that has persisted through at least three semi-annual releases of Windows, with a WONTFIX response from Microsoft!

This is where were are on the eve of 2022: A super-high-end Windows computer screws up the colours for apps, can't reliably play HDR, refuses to for games, and dies if HDR is used before it goes to sleep. It doesn't matter anyway because HDR displays are rarer than hen's teeth and won't be available until 2023 at the earliest, some time after TV manufacturers will stop making LCDs entirely and switch 100% of their production over to OLEDs.

Don't worry, the old LCD lines will be used to make faded, sub-sRGB displays for us poors.


How about the ASUS 32" PA329CV?


Literally going the opposite direction there - 32” at 4k means huuuuge pixels…


Get an iPad Pro. Problem solved.


An iPad Pro is only 12.9” and you’re stuck with one resolution. You’re better off buying the 24” LG 4K (I think. Don’t remember it being configurable.)

That said, it makes a very good external monitor for mobile use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: