"The original XGA monitor, considered “high resolution” at the time, had a 16” diagonal and 82ppi, which translated 36 to 45 pixels per degree (ppd) from 0.5 meters to 0.8 meters (typical monitor viewing distance), respectively. Factoring in the estimated FOV and resolutions, the Apple Vision Pro is between 35 and 40 ppd or about the same as a 1987 monitor."
Okay, now wait a minute. On the one hand, it's possible to scoff at technology that was state of the art in 1987 and make it seem like it's shitty, but 1024 by 768 with a diagonal of 16 inches seems awfully close to the monitor I'm currently typing this on right now: a 1920 by 1080 27 inch LCD screen, sitting three feet away.
I used https://qasimk.io/screen-ppd/ to compare some of the displays I have around to the IBM XGA monitor. Although I have one 1440 monitor, I also have a 1920x1080 that I use all the time, and when I'm actually in the office, those monitors are the same.
To hear this person describe it, the monitor that I and a ton of people currently use today has the resolution of an ancient relic. A 1024x768 monitor with a 16 inch diagonal has effectively the same pixels per inch as a 1920x1080 with a 27 inch diagonal. Would a 27 inch 1080p monitor really be worth characterizing as obsolete, insufficient technology today?
24" is generally considered the upper limit of where 1080p is reasonable. 27" is getting to be a notably low PPI for text and UI, although it's generally fine for games and other motion content.
Normally at 27" you'd want at least 1440p these days, and that's been pretty standard for around 10 years now. Those Pixio PX276/PX277 monitors and similar have been around for a long time even as budget products - the panels they were using come from cast-offs from the premium panels.
Which isn't to say you won't find 25" 1080p monitors and similar junk in offices but... they're not buying you the good stuff either.
Not to say it's not fine for you! But generally today people would be looking for a higher res, either 1440p or 4K, either of which is under $250 these days for a 27", and often a pretty nice one (nano-IPS, 144 hz, wide color gamut, etc).
It just feel wild to me that using something which would be in offices today would be so ridiculous as to not even be usable for text. The article has to take for granted that any display with a resolution below 4K is not sufficient for reading text. That's unsupportable.
I'm not arguing that 1440 or 4k isn't better than 1080. I'm not saying that monitors aren't cheaper than the Apple Vision Pro. I know that text on a screen in 1440p looks better than 1080p.
I'm just saying it seems more ridiculous to be like "the monitors that people use in offices all around the world today cannot be used as monitors, welcome to 1987 if you think so" because there exist alternatives that people would prefer to use.
For goodness' sake, 61 percent of people running Steam have their resolution as 1080p.
I'm just saying it's uncharitable for the author to make it seem as if the resolution that a vast swath of people use in 2023 is so inadequate as to not be worth taking seriously. He's saying "Yes, they will “work” as a monitor if you are desperate and have nothing else, but having multiple terrible monitors is not a solution many people will want", and buried in that is the assumption that the majority of people are using "terrible" monitors that (sarcasm quotes) work. I don't think that's reasonable.
IMHO 1080p is unacceptable today, and if my employer provided me with a laptop at that resolution it would be bad for productivity and a slap in the face. In a full size monitor it's only worse. Retina screens have been a thing on laptops since 2012 and before that on mobile devices. Let's stop accepting fuzzy text as normal.
FWIW, I'm using a 1920x1080 14" screen when not on a 4K 28" screen. Both with almost identical pixels per degree. (They are identical dot-pitch, but I don't use exactly the same viewing distance for laptop vs desk...)
I would be absolutely aggravated to work at half the linear resolution. I would tolerate it for some fast-paced game as a practical tradeoff for framerate, but not for text or static image viewing.
I have seen a 16" 4K laptop and considered it overkill or even impractical, i.e. pixels are too small to resolve and naively scaled icons or text are unreadable without a magnifier. But my usual screens described up top are above that threshold for me, and I can resolve single pixel gaps much of the time.
Okay, now wait a minute. On the one hand, it's possible to scoff at technology that was state of the art in 1987 and make it seem like it's shitty, but 1024 by 768 with a diagonal of 16 inches seems awfully close to the monitor I'm currently typing this on right now: a 1920 by 1080 27 inch LCD screen, sitting three feet away.
I used https://qasimk.io/screen-ppd/ to compare some of the displays I have around to the IBM XGA monitor. Although I have one 1440 monitor, I also have a 1920x1080 that I use all the time, and when I'm actually in the office, those monitors are the same.
To hear this person describe it, the monitor that I and a ton of people currently use today has the resolution of an ancient relic. A 1024x768 monitor with a 16 inch diagonal has effectively the same pixels per inch as a 1920x1080 with a 27 inch diagonal. Would a 27 inch 1080p monitor really be worth characterizing as obsolete, insufficient technology today?